npe2-0.7.2/.github_changelog_generator0000644000000000000000000000100613615410400014641 0ustar00# generate new changelog with # https://github.com/github-changelog-generator/github-changelog-generator # set `CHANGELOG_GITHUB_TOKEN` to github token then, # e.g. `github_changelog_generator --future-release v0.1.0` user=napari project=npe2 issues=false exclude-labels=duplicate,question,invalid,wontfix,hide add-sections={"tests":{"prefix":"**Tests & CI:**","labels":["tests"]},"refactor":{"prefix":"**Refactors:**","labels":["refactor"]},"documentation":{"prefix":"**Documentation:**","labels":["documentation"]}} npe2-0.7.2/.pre-commit-config.yaml0000644000000000000000000000145613615410400013573 0ustar00ci: autoupdate_schedule: monthly autofix_commit_msg: "style: [pre-commit.ci] auto fixes [...]" autoupdate_commit_msg: "ci: [pre-commit.ci] autoupdate" exclude: _docs/example_plugin/some_module.py repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.4.0 hooks: - id: check-docstring-first - id: end-of-file-fixer - id: trailing-whitespace - repo: https://github.com/psf/black rev: 23.3.0 hooks: - id: black - repo: https://github.com/charliermarsh/ruff-pre-commit rev: v0.0.263 hooks: - id: ruff - repo: https://github.com/pre-commit/mirrors-mypy rev: v1.2.0 hooks: - id: mypy additional_dependencies: - types-toml - types-PyYAML exclude: npe2/implements.pyi|_docs/render.py npe2-0.7.2/CHANGELOG.md0000644000000000000000000006533713615410400011133 0ustar00# Changelog ## [v0.7.1](https://github.com/napari/npe2/tree/v0.7.1) (2023-07-16) [Full Changelog](https://github.com/napari/npe2/compare/v0.7.0...v0.7.1) **Implemented enhancements:** - feat: support python3.11 [\#293](https://github.com/napari/npe2/pull/293) ([tlambert03](https://github.com/tlambert03)) - add graph layer [\#292](https://github.com/napari/npe2/pull/292) ([JoOkuma](https://github.com/JoOkuma)) **Fixed bugs:** - fix: use constraints in napari tests [\#298](https://github.com/napari/npe2/pull/298) ([Czaki](https://github.com/Czaki)) - Use full `plugin_name` when finding chosen `reader` rather than `startswith` [\#297](https://github.com/napari/npe2/pull/297) ([DragaDoncila](https://github.com/DragaDoncila)) - Change ArrayLike type to use read only properties [\#289](https://github.com/napari/npe2/pull/289) ([andy-sweet](https://github.com/andy-sweet)) - Bugfix: use .lower\(\) to make paths & pattern fnmatch case insensitive [\#275](https://github.com/napari/npe2/pull/275) ([psobolewskiPhD](https://github.com/psobolewskiPhD)) **Documentation:** - Fix typo in `DynamicPlugin` [\#304](https://github.com/napari/npe2/pull/304) ([lucyleeow](https://github.com/lucyleeow)) - DOCS: Widget guide should consistently use parent=None kwarg in examples [\#302](https://github.com/napari/npe2/pull/302) ([GenevieveBuckley](https://github.com/GenevieveBuckley)) **Merged pull requests:** - remove tomlpp [\#294](https://github.com/napari/npe2/pull/294) ([tlambert03](https://github.com/tlambert03)) - Use hatchling as build backend [\#282](https://github.com/napari/npe2/pull/282) ([tlambert03](https://github.com/tlambert03)) ## [v0.7.0](https://github.com/napari/npe2/tree/v0.7.0) (2023-04-14) [Full Changelog](https://github.com/napari/npe2/compare/v0.6.2...v0.7.0) **Fixed bugs:** - fix: pass command registry to rdr.exec in io\_utils.\_read [\#285](https://github.com/napari/npe2/pull/285) ([tlambert03](https://github.com/tlambert03)) - fix: use logger instead of warning for TestPluginManager.discover [\#283](https://github.com/napari/npe2/pull/283) ([tlambert03](https://github.com/tlambert03)) - Add specific error when reader plugin was chosen but failed [\#276](https://github.com/napari/npe2/pull/276) ([DragaDoncila](https://github.com/DragaDoncila)) **Tests & CI:** - Fix tests: use npe1 version \(0.1.2\) of napari-ndtiffs [\#277](https://github.com/napari/npe2/pull/277) ([psobolewskiPhD](https://github.com/psobolewskiPhD)) - ci: update pre-commit, use ruff and fix mypy [\#273](https://github.com/napari/npe2/pull/273) ([tlambert03](https://github.com/tlambert03)) - Switch from GabrielBB/xvfb-action to aganders3/headless-gui [\#269](https://github.com/napari/npe2/pull/269) ([Czaki](https://github.com/Czaki)) **Refactors:** - refactor: use src layout and pyproject.toml [\#281](https://github.com/napari/npe2/pull/281) ([tlambert03](https://github.com/tlambert03)) **Documentation:** - Fix link to magicgui objects.inv in intersphinx [\#270](https://github.com/napari/npe2/pull/270) ([melissawm](https://github.com/melissawm)) **Merged pull requests:** - chore: changelog v0.7.0 [\#286](https://github.com/napari/npe2/pull/286) ([tlambert03](https://github.com/tlambert03)) - ci\(dependabot\): bump peter-evans/create-pull-request from 4 to 5 [\#284](https://github.com/napari/npe2/pull/284) ([dependabot[bot]](https://github.com/apps/dependabot)) - Pin pydantic bellow 2.0 [\#279](https://github.com/napari/npe2/pull/279) ([Czaki](https://github.com/Czaki)) ## [v0.6.2](https://github.com/napari/npe2/tree/v0.6.2) (2023-01-12) [Full Changelog](https://github.com/napari/npe2/compare/v0.6.1...v0.6.2) **Implemented enhancements:** - Expose `syntax_style` [\#261](https://github.com/napari/npe2/pull/261) ([brisvag](https://github.com/brisvag)) - enable keybinding contribution [\#254](https://github.com/napari/npe2/pull/254) ([kne42](https://github.com/kne42)) - Add count of discovered plugins [\#248](https://github.com/napari/npe2/pull/248) ([Czaki](https://github.com/Czaki)) - fix: relax display name validation [\#242](https://github.com/napari/npe2/pull/242) ([tlambert03](https://github.com/tlambert03)) - feat: add more fetch sources [\#240](https://github.com/napari/npe2/pull/240) ([tlambert03](https://github.com/tlambert03)) - feat: add category [\#239](https://github.com/napari/npe2/pull/239) ([tlambert03](https://github.com/tlambert03)) - bug: ignore extra fields on manifest [\#237](https://github.com/napari/npe2/pull/237) ([tlambert03](https://github.com/tlambert03)) - feat: add icon to manifest [\#235](https://github.com/napari/npe2/pull/235) ([tlambert03](https://github.com/tlambert03)) - add visibility field [\#234](https://github.com/napari/npe2/pull/234) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Fix PackageMetadata validation error with extra provided field [\#256](https://github.com/napari/npe2/pull/256) ([aganders3](https://github.com/aganders3)) - fix: alternate fix for dotted plugin\_name [\#238](https://github.com/napari/npe2/pull/238) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - fix flaky fetch tests [\#255](https://github.com/napari/npe2/pull/255) ([nclack](https://github.com/nclack)) **Documentation:** - Move to sphinx-design tabs [\#259](https://github.com/napari/npe2/pull/259) ([melissawm](https://github.com/melissawm)) - Fix a few broken links [\#258](https://github.com/napari/npe2/pull/258) ([melissawm](https://github.com/melissawm)) **Merged pull requests:** - changelog v0.6.2 [\#268](https://github.com/napari/npe2/pull/268) ([github-actions[bot]](https://github.com/apps/github-actions)) - ci\(dependabot\): bump styfle/cancel-workflow-action from 0.10.1 to 0.11.0 [\#252](https://github.com/napari/npe2/pull/252) ([dependabot[bot]](https://github.com/apps/dependabot)) - ci\(dependabot\): bump styfle/cancel-workflow-action from 0.10.0 to 0.10.1 [\#246](https://github.com/napari/npe2/pull/246) ([dependabot[bot]](https://github.com/apps/dependabot)) - Add setuptools plugin to compile manifest at build [\#194](https://github.com/napari/npe2/pull/194) ([tlambert03](https://github.com/tlambert03)) ## [v0.6.1](https://github.com/napari/npe2/tree/v0.6.1) (2022-08-08) [Full Changelog](https://github.com/napari/npe2/compare/v0.6.0...v0.6.1) **Fixed bugs:** - fix command id validation when dot in package name [\#230](https://github.com/napari/npe2/pull/230) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - changelog v0.6.1 [\#231](https://github.com/napari/npe2/pull/231) ([tlambert03](https://github.com/tlambert03)) ## [v0.6.0](https://github.com/napari/npe2/tree/v0.6.0) (2022-08-07) [Full Changelog](https://github.com/napari/npe2/compare/v0.5.2...v0.6.0) **Implemented enhancements:** - Add \(refactor\) configuration contribution, allowing plugins to declare a schema for their configurables [\#219](https://github.com/napari/npe2/pull/219) ([tlambert03](https://github.com/tlambert03)) - npe1 module ast visitor \(for faster fetch without install\) [\#217](https://github.com/napari/npe2/pull/217) ([tlambert03](https://github.com/tlambert03)) - Compile plugins using `npe2.implements` [\#186](https://github.com/napari/npe2/pull/186) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - fix name validator to allow periods [\#227](https://github.com/napari/npe2/pull/227) ([tlambert03](https://github.com/tlambert03)) - fix: delay inspection of command params [\#223](https://github.com/napari/npe2/pull/223) ([tlambert03](https://github.com/tlambert03)) - Improve error message on schema validation [\#220](https://github.com/napari/npe2/pull/220) ([Czaki](https://github.com/Czaki)) **Tests & CI:** - ci: remove fetch\_manifests [\#224](https://github.com/napari/npe2/pull/224) ([tlambert03](https://github.com/tlambert03)) - run io\_utils tests first [\#222](https://github.com/napari/npe2/pull/222) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - v0.6.0 changelog [\#229](https://github.com/napari/npe2/pull/229) ([github-actions[bot]](https://github.com/apps/github-actions)) - build: remove magicgui dependency [\#218](https://github.com/napari/npe2/pull/218) ([tlambert03](https://github.com/tlambert03)) ## [v0.5.2](https://github.com/napari/npe2/tree/v0.5.2) (2022-07-24) [Full Changelog](https://github.com/napari/npe2/compare/v0.5.1...v0.5.2) **Implemented enhancements:** - feat: deactivate on disable [\#212](https://github.com/napari/npe2/pull/212) ([tlambert03](https://github.com/tlambert03)) - feat: add register\_disposable [\#211](https://github.com/napari/npe2/pull/211) ([tlambert03](https://github.com/tlambert03)) - add back command enablement, category, short\_title, and icon [\#210](https://github.com/napari/npe2/pull/210) ([tlambert03](https://github.com/tlambert03)) - add version to cli [\#205](https://github.com/napari/npe2/pull/205) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Fix `npe2 list` when a dotted field key is empty [\#203](https://github.com/napari/npe2/pull/203) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - Split out `from_npe1` setuptools package inspection into new module [\#206](https://github.com/napari/npe2/pull/206) ([tlambert03](https://github.com/tlambert03)) **Documentation:** - Fix documentation links [\#208](https://github.com/napari/npe2/pull/208) ([melissawm](https://github.com/melissawm)) **Merged pull requests:** - changelog v0.5.2 [\#213](https://github.com/napari/npe2/pull/213) ([tlambert03](https://github.com/tlambert03)) ## [v0.5.1](https://github.com/napari/npe2/tree/v0.5.1) (2022-06-27) [Full Changelog](https://github.com/napari/npe2/compare/v0.5.0...v0.5.1) **Implemented enhancements:** - Add PluginManager `dict()` method to export state of manager [\#197](https://github.com/napari/npe2/pull/197) ([tlambert03](https://github.com/tlambert03)) - Add `npe2 list` command to discover/display all currently installed plugins [\#192](https://github.com/napari/npe2/pull/192) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - index npe1 stuff on `npe2 list` [\#198](https://github.com/napari/npe2/pull/198) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Always mock cache in tests [\#199](https://github.com/napari/npe2/pull/199) ([tlambert03](https://github.com/tlambert03)) **Documentation:** - Add mised theme type in description [\#200](https://github.com/napari/npe2/pull/200) ([Czaki](https://github.com/Czaki)) - Add docs clarifying menus `when` and `group` [\#195](https://github.com/napari/npe2/pull/195) ([tlambert03](https://github.com/tlambert03)) - Add docs about length requirements to display name [\#191](https://github.com/napari/npe2/pull/191) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - changelog v0.5.1 [\#201](https://github.com/napari/npe2/pull/201) ([tlambert03](https://github.com/tlambert03)) - Remove typing-extensions imports [\#193](https://github.com/napari/npe2/pull/193) ([tlambert03](https://github.com/tlambert03)) ## [v0.5.0](https://github.com/napari/npe2/tree/v0.5.0) (2022-06-21) [Full Changelog](https://github.com/napari/npe2/compare/v0.4.1...v0.5.0) **Implemented enhancements:** - Prevent runtime-arg checking on npe2.implements decorators by default [\#188](https://github.com/napari/npe2/pull/188) ([tlambert03](https://github.com/tlambert03)) - Add `npe2 fetch` command to cli to fetch remote manifests [\#185](https://github.com/napari/npe2/pull/185) ([tlambert03](https://github.com/tlambert03)) - allow `npe2 parse` to output to file, add format option [\#183](https://github.com/napari/npe2/pull/183) ([tlambert03](https://github.com/tlambert03)) - Add `npe1_shim` field to schema [\#182](https://github.com/napari/npe2/pull/182) ([tlambert03](https://github.com/tlambert03)) - allow `npe2.write` to take layer instances [\#181](https://github.com/napari/npe2/pull/181) ([tlambert03](https://github.com/tlambert03)) - Add `npe2pm` `TestPluginManager` fixture [\#180](https://github.com/napari/npe2/pull/180) ([tlambert03](https://github.com/tlambert03)) - Add `@npe.implements` decorators, for opt-in manifest validation and/or AST-based-generation [\#75](https://github.com/napari/npe2/pull/75) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Fail silently when caching throws `OSError` [\#184](https://github.com/napari/npe2/pull/184) ([DragaDoncila](https://github.com/DragaDoncila)) ## [v0.4.1](https://github.com/napari/npe2/tree/v0.4.1) (2022-06-13) [Full Changelog](https://github.com/napari/npe2/compare/v0.4.0...v0.4.1) **Fixed bugs:** - hide docs again in napari menus \(Fix napari docs build\) [\#178](https://github.com/napari/npe2/pull/178) ([tlambert03](https://github.com/tlambert03)) ## [v0.4.0](https://github.com/napari/npe2/tree/v0.4.0) (2022-06-13) [Full Changelog](https://github.com/napari/npe2/compare/v0.3.0.rc0...v0.4.0) **Implemented enhancements:** - Turn menus contributions into dict of arbitrary key to list of MenuItems [\#175](https://github.com/napari/npe2/pull/175) ([tlambert03](https://github.com/tlambert03)) - Add minor conveniences for DynamicPlugin [\#173](https://github.com/napari/npe2/pull/173) ([tlambert03](https://github.com/tlambert03)) - Add `plugin_manager` module for global singleton convenience [\#164](https://github.com/napari/npe2/pull/164) ([tlambert03](https://github.com/tlambert03)) - Allow arbitrary menu locations in npe2 [\#160](https://github.com/napari/npe2/pull/160) ([sofroniewn](https://github.com/sofroniewn)) **Fixed bugs:** - Fix writer order preference [\#172](https://github.com/napari/npe2/pull/172) ([tlambert03](https://github.com/tlambert03)) - Fix potential error with `npe2 cache --list` with uninstalled plugin [\#165](https://github.com/napari/npe2/pull/165) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Pre commit, flake8, and mypy updates [\#171](https://github.com/napari/npe2/pull/171) ([tlambert03](https://github.com/tlambert03)) - add dependabot [\#167](https://github.com/napari/npe2/pull/167) ([tlambert03](https://github.com/tlambert03)) - Auto update changelog workflow [\#151](https://github.com/napari/npe2/pull/151) ([Carreau](https://github.com/Carreau)) **Refactors:** - Revert menu restriction \(\#160\) [\#174](https://github.com/napari/npe2/pull/174) ([tlambert03](https://github.com/tlambert03)) - Push stack=... through all the reader internal API, take II [\#142](https://github.com/napari/npe2/pull/142) ([Carreau](https://github.com/Carreau)) **Documentation:** - Add doc links to README [\#158](https://github.com/napari/npe2/pull/158) ([nclack](https://github.com/nclack)) - Fix codeblock directive in docstring [\#156](https://github.com/napari/npe2/pull/156) ([melissawm](https://github.com/melissawm)) ## [v0.3.0.rc0](https://github.com/napari/npe2/tree/v0.3.0.rc0) (2022-04-05) [Full Changelog](https://github.com/napari/npe2/compare/v0.3.0...v0.3.0.rc0) ## [v0.3.0](https://github.com/napari/npe2/tree/v0.3.0) (2022-04-05) [Full Changelog](https://github.com/napari/npe2/compare/v0.2.2...v0.3.0) **Implemented enhancements:** - NPE1Adapter Part 3 - caching of adapter manifests [\#126](https://github.com/napari/npe2/pull/126) ([tlambert03](https://github.com/tlambert03)) - NPE1Adapter Part 2 - adding the NPE1Adapter object. [\#125](https://github.com/napari/npe2/pull/125) ([tlambert03](https://github.com/tlambert03)) - NPE1Adapter Part 1 - updated \_from\_npe1 conversion logic to prepare for locally defined objects [\#124](https://github.com/napari/npe2/pull/124) ([tlambert03](https://github.com/tlambert03)) **Fixed bugs:** - Avoid use of inspect.signature on CommandContribution class [\#146](https://github.com/napari/npe2/pull/146) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Fix black problem on pre-commit CI [\#147](https://github.com/napari/npe2/pull/147) ([Czaki](https://github.com/Czaki)) - Fix ci for testing all plugins [\#134](https://github.com/napari/npe2/pull/134) ([tlambert03](https://github.com/tlambert03)) - Add ci to test all plugins on PR label part1 [\#133](https://github.com/napari/npe2/pull/133) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - Turn off npe1 discovery by default [\#145](https://github.com/napari/npe2/pull/145) ([tlambert03](https://github.com/tlambert03)) - Reorganize contributions into submodule [\#130](https://github.com/napari/npe2/pull/130) ([tlambert03](https://github.com/tlambert03)) - drop py3.7 & misc small reformats [\#123](https://github.com/napari/npe2/pull/123) ([tlambert03](https://github.com/tlambert03)) **Merged pull requests:** - add v0.3.0 changelog [\#150](https://github.com/napari/npe2/pull/150) ([tlambert03](https://github.com/tlambert03)) ## [v0.2.2](https://github.com/napari/npe2/tree/v0.2.2) (2022-03-14) [Full Changelog](https://github.com/napari/npe2/compare/v0.2.1...v0.2.2) **Implemented enhancements:** - Add DynamicPlugin object/context for ease of testing & dynamic plugin creation [\#128](https://github.com/napari/npe2/pull/128) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - Disallow mutation on plugin manifest name [\#127](https://github.com/napari/npe2/pull/127) ([tlambert03](https://github.com/tlambert03)) - Clarify Typing. [\#105](https://github.com/napari/npe2/pull/105) ([Carreau](https://github.com/Carreau)) ## [v0.2.1](https://github.com/napari/npe2/tree/v0.2.1) (2022-03-11) [Full Changelog](https://github.com/napari/npe2/compare/v0.2.0...v0.2.1) **Fixed bugs:** - Fix auto-discovery of plugins for napari \<= 0.4.15 [\#120](https://github.com/napari/npe2/pull/120) ([tlambert03](https://github.com/tlambert03)) ## [v0.2.0](https://github.com/napari/npe2/tree/v0.2.0) (2022-03-10) [Full Changelog](https://github.com/napari/npe2/compare/v0.1.2...v0.2.0) **Implemented enhancements:** - add py.typed [\#115](https://github.com/napari/npe2/pull/115) ([tlambert03](https://github.com/tlambert03)) - Suggest to run npe2 validate when errors present. [\#104](https://github.com/napari/npe2/pull/104) ([Carreau](https://github.com/Carreau)) - Add enable disable [\#101](https://github.com/napari/npe2/pull/101) ([tlambert03](https://github.com/tlambert03)) - make package meta hashable [\#97](https://github.com/napari/npe2/pull/97) ([tlambert03](https://github.com/tlambert03)) - add min\_ver to PackageMetadata [\#96](https://github.com/napari/npe2/pull/96) ([tlambert03](https://github.com/tlambert03)) - set display\_name to plugin name when empty [\#92](https://github.com/napari/npe2/pull/92) ([nclack](https://github.com/nclack)) **Fixed bugs:** - add back deprecated \_samples on contributions index [\#116](https://github.com/napari/npe2/pull/116) ([tlambert03](https://github.com/tlambert03)) - Make conversion robust to entry\_point string entries [\#94](https://github.com/napari/npe2/pull/94) ([nclack](https://github.com/nclack)) **Tests & CI:** - Fix test warning [\#118](https://github.com/napari/npe2/pull/118) ([tlambert03](https://github.com/tlambert03)) - Test napari during CI [\#117](https://github.com/napari/npe2/pull/117) ([tlambert03](https://github.com/tlambert03)) - Separate dev test from integration test. [\#114](https://github.com/napari/npe2/pull/114) ([Carreau](https://github.com/Carreau)) **Refactors:** - Simplify Reader/writer internal logic. [\#107](https://github.com/napari/npe2/pull/107) ([Carreau](https://github.com/Carreau)) **Documentation:** - Update reader plugin contribution doc to mention `[(None,)]` sentinel [\#113](https://github.com/napari/npe2/pull/113) ([tlambert03](https://github.com/tlambert03)) - DOC: typo missing backtick [\#102](https://github.com/napari/npe2/pull/102) ([Carreau](https://github.com/Carreau)) - Fix some typos and dead links [\#99](https://github.com/napari/npe2/pull/99) ([andy-sweet](https://github.com/andy-sweet)) ## [v0.1.2](https://github.com/napari/npe2/tree/v0.1.2) (2022-01-28) [Full Changelog](https://github.com/napari/npe2/compare/v0.1.1...v0.1.2) **Fixed bugs:** - add include\_package\_data to setup.cfg in npe2 convert [\#89](https://github.com/napari/npe2/pull/89) ([tlambert03](https://github.com/tlambert03)) - Handle list of paths in iter\_compatible\_reader [\#87](https://github.com/napari/npe2/pull/87) ([ppwadhwa](https://github.com/ppwadhwa)) **Tests & CI:** - update gh release action to include schema [\#90](https://github.com/napari/npe2/pull/90) ([tlambert03](https://github.com/tlambert03)) **Documentation:** - use latest release schema for docs [\#85](https://github.com/napari/npe2/pull/85) ([tlambert03](https://github.com/tlambert03)) - Better way to find templates folder when building docs [\#84](https://github.com/napari/npe2/pull/84) ([tlambert03](https://github.com/tlambert03)) - Move some \_docs files [\#78](https://github.com/napari/npe2/pull/78) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.1](https://github.com/napari/npe2/tree/v0.1.1) (2022-01-07) [Full Changelog](https://github.com/napari/npe2/compare/v0.1.0...v0.1.1) **Implemented enhancements:** - Add option to validate python\_name imports [\#76](https://github.com/napari/npe2/pull/76) ([tlambert03](https://github.com/tlambert03)) - Extract out ImportExport logic from PluginManifest, don't sort yaml fields alphabetically [\#72](https://github.com/napari/npe2/pull/72) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - Bring test coverage to 100%, reorganize tests [\#70](https://github.com/napari/npe2/pull/70) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - change sample plugin name from `my_plugin` to `my-plugin` [\#74](https://github.com/napari/npe2/pull/74) ([tlambert03](https://github.com/tlambert03)) - split contributions/io into readers and writers [\#73](https://github.com/napari/npe2/pull/73) ([tlambert03](https://github.com/tlambert03)) - remove SPDX licenses [\#71](https://github.com/napari/npe2/pull/71) ([tlambert03](https://github.com/tlambert03)) - change engine to schema\_version [\#69](https://github.com/napari/npe2/pull/69) ([tlambert03](https://github.com/tlambert03)) - Replace entry\_point with activate/deactive function [\#68](https://github.com/napari/npe2/pull/68) ([tlambert03](https://github.com/tlambert03)) **Documentation:** - Autogen docs [\#77](https://github.com/napari/npe2/pull/77) ([tlambert03](https://github.com/tlambert03)) ## [v0.1.0](https://github.com/napari/npe2/tree/v0.1.0) (2021-12-15) [Full Changelog](https://github.com/napari/npe2/compare/v0.1.0rc1...v0.1.0) **Implemented enhancements:** - Remove semver dependency, vendor small portion [\#62](https://github.com/napari/npe2/pull/62) ([tlambert03](https://github.com/tlambert03)) - Make `npe2 convert` modify a repository [\#60](https://github.com/napari/npe2/pull/60) ([tlambert03](https://github.com/tlambert03)) - Delay import of `cmd.python_name` until needed [\#55](https://github.com/napari/npe2/pull/55) ([tlambert03](https://github.com/tlambert03)) - Add autogenerate\_from\_command field to Widget contribution [\#51](https://github.com/napari/npe2/pull/51) ([tlambert03](https://github.com/tlambert03)) - Update error messages [\#46](https://github.com/napari/npe2/pull/46) ([ppwadhwa](https://github.com/ppwadhwa)) - PackageMetadata field [\#44](https://github.com/napari/npe2/pull/44) ([tlambert03](https://github.com/tlambert03)) **Tests & CI:** - add changelog generator config [\#65](https://github.com/napari/npe2/pull/65) ([tlambert03](https://github.com/tlambert03)) - Test conversion for all plugins [\#52](https://github.com/napari/npe2/pull/52) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - Start to make command APIs clearer [\#61](https://github.com/napari/npe2/pull/61) ([tlambert03](https://github.com/tlambert03)) - rename autogenerate field \(\#53\) [\#58](https://github.com/napari/npe2/pull/58) ([nclack](https://github.com/nclack)) - Schema review [\#49](https://github.com/napari/npe2/pull/49) ([nclack](https://github.com/nclack)) ## [v0.1.0rc1](https://github.com/napari/npe2/tree/v0.1.0rc1) (2021-12-03) [Full Changelog](https://github.com/napari/npe2/compare/v0.0.1rc1...v0.1.0rc1) **Implemented enhancements:** - add `get_callable` to Executable mixin [\#34](https://github.com/napari/npe2/pull/34) ([tlambert03](https://github.com/tlambert03)) - Sample data [\#31](https://github.com/napari/npe2/pull/31) ([tlambert03](https://github.com/tlambert03)) - support for Dock Widgets [\#26](https://github.com/napari/npe2/pull/26) ([tlambert03](https://github.com/tlambert03)) - Manifest cli [\#20](https://github.com/napari/npe2/pull/20) ([ppwadhwa](https://github.com/ppwadhwa)) **Tests & CI:** - use pytomlpp, and test toml/json round trips [\#43](https://github.com/napari/npe2/pull/43) ([tlambert03](https://github.com/tlambert03)) - prep for release [\#42](https://github.com/napari/npe2/pull/42) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - Change 'publisher' to 'author' \(\#39\) [\#40](https://github.com/napari/npe2/pull/40) ([nclack](https://github.com/nclack)) - Cleanup manifest [\#38](https://github.com/napari/npe2/pull/38) ([nclack](https://github.com/nclack)) ## [v0.0.1rc1](https://github.com/napari/npe2/tree/v0.0.1rc1) (2021-11-17) [Full Changelog](https://github.com/napari/npe2/compare/cdbe96c3f0ea8c0e3ad050e91c24b40029cc0387...v0.0.1rc1) **Implemented enhancements:** - Small updates for napari [\#25](https://github.com/napari/npe2/pull/25) ([tlambert03](https://github.com/tlambert03)) - Add display\_name validation [\#23](https://github.com/napari/npe2/pull/23) ([nclack](https://github.com/nclack)) - Prevent extra fields in Commands. [\#15](https://github.com/napari/npe2/pull/15) ([Carreau](https://github.com/Carreau)) - More Validation. [\#14](https://github.com/napari/npe2/pull/14) ([Carreau](https://github.com/Carreau)) - Add debug to help diagnosing non-validation errors. [\#12](https://github.com/napari/npe2/pull/12) ([Carreau](https://github.com/Carreau)) - Add support for writer plugins [\#3](https://github.com/napari/npe2/pull/3) ([nclack](https://github.com/nclack)) - Some extra validation and allow to execute module with -m [\#1](https://github.com/napari/npe2/pull/1) ([Carreau](https://github.com/Carreau)) **Tests & CI:** - Better pytest error on invalid schema. [\#11](https://github.com/napari/npe2/pull/11) ([Carreau](https://github.com/Carreau)) - Misc validation and testing. [\#5](https://github.com/napari/npe2/pull/5) ([Carreau](https://github.com/Carreau)) - Implement linting, CI, add basic tests [\#4](https://github.com/napari/npe2/pull/4) ([tlambert03](https://github.com/tlambert03)) **Refactors:** - General refactor, Exectuable mixin, io\_utils APIs, remove some globals [\#18](https://github.com/napari/npe2/pull/18) ([tlambert03](https://github.com/tlambert03)) - Rename command command field to id. [\#10](https://github.com/napari/npe2/pull/10) ([Carreau](https://github.com/Carreau)) - Rename contributes to contributions ? [\#8](https://github.com/napari/npe2/pull/8) ([Carreau](https://github.com/Carreau)) \* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)* npe2-0.7.2/Makefile0000644000000000000000000000130713615410400010745 0ustar00.PHONY: docs docs: rm -rf docs/plugins python _docs/render.py rm -rf ./napari git clone --depth 1 --filter=blob:none --sparse -b update-plugin-docs https://github.com/tlambert03/napari.git (cd napari && git sparse-checkout set docs/plugins) cp napari/docs/plugins/* docs/plugins rm -rf ./napari jb build docs # by default this will make a minor version bump (e.g v0.4.16 -> v0.4.17) LAST := $(shell git tag -l | grep "v[0-9]+*" | awk '!/rc/' | sort -V | tail -1) SINCE := $(shell git log -1 -s --format=%cd --date=format:'%Y-%m-%d' $(LAST)) NEXT := $(shell echo $(LAST) | awk -F. -v OFS=. '{$$NF += 1 ; print}') changelog: github_changelog_generator --future-release=$(NEXT) --since-commit=$(SINCE) npe2-0.7.2/codecov.yml0000644000000000000000000000010513615410400011445 0ustar00coverage: status: project: default: target: 100% npe2-0.7.2/.github/ISSUE_TEMPLATE.md0000644000000000000000000000047313615410400013355 0ustar00* npe2 version: * Python version: * Operating System: ### Description Describe what you were trying to get done. Tell us what happened, what went wrong, and what you expected to happen. ### What I Did ``` Paste the command(s) you ran and the output. If there was a crash, please include the traceback here. ``` npe2-0.7.2/.github/dependabot.yml0000644000000000000000000000042413615410400013474 0ustar00# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates version: 2 updates: - package-ecosystem: "github-actions" directory: "/" schedule: interval: "weekly" commit-message: prefix: "ci(dependabot):" npe2-0.7.2/.github/workflows/ci.yml0000644000000000000000000000715013615410400014022 0ustar00name: CI on: push: branches: - main tags: - "v*" pull_request: {} workflow_dispatch: jobs: check-manifest: name: Check Manifest runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Check run: pipx run check-manifest test: name: ${{ matrix.platform }} (${{ matrix.python-version }}) runs-on: ${{ matrix.platform }} strategy: fail-fast: false matrix: python-version: [3.8, 3.9, "3.10", "3.11"] platform: [ubuntu-latest, macos-latest, windows-latest] steps: - uses: actions/checkout@v3 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v4 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: | python -m pip install --upgrade pip pip install -e .[json,docs,testing] - name: Test Main docs build run: pytest --color yes -m github_main_only - name: Test run: | coverage run --source=npe2 -m pytest --color yes coverage xml coverage report --show-missing - name: Coverage uses: codecov/codecov-action@v3 with: fail_ci_if_error: true test_napari: name: napari tests runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/checkout@v3 with: repository: napari/napari path: napari-from-github fetch-depth: 0 - uses: actions/setup-python@v4 with: python-version: "3.10" - name: Install run: | python -m pip install --upgrade pip python -m pip install -e ./napari-from-github -c "./napari-from-github/resources/constraints/constraints_py3.10.txt" python -m pip install -e .[json] # bare minimum required to test napari/plugins python -m pip install pytest scikit-image[data] zarr xarray hypothesis matplotlib - name: Run napari plugin headless tests run: pytest -W 'ignore::DeprecationWarning' napari/plugins napari/settings napari/layers napari/components working-directory: napari-from-github test_docs_render: name: docs render runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: actions/setup-python@v4 with: python-version: "3.x" - name: Build schema run: | python -m pip install --upgrade pip pip install -e .[docs] python -m npe2.manifest.schema > _schema.json - name: Test rendering docs run: python _docs/render.py env: NPE2_SCHEMA: "_schema.json" deploy: name: Deploy needs: test if: "success() && startsWith(github.ref, 'refs/tags/')" runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Set up Python uses: actions/setup-python@v4 with: python-version: "3.x" - name: install run: | git tag pip install -U pip pip install -U twine build python -m build . twine check dist/* ls -lh dist - name: write schema run: | pip install -e . python -m npe2.manifest.schema > schema.json - name: Build and publish run: twine upload dist/* env: TWINE_USERNAME: __token__ TWINE_PASSWORD: ${{ secrets.TWINE_API_KEY }} - uses: softprops/action-gh-release@v1 if: startsWith(github.ref, 'refs/tags/') with: generate_release_notes: true files: schema.json npe2-0.7.2/.github/workflows/test_all_plugins.yml0000644000000000000000000000256213615410400017001 0ustar00name: Test all plugins on: pull_request: types: [ labeled ] workflow_dispatch: jobs: get-plugins: if: github.event.label.name == 'test-all-plugins' || github.event_name == 'workflow_dispatch' runs-on: ubuntu-latest steps: - uses: styfle/cancel-workflow-action@0.11.0 with: access_token: ${{ github.token }} - id: plugin_names run: echo "::set-output name=plugins::$(curl -s https://api.napari-hub.org/plugins | jq -c 'keys')" outputs: plugins: ${{ steps.plugin_names.outputs.plugins }} test_all: needs: get-plugins name: ${{ matrix.plugin }} runs-on: ubuntu-latest strategy: fail-fast: false matrix: plugin: ${{ fromJson(needs.get-plugins.outputs.plugins) }} defaults: run: shell: bash -l {0} steps: - uses: actions/checkout@v3 - uses: tlambert03/setup-qt-libs@v1 - uses: conda-incubator/setup-miniconda@v2 with: python-version: 3.9 miniforge-variant: Mambaforge miniforge-version: latest use-mamba: true - name: Install npe2 run: pip install -e .[testing] - run: sudo apt-get install -y xvfb - name: Run tests run: xvfb-run --auto-servernum pytest tests/test_all_plugins.py -s -v --color=yes env: TEST_PACKAGE_NAME: ${{ matrix.plugin }} npe2-0.7.2/.github/workflows/test_conversion.yml0000644000000000000000000000634213615410400016655 0ustar00name: Test Plugin Conversion on: # push: # pull_request: workflow_dispatch: jobs: get-plugins: runs-on: ubuntu-latest steps: - uses: styfle/cancel-workflow-action@0.11.0 with: access_token: ${{ github.token }} - id: plugin_names run: echo "::set-output name=plugins::$(curl -s https://api.napari-hub.org/plugins | jq -c 'keys')" outputs: plugins: ${{ steps.plugin_names.outputs.plugins }} convert: needs: get-plugins name: convert ${{ matrix.plugin }} runs-on: ubuntu-latest strategy: fail-fast: false matrix: # plugin: ["napari-dv"] plugin: ${{ fromJson(needs.get-plugins.outputs.plugins) }} steps: - uses: tlambert03/setup-qt-libs@v1 - uses: actions/setup-python@v4 with: python-version: 3.9 - name: Install dependencies run: | pip install -U pip # just in case... we ask them not to depend on this or Pyside # since it's up to the enduser to have with napari pip install PyQt5 pip install git+https://github.com/napari/npe2.git@refs/pull/60/head#egg=npe2 - name: Fetch repo URL run: | URL=$(curl -s https://api.napari-hub.org/plugins/${{ matrix.plugin }} | jq '.code_repository') URL=${URL#'"https://github.com/'} URL=${URL%'"'} echo "plugin_repo=$URL" >> $GITHUB_ENV - name: Checkout plugin repo uses: actions/checkout@v3 with: repository: ${{ env.plugin_repo }} path: 'plugin_repo' fetch-depth: 0 - name: Install ${{ matrix.plugin }} run: pip install -e ./plugin_repo - name: Test Conversion id: test-without-napari uses: aganders3/headless-gui@v1 continue-on-error: true with: run: npe2 convert ./plugin_repo - name: Install napari if: ${{ steps.test-without-napari.outcome == 'failure' }} run: pip install napari - name: Test Conversion again with napari id: test-with-napari if: ${{ steps.test-without-napari.outcome == 'failure' }} uses: aganders3/headless-gui@v1 with: run: npe2 convert ./plugin_repo - name: Test Conversion again with napari if: ${{ steps.test-without-napari.outcome == 'failure' && steps.test-with-napari.outcome == 'failure' }} uses: aganders3/headless-gui@v1 with: # try without modifying directory run: npe2 convert -n ${{ matrix.plugin }} # this won't work, we'd need to first fork the repo somewhere we have write permissions # then push changes that that repository, and then create a PR to the original repo # - name: Create Pull Request # if: success() # uses: peter-evans/create-pull-request@v3 # with: # commit-message: convert plugin to npe2 format # title: 'Convert to npe2 plugin' # body: | # This PR adds an (autogenerated) npe2 manifest, and updates setup.cfg (if setup.cfg is used). # If you use setup.py instead, please update the entry_point manually: # entry_points = {'napari.manifest': "your-package = your_package:napari.yaml"} npe2-0.7.2/.github/workflows/update_changelog.yml0000644000000000000000000000146713615410400016725 0ustar00name: Update Changelog on: workflow_dispatch: inputs: next_tag: description: "Next version tag (`vX.Y.Z`)" required: true jobs: changelog: runs-on: ubuntu-20.04 steps: - name: Checkout uses: actions/checkout@v3 - name: "✏️ Generate release changelog" uses: heinrichreimer/github-changelog-generator-action@v2.3 with: futureRelease: ${{ github.event.inputs.next_tag }} token: ${{ secrets.GITHUB_TOKEN }} repo: napari/npe2 - name: Create Pull Request uses: peter-evans/create-pull-request@v5 with: token: ${{ secrets.GITHUB_TOKEN }} commit-message: Automatic changelog update title: changelog ${{ github.event.inputs.next_tag }} branch: update-changelog npe2-0.7.2/_docs/example_manifest.yaml0000644000000000000000000000407313615410400014604 0ustar00name: example-plugin display_name: Example Plugin contributions: commands: - id: example-plugin.hello_world title: Hello World - id: example-plugin.read_xyz title: Read ".xyz" files python_name: example_plugin.some_module:get_reader - id: example-plugin.write_points title: Save points layer to csv python_name: example_plugin.some_module:write_points - id: example-plugin.my_widget title: Open my widget python_name: example_plugin.some_module:MyWidget - id: example-plugin.do_threshold title: Perform threshold on image, return new image python_name: example_plugin.some_module:threshold - id: example-plugin.threshold_widget title: Make threshold widget with magic_factory python_name: example_plugin.some_module:widget_factory - id: example-plugin.data.fractal title: Create fractal image python_name: example_plugin.some_module:create_fractal readers: - command: example-plugin.read_xyz filename_patterns: ["*.xyz"] accepts_directories: false writers: - command: example-plugin.write_points filename_extensions: [".csv"] layer_types: ["points"] widgets: - command: example-plugin.my_widget display_name: Wizard - command: example-plugin.threshold_widget display_name: Threshold - command: example-plugin.do_threshold display_name: Threshold autogenerate: true themes: - label: "Monokai" id: "monokai" type: "dark" syntax_style: "monokai" colors: canvas: "#000000" console: "#000000" background: "#272822" foreground: "#75715e" primary: "#cfcfc2" secondary: "#f8f8f2" highlight: "#e6db74" text: "#a1ef34" icon: "#a1ef34" warning: "#f92672" current: "#66d9ef" sample_data: - key: fractal display_name: Fractal command: example-plugin.data.fractal - key: napari display_name: Tabueran Kiribati uri: https://en.wikipedia.org/wiki/Napari#/media/File:Tabuaeran_Kiribati.jpg npe2-0.7.2/_docs/render.py0000644000000000000000000001573313615410400012235 0ustar00from __future__ import annotations import inspect import json import os import re import sys from contextlib import contextmanager from functools import lru_cache, partial from inspect import getsource from pathlib import Path from types import FunctionType from typing import Dict, Optional, Set from urllib.request import urlopen import yaml from jinja2 import Environment, FileSystemLoader, select_autoescape from npe2 import PluginManager, PluginManifest from npe2.manifest.contributions import ContributionPoints from npe2.manifest.utils import Executable SCHEMA_URL = "https://github.com/napari/npe2/releases/latest/download/schema.json" DOCS = Path(__file__).parent TEMPLATES = DOCS / "templates" _BUILD = DOCS.parent / "docs" / "plugins" EXAMPLE_MANIFEST = PluginManifest.from_file(DOCS / "example_manifest.yaml") @contextmanager def _mocked_qtwidgets(): # just mocking a "qtpy.QtWidgets" so we don't need to include PyQt just to build # documentation. from types import ModuleType mock = ModuleType("qtpy.QtWidgets") mock.__dict__["QWidget"] = object before, sys.modules["qtpy.QtWidgets"] = sys.modules.get("qtpy.QtWidgets"), mock try: yield finally: if before is not None: sys.modules["qtpy.QtWidgets"] = mock else: del sys.modules["qtpy.QtWidgets"] @lru_cache def type_strings() -> Dict[str, str]: """Return map of type name to source code for all types in types.py""" from npe2 import types as _t type_strings = {} type_lines = getsource(_t).splitlines() for r, line in enumerate(type_lines): if not line or line.startswith((" ", "#", "]", ")", "if", "from")): continue end = 0 if r + 1 >= len(type_lines): continue next_line = type_lines[r + 1] if next_line.startswith(" "): end = next( ( i for i, x in enumerate(type_lines[r + 1 :]) if not x.startswith((" ", "#")) ) ) if end: end += 1 name = line.split()[0] if name == "class": name = line.split()[1].split("(")[0] type_strings[name] = "\n".join(type_lines[r : r + end + 1]) return type_strings def _get_needed_types(source: str, so_far: Optional[Set[str]] = None) -> Set[str]: """Return the names of types in the npe2.types.py that are used in `source`""" so_far = so_far or set() for name, string in type_strings().items(): # we treat LayerData specially if ( name != "LayerData" and name not in so_far and re.search(rf"\W{name}\W", source) ): so_far.add(name) so_far.update(_get_needed_types(string, so_far=so_far)) return so_far def _build_example(contrib: Executable) -> str: """Extract just the source code for a specific executable contribution""" if not isinstance(contrib, Executable): return "" with _mocked_qtwidgets(): func = contrib.get_callable() if not callable(func): return "" if isinstance(func, partial): func = func.keywords["function"] source = inspect.getsource(func) # additionally get source code of all internally referenced functions # i.e. for get_reader we also get the source for the returned reader. if isinstance(func, FunctionType): for name in func.__code__.co_names: if name in func.__globals__: f = func.__globals__[name] source += "\n\n" + inspect.getsource(f) needed = _get_needed_types(source) lines = [v for k, v in type_strings().items() if k in needed] if lines: lines.extend(["", ""]) lines.extend(source.splitlines()) return "\n".join(lines) def example_implementation(contrib_name: str) -> str: """Build an example string of python source implementing a specific contribution.""" contrib = getattr(EXAMPLE_MANIFEST.contributions, contrib_name) if isinstance(contrib, list): return "\n\n".join([_build_example(x) for x in contrib]).strip() return _build_example(contrib) def example_contribution( contrib_name: str, format="yaml", manifest: PluginManifest = EXAMPLE_MANIFEST ) -> str: """Get small manifest example for just contribution named `contrib_name`""" assert manifest.contributions contribs = getattr(manifest.contributions, contrib_name) # only take the first command example ... the rest are for executables if contrib_name == "commands": contribs = [contribs[0]] ex = ContributionPoints(**{contrib_name: contribs}) # for "executables", include associated command ExampleCommands = manifest.contributions.commands assert ExampleCommands for c in contribs or (): if isinstance(c, Executable): associated_command = next(i for i in ExampleCommands if i.id == c.command) if not ex.commands: ex.commands = [] ex.commands.append(associated_command) output = {"contributions": json.loads(ex.json(exclude_unset=True))} if format == "yaml": return yaml.safe_dump(output, sort_keys=False) if format == "toml": import tomli_w return tomli_w.dumps(output) if format == "json": return json.dumps(output) raise ValueError("Invalid format: {format}. Must be 'yaml', 'toml' or 'json'.") def has_guide(contrib_name: str) -> bool: """Return true if a guide exists for this contribution.""" return (TEMPLATES / f"_npe2_{contrib_name}_guide.md.jinja").exists() def main(dest: Path = _BUILD): """Render all jinja docs in ./templates and output to `dest`""" # register the example plugin so we can use `.get_callable()` in _build_example sys.path.append(str(DOCS.absolute())) PluginManager.instance().register(EXAMPLE_MANIFEST) env = Environment( loader=FileSystemLoader(TEMPLATES), autoescape=select_autoescape() ) env.filters["example_contribution"] = example_contribution env.filters["example_implementation"] = example_implementation env.filters["has_guide"] = has_guide dest.mkdir(exist_ok=True, parents=True) schema = PluginManifest.schema() if local_schema := os.getenv("NPE2_SCHEMA"): with open(local_schema) as f: schema = json.load(f) else: with urlopen(SCHEMA_URL) as response: schema = json.load(response) contributions = schema["definitions"]["ContributionPoints"]["properties"] context = { "schema": schema, "contributions": contributions, "example": EXAMPLE_MANIFEST, # "specs": _get_specs(), "specs": {}, } for t in TEMPLATES.glob("*.jinja"): template = env.get_template(t.name) _dest = dest / f"{t.stem}" _dest.write_text(template.render(context), encoding="utf-8") print(f"Rendered {_dest}") if __name__ == "__main__": dest = Path(sys.argv[1]).absolute() if len(sys.argv) > 1 else _BUILD main(dest) npe2-0.7.2/_docs/example_plugin/__init__.py0000644000000000000000000000000013615410400015503 0ustar00npe2-0.7.2/_docs/example_plugin/some_module.py0000644000000000000000000000404613615410400016272 0ustar00# python_name: example_plugin._data:fractal from typing import TYPE_CHECKING, Any, Dict, List, Optional from magicgui import magic_factory from qtpy.QtWidgets import QWidget from npe2.types import LayerData, PathOrPaths, ReaderFunction if TYPE_CHECKING: import napari.types import napari.viewer def write_points(path: str, layer_data: Any, attributes: Dict[str, Any]) -> List[str]: with open(path, "w"): ... # save layer_data and attributes to file # return path to any file(s) that were successfully written return [path] def get_reader(path: PathOrPaths) -> Optional[ReaderFunction]: # If we recognize the format, we return the actual reader function if isinstance(path, str) and path.endswith(".xyz"): return xyz_file_reader # otherwise we return None. return None def xyz_file_reader(path: PathOrPaths) -> List[LayerData]: data = ... # somehow read data from path layer_attributes = {"name": "etc..."} return [(data, layer_attributes)] class MyWidget(QWidget): """Any QtWidgets.QWidget or magicgui.widgets.Widget subclass can be used.""" def __init__(self, viewer: "napari.viewer.Viewer", parent=None): super().__init__(parent) ... @magic_factory def widget_factory( image: "napari.types.ImageData", threshold: int ) -> "napari.types.LabelsData": """Generate thresholded image. This pattern uses magicgui.magic_factory directly to turn a function into a callable that returns a widget. """ return (image > threshold).astype(int) def threshold( image: "napari.types.ImageData", threshold: int ) -> "napari.types.LabelsData": """Generate thresholded image. This function will be turned into a widget using `autogenerate: true`. """ return (image > threshold).astype(int) def create_fractal() -> List[LayerData]: """An example of a Sample Data Function. Note: Sample Data with URIs don't need python code. """ data = ... # do something cool to create a fractal return [(data, {"name": "My cool fractal"})] npe2-0.7.2/_docs/templates/_npe2_contributions.md.jinja0000644000000000000000000000463313615410400020000 0ustar00# Contributions Reference **Contributions** are a set of static declarations that you make in the `contributions` field of the [Plugin Manifest](./manifest). Your extension registers **Contributions** to extend various functionalities within napari. Here is a list of all available **Contributions**: {# list all contributions first #} {%- for name, contrib in contributions.items() %} {%- if not contrib.hide_docs %} - [`{{ name }}`](contributions-{{ name|replace('_', '-') }}) {%- endif -%} {% endfor %} You may add as many contributions as you'd like to a single manifest. For clarity, the following examples include only the specific contribution that is being discussed. {# now, iterate through all contributions and show fields and #} {%- for contrib_name, contrib in contributions.items() -%} {% if not contrib.hide_docs -%} (contributions-{{ contrib_name|replace('_', '-') }})= ## `contributions.{{contrib_name}}` {# check if this is a single type, or a union #} {%- if contrib['items']['anyOf'] is defined %} {%- set type_names = contrib['items']['anyOf']|map(attribute='$ref')|map("replace", "#/definitions/", "")|list %} {%- set union = True %} {%- else %} {%- set type_names = [contrib['items']['$ref']|replace("#/definitions/", "")] %} {%- set union = False %} {%- endif -%} {%- if union %} ```{tip} This contribution accepts {{ type_names|length }} schema types ``` {%- endif -%} {%- for tname in type_names -%} {% set type = schema['definitions'][tname] %} {% if union %}##### {{loop.index}}. {{type.title}}{% endif %} {{ type.description }} {% if contrib_name|has_guide %} See the [{{ contrib_name.title()|replace('_', ' ') }} Guide]({{ contrib_name|replace('_', '-') }}-contribution-guide) for more details on implementing this contribution. {% endif %} {# Using bold instead of headers in this case to avoid right-side nav #} **Fields** {%- for field_name, field_props in type.properties.items() -%} {% set required = field_name in type.required %} - **`{{contrib_name}}.{{field_name}}`** : {% if not required %} *(Optional: default={{ field_props.default if field_props.default is not undefined else 'None' }}).* {% endif -%} {{- field_props.description -}} {% endfor -%} {% endfor %} ### {{ contrib_name.title()|replace("_", " ") }} example ::::{tab-set} {% for format in ['yaml', 'toml'] %} :::{tab-item} {{format}} ```{{format}} {{ contrib_name|example_contribution(format) }} ``` ::: {% endfor -%} :::: {% endif %} {% endfor %} npe2-0.7.2/_docs/templates/_npe2_manifest.md.jinja0000644000000000000000000000373013615410400016701 0ustar00# Manifest Reference ```{important} Plugin manifests are a feature of the second generation napari plugin engine ("npe2"). If you are still using the first generation `napari-plugin-engine` (i.e. the `napari.plugin` entrypoint, along with `@napari_hook_implementation` decorators) then this page does not apply to your plugin. ``` Every napari plugin needs to ship a manifest file with their package. By convention, this file is called `napari.yaml` and it is placed in the top level module of the package, but it can be named anything and placed anywhere. You tell napari where to find your manifest by adding a `napari.manifest` [entry point](https://packaging.python.org/en/latest/specifications/entry-points/) to your package metadata: ```ini # tell napari where to find to your manifest [options.entry_points] napari.manifest = example-plugin = example_plugin:napari.yaml # make sure it gets included in your package [options.package_data] example-plugin = napari.yaml ``` ## Fields All fields are optional except those in **bold**. | Name | Details | |------|---------| {%- for key, field in schema.properties.items() %} {%- if not field.hide_docs %} | {% if key in schema.required %} **`{{ key }}`** {%else%} `{{ key }}` {%endif%} | {{ field.description }}| {%- endif %} {%- endfor %} ```{note} Standard python [package metadata](https://packaging.python.org/en/latest/specifications/core-metadata/) from your `setup.cfg` file will also be parsed for version, license, and other info. ``` ## Example Here is a complete example of what the manifest of a plugin providing *all* contributions might look like. (Note: a plugin needn't implement more than a single contribution type). ```{tip} Both [YAML](https://yaml.org/) and [TOML](https://toml.io/en/) are supported manifest formats, though YAML is the "default" and more common format. ``` ::::{tab-set} {% for format in ['yaml', 'toml'] %} :::{tab-item} {{format}} ```{{format}} {{ example[format]() }} ``` ::: {% endfor -%} :::: npe2-0.7.2/_docs/templates/_npe2_readers_guide.md.jinja0000644000000000000000000000444313615410400017677 0ustar00(readers-contribution-guide)= ## Readers Reader plugins may add support for new filetypes to napari. They are invoked whenever `viewer.open('some/path')` is used on the command line, or when a user opens a file in the graphical user interface by dropping a file into the canvas, or using `File -> Open...` The `command` provided by a reader contribution is expected to be a function that accepts a path (`str`) or a list of paths and: * returns `None` (if it does not want to accept the given path) * returns a *new function* (a `ReaderFunction`) that is capable of doing the reading. The `ReaderFunction` will be passed the same path (or list of paths) and is expected to return a list of {ref}`LayerData tuples `. In the rare case that a reader plugin would like to "claim" a file, but *not* actually add any data to the viewer, the `ReaderFunction` may return the special value `[(None,)]`. ```{admonition} Accepting directories A reader may indicate that it accepts directories by setting `contributions.readers..accepts_directories` to `True`; otherwise, they will not be invoked when a directory is passed to `viewer.open`. ``` ### Reader example ::::{tab-set} :::{tab-item} npe2 **python implementation** ```python # example_plugin.some_module {{ 'readers'|example_implementation }} ``` **manifest** See [Readers contribution reference](contributions-readers) for field details. ```yaml {{ 'readers'|example_contribution }} ``` ::: :::{tab-item} napari-plugin-engine ```{admonition} Deprecated! This demonstrates the now-deprecated `napari-plugin-engine` pattern. ``` **python implementation** [hook specification](https://napari.org/stable/plugins/npe1.html#napari.plugins.hook_specifications.napari_get_reader) ```python from napari_plugin_engine import napari_hook_implementation @napari_hook_implementation def napari_get_reader(path: PathOrPaths) -> Optional[ReaderFunction]: # If we recognize the format, we return the actual reader function if isinstance(path, str) and path.endswith(".xyz"): return xyz_file_reader # otherwise we return None. return None def xyz_file_reader(path: PathOrPaths) -> List[LayerData]: data = ... # somehow read data from path layer_properties = {"name": "etc..."} return [(data, layer_properties)] ``` ::: :::: npe2-0.7.2/_docs/templates/_npe2_sample_data_guide.md.jinja0000644000000000000000000000330513615410400020520 0ustar00(sample-data-contribution-guide)= ## Sample Data This contribution point allows plugin developers to contribute sample data that will be accessible in the napari interface via the `File > Open Sample` menu, or via the command line with `viewer.open_sample`. Sample data can be useful for demonstrating the functionality of a given plugin. It can take the form of a **Sample Data URI** that points to a static resource (such as a file included in the plugin distribution, or a remote resource), or **Sample Data Function** that generates layer data on demand. ### Sample Data example ::::{tab-set} :::{tab-item} npe2 **python implementation** ```python # example_plugin.some_module {{ 'sample_data'|example_implementation }} ``` **manifest** See [Sample Data contribution reference](contributions-sample-data) for field details. ```yaml {{ 'sample_data'|example_contribution }} ``` ::: :::{tab-item} napari-plugin-engine ```{admonition} Deprecated! This demonstrates the now-deprecated `napari-plugin-engine` pattern. ``` **python implementation** [hook specification](https://napari.org/stable/plugins/npe1.html#napari.plugins.hook_specifications.napari_provide_sample_data) ```python import numpy as np from napari_plugin_engine import napari_hook_implementation def _generate_random_data(shape=(512, 512)): data = np.random.rand(*shape) return [(data, {'name': 'random data'})] @napari_hook_implementation def napari_provide_sample_data(): return { 'random data': _generate_random_data, 'random image': 'https://picsum.photos/1024', 'sample_key': { 'display_name': 'Some Random Data (512 x 512)' 'data': _generate_random_data, } } ``` ::: :::: npe2-0.7.2/_docs/templates/_npe2_widgets_guide.md.jinja0000644000000000000000000001007613615410400017717 0ustar00(widgets-contribution-guide)= ## Widgets Widget plugins allow developers to contribute novel graphical elements (aka "widgets") to the user interface. These widgets can request access to the viewer instance in which they are docked, enabling a broad range of functionality: essentially, anything that can be done with the napari `Viewer` and `Layer` APIs can be accomplished with widgets. ```{important} Because this is a powerful and open-ended plugin specification, we ask that plugin developers take additional care when providing widget plugins. Make sure to only use public methods on the `viewer` and `layer` instances. Also, be mindful of the fact that the user may be using your plugin along with other plugins or workflows: try to only modify layers added by your plugin, or specifically requested by the user. ``` The widget specification requires that the plugin provide napari with a *callable* object that, when called, returns an *instance* of a widget (where a "widget" is an instance of `QtWidgets.QWidget` or `magicgui.widgets.Widget`). There are a few commonly used patterns that fulfill this `Callable[..., Widget]` specification: 1. Provide a `class` object directly, such as a `QtWidgets.QWidget` or `magicgui.widgets.Widget` subclass: ```python from qtpy.QtWidgets import QWidget class MyPluginWidget(QWidget): def __init__(self, viewer: 'napari.viewer.Viewer', parent=None): super().__init__(parent) self._viewer = viewer ``` 2. Provide a wrapper function, or `magicgui.magic_factory` object: ```python from magicgui import magic_factory @magic_factory def create_widget(image: 'napari.types.ImageData') -> 'napari.types.ImageData': ... ``` *(reminder, in the example above, each time the `magic_factory`-decorated `create_widget()` function is called, it returns a new widget instance –– just as we need for the widget specification.)* 3. Lastly, you can provide an arbitrary function and request that napari autogenerate a widget using `magicgui.magicgui`. In the first generation `napari_plugin_engine`, this was the `napari_experimental_provide_function` hook specification. In the new `npe2` pattern, one uses the `autogenerate` field in the [WidgetContribution](contributions-widgets). ### Widget example ::::{tab-set} :::{tab-item} npe2 **python implementation** ```python # example_plugin.some_module {{ 'widgets'|example_implementation }} ``` **manifest** See [Widgets contribution reference](contributions-widgets) for field details. ```yaml {{ 'widgets'|example_contribution }} ``` ::: :::{tab-item} napari-plugin-engine ```{admonition} Deprecated! This demonstrates the now-deprecated `napari-plugin-engine` pattern. ``` **python implementation** [hook_specification](https://napari.org/stable/plugins/npe1.html#napari.plugins.hook_specifications.napari_experimental_provide_dock_widget) ```python from qtpy.QtWidgets import QWidget from napari_plugin_engine import napari_hook_implementation class AnimationWizard(QWidget): def __init__(self, viewer: "napari.viewer.Viewer", parent=None): super().__init__(parent) ... @magic_factory def widget_factory( image: "napari.types.ImageData", threshold: int ) -> "napari.types.LabelsData": """Generate thresholded image. This pattern uses magicgui.magic_factory directly to turn a function into a callable that returns a widget. """ return (image > threshold).astype(int) def threshold( image: "napari.types.ImageData", threshold: int ) -> "napari.types.LabelsData": """Generate thresholded image. This function will be turned into a widget using `autogenerate: true`. """ return (image > threshold).astype(int) # in the first generation plugin engine, these widgets were declared # using special `napari_hook_implementation`-decorated functions. @napari_hook_implementation def napari_experimental_provide_dock_widget(): return [AnimationWizard, widget_factory] @napari_hook_implementation def napari_experimental_provide_function(): return [threshold] ``` ::: :::: npe2-0.7.2/_docs/templates/_npe2_writers_guide.md.jinja0000644000000000000000000001163413615410400017751 0ustar00(writers-contribution-guide)= ## Writers Writer plugins add support for exporting data from napari. They are invoked whenever `viewer.layers.save('some/path.ext')` is used on the command line, or when a user requests to save one or more layers in the graphical user interface with `File -> Save Selected Layer(s)...` or `Save All Layers...` ```{important} This guide describes the second generation (`npe2`) plugin specification. New plugins should no longer use the old `napari_get_writer` hook specification from the first generation `napari_plugin_engine`. ``` ### Writer plugin function signatures Writer plugins are *functions* that: 1. Accept a destination path and data from one or more layers in the viewer 2. Write layer data and associated attributes to disk 3. Return a list of strings containing the path(s) that were successfully written. They must follow one of two calling conventions (where the convention used is determined by the [`layer_type` constraints](layer-type-constraints) provided by the corresponding writer contribution in the manifest). #### 1. single-layer writer Single-layer writers will receive a **path**, layer **data**, and a `dict` of layer **attributes**, (e.g. `{'name': 'My Layer', 'opacity': 0.6}`) ```python def single_layer_writer(path: str, data: Any, attributes: dict) -> List[str]: ... ``` The formal type is as follows: ```python DataType = Any # usually something like a numpy array, but varies by layer LayerAttributes = dict SingleWriterFunction = Callable[[str, DataType, LayerAttributes], List[str]] ``` #### 2. multi-layer writer Multi-layer writers will receive a **path**, and a list of full [layer data tuples](layer-data-tuples). ```python def multi_layer_writer(path: str, layer_data: List[FullLayerData]) -> List[str]: ... ``` The formal type is as follows: ```python DataType = Any # usually something like a numpy array, but varies by layer LayerAttributes = dict LayerName = Literal["graph", "image", "labels", "points", "shapes", "surface", "tracks", "vectors"] FullLayerData = Tuple[DataType, LayerAttributes, LayerName] MultiWriterFunction = Callable[[str, List[FullLayerData]], List[str]] ``` (layer-type-constraints)= ### Layer type constraints Individual writer contributions are determined to be **single-layer writers** or **multi-layer writers** based on their **`writer.layer_types`** constraints provided in the [contribution metadata](contributions-writers). A writer plugin declares that it can accept between *m* and *n* layers of a specific *type* (where *0 ≤ m ≤ n*), using regex-like syntax with the special characters **`?`**, **`+`** and **`*`**: - **`image`**: Writes exactly 1 image layer. - **`image?`**: Writes 0 or 1 image layers. - **`image+`**: Writes 1 or more image layers. - **`image*`**: Writes 0 or more image layers. - **`image{k}`**: Writes exactly k image layers. - **`image{m,n}`**: Writes between *m* and *n* layers (inclusive range). Must have *m <= n*. A writer plugin will *only* be invoked when its `layer_types` constraint is compatible with the layer type(s) that the user is saving. When a type is not present in the list of constraints, it is assumed the writer is **not** compatible with that type. **Consider this example contributions section in a manifest:** ```yaml contributions: writers: - command: example-plugin.some_writer layer_types: ["image+", "points?"] filename_extensions: [".ext"] ``` This writer would be considered when 1 or more `Image` layers and 0 or 1 `Points` layers are selected (i.e. the `Points` layer is optional). This writer would *not* be selected when the user tries to save an `image` and a `vectors` layer, because `vectors` is not listed in the `layer_types`. ### Writer example ::::{tab-set} :::{tab-item} npe2 **python implementation** ```python # example_plugin.some_module {{ 'writers'|example_implementation }} ``` **manifest** See [Writers contribution reference](contributions-writers) for field details. ```yaml {{ 'writers'|example_contribution }} ``` ::: ::::{tab-item} napari-plugin-engine ```{admonition} Deprecated! This demonstrates the now-deprecated `napari-plugin-engine` pattern. ``` **python implementation** [hook specification](https://napari.org/stable/plugins/npe1.html#single-layers-io) ```python from napari_plugin_engine import napari_hook_implementation @napari_hook_implementation def napari_write_points(path: str, data: Any, meta: dict) -> Optional[str]: """Write points data and metadata into a path. Parameters ---------- path : str Path to file, directory, or resource (like a URL). data : array (N, D) Points layer data meta : dict Points metadata. Returns ------- path : str or None If data is successfully written, return the ``path`` that was written. Otherwise, if nothing was done, return ``None``. """ ``` ::: :::: npe2-0.7.2/docs/_config.yml0000644000000000000000000000371013615410400012364 0ustar00# Book settings # Learn more at https://jupyterbook.org/customize/config.html title: napari author: The napari team logo: images/logo.png # only_build_toc_files: true # Force re-execution of notebooks on each build. # See https://jupyterbook.org/content/execute.html execute: execute_notebooks: force # Information about where the book exists on the web repository: url: https://github.com/napari/napari # Online location of your book path_to_book: docs # Optional path to your book, relative to the repository root branch: main # Which branch of the repository should be used when creating links (optional) # Exclude files from build (prevents Sphinx warnings about missing # files in table of contents _toc.yml) exclude_patterns: [ 'ORGANIZATION.md', 'guides/_viewer_events.md', 'guides/_layer_events.md', ] # Add GitHub buttons to your book # See https://jupyterbook.org/customize/config.html#add-a-link-to-your-repository html: use_issues_button: true use_repository_button: true sphinx: extra_extensions: - sphinx.ext.intersphinx - sphinx.ext.napoleon - sphinx.ext.autodoc - sphinx_tabs.tabs config: autosummary_generate: True autosummary_imported_members: True html_theme: furo html_theme_options: {} pygments_style: solarized-dark suppress_warnings: ["myst.header"] exclude_patterns: ['**/_*.md'] # includes templates_path: - '_templates' intersphinx_mapping: python: - "https://docs.python.org/3" - null numpy: - "https://numpy.org/doc/stable/" - null napari_plugin_engine: - "https://napari-plugin-engine.readthedocs.io/en/latest/" - "https://napari-plugin-engine.readthedocs.io/en/latest/objects.inv" magicgui: - "https://pyapp-kit.github.io/magicgui/" - "https://pyapp-kit.github.io/magicgui/objects.inv" napari: - "https://napari.org/" - "https://napari.org/docs/dev/objects.inv" npe2-0.7.2/docs/_toc.yml0000644000000000000000000000055613615410400011711 0ustar00format: jb-article root: index sections: # this is meant to match the plugins toc at napari - file: plugins/index sections: - file: plugins/first_plugin - file: plugins/manifest - file: plugins/contributions - file: plugins/guides - file: plugins/test_deploy - file: plugins/best_practices - file: plugins/npe2_migration_guide - file: plugins/npe1 npe2-0.7.2/docs/index.md0000644000000000000000000000022713615410400011666 0ustar00# napari This is a stub file for building the plugin documentation. It is not used and should not be edited [go to plugins section](./plugins/index) npe2-0.7.2/docs/requirements.txt0000644000000000000000000000005613615410400013521 0ustar00jupyter-book furo Jinja2 sphinx-tabs magicgui npe2-0.7.2/src/npe2/__init__.py0000644000000000000000000000145413615410400013054 0ustar00from importlib.metadata import PackageNotFoundError, version try: __version__ = version("npe2") except PackageNotFoundError: # pragma: no cover __version__ = "unknown" __author__ = "Talley Lambert" __email__ = "talley.lambert@gmail.com" from ._dynamic_plugin import DynamicPlugin from ._inspection._fetch import fetch_manifest, get_manifest_from_wheel from ._plugin_manager import PluginContext, PluginManager from .io_utils import read, read_get_reader, write, write_get_writer from .manifest import PackageMetadata, PluginManifest __all__ = [ "__version__", "DynamicPlugin", "fetch_manifest", "get_manifest_from_wheel", "PackageMetadata", "PluginContext", "PluginManager", "PluginManifest", "read_get_reader", "read", "write_get_writer", "write", ] npe2-0.7.2/src/npe2/_command_registry.py0000644000000000000000000001116613615410400015023 0ustar00from __future__ import annotations from dataclasses import dataclass from functools import partial from typing import TYPE_CHECKING, Any, Callable, Dict, Optional, Union from psygnal import Signal from .manifest import utils from .manifest._validators import DOTTED_NAME_PATTERN from .types import PythonName PDisposable = Callable[[], None] if TYPE_CHECKING: from .manifest.schema import PluginManifest @dataclass class CommandHandler: id: str function: Optional[Callable] = None python_name: Optional[PythonName] = None def resolve(self) -> Callable: if self.function is not None: return self.function if self.python_name is None: raise RuntimeError("cannot resolve command without python_name") try: self.function = utils.import_python_name(self.python_name) except Exception as e: raise RuntimeError( f"Failed to import command at {self.python_name!r}: {e}" ) from e return self.function class CommandRegistry: command_registered = Signal(str) command_unregistered = Signal(str) def __init__(self) -> None: self._commands: Dict[str, CommandHandler] = {} def register(self, id: str, command: Union[Callable, str]) -> PDisposable: """Register a command under `id`. Parameters ---------- id : str A unique key with which to refer to this command command : Union[Callable, str] Either a callable object, or (if a string) the fully qualified name of a python object. If a string is provided, it is not imported until the command is actually executed. Returns ------- PDisposable A callable that, when called, unregisters the command. Raises ------ ValueError If the id is not a non-empty string, or if it already exists. TypeError If `command` is not a string or a callable object. """ if not (isinstance(id, str) and id.strip()): raise ValueError( f"Invalid command id for {command}, must be non-empty string" ) if id in self._commands: raise ValueError(f"Command {id} already exists") if isinstance(command, str): if not DOTTED_NAME_PATTERN.match(command): raise ValueError( "String command {command!r} is not a valid qualified python path." ) cmd = CommandHandler(id, python_name=PythonName(command)) elif not callable(command): raise TypeError(f"Cannot register non-callable command: {command}") else: cmd = CommandHandler(id, function=command) # TODO: validate arguments and type constraints # possibly wrap command in a type validator? self._commands[id] = cmd self.command_registered.emit(id) return partial(self.unregister, id) def unregister(self, id: str): """Unregister command with key `id`. No-op if key doesn't exist.""" if id in self._commands: del self._commands[id] self.command_unregistered.emit(id) def register_manifest(self, mf: PluginManifest) -> None: """Register all commands in a manifest""" if mf.contributions and mf.contributions.commands: for cmd in mf.contributions.commands: if cmd.python_name and cmd.id not in self: self.register(cmd.id, cmd.python_name) def unregister_manifest(self, mf: PluginManifest) -> None: """Unregister all commands in a manifest""" if mf.contributions and mf.contributions.commands: for cmd in mf.contributions.commands: if cmd.id in self: self.unregister(cmd.id) def get(self, id: str) -> Callable: """Get callable object for command `id`.""" # FIXME: who should control activation? if id not in self._commands: from ._plugin_manager import PluginManager pm = PluginManager.instance() if id in pm._contrib._commands: _, plugin_key = pm._contrib._commands[id] pm.activate(plugin_key) if id not in self._commands: # sourcery skip raise KeyError(f"command {id!r} not registered") return self._commands[id].resolve() def execute(self, id: str, args=(), kwargs=None) -> Any: if kwargs is None: kwargs = {} return self.get(id)(*args, **kwargs) def __contains__(self, id: str): return id in self._commands npe2-0.7.2/src/npe2/_dynamic_plugin.py0000644000000000000000000002342313615410400014456 0ustar00from __future__ import annotations from typing import ( Any, Callable, Dict, Generic, List, Literal, Optional, Type, TypeVar, Union, overload, ) from pydantic import BaseModel, ValidationError from ._plugin_manager import PluginManager from .manifest.contributions import ( CommandContribution, ContributionPoints, ReaderContribution, SampleDataGenerator, WidgetContribution, WriterContribution, ) from .manifest.schema import PluginManifest C = TypeVar("C", bound=BaseModel) T = TypeVar("T", bound=Callable[..., Any]) # a mapping of contribution type to string name in the ContributionPoints # e.g. {ReaderContribution: 'readers'} CONTRIB_NAMES = {v.type_: k for k, v in ContributionPoints.__fields__.items()} for key in list(CONTRIB_NAMES): if getattr(key, "__origin__", "") == Union: v = CONTRIB_NAMES.pop(key) for t in key.__args__: CONTRIB_NAMES[t] = v class DynamicPlugin: """A context manager that creates and modifies temporary plugin contributions. Parameters ---------- name : str Optional name for this temporary plugin., by default "temp-plugin" plugin_manager : Optional[PluginManager] A plugin manager instance with which to associate this plugin. If `None` (the default), the global `PluginManager.instance()` will be used. manifest: Optional[PluginManifest], Optionally provide a manifest to use for this plugin. If not provided, a new manifest will be created. Examples -------- >>> with TemporaryPlugin('name') as p: >>> @p.contribute.sample_data >>> def make_data() -> np.ndarray: ... """ def __init__( self, name: str = "temp-plugin", plugin_manager: Optional[PluginManager] = None, manifest: Optional[PluginManifest] = None, ) -> None: if isinstance(manifest, PluginManifest): self.manifest = manifest else: self.manifest = PluginManifest(name=name) self.contribute = ContributionDecorators(self) self._pm = plugin_manager @property def name(self) -> str: """Name of the plugin.""" return self.manifest.name @property def display_name(self) -> str: """Display name of the plugin.""" return self.manifest.display_name def cleanup(self) -> None: """Remove this plugin from its plugin manager.""" self.plugin_manager.unregister(self.manifest.name) def register(self) -> None: """Register this plugin with its plugin manager.""" self.plugin_manager.register(self.manifest) def clear(self) -> None: """Clear contributions.""" self.plugin_manager.deactivate(self.manifest.name) self.plugin_manager._contrib.remove_contributions(self.manifest.name) self.manifest.contributions = ContributionPoints() @property def plugin_manager(self) -> PluginManager: """Return the plugin manager this plugin is registered in. If unset, will use the global plugin manager instance. """ return self._pm if self._pm is not None else PluginManager.instance() @plugin_manager.setter def plugin_manager(self, pm: Optional[PluginManager]) -> None: """Set the plugin manager this plugin is registered in.""" if pm is self._pm: # pragma: no cover return my_cmds: Dict[str, Callable] = { k: v.function for k, v in self.plugin_manager.commands._commands.items() if k.startswith(self.manifest.name) and v.function } self.cleanup() self._pm = pm self.register() for k, v in my_cmds.items(): self.plugin_manager.commands.register(k, v) def __enter__(self) -> DynamicPlugin: self.register() return self def __exit__(self, *_) -> None: self.cleanup() def spawn( self, name: Optional[str] = None, plugin_manager: Optional[PluginManager] = None, register: bool = False, ) -> DynamicPlugin: """Create a new DynamicPlugin instance with the same plugin manager. Parameters ---------- name : Optional[str] If not provided, will increment current name, by default None plugin_manager : Optional[PluginManager] Plugin manager, by default the same as this plugin's plugin manager register : bool Whether to register the new plugin, by default False Returns ------- DynamicPlugin A new DynamicPlugin instance. """ pm = plugin_manager or self.plugin_manager assert isinstance(pm, PluginManager), "plugin_manager must be a PluginManager" if name: assert name not in pm._manifests, f"name {name!r} already in plugin manager" _name = name else: i = 1 while (_name := f"{self.name}-{i}") in pm._manifests: i += 1 new = type(self)(_name, plugin_manager=pm) if register: new.register() return new class ContributionDecorators: """A set of decorators that facilitate adding contributions to a TemporaryPlugin. Examples -------- >>> with TemporaryPlugin('name') as p: >>> @p.contribute.sample_data >>> def make_data() -> np.ndarray: ... >>> """ def __init__(self, plugin: DynamicPlugin) -> None: self.plugin = plugin self.command = ContributionDecorator(plugin, CommandContribution) self.reader = ContributionDecorator(plugin, ReaderContribution) self.writer = ContributionDecorator(plugin, WriterContribution) self.widget = ContributionDecorator(plugin, WidgetContribution) self.sample_data = ContributionDecorator(plugin, SampleDataGenerator) class ContributionDecorator(Generic[C]): """An actual instance of a contribution decorator. This holds the logic for actually adding a decorated function as a contribution of a specific `contrib_type` to a temporary plugin. """ def __init__(self, plugin: DynamicPlugin, contrib_type: Type[C]) -> None: self.plugin = plugin self.contrib_type = contrib_type self._contrib_name = CONTRIB_NAMES[self.contrib_type] @overload def __call__(self, func: T, **kwargs) -> T: ... @overload def __call__(self, func: Literal[None] = None, **kwargs) -> Callable[[T], T]: ... def __call__( self, func: Optional[T] = None, **kwargs ) -> Union[T, Callable[[T], T]]: """Decorate function as providing this contrubtion type. This is the actual decorator used when one calls, eg. >>> @npe2plugin.contribute.reader >>> def some_func(path): >>> ... """ def _mark_contribution(_func: T, _kwargs=kwargs) -> T: try: self._set_defaults(_func, _kwargs) _kwargs = self._store_command(_func, _kwargs) self._store_contrib(_kwargs) self.plugin.plugin_manager._contrib.reindex(self._mf) except ValidationError as e: # cleanup any added commands if "command" in _kwargs: new = [c for c in self.commands if c.id != _kwargs["command"]] self._mf.contributions.commands = new self.plugin.plugin_manager.commands.unregister(_kwargs["command"]) raise AssertionError( f"Invalid decorator for {self.contrib_type.__name__}.\n{e}" ) from e return _func return _mark_contribution if func is None else _mark_contribution(func) def _set_defaults(self, _func: T, kwargs: dict) -> None: """Populate contribution kwargs with reasonable type-specific defaults""" if self.contrib_type is ReaderContribution: kwargs.setdefault("filename_patterns", ["*"]) if self.contrib_type is SampleDataGenerator: kwargs.setdefault("key", _func.__name__) kwargs.setdefault("display_name", _func.__name__) if self.contrib_type is WriterContribution: kwargs.setdefault("layer_types", []) def _store_contrib(self, kwargs: dict) -> None: """Store the new contribution in the manifest""" if self.contrib_type is not CommandContribution: self.contribution_list.append(self.contrib_type(**kwargs)) def _store_command(self, func: T, kwargs: dict) -> dict: """Create a new command contribution for `func`""" kwargs.setdefault("title", func.__name__) kwargs.setdefault("id", f"{self.plugin.manifest.name}.{func.__name__}") cmd_kwargs = { k: kwargs.pop(k) for k in list(kwargs) if k in CommandContribution.__fields__ } cmd = CommandContribution(**cmd_kwargs) self.commands.append(cmd) self.plugin.plugin_manager.commands.register(cmd.id, func) kwargs["command"] = cmd.id return kwargs @property def _mf(self) -> PluginManifest: """Return all contributions in currently in the temporary plugin""" return self.plugin.manifest @property def contribution_list(self) -> List[C]: """Return contributions of this type in the associated manifest.""" if not getattr(self._mf.contributions, self._contrib_name): setattr(self._mf.contributions, self._contrib_name, []) return getattr(self._mf.contributions, self._contrib_name) @property def commands(self) -> List[CommandContribution]: """Return the CommandContributions in the associated manifest.""" if not self._mf.contributions.commands: self._mf.contributions.commands = [] return self._mf.contributions.commands npe2-0.7.2/src/npe2/_plugin_manager.py0000644000000000000000000007031313615410400014444 0ustar00from __future__ import annotations import contextlib import os import urllib import warnings from collections import Counter from fnmatch import fnmatch from importlib import metadata from pathlib import Path from typing import ( TYPE_CHECKING, AbstractSet, Any, Callable, DefaultDict, Dict, Iterable, Iterator, List, Mapping, Optional, Sequence, Set, Tuple, Union, ) from psygnal import Signal, SignalGroup from ._command_registry import CommandRegistry from .manifest import PluginManifest from .manifest._npe1_adapter import NPE1Adapter from .manifest.contributions import LayerType, WriterContribution from .types import PathLike, PythonName if TYPE_CHECKING: from .manifest.contributions import ( CommandContribution, MenuItem, ReaderContribution, SampleDataContribution, SubmenuContribution, ThemeContribution, WidgetContribution, ) IntStr = Union[int, str] AbstractSetIntStr = AbstractSet[IntStr] DictIntStrAny = Dict[IntStr, Any] MappingIntStrAny = Mapping[IntStr, Any] InclusionSet = Union[AbstractSetIntStr, MappingIntStrAny, None] DisposeFunction = Callable[[], None] __all__ = ["PluginContext", "PluginManager"] PluginName = str # this is `PluginManifest.name` class _ContributionsIndex: def __init__(self) -> None: self._indexed: Set[str] = set() self._commands: Dict[str, Tuple[CommandContribution, PluginName]] = {} self._readers: List[Tuple[str, ReaderContribution]] = [] self._writers: List[Tuple[LayerType, int, int, WriterContribution]] = [] # DEPRECATED: only here for napari <= 0.4.15 compat. self._samples: DefaultDict[str, List[SampleDataContribution]] = DefaultDict( list ) def reindex(self, manifest): self.remove_contributions(manifest.name) self.index_contributions(manifest) def index_contributions(self, manifest: PluginManifest): ctrb = manifest.contributions if not ctrb or manifest.name in self._indexed: return # pragma: no cover self._indexed.add(manifest.name) for cmd in ctrb.commands or (): self._commands[cmd.id] = cmd, manifest.name for reader in ctrb.readers or (): for pattern in reader.filename_patterns: self._readers.append((pattern, reader)) if reader.accepts_directories: self._readers.append(("", reader)) for writer in ctrb.writers or (): for c in writer.layer_type_constraints(): self._writers.append((c.layer_type, *c.bounds, writer)) # DEPRECATED: only here for napari <= 0.4.15 compat. if ctrb.sample_data: self._samples[manifest.name] = ctrb.sample_data def remove_contributions(self, key: PluginName) -> None: """This must completely remove everything added by `index_contributions`.""" if key not in self._indexed: return # pragma: no cover for cmd_id, (_, plugin) in list(self._commands.items()): if key == plugin: del self._commands[cmd_id] self._readers = [ (pattern, reader) for pattern, reader in self._readers if reader.plugin_name != key ] self._writers = [ (layer_type, min_, max_, writer) for layer_type, min_, max_, writer in self._writers if writer.plugin_name != key ] self._indexed.remove(key) # DEPRECATED: only here for napari <= 0.4.15 compat. self._samples.pop(key, None) def get_command(self, command_id: str) -> CommandContribution: return self._commands[command_id][0] def iter_compatible_readers(self, paths: List[str]) -> Iterator[ReaderContribution]: assert isinstance(paths, list) if not paths: return # pragma: no cover if len({Path(i).suffix for i in paths}) > 1: raise ValueError( "All paths in the stack list must have the same extension." ) path = paths[0] if not path: return assert isinstance(path, str) if os.path.isdir(path): yield from (r for pattern, r in self._readers if pattern == "") else: # ensure not a URI if not urllib.parse.urlparse(path).scheme: # lower case the extension for checking manifest pattern base = os.path.splitext(Path(path).stem)[0] ext = "".join(Path(path).suffixes) path = base + ext.lower() # not sure about the set logic as it won't be lazy anymore, # but would we yield duplicate anymore. # above does not have have the unseen check either. # it's easy to make an iterable version if we wish, or use more-itertools. # match against pattern.lower() to make matching case insensitive yield from { r for pattern, r in self._readers if fnmatch(path, pattern.lower()) } def iter_compatible_writers( self, layer_types: Sequence[str] ) -> Iterator[WriterContribution]: """Attempt to match writers that consume all layers.""" if not layer_types: return # First count how many of each distinct type are requested. We'll use # this to get candidate writers compatible with the requested count. counts = Counter(layer_types) def _get_candidates(lt: LayerType) -> Set[WriterContribution]: return { w for layer, min_, max_, w in self._writers if layer == lt and (min_ <= counts[lt] < max_) } # keep ordered without duplicates candidates = list({w: None for _, _, _, w in self._writers}) for lt in LayerType: if candidates: candidates = [i for i in candidates if i in _get_candidates(lt)] else: break def _writer_key(writer: WriterContribution) -> Tuple[bool, int]: # 1. writers with no file extensions (like directory writers) go last no_ext = len(writer.filename_extensions) == 0 # 2. more "specific" writers first nbounds = sum(not c.is_zero() for c in writer.layer_type_constraints()) return (no_ext, nbounds) yield from sorted(candidates, key=_writer_key) class PluginManagerEvents(SignalGroup): plugins_registered = Signal( set, description="Emitted with a set of PluginManifest instances " "whenever new plugins are registered. 'Registered' means that a " "manifest has been provided or discovered.", ) activation_changed = Signal( set, set, description="Emitted with two arguments: a set of plugin " "names that were activated, and a set of names that were " "deactivated. 'Activated' means the plugin has been *imported*, its " "`on_activate` function was called.", ) enablement_changed = Signal( set, set, description="Emitted with two arguments: a set of plugin names " "that were enabled, and a set of names that were " "disabled. 'Disabled' means the plugin remains installed, but it " "cannot be activated, and its contributions will not be indexed.", ) class PluginManager: __instance: Optional[PluginManager] = None # a global instance _contrib: _ContributionsIndex events: PluginManagerEvents def __init__( self, *, disable: Iterable[str] = (), reg: Optional[CommandRegistry] = None ) -> None: self._disabled_plugins: Set[PluginName] = set(disable) self._command_registry = reg or CommandRegistry() self._contexts: Dict[PluginName, PluginContext] = {} self._contrib = _ContributionsIndex() self._manifests: Dict[PluginName, PluginManifest] = {} self.events = PluginManagerEvents(self) self._npe1_adapters: List[NPE1Adapter] = [] # up to napari 0.4.15, discovery happened in the init here # so if we're running on an older version of napari, we need to discover try: nv = metadata.version("napari") except metadata.PackageNotFoundError: # pragma: no cover pass else: # pragma: no cover vsplit = nv.split(".")[:4] if ( "dev" in nv and vsplit < ["0", "4", "16", "dev4"] or "dev" not in nv and vsplit < ["0", "4", "16"] ): self.discover() @classmethod def instance(cls) -> PluginManager: """Return global PluginManager singleton instance.""" if cls.__instance is None: cls.__instance = cls() return cls.__instance @property def commands(self) -> CommandRegistry: return self._command_registry # Discovery, activation, enablement def discover( self, paths: Sequence[str] = (), clear=False, include_npe1=False ) -> int: """Discover and index plugin manifests in the environment. Parameters ---------- paths : Sequence[str] Optional list of strings to insert at front of sys.path when discovering. clear : bool Clear and re-index the environment. If `False` (the default), calling discover again will only register and index newly discovered plugins. (Existing manifests will not be re-indexed) include_npe1 : bool Whether to detect npe1 plugins as npe1_adapters during discovery. By default `False`. Returns ------- discover_count : int Number of discovered plugins """ if clear: self._contrib = _ContributionsIndex() self._manifests.clear() count = 0 with self.events.plugins_registered.paused(lambda a, b: (a[0] | b[0],)): for result in PluginManifest.discover(paths=paths): if ( result.manifest and result.manifest.name not in self._manifests and (include_npe1 or not isinstance(result.manifest, NPE1Adapter)) ): self.register(result.manifest, warn_disabled=False) count += 1 return count def index_npe1_adapters(self): """Import and index any/all npe1 adapters.""" with warnings.catch_warnings(): warnings.showwarning = lambda e, *_: print(str(e).split(" Please add")[0]) while self._npe1_adapters: self._contrib.index_contributions(self._npe1_adapters.pop()) def register( self, manifest_or_package: Union[PluginManifest, str], warn_disabled=True ) -> None: """Register a plugin manifest, path to manifest file, or a package name. Parameters ---------- manifest_or_package : Union[PluginManifest, str] Either a PluginManifest instance or a string. If a string, should be either the name of a plugin package, or a path to a plugin manifest file. warn_disabled : bool, optional If True, emits a warning if the plugin being registered is marked as disabled, by default True. Raises ------ ValueError If a plugin with the same name is already registered. """ if isinstance(manifest_or_package, str): if Path(manifest_or_package).is_file(): manifest = PluginManifest.from_file(manifest_or_package) else: manifest = PluginManifest.from_distribution(manifest_or_package) elif isinstance(manifest_or_package, PluginManifest): manifest = manifest_or_package else: # pragma: no cover raise TypeError( "The first argument to register must be a string or a PluginManifest." ) if manifest.name in self._manifests: raise ValueError(f"A manifest with name {manifest.name!r} already exists.") self._manifests[manifest.name] = manifest if self.is_disabled(manifest.name): if warn_disabled: warnings.warn( f"Disabled plugin {manifest.name!r} was registered, but will not " "be indexed. Use `warn_disabled=False` to suppress this message.", stacklevel=2, ) elif isinstance(manifest, NPE1Adapter): self._npe1_adapters.append(manifest) else: self._contrib.index_contributions(manifest) self.events.plugins_registered.emit({manifest}) def unregister(self, key: PluginName): """Unregister plugin named `key`.""" if key not in self._manifests: raise ValueError(f"No registered plugin named {key!r}") # pragma: no cover self.deactivate(key) self._contrib.remove_contributions(key) self._manifests.pop(key) def activate(self, key: PluginName) -> PluginContext: """Activate plugin with `key`. This does the following: - finds the manifest for the associated plugin key - gets or creates a PluginContext for the plugin - bails if it's already activated - otherwise calls the plugin's activate() function, passing the Context. - imports any commands that were declared as python_name: - emits an event """ # TODO: this is an important function... should be carefully considered if key not in self._manifests: raise KeyError(f"Cannot activate unrecognized plugin: {key!r}") if self.is_disabled(key): raise ValueError(f"Cannot activate disabled plugin: {key!r}") # create the context that will be with this plugin for its lifetime. ctx = self.get_context(key) if ctx._activated: # prevent "reactivation" return ctx mf = self._manifests[key] try: if mf.on_activate: _call_python_name(mf.on_activate, args=(ctx,)) except Exception as e: # pragma: no cover self._contexts.pop(key, None) raise type(e)(f"Activating plugin {key!r} failed: {e}") from e self.commands.register_manifest(mf) ctx._activated = True self.events.activation_changed({mf.name}, {}) return ctx def get_context(self, plugin_name: PluginName) -> PluginContext: """Return PluginContext for plugin_name""" if plugin_name not in self._contexts: self._contexts[plugin_name] = PluginContext(plugin_name, reg=self.commands) return self._contexts[plugin_name] def deactivate(self, plugin_name: PluginName) -> None: """Deactivate `plugin_name` This does the following: - unregisters all commands from the associated manifest - calls the plugin's on_deactivate() func, passing the Context. - calls and cleanup functions in the context's `_dispose` method. - emits an event This does not: - "unindex" contributions (i.e. the contributions of a deactivated plugin are still visible in the index) - "disable" the plugin (i.e. it can still be used). """ mf = self._manifests[plugin_name] self.commands.unregister_manifest(mf) if plugin_name not in self._contexts: return ctx = self._contexts.pop(plugin_name) if mf.on_deactivate: _call_python_name(mf.on_deactivate, args=(ctx,)) ctx._activated = False ctx._dispose() self.events.activation_changed({}, {mf.name}) def enable(self, plugin_name: PluginName) -> None: """Enable a plugin (which mostly means just `un-disable` it). This is a no-op if the plugin wasn't already disabled. """ if not self.is_disabled(plugin_name): return # pragma: no cover self._disabled_plugins.remove(plugin_name) mf = self._manifests.get(plugin_name) if mf is not None: self._contrib.index_contributions(mf) self.events.enablement_changed({plugin_name}, {}) def disable(self, plugin_name: PluginName) -> None: """Disable a plugin. 'Disabled' means the plugin remains installed, but it cannot be activated, and its contributions will not be indexed. Menu items and keybindings and such will not be available. In napari, plugin disablement is persisted across sessions. """ if self.is_disabled(plugin_name): return # pragma: no cover with contextlib.suppress(KeyError): self.deactivate(plugin_name) self._disabled_plugins.add(plugin_name) self._contrib.remove_contributions(plugin_name) self.events.enablement_changed({}, {plugin_name}) def is_disabled(self, plugin_name: str) -> bool: """Return `True` if plugin_name is disabled.""" return plugin_name in self._disabled_plugins # Getting manifests def get_manifest(self, plugin_name: str) -> PluginManifest: """Get manifest for `plugin_name`""" key = str(plugin_name).split(".")[0] if key not in self._manifests: msg = f"Manifest key {key!r} not found in {list(self._manifests)}" raise KeyError(msg) return self._manifests[key] def iter_manifests( self, disabled: Optional[bool] = None ) -> Iterator[PluginManifest]: """Iterate through registered manifests. Parameters ---------- disabled : Optional[bool] If `False`, yield only enabled manifests. If `True`, yield only disabled manifests. If `None` (the default), yield all manifests. Yields ------ PluginManifest """ for key, mf in self._manifests.items(): if disabled is True and not self.is_disabled(key): continue elif disabled is False and self.is_disabled(key): continue yield mf def dict( self, *, include: Optional[InclusionSet] = None, exclude: Optional[InclusionSet] = None, ) -> Dict[str, Any]: """Return a dictionary with the state of the plugin manager. `include` and `exclude` will be passed to each `PluginManifest.dict()` See pydantic documentation for details: https://pydantic-docs.helpmanual.io/usage/exporting_models/#modeldict `include` and `exclude` may be a set of dotted strings, indicating nested fields in the manifest model. For example: {'contributions.readers', 'package_metadata.description'} will be expanded to { 'contributions': {'readers': True}, 'package_metadata': {'description': True} } This facilitates selection of nested fields on the command line. Parameters ---------- include : InclusionSet, optional A set of manifest fields to include, by default all fields are included. exclude : InclusionSet, optional A set of manifest fields to exclude, by default no fields are excluded. Returns ------- Dict[str, Any] Dictionary with the state of the plugin manager. Keys will include - `'plugins'`: dict of `{name: manifest.dict()} for discovered plugins - `'disabled'`: set of disabled plugins - `'activated'`: set of activated plugins """ # _include = out: Dict[str, Any] = { "plugins": { mf.name: mf.dict( include=_expand_dotted_set(include), exclude=_expand_dotted_set(exclude), ) for mf in self.iter_manifests() } } if not exclude or "disabled" not in exclude: out["disabled"] = set(self._disabled_plugins) if not exclude or "activated" not in exclude: out["activated"] = { name for name, ctx in self._contexts.items() if ctx._activated } return out def __contains__(self, name: str) -> bool: return name in self._manifests def __getitem__(self, name: str) -> PluginManifest: return self.get_manifest(name) def __len__(self) -> int: return len(self._manifests) # Accessing Contributions def get_command(self, command_id: str) -> CommandContribution: """Retrieve CommandContribution for `command_id`""" return self._contrib.get_command(command_id) def get_submenu(self, submenu_id: str) -> SubmenuContribution: """Get SubmenuContribution for `submenu_id`.""" for mf in self.iter_manifests(disabled=False): for subm in mf.contributions.submenus or (): if submenu_id == subm.id: return subm raise KeyError(f"No plugin provides a submenu with id {submenu_id}") def iter_menu(self, menu_key: str, disabled=False) -> Iterator[MenuItem]: """Iterate over `MenuItems` in menu with id `menu_key`.""" for mf in self.iter_manifests(disabled=disabled): yield from mf.contributions.menus.get(menu_key, ()) def menus(self, disabled=False) -> Dict[str, List[MenuItem]]: """Return all registered menu_key -> List[MenuItems].""" _menus: DefaultDict[str, List[MenuItem]] = DefaultDict(list) for mf in self.iter_manifests(disabled=disabled): for key, menus in mf.contributions.menus.items(): _menus[key].extend(menus) return dict(_menus) def iter_themes(self) -> Iterator[ThemeContribution]: """Iterate over discovered/enuabled `ThemeContributions`.""" for mf in self.iter_manifests(disabled=False): yield from mf.contributions.themes or () def iter_compatible_readers( self, path: Union[PathLike, Sequence[str]] ) -> Iterator[ReaderContribution]: """Iterate over ReaderContributions compatible with `path`. Parameters ---------- path : Union[PathLike, Sequence[str]] Pathlike or list of pathlikes, with file(s) to read. """ if isinstance(path, (str, Path)): path = [path] assert isinstance(path, list) return self._contrib.iter_compatible_readers(path) def iter_compatible_writers( self, layer_types: Sequence[str] ) -> Iterator[WriterContribution]: """Iterate over compatible WriterContributions given a sequence of layer_types. Parameters ---------- layer_types : Sequence[str] list of lowercase Layer type names like `['image', 'labels']` """ return self._contrib.iter_compatible_writers(layer_types) def iter_widgets(self) -> Iterator[WidgetContribution]: """Iterate over discovered WidgetContributions.""" for mf in self.iter_manifests(disabled=False): yield from mf.contributions.widgets or () def iter_sample_data( self, ) -> Iterator[Tuple[PluginName, List[SampleDataContribution]]]: """Iterates over (plugin_name, [sample_contribs]).""" for mf in self.iter_manifests(disabled=False): if mf.contributions.sample_data: yield mf.name, mf.contributions.sample_data def get_writer( self, path: str, layer_types: Sequence[str], plugin_name: Optional[str] = None ) -> Tuple[Optional[WriterContribution], str]: """Get Writer contribution appropriate for `path`, and `layer_types`. When `path` has a file extension, find a compatible writer that has that same extension. When there is no extension and only a single layer, find a compatible writer and append the extension. Otherwise, find a compatible no-extension writer and write to that. No-extension writers typically write to a folder. Parameters ---------- path : str Path to write layer_types : Sequence[str] Sequence of layer type strings (e.g. ['image', 'labels']) plugin_name : Optional[str], optional Name of plugin to use. If provided, only writers from `plugin_name` will be considered, otherwise all plugins are considered. by default `None`. Returns ------- Tuple[Optional[WriterContribution], str] WriterContribution and path that will be written. """ ext = Path(path).suffix.lower() if path else "" for writer in self.iter_compatible_writers(layer_types): if not plugin_name or writer.command.startswith(plugin_name): if ( ext and ext in writer.filename_extensions or not ext and len(layer_types) != 1 and not writer.filename_extensions ): return writer, path elif not ext and len(layer_types) == 1: # No extension, single layer. ext = next(iter(writer.filename_extensions), "") return writer, path + ext # When the list of extensions for the writer doesn't match the # extension in the filename, keep searching. # Nothing got found return None, path class PluginContext: """An object that can contain information for a plugin over its lifetime.""" # stores all created contexts (currently cleared by `PluginManager.deactivate`) def __init__( self, plugin_key: PluginName, reg: Optional[CommandRegistry] = None ) -> None: self._activated = False self.plugin_key = plugin_key self._command_registry = reg or PluginManager.instance().commands self._imports: Set[str] = set() # modules that were imported by this plugin # functions to call when deactivating self._disposables: Set[DisposeFunction] = set() def _dispose(self): while self._disposables: self._disposables.pop()() def register_command(self, id: str, command: Optional[Callable] = None): """Associate a callable with a command id.""" def _inner(command): self._disposables.add(self._command_registry.register(id, command)) return command return _inner if command is None else _inner(command) def register_disposable(self, func: DisposeFunction): """Register `func` to be executed when this plugin is deactivated.""" self._disposables.add(func) def _call_python_name(python_name: PythonName, args=()) -> Any: """convenience to call `python_name` function. eg `module.submodule:funcname`.""" from .manifest.utils import import_python_name if not python_name: # pragma: no cover return None func = import_python_name(python_name) if callable(func): return func(*args) def _expand_dotted_set(inclusion_set: InclusionSet) -> InclusionSet: """Expand a set of strings with dots to a dict of dicts. Examples -------- >>> _expand_dotted_set({'a.b', 'c', 'a.d'}) {'a': {'b': True, 'd': True}, 'c': True} >>> _expand_dotted_set({'a.b', 'a.d.e', 'a'}) {'a'} >>> _expand_dotted_set({'a.b', 'a.d', 'x.y.z'}) {'x': {'y': {'z': True}}, 'a': {'d': True, 'b': True}} """ if not isinstance(inclusion_set, set) or all( "." not in str(s) for s in inclusion_set ): return inclusion_set result: Dict[IntStr, Any] = {} # sort the strings based on the number of dots, # so that higher level keys take precedence # e.g. {'a.b', 'a.d.e', 'a'} -> {'a'} for key in sorted(inclusion_set, key=lambda i: i.count("."), reverse=True): if isinstance(key, str): parts = key.split(".") if len(parts) == 1: result[key] = True else: cur = result for part in parts[:-1]: # integer keys are used in pydantic for lists # they must remain integers _p: IntStr = int(part) if part.isdigit() else part cur = cur.setdefault(_p, {}) cur[parts[-1]] = True return result npe2-0.7.2/src/npe2/_pytest_plugin.py0000644000000000000000000000703213615410400014360 0ustar00import logging import warnings from typing import Optional, Union from unittest.mock import patch import pytest from npe2 import DynamicPlugin, PluginManager, PluginManifest logger = logging.getLogger(__name__) class TestPluginManager(PluginManager): """A PluginManager subclass suitable for use in testing.""" def discover(self, *_, **__) -> int: """Discovery is blocked in the TestPluginManager.""" logger.warning( "NOTE: TestPluginManager refusing to discover plugins. " "You may add plugins to this test plugin manager using `tmp_plugin()`." ) return 0 def tmp_plugin( self, manifest: Optional[Union[PluginManifest, str]] = None, package: Optional[str] = None, name: Optional[str] = None, ) -> DynamicPlugin: """Create a DynamicPlugin instance using this plugin manager. If providing arguments, provide only one of 'manifest', 'package', 'name'. Parameters ---------- manifest : Union[PluginManifest, str] A manifest to use for this plugin. If a string, it is assumed to be the path to a manifest file (which must exist), otherwise must be a PluginManifest instance. package : str Name of an installed plugin/package. name : str If neither `manifest` or `package` is provided, a new DynamicPlugin is created with this name, by default "tmp_plugin" Returns ------- DynamicPlugin be sure to enter the DynamicPlugin context to register the plugin. Examples -------- >>> def test_something_with_only_my_plugin_registered(npe2pm): ... with npe2pm.tmp_plugin(package='my-plugin') as plugin: ... ... >>> def test_something_with_specific_manifest_file_registered(npe2pm): ... mf_file = Path(__file__).parent / 'my_manifest.yaml' ... with npe2pm.tmp_plugin(manifest=str(mf_file)) as plugin: ... ... """ if manifest is not None: if package or name: # pragma: no cover warnings.warn( "`manifest` overrides the `package` and `name` arguments. " "Please provide only one.", stacklevel=2, ) if isinstance(manifest, PluginManifest): mf = manifest else: mf = PluginManifest.from_file(manifest) elif package: if name: # pragma: no cover warnings.warn( "`package` overrides the `name` argument. Please provide only one.", stacklevel=2, ) mf = PluginManifest.from_distribution(package) else: name = name or "tmp_plugin" i = 0 while name in self._manifests: # pragma: no cover # guarantee that name is unique name = f"{name}_{i}" i += 1 mf = PluginManifest(name=name) return DynamicPlugin(mf.name, plugin_manager=self, manifest=mf) @pytest.fixture def npe2pm(): """Return mocked Global plugin manager instance, unable to discover plugins. Examples -------- >>> @pytest.fixture(autouse=True) ... def mock_npe2_pm(npe2pm): ... # Auto-use this fixture to prevent plugin discovery. ... return npe2pm """ _pm = TestPluginManager() with patch("npe2.PluginManager.instance", return_value=_pm): yield _pm npe2-0.7.2/src/npe2/_setuptools_plugin.py0000644000000000000000000001520113615410400015246 0ustar00""" # pyproject.toml [build-system] requires = ["setuptools", "wheel", "setuptools_scm", "npe2"] build-backend = "setuptools.build_meta" [tool.npe2] """ from __future__ import annotations import os import re import sys import warnings from typing import TYPE_CHECKING, Optional, Tuple, cast from setuptools import Distribution from setuptools.command.build_py import build_py if TYPE_CHECKING: from distutils.cmd import Command from typing import Any, Union PathT = Union["os.PathLike[str]", str] NPE2_ENTRY = "napari.manifest" DEBUG = bool(os.environ.get("SETUPTOOLS_NPE2_DEBUG")) EP_PATTERN = re.compile( r"(?P[\w.]+)\s*(:\s*(?P[\w.]+)\s*)?((?P\[.*\])\s*)?$" ) def trace(*k: object) -> None: if DEBUG: print(*k, file=sys.stderr, flush=True) def _lazy_tomli_load(data: str) -> dict[str, Any]: try: import tomllib except ImportError: import tomli as tomllib # type: ignore [no-redef] return tomllib.loads(data) def _read_dist_name_from_setup_cfg() -> str | None: # minimal effort to read dist_name off setup.cfg metadata import configparser parser = configparser.ConfigParser() parser.read(["setup.cfg"]) return parser.get("metadata", "name", fallback=None) def _check_absolute_root(root: PathT, relative_to: PathT | None) -> str: trace("abs root", repr(locals())) if relative_to: if ( os.path.isabs(root) and os.path.commonpath([root, relative_to]) != relative_to ): warnings.warn( f"absolute root path '{root}' overrides relative_to '{relative_to}'", stacklevel=2, ) if os.path.isdir(relative_to): warnings.warn( "relative_to is expected to be a file," " its the directory {relative_to!r}\n" "assuming the parent directory was passed", stacklevel=2, ) trace("dir", relative_to) root = os.path.join(relative_to, root) else: trace("file", relative_to) root = os.path.join(os.path.dirname(relative_to), root) return os.path.abspath(root) class Configuration: """Global configuration model""" def __init__( self, relative_to: PathT | None = None, root: PathT = ".", write_to: PathT | None = None, write_to_template: str | None = None, dist_name: str | None = None, template: str | None = None, ): self._relative_to = None if relative_to is None else os.fspath(relative_to) self._root = "." self.root = os.fspath(root) self.write_to = write_to self.write_to_template = write_to_template self.dist_name = dist_name self.template = template @property def relative_to(self) -> str | None: return self._relative_to @property def root(self) -> str: return self._root @root.setter def root(self, value: PathT) -> None: self._absolute_root = _check_absolute_root(value, self._relative_to) self._root = os.fspath(value) trace("root", repr(self._absolute_root)) trace("relative_to", repr(self._relative_to)) @property def absolute_root(self) -> str: return self._absolute_root @classmethod def from_file( cls, name: str = "pyproject.toml", dist_name: str | None = None, **kwargs: Any ) -> Configuration: """ Read Configuration from pyproject.toml (or similar). Raises exceptions when file is not found or toml is not installed or the file has invalid format or does not contain the [tool.npe2] section. """ with open(name, encoding="UTF-8") as strm: data = strm.read() defn = _lazy_tomli_load(data) try: section = defn.get("tool", {})["npe2"] except LookupError as e: raise LookupError(f"{name} does not contain a tool.npe2 section") from e if "dist_name" in section: if dist_name is None: dist_name = section.pop("dist_name") else: assert dist_name == section["dist_name"] del section["dist_name"] if dist_name is None and "project" in defn: # minimal pep 621 support for figuring the pretend keys dist_name = defn["project"].get("name") if dist_name is None: dist_name = _read_dist_name_from_setup_cfg() return cls(dist_name=dist_name, **section, **kwargs) def _mf_entry_from_dist(dist: Distribution) -> Optional[Tuple[str, str]]: """Return (module, attr) for a distribution's npe2 entry point.""" eps: dict = getattr(dist, "entry_points", {}) if napari_entrys := eps.get(NPE2_ENTRY, []): if match := EP_PATTERN.search(napari_entrys[0]): return match.group("module"), match.group("attr") return None class npe2_compile(build_py): def run(self) -> None: trace("RUN npe2_compile") if ep := _mf_entry_from_dist(self.distribution): from npe2._inspection._compile import compile module, attr = ep src = self.distribution.src_root or os.getcwd() dest = os.path.join(self.get_package_dir(module), attr) compile(src, dest, template=self.distribution.config.template) else: name = self.distribution.metadata.name trace(f"no {NPE2_ENTRY!r} found in entry_points for {name}") def finalize_npe2(dist: Distribution): # this hook is declared in the setuptools.finalize_distribution_options # entry point in our setup.cfg # https://setuptools.pypa.io/en/latest/userguide/extension.html#customizing-distribution-options trace("finalize hook", vars(dist.metadata)) dist_name = dist.metadata.name if dist_name is None: dist_name = _read_dist_name_from_setup_cfg() if not os.path.isfile("pyproject.toml"): return if dist_name == "npe2": # if we're packaging npe2 itself, don't do anything return try: # config will *only* be detected if there is a [tool.npe2] # section in pyproject.toml. This is how plugins opt in # to the npe2 compile feature during build config = Configuration.from_file(dist_name=dist_name) except LookupError as e: trace(e) else: # inject our `npe2_compile` command to be called whenever we're building an # sdist or a wheel dist.config = config for cmd in ("build", "sdist"): if base := dist.get_command_class(cmd): cast("Command", base).sub_commands.insert(0, ("npe2_compile", None)) npe2-0.7.2/src/npe2/cli.py0000644000000000000000000003730213615410400012065 0ustar00import builtins import sys import warnings from enum import Enum from pathlib import Path from typing import TYPE_CHECKING, Iterator, List, Optional, Sequence import typer from npe2 import PluginManager, PluginManifest, __version__ if TYPE_CHECKING: from rich.console import RenderableType app = typer.Typer(no_args_is_help=True) def _show_version_and_exit(value: bool) -> None: if value: typer.echo(f"npe2 v{__version__}") raise typer.Exit() @app.callback() def _main( version: Optional[bool] = typer.Option( None, "-v", "--version", callback=_show_version_and_exit, help="Show version and exit.", is_eager=True, ), ): """npe2: napari plugin engine (v{version}) For additional help on a specific command: type 'npe2 [command] --help' """ _main.__doc__ = typer.style( (_main.__doc__ or "").format(version=__version__), fg="bright_yellow" ) SYNTAX_THME = "monokai" SYNTAX_BACKGROUND = "black" class Format(str, Enum): """Valid manifest file formats.""" yaml = "yaml" json = "json" toml = "toml" class ListFormat(str, Enum): """Valid out formats for `npe2 list`.""" table = "table" json = "json" # alias for json in pandas "records" format yaml = "yaml" compact = "compact" def _pprint_formatted(string, format: Format = Format.yaml): # pragma: no cover """Print yaml nicely, depending on available modules.""" from rich.console import Console from rich.syntax import Syntax syntax = Syntax( string, format.value, theme=SYNTAX_THME, background_color=SYNTAX_BACKGROUND ) Console().print(syntax) def _pprint_exception(err: Exception): from rich.console import Console from rich.traceback import Traceback e_info = (type(err), err, err.__traceback__) trace = Traceback.extract(*e_info, show_locals=True) Console().print(Traceback(trace)) def _pprint_table( headers: Sequence["RenderableType"], rows: Sequence[Sequence["RenderableType"]] ): from itertools import cycle from rich.console import Console from rich.table import Table COLORS = ["cyan", "magenta", "green", "yellow"] EMOJI_TRUE = ":white_check_mark:" EMOJI_FALSE = "" table = Table() for head, color in zip(headers, cycle(COLORS)): table.add_column(head, style=color) for row in rows: strings = [] for r in row: val = "" if isinstance(r, dict): val = ", ".join(f"{k} ({v})" for k, v in r.items()) elif r: val = str(r).replace("True", EMOJI_TRUE).replace("False", EMOJI_FALSE) strings.append(val) table.add_row(*strings) console = Console() console.print(table) @app.command() def validate( name: str, imports: bool = typer.Option( False, help="Validate all `python_name` entries by importing. This runs python code! " "package must be importable on sys.path.", ), debug: bool = typer.Option( False, "--debug", help="Print tracebacks on error.", ), ): """Validate manifest for a distribution name or manifest filepath.""" err: Optional[Exception] = None try: pm = PluginManifest._from_package_or_name(name) msg = f"✔ Manifest for {(pm.display_name or pm.name)!r} valid!" if imports: pm.validate_imports() except PluginManifest.ValidationError as e: msg = f"🅇 Invalid! {e}" err = e except Exception as e: msg = f"🅇 Unexpected error in {name!r}.\n{type(e).__name__}: {e}" err = e typer.secho(msg, fg=typer.colors.RED if err else typer.colors.GREEN, bold=True) if err is not None: if debug: _pprint_exception(err) raise typer.Exit(1) def _check_output(output: Path) -> Format: if output.suffix.lstrip(".") not in Format._member_names_: typer.echo( f"Invalid output extension {output.suffix!r}. Must be one of: " + ", ".join(Format._member_names_) ) raise typer.Exit(1) return Format(output.suffix.lstrip(".")) @app.command() def parse( name: str = typer.Argument( ..., help="Name of an installed package, or path to a manifest file." ), format: Format = typer.Option( "yaml", "-f", "--format", help="Markdown format to use." ), indent: Optional[int] = typer.Option( None, "--indent", help="Number of spaces to indent (for json)", min=0, max=10, ), output: Optional[Path] = typer.Option( None, "-o", "--output", exists=False, help="If provided, will write manifest to filepath (must end with .yaml, " ".json, or .toml). Otherwise, will print to stdout.", ), ): """Show parsed manifest as yaml.""" fmt = _check_output(output) if output else format pm = PluginManifest._from_package_or_name(name) manifest_string = getattr(pm, fmt.value)(indent=indent) if output: output.write_text(manifest_string) else: _pprint_formatted(manifest_string, fmt) def _make_rows(pm_dict: dict, normed_fields: Sequence[str]) -> Iterator[List]: """Cleanup output from pm.dict() into rows for table. outside of just extracting the fields we care about, this also: - handles nested fields expressed as dotted strings: `packge_metadata.version` - negates fields that are prefixed with `!` - simplifies contributions to a {name: count} dict. """ for info in pm_dict["plugins"].values(): row = [] for field in normed_fields: val = info.get(field.lstrip("!")) # extact nested fields if not val and "." in field: parts = field.split(".") val = info while parts and hasattr(val, "__getitem__"): val = val[parts.pop(0)] # negate fields starting with ! if field.startswith("!"): val = not val # simplify contributions to just the number of contributions if field == "contributions": val = {k: len(v) for k, v in val.items() if v} row.append(val) yield row @app.command() def list( fields: str = typer.Option( "name,version,npe2,contributions", help="Comma seperated list of fields to include in the output." "Names may contain dots, indicating nested manifest fields " "(`contributions.readers`). Fields names prefixed with `!` will be " "negated in the output. Fields will appear in the table in the order in " "which they are provided.", metavar="FIELDS", ), sort: str = typer.Option( "0", "-s", "--sort", help="Field name or (int) index on which to sort.", metavar="KEY", ), format: ListFormat = typer.Option( "table", "-f", "--format", help="Out format to use. When using 'compact', `--fields` is ignored ", ), ): """List currently installed plugins.""" if format == ListFormat.compact: fields = "name,version,contributions" requested_fields = [f.lower() for f in fields.split(",")] # check for sort values that will not work bad_sort_param_msg = ( f"Invalid sort value {sort!r}. " f"Must be column index (<{len(requested_fields)}) or one of: " + ", ".join(requested_fields) ) try: if (sort_index := int(sort)) >= len(requested_fields): raise typer.BadParameter(bad_sort_param_msg) except ValueError: try: sort_index = requested_fields.index(sort.lower()) except ValueError as e: raise typer.BadParameter(bad_sort_param_msg) from e # some convenience aliases ALIASES = { "version": "package_metadata.version", "summary": "package_metadata.summary", "license": "package_metadata.license", "author": "package_metadata.author", "npe2": "!npe1_shim", "npe1": "npe1_shim", } normed_fields = [ALIASES.get(f, f) for f in requested_fields] pm = PluginManager.instance() pm.discover(include_npe1=True) pm.index_npe1_adapters() pm_dict = pm.dict(include={f.lstrip("!") for f in normed_fields}) rows = sorted(_make_rows(pm_dict, normed_fields), key=lambda r: r[sort_index]) if format == ListFormat.table: heads = [f.split(".")[-1].replace("_", " ").title() for f in requested_fields] _pprint_table(headers=heads, rows=rows) return # standard records format used for the other formats # [{column -> value}, ... , {column -> value}] data: List[dict] = [dict(zip(requested_fields, row)) for row in rows] if format == ListFormat.json: import json _pprint_formatted(json.dumps(data, indent=1), Format.json) elif format in (ListFormat.yaml): import yaml _pprint_formatted(yaml.safe_dump(data, sort_keys=False), Format.yaml) elif format in (ListFormat.compact): template = " - {name}: {version} ({ncontrib} contributions)" for r in data: ncontrib = sum(r.get("contributions", {}).values()) typer.echo(template.format(**r, ncontrib=ncontrib)) def _fetch_all_manifests(doit: bool): """Fetch all manifests and dump to "manifests" folder.""" if not doit: return from npe2._inspection import _fetch dest = "manifests" if "-o" in sys.argv: dest = sys.argv[sys.argv.index("-o") + 1] elif "--output" in sys.argv: # pragma: no cover dest = sys.argv[sys.argv.index("--output") + 1] _fetch.fetch_all_manifests(dest) raise typer.Exit(0) @app.command() def fetch( name: List[str], version: Optional[str] = None, include_package_meta: Optional[bool] = typer.Option( False, "-m", "--include-package-meta", help="Include package metadata in the manifest.", ), format: Format = typer.Option( "yaml", "-f", "--format", help="Markdown format to use." ), indent: Optional[int] = typer.Option( None, "--indent", help="Number of spaces to indent (for json)", min=0, max=10, ), output: Optional[Path] = typer.Option( None, "-o", "--output", exists=False, help="If provided, will write manifest to filepath (must end with .yaml, " ".json, or .toml). Otherwise, will print to stdout.", ), all: Optional[bool] = typer.Option( None, "--all", help="Fetch manifests for ALL known plugins (will be SLOW)", callback=_fetch_all_manifests, is_eager=True, ), ): """Fetch manifest from remote package. If an npe2 plugin is detected, the manifest is returned directly, otherwise it will be installed into a temporary directory, imported, and discovered. """ from npe2 import fetch_manifest fmt = _check_output(output) if output else format kwargs: dict = {"indent": indent} if include_package_meta: kwargs["exclude"] = set() for n in name: mf = fetch_manifest(n, version=version) manifest_string = getattr(mf, fmt.value)(**kwargs) if output: output.write_text(manifest_string, encoding="utf-8") else: _pprint_formatted(manifest_string, fmt) @app.command() def convert( path: Path = typer.Argument( ..., help="Path of a local repository to convert (package must also be installed in" " current environment). Or, the name of an installed package/plugin. If a " "package is provided instead of a directory, the new manifest will simply be " "printed to stdout.", ), dry_run: Optional[bool] = typer.Option( False, "--dry-runs", "-n", help="Just print manifest to stdout. Do not modify anything", ), ): """Convert first generation napari plugin to new (manifest) format.""" from ._inspection._from_npe1 import convert_repository, manifest_from_npe1 try: with warnings.catch_warnings(record=True) as w: if path.is_dir(): pm, mf_path = convert_repository(path, dry_run=dry_run) else: pm = manifest_from_npe1(str(path)) if w: from textwrap import indent typer.secho("Some issues occured:", fg=typer.colors.RED, bold=False) for r in w: typer.secho( indent(str(r.message), " "), fg=typer.colors.MAGENTA, bold=False, ) typer.echo() except Exception as e: msg = f"Conversion failed:\n{type(e).__name__}: {e}" typer.secho(msg, fg=typer.colors.RED, bold=True) raise typer.Exit(1) from e if dry_run or not path.is_dir(): if path.is_dir(): typer.secho( f"# Manifest would be written to {mf_path}", fg=typer.colors.BRIGHT_GREEN, bold=False, ) _pprint_formatted(pm.yaml(), Format.yaml) else: msg = f"✔ Conversion complete! New manifest at {mf_path}." typer.secho(msg, fg=typer.colors.GREEN, bold=True) typer.secho( "If you have any napari_plugin_engine imports or hook_implementation " "decorators, you may remove them now.", fg=typer.colors.GREEN, bold=False, ) @app.command() def cache( clear: Optional[bool] = typer.Option( False, "--clear", "-d", help="Clear the npe1 adapter manifest cache" ), names: List[str] = typer.Argument( None, help="Name(s) of distributions to list/delete" ), list_: Optional[bool] = typer.Option( False, "--list", "-l", help="List cached manifests" ), verbose: Optional[bool] = typer.Option(False, "--verbose", "-v", help="verbose"), ): """Cache utils""" from npe2.manifest._npe1_adapter import ADAPTER_CACHE, clear_cache if clear: if _cleared := clear_cache(names): nf = "\n".join(f" - {i.name}" for i in _cleared) typer.secho("Cleared these files from cache:") typer.secho(nf, fg=typer.colors.RED) else: msg = "Nothing to clear" if names: msg += f" for plugins: {','.join(names)}" typer.secho(msg, fg=typer.colors.RED) typer.Exit() if list_: files = builtins.list(ADAPTER_CACHE.glob("*.yaml")) if names: files = [f for f in files if any(f.name.startswith(n) for n in names)] if not files: if names: typer.secho(f"Nothing cached for plugins: {','.join(names)}") else: typer.secho("Nothing cached") typer.Exit() for fname in files: mf = PluginManifest.from_file(fname) if verbose: _pprint_formatted(mf.yaml(), Format.yaml) # pragma: no cover else: typer.secho(f"{mf.name}: {mf.package_version}", fg=typer.colors.GREEN) @app.command() def compile( src_dir: str, output: Optional[Path] = typer.Option( None, "-o", "--output", exists=False, help="If provided, will write manifest to filepath (must end with .yaml, " ".json, or .toml). Otherwise, will print to stdout.", ), format: Format = typer.Option( "yaml", "-f", "--format", help="Markdown format to use." ), ): """Compile @npe2.implements contributions to generate a manifest.""" from . import _inspection manifest = _inspection.compile(src_dir, dest=output) manifest_string = getattr(manifest, format.value)(indent=2) _pprint_formatted(manifest_string, format) def main(): app() npe2-0.7.2/src/npe2/implements.py0000644000000000000000000000765213615410400013500 0ustar00import contextlib from inspect import Parameter, Signature from typing import Any, Callable, List, Sequence, Type, TypeVar from pydantic import BaseModel from .manifest import contributions __all__ = [ "CHECK_ARGS_PARAM", "on_activate", "on_deactivate", "reader", "sample_data_generator", "widget", "writer", ] T = TypeVar("T", bound=Callable[..., Any]) CHECK_ARGS_PARAM = "ensure_args_valid" def _build_decorator(contrib: Type[BaseModel]) -> Callable: """Create a decorator (e.g. `@implements.reader`) to mark an object as a contrib. Parameters ---------- contrib : Type[BaseModel] The type of contribution this object implements. """ # build a signature based on the fields in this contribution type, mixed with # the fields in the CommandContribution contribs: Sequence[Type[BaseModel]] = (contributions.CommandContribution, contrib) params: List[Parameter] = [] for contrib in contribs: # iterate over the fields in the contribution types for field in contrib.__fields__.values(): # we don't need python_name (since that will be gleaned from the function # we're decorating) ... and we don't need `command`, since that will just # be a string pointing to the contributions.commands entry that we are # creating here. if field.name not in {"python_name", "command"}: # ensure that required fields raise a TypeError if they are not provided default = Parameter.empty if field.required else field.get_default() # create the parameter and add it to the signature. param = Parameter( field.name, Parameter.KEYWORD_ONLY, default=default, annotation=field.outer_type_ or field.type_, ) params.append(param) # add one more parameter to control whether the arguments in the decorator itself # are validated at runtime params.append( Parameter( CHECK_ARGS_PARAM, kind=Parameter.KEYWORD_ONLY, default=False, annotation=bool, ) ) signature = Signature(parameters=params, return_annotation=Callable[[T], T]) # creates the actual `@npe2.implements.something` decorator # this just stores the parameters for the corresponding contribution type # as attributes on the function being decorated. def _deco(**kwargs) -> Callable[[T], T]: def _store_attrs(func: T) -> T: # If requested, assert that we've satisfied the signature when # the decorator is invoked at runtime. # TODO: improve error message to provide context if kwargs.pop(CHECK_ARGS_PARAM, False): signature.bind(**kwargs) # TODO: check if it's already there and assert the same id # store these attributes on the function with contextlib.suppress(AttributeError): setattr(func, f"_npe2_{contrib.__name__}", kwargs) # return the original decorated function return func return _store_attrs # set the signature and return the decorator _deco.__signature__ = signature # type: ignore return _deco # builds decorators for each of the contribution types that are essentially just # pointers to some command. reader = _build_decorator(contributions.ReaderContribution) writer = _build_decorator(contributions.WriterContribution) widget = _build_decorator(contributions.WidgetContribution) sample_data_generator = _build_decorator(contributions.SampleDataGenerator) def on_activate(func): """Mark a function to be called when a plugin is activated.""" func.npe2_on_activate = True return func def on_deactivate(func): """Mark a function to be called when a plugin is deactivated.""" func.npe2_on_deactivate = True return func npe2-0.7.2/src/npe2/implements.pyi0000644000000000000000000000256713615410400013651 0ustar00from typing import Any, Callable, List, TypeVar from pydantic import BaseModel as BaseModel from .manifest import PluginManifest as PluginManifest from .manifest import contributions as contributions T = TypeVar("T", bound=Callable[..., Any]) CHECK_ARGS_PARAM: str def reader( *, id: str, title: str, filename_patterns: List[str], accepts_directories: bool = False, ensure_args_valid: bool = False, ) -> Callable[[T], T]: """Mark a function as a reader contribution""" def writer( *, id: str, title: str, layer_types: List[str], filename_extensions: List[str] = [], # noqa: B006 display_name: str = "", ensure_args_valid: bool = False, ) -> Callable[[T], T]: """Mark function as a writer contribution""" def widget( *, id: str, title: str, display_name: str, autogenerate: bool = False, ensure_args_valid: bool = False, ) -> Callable[[T], T]: """Mark a function as a widget contribution""" def sample_data_generator( *, id: str, title: str, key: str, display_name: str, ensure_args_valid: bool = False, ) -> Callable[[T], T]: """Mark a function as a sample data generator contribution""" def on_activate(func): """Mark a function to be called when a plugin is activated.""" def on_deactivate(func): """Mark a function to be called when a plugin is deactivated.""" npe2-0.7.2/src/npe2/io_utils.py0000644000000000000000000002347213615410400013150 0ustar00from __future__ import annotations from typing import ( TYPE_CHECKING, List, Literal, Optional, Sequence, Tuple, Union, cast, overload, ) from . import PluginManager from .manifest.utils import v1_to_v2 from .types import FullLayerData, LayerData if TYPE_CHECKING: import napari.layers from .manifest.contributions import ReaderContribution, WriterContribution def read( paths: List[str], *, stack: bool, plugin_name: Optional[str] = None ) -> List[LayerData]: """Try to read file at `path`, with plugins offering a ReaderContribution. Parameters ---------- paths : list of str Path to the file or resource being read. stack : bool Should the readers stack the read files. plugin_name : str, optional Optional plugin name. If provided, only readers from this plugin will be tried (it's possible that none will be compatible). by default None Returns ------- List[LayerData] If a compatible reader is found and executed, a list of LayerDataTuples is returned Raises ------ ValueError If no readers are found or none return data """ assert isinstance(paths, list) return _read(paths, plugin_name=plugin_name, stack=stack) def read_get_reader( path: Union[str, Sequence[str]], *, plugin_name: Optional[str] = None, stack: Optional[bool] = None, ) -> Tuple[List[LayerData], ReaderContribution]: """Variant of `read` that also returns the `ReaderContribution` used.""" if stack is None: # "npe1" old path # Napari 0.4.15 and older, hopefully we can drop this and make stack mandatory new_path, new_stack = v1_to_v2(path) return _read( new_path, plugin_name=plugin_name, return_reader=True, stack=new_stack ) else: assert isinstance(path, list) for p in path: assert isinstance(p, str) return _read(path, plugin_name=plugin_name, return_reader=True, stack=stack) def write( path: str, layer_data: List[Union[FullLayerData, napari.layers.Layer]], *, plugin_name: Optional[str] = None, ) -> List[str]: """Write layer_data tuples to `path`. Parameters ---------- path : str The path (file, directory, url) to write. layer_data : list of layer data tuples List of tuples in the form (data, metadata_dict, layer_type_string) plugin_name : str, optional Name of the plugin to write data with. If `None` then all plugins corresponding to appropriate hook specification will be looped through to find the first one that can write the data. Returns ------- list of str List of file paths that were written Raises ------ ValueError If no suitable writers are found. """ return _write(path, layer_data, plugin_name=plugin_name) def write_get_writer( path: str, layer_data: List[Union[FullLayerData, napari.layers.Layer]], *, plugin_name: Optional[str] = None, ) -> Tuple[List[str], WriterContribution]: """Variant of write that also returns the `WriterContribution` used.""" return _write(path, layer_data, plugin_name=plugin_name, return_writer=True) # ----------------------------------------------------------------------------------- @overload def _read( paths: Union[str, Sequence[str]], *, stack: bool, plugin_name: Optional[str] = None, return_reader: Literal[False] = False, _pm=None, ) -> List[LayerData]: ... @overload def _read( paths: Union[str, Sequence[str]], *, stack: bool, plugin_name: Optional[str] = None, return_reader: Literal[True], _pm=None, ) -> Tuple[List[LayerData], ReaderContribution]: ... def _read( paths: Union[str, Sequence[str]], *, stack: bool, plugin_name: Optional[str] = None, return_reader: bool = False, _pm: Optional[PluginManager] = None, ) -> Union[Tuple[List[LayerData], ReaderContribution], List[LayerData]]: """Execute the `read...` functions above.""" if _pm is None: _pm = PluginManager.instance() # get readers compatible with paths and chosen plugin - raise errors if # choices are invalid or there's nothing to try chosen_compatible_readers = _get_compatible_readers_by_choice( plugin_name, paths, _pm ) assert ( chosen_compatible_readers ), "No readers to try. Expected an exception before this point." for rdr in chosen_compatible_readers: read_func = rdr.exec( kwargs={"path": paths, "stack": stack, "_registry": _pm.commands} ) if read_func is not None: # if the reader function raises an exception here, we don't try to catch it if layer_data := read_func(paths, stack=stack): return (layer_data, rdr) if return_reader else layer_data if plugin_name: raise ValueError( f"Reader {plugin_name!r} was selected to open " + f"{paths!r}, but returned no data." ) raise ValueError(f"No readers returned data for {paths!r}") def _get_compatible_readers_by_choice( plugin_name: Union[str, None], paths: Union[str, Sequence[str]], pm: PluginManager ): """Returns compatible readers filtered by validated plugin choice. Checks that plugin_name is an existing plugin (and command if a specific contribution was passed), and that it is compatible with paths. Raises ValueError if given plugin doesn't exist, it is not compatible with the given paths, or no compatible readers can be found for paths (if no plugin was chosen). Parameters ---------- plugin_name: Union[str, None] name of chosen plugin, or None paths: Union[str, Sequence[str]] paths to read pm: PluginManager plugin manager instance to check for readers Raises ------ ValueError If the given reader doesn't exist ValueError If there are no compatible readers ValueError If the given reader is not compatible Returns ------- compat_readers : List[ReaderContribution] Compatible readers for plugin choice """ passed_contrib = plugin_name and ("." in plugin_name) compat_readers = list(pm.iter_compatible_readers(paths)) compat_reader_names = sorted( {(rdr.command if passed_contrib else rdr.plugin_name) for rdr in compat_readers} ) helper_error_message = ( f"Available readers for {paths!r} are: {compat_reader_names!r}." if compat_reader_names else f"No compatible readers are available for {paths!r}." ) # check whether plugin even exists. if plugin_name: try: # note that get_manifest works with a full command e.g. my-plugin.my-reader pm.get_manifest(plugin_name) if passed_contrib: pm.get_command(plugin_name) except KeyError: raise ValueError( f"Given reader {plugin_name!r} does not exist. {helper_error_message}" ) from None # no choice was made and there's no readers to try if not plugin_name and not len(compat_reader_names): raise ValueError(helper_error_message) # user didn't make a choice and we have some readers to try, return them if not plugin_name: return compat_readers # user made a choice and it exists, but it may not be a compatible reader plugin, _, _ = plugin_name.partition(".") chosen_compatible_readers = [ rdr for rdr in compat_readers if rdr.plugin_name == plugin and (not passed_contrib or rdr.command == plugin_name) ] # the user's choice is not compatible with the paths. let them know what is if not chosen_compatible_readers: raise ValueError( f"Given reader {plugin_name!r} is not a compatible reader for {paths!r}. " + helper_error_message ) return chosen_compatible_readers @overload def _write( path: str, layer_data: List[Union[FullLayerData, napari.layers.Layer]], *, plugin_name: Optional[str] = None, return_writer: Literal[False] = False, _pm: Optional[PluginManager] = None, ) -> List[str]: ... @overload def _write( path: str, layer_data: List[Union[FullLayerData, napari.layers.Layer]], *, plugin_name: Optional[str] = None, return_writer: Literal[True], _pm: Optional[PluginManager] = None, ) -> Tuple[List[str], WriterContribution]: ... def _write( path: str, layer_data: List[Union[FullLayerData, napari.layers.Layer]], *, plugin_name: Optional[str] = None, return_writer: bool = False, _pm: Optional[PluginManager] = None, ) -> Union[List[str], Tuple[List[str], WriterContribution]]: if not layer_data: raise ValueError("Must provide layer data") if _pm is None: _pm = PluginManager.instance() _layer_tuples: List[FullLayerData] = [ cast("napari.layers.Layer", x).as_layer_data_tuple() if hasattr(x, "as_layer_data_tuple") else x for x in layer_data ] layer_types = [x[2] for x in _layer_tuples] writer, new_path = _pm.get_writer( path, layer_types=layer_types, plugin_name=plugin_name ) if not writer: raise ValueError(f"No writer found for {path!r} with layer types {layer_types}") # Writers that take at most one layer must use the single-layer api. # Otherwise, they must use the multi-layer api. n = sum(ltc.max() for ltc in writer.layer_type_constraints()) args = (new_path, *_layer_tuples[0][:2]) if n <= 1 else (new_path, _layer_tuples) res = writer.exec(args=args) # napari_get_writer-style writers don't always return a list # though strictly speaking they should? result = [res] if isinstance(res, str) else res or [] # type: ignore return (result, writer) if return_writer else result npe2-0.7.2/src/npe2/plugin_manager.py0000644000000000000000000001042213615410400014300 0ustar00# mypy: disable-error-code=empty-body """Convenience module to access methods on the global PluginManager singleton.""" from __future__ import annotations from typing import TYPE_CHECKING, Dict if TYPE_CHECKING: from os import PathLike from typing import Any, Iterator, List, NewType, Optional, Sequence, Tuple, Union from npe2 import PluginManifest from npe2._plugin_manager import InclusionSet, PluginContext from npe2.manifest import contributions from ._plugin_manager import PluginManager PluginName = NewType("PluginName", str) def instance() -> PluginManager: """Return global PluginManager singleton instance.""" from ._plugin_manager import PluginManager return PluginManager.instance() def discover(paths: Sequence[str] = (), clear=False, include_npe1=False) -> None: """Discover and index plugin manifests in the environment.""" def dict( self, *, include: Optional[InclusionSet] = None, exclude: Optional[InclusionSet] = None, ) -> Dict[str, Any]: """Return a dictionary with the state of the plugin manager.""" def index_npe1_adapters() -> None: """Import and index any/all npe1 adapters.""" def register(manifest: PluginManifest, warn_disabled=True) -> None: """Register a plugin manifest""" def unregister(key: PluginName) -> None: """Unregister plugin named `key`.""" def activate(key: PluginName) -> PluginContext: """Activate plugin with `key`.""" def get_context(plugin_name: PluginName) -> PluginContext: """Return PluginContext for plugin_name""" def deactivate(plugin_name: PluginName) -> None: """Deactivate `plugin_name`""" def enable(plugin_name: PluginName) -> None: """Enable a plugin (which mostly means just `un-disable` it.""" def disable(plugin_name: PluginName) -> None: """Disable a plugin""" def is_disabled(plugin_name: str) -> bool: """Return `True` if plugin_name is disabled.""" def get_manifest(plugin_name: str) -> PluginManifest: """Get manifest for `plugin_name`""" def iter_manifests(disabled: Optional[bool] = None) -> Iterator[PluginManifest]: """Iterate through registered manifests.""" def get_command(command_id: str) -> contributions.CommandContribution: """Retrieve CommandContribution for `command_id`""" def get_submenu(submenu_id: str) -> contributions.SubmenuContribution: """Get SubmenuContribution for `submenu_id`.""" def iter_menu(menu_key: str, disabled=False) -> Iterator[contributions.MenuItem]: """Iterate over `MenuItems` in menu with id `menu_key`.""" def menus(disabled=False) -> Dict[str, List[contributions.MenuItem]]: """Return all registered menu_key -> List[MenuItems].""" def iter_themes() -> Iterator[contributions.ThemeContribution]: """Iterate over discovered/enuabled `ThemeContributions`.""" def iter_compatible_readers( path: Union[PathLike, Sequence[str]] ) -> Iterator[contributions.ReaderContribution]: """Iterate over ReaderContributions compatible with `path`.""" def iter_compatible_writers( layer_types: Sequence[str], ) -> Iterator[contributions.WriterContribution]: """Iterate over compatible WriterContributions given a sequence of layer_types.""" def iter_widgets() -> Iterator[contributions.WidgetContribution]: """Iterate over discovered WidgetContributions.""" def iter_sample_data() -> ( Iterator[Tuple[PluginName, List[contributions.SampleDataContribution]]] ): """Iterates over (plugin_name, [sample_contribs]).""" def get_writer( path: str, layer_types: Sequence[str], plugin_name: Optional[str] = None ) -> Tuple[Optional[contributions.WriterContribution], str]: """Get Writer contribution appropriate for `path`, and `layer_types`.""" def _populate_module(): """Convert all functions in this module into global plugin manager methods.""" import functools import sys from ._plugin_manager import PluginManager _module = sys.modules[__name__] for key in dir(_module): if key.startswith(("_", "instance")) or not hasattr(PluginManager, key): continue @functools.wraps(getattr(_module, key)) def _f(*args, _key=key, **kwargs): return getattr(instance(), _key)(*args, **kwargs) setattr(_module, key, _f) _populate_module() del _populate_module, TYPE_CHECKING, annotations npe2-0.7.2/src/npe2/py.typed0000644000000000000000000000000013615410400012424 0ustar00npe2-0.7.2/src/npe2/types.py0000644000000000000000000000425313615410400012461 0ustar00from typing import ( TYPE_CHECKING, Callable, Dict, List, Literal, NewType, Optional, Protocol, Sequence, Tuple, Union, ) if TYPE_CHECKING: import magicgui.widgets import numpy as np import qtpy.QtWidgets # General types # PathLike = Union[str, pathlib.Path] # we really have to pick one PathLike = str PathOrPaths = Union[PathLike, Sequence[PathLike]] PythonName = NewType("PythonName", str) # Layer-related types class ArrayLike(Protocol): @property def shape(self) -> Tuple[int, ...]: ... @property def ndim(self) -> int: ... @property def dtype(self) -> "np.dtype": ... def __array__(self) -> "np.ndarray": ... # pragma: no cover LayerName = Literal[ "graph", "image", "labels", "points", "shapes", "surface", "tracks", "vectors" ] Metadata = Dict DataType = Union[ArrayLike, Sequence[ArrayLike]] FullLayerData = Tuple[DataType, Metadata, LayerName] LayerData = Union[Tuple[DataType], Tuple[DataType, Metadata], FullLayerData] # ########################## CONTRIBUTIONS ################################# # WidgetContribution.command must point to a WidgetCreator Widget = Union["magicgui.widgets.Widget", "qtpy.QtWidgets.QWidget"] WidgetCreator = Callable[..., Widget] # ReaderContribution.command must point to a ReaderGetter ReaderFunction = Callable[[PathOrPaths], List[LayerData]] ReaderGetter = Callable[[PathOrPaths], Optional[ReaderFunction]] # SampleDataGenerator.command must point to a SampleDataCreator SampleDataCreator = Callable[..., List[LayerData]] # WriterContribution.command must point to a WriterFunction # Writers that take at most one layer must provide a SingleWriterFunction command. # Otherwise, they must provide a MultiWriterFunction. # where the number of layers they take is defined as # n = sum(ltc.max() for ltc in WriterContribution.layer_type_constraints()) SingleWriterFunction = Callable[[str, DataType, Metadata], List[str]] MultiWriterFunction = Callable[[str, List[FullLayerData]], List[str]] WriterFunction = Union[SingleWriterFunction, MultiWriterFunction] # ########################################################################## npe2-0.7.2/src/npe2/_inspection/__init__.py0000644000000000000000000000044713615410400015367 0ustar00from ._compile import compile from ._visitors import ( NPE2PluginModuleVisitor, find_npe1_module_contributions, find_npe2_module_contributions, ) __all__ = [ "NPE2PluginModuleVisitor", "find_npe2_module_contributions", "find_npe1_module_contributions", "compile", ] npe2-0.7.2/src/npe2/_inspection/_compile.py0000644000000000000000000000754513615410400015425 0ustar00from pathlib import Path from typing import Iterator, List, Sequence, Tuple, Union, cast from npe2.manifest import PluginManifest, contributions from npe2.manifest.utils import merge_contributions, merge_manifests from ._setuputils import get_package_dir_info from ._visitors import find_npe2_module_contributions def find_packages(where: Union[str, Path] = ".") -> List[Path]: """Return all folders that have an __init__.py file""" return [p.parent for p in Path(where).resolve().rglob("**/__init__.py")] def get_package_name(where: Union[str, Path] = ".") -> str: return get_package_dir_info(where).package_name def compile( src_dir: Union[str, Path], dest: Union[str, Path, None] = None, packages: Sequence[str] = (), plugin_name: str = "", template: Union[str, Path, None] = None, ) -> PluginManifest: """Compile plugin manifest from `src_dir`, where is a top-level repo. This will discover all the contribution points in the repo and output a manifest object Parameters ---------- src_dir : Union[str, Path] Repo root. Should contain a pyproject or setup.cfg file. dest : Union[str, Path, None] If provided, path where output manifest should be written. packages : Sequence[str] List of packages to include in the manifest. By default, all packages (subfolders that have an `__init__.py`) will be included. plugin_name : str Name of the plugin. If not provided, the name will be derived from the package structure (this assumes a setuptools package.) template : Union[str, Path, None] If provided, path to a template manifest file to use. This file can contain "non-command" contributions, like `display_name`, or `themes`, etc... In the case of conflicts (discovered, decoratated contributions with the same id as something in the template), discovered contributions will take precedence. Returns ------- PluginManifest Manifest including all discovered contribution points, combined with any existing contributions explicitly stated in the manifest. """ src_path = Path(src_dir) assert src_path.exists(), f"src_dir {src_dir} does not exist" if dest is not None: pdest = Path(dest) suffix = pdest.suffix.lstrip(".") if suffix not in {"json", "yaml", "toml"}: raise ValueError( f"dest {dest!r} must have an extension of .json, .yaml, or .toml" ) if template is not None: template_mf = PluginManifest.from_file(template) _packages = find_packages(src_path) if packages: _packages = [p for p in _packages if p.name in packages] if not plugin_name: plugin_name = get_package_name(src_path) contribs: List[contributions.ContributionPoints] = [] for pkg_path in _packages: top_mod = pkg_path.name # TODO: add more tests with more complicated package structures # make sure we're not double detecting and/or missing stuff. for mod_path, mod_name in _iter_modules(pkg_path): contrib = find_npe2_module_contributions( mod_path, plugin_name=plugin_name, module_name=f"{top_mod}.{mod_name}" if mod_name else top_mod, ) contribs.append(contrib) mf = PluginManifest( name=plugin_name, contributions=merge_contributions(contribs), ) if template is not None: mf = merge_manifests([template_mf, mf], overwrite=True) if dest is not None: manifest_string = getattr(mf, cast(str, suffix))(indent=2) pdest.write_text(manifest_string) return mf def _iter_modules(path: Path) -> Iterator[Tuple[Path, str]]: """Return all python modules in path""" for p in path.glob("*.py"): yield p, "" if p.name == "__init__.py" else p.stem npe2-0.7.2/src/npe2/_inspection/_fetch.py0000644000000000000000000003565113615410400015065 0ustar00from __future__ import annotations import io import json import os import re import subprocess import tempfile from concurrent.futures import ProcessPoolExecutor from contextlib import contextmanager from functools import lru_cache from importlib import metadata from logging import getLogger from pathlib import Path from typing import ( TYPE_CHECKING, Any, ContextManager, Dict, Iterator, List, Optional, Tuple, Union, ) from unittest.mock import patch from urllib import error, parse, request from zipfile import ZipFile from npe2.manifest import PackageMetadata if TYPE_CHECKING: from npe2.manifest import PluginManifest logger = getLogger(__name__) NPE1_ENTRY_POINT = "napari.plugin" NPE2_ENTRY_POINT = "napari.manifest" __all__ = [ "fetch_manifest", "get_pypi_url", "get_hub_plugin", "get_hub_plugins", "get_pypi_plugins", ] def _manifest_from_npe2_dist( dist: metadata.PathDistribution, ep: metadata.EntryPoint ) -> PluginManifest: """Extract plugin manifest from a distribution with an npe2 entry point.""" from npe2.manifest import PluginManifest logger.debug("pypi wheel has npe2 entry point.") # python 3.8 fallbacks match = ep.pattern.match(ep.value) assert match module: str = match.groupdict()["module"] attr: str = match.groupdict()["attr"] mf_file = Path(dist.locate_file(Path(module) / attr)) if not mf_file.exists(): raise ValueError( # pragma: no cover f"manifest {mf_file.name!r} does not exist in distribution " f"for {dist.metadata['Name']}" ) mf = PluginManifest.from_file(str(mf_file)) # manually add the package metadata from our distribution object. mf.package_metadata = PackageMetadata.from_dist_metadata(dist.metadata) return mf def _manifest_from_npe1_dist(dist: metadata.PathDistribution) -> PluginManifest: """Extract plugin manifest from a distribution with an npe1 entry point.""" from npe2.manifest import PluginManifest from npe2.manifest.utils import merge_contributions from . import find_npe1_module_contributions name = dist.metadata["Name"] contribs = [] for ep in dist.entry_points: if ep.group == NPE1_ENTRY_POINT and (match := ep.pattern.match(ep.value)): module = match.group("module") contribs.append(find_npe1_module_contributions(dist, module)) mf = PluginManifest( name=name, contributions=merge_contributions(contribs), npe1_shim=True ) mf.package_metadata = PackageMetadata.from_dist_metadata(dist.metadata) return mf def _manifest_from_extracted_wheel(wheel_dir: Path) -> PluginManifest: """Return plugin manifest from an extracted wheel.""" # create a PathDistribution from the dist-info directory in the wheel dist = metadata.PathDistribution(next(Path(wheel_dir).glob("*.dist-info"))) has_npe1 = False for ep in dist.entry_points: # if we find an npe2 entry point, we can just use # PathDistribution.locate_file to get the file. if ep.group == NPE2_ENTRY_POINT: return _manifest_from_npe2_dist(dist, ep) elif ep.group == NPE1_ENTRY_POINT: has_npe1 = True # pragma: no cover if has_npe1: return _manifest_from_npe1_dist(dist) raise ValueError("No npe2 or npe1 entry point found in wheel") # pragma: no cover @contextmanager def _guard_cwd() -> Iterator[None]: """Protect current working directory from changes.""" current = os.getcwd() try: yield finally: os.chdir(current) def _build_wheel(src: Union[str, Path]) -> Path: """Build a wheel from a source directory and extract it into dest.""" from build.__main__ import build_package dest = Path(src) / "extracted_wheel" class _QuietPopen(subprocess.Popen): """Silence all the noise from build.""" def __init__(self, *args, **kwargs): kwargs["stdout"] = subprocess.DEVNULL kwargs["stderr"] = subprocess.DEVNULL super().__init__(*args, **kwargs) with patch("subprocess.Popen", _QuietPopen), _guard_cwd(): dist = Path(src) / "dist" build_package(src, dist, ["wheel"]) with ZipFile(next((dist).glob("*.whl"))) as zf: zf.extractall(dest) return dest def get_manifest_from_wheel(src: str) -> PluginManifest: """Extract a manifest from a .whl file.""" with tempfile.TemporaryDirectory() as td: with ZipFile(src) as zf: zf.extractall(td) return _manifest_from_extracted_wheel(Path(td)) def _build_src_and_extract_manifest(src_dir: Union[str, Path]) -> PluginManifest: """Build a wheel from a source directory and extract the manifest.""" return _manifest_from_extracted_wheel(_build_wheel(src_dir)) def _get_manifest_from_zip_url(url: str) -> PluginManifest: """Extract a manifest from a remote source directory zip file. Examples -------- $ npe2 fetch https://github.com/org/project/archive/refs/heads/master.zip """ with _tmp_zip_download(url) as zip_path: src_dir = next(Path(zip_path).iterdir()) # find first directory return _build_src_and_extract_manifest(src_dir) def _get_manifest_from_wheel_url(url: str) -> PluginManifest: """Extract a manifest from a remote wheel file. Examples -------- $ npe2 fetch https://files.pythonhosted.org/packages/b0/93/a00a1ee154d5ce3540dd5ae081dc53fcfa7498f34ba68a7345ac027a4f96/pycudadecon-0.3.0-py3-none-any.whl """ with _tmp_zip_download(url) as wheel_dir: return _manifest_from_extracted_wheel(wheel_dir) def _get_manifest_from_targz_url(url: str) -> PluginManifest: """Extract a manifest from a remote source directory tar.gz file. Examples -------- $ npe2 fetch https://files.pythonhosted.org/packages/4a/84/de031ba465f183c319cb37633c49dfebd57f1ff42bc9744db3f80f7f4093/pycudadecon-0.3.0.tar.gz """ with _tmp_targz_download(url) as targz_path: src_dir = next(Path(targz_path).iterdir()) # find first directory return _build_src_and_extract_manifest(src_dir) def _get_manifest_from_git_url(url: str) -> PluginManifest: """Extract a manifest from a remote git repository. Examples -------- $ npe2 fetch https://github.com/tlambert03/napari-dv $ npe2 fetch https://github.com/tlambert03/napari-dv.git $ npe2 fetch git+https://github.com/tlambert03/napari-dv.git """ if url.startswith("git+"): url = url[4:] with tempfile.TemporaryDirectory() as td: subprocess.run(["git", "clone", url, td], stdout=subprocess.DEVNULL) return _build_src_and_extract_manifest(td) def fetch_manifest( package_or_url: str, version: Optional[str] = None ) -> PluginManifest: """Fetch a manifest for a pypi package name or URL to a wheel or source. Parameters ---------- package_or_url : str package name or URL to a git repository or zip file. version : Optional[str] package version, by default, latest version. Returns ------- PluginManifest Plugin manifest for package `specifier`. Examples -------- >>> fetch_manifest("napari-dv") >>> fetch_manifest("napari-dv", "0.3.0") >>> fetch_manifest("https://github.com/tlambert03/napari-dv") >>> fetch_manifest("git+https://github.com/tlambert03/napari-dv.git") >>> fetch_manifest("https://github.com/org/project/archive/refs/heads/master.zip") >>> fetch_manifest("https://files.pythonhosted.org/.../package-0.3.0-py3-none-any.whl") >>> fetch_manifest("https://files.pythonhosted.org/.../package-0.3.0.tar.gz") """ # not on PyPI check various URL forms if package_or_url.startswith(("http", "git+http")): if package_or_url.endswith(".zip"): return _get_manifest_from_zip_url(package_or_url) if package_or_url.endswith(".whl"): return _get_manifest_from_wheel_url(package_or_url) if package_or_url.endswith(".tar.gz"): return _get_manifest_from_targz_url(package_or_url) if ( package_or_url.startswith("git+") or package_or_url.endswith(".git") or "github.com" in package_or_url ): return _get_manifest_from_git_url(package_or_url) else: try: with _tmp_pypi_wheel_download(package_or_url, version) as td: return _manifest_from_extracted_wheel(td) except metadata.PackageNotFoundError: return _manifest_from_pypi_sdist(package_or_url, version) except error.HTTPError: pass # pragma: no cover raise ValueError( # pragma: no cover f"Could not interpret {package_or_url!r} as a PYPI package name or URL to a " "wheel or source distribution/zip file." ) def _manifest_from_pypi_sdist( package: str, version: Optional[str] = None ) -> PluginManifest: """Extract a manifest from a source distribution on pypi.""" with _tmp_pypi_sdist_download(package, version) as td: src = next(p for p in td.iterdir() if p.is_dir()) return _build_src_and_extract_manifest(src) @lru_cache def _pypi_info(package: str) -> dict: with request.urlopen(f"https://pypi.org/pypi/{package}/json") as f: return json.load(f) def get_pypi_url( package: str, version: Optional[str] = None, packagetype: Optional[str] = None ) -> str: """Get URL for a package on PyPI. Parameters ---------- package : str package name version : str, optional package version, by default, latest version. packagetype : str, optional one of `'sdist'`, `'bdist_wheel'`, or `None`, by default `None`, which means 'bdist_wheel' will be tried first, then 'sdist' Returns ------- str URL to download the package. Raises ------ ValueError If packagetype is not one of 'sdist', 'bdist_wheel', or if version is specified and does not match any available version. KeyError If packagetype is specified and no package of that type is available. """ if packagetype not in {"sdist", "bdist_wheel", None}: raise ValueError( # pragma: no cover f"Invalid packagetype: {packagetype}, must be one of sdist, bdist_wheel" ) data = _pypi_info(package) if version: version = version.lstrip("v") try: _releases: List[dict] = data["releases"][version] except KeyError as e: # pragma: no cover raise ValueError(f"{package} does not have version {version}") from e else: _releases = data["urls"] releases = {d.get("packagetype"): d for d in _releases} if packagetype: if packagetype not in releases: # pragma: no cover version = version or "latest" raise metadata.PackageNotFoundError( f'No {packagetype} releases found for version "{version}"' ) return releases[packagetype]["url"] return (releases.get("bdist_wheel") or releases["sdist"])["url"] @contextmanager def _tmp_zip_download(url: str) -> Iterator[Path]: """Extract remote zip file to a temporary directory.""" with tempfile.TemporaryDirectory() as td, request.urlopen(url) as f: with ZipFile(io.BytesIO(f.read())) as zf: zf.extractall(td) yield Path(td) @contextmanager def _tmp_targz_download(url: str) -> Iterator[Path]: """Extract remote tar.gz file to a temporary directory.""" import tarfile with tempfile.TemporaryDirectory() as td, request.urlopen(url) as f: with tarfile.open(fileobj=f, mode="r:gz") as tar: tar.extractall(td) yield Path(td) def _tmp_pypi_wheel_download( package: str, version: Optional[str] = None ) -> ContextManager[Path]: url = get_pypi_url(package, version=version, packagetype="bdist_wheel") logger.debug(f"downloading wheel for {package} {version or ''}") return _tmp_zip_download(url) def _tmp_pypi_sdist_download( package: str, version: Optional[str] = None ) -> ContextManager[Path]: url = get_pypi_url(package, version=version, packagetype="sdist") logger.debug(f"downloading sdist for {package} {version or ''}") return _tmp_targz_download(url) @lru_cache def _get_packages_by_classifier(classifier: str) -> Dict[str, str]: """Search for packages declaring ``classifier`` on PyPI. Returns ------- packages : List[str] name of all packages at pypi that declare ``classifier`` """ PACKAGE_NAME_PATTERN = re.compile('class="package-snippet__name">(.+)') PACKAGE_VERSION_PATTERN = re.compile('class="package-snippet__version">(.+)') packages = {} page = 1 url = f"https://pypi.org/search/?c={parse.quote_plus(classifier)}&page=" while True: try: with request.urlopen(f"{url}{page}") as response: html = response.read().decode() names = PACKAGE_NAME_PATTERN.findall(html) versions = PACKAGE_VERSION_PATTERN.findall(html) packages.update(dict(zip(names, versions))) page += 1 except error.HTTPError: break return dict(sorted(packages.items())) def get_pypi_plugins() -> Dict[str, str]: """Return {name: latest_version} for all plugins found on pypi.""" NAPARI_CLASSIFIER = "Framework :: napari" return _get_packages_by_classifier(NAPARI_CLASSIFIER) @lru_cache def get_hub_plugins() -> Dict[str, str]: """Return {name: latest_version} for all plugins on the hub.""" with request.urlopen("https://api.napari-hub.org/plugins") as r: return json.load(r) @lru_cache def get_hub_plugin(plugin_name: str) -> Dict[str, Any]: """Return hub information for a specific plugin.""" with request.urlopen(f"https://api.napari-hub.org/plugins/{plugin_name}") as r: return json.load(r) def _try_fetch_and_write_manifest(args: Tuple[str, str, Path, int]): name, version, dest, indent = args FORMAT = "json" try: # pragma: no cover mf = fetch_manifest(name, version=version) manifest_string = getattr(mf, FORMAT)(exclude=set(), indent=indent) (dest / f"{name}.{FORMAT}").write_text(manifest_string) print(f"✅ {name}") except Exception as e: print(f"❌ {name}") return name, {"version": version, "error": str(e)} def fetch_all_manifests(dest: str = "manifests", indent: int = 2) -> None: """Fetch all manifests for plugins on PyPI and write to ``dest`` directory.""" _dest = Path(dest) _dest.mkdir(exist_ok=True, parents=True) args = [ (name, ver, _dest, indent) for name, ver in sorted(get_pypi_plugins().items()) ] # use processes instead of threads, because many of the subroutines in build # and setuptools use `os.chdir()`, which is not thread-safe with ProcessPoolExecutor() as executor: errors = list(executor.map(_try_fetch_and_write_manifest, args)) _errors = {tup[0]: tup[1] for tup in errors if tup} (_dest / "errors.json").write_text(json.dumps(_errors, indent=indent)) npe2-0.7.2/src/npe2/_inspection/_from_npe1.py0000644000000000000000000005543013615410400015657 0ustar00import ast import inspect import re import sys import warnings from configparser import ConfigParser from functools import lru_cache, partial from importlib import import_module, metadata from logging import getLogger from pathlib import Path from types import ModuleType from typing import ( Any, Callable, DefaultDict, Dict, Iterator, List, Optional, Tuple, Union, cast, ) from npe2.manifest import PluginManifest from npe2.manifest.contributions import ( CommandContribution, ThemeColors, WidgetContribution, ) from npe2.manifest.utils import ( SHIM_NAME_PREFIX, import_python_name, merge_manifests, safe_key, ) from npe2.types import WidgetCreator from ._setuputils import PackageInfo, get_package_dir_info logger = getLogger(__name__) NPE1_EP = "napari.plugin" NPE2_EP = "napari.manifest" NPE1_IMPL_TAG = "napari_impl" # same as HookImplementation.format_tag("napari") class HookImplementation: def __init__( self, function: Callable, plugin: Optional[ModuleType] = None, plugin_name: Optional[str] = None, **kwargs, ): self.function = function self.plugin = plugin self.plugin_name = plugin_name self._specname = kwargs.get("specname") def __repr__(self) -> str: # pragma: no cover return ( f"" ) @property def specname(self) -> str: return self._specname or self.function.__name__ def iter_hookimpls( module: ModuleType, plugin_name: Optional[str] = None ) -> Iterator[HookImplementation]: # yield all routines in module that have "{self.project_name}_impl" attr for name in dir(module): method = getattr(module, name) if hasattr(method, NPE1_IMPL_TAG) and inspect.isroutine(method): hookimpl_opts = getattr(method, NPE1_IMPL_TAG) if isinstance(hookimpl_opts, dict): yield HookImplementation(method, module, plugin_name, **hookimpl_opts) @lru_cache def plugin_packages() -> List[PackageInfo]: """List of all packages with napari entry points. This is useful to help resolve naming issues (due to the terrible confusion around *what* a npe1 plugin name actually was). """ packages: List[PackageInfo] = [] for dist in metadata.distributions(): packages.extend( PackageInfo(package_name=dist.metadata["Name"], entry_points=[ep]) for ep in dist.entry_points if ep.group == NPE1_EP ) return packages def manifest_from_npe1( plugin: Union[str, metadata.Distribution, None] = None, module: Optional[Any] = None, adapter=False, ) -> PluginManifest: """Return manifest object given npe1 plugin or package name. One of `plugin` or `module` must be provide. Parameters ---------- plugin : Union[str, metadata.Distribution, None] Name of package/plugin to convert. Or a `metadata.Distribution` object. If a string, this function should be prepared to accept both the name of the package, and the name of an npe1 `napari.plugin` entry_point. by default None module : Optional[Module] namespace object, to directly import (mostly for testing.), by default None adapter : bool If True, the resulting manifest will be used internally by NPE1Adapter, but is NOT necessarily suitable for export as npe2 manifest. This will handle cases of locally defined functions and partials that don't have global python_names that are not supported natively by npe2. by default False """ if module is not None: modules: List[str] = [module] package_name = "dynamic" plugin_name = getattr(module, "__name__", "dynamic_plugin") elif isinstance(plugin, str): modules = [] plugin_name = plugin for pp in plugin_packages(): if plugin in (pp.ep_name, pp.package_name): modules.append(pp.ep_value) package_name = pp.package_name if not modules: _avail = [f" {p.package_name} ({p.ep_name})" for p in plugin_packages()] avail = "\n".join(_avail) raise metadata.PackageNotFoundError( f"No package or entry point found with name {plugin!r}: " f"\nFound packages (entry_point):\n{avail}" ) elif hasattr(plugin, "entry_points") and hasattr(plugin, "metadata"): plugin = cast(metadata.Distribution, plugin) # don't use isinstance(Distribution), setuptools monkeypatches sys.meta_path: # https://github.com/pypa/setuptools/issues/3169 NPE1_ENTRY_POINT = "napari.plugin" plugin_name = package_name = plugin.metadata["Name"] modules = [ ep.value for ep in plugin.entry_points if ep.group == NPE1_ENTRY_POINT ] assert modules, f"No npe1 entry points found in distribution {plugin_name!r}" else: raise ValueError("one of plugin or module must be provided") # pragma: no cover manifests: List[PluginManifest] = [] for mod_name in modules: logger.debug( "Discovering contributions for npe1 plugin %r: module %r", package_name, mod_name, ) parser = HookImplParser(package_name, plugin_name or "", adapter=adapter) _mod = import_module(mod_name) if isinstance(mod_name, str) else mod_name parser.parse_module(_mod) manifests.append(parser.manifest()) assert manifests, "No npe1 entry points found in distribution {name}" return merge_manifests(manifests) class HookImplParser: def __init__(self, package: str, plugin_name: str, adapter: bool = False) -> None: """A visitor class to convert npe1 hookimpls to a npe2 manifest Parameters ---------- package : str Name of package plugin_name : str Name of plugin (will almost always be name of package) adapter : bool, optional If True, the resulting manifest will be used internally by NPE1Adapter, but is NOT necessarily suitable for export as npe2 manifest. This will handle cases of locally defined functions and partials that don't have global python_names that are not supported natively by npe2. by default False Examples -------- >>> parser = HookImplParser(package, plugin_name) >>> parser.parse_callers(plugin_manager._plugin2hookcallers[_module]) >>> mf = PluginManifest(name=package, contributions=dict(parser.contributions)) """ self.package = package self.plugin_name = plugin_name self.contributions: DefaultDict[str, list] = DefaultDict(list) self.adapter = adapter def manifest(self) -> PluginManifest: return PluginManifest(name=self.package, contributions=dict(self.contributions)) def parse_module(self, module: ModuleType): for impl in iter_hookimpls(module, plugin_name=self.plugin_name): if impl.plugin_name == self.plugin_name: # call the corresponding hookimpl parser try: getattr(self, impl.specname)(impl) except Exception as e: # pragma: no cover warnings.warn( f"Failed to convert {impl.specname} in {self.package!r}: {e}", stacklevel=2, ) def napari_experimental_provide_theme(self, impl: HookImplementation): ThemeDict = Dict[str, Union[str, Tuple, List]] d: Dict[str, ThemeDict] = impl.function() for name, theme_dict in d.items(): colors = ThemeColors(**theme_dict) clr = colors.background or colors.foreground luma = _luma(*clr.as_rgb_tuple()[:3]) if clr else 0 self.contributions["themes"].append( { "label": name, "id": name.lower().replace(" ", "_"), "type": "dark" if luma < 128 else "light", "colors": colors, } ) def napari_get_reader(self, impl: HookImplementation): patterns = _guess_fname_patterns(impl.function) self.contributions["readers"].append( { "command": self.add_command(impl), "accepts_directories": True, "filename_patterns": patterns, } ) def napari_provide_sample_data(self, impl: HookImplementation): module = sys.modules[impl.function.__module__.split(".", 1)[0]] samples: Dict[str, Union[dict, str, Callable]] = impl.function() for idx, (key, sample) in enumerate(samples.items()): _sample: Union[str, Callable] if isinstance(sample, dict): display_name = sample.get("display_name") _sample = sample.get("data") # type: ignore else: _sample = sample display_name = key _key = safe_key(key) s = {"key": _key, "display_name": display_name} if callable(_sample): # let these raise exceptions here immediately if they don't validate id = f"{self.package}.data.{_key}" py_name = _python_name( _sample, impl.function, hook_idx=idx if self.adapter else None ) cmd_contrib = CommandContribution( id=id, python_name=py_name, title=f"{key} sample", ) self.contributions["commands"].append(cmd_contrib) s["command"] = id else: assert module.__file__ package_dir = module.__file__.rsplit("/", 1)[0] s["uri"] = str(_sample).replace(package_dir, r"${package}") self.contributions["sample_data"].append(s) def napari_experimental_provide_function(self, impl: HookImplementation): items: Union[Callable, List[Callable]] = impl.function() items = [items] if not isinstance(items, list) else items for idx, item in enumerate(items): try: cmd = f"{self.package}.{item.__name__}" py_name = _python_name( item, impl.function, hook_idx=idx if self.adapter else None ) docsum = item.__doc__.splitlines()[0] if item.__doc__ else None cmd_contrib = CommandContribution( id=cmd, python_name=py_name, title=docsum or item.__name__ ) self.contributions["commands"].append(cmd_contrib) wdg_contrib = WidgetContribution( command=cmd, display_name=item.__name__.replace("_", " "), autogenerate=True, ) self.contributions["widgets"].append(wdg_contrib) except Exception as e: # pragma: no cover msg = ( f"Error converting function [{idx}] " f"from {impl.function.__module__!r}:\n{e}" ) warnings.warn(msg, stacklevel=2) def napari_experimental_provide_dock_widget(self, impl: HookImplementation): WidgetCallable = Union[Callable, Tuple[Callable, dict]] items: Union[WidgetCallable, List[WidgetCallable]] = impl.function() if not isinstance(items, list): items = [items] # pragma: no cover # "wdg_creator" will be the function given by the plugin that returns a widget # while `impl` is the hook implementation that returned all the `wdg_creators` for idx, item in enumerate(items): if isinstance(item, tuple): wdg_creator = item[0] kwargs = item[1] if len(item) > 1 else {} else: wdg_creator, kwargs = (item, {}) if not callable(wdg_creator) and isinstance( kwargs, dict ): # pragma: no cover warnings.warn( f"Invalid widget spec: {wdg_creator}, {kwargs}", stacklevel=2 ) continue try: func_name = getattr(wdg_creator, "__name__", "") wdg_name = str(kwargs.get("name", "")) or _camel_to_spaces(func_name) self._create_widget_contrib( wdg_creator, display_name=wdg_name, idx=idx, hook=impl.function ) except Exception as e: # pragma: no cover msg = ( f"Error converting dock widget [{idx}] " f"from {impl.function.__module__!r}:\n{e}" ) warnings.warn(msg, stacklevel=2) def _create_widget_contrib( self, wdg_creator: WidgetCreator, display_name: str, idx: int, hook: Callable, ): # we provide both the wdg_creator object itself, as well as the hook impl that # returned it... In the case that we can't get an absolute python name to the # wdg_creator itself (e.g. it's defined in a local scope), then the py_name # will use the hookimpl itself, and the index of the object returned. py_name = _python_name( wdg_creator, hook, hook_idx=idx if self.adapter else None ) if not py_name: # pragma: no cover raise ValueError( "No suitable python name to point to. " "Is this a locally defined function or partial?" ) func_name = getattr(wdg_creator, "__name__", "") cmd = f"{self.package}.{func_name or display_name.lower().replace(' ', '_')}" # let these raise exceptions here immediately if they don't validate cmd_contrib = CommandContribution( id=cmd, python_name=py_name, title=f"Create {display_name}" ) wdg_contrib = WidgetContribution(command=cmd, display_name=display_name) self.contributions["commands"].append(cmd_contrib) self.contributions["widgets"].append(wdg_contrib) def napari_get_writer(self, impl: HookImplementation): warnings.warn( f"Found a multi-layer writer in {self.package!r} - {impl.specname!r}, " "but it's not convertable. Please add the writer manually.", stacklevel=2, ) return NotImplemented # pragma: no cover def napari_write_image(self, impl: HookImplementation): self._parse_writer(impl, "image") def napari_write_labels(self, impl: HookImplementation): self._parse_writer(impl, "labels") def napari_write_points(self, impl: HookImplementation): self._parse_writer(impl, "points") def napari_write_shapes(self, impl: HookImplementation): self._parse_writer(impl, "shapes") def napari_write_vectors(self, impl: HookImplementation): self._parse_writer(impl, "vectors") def _parse_writer(self, impl: HookImplementation, layer: str): id = self.add_command(impl) self.contributions["writers"].append( { "command": id, "layer_types": [layer], "display_name": layer, "filename_extensions": [], } ) def add_command(self, impl: HookImplementation, py_name: str = "") -> str: name = impl.specname.replace("napari_", "") id = f"{self.package}.{name}" title = " ".join(name.split("_")).title() if not py_name: py_name = _python_name(impl.function) c = CommandContribution(id=id, python_name=py_name, title=title) self.contributions["commands"].append(c) return id def _is_magicgui_magic_factory(obj): return "magicgui" in sys.modules and isinstance(obj, partial) def _python_name( obj: Any, hook: Optional[Callable] = None, hook_idx: Optional[int] = None ) -> str: """Get resolvable python name for `obj` returned from an npe1 `hook` implentation. Parameters ---------- obj : Any a python obj hook : Callable, optional the npe1 hook implementation that returned `obj`, by default None. This is used both to search the module namespace for `obj`, and also in the shim python name if `obj` cannot be found. hook_idx : int, optional If `obj` cannot be found and `hook_idx` is not None, then a shim name. of the form "__npe1shim__.{_python_name(hook)}_{hook_idx}" will be returned. by default None. Returns ------- str a string that can be imported with npe2.manifest.utils.import_python_name Raises ------ AttributeError If a resolvable string cannot be found """ obj_name: Optional[str] = None mod_name: Optional[str] = None # first, check the global namespace of the module where the hook was declared # if we find `obj` itself, we can just use it. if hasattr(hook, "__module__"): hook_mod = sys.modules.get(hook.__module__) if hook_mod: for local_name, _obj in vars(hook_mod).items(): if _obj is obj: obj_name = local_name mod_name = hook_mod.__name__ break # trick if it's a magic_factory if _is_magicgui_magic_factory(obj): f = obj.keywords.get("function") if f: v = getattr(f, "__globals__", {}).get(getattr(f, "__name__", "")) if v is obj: # pragma: no cover mod_name = f.__module__ obj_name = f.__qualname__ # if that didn't work get the qualname of the object # and, if it's not a locally defined qualname, get the name of the module # in which it is defined if not (mod_name and obj_name): obj_name = getattr(obj, "__qualname__", "") if obj_name and "" not in obj_name: mod = inspect.getmodule(obj) or inspect.getmodule(hook) if mod: mod_name = mod.__name__ if not (mod_name and obj_name) and (hook and hook_idx is not None): # we weren't able to resolve an absolute name... if we are shimming, then we # can create a special py_name of the form `__npe1shim__.hookfunction_idx` return f"{SHIM_NAME_PREFIX}{_python_name(hook)}_{hook_idx}" if obj_name and "" in obj_name: raise ValueError("functions defined in local scopes are not yet supported.") if not mod_name: raise AttributeError(f"could not get resolvable python name for {obj}") pyname = f"{mod_name}:{obj_name}" if import_python_name(pyname) is not obj: # pragma: no cover raise AttributeError(f"could not get resolvable python name for {obj}") return pyname def _luma(r, g, b): # https://en.wikipedia.org/wiki/Luma_(video) return 0.2126 * r + 0.7152 * g + 0.0722 * b # per ITU-R BT.709 _camel_to_spaces_pattern = re.compile(r"((?<=[a-z])[A-Z]|(? Path: dist = metadata.distribution(package_name) if not top_module: top_mods = (dist.read_text("top_level.txt") or "").strip().splitlines() if not top_mods: raise ValueError( # pragma: no cover "Could not detect a top level module in distribution metadata " f"for {package_name}" ) top_module = top_mods[0] path = Path(dist.locate_file(top_module)) assert path.is_dir() return path def convert_repository( path: Union[Path, str], mf_name: str = "napari.yaml", dry_run=False ) -> Tuple[PluginManifest, Path]: """Convert repository at `path` to new npe2 style.""" path = Path(path) # get the info we need and create a manifest info = get_package_dir_info(path) if not (info.package_name and info._ep1): msg = f'Could not detect first gen napari plugin package at "{path}".' if info._ep2 is not None: msg += f" Found a {NPE2_EP} entry_point. Is this package already converted?" raise ValueError(msg) manifest = manifest_from_npe1(info.package_name) top_module = get_top_module_path(info.package_name, info.top_module) if not top_module.is_dir(): raise ValueError( # pragma: no cover f"Detection of top-level module failed. {top_module} is not a directory." ) mf_path = top_module / mf_name if dry_run: return manifest, mf_path # update the entry_points in setup.cfg/setup.py if info.setup_cfg: _write_new_setup_cfg_ep(info, mf_name) # or tell them to do it themselves in setup.py else: # tell them to do it manually warnings.warn( "\nCannot auto-update setup.py, please edit setup.py as follows:\n" " 1. remove the `napari.plugin` entry_point\n" " 2. add the following entry_point:" f""" entry_points={{ "{NPE2_EP}": [ "{info.package_name} = {info.top_module}:{mf_name}", ], }}, package_data={{"{info.top_module}": ["{mf_name}"]}}, """, stacklevel=2, ) # write the yaml to top_module/napari.yaml mf_path.write_text(manifest.yaml()) return manifest, mf_path def _write_new_setup_cfg_ep(info: PackageInfo, mf_name: str): assert info.setup_cfg p = ConfigParser(comment_prefixes="/", allow_no_value=True) # preserve comments p.read(info.setup_cfg) mf_path = f"{info.top_module}:{mf_name}" new_ep = f"\n{info.package_name} = {mf_path}" if "options.entry_points" not in p.sections(): p.add_section("options.entry_points") # pragma: no cover p.set("options.entry_points", NPE2_EP, new_ep) if "options.package_data" not in p.sections(): p.add_section("options.package_data") p.set("options.package_data", info.top_module, mf_name) if "options" not in p.sections(): p.add_section("options") p.set("options", "include_package_data", "True") p.remove_option("options.entry_points", NPE1_EP) with open(info.setup_cfg, "w") as fh: p.write(fh) def _guess_fname_patterns(func): """Try to guess filename extension patterns from source code. Fallback to "*".""" patterns = ["*"] # try to look at source code to guess file extensions _, *b = inspect.getsource(func).split("endswith(") if b: try: middle = b[0].split(")")[0] if middle.startswith("("): middle += ")" files = ast.literal_eval(middle) if isinstance(files, str): files = [files] if files: patterns = [f"*{f}" for f in files] except Exception: # pragma: no cover # couldn't do it... just accept all filename patterns pass return patterns npe2-0.7.2/src/npe2/_inspection/_full_install.py0000644000000000000000000001063313615410400016455 0ustar00"""This module is mostly superceded by the static NPE1ModuleVisitor pattern. It is left here for reference, but could be removed in the future. """ from __future__ import annotations import site import sys import warnings from contextlib import contextmanager from importlib import metadata from logging import getLogger from typing import TYPE_CHECKING, Iterator, Optional if TYPE_CHECKING: from build.env import IsolatedEnv from npe2.manifest import PluginManifest logger = getLogger(__name__) __all__ = [ "fetch_manifest_with_full_install", "isolated_plugin_env", ] NPE1_ENTRY_POINT = "napari.plugin" NPE2_ENTRY_POINT = "napari.manifest" @contextmanager def isolated_plugin_env( package: str, version: Optional[str] = None, validate_npe1_imports: bool = True, install_napari_if_necessary: bool = True, ) -> Iterator[IsolatedEnv]: """Isolated env context with a plugin installed. The site-packages folder of the env is added to sys.path within the context. Parameters ---------- package : str package name version : Optional[str] package version, by default, latest version. validate_npe1_imports: bool Whether to try to import an npe1 plugin's entry points. by default True. install_napari_if_necessary: bool If `validate_npe1_imports` is True, whether to install napari if the import fails. (It's not uncommon for plugins to fail to specify napari as a dependency. Othertimes, they simply need a qt backend.). by default True. Yields ------ build.env.IsolatedEnv env object that has an `install` method. """ # it's important that this import be lazy, otherwise we'll get a circular # import when serving as a setuptools plugin with `python -m build` from build.env import IsolatedEnvBuilder with IsolatedEnvBuilder() as env: # install the package pkg = f"{package}=={version}" if version else package logger.debug(f"installing {pkg} into virtual env") env.install([pkg]) # temporarily add env site packages to path prefixes = [getattr(env, "path")] # noqa if not (site_pkgs := site.getsitepackages(prefixes=prefixes)): raise ValueError("No site-packages found") # pragma: no cover sys.path.insert(0, site_pkgs[0]) try: if validate_npe1_imports: # try to import the plugin's entry points dist = metadata.distribution(package) ep_groups = {ep.group for ep in dist.entry_points} if NPE1_ENTRY_POINT in ep_groups and NPE2_ENTRY_POINT not in ep_groups: try: _get_loaded_mf_or_die(package) except Exception: # pragma: no cover # if loading contributions fails, it can very often be fixed # by installing `napari[all]` into the environment if install_napari_if_necessary: env.install(["napari[all]"]) # force reloading of qtpy sys.modules.pop("qtpy", None) _get_loaded_mf_or_die(package) else: raise yield env finally: # cleanup sys.path sys.path.pop(0) def _get_loaded_mf_or_die(package: str) -> PluginManifest: """Return a fully loaded (if npe1) manifest, or raise an exception.""" from npe2 import PluginManifest from npe2.manifest._npe1_adapter import NPE1Adapter mf = PluginManifest.from_distribution(package) if isinstance(mf, NPE1Adapter): with warnings.catch_warnings(): warnings.filterwarnings("error", message="Error importing contributions") warnings.filterwarnings("error", message="Failed to convert") warnings.filterwarnings("ignore", message="Found a multi-layer writer") mf._load_contributions(save=False) return mf def fetch_manifest_with_full_install( package: str, version: Optional[str] = None ) -> PluginManifest: """Fetch manifest for plugin by installing into an isolated environment.""" # create an isolated env in which to install npe1 plugin with isolated_plugin_env( package, version, validate_npe1_imports=True, install_napari_if_necessary=True ): return _get_loaded_mf_or_die(package) npe2-0.7.2/src/npe2/_inspection/_setuputils.py0000644000000000000000000001073713615410400016213 0ustar00import ast from configparser import ConfigParser from dataclasses import dataclass, field from functools import cached_property from importlib.metadata import EntryPoint from pathlib import Path from typing import Any, Dict, List, Optional, Union NPE1_EP = "napari.plugin" NPE2_EP = "napari.manifest" @dataclass class PackageInfo: src_root: Optional[Path] = None package_name: str = "" entry_points: List[EntryPoint] = field(default_factory=list) setup_cfg: Optional[Path] = None setup_py: Optional[Path] = None pyproject_toml: Optional[Path] = None # @property # def packages(self) -> Optional[List[Path]]: # return Path(self.top_module) @cached_property def _ep1(self) -> Optional[EntryPoint]: return next((ep for ep in self.entry_points if ep.group == NPE1_EP), None) @cached_property def _ep2(self) -> Optional[EntryPoint]: return next((ep for ep in self.entry_points if ep.group == NPE2_EP), None) @property def ep_name(self): if ep := self._ep1: return ep.name @property def ep_value(self): if ep := self._ep1: return ep.value @property def top_module(self) -> str: if ep := (self._ep1 or self._ep2): return ep.value.split(".", 1)[0].split(":", 1)[0] return "" # pragma: no cover def get_package_dir_info(path: Union[Path, str]) -> PackageInfo: """Attempt to *statically* get plugin info from a package directory.""" path = Path(path).resolve() if not path.is_dir(): # pragma: no cover raise ValueError(f"Provided path is not a directory: {path}") info = PackageInfo(src_root=path) p = None # check for setup.cfg setup_cfg = path / "setup.cfg" if setup_cfg.exists(): info.setup_cfg = setup_cfg p = ConfigParser() p.read(setup_cfg) info.package_name = p.get("metadata", "name", fallback="") if p.has_section("options.entry_points"): for group, val in p.items("options.entry_points"): name, _, value = val.partition("=") info.entry_points.append(EntryPoint(name.strip(), value.strip(), group)) # check for setup.py setup_py = path / "setup.py" if setup_py.exists(): info.setup_py = setup_py node = ast.parse(setup_py.read_text()) visitor = _SetupVisitor() visitor.visit(node) if not info.package_name: info.package_name = visitor.get("name") if not info.entry_points: for group, vals in visitor.get("entry_points", {}).items(): for val in vals if isinstance(vals, list) else [vals]: name, _, value = val.partition("=") info.entry_points.append( EntryPoint(name.strip(), value.strip(), group) ) return info class _SetupVisitor(ast.NodeVisitor): """Visitor to statically determine metadata from setup.py""" def __init__(self) -> None: super().__init__() self._names: Dict[str, Any] = {} self._setup_kwargs: Dict[str, Any] = {} def visit_Assign(self, node: ast.Assign) -> Any: if len(node.targets) == 1: target = node.targets[0] if isinstance(target, ast.Name) and isinstance(target.ctx, ast.Store): self._names[target.id] = self._get_val(node.value) def visit_Call(self, node: ast.Call) -> Any: if getattr(node.func, "id", "") == "setup": for k in node.keywords: key = k.arg value = self._get_val(k.value) self._setup_kwargs[str(key)] = value def _get_val(self, node: Optional[ast.expr]) -> Any: if isinstance(node, ast.Constant): return node.value if isinstance(node, ast.Name): return ( self._names.get(node.id) if isinstance(node.ctx, ast.Load) else node.id ) if isinstance(node, ast.Dict): keys = [self._get_val(k) for k in node.keys] values = [self._get_val(k) for k in node.values] return dict(zip(keys, values)) if isinstance(node, ast.List): return [self._get_val(k) for k in node.elts] if isinstance(node, ast.Tuple): # pragma: no cover return tuple(self._get_val(k) for k in node.elts) return str(node) # pragma: no cover def get(self, key: str, default: Optional[Any] = None) -> Any: return self._setup_kwargs.get(key, default) npe2-0.7.2/src/npe2/_inspection/_visitors.py0000644000000000000000000004643213615410400015655 0ustar00import ast import inspect from abc import ABC, abstractmethod from importlib.metadata import Distribution from pathlib import Path from types import ModuleType from typing import TYPE_CHECKING, Any, DefaultDict, Dict, List, Tuple, Type, Union from npe2.manifest import contributions if TYPE_CHECKING: from pydantic import BaseModel CONTRIB_MAP: Dict[str, Tuple[Type["BaseModel"], str]] = { "writer": (contributions.WriterContribution, "writers"), "reader": (contributions.ReaderContribution, "readers"), "sample_data_generator": (contributions.SampleDataGenerator, "sample_data"), "widget": (contributions.WidgetContribution, "widgets"), } class _DecoratorVisitor(ast.NodeVisitor, ABC): """Visitor that finds decorators that match something. 1. Visit all `Import` and `ImportFrom` nodes in the module, storing their import names in `_names` so we can look them up later. For example, if the module had the line `from npe2 import implements as impls` at the top, then the `visit_ImportFrom` method would add the entry: `self._names['impls'] = 'npe2.implements'` This way, we know that an `@impls` found later in the module is referring to `npe2.implements`. 2. Visit all `FunctionDef` and `ClassDef` nodes in the module and check their decorators with `_find_decorators` 3. In `_find_decorators` we check to see if the name of any of the decorators resolves to the something from self._match (i.e., if it's being decorated with something from `npe2.implements`). If it is, then we call `_process_decorated` implement `_process_decorated` in subclasses """ def __init__(self, module_name: str, match: str) -> None: self.module_name = module_name self._match = match self._names: Dict[str, str] = {} def visit_Import(self, node: ast.Import) -> Any: # https://docs.python.org/3/library/ast.html#ast.Import for alias in node.names: self._names[alias.asname or alias.name] = alias.name return super().generic_visit(node) def visit_ImportFrom(self, node: ast.ImportFrom) -> Any: # https://docs.python.org/3/library/ast.html#ast.ImportFrom module = node.module if node.level > 0: root = self.module_name.rsplit(".", node.level)[0] module = f"{root}.{module}" for alias in node.names: self._names[alias.asname or alias.name] = f"{module}.{alias.name}" return super().generic_visit(node) def visit_FunctionDef(self, node: ast.FunctionDef) -> Any: # https://docs.python.org/3/library/ast.html#ast.FunctionDef self._find_decorators(node) def visit_ClassDef(self, node: ast.ClassDef) -> Any: self._find_decorators(node) def _find_decorators(self, node: Union[ast.ClassDef, ast.FunctionDef]): # for each in the decorator list ... for call in node.decorator_list: # https://docs.python.org/3/library/ast.html#ast.Call if isinstance(call, ast.Name) and self._names.get(call.id, "").startswith( self._match ): self._process_decorated(call.id, node, {}) if not isinstance(call, ast.Call): continue # if the function is an attribute ... # (e.g in `@npe2.implements.reader`, `reader` is an attribute of # implements, which is an attribute of npe2) if isinstance(call.func, ast.Attribute): # then go up the chain of attributes until we get to a root Name val = call.func.value _names = [] while isinstance(val, ast.Attribute): _names.append(val.attr) val = val.value if isinstance(val, ast.Name): _names.append(val.id) # finally, we can build the fully resolved call name of the # decorator (e.g. `@npe2.implements`, or `@implements`) call_name = ".".join(reversed(_names)) # we then check the `_names` we gathered during imports to resolve # these call names to their fully qualified names (e.g. `implements` # would resolve to `npe2.implements` if the name `implements` was # imported from `npe2`.) # If the name resolves to whatever `self._match` is, # then we have a hit! process the function. # In the case of an NPE2 module visitor, the name we're trying to # match will be `npe2.implements` or `implements`. # In the case of a npe1 visitor, the name we're trying to match # will be `napari_plugin_engine.napari_hook_implementation`. if self._names.get(call_name) == self._match: kwargs = self._keywords_to_kwargs(call.keywords) self._process_decorated(call.func.attr, node, kwargs) elif isinstance(call.func, ast.Name): # if the function is just a direct name (e.g. `@reader`) # then we can just see if the name points to something imported from # this module. if self._names.get(call.func.id, "").startswith(self._match): kwargs = self._keywords_to_kwargs(call.keywords) self._process_decorated(call.func.id, node, kwargs) return super().generic_visit(node) def _keywords_to_kwargs(self, keywords: List[ast.keyword]) -> Dict[str, Any]: return {str(k.arg): ast.literal_eval(k.value) for k in keywords} @abstractmethod def _process_decorated( self, decorator_name: str, node: Union[ast.ClassDef, ast.FunctionDef], decorator_kwargs: dict, ): """Process a decorated function. This is a hook for subclasses to do something with the decorated function. """ class NPE2PluginModuleVisitor(_DecoratorVisitor): """AST visitor to find all contributions in an npe2 module. This visitor will find all the contributions (things decorated with `@npe2.implements`) in a module and store them in `contribution_points`. It works as follows: See how decorators are found in the docstring for `_DecoratorVisitor`. when a decorator is found that matches `npe2.implements` (e.g. `@implements.reader`) 4. `_store_contrib` first calls `_store_command` which does the job of storing the fully-resolved `python_name` for the function being decorated, and creates a CommandContribution. `_store_contrib` will then create the appropriate contribution type (e.g. `npe2.implements.reader` will instantiate a `ReaderContribution`), set its `command` entry to the `id` of the just-created `CommandContribution`, then store it in `contribution_points`. 5. When the visitor is finished, we can create an instance of `ContributionPoints` using `ContributionPoints(**visitor.contribution_points)`, then add it to a manifest. """ def __init__( self, plugin_name: str, module_name: str, match: str = "npe2.implements" ) -> None: super().__init__(module_name, match) self.plugin_name = plugin_name self.contribution_points: Dict[str, List[dict]] = {} def _process_decorated( self, decorator_name: str, node: Union[ast.ClassDef, ast.FunctionDef], decorator_kwargs: dict, ): self._store_contrib(decorator_name, node.name, decorator_kwargs) def _store_contrib(self, contrib_type: str, name: str, kwargs: Dict[str, Any]): from npe2.implements import CHECK_ARGS_PARAM # circ import kwargs.pop(CHECK_ARGS_PARAM, None) ContribClass, contrib_name = CONTRIB_MAP[contrib_type] contrib = ContribClass(**self._store_command(name, kwargs)) existing: List[dict] = self.contribution_points.setdefault(contrib_name, []) existing.append(contrib.dict(exclude_unset=True)) def _store_command(self, name: str, kwargs: Dict[str, Any]) -> Dict[str, Any]: cmd_params = inspect.signature(contributions.CommandContribution).parameters cmd_kwargs = {k: kwargs.pop(k) for k in list(kwargs) if k in cmd_params} cmd_kwargs["python_name"] = self._qualified_pyname(name) cmd = contributions.CommandContribution(**cmd_kwargs) if cmd.id.startswith(self.plugin_name): n = len(self.plugin_name) cmd.id = cmd.id[n:] cmd.id = f"{self.plugin_name}.{cmd.id.lstrip('.')}" cmd_contribs: List[dict] = self.contribution_points.setdefault("commands", []) cmd_contribs.append(cmd.dict(exclude_unset=True)) kwargs["command"] = cmd.id return kwargs def _qualified_pyname(self, obj_name: str) -> str: return f"{self.module_name}:{obj_name}" class NPE1PluginModuleVisitor(_DecoratorVisitor): """AST visitor to find all contributions in an npe1 module.""" def __init__(self, plugin_name: str, module_name: str) -> None: super().__init__(module_name, "napari_plugin_engine.napari_hook_implementation") self.plugin_name = plugin_name self.contribution_points: DefaultDict[str, list] = DefaultDict(list) def _process_decorated( self, decorator_name: str, node: Union[ast.ClassDef, ast.FunctionDef], decorator_kwargs: Dict[str, Any], ): self.generic_visit(node) # do this to process any imports in the function hookname = decorator_kwargs.get("specname", node.name) getattr(self, hookname)(node) def _add_command(self, node: ast.FunctionDef) -> contributions.CommandContribution: cmd_id = f"{self.plugin_name}.{node.name}" py_name = f"{self.module_name}:{node.name}" cmd_contrib = contributions.CommandContribution( id=cmd_id, python_name=py_name, title=node.name ) self.contribution_points["commands"].append(cmd_contrib) return cmd_contrib def napari_get_reader(self, node: ast.FunctionDef): cmd = self._add_command(node) rdr_contrib = contributions.ReaderContribution( command=cmd.id, filename_patterns=["*"], accepts_directories=True ) self.contribution_points["readers"].append(rdr_contrib) def napari_get_writer(self, node: ast.FunctionDef): # we can't convert this to an npe2 command contribution pass # pragma: no cover def napari_write_graph(self, node: ast.FunctionDef): self._parse_writer(node, "graph") # pragma: no cover def napari_write_image(self, node: ast.FunctionDef): self._parse_writer(node, "image") def napari_write_labels(self, node: ast.FunctionDef): self._parse_writer(node, "labels") # pragma: no cover def napari_write_points(self, node: ast.FunctionDef): self._parse_writer(node, "points") # pragma: no cover def napari_write_shapes(self, node: ast.FunctionDef): self._parse_writer(node, "shapes") # pragma: no cover def napari_write_vectors(self, node: ast.FunctionDef): self._parse_writer(node, "vectors") # pragma: no cover def _parse_writer(self, node, layer_type: str): cmd = self._add_command(node) wrt_contrib = contributions.WriterContribution( command=cmd.id, layer_types=[layer_type], display_name=layer_type ) self.contribution_points["writers"].append(wrt_contrib) def napari_provide_sample_data(self, node: ast.FunctionDef): from npe2.manifest.utils import safe_key return_ = next(n for n in node.body if isinstance(n, ast.Return)) if not isinstance(return_.value, ast.Dict): raise TypeError( # pragma: no cover f"napari_provide_sample_data must return a dict, not {type(return_)}" ) contrib: contributions.SampleDataContribution for key, val in zip(return_.value.keys, return_.value.values): if isinstance(val, ast.Dict): raise NotImplementedError("npe1 sample dicts-of-dicts not supported") assert isinstance(key, ast.Constant) display_name = key.value key = safe_key(display_name) # type: ignore # sample should now either be a callable, or a string if isinstance(val, ast.Name): cmd_id = f"{self.plugin_name}.{val.id}" py_name = f"{self.module_name}:{val.id}" cmd_contrib = contributions.CommandContribution( id=cmd_id, python_name=py_name, title=val.id ) self.contribution_points["commands"].append(cmd_contrib) contrib = contributions.SampleDataGenerator( command=cmd_id, key=key, display_name=display_name ) else: uri = "__dynamic__" # TODO: make this a real uri contrib = contributions.SampleDataURI( key=key, display_name=display_name, uri=uri ) self.contribution_points["sample_data"].append(contrib) def napari_experimental_provide_function(self, node: ast.FunctionDef): return_ = next(n for n in node.body if isinstance(n, ast.Return)) items = ( list(return_.value.elts) if isinstance(return_.value, ast.List) else [return_.value] # type: ignore ) for item in items: if not isinstance(item, ast.Name): raise NotImplementedError( # pragma: no cover "provide function got non-name" ) py_name = self._names.get(item.id) py_name = ( ":".join(py_name.rsplit(".", 1)) if py_name else f"{self.module_name}:{node.name}" ) cmd_id = f"{self.plugin_name}.{node.name}" cmd_contrib = contributions.CommandContribution( id=cmd_id, python_name=py_name, title=item.id ) wdg_contrib = contributions.WidgetContribution( command=cmd_id, display_name=item.id, autogenerate=True ) self.contribution_points["commands"].append(cmd_contrib) self.contribution_points["widgets"].append(wdg_contrib) def napari_experimental_provide_dock_widget(self, node: ast.FunctionDef): return_ = next(n for n in node.body if isinstance(n, ast.Return)) items = ( list(return_.value.elts) if isinstance(return_.value, ast.List) else [return_.value] # type: ignore ) for item in items: wdg_creator = item.elts[0] if isinstance(item, ast.Tuple) else item if isinstance(wdg_creator, ast.Name): # eg `SegmentationWidget` obj_name = wdg_creator.id if py_name := self._names.get(wdg_creator.id): py_name = ":".join(py_name.rsplit(".", 1)) else: py_name = f"{self.module_name}:{obj_name}" elif isinstance(wdg_creator, ast.Attribute): # eg `measurement.analyze_points_layer` py_name = wdg_creator.attr tmp = wdg_creator assert isinstance(tmp.value, ast.Name) py_name = f"{self._names[tmp.value.id]}.{py_name}" py_name = ":".join(py_name.rsplit(".", 1)) obj_name = tmp.value.id else: raise TypeError( # pragma: no cover f"Unexpected widget creator type: {type(wdg_creator)}" ) cmd_id = f"{self.plugin_name}.{obj_name}" cmd_contrib = contributions.CommandContribution( id=cmd_id, python_name=py_name, title=obj_name ) wdg_contrib = contributions.WidgetContribution( command=cmd_id, display_name=obj_name ) self.contribution_points["commands"].append(cmd_contrib) self.contribution_points["widgets"].append(wdg_contrib) def find_npe2_module_contributions( path: Union[ModuleType, str, Path], plugin_name: str, module_name: str = "" ) -> contributions.ContributionPoints: """Visit an npe2 module and extract contribution points. Parameters ---------- path : Union[ModuleType, str, Path] Either a path to a Python module, a module object, or a string plugin_name : str Name of the plugin module_name : str Module name, by default "" Returns ------- ContributionPoints ContributionPoints discovered in the module. """ if isinstance(path, ModuleType): assert path.__file__ assert path.__name__ module_name = path.__name__ path = path.__file__ visitor = NPE2PluginModuleVisitor(plugin_name, module_name=module_name) visitor.visit(ast.parse(Path(path).read_text())) if "commands" in visitor.contribution_points: compress = {tuple(i.items()) for i in visitor.contribution_points["commands"]} visitor.contribution_points["commands"] = [dict(i) for i in compress] return contributions.ContributionPoints(**visitor.contribution_points) def find_npe1_module_contributions( dist: Distribution, module_name: str = "" ) -> contributions.ContributionPoints: """Statically visit an npe1 module and extract contribution points. Parameters ---------- dist: Distribution A distribution object representing an npe1 plugin to be visited. module_name : str Module name, by default "" Returns ------- ContributionPoints ContributionPoints discovered in the module. """ plugin_name = dist.metadata["Name"] file = _locate_module_in_dist(dist, module_name) visitor = NPE1PluginModuleVisitor(plugin_name, module_name=module_name) visitor.visit(ast.parse(Path(file).read_text())) # now check all of the modules that were imported by `module_name` to see # if any of those had npe1 decorated functions. # NOTE: we're only going 1 level deep here... for name, target in visitor._names.items(): if not name.startswith("_"): target_module = target.rsplit(".", 1)[0] try: file = _locate_module_in_dist(dist, target_module) except FileNotFoundError: # if the imported module is not in the same distribution # just skip it. continue # NOTE: technically, this time we should restrict the allowable names # to those that are imported from the original module_name ... # but that's probably overkill v2 = NPE1PluginModuleVisitor(plugin_name, target_module) v2.visit(ast.parse(file.read_text())) visitor.contribution_points.update(v2.contribution_points) return contributions.ContributionPoints(**visitor.contribution_points) def _locate_module_in_dist(dist: Distribution, module_name: str) -> Path: root = dist.locate_file(module_name.replace(".", "/")) assert isinstance(root, Path) if not (file := root / "__init__.py").exists(): if not (file := root.with_suffix(".py")).exists(): raise FileNotFoundError( f"Could not find {module_name!r} in " f"in distribution for {dist.metadata['Name']}" ) return file npe2-0.7.2/src/npe2/manifest/__init__.py0000644000000000000000000000023713615410400014660 0ustar00from ._package_metadata import PackageMetadata from .schema import ENTRY_POINT, PluginManifest __all__ = ["PackageMetadata", "PluginManifest", "ENTRY_POINT"] npe2-0.7.2/src/npe2/manifest/_bases.py0000644000000000000000000000706313615410400014361 0ustar00import json from contextlib import contextmanager from pathlib import Path from typing import Callable, Dict, Optional, Union import yaml from pydantic import BaseModel, PrivateAttr class ImportExportModel(BaseModel): """Model mixin/base class that provides read/write from toml/yaml/json. To force the inclusion of a given field in the exported toml/yaml use: class MyModel(ImportExportModel): some_field: str = Field(..., always_export=True) """ _source_file: Optional[Path] = PrivateAttr(None) def toml(self, pyproject=False, **kwargs) -> str: """Generate serialized `toml` string for this model. Parameters ---------- pyproject : bool, optional If `True`, output will be in pyproject format, with all data under `tool.napari`, by default `False`. **kwargs passed to `BaseModel.json()` """ import tomli_w d = self._serialized_data(**kwargs) if pyproject: d = {"tool": {"napari": d}} return tomli_w.dumps(d) def yaml(self, **kwargs) -> str: """Generate serialized `yaml` string for this model. Parameters ---------- **kwargs passed to `BaseModel.json()` """ return yaml.safe_dump(self._serialized_data(**kwargs), sort_keys=False) @classmethod def from_file(cls, path: Union[Path, str]): """Parse model from a metadata file. Parameters ---------- path : Path or str Path to file. Must have extension {'.json', '.yaml', '.yml', '.toml'} Returns ------- object The parsed model. Raises ------ FileNotFoundError If `path` does not exist. ValueError If the file extension is not in {'.json', '.yaml', '.yml', '.toml'} """ path = Path(path).expanduser().absolute().resolve() if not path.exists(): raise FileNotFoundError(f"File not found: {path}") loader: Callable if path.suffix.lower() == ".json": loader = json.load elif path.suffix.lower() == ".toml": try: import tomllib except ImportError: import tomli as tomllib # type: ignore [no-redef] loader = tomllib.load elif path.suffix.lower() in (".yaml", ".yml"): loader = yaml.safe_load else: raise ValueError(f"unrecognized file extension: {path}") # pragma: no cover with open(path, mode="rb") as f: data = loader(f) or {} if path.name == "pyproject.toml": data = data["tool"]["napari"] obj = cls(**data) obj._source_file = Path(path).expanduser().absolute().resolve() return obj def _serialized_data(self, **kwargs): """using json encoders for all outputs""" kwargs.setdefault("exclude_unset", True) with self._required_export_fields_set(): return json.loads(self.json(**kwargs)) @contextmanager def _required_export_fields_set(self): fields = self.__fields__.items() required = {k for k, v in fields if v.field_info.extra.get("always_export")} was_there: Dict[str, bool] = {} for f in required: was_there[f] = f in self.__fields_set__ self.__fields_set__.add(f) try: yield finally: for f in required: if not was_there.get(f): self.__fields_set__.discard(f) npe2-0.7.2/src/npe2/manifest/_npe1_adapter.py0000644000000000000000000001213413615410400015622 0ustar00import contextlib import logging import os import site import warnings from importlib import metadata from pathlib import Path from shutil import rmtree from typing import List, Sequence from appdirs import user_cache_dir from npe2._inspection._from_npe1 import manifest_from_npe1 from npe2.manifest import PackageMetadata from .schema import PluginManifest, discovery_blocked logger = logging.getLogger(__name__) ADAPTER_CACHE = Path(user_cache_dir("napari", "napari")) / "npe2" / "adapter_manifests" NPE2_NOCACHE = "NPE2_NOCACHE" def clear_cache(names: Sequence[str] = ()) -> List[Path]: """Clear cached NPE1Adapter manifests. Parameters ---------- names : Sequence[str], optional selection of plugin names to clear, by default, all will be cleared Returns ------- List[Path] List of filepaths cleared """ _cleared: List[Path] = [] if ADAPTER_CACHE.exists(): if names: for f in ADAPTER_CACHE.glob("*.yaml"): if any(f.name.startswith(f"{n}_") for n in names): f.unlink() _cleared.append(f) else: _cleared = list(ADAPTER_CACHE.iterdir()) rmtree(ADAPTER_CACHE) return _cleared class NPE1Adapter(PluginManifest): """PluginManifest subclass that acts as an adapter for 1st gen plugins. During plugin discovery, packages that provide a first generation 'napari.plugin' entry_point (but do *not* provide a second generation 'napari.manifest' entrypoint) will be stored as `NPE1Adapter` manifests in the `PluginManager._npe1_adapters` list. This class is instantiated with only a distribution object, but lacks contributions at construction time. When `self.contributions` is accesses for the first time, `_load_contributions` is called triggering and import and indexing of all plugin modules using the same logic as `npe2 convert`. After import, the discovered contributions are cached in a manifest for use in future sessions. (The cache can be cleared using `npe2 cache --clear [plugin-name]`). Parameters ---------- dist : metadata.Distribution A Distribution object for a package installed in the environment. (Minimally, the distribution object must implement the `metadata` and `entry_points` attributes.). It will be passed to `manifest_from_npe1` """ _is_loaded: bool = False _dist: metadata.Distribution def __init__(self, dist: metadata.Distribution): """_summary_""" meta = PackageMetadata.from_dist_metadata(dist.metadata) super().__init__( name=dist.metadata["Name"], package_metadata=meta, npe1_shim=True ) self._dist = dist def __getattribute__(self, __name: str): if __name == "contributions": self._load_contributions() return super().__getattribute__(__name) def _load_contributions(self, save=True) -> None: """import and inspect package contributions.""" if self._is_loaded: return self._is_loaded = True # if we fail once, we still don't try again. if self._cache_path().exists() and not os.getenv(NPE2_NOCACHE): mf = PluginManifest.from_file(self._cache_path()) self.contributions = mf.contributions logger.debug("%r npe1 adapter loaded from cache", self.name) return with discovery_blocked(): try: mf = manifest_from_npe1(self._dist, adapter=True) except Exception as e: warnings.warn( "Error importing contributions for first-generation " f"napari plugin {self.name!r}: {e}", stacklevel=2, ) return self.contributions = mf.contributions logger.debug("%r npe1 adapter imported", self.name) if save and not _is_editable_install(self._dist): with contextlib.suppress(OSError): self._save_to_cache() def _save_to_cache(self): cache_path = self._cache_path() cache_path.parent.mkdir(exist_ok=True, parents=True) cache_path.write_text(self.yaml()) def _cache_path(self) -> Path: """Return cache path for manifest corresponding to distribution.""" return _cached_adapter_path(self.name, self.package_version or "") def _serialized_data(self, **kwargs): if not self._is_loaded: # pragma: no cover self._load_contributions(save=False) return super()._serialized_data(**kwargs) def _cached_adapter_path(name: str, version: str) -> Path: """Return cache path for manifest corresponding to distribution.""" return ADAPTER_CACHE / f"{name}_{version}.yaml" def _is_editable_install(dist: metadata.Distribution) -> bool: """Return True if dist is installed as editable. i.e: if the package isn't in site-packages or user site-packages. """ root = str(dist.locate_file("")) installed_paths = [*site.getsitepackages(), site.getusersitepackages()] return all(loc not in root for loc in installed_paths) npe2-0.7.2/src/npe2/manifest/_package_metadata.py0000644000000000000000000002156513615410400016522 0ustar00from importlib import metadata from typing import Dict, List, Literal, Optional, Union from pydantic import BaseModel, Extra, Field, constr, root_validator from pydantic.fields import SHAPE_LIST # https://packaging.python.org/specifications/core-metadata/ MetadataVersion = Literal["1.0", "1.1", "1.2", "2.0", "2.1", "2.2"] _alphanum = "[a-zA-Z0-9]" PackageName = constr(regex=f"^{_alphanum}[a-zA-Z0-9._-]*{_alphanum}$") class PackageMetadata(BaseModel): """Pydantic model for standard python package metadata. https://www.python.org/dev/peps/pep-0566/ https://packaging.python.org/specifications/core-metadata/ The `importlib.metadata` provides the `metadata()` function, but it returns a somewhat awkward `email.message.Message` object. """ class Config: extra = Extra.ignore metadata_version: MetadataVersion = Field( "1.0", description="Version of the file format" ) name: PackageName = Field( # type: ignore ..., description="The name of the distribution. The name field " "is the primary identifier for a distribution.", ) # technically there is PackageVersion regex at # https://www.python.org/dev/peps/pep-0440/#id81 # but it will fail on some dev versions, and it's not worth it. version: str = Field( ..., description="A string containing the distribution's version number. " "This field must be in the format specified in PEP 440.", ) dynamic: Optional[List[str]] = Field( None, description="A string containing the name of another core metadata " "field. The field names Name and Version may not be specified in this field.", min_ver="2.2", ) platform: Optional[List[str]] = Field( None, description="A Platform specification describing an operating system " "supported by the distribution which is not listed in the “Operating System” " "Trove classifiers. See “Classifier” below.", ) supported_platform: Optional[List[str]] = Field( None, description="Binary distributions containing a PKG-INFO file will use the " "Supported-Platform field in their metadata to specify the OS and CPU for " "which the binary distribution was compiled", min_ver="1.1", ) summary: Optional[str] = Field( None, description="A one-line summary of what the distribution does." ) description: Optional[str] = Field( None, description="A longer description of the distribution that can " "run to several paragraphs.", ) description_content_type: Optional[str] = Field( None, description="A string stating the markup syntax (if any) used in the " "distribution's description, so that tools can intelligently render the " "description. The type/subtype part has only a few legal values: " "text/plain, text/x-rst, text/markdown", min_ver="2.1", ) keywords: Optional[str] = Field( None, description="A list of additional keywords, separated by commas, to be used " "to assist searching for the distribution in a larger catalog.", ) home_page: Optional[str] = Field( None, description="A string containing the URL for the distribution's home page.", ) download_url: Optional[str] = Field( None, description="A string containing the URL from which THIS version of the " "distribution can be downloaded.", min_ver="1.1", ) author: Optional[str] = Field( None, description="A string containing the author's name at a minimum; " "additional contact information may be provided.", ) author_email: Optional[str] = Field( None, description="A string containing the author's e-mail address. It can contain " "a name and e-mail address in the legal forms for a RFC-822 From: header.", ) maintainer: Optional[str] = Field( None, description="A string containing the maintainer's name at a minimum; " "additional contact information may be provided.", min_ver="1.2", ) maintainer_email: Optional[str] = Field( None, description="A string containing the maintainer's e-mail address. It can " "contain a name and e-mail address in the legal forms for a " "RFC-822 From: header.", min_ver="1.2", ) license: Optional[str] = Field( None, description="Text indicating the license covering the distribution where the " "license is not a selection from the “License” Trove classifiers. See " "“Classifier” below. This field may also be used to specify a particular " "version of a license which is named via the Classifier field, or to " "indicate a variation or exception to such a license.", ) classifier: Optional[List[str]] = Field( None, description="Each entry is a string giving a single classification value for " "the distribution. Classifiers are described in PEP 301, and the Python " "Package Index publishes a dynamic list of currently defined classifiers.", min_ver="1.1", ) requires_dist: Optional[List[str]] = Field( None, description="The field format specification was relaxed to accept the syntax " "used by popular publishing tools. Each entry contains a string naming some " "other distutils project required by this distribution.", min_ver="1.2", ) requires_python: Optional[str] = Field( None, description="This field specifies the Python version(s) that the distribution " "is guaranteed to be compatible with. Installation tools may look at this " "when picking which version of a project to install. " "The value must be in the format specified in Version specifiers (PEP 440).", min_ver="1.2", ) requires_external: Optional[List[str]] = Field( None, description="The field format specification was relaxed to accept the syntax " "used by popular publishing tools. Each entry contains a string describing " "some dependency in the system that the distribution is to be used. This " "field is intended to serve as a hint to downstream project maintainers, and " "has no semantics which are meaningful to the distutils distribution.", min_ver="1.2", ) project_url: Optional[List[str]] = Field( None, description="A string containing a browsable URL for the project and a label " "for it, separated by a comma.", min_ver="1.2", ) provides_extra: Optional[List[str]] = Field( None, description="A string containing the name of an optional feature. Must be a " "valid Python identifier. May be used to make a dependency conditional on " "whether the optional feature has been requested.", min_ver="2.1", ) # rarely_used provides_dist: Optional[List[str]] = Field(None, min_ver="1.2") obsoletes_dist: Optional[List[str]] = Field(None, min_ver="1.2") @root_validator(pre=True) def _validate_root(cls, values): if "metadata_version" not in values: fields = cls.__fields__ mins = { fields[n].field_info.extra.get("min_ver", "1.0") for n in values if n in fields } values["metadata_version"] = str(max(float(x) for x in mins)) return values @classmethod def for_package(cls, name: str) -> "PackageMetadata": """Get PackageMetadata from a package name.""" return cls.from_dist_metadata(metadata.metadata(name)) # note, the metadata.PackageMetadata hint is only valid for py 3.10 # before that, it was email.message.Message @classmethod def from_dist_metadata(cls, meta: "metadata.PackageMetadata") -> "PackageMetadata": """Generate PackageMetadata from importlib.metadata.PackageMetdata.""" manys = [f.name for f in cls.__fields__.values() if f.shape == SHAPE_LIST] d: Dict[str, Union[str, List[str]]] = {} # looks like py3.10 changed the public protocol of metadata.PackageMetadata # and they don't want you to rely on the Mapping interface... however, the # __iter__ method doesn't iterate key value pairs, just keys, and I can't figure # out how to get multi-valued fields from that (e.g. Classifier) # might need to change this in the future for key, value in meta.items(): # type: ignore key = _norm(key) if key in manys: d.setdefault(key, []).append(value) # type: ignore else: d[key] = value return cls.parse_obj(d) def __hash__(self) -> int: return id(self) def _norm(string: str) -> str: return string.replace("-", "_").replace(" ", "_").lower() npe2-0.7.2/src/npe2/manifest/_validators.py0000644000000000000000000000547313615410400015437 0ustar00import re _package_name = "([A-Z0-9]|[A-Z0-9][A-Z0-9._-]*[A-Z0-9])" _python_identifier = "([a-zA-Z_][a-zA-Z_0-9]*)" # how do we deal with keywords ? # do we try to validate ? Or do we just # assume users won't try to create a command named # `npe2_tester.False.if.for.in` ? _identifier_plus_dash = "(?:[a-zA-Z_][a-zA-Z_0-9-]+)" _dotted_name = f"(?:(?:{_identifier_plus_dash}\\.)*{_identifier_plus_dash})" PACKAGE_NAME_PATTERN = re.compile(f"^{_package_name}$", re.IGNORECASE) DOTTED_NAME_PATTERN = re.compile(_dotted_name) DISPLAY_NAME_PATTERN = re.compile(r"^[^\W_][\w -~:.'\"]{1,88}[^\W_]$") PYTHON_NAME_PATTERN = re.compile(f"^({_dotted_name}):({_dotted_name})$") COMMAND_ID_PATTERN = re.compile( f"^(({_package_name}\\.)*{_python_identifier})$", re.IGNORECASE ) def command_id(id: str) -> str: if id and not COMMAND_ID_PATTERN.match(id): raise ValueError( f"{id!r} is not a valid command id. It must begin with the package name " "followed by a period, then may can only contain alphanumeric " "characters and underscores." ) return id def package_name(name: str) -> str: """Assert that `name` is a valid package name in accordance with PEP-0508.""" if name and not PACKAGE_NAME_PATTERN.match(name): raise ValueError( f"{name!r} is not a valid python package name. " "See https://peps.python.org/pep-0508/#names " ) return name def python_name(name: str) -> str: """Assert that `name` is a valid python name: e.g. `module.submodule:funcname`""" if name and not PYTHON_NAME_PATTERN.match(name): msg = ( f"{name!r} is not a valid python_name. A python_name must " "be of the form '{obj.__module__}:{obj.__qualname__}' (e.g. " "'my_package.a_module:some_function')." ) if ".." in name: *_, a, b = name.split("..") a = a.split(":")[-1] msg += ( " Note: functions defined in local scopes are not yet supported. " f"Please move function {b!r} to the global scope of module {a!r}" ) raise ValueError(msg) return name def display_name(v: str) -> str: if not DISPLAY_NAME_PATTERN.match(v): raise ValueError( f"{v} is not a valid display_name. It must be 3-90 characters long, " "and must not begin or end with an underscore, white space, or " "non-word character." ) return v def icon_path(v: str) -> str: if not v: return "" if v.startswith("http"): if not v.startswith("https://"): raise ValueError( f"{v} is not a valid icon URL. It must start with 'https://'" ) return v assert isinstance(v, str), f"{v} must be a string" return v npe2-0.7.2/src/npe2/manifest/menus.py0000644000000000000000000000026513615410400014251 0ustar00import warnings warnings.warn( "please import menus from npe2.manifest.contributions", DeprecationWarning, stacklevel=2, ) from .contributions._menus import * # noqa npe2-0.7.2/src/npe2/manifest/package_metadata.py0000644000000000000000000000027713615410400016360 0ustar00import warnings warnings.warn( "Please import PackageMetadata from 'npe2' or from 'npe2.manifest'", DeprecationWarning, stacklevel=2, ) from ._package_metadata import * # noqa npe2-0.7.2/src/npe2/manifest/schema.py0000644000000000000000000004722413615410400014370 0ustar00from __future__ import annotations import sys from contextlib import contextmanager, suppress from enum import Enum from importlib import metadata, util from logging import getLogger from pathlib import Path from typing import Iterator, List, Literal, NamedTuple, Optional, Sequence, Union from pydantic import Extra, Field, ValidationError, root_validator, validator from pydantic.error_wrappers import ErrorWrapper from pydantic.main import BaseModel, ModelMetaclass from npe2.types import PythonName from . import _validators from ._bases import ImportExportModel from ._package_metadata import PackageMetadata from .contributions import ContributionPoints from .utils import Executable, Version logger = getLogger(__name__) SCHEMA_VERSION = "0.2.0" ENTRY_POINT = "napari.manifest" NPE1_ENTRY_POINT = "napari.plugin" class Category(str, Enum): """Broad plugin categories, values for PluginManifest.categories.""" # drive devices (webcams, microscopes, etc) to acquire data Acquisition = "Acquisition" # tools that facilitate labeling, marking, and, erm, "annotating" data within napari Annotation = "Annotation" # Sample datasets for training, demonstration, learning, etc... Dataset = "Dataset" # Routines that take in numerical arrays and generally return new arrays or datasets # (e.g. scikit image stuff, deconvolution, super-res reconstruction, etc...) Image_Processing = "Image Processing" # Plugins that read from and/or write to files or data streams # not supported natively by napari IO = "IO" # Plugins that employ machine learning: may facilitate either training or prediction # Machine_Learning = "Machine Learning" # Tools that extract measurements (i.e. into tabular, graph, or other data formats), # such as region properties, etc... Measurement = "Measurement" # tools that identify objects and/or boundaries in datasets # (including, but not limited to, images) Segmentation = "Segmentation" # tools that simulate some physical process. # microscope/PSF generators, optics simulators, astronomy simulations, etc... Simulation = "Simulation" # plugins that modify the look and feel of napari Themes = "Themes" # linear and or nonlinear registrations, general data transformations and mappings Transformations = "Transformations" # Conveniences, widgets, etc... stuff that could conceivably be "core" # but which is community-supported Utilities = "Utilities" # tools for plotting, rendering, and visualization # (on top of those provided by napari) Visualization = "Visualization" def __str__(self) -> str: return self.value # pragma: no cover class DiscoverResults(NamedTuple): manifest: Optional[PluginManifest] distribution: Optional[metadata.Distribution] error: Optional[Exception] class PluginManifest(ImportExportModel): class Config: underscore_attrs_are_private = True extra = Extra.ignore validate_assignment = True # VS Code uses . as a unique ID for the extension # should this just be the package name ... not the module name? (yes) # do we normalize this? (i.e. underscores / dashes ?) (no) # TODO: enforce that this matches the package name name: str = Field( ..., description="The name of the plugin. Though this field is mandatory, it *must*" " match the package `name` as defined in the python package metadata.", allow_mutation=False, ) _validate_name = validator("name", pre=True, allow_reuse=True)( _validators.package_name ) display_name: str = Field( "", description="User-facing text to display as the name of this plugin. " "Must be 3-90 characters long and must not begin or end with an underscore, " "white space, or non-word character. If not provided, the manifest `name` " "will be used as the display name.", min_length=3, max_length=90, ) _validate_display_name = validator("display_name", allow_reuse=True)( _validators.display_name ) visibility: Literal["public", "hidden"] = Field( "public", description="Whether this plugin should be searchable and visible in " "the built-in plugin installer and the napari hub. By default (`'public'`) " "all plugins are visible. To prevent your plugin from appearing in search " "results, change this to `'hidden'`.", ) icon: str = Field( "", description="The path to a square PNG icon of at least 128x128 pixels (256x256 " "for Retina screens). May be one of:\n" " - a secure (https) URL\n" " - a path relative to the manifest file, (must be shipped in the sdist)\n" " - a string in the format `{package}:{resource}`, where `package` and " "`resource` are arguments to `importlib.resources.path(package, resource)`, " "(e.g. `top_module.some_folder:my_logo.png`).", ) _validate_icon_path = validator("icon", allow_reuse=True)(_validators.icon_path) categories: List[Category] = Field( default_factory=list, description="A list of categories that this plugin belongs to. This is used to " "help users discover your plugin. Allowed values:\n" f"`[{', '.join(Category)}]`", ) # Plugins rely on certain guarantees to interoperate propertly with the # plugin engine. These include the manifest specification, conventions # around python packaging, command api's, etc. Together these form a # "contract". The version of this contract is the "schema version." # # The first release of npe2 defines the first schema version. # As the contract around plugins evolve the SCHEMA_VERSION should be # increased follow SemVer rules. Note that sometimes the version number # will change even though no npe2 code changes. # # The `schema_version` field declares the version of the contract that this # plugin targets. schema_version: str = Field( SCHEMA_VERSION, description="A SemVer compatible version string matching the napari plugin " "schema version that the plugin is compatible with.", always_export=True, ) # TODO: # Perhaps we should version the plugin interface (not so the manifest, but # the actual mechanism/consumption of plugin information) independently # of napari itself on_activate: Optional[PythonName] = Field( default=None, description="Fully qualified python path to a function that will be called " "upon plugin activation (e.g. `'my_plugin.some_module:activate'`). The " "activate function can be used to connect command ids to python callables, or" " perform other side-effects. A plugin will be 'activated' when one of its " "contributions is requested by the user (such as a widget, or reader).", ) _validate_activate_func = validator("on_activate", allow_reuse=True)( _validators.python_name ) on_deactivate: Optional[PythonName] = Field( default=None, description="Fully qualified python path to a function that will be called " "when a user deactivates a plugin (e.g. `'my_plugin.some_module:deactivate'`)" ". This is optional, and may be used to perform any plugin cleanup.", ) _validate_deactivate_func = validator("on_deactivate", allow_reuse=True)( _validators.python_name ) contributions: ContributionPoints = Field( default_factory=ContributionPoints, description="An object describing the plugin's " "[contributions](./contributions)", ) package_metadata: Optional[PackageMetadata] = Field( None, description="Package metadata following " "https://packaging.python.org/specifications/core-metadata/. " "For normal (non-dynamic) plugins, this data will come from the package's " "setup.cfg", hide_docs=True, ) npe1_shim: bool = Field( False, description="Whether this manifest was created as a shim for an npe1 plugin.", hide_docs=True, ) def __init__(self, **data): super().__init__(**data) if self.package_metadata is None and self.name: with suppress(metadata.PackageNotFoundError): meta = metadata.distribution(self.name).metadata self.package_metadata = PackageMetadata.from_dist_metadata(meta) if not self.npe1_shim: # assign plugin name on all contributions that have a private # _plugin_name field. for _, value in self.contributions or (): for item in value if isinstance(value, list) else [value]: if isinstance(item, Executable): item._plugin_name = self.name def __hash__(self): return hash((self.name, self.package_version)) @property def license(self) -> Optional[str]: return self.package_metadata.license if self.package_metadata else None @property def package_version(self) -> Optional[str]: return self.package_metadata.version if self.package_metadata else None @property def description(self) -> Optional[str]: return self.package_metadata.summary if self.package_metadata else None @property def author(self) -> Optional[str]: return self.package_metadata.author if self.package_metadata else None @property def is_visible(self) -> bool: return self.visibility == "public" @validator("contributions", pre=True) def _coerce_none_contributions(cls, value): return [] if value is None else value @root_validator def _validate_root(cls, values: dict) -> dict: mf_name = values.get("name") # validate schema version declared_version = Version.parse(values.get("schema_version", "")) current_version = Version.parse(SCHEMA_VERSION) if current_version < declared_version: import warnings warnings.warn( f"The declared schema version for plugin {mf_name!r} " f"({declared_version}) is newer than npe2's schema version " f"({current_version}). Things may break, you should upgrade npe2.", stacklevel=2, ) invalid_commands: List[str] = [] if values.get("contributions") is not None: invalid_commands.extend( command.id for command in values["contributions"].commands or [] if not command.id.startswith(f"{mf_name}.") ) if invalid_commands: raise ValueError( "Commands identifiers must start with the current package name " f"followed by a dot: '{mf_name}'. The following commands do not: " f"{invalid_commands}" ) if not values.get("display_name"): values["display_name"] = mf_name return values @classmethod def from_distribution(cls, name: str) -> PluginManifest: """Return PluginManifest given a distribution (package) name. Parameters ---------- name : str Name of a python distribution installed in the environment. Note: this is the package name, not the top level module name, (e.g. "scikit-image", not "skimage"). Returns ------- PluginManifest The parsed manifest. Raises ------ ValueError If the distribution exists, but does not provide a manifest PackageNotFoundError If there is no distribution found for `name` ValidationError If the manifest is not valid """ dist = metadata.distribution(name) # may raise PackageNotFoundError pm = _from_dist(dist) if not pm: raise ValueError( "Distribution {name!r} exists but does not provide a napari manifest" ) return pm @classmethod def discover( cls, paths: Sequence[Union[str, Path]] = () ) -> Iterator[DiscoverResults]: """Discover manifests in the environment. This function searches for installed python packages with a matching entry point group and then attempts to resolve the manifest file. The manifest file should be specified in the plugin's ``setup.cfg`` or ``setup.py`` file using the [entry point group][1]: "napari.manifest". For example, this would be the section for a plugin "npe-tester" with "napari.yaml" as the manifest file: .. code-block:: cfg [options.entry_points] napari.manifest = npe2-tester = npe2_tester:napari.yaml The manifest file is specified relative to the submodule root path. So for the example it will be loaded from: ``/napari.yaml``. [1]: https://packaging.python.org/specifications/entry-points/ Parameters ---------- paths : Sequence[str], optional paths to add to sys.path while discovering. Yields ------ DiscoverResults: (3 namedtuples: manifest, entrypoint, error) 3-tuples with either manifest or (entrypoint and error) being None. """ with _temporary_path_additions(paths): for dist in metadata.distributions(): try: pm = _from_dist(dist) if pm: yield DiscoverResults(pm, dist, None) except ValidationError as e: logger.warning( "Invalid schema for package %r, please run" " 'npe2 validate %s' to check for manifest errors.", dist.metadata["Name"], dist.metadata["Name"], ) yield DiscoverResults(None, dist, e) except Exception as e: logger.error( "{} -> {!r} could not be imported: {}".format( ENTRY_POINT, dist.metadata["Name"], e ) ) yield DiscoverResults(None, dist, e) @classmethod def _from_entrypoint( cls, entry_point: metadata.EntryPoint, distribution: Optional[metadata.Distribution] = None, ) -> PluginManifest: assert (match := entry_point.pattern.match(entry_point.value)) module = match.group("module") spec = util.find_spec(module or "") if not spec: # pragma: no cover raise ValueError( f"Cannot find module {module!r} declared in " f"entrypoint: {entry_point.value!r}" ) assert (match := entry_point.pattern.match(entry_point.value)) fname = match.group("attr") for loc in spec.submodule_search_locations or []: mf_file = Path(loc) / fname if mf_file.exists(): mf = PluginManifest.from_file(mf_file) if distribution is not None: meta = PackageMetadata.from_dist_metadata(distribution.metadata) mf.package_metadata = meta if mf.name != meta.name: raise ValueError( # pragma: no cover f"The name field in the manifest ({mf.name!r}) " f"must match the package name ({meta.name!r})" ) return mf raise FileNotFoundError( # pragma: no cover f"Could not find file {fname!r} in module {module!r}" ) @classmethod def _from_package_or_name( cls, package_or_filename: Union[Path, str] ) -> PluginManifest: """Internal convenience function, calls both `from_file` and `from_distribution` Parameters ---------- package_or_filename : Union[Path, str] Either a filename or a package name. Will be tried first as a filename, and then as a distribution name. Returns ------- PluginManifest [description] Raises ------ ValidationError If the name can be resolved as either a distribution name or a file, but the manifest is not valid. ValueError If the name does not resolve to either a distribution name or a filename. """ from pydantic import ValidationError from npe2 import PluginManifest try: return PluginManifest.from_file(package_or_filename) except ValidationError: # pragma: no cover raise except (FileNotFoundError, ValueError): try: return PluginManifest.from_distribution(str(package_or_filename)) except ValidationError: # pragma: no cover raise except Exception as e: raise ValueError( f"Could not find manifest for {package_or_filename!r} as either a " "package name or a file.." ) from e def _serialized_data(self, **kwargs): kwargs.setdefault("exclude", {"package_metadata"}) return super()._serialized_data(**kwargs) def validate_imports(self) -> None: """Checks recursively that all `python_name` fields are actually importable.""" from .utils import import_python_name errors = [] def check_pynames(m: BaseModel, loc=()): for name, value in m: if not value: continue if isinstance(value, BaseModel): return check_pynames(value, (*loc, name)) field = m.__fields__[name] if isinstance(value, list) and isinstance(field.type_, ModelMetaclass): return [check_pynames(i, (*loc, n)) for n, i in enumerate(value)] if field.outer_type_ is PythonName: try: import_python_name(value) except (ImportError, AttributeError) as e: errors.append(ErrorWrapper(e, (*loc, name))) check_pynames(self) if errors: raise ValidationError(errors, type(self)) ValidationError = ValidationError # for convenience of access def _noop(*_, **__): return [] # pragma: no cover @contextmanager def discovery_blocked(): orig = PluginManifest.discover PluginManifest.discover = _noop # type: ignore [method-assign] try: yield finally: PluginManifest.discover = orig # type: ignore [method-assign] @contextmanager def _temporary_path_additions(paths: Sequence[Union[str, Path]] = ()): if paths and (not isinstance(paths, Sequence) or isinstance(paths, str)): raise TypeError("paths must be a sequence of strings") # pragma: no cover for p in reversed(paths): sys.path.insert(0, str(p)) try: yield finally: for p in paths: sys.path.remove(str(p)) def _from_dist(dist: metadata.Distribution) -> Optional[PluginManifest]: """Return PluginManifest or NPE1Adapter for a metadata.Distribution object. ...depending on which entry points are available. """ _npe1, _npe2 = [], None for ep in dist.entry_points: if ep.group == NPE1_ENTRY_POINT: _npe1.append(ep) elif ep.group == ENTRY_POINT: _npe2 = ep if _npe2: return PluginManifest._from_entrypoint(_npe2, dist) elif _npe1: from ._npe1_adapter import NPE1Adapter return NPE1Adapter(dist=dist) return None if __name__ == "__main__": print(PluginManifest.schema_json(indent=2)) npe2-0.7.2/src/npe2/manifest/utils.py0000644000000000000000000002770313615410400014270 0ustar00from __future__ import annotations import re from dataclasses import dataclass from functools import total_ordering from importlib import import_module from typing import ( TYPE_CHECKING, Any, Callable, Dict, Generic, Optional, Sequence, SupportsInt, Tuple, TypeVar, Union, ) from pydantic import PrivateAttr from pydantic.generics import GenericModel from npe2.types import PythonName if TYPE_CHECKING: from npe2._command_registry import CommandRegistry from npe2.manifest.schema import PluginManifest from .contributions import ContributionPoints def v1_to_v2(path): return (path, True) if isinstance(path, list) else ([path], False) def v2_to_v1(paths, stack): if stack: return paths assert len(paths) == 1 return paths[0] R = TypeVar("R") SHIM_NAME_PREFIX = "__npe1shim__." # TODO: add ParamSpec when it's supported better by mypy class Executable(GenericModel, Generic[R]): command: str # plugin_name gets populated in `PluginManifest.__init__` _plugin_name: str = PrivateAttr("") def exec( self, args: tuple = (), kwargs: Optional[dict] = None, _registry: Optional[CommandRegistry] = None, ) -> R: if kwargs is None: kwargs = {} reg = _registry or kwargs.pop("_registry", None) return self.get_callable(reg)(*args, **kwargs) def get_callable( self, _registry: Optional[CommandRegistry] = None, ) -> Callable[..., R]: if _registry is None: from npe2._plugin_manager import PluginManager _registry = PluginManager.instance().commands return _registry.get(self.command) @property def plugin_name(self) -> str: """Return the manifest/plugin name for this contribution.""" if not self._plugin_name: # we will likely never get here if the contribution was created # as a member of a PluginManifest. # But if not, we can use a couple heuristics... from importlib.metadata import distributions # look for a package name in the command id dists = sorted( (d.metadata["Name"] for d in distributions()), key=len, reverse=True ) for name in dists: if self.command.startswith(name): # pragma: no cover self._plugin_name = name break else: # just split on the first period. # Will break for package names with periods self._plugin_name = self.command.split(".")[0] return self._plugin_name @total_ordering @dataclass class Version: """A semver compatible version class. mostly vendored from python-semver (BSD-3): https://github.com/python-semver/python-semver/ """ major: SupportsInt minor: SupportsInt = 0 patch: SupportsInt = 0 prerelease: Union[bytes, str, int, None] = None build: Union[bytes, str, int, None] = None _SEMVER_PATTERN = re.compile( r""" ^ (?P0|[1-9]\d*) \. (?P0|[1-9]\d*) \. (?P0|[1-9]\d*) (?:-(?P (?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*) (?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))* ))? (?:\+(?P [0-9a-zA-Z-]+ (?:\.[0-9a-zA-Z-]+)* ))? $ """, re.VERBOSE, ) @classmethod def parse(cls, version: Union[bytes, str]) -> Version: """Convert string or bytes into Version object.""" if isinstance(version, bytes): version = version.decode("UTF-8") match = cls._SEMVER_PATTERN.match(version) if match is None: raise ValueError(f"{version} is not valid SemVer string") matched_version_parts: Dict[str, Any] = match.groupdict() return cls(**matched_version_parts) # NOTE: we're only comparing the numeric parts for now. # ALSO: the rest of the comparators come from functools.total_ordering def __eq__(self, other) -> bool: return self.to_tuple()[:3] == self._from_obj(other).to_tuple()[:3] def __lt__(self, other) -> bool: return self.to_tuple()[:3] < self._from_obj(other).to_tuple()[:3] @classmethod def _from_obj(cls, other): if isinstance(other, (str, bytes)): other = Version.parse(other) elif isinstance(other, dict): other = Version(**other) elif isinstance(other, (tuple, list)): other = Version(*other) elif not isinstance(other, Version): raise TypeError( f"Expected str, bytes, dict, tuple, list, or {cls} instance, " f"but got {type(other)}" ) return other def to_tuple(self) -> Tuple[int, int, int, Optional[str], Optional[str]]: """Return version as tuple (first three are int, last two Opt[str]).""" return ( int(self.major), int(self.minor), int(self.patch), str(self.prerelease) if self.prerelease is not None else None, str(self.build) if self.build is not None else None, ) def __iter__(self): yield from self.to_tuple() def __str__(self) -> str: v = f"{self.major}.{self.minor}.{self.patch}" if self.prerelease: # pragma: no cover v += str(self.prerelease) if self.build: # pragma: no cover v += str(self.build) return v def _import_npe1_shim(shim_name: str) -> Any: """Import npe1 shimmed python_name Some objects returned by npe1 hooks (such as locally defined partials or other objects) don't have globally accessible python names. In such cases, we create a "shim" python_name of the form: `__npe1shim__._` The implication is that the hook should be imported, called, and indexed to return the corresponding item in the hook results. Parameters ---------- shim_name : str A string in the form `__npe1shim__._` Returns ------- Any The th object returned from the callable . Raises ------ IndexError If len(()) <= """ assert shim_name.startswith(SHIM_NAME_PREFIX), f"Invalid shim name: {shim_name}" python_name, idx = shim_name[13:].rsplit("_", maxsplit=1) # TODO, make a function index = int(idx) hook = import_python_name(python_name) result = hook() if isinstance(result, dict): # things like sample_data hookspec return a dict, in which case we want the # "idxth" item in the dict (assumes ordered dict, which is safe now) result = list(result.values()) if not isinstance(result, list): result = [result] # pragma: no cover try: out = result[index] except IndexError as e: # pragma: no cover raise IndexError(f"invalid npe1 shim index {index} for hook {hook}") from e if "dock_widget" in python_name and isinstance(out, tuple): return out[0] if "sample_data" in python_name and isinstance(out, dict): # this was a nested sample data return out.get("data") return out def import_python_name(python_name: Union[PythonName, str]) -> Any: from . import _validators if python_name.startswith(SHIM_NAME_PREFIX): return _import_npe1_shim(python_name) _validators.python_name(python_name) # shows the best error message match = _validators.PYTHON_NAME_PATTERN.match(python_name) module_name, funcname = match.groups() # type: ignore [union-attr] mod = import_module(module_name) return getattr(mod, funcname) def deep_update(dct: dict, merge_dct: dict, copy=True) -> dict: """Merge possibly nested dicts""" from copy import deepcopy _dct = deepcopy(dct) if copy else dct for k, v in merge_dct.items(): if k in _dct and isinstance(dct[k], dict) and isinstance(v, dict): deep_update(_dct[k], v, copy=False) elif isinstance(v, list): if k not in _dct: _dct[k] = [] _dct[k].extend(v) else: _dct[k] = v return _dct def merge_manifests( manifests: Sequence[PluginManifest], overwrite=False ) -> PluginManifest: """Merge a sequence of PluginManifests into a single PluginManifest.""" from npe2.manifest.schema import PluginManifest if not manifests: raise ValueError("Cannot merge empty sequence of manifests") if len(manifests) == 1: return manifests[0] assert len({mf.name for mf in manifests}) == 1, "All manifests must have same name" assert ( len({mf.package_version for mf in manifests}) == 1 ), "All manifests must have same version" if not overwrite: assert ( len({mf.display_name for mf in manifests}) == 1 ), "All manifests must have same display_name" mf0 = manifests[0] info = mf0.dict(exclude={"contributions"}, exclude_unset=True) info["contributions"] = merge_contributions( [m.contributions for m in manifests], overwrite=overwrite ) return PluginManifest(**info) # TODO: refactor this ugly thing def merge_contributions( contribs: Sequence[Optional[ContributionPoints]], overwrite=False ) -> dict: """Merge a sequence of contribution points in a single dict. Parameters ---------- contribs : Sequence[Optional[ContributionPoints]] A sequence of contribution points. None values are ignored. overwrite : bool, optional If `True`, when existing command id's are encountered, they overwrite the previous command. By default (`False`), the command id is incremented until it is unique. Returns ------- dict Kwargs that can be passed to `ContributionPoints(**kwargs)` """ _contribs = [c for c in contribs if c and c.dict(exclude_unset=True)] if not _contribs: return {} # pragma: no cover out_dict = _contribs[0].dict(exclude_unset=True) if len(_contribs) <= 1: # no need to merge a single contribution return out_dict # pragma: no cover for ctrb in _contribs[1:]: _renames = {} existing_cmds = {c["id"] for c in out_dict.get("commands", {})} new_ctrb_dict = ctrb.dict(exclude_unset=True) for cmd in list(new_ctrb_dict.get("commands", ())): cmd_id = cmd["id"] if cmd_id in existing_cmds: if overwrite: # remove existing command new_ctrb_dict["commands"].remove(cmd) else: # if we're not overwriting, we need to rename the command # to avoid collisions i = 1 while cmd_id in existing_cmds: if i != 1: cmd_id = cmd_id[:-2] i += 1 cmd_id = f"{cmd_id}_{i}" _renames[cmd["id"]] = cmd_id cmd["id"] = cmd_id for key, val in new_ctrb_dict.items(): if isinstance(val, list): for item in val: if "command" in item: cmd_id = item["command"] if cmd_id in _renames: cmd_id = _renames[cmd_id] item["command"] = cmd_id if overwrite: for existing_item in list(out_dict.get(key, [])): if all(item[k] == existing_item[k] for k in item): out_dict[key].remove(existing_item) out_dict = deep_update(out_dict, new_ctrb_dict) return out_dict def safe_key(key: str) -> str: """Remove parentheses and brackets from a string.""" key = re.sub(r"[ -]", "_", key.lower()) return re.sub(r"[\[\]\(\)]", "", key) npe2-0.7.2/src/npe2/manifest/contributions/__init__.py0000644000000000000000000000175313615410400017566 0ustar00from ._commands import CommandContribution from ._configuration import ConfigurationContribution, ConfigurationProperty from ._contributions import ContributionPoints from ._menus import MenuCommand, MenuItem, Submenu from ._readers import ReaderContribution from ._sample_data import SampleDataContribution, SampleDataGenerator, SampleDataURI from ._submenu import SubmenuContribution from ._themes import ThemeColors, ThemeContribution from ._widgets import WidgetContribution from ._writers import LayerType, LayerTypeConstraint, WriterContribution __all__ = [ "CommandContribution", "ConfigurationContribution", "ConfigurationProperty", "ContributionPoints", "LayerType", "LayerTypeConstraint", "MenuCommand", "MenuItem", "ReaderContribution", "SampleDataContribution", "SampleDataGenerator", "SampleDataURI", "Submenu", "SubmenuContribution", "ThemeColors", "ThemeContribution", "WidgetContribution", "WriterContribution", ] npe2-0.7.2/src/npe2/manifest/contributions/_commands.py0000644000000000000000000001033513615410400017763 0ustar00from typing import TYPE_CHECKING, Any, Optional, Union from pydantic import BaseModel, Extra, Field, validator from npe2.manifest import _validators from npe2.types import PythonName from ._icon import Icon if TYPE_CHECKING: from npe2._command_registry import CommandRegistry class CommandContribution(BaseModel): """Contribute a **command** (a python callable) consisting of a unique `id`, a `title` and (optionally) a `python_path` that points to a fully qualified python callable. If a `python_path` is not included in the manifest, it *must* be registered during activation with `register_command`. Note, some other contributions (e.g. `readers`, `writers` and `widgets`) will *point* to a specific command. The command itself (i.e. the callable python object) will always appear in the `contributions.commands` section, but those contribution types may add additional contribution-specific metadata. ```{admonition} Future Plans Command contributions will eventually include an **icon**, **category**, and **enabled** state. Enablement is expressed with *when clauses*, that capture a conditional expression determining whether the command should be enabled or not, based on the current state of the program. (i.e. "*If the active layer is a `Labels` layer*") Commands will eventually be availble in a Command Palette (accessible with a hotkey) but they can also show in other menus. ``` """ id: str = Field( ..., description="A unique identifier used to reference this command. While this may" " look like a python fully qualified name this does *not* refer to a python " "object; this identifier is specific to napari. It must begin with " "the name of the package, and include only alphanumeric characters, plus " "dashes and underscores.", ) _valid_id = validator("id", allow_reuse=True)(_validators.command_id) title: str = Field( ..., description="User facing title representing the command. This might be used, " "for example, when searching in a command palette. Examples: 'Generate lily " "sample', 'Read tiff image', 'Open gaussian blur widget'. ", ) python_name: Optional[PythonName] = Field( None, description="Fully qualified name to a callable python object " "implementing this command. This usually takes the form of " "`{obj.__module__}:{obj.__qualname__}` " "(e.g. `my_package.a_module:some_function`)", ) _valid_pyname = validator("python_name", allow_reuse=True)(_validators.python_name) short_title: Optional[str] = Field( None, description="Short title by which the command is represented in " "the UI. Menus pick either `title` or `short_title` depending on the context " "in which they show commands.", ) category: Optional[str] = Field( None, description="Category string by which the command may be grouped in the UI.", ) icon: Optional[Union[str, Icon]] = Field( None, description="Icon used to represent this command in the UI, on " "buttons or in menus. These may be [superqt](https://github.com/napari/superqt)" " fonticon keys, such as `'fa6s.arrow_down'`; though note that plugins are " "expected to depend on any fonticon libraries they use, e.g " "[fonticon-fontawesome6](https://github.com/tlambert03/fonticon-fontawesome6).", ) enablement: Optional[str] = Field( None, description=( "Expression which must evaluate as true to enable the command in the UI " "(menu and keybindings). Does not prevent executing the command " "by other means, like the `execute_command` api." ), ) class Config: extra = Extra.forbid def exec( self, args: tuple = (), kwargs: Optional[dict] = None, _registry: Optional["CommandRegistry"] = None, ) -> Any: if kwargs is None: kwargs = {} if _registry is None: from npe2._plugin_manager import PluginManager _registry = PluginManager.instance().commands return _registry.execute(self.id, args, kwargs) npe2-0.7.2/src/npe2/manifest/contributions/_configuration.py0000644000000000000000000001456413615410400021041 0ustar00from typing import Any, Dict, List, Literal, Optional, Union from pydantic import BaseModel, Field, conlist, root_validator, validator from ._json_schema import ( Draft07JsonSchema, JsonType, JsonTypeArray, ValidationError, _coerce_type_name, ) class ConfigurationProperty(Draft07JsonSchema): """Configuration for a single property in the plugin settings. This is a super-set of the JSON Schema (draft 7) specification. https://json-schema.org/understanding-json-schema/reference/index.html """ type: Union[JsonType, JsonTypeArray] = Field( None, description="The type of this variable. Either JSON Schema type names ('array'," " 'boolean', 'object', ...) or python type names ('list', 'bool', 'dict', ...) " "may be used, but they will be coerced to JSON Schema types. Numbers, strings, " "and booleans will be editable in the UI, other types (lists, dicts) *may* be " "editable in the UI depending on their content, but maby will only be editable " "as text in the napari settings file. For boolean entries, the description " "(or markdownDescription) will be used as the label for the checkbox.", ) _coerce_type_name = validator("type", pre=True, allow_reuse=True)(_coerce_type_name) default: Any = Field(None, description="The default value for this property.") description: Optional[str] = Field( None, description="Your `description` appears after the title and before the input " "field, except for booleans, where the description is used as the label for " "the checkbox. See also `markdown_description`.", ) description_format: Literal["markdown", "plain"] = Field( "markdown", description="By default (`markdown`) your `description`, will be parsed " "as CommonMark (with `markdown_it`) and rendered as rich text. To render as " "plain text, set this value to `plain`.", ) enum: Optional[conlist(Any, min_items=1, unique_items=True)] = Field( # type: ignore # noqa: E501 None, description="A list of valid options for this field. If you provide this field," "the settings UI will render a dropdown menu.", ) enum_descriptions: List[str] = Field( default_factory=list, description="If you provide a list of items under the `enum` field, you may " "provide `enum_descriptions` to add descriptive text for each enum.", ) enum_descriptions_format: Literal["markdown", "plain"] = Field( "markdown", description="By default (`markdown`) your `enum_description`s, will be parsed " "as CommonMark (with `markdown_it`) and rendered as rich text. To render as " "plain text, set this value to `plain`.", ) deprecation_message: Optional[str] = Field( None, description="If you set deprecationMessage, the setting will get a warning " "underline with your specified message. It won't show up in the settings " "UI unless it is configured by the user.", ) deprecation_message_format: Literal["markdown", "plain"] = Field( "markdown", description="By default (`markdown`) your `deprecation_message`, will be " "parsed as CommonMark (with `markdown_it`) and rendered as rich text. To " "render as plain text, set this value to `plain`.", ) edit_presentation: Literal["singleline", "multiline"] = Field( "singleline", description="By default, string settings will be rendered with a single-line " "editor. To render with a multi-line editor, set this value to `multiline`.", ) order: Optional[int] = Field( None, description="When specified, gives the order of this setting relative to other " "settings within the same category. Settings with an order property will be " "placed before settings without this property set; and settings without `order`" " will be placed in alphabetical order.", ) pattern_error_message: Optional[str] = Field( None, description="When restricting string types to a given regular expression with " "the `pattern` field, this field may be used to provide a custom error when " "the pattern does not match.", ) @root_validator(pre=True) def _validate_root(cls, values): values = super()._validate_root(values) # we don't allow $ref and/or $defs in the schema for ignored in {"$ref", "ref", "definition", "$def"}: if ignored in values: import warnings del values[ignored] warnings.warn( f"ignoring {ignored} in configuration property. " "Configuration schemas must be self-contained.", stacklevel=2, ) return values def validate_instance(self, instance: object) -> dict: """Validate an object (instance) against this schema.""" try: return super().validate_instance(instance) except ValidationError as e: if e.validator == "pattern" and self.pattern_error_message: e.message = self.pattern_error_message raise e class ConfigurationContribution(BaseModel): """A configuration contribution for a plugin. This enables plugins to provide a schema for their configurables. Configuration contributions are used to generate the settings UI. """ title: str = Field( ..., description="The heading used for this configuration category. Words like " '"Plugin", "Configuration", and "Settings" are redundant and should not be' "used in your title.", ) properties: Dict[str, ConfigurationProperty] = Field( ..., description="Configuration properties. In the settings UI, your configuration " "key will be used to namespace and construct a title. Though a plugin can " "contain multiple categories of settings, each plugin setting must still have " "its own unique key. Capital letters in your key are used to indicate word " "breaks. For example, if your key is 'gitMagic.blame.dateFormat', the " "generated title for the setting will look like 'Blame: Date Format'", ) # order: int # vscode uses this to sort multiple configurations # ... I think we can just use the order in which they are declared npe2-0.7.2/src/npe2/manifest/contributions/_contributions.py0000644000000000000000000000416213615410400021065 0ustar00from typing import Dict, List, Optional from pydantic import BaseModel, Field, validator from ._commands import CommandContribution from ._configuration import ConfigurationContribution from ._keybindings import KeyBindingContribution from ._menus import MenuItem from ._readers import ReaderContribution from ._sample_data import SampleDataContribution, SampleDataGenerator, SampleDataURI from ._submenu import SubmenuContribution from ._themes import ThemeContribution from ._widgets import WidgetContribution from ._writers import WriterContribution __all__ = [ "ContributionPoints", "CommandContribution", "KeyBindingContribution", "MenuItem", "ReaderContribution", "SampleDataContribution", "SubmenuContribution", "ThemeContribution", "WidgetContribution", "WriterContribution", "SampleDataGenerator", "SampleDataURI", ] class ContributionPoints(BaseModel): commands: Optional[List[CommandContribution]] readers: Optional[List[ReaderContribution]] writers: Optional[List[WriterContribution]] widgets: Optional[List[WidgetContribution]] sample_data: Optional[List[SampleDataContribution]] themes: Optional[List[ThemeContribution]] menus: Dict[str, List[MenuItem]] = Field(default_factory=dict, hide_docs=True) submenus: Optional[List[SubmenuContribution]] = Field(None, hide_docs=True) keybindings: Optional[List[KeyBindingContribution]] = Field(None, hide_docs=True) configuration: List[ConfigurationContribution] = Field( default_factory=list, hide_docs=True, description="Configuration options for this plugin." "This section can either be a single object, representing a single category of" "settings, or an array of objects, representing multiple categories of" "settings. If there are multiple categories of settings, the Settings editor" "will show a submenu in the table of contents for that extension, and the title" "keys will be used for the submenu entry names.", ) @validator("configuration", pre=True) def _to_list(cls, v): return v if isinstance(v, list) else [v] npe2-0.7.2/src/npe2/manifest/contributions/_icon.py0000644000000000000000000000022413615410400017106 0ustar00from typing import Optional from pydantic import BaseModel class Icon(BaseModel): light: Optional[str] = None dark: Optional[str] = None npe2-0.7.2/src/npe2/manifest/contributions/_json_schema.py0000644000000000000000000002526613615410400020464 0ustar00from __future__ import annotations from typing import TYPE_CHECKING, Any, Dict, List, Literal, Optional, Type, Union from pydantic import BaseModel, Field, PrivateAttr, conlist, root_validator, validator if TYPE_CHECKING: from jsonschema.exceptions import ValidationError from jsonschema.protocols import Validator else: try: from jsonschema.exceptions import ValidationError except ImportError: ValidationError = Exception __all__ = [ "ValidationError", "Draft04JsonSchema", "Draft06JsonSchema", "Draft07JsonSchema", ] JsonType = Literal["array", "boolean", "integer", "null", "number", "object", "string"] JsonTypeArray = conlist(JsonType, min_items=True, unique_items=True) StringArrayMin1 = conlist(str, unique_items=True, min_items=1) StringArray = conlist(str, unique_items=True) PY_NAME_TO_JSON_NAME = { "list": "array", "bool": "boolean", "int": "integer", "float": "number", "dict": "object", "str": "string", "NoneType": "null", "None": "null", } def _to_json_type(type_: Union[str, Type]) -> JsonType: if isinstance(type_, type): type_ = type_.__name__ type_ = str(type_).lower() return PY_NAME_TO_JSON_NAME.get(type_, type_) # type: ignore # (validated later) def _coerce_type_name(v): """Coerce python type names to json schema type names.""" if isinstance(v, list): return [_to_json_type(t) for t in v] return _to_json_type(v) def _to_camel(string: str) -> str: words = string.split("_") return words[0] + "".join(w.capitalize() for w in words[1:]) _CONSTRAINT_FIELDS = { "exclusive_minimum", "minimum", "exclusive_maximum", "maximum", "multiple_of", "min_items", "max_items", "min_length", "max_length", "pattern", } _python_equivalent: Dict[Optional[str], Type] = { "array": list, "boolean": bool, "integer": int, "null": type(None), "number": float, "object": dict, "string": str, None: object, } class _JsonSchemaBase(BaseModel): class Config: alias_generator = _to_camel allow_population_by_field_name = True # underscore here to avoid name collision with pydantic's `schema` method schema_: Optional[str] = Field(None, alias="$schema") title: Optional[str] = Field(None) description: Optional[str] = Field(None) default: Any = Field(None) multiple_of: Optional[float] = Field(None, ge=0) maximum: Optional[float] = Field(None) minimum: Optional[float] = Field(None) max_length: Optional[int] = Field(None, ge=0) min_length: Optional[int] = Field(0, ge=0) # could be Pattern. but it's easier to work with as str pattern: Optional[str] = Field(None) max_items: Optional[int] = Field(None, ge=0) min_items: Optional[int] = Field(0, ge=0) unique_items: bool = Field(False) max_properties: Optional[int] = Field(None, ge=0) min_properties: Optional[int] = Field(0, ge=0) enum: Optional[conlist(Any, min_items=1, unique_items=True)] = Field(None) # type: ignore # noqa type: Union[JsonType, JsonTypeArray] = Field(None) # type: ignore format: Optional[str] = Field(None) _json_validator: Type[Validator] = PrivateAttr() # these will be redefined in subclasses with specific subschema types # just here for type-checking in the methods of this base class if TYPE_CHECKING: items: Any properties: Any all_of: Any any_of: Any one_of: Any _coerce_type_name = validator("type", pre=True, allow_reuse=True)(_coerce_type_name) @root_validator(pre=True) def _validate_root(cls, values: Dict[str, Any]) -> Any: if "type" not in values: if "properties" in values: values["type"] = "object" elif "items" in values: values["type"] = "array" # Get around pydantic bug wherein `Optional[conlists]`` throw a # 'NoneType' object is not iterable error if `None` is provided in init. for conlists in ("enum", "required"): if conlists in values and not values[conlists]: values.pop(conlists) return values @property def has_constraint(self) -> bool: """Return True if this schema has any constraints.""" return bool(_CONSTRAINT_FIELDS & self.__fields_set__) @property def has_default(self) -> bool: """Return True if the schema has a default value.""" return "default" in self.__fields_set__ @property def python_type(self) -> Union[Type, List[Type]]: """Return Python type equivalent(s) for this schema (JSON) type.""" if isinstance(self.type, list): return [_python_equivalent[t] for t in self.type] else: return _python_equivalent[self.type] @property def is_array(self) -> bool: """Return True if this schema is an array schema.""" return self.items is not None or self.type == "array" @property def is_object(self) -> bool: """Return True if this schema is an object schema.""" return ( self.properties is not None or self.type == "object" and not self.all_of and not self.one_of and not self.any_of and not getattr(self, "ref", False) # draft 6+ ) @property def json_validator(self) -> Type[Validator]: """Return jsonschema validator class for this schema. See also `validate_instance`. """ if not hasattr(self, "_json_validator"): from jsonschema.validators import validator_for schema = self.dict(by_alias=True, exclude_unset=True) schema["$schema"] = self.schema_ cls = validator_for(schema) cls.check_schema(schema) self._json_validator = cls(schema) return self._json_validator def validate_instance(self, instance: Any) -> dict: """Validate an object (instance) against this schema.""" from jsonschema.exceptions import best_match error: ValidationError = best_match(self.json_validator.iter_errors(instance)) if error is not None: raise error return instance class Draft04JsonSchema(_JsonSchemaBase): """Model for Draft 4 JSON Schema.""" schema_: str = Field("http://json-schema.org/draft-04/schema#", alias="$schema") id: Optional[str] = Field(None) exclusive_maximum: Optional[bool] = Field(None) exclusive_minimum: Optional[bool] = Field(None) required: Optional[StringArrayMin1] = Field(None) # type: ignore dependencies: Optional[ # type: ignore Dict[str, Union[Draft04JsonSchema, StringArrayMin1]] ] = Field(None) # common to all schemas (could go in _JsonSchemaBase) # except we need the self-referrential type to be this class additional_items: Union[bool, Draft04JsonSchema, None] = Field(None) items: Union[Draft04JsonSchema, List[Draft04JsonSchema], None] = Field(None) additional_properties: Union[bool, Draft04JsonSchema, None] = Field(None) definitions: Optional[Dict[str, Draft04JsonSchema]] = Field(None) properties: Optional[Dict[str, Draft04JsonSchema]] = Field(None) pattern_properties: Optional[Dict[str, Draft04JsonSchema]] = Field(None) all_of: Optional[List[Draft04JsonSchema]] = Field(None) any_of: Optional[List[Draft04JsonSchema]] = Field(None) one_of: Optional[List[Draft04JsonSchema]] = Field(None) not_: Optional[Draft04JsonSchema] = Field(None, alias="not") class _Draft06JsonSchema(_JsonSchemaBase): id: Optional[str] = Field(None, alias="$id") # ref: Optional[str] = Field(None, alias="$ref") examples: Optional[List[Any]] = Field(None) exclusive_maximum: Optional[float] = Field(None) exclusive_minimum: Optional[float] = Field(None) contains: Optional[Draft06JsonSchema] = Field(None) required: Optional[StringArray] = Field(None) # type: ignore dependencies: Optional[ # type: ignore Dict[str, Union[Draft06JsonSchema, StringArray]] ] = Field(None) property_names: Optional[Draft06JsonSchema] = Field(None) const: Any = Field(None) class Draft06JsonSchema(_Draft06JsonSchema): """Model for Draft 6 JSON Schema.""" schema_: str = Field("http://json-schema.org/draft-06/schema#", alias="$schema") # common to all schemas (could go in _JsonSchemaBase) # except we need the self-referrential type to be this class # and... technically, all subschemas may also be booleans as of Draft 6, # not just additional_properties and additional_items additional_items: Union[bool, Draft06JsonSchema, None] = Field(None) items: Union[Draft06JsonSchema, List[Draft06JsonSchema], None] = Field(None) additional_properties: Union[bool, Draft06JsonSchema, None] = Field(None) # definitions: Optional[Dict[str, Draft06JsonSchema]] = Field(None) properties: Optional[Dict[str, Draft06JsonSchema]] = Field(None) pattern_properties: Optional[Dict[str, Draft06JsonSchema]] = Field(None) all_of: Optional[List[Draft06JsonSchema]] = Field(None) any_of: Optional[List[Draft06JsonSchema]] = Field(None) one_of: Optional[List[Draft06JsonSchema]] = Field(None) not_: Optional[Draft06JsonSchema] = Field(None, alias="not") class Draft07JsonSchema(_Draft06JsonSchema): """Model for Draft 7 JSON Schema.""" schema_: str = Field("http://json-schema.org/draft-07/schema#", alias="$schema") comment: Optional[str] = Field(None, alias="$comment") read_only: bool = Field(False) write_only: bool = Field(False) content_media_type: Optional[str] = Field(None) content_encoding: Optional[str] = Field(None) if_: Optional[Draft07JsonSchema] = Field(None, alias="if") then: Optional[Draft07JsonSchema] = Field(None) else_: Optional[Draft07JsonSchema] = Field(None, alias="else") # common to all schemas (could go in _JsonSchemaBase) # except we need the self-referrential type to be this class # and... technically, all subschemas may also be booleans as of Draft 6, # not just additional_properties and additional_items additional_items: Union[bool, Draft07JsonSchema, None] = Field(None) items: Union[Draft07JsonSchema, List[Draft07JsonSchema], None] = Field(None) additional_properties: Union[bool, Draft07JsonSchema, None] = Field(None) # definitions: Optional[Dict[str, Draft07JsonSchema]] = Field(None) properties: Optional[Dict[str, Draft07JsonSchema]] = Field(None) pattern_properties: Optional[Dict[str, Draft07JsonSchema]] = Field(None) all_of: Optional[List[Draft07JsonSchema]] = Field(None) any_of: Optional[List[Draft07JsonSchema]] = Field(None) one_of: Optional[List[Draft07JsonSchema]] = Field(None) not_: Optional[Draft07JsonSchema] = Field(None, alias="not") npe2-0.7.2/src/npe2/manifest/contributions/_keybindings.py0000644000000000000000000000167513615410400020477 0ustar00from typing import Optional from pydantic.fields import Field from npe2.manifest.utils import Executable class KeyBindingContribution(Executable): command: str = Field( description="Identifier of the command to run when keybinding is triggered." ) # the existence of the command is not validated at registration-time, # but rather at call time... (since commands from other extensions can be called) key: str = Field( description="Key or key sequence (separate simultaneous key presses with " "a plus-sign e.g. Ctrl+O and sequences with a space e.g. Ctrl+L L for a chord)." ) mac: Optional[str] = Field(description="Mac specific key or key sequence.") linux: Optional[str] = Field(description="Linux specific key or key sequence.") win: Optional[str] = Field(description="Windows specific key or key sequence.") when: Optional[str] = Field(description="Condition when the key is active.") npe2-0.7.2/src/npe2/manifest/contributions/_menus.py0000644000000000000000000000425513615410400017315 0ustar00from typing import Optional, Union from pydantic import BaseModel, Field from npe2.manifest.utils import Executable # user provides this class _MenuItem(BaseModel): """Generic menu item contribution.""" when: Optional[str] = Field( description="Condition which must be true to *show* this item in the menu. " "Note that ``when`` clauses apply to menus and ``enablement`` clauses to " "commands. The ``enablement`` applies to all menus and even keybindings while " "the ``when`` only applies to a single menu." ) # TODO: declare groups for every menu exposed by napari: # e.g. `2_compare`, `4_search`, `6_cutcopypaste` group: Optional[str] = Field( description="The `group` property defines sorting and grouping of menu items. " "The `'navigation'` group is special: it will always be sorted to the " "top/beginning of a menu. By default, the order *inside* a group depends on " "the `title`. The group-local order of a menu item can be specified by " "appending @ to the group identifier: e.g. `group: 'myGroup@2'`." ) class Submenu(_MenuItem): """Contributes a submenu placement in a menu.""" submenu: str = Field( ..., description="Identifier of the submenu to display in this item." "The submenu must be declared in the 'submenus' -section", ) # if submenu doesn't exist, you get: # Menu item references a submenu ...` which is not defined in the 'submenus' section class MenuCommand(_MenuItem, Executable): """Contributes a command in a menu.""" command: str = Field( ..., description="Identifier of the command to execute. " "The command must be declared in the 'commands' section", ) # if command doesn't exist, you get: # "Menu item references a command `...` which is not defined in the # 'commands' section." alt: Optional[str] = Field( description="Identifier of an alternative command to execute. " "It will be shown and invoked when pressing Alt while opening a menu." "The command must be declared in the 'commands' section" ) MenuItem = Union[MenuCommand, Submenu] npe2-0.7.2/src/npe2/manifest/contributions/_readers.py0000644000000000000000000000423113615410400017605 0ustar00from functools import wraps from typing import List, Optional from pydantic import Extra, Field from npe2.manifest.utils import Executable, v2_to_v1 from npe2.types import ReaderFunction class ReaderContribution(Executable[Optional[ReaderFunction]]): """Contribute a file reader. Readers may be associated with specific **filename_patterns** (e.g. "*.tif", "*.zip") and are invoked whenever `viewer.open('some/path')` is used on the command line, or when a user opens a file in the graphical user interface by dropping a file into the canvas, or using `File -> Open...` """ command: str = Field( ..., description="Identifier of the command providing `napari_get_reader`." ) filename_patterns: List[str] = Field( ..., description="List of filename patterns (for fnmatch) that this reader can " "accept. Reader will be tried only if `fnmatch(filename, pattern) == True`. " "Use `['*']` to match all filenames.", ) accepts_directories: bool = Field( False, description="Whether this reader accepts directories" ) class Config: extra = Extra.forbid def __hash__(self): return hash( (self.command, tuple(self.filename_patterns), self.accepts_directories) ) def exec(self, *, kwargs): """ We are trying to simplify internal npe2 logic to always deal with a (list[str], bool) pair instead of Union[PathLike, Seq[Pathlike]]. We thus wrap the Reader Contributions to still give them the old api. Later on we could add a "if manifest.version == 2" or similar to not have this backward-compatibility logic for new plugins. """ kwargs = kwargs.copy() stack = kwargs.pop("stack", None) assert stack is not None kwargs["path"] = v2_to_v1(kwargs["path"], stack) callable_ = super().exec(kwargs=kwargs) if callable_ is None: # pragma: no cover return None @wraps(callable_) def npe1_compat(paths, *, stack): path = v2_to_v1(paths, stack) return callable_(path) # type: ignore return npe1_compat npe2-0.7.2/src/npe2/manifest/contributions/_sample_data.py0000644000000000000000000000447213615410400020441 0ustar00from abc import ABC, abstractmethod from typing import TYPE_CHECKING, List, Optional, Union from pydantic.fields import Field from pydantic.generics import GenericModel from npe2.manifest.utils import Executable from npe2.types import LayerData if TYPE_CHECKING: from npe2._command_registry import CommandRegistry class _SampleDataContribution(GenericModel, ABC): """Contribute sample data for use in napari. Sample data can take the form of a **command** that returns layer data, or a simple path or **uri** to a local or remote resource (assuming there is a reader plugin capable of reading that path/URI). """ key: str = Field(..., description="A unique key to identify this sample.") display_name: str = Field( ..., description="String to show in the UI when referring to this sample" ) @abstractmethod def open( self, *args, _registry: Optional["CommandRegistry"] = None, **kwargs ) -> List[LayerData]: ... class SampleDataGenerator(_SampleDataContribution, Executable[List[LayerData]]): """Contribute a callable command that creates data on demand.""" command: str = Field( ..., description="Identifier of a command that returns layer data tuple." ) def open( self, *args, _registry: Optional["CommandRegistry"] = None, **kwargs ) -> List[LayerData]: return self.exec(args, kwargs, _registry=_registry) class Config: title = "Sample Data Function" class SampleDataURI(_SampleDataContribution): """Contribute a URI to static local or remote data. This can be data included in the plugin package, or a URL to remote data. The URI must be readable by either napari's builtin reader, or by a plugin that is included/required.""" uri: str = Field( ..., description="Path or URL to a data resource. " "This URI should be a valid input to `io_utils.read`", ) reader_plugin: Optional[str] = Field( None, description="Name of plugin to use to open URI", ) def open(self, *args, **kwargs) -> List[LayerData]: from npe2.io_utils import read return read([self.uri], plugin_name=self.reader_plugin, stack=False) class Config: title = "Sample Data URI" SampleDataContribution = Union[SampleDataGenerator, SampleDataURI] npe2-0.7.2/src/npe2/manifest/contributions/_submenu.py0000644000000000000000000000125013615410400017634 0ustar00from typing import Optional, Union from pydantic import BaseModel from pydantic.fields import Field from ._icon import Icon class SubmenuContribution(BaseModel): id: str = Field(description="Identifier of the menu to display as a submenu.") label: str = Field( description="The label of the menu item which leads to this submenu." ) icon: Optional[Union[str, Icon]] = Field( None, description=( "(Optional) Icon which is used to represent the command in the UI." " Either a file path, an object with file paths for dark and light" "themes, or a theme icon references, like `$(zap)`" ), ) npe2-0.7.2/src/npe2/manifest/contributions/_themes.py0000644000000000000000000000472313615410400017453 0ustar00from typing import Literal, Optional, Union from pydantic import BaseModel, color from pydantic.fields import Field # pydantic doesn't implement color equality? class Color(color.Color): def __eq__(self, __o: object) -> bool: if not isinstance(__o, color.Color): return False # pragma: no cover return self.as_rgb_tuple() == __o.as_rgb_tuple() class ThemeColors(BaseModel): canvas: Optional[Color] console: Optional[Color] background: Optional[Color] foreground: Optional[Color] primary: Optional[Color] secondary: Optional[Color] highlight: Optional[Color] text: Optional[Color] icon: Optional[Color] warning: Optional[Color] current: Optional[Color] _color_keys = ", ".join([f"`{k}`" for k in ThemeColors.__fields__]) _color_args = """ - name: `"Black"`, `"azure"` - hexadecimal value: `"0x000"`, `"#FFFFFF"`, `"7fffd4"` - RGB/RGBA tuples: `(255, 255, 255)`, `(255, 255, 255, 0.5)` - RGB/RGBA strings: `"rgb(255, 255, 255)"`, `"rgba(255, 255, 255, 0.5)`" - HSL strings: "`hsl(270, 60%, 70%)"`, `"hsl(270, 60%, 70%, .5)`" """ class ThemeContribution(BaseModel): """Contribute a color theme to napari. You must specify an **id**, **label**, and whether the theme is a dark theme or a light theme **type** (such that the rest of napari changes to match your theme). Any color keys omitted from the theme contribution will use the default napari dark/light theme colors. """ # TODO: do we need both id and label? id: str = Field( description="Identifier of the color theme as used in the user settings." ) label: str = Field(description="Label of the color theme as shown in the UI.") type: Union[Literal["dark"], Literal["light"]] = Field( description="Base theme type, used for icons and filling in unprovided colors. " "Must be either `'dark'` or `'light'`." ) syntax_style: Optional[str] colors: ThemeColors = Field( description=f"Theme colors. Valid keys include: {_color_keys}. All keys " "are optional. Color values can be defined via:\n" ' - name: `"Black"`, `"azure"`\n' ' - hexadecimal value: `"0x000"`, `"#FFFFFF"`, `"7fffd4"`\n' " - RGB/RGBA tuples: `(255, 255, 255)`, `(255, 255, 255, 0.5)`\n" ' - RGB/RGBA strings: `"rgb(255, 255, 255)"`, `"rgba(255, 255, 255, 0.5)`"\n' ' - HSL strings: "`hsl(270, 60%, 70%)"`, `"hsl(270, 60%, 70%, .5)`"\n' ) npe2-0.7.2/src/npe2/manifest/contributions/_widgets.py0000644000000000000000000000423513615410400017632 0ustar00from __future__ import annotations from typing import TYPE_CHECKING, Callable, Optional from pydantic import Extra, Field from npe2.manifest.utils import Executable from npe2.types import Widget if TYPE_CHECKING: from npe2._command_registry import CommandRegistry class WidgetContribution(Executable[Widget]): """Contribute a widget that can be added to the napari viewer. Widget contributions point to a **command** that, when called, returns a widget *instance*; this includes functions that return a widget instance, (e.g. those decorated with `magicgui.magic_factory`) and subclasses of either [`QtWidgets.QWidget`](https://doc.qt.io/qt-5/qwidget.html) or [`magicgui.widgets.Widget`](https://napari.org/magicgui/api/_autosummary/magicgui.widgets._bases.Widget.html). Optionally, **autogenerate** may be used to create a widget (using [magicgui](https://napari.org/magicgui/)) from a command. (In this case, the command needn't return a widget instance; it can be any function suitable as an argument to `magicgui.magicgui()`.) """ command: str = Field( ..., description="Identifier of a command that returns a widget instance. " "Or, if `autogenerate` is `True`, any command suitable as an argument " "to `magicgui.magicgui()`.", ) display_name: str = Field( ..., description="Name for the widget, as presented in the UI." ) autogenerate: bool = Field( default=False, description="If true, a widget will be autogenerated from the signature of " "the associated command using [magicgui](https://napari.org/magicgui/).", ) class Config: extra = Extra.forbid def get_callable( self, _registry: Optional[CommandRegistry] = None ) -> Callable[..., Widget]: func = super().get_callable() if self.autogenerate: try: from magicgui import magic_factory except ImportError as e: raise ImportError( "To use autogeneration, you must have magicgui installed." ) from e return magic_factory(func) return func npe2-0.7.2/src/npe2/manifest/contributions/_writers.py0000644000000000000000000001725713615410400017673 0ustar00from enum import Enum from typing import List, Tuple from pydantic import BaseModel, Extra, Field, validator from npe2.manifest.utils import Executable class LayerType(str, Enum): graph = "graph" image = "image" labels = "labels" points = "points" shapes = "shapes" surface = "surface" tracks = "tracks" vectors = "vectors" class LayerTypeConstraint(BaseModel): """Layer type constraints. A writer plugin can declare that it will write 0 or more layers of a specific type. For example: ``` image Write exactly 1 image layer. image? Write 0 or 1 image layes. image+ Write 1 or more image layers. image* Write 0 or more image layers. image{k} Write exactly k image layres. image{m,n} Write between m and n layers (inclusive range). Must have m<=n. ``` When a type is not present in the list of constraints, that corresponds to a writer that is not compatible with that type. For example, a writer declaring: ``` layer_types=["image+", "points*"] ``` would not be selected when trying to write an `image` and a `vector` layer because the above only works for cases with 0 `vector` layers. Note that just because a writer declares compatibility with a layer type does not mean it actually writes that type. In the example above, the writer might accept a set of layers containing `image`s and `point`s, but the write command might just ignore the `point` layers """ layer_type: LayerType bounds: Tuple[int, int] = Field( ..., description="This writer consumes between bounds[0] and bounds[1] " "layers of `layer_type`", ) @validator("bounds") def check_bounds(cls, v): mn, mx = v assert mn >= 0, "min must be >= 0" assert mx > mn, "max must be > min" return v @classmethod def zero(cls, layer_type: LayerType) -> "LayerTypeConstraint": return cls(layer_type=layer_type, bounds=(0, 1)) def is_zero(self) -> bool: return self.bounds == (0, 1) def max(self) -> int: return max(0, self.bounds[1] - 1) @classmethod def from_str(cls, expr: str) -> "LayerTypeConstraint": """Parse layer-type constraint expressions. These have the form '' where is one of: '?', '+', '*', '{k}', '{m,n}'. '?' means 0 or 1. '+' means 1 or more. '*' means 0 or more. '{k}' means exactly k. '{m,n}' means between m and n (inclusive). """ # Writers won't accept more than this number of layers. MAX_LAYERS = 1 << 32 def parse(expr): if expr.endswith("?"): return (0, 2), LayerType(expr[:-1]) elif expr.endswith("+"): return (1, MAX_LAYERS), LayerType(expr[:-1]) elif expr.endswith("*"): return (0, MAX_LAYERS), LayerType(expr[:-1]) elif expr.endswith("}"): rest, _, range_expr = expr[:-1].rpartition("{") if "," in range_expr: m, n = range_expr.split(",") return (int(m), int(n) + 1), LayerType(rest) else: k = int(range_expr) return (k, k + 1), LayerType(rest) else: return (1, 2), LayerType(expr) bounds, lt = parse(expr) return cls(layer_type=lt, bounds=bounds) class WriterContribution(Executable[List[str]]): r"""Contribute a layer writer. Writers accept data from one or more layers and write them to file. Writers declare support for writing one or more **layer_types**, may be associated with specific **filename_patterns** (e.g. "\*.tif", "\*.zip") and are invoked whenever `viewer.layers.save('some/path.ext')` is used on the command line, or when a user requests to save one or more layers in the graphical user interface with `File -> Save Selected Layer(s)...` or `Save All Layers...` """ command: str = Field( ..., description="Identifier of the command providing a writer." ) layer_types: List[str] = Field( ..., description="List of layer type constraints. These determine what " "layers (or combinations thereof) this writer handles.", ) # An empty filename_extensions list matches any file extension. Making the # default something like ['.*'] is tempting but we don't actually use # these for glob matching and supporting this default ends up making the # code more complicated. filename_extensions: List[str] = Field( default_factory=list, description="List of filename extensions compatible with this writer. " "The first entry is used as the default if necessary. Empty by default. " "When empty, any filename extension is accepted.", ) display_name: str = Field( default="", description="Brief text used to describe this writer when presented. " "Empty by default. When present, this string is presented in the save dialog " "along side the plugin name and may be used to distinguish the kind of " "writer for the user. E.g. “lossy” or “lossless”.", ) def layer_type_constraints(self) -> List[LayerTypeConstraint]: spec = [LayerTypeConstraint.from_str(lt) for lt in self.layer_types] unspecified_types = set(LayerType) - {lt.layer_type for lt in spec} return spec + [LayerTypeConstraint.zero(lt) for lt in unspecified_types] def __hash__(self): return hash( ( self.command, str(self.layer_types), str(self.filename_extensions), self.display_name, ) ) class Config: extra = Extra.forbid @validator("layer_types") def _parsable_layer_type_expr(cls, layer_types: List[str]) -> List[str]: try: # a successful parse means the string is valid for lt in layer_types: LayerTypeConstraint.from_str(lt) except Exception as e: raise ValueError(f"Could not parse layer_types: {layer_types}. {e}") from e return layer_types @validator("layer_types") def _nonempty_layer_types(cls, layer_types: List[str]) -> List[str]: """If layer_types is empty, raise a ValueError.""" if not layer_types: raise ValueError("layer_types must not be empty") return layer_types @validator("layer_types") def _layer_types_unique(cls, layer_types: List[str]) -> List[str]: """Each layer type can be refered to at most once.""" from collections import Counter c = Counter(LayerTypeConstraint.from_str(lt).layer_type for lt in layer_types) if any(c[lt] > 1 for lt in c): raise ValueError(f"Duplicate layer type in {layer_types}") return layer_types @validator("filename_extensions") def _coerce_common_glob_patterns(cls, exts: List[str]) -> List[str]: """If any of the listed extensions are common glob patterns, replace the list with one of all extensions. Coercions: 1. File extensions beginning with '*' have their leading '*' removed. 2. File extensions lacking a leading '.' have a leading '.' added. Rules: 3. File extensions must start with '.' or '*.' 4. File extensions must be at least two characters long. """ exts = [f".{e.lstrip('*.')}" for e in exts] if any(len(e) < 2 for e in exts): raise ValueError( "Invalid file extension: Must have one character past the '.'" ) return exts npe2-0.7.2/tests/conftest.py0000644000000000000000000001210413615410400012643 0ustar00import shutil import sys from importlib import abc, metadata from pathlib import Path from unittest.mock import patch import pytest from npe2 import PluginManager, PluginManifest from npe2.manifest import _npe1_adapter FIXTURES = Path(__file__).parent / "fixtures" @pytest.fixture def sample_path(): return Path(__file__).parent / "sample" @pytest.fixture def sample_manifest(sample_path): return PluginManifest.from_file(sample_path / "my_plugin" / "napari.yaml") @pytest.fixture def compiled_plugin_dir(tmp_path): shutil.copytree(FIXTURES / "my-compiled-plugin", tmp_path, dirs_exist_ok=True) return tmp_path @pytest.fixture def uses_sample_plugin(sample_path): sys.path.append(str(sample_path)) try: pm = PluginManager.instance() pm.discover() yield finally: sys.path.remove(str(sample_path)) @pytest.fixture def plugin_manager(): pm = PluginManager() pm.discover() return pm @pytest.fixture(autouse=True) def mock_discover(): _discover = PluginManifest.discover def wrapper(*args, **kwargs): before = sys.path.copy() # only allowing things from test directory in discover sys.path = [x for x in sys.path if str(Path(__file__).parent) in x] try: yield from _discover(*args, **kwargs) finally: sys.path = before with patch("npe2.PluginManifest.discover", wraps=wrapper): yield 1 @pytest.fixture def npe1_repo(): return Path(__file__).parent / "npe1-plugin" @pytest.fixture def uses_npe1_plugin(npe1_repo): import site class Importer(abc.MetaPathFinder): def find_spec(self, *_, **__): return None def find_distributions(self, ctx, **k): if ctx.name == "npe1-plugin": pth = npe1_repo / "npe1-plugin-0.0.1.dist-info" yield metadata.PathDistribution(pth) return sys.meta_path.append(Importer()) sys.path.append(str(npe1_repo)) try: pkgs = [*site.getsitepackages(), str(npe1_repo)] with patch("site.getsitepackages", return_value=pkgs): yield finally: sys.path.remove(str(npe1_repo)) @pytest.fixture def npe1_plugin_module(npe1_repo): import sys from importlib.util import module_from_spec, spec_from_file_location npe1_module_path = npe1_repo / "npe1_module" / "__init__.py" spec = spec_from_file_location("npe1_module", npe1_module_path) assert spec module = module_from_spec(spec) sys.modules[spec.name] = module spec.loader.exec_module(module) # type: ignore try: yield module finally: del sys.modules[spec.name] @pytest.fixture def mock_npe1_pm(): from napari_plugin_engine import PluginManager, napari_hook_specification # fmt: off class HookSpecs: def napari_provide_sample_data(): ... # type: ignore def napari_get_reader(path): ... def napari_get_writer(path, layer_types): ... def napari_write_graph(path, data, meta): ... def napari_write_image(path, data, meta): ... def napari_write_labels(path, data, meta): ... def napari_write_points(path, data, meta): ... def napari_write_shapes(path, data, meta): ... def napari_write_surface(path, data, meta): ... def napari_write_vectors(path, data, meta): ... def napari_experimental_provide_function(): ... # type: ignore def napari_experimental_provide_dock_widget(): ... # type: ignore def napari_experimental_provide_theme(): ... # type: ignore # fmt: on for m in dir(HookSpecs): if m.startswith("napari"): setattr(HookSpecs, m, napari_hook_specification(getattr(HookSpecs, m))) pm = PluginManager("napari") pm.add_hookspecs(HookSpecs) yield pm @pytest.fixture def mock_npe1_pm_with_plugin(npe1_repo, npe1_plugin_module): """Mocks a fully installed local repository""" from npe2._inspection._from_npe1 import plugin_packages mock_dist = metadata.PathDistribution(npe1_repo / "npe1-plugin-0.0.1.dist-info") def _dists(): return [mock_dist] def _from_name(name): if name == "npe1-plugin": return mock_dist raise metadata.PackageNotFoundError(name) setup_cfg = npe1_repo / "setup.cfg" new_manifest = npe1_repo / "npe1_module" / "napari.yaml" with patch.object(metadata, "distributions", new=_dists): with patch.object(metadata.Distribution, "from_name", new=_from_name): cfg = setup_cfg.read_text() plugin_packages.cache_clear() try: yield mock_npe1_pm finally: plugin_packages.cache_clear() setup_cfg.write_text(cfg) if new_manifest.exists(): new_manifest.unlink() if (npe1_repo / "setup.py").exists(): (npe1_repo / "setup.py").unlink() @pytest.fixture(autouse=True) def mock_cache(tmp_path, monkeypatch): with monkeypatch.context() as m: m.setattr(_npe1_adapter, "ADAPTER_CACHE", tmp_path) yield tmp_path npe2-0.7.2/tests/test__io_utils.py0000644000000000000000000002446713615410400014062 0ustar00# extra underscore in name to run this first from pathlib import Path from unittest.mock import patch import pytest from npe2 import ( DynamicPlugin, PluginManager, io_utils, read, read_get_reader, write, write_get_writer, ) from npe2.types import FullLayerData SAMPLE_PLUGIN_NAME = "my-plugin" def test_read(uses_sample_plugin): assert read(["some.fzzy"], stack=False) == [(None,)] def test_read_with_unknown_plugin(uses_sample_plugin): # no such plugin name.... skips over the sample plugin & error is specific paths = ["some.fzzy"] chosen_reader = "not-a-plugin" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} does not exist." ) as e: read(paths, plugin_name=chosen_reader, stack=False) assert f"Available readers for {paths!r} are: {[SAMPLE_PLUGIN_NAME]!r}" in str(e) def test_read_with_unknown_plugin_no_readers(uses_sample_plugin): paths = ["some.nope"] chosen_reader = "not-a-plugin" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} does not exist." ) as e: read(paths, plugin_name=chosen_reader, stack=False) assert "No compatible readers are available" in str(e) def test_read_with_no_plugin(): # no plugin passed and none registered paths = ["some.nope"] with pytest.raises(ValueError, match="No compatible readers are available"): read(paths, stack=False) def test_read_uses_correct_passed_plugin(tmp_path): pm = PluginManager() long_name = "gooby-again" short_name = "gooby" long_name_plugin = DynamicPlugin(long_name, plugin_manager=pm) short_name_plugin = DynamicPlugin(short_name, plugin_manager=pm) long_name_plugin.register() short_name_plugin.register() path = "something.fzzy" mock_file = tmp_path / path mock_file.touch() @long_name_plugin.contribute.reader(filename_patterns=["*.fzzy"]) def get_read_long(path=mock_file): raise ValueError( f"Uhoh, {long_name} was chosen, but given plugin was {short_name}" ) @short_name_plugin.contribute.reader(filename_patterns=["*.fzzy"]) def get_read(path=mock_file): def read(paths): return [(None,)] return read # "gooby-again" isn't used even though given plugin starts with the same name # if an error is thrown here, it means we selected the wrong plugin io_utils._read(["some.fzzy"], plugin_name=short_name, stack=False, _pm=pm) def test_read_fails(): pm = PluginManager() plugin_name = "always-fails" plugin = DynamicPlugin(plugin_name, plugin_manager=pm) plugin.register() @plugin.contribute.reader(filename_patterns=["*.fzzy"]) def get_read(path): return None with pytest.raises(ValueError, match=f"Reader {plugin_name!r} was selected"): io_utils._read(["some.fzzy"], plugin_name=plugin_name, stack=False, _pm=pm) with pytest.raises(ValueError, match="No readers returned data"): io_utils._read(["some.fzzy"], stack=False, _pm=pm) def test_read_with_incompatible_reader(uses_sample_plugin): paths = ["some.notfzzy"] chosen_reader = f"{SAMPLE_PLUGIN_NAME}" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} is not a compatible reader" ): read(paths, stack=False, plugin_name=chosen_reader) def test_read_with_no_compatible_reader(): paths = ["some.notfzzy"] with pytest.raises(ValueError, match="No compatible readers are available"): read(paths, stack=False) def test_read_with_reader_contribution_plugin(uses_sample_plugin): paths = ["some.fzzy"] chosen_reader = f"{SAMPLE_PLUGIN_NAME}.some_reader" assert read(paths, stack=False, plugin_name=chosen_reader) == [(None,)] # if the wrong contribution is passed we get useful error message chosen_reader = f"{SAMPLE_PLUGIN_NAME}.not_a_reader" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} does not exist.", ) as e: read(paths, stack=False, plugin_name=chosen_reader) assert "Available readers for" in str(e) def test_read_assertion_with_no_compatible_readers(uses_sample_plugin): paths = ["some.noreader"] with patch("npe2.io_utils._get_compatible_readers_by_choice", return_value=[]): with pytest.raises(AssertionError, match="No readers to try."): read(paths, stack=False) def test_available_readers_show_commands(uses_sample_plugin): paths = ["some.fzzy"] chosen_reader = "not-a-plugin.not-a-reader" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} does not exist.", ) as e: read(paths, stack=False, plugin_name=chosen_reader) assert "Available readers " in str(e) assert f"{SAMPLE_PLUGIN_NAME}.some_reader" in str(e) chosen_reader = "not-a-plugin" with pytest.raises( ValueError, match=f"Given reader {chosen_reader!r} does not exist.", ) as e: read(paths, stack=False, plugin_name=chosen_reader) assert "Available readers " in str(e) assert f"{SAMPLE_PLUGIN_NAME}.some_reader" not in str(e) assert f"{SAMPLE_PLUGIN_NAME}" in str(e) def test_read_return_reader(uses_sample_plugin): data, reader = read_get_reader("some.fzzy") assert data == [(None,)] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" def test_read_return_reader_with_stack(uses_sample_plugin): data, reader = read_get_reader(["some.fzzy"], stack=True) assert data == [(None,)] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" def test_read_list(uses_sample_plugin): data, reader = read_get_reader(["some.fzzy", "other.fzzy"]) assert data == [(None,)] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" null_image: FullLayerData = ([], {}, "image") def test_writer_exec(uses_sample_plugin): # the sample writer knows how to handle two image layers result = write("test.tif", [null_image, null_image]) assert result == ["test.tif"] result, contrib = write_get_writer("test.tif", [null_image, null_image]) assert result == ["test.tif"] assert contrib.command == f"{SAMPLE_PLUGIN_NAME}.my_writer" @pytest.mark.parametrize("layer_data", [[null_image, null_image], []]) def test_writer_exec_fails(layer_data, uses_sample_plugin): # the sample writer doesn't accept no extension with pytest.raises(ValueError): write("test/path", layer_data) def test_writer_exec_fails2(uses_sample_plugin): # the sample writer doesn't accept 5 images with pytest.raises(ValueError): write("test.tif", [null_image, null_image, null_image, null_image, null_image]) def test_writer_single_layer_api_exec(uses_sample_plugin): # This writer doesn't do anything but type check. paths = write("test/path", [([], {}, "labels")]) assert len(paths) == 1 def test_read_non_global_pm(): pm = PluginManager() plugin = DynamicPlugin("my-plugin", plugin_manager=pm) @plugin.contribute.reader def read_path(path): def _reader(path): return [(None,)] return _reader assert io_utils._read(["some.fzzy"], stack=False, _pm=pm) == [(None,)] def test_read_uppercase_extension(tmp_path: Path): pm = PluginManager() plugin = DynamicPlugin("tif-plugin", plugin_manager=pm) path = "something.TIF" mock_file = tmp_path / path mock_file.touch() # reader should be compatible despite lowercase pattern @plugin.contribute.reader(filename_patterns=["*.tif"]) def get_read(path=mock_file): if path.lower() != path: # if this error is raised we can be certain path is unchanged raise ValueError("Given path contains capitals.") def read(paths): return [(None,)] return read with pytest.raises(ValueError, match="Given path contains capitals."): io_utils._read([str(mock_file)], stack=False, _pm=pm) @pytest.mark.parametrize( "path", ["some_zarr_directory.ZARR", "some_zarr_directory.Zarr"] ) def test_read_zarr_variants(path: str, tmp_path: Path): new_dir = tmp_path / path new_dir.mkdir() pm = PluginManager() plugin = DynamicPlugin("zarr-plugin", plugin_manager=pm) # reader should be compatible despite lowercase pattern @plugin.contribute.reader(filename_patterns=["*.zarr"], accepts_directories=True) def get_read(path): if path.lower() != path: # if this error is raised we can be certain path is unchanged raise ValueError("Given path contains capitals.") def read(paths): return [(None,)] return read with pytest.raises(ValueError, match="Given path contains capitals."): io_utils._read([str(new_dir)], stack=False, _pm=pm) @pytest.mark.parametrize( "path", ["some_two_ext_file.TAR.gz", "some_two_ext_file.TAR.GZ"] ) def test_read_tar_gz_variants(path: str, tmp_path: Path): pm = PluginManager() plugin = DynamicPlugin("tar-gz-plugin", plugin_manager=pm) mock_file = tmp_path / path mock_file.touch() # reader should be compatible despite lowercase pattern @plugin.contribute.reader(filename_patterns=["*.tar.gz"]) def get_read(path=mock_file): if path.lower() != path: # if this error is raised we can be certain path is unchanged raise ValueError("Given path contains capitals.") def read(paths): return [(None,)] return read with pytest.raises(ValueError, match="Given path contains capitals."): io_utils._read([str(mock_file)], stack=False, _pm=pm) @pytest.mark.parametrize("path", ["some_directory.Final", "some_directory.FINAL"]) def test_read_directory_variants(path: str, tmp_path: Path): new_dir = tmp_path / path new_dir.mkdir() pm = PluginManager() plugin = DynamicPlugin("directory-plugin", plugin_manager=pm) # reader should be compatible despite lowercase pattern @plugin.contribute.reader(filename_patterns=["*"], accepts_directories=True) def get_read(path): if path.lower() != path: # if this error is raised we can be certain path is unchanged raise ValueError("Given path contains capitals.") def read(paths): return [(None,)] return read with pytest.raises(ValueError, match="Given path contains capitals."): io_utils._read([str(new_dir)], stack=False, _pm=pm) npe2-0.7.2/tests/test_all_plugins.py0000644000000000000000000000271713615410400014377 0ustar00import os from importlib import metadata from subprocess import CalledProcessError from typing import TYPE_CHECKING import pytest from npe2._inspection._full_install import isolated_plugin_env from npe2.cli import app if TYPE_CHECKING: from pathlib import Path PLUGIN: str = os.getenv("TEST_PACKAGE_NAME") or "" if not PLUGIN: pytest.skip("skipping plugin specific tests", allow_module_level=True) @pytest.fixture(scope="session") def plugin_env(): try: with isolated_plugin_env(PLUGIN) as env: yield env except CalledProcessError as e: if "Failed building wheel" in str(e.output): yield None def test_entry_points_importable(plugin_env): if plugin_env is None: pytest.mark.xfail() return entry_points = [ ep for ep in metadata.distribution(PLUGIN).entry_points if ep.group in ("napari.plugin", "napari.manifest") ] if PLUGIN not in {"napari-console", "napari-error-reporter"}: assert entry_points for ep in entry_points: if ep.group == "napari.plugin": ep.load() def test_fetch(tmp_path: "Path"): from typer.testing import CliRunner mf_file = tmp_path / "manifest.yaml" result = CliRunner().invoke(app, ["fetch", PLUGIN, "-o", str(mf_file)]) assert result.exit_code == 0 assert PLUGIN in mf_file.read_text() result2 = CliRunner().invoke(app, ["validate", str(mf_file)]) assert result2.exit_code == 0 npe2-0.7.2/tests/test_cli.py0000644000000000000000000002022313615410400012625 0ustar00import sys from concurrent.futures import ThreadPoolExecutor from unittest.mock import patch import pytest from typer.testing import CliRunner from npe2 import __version__ from npe2.cli import app, main from npe2.manifest.schema import PluginManifest runner = CliRunner() @pytest.mark.parametrize("debug", ["--debug", ""]) @pytest.mark.parametrize("imports", ["--imports", "--no-imports"]) def test_cli_validate_ok(sample_path, debug, imports, monkeypatch): cmd = ["validate", str(sample_path / "my_plugin" / "napari.yaml"), imports] if debug: cmd += [debug] with monkeypatch.context() as m: m.setattr(sys, "path", [*sys.path, str(sample_path)]) result = runner.invoke(app, cmd) assert "✔ Manifest for 'My Plugin' valid!" in result.stdout assert result.exit_code == 0 def test_cli_validate_invalid(tmp_path, capsys): (tmp_path / "manifest.yaml").write_text("name: hi??\n") cmd = ["validate", str(tmp_path / "manifest.yaml")] result = runner.invoke(app, cmd) assert "'hi??' is not a valid python package name." in result.stdout def test_cli_validate_load_err(tmp_path): non_existent = str(tmp_path / "manifest.yaml") result = runner.invoke(app, ["validate", non_existent]) assert result.exit_code == 1 assert "🅇 Unexpected error in" in result.stdout assert "Could not find manifest for" in result.stdout result = runner.invoke(app, ["validate", non_existent, "--debug"]) assert "Could not find manifest for" in result.stdout @pytest.mark.parametrize("format", ["json", "yaml", "toml", "csv"]) @pytest.mark.parametrize("to_file", [True, False]) def test_cli_parse(sample_path, format, tmp_path, to_file): cmd = ["parse", str(sample_path / "my_plugin" / "napari.yaml")] if to_file: dest = tmp_path / f"output.{format}" cmd.extend(["-o", str(dest)]) else: cmd.extend(["-f", format]) result = runner.invoke(app, cmd) if format == "csv": assert result.exit_code return assert result.exit_code == 0 if to_file: assert dest.exists() assert PluginManifest.from_file(dest) else: assert "my-plugin" in result.stdout # just prints the yaml @pytest.mark.parametrize("format", ["json", "yaml", "toml", "csv"]) @pytest.mark.parametrize("to_file", [True, False]) @pytest.mark.parametrize("include_meta", [True, False]) def test_cli_fetch(format, tmp_path, to_file, include_meta): cmd = ["fetch", "napari-omero"] if to_file: dest = tmp_path / f"output.{format}" cmd.extend(["-o", str(dest)]) else: cmd.extend(["-f", format]) if include_meta: cmd.extend(["--include-package-meta", "--indent=2"]) result = runner.invoke(app, cmd) if format == "csv": assert result.exit_code return assert result.exit_code == 0 if to_file: assert dest.exists() assert PluginManifest.from_file(dest) else: assert "napari-omero" in result.stdout # just prints the yaml if include_meta: assert "package_metadata" in result.stdout def test_cli_fetch_all(tmp_path, monkeypatch): dest = tmp_path / "output" with patch("npe2._inspection._fetch.get_pypi_plugins") as mock_hub: mock_hub.return_value = {"a": "0.1.0", "b": "0.2.0", "c": "0.3.0"} with patch("npe2._inspection._fetch.ProcessPoolExecutor", ThreadPoolExecutor): cmd = ["fetch", "--all", "-o", str(dest)] monkeypatch.setattr(sys, "argv", cmd) result = runner.invoke(app, cmd) mock_hub.assert_called_once() assert result.exit_code == 0 assert dest.exists() assert (dest / "errors.json").exists() @pytest.mark.filterwarnings("default:Failed to convert") def test_cli_convert_repo(npe1_repo, mock_npe1_pm_with_plugin): result = runner.invoke(app, ["convert", str(npe1_repo)]) assert result.exit_code == 0 assert "✔ Conversion complete!" in result.stdout @pytest.mark.filterwarnings("default:Failed to convert") def test_cli_convert_repo_dry_run(npe1_repo, mock_npe1_pm_with_plugin): result = runner.invoke(app, ["convert", str(npe1_repo), "-n"]) assert result.exit_code == 0 assert "# Manifest would be written to" in result.stdout @pytest.mark.filterwarnings("ignore:The distutils package is deprecated") @pytest.mark.filterwarnings("default:Failed to convert napari_get_writer") def test_cli_convert_svg(): result = runner.invoke(app, ["convert", "napari-svg"]) assert "Some issues occured:" in result.stdout assert "Found a multi-layer writer in 'napari-svg'" in result.stdout assert result.exit_code == 0 def test_cli_convert_repo_fails(npe1_repo, mock_npe1_pm_with_plugin): (npe1_repo / "setup.cfg").unlink() result = runner.invoke(app, ["convert", str(npe1_repo)]) assert result.exit_code == 1 assert "Could not detect first gen napari plugin package" in result.stdout @pytest.mark.filterwarnings("default:Failed to convert") def test_cli_convert_package_name(npe1_repo, mock_npe1_pm_with_plugin): result = runner.invoke(app, ["convert", "npe1-plugin"]) assert "name: npe1-plugin" in result.stdout # just prints the yaml assert result.exit_code == 0 def test_cli_main(monkeypatch, sample_path): cmd = ["npe2", "validate", str(sample_path / "my_plugin" / "napari.yaml")] monkeypatch.setattr(sys, "argv", cmd) with pytest.raises(SystemExit) as e: main() assert e.value.code == 0 def test_cli_cache_list_empty(): result = runner.invoke(app, ["cache", "--list"]) assert "Nothing cached" in result.stdout assert result.exit_code == 0 def test_cli_cache_list_full(uses_npe1_plugin, mock_cache): (mock_cache / "npe1-plugin.yaml").write_text("name: npe1-plugin\n") result = runner.invoke(app, ["cache", "--list"]) assert result.stdout == "npe1-plugin: 0.1.0\n" assert result.exit_code == 0 def test_cli_cache_list_named(uses_npe1_plugin, mock_cache): (mock_cache / "npe1-plugin.yaml").write_text("name: npe1-plugin\n") result = runner.invoke(app, ["cache", "--list", "not-a-plugin"]) assert result.stdout == "Nothing cached for plugins: not-a-plugin\n" assert result.exit_code == 0 def test_cli_cache_clear_empty(): result = runner.invoke(app, ["cache", "--clear"]) assert "Nothing to clear" in result.stdout assert result.exit_code == 0 def test_cli_cache_clear_full(mock_cache): (mock_cache / "npe1-plugin.yaml").write_text("name: npe1-plugin\n") result = runner.invoke(app, ["cache", "--clear"]) assert "Cleared these files from cache" in result.stdout assert "- npe1-plugin.yaml" in result.stdout assert result.exit_code == 0 def test_cli_cache_clear_named(mock_cache): (mock_cache / "npe1-plugin.yaml").write_text("name: npe1-plugin\n") result = runner.invoke(app, ["cache", "--clear", "not-a-plugin"]) assert result.stdout == "Nothing to clear for plugins: not-a-plugin\n" assert result.exit_code == 0 @pytest.mark.parametrize("format", ["table", "compact", "yaml", "json"]) @pytest.mark.parametrize("fields", [None, "name,version,author"]) def test_cli_list(format, fields, uses_npe1_plugin): result = runner.invoke(app, ["list", "-f", format, "--fields", fields]) assert result.exit_code == 0 assert "npe1-plugin" in result.output if fields and "author" in fields and format != "compact": assert "author" in result.output.lower() else: assert "author" not in result.output.lower() def test_cli_list_sort(uses_npe1_plugin): result = runner.invoke(app, ["list", "--sort", "version"]) assert result.exit_code == 0 result = runner.invoke(app, ["list", "--sort", "7"]) assert result.exit_code assert "Invalid sort value '7'" in result.output result = runner.invoke(app, ["list", "--sort", "notaname"]) assert result.exit_code assert "Invalid sort value 'notaname'" in result.output def test_cli_version(): result = runner.invoke(app, ["--version"]) assert result.exit_code == 0 assert __version__ in result.output def test_compile(compiled_plugin_dir): result = runner.invoke(app, ["compile", str(compiled_plugin_dir)]) assert "id: my_compiled_plugin.my-plugin.generate_random_data" in result.output npe2-0.7.2/tests/test_compile.py0000644000000000000000000000216613615410400013514 0ustar00from pathlib import Path import pytest from npe2._inspection import compile from npe2.manifest.schema import PluginManifest def test_compile(compiled_plugin_dir: Path, tmp_path: Path): """ Test that the plugin manager can be compiled. """ with pytest.raises(ValueError, match="must have an extension of .json, .yaml, or"): compile(compiled_plugin_dir, "bad_path") dest = tmp_path / "output.yaml" mf = compile(compiled_plugin_dir, dest=dest, packages=["my_module"]) assert isinstance(mf, PluginManifest) assert mf.name == "my_compiled_plugin" assert mf.contributions.commands and len(mf.contributions.commands) == 5 assert dest.exists() assert PluginManifest.from_file(dest) == mf def test_compile_with_template(compiled_plugin_dir: Path, tmp_path: Path): """Test building from a template with npe2 compile.""" template = tmp_path / "template.yaml" template.write_text("name: my_compiled_plugin\ndisplay_name: Display Name\n") mf = compile(compiled_plugin_dir, template=template) assert mf.name == "my_compiled_plugin" assert mf.display_name == "Display Name" npe2-0.7.2/tests/test_config_contribution.py0000644000000000000000000000647513615410400016137 0ustar00import pytest from npe2.manifest.contributions import ConfigurationContribution, ConfigurationProperty from npe2.manifest.contributions._json_schema import ValidationError PROPS = [ { "plugin.heatmap.location": { "type": "string", "default": "right", "enum": ["left", "right"], "enumDescriptions": [ "Adds a heatmap indicator on the left edge", "Adds a heatmap indicator on the right edge", ], } } ] @pytest.mark.parametrize("props", PROPS) def test_config_contribution(props): cc = ConfigurationContribution( title="My Plugin", properties=props, ) assert cc.title == "My Plugin" for key, val in cc.properties.items(): assert val.dict(exclude_unset=True, by_alias=True) == props[key] def test_warn_on_refs_defs(): with pytest.warns(UserWarning): ConfigurationProperty( type="string", default="baz", description="quux", ref="http://example.com", ) CASES = [ ( { "type": str, "pattern": "^(\\([0-9]{3}\\))?[0-9]{3}-[0-9]{4}$", "pattern_error_message": "custom error", }, "555-1212", "(888)555-1212 ext. 532", ), ({"type": "string", "minLength": 2}, "AB", "A"), ({"type": "string", "maxLength": 3}, "AB", "ABCD"), ({"type": "integer"}, 42, 3.123), ({"type": float}, 42.45, "3.123"), ({"type": int, "multipleOf": 10}, 30, 23), ({"type": "number", "minimum": 100}, 100, 99), ({"type": "number", "exclusiveMaximum": 100}, 99, 100), ( {"properties": {"number": {"type": "number"}}}, {"number": 1600}, {"number": "1600"}, ), ( { "type": dict, "properties": { "number": {"type": "number"}, }, "additional_properties": False, }, {"number": 1600}, {"number": 1600, "street_name": "Pennsylvania"}, ), ({"type": "array"}, [3, "diff", {"types": "of values"}], {"Not": "an array"}), ({"items": {"type": "number"}}, [1, 2, 3, 4, 5], [1, 2, "3", 4, 5]), ( { "items": [ {"type": "number"}, {"type": "string"}, {"enum": ["Street", "Avenue", "Boulevard"]}, {"enum": ["NW", "NE", "SW", "SE"]}, ] }, [1600, "Pennsylvania", "Avenue", "NW"], [24, "Sussex", "Drive"], ), ({"type": [bool, int]}, True, "True"), ] @pytest.mark.parametrize("schema, valid, invalid", CASES) def test_config_validation(schema, valid, invalid): cfg = ConfigurationProperty(**schema) assert cfg.validate_instance(valid) == valid match = schema.get("pattern_error_message", None) with pytest.raises(ValidationError, match=match): assert cfg.validate_instance(invalid) assert cfg.is_array is ("items" in schema or cfg.type == "array") assert cfg.is_object is (cfg.type == "object") assert isinstance(cfg.has_constraint, bool) # check that we can can convert json type to python type for t in ( cfg.python_type if isinstance(cfg.python_type, list) else [cfg.python_type] ): assert t.__module__ == "builtins" assert cfg.has_default is ("default" in schema) npe2-0.7.2/tests/test_contributions.py0000644000000000000000000001220413615410400014760 0ustar00import json from functools import partial from unittest.mock import Mock import pytest from npe2 import DynamicPlugin, PluginManager, PluginManifest from npe2.manifest.contributions import ( CommandContribution, SampleDataGenerator, SampleDataURI, ) SAMPLE_PLUGIN_NAME = "my-plugin" def test_writer_empty_layers(): pm = PluginManager() pm.discover() writers = list(pm.iter_compatible_writers([])) assert not writers @pytest.mark.parametrize( "param", [ (["image"] * 2, 1), (["labels"], 0), (["image"] * 4, 1), (["image"] * 5, 0), (["points", "surface"], 1), (["points", "surface", "points"], 0), ], ) def test_writer_ranges(param, uses_sample_plugin, plugin_manager: PluginManager): layer_types, expected_count = param nwriters = sum( w.command == f"{SAMPLE_PLUGIN_NAME}.my_writer" for w in plugin_manager.iter_compatible_writers(layer_types) ) assert nwriters == expected_count def test_writer_priority(): """Contributions listed earlier in the manifest should be preferred.""" pm = PluginManager() with DynamicPlugin(name="my_plugin", plugin_manager=pm) as plg: @plg.contribute.writer(filename_extensions=["*.tif"], layer_types=["image"]) def my_writer1(path, data): ... @plg.contribute.writer(filename_extensions=["*.abc"], layer_types=["image"]) def my_writer2(path, data): ... writers = list(pm.iter_compatible_writers(["image"])) assert writers[0].command == "my_plugin.my_writer1" assert len(pm) == 1 @pytest.mark.parametrize( "expr", ["vectors", "vectors+", "vectors*", "vectors?", "vectors{3}", "vectors{3,8}"], ) def test_writer_valid_layer_type_expressions(expr, uses_sample_plugin): result = next( result for result in PluginManifest.discover() if result.manifest and result.manifest.name == SAMPLE_PLUGIN_NAME ) assert result.error is None assert result.manifest is not None pm = result.manifest data = json.loads(pm.json(exclude_unset=True)) assert "contributions" in data assert "writers" in data["contributions"] data["contributions"]["writers"][0]["layer_types"].append(expr) PluginManifest(**data) def test_basic_iter_reader(uses_sample_plugin, plugin_manager: PluginManager, tmp_path): tmp_path = str(tmp_path) assert not list(plugin_manager.iter_compatible_readers("")) reader = list(plugin_manager.iter_compatible_readers(tmp_path))[0] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" reader = list(plugin_manager.iter_compatible_readers([tmp_path, tmp_path]))[0] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" with pytest.raises(ValueError): list(plugin_manager.iter_compatible_readers(["a.tif", "b.jpg"])) def test_widgets(uses_sample_plugin, plugin_manager: PluginManager): widgets = list(plugin_manager.iter_widgets()) assert len(widgets) == 2 assert widgets[0].command == f"{SAMPLE_PLUGIN_NAME}.some_widget" w = widgets[0].exec() assert type(w).__name__ == "SomeWidget" assert widgets[1].command == f"{SAMPLE_PLUGIN_NAME}.some_function_widget" w = widgets[1].get_callable() assert isinstance(w, partial) def test_sample(uses_sample_plugin, plugin_manager: PluginManager): plugin, contribs = list(plugin_manager.iter_sample_data())[0] assert plugin == SAMPLE_PLUGIN_NAME assert len(contribs) == 2 ctrbA, ctrbB = contribs # ignoring types because .command and .uri come from different sample provider # types... they don't both have "command" or "uri" assert isinstance(ctrbA, SampleDataGenerator) assert ctrbA.command == f"{SAMPLE_PLUGIN_NAME}.generate_random_data" assert ctrbA.plugin_name == SAMPLE_PLUGIN_NAME assert isinstance(ctrbB, SampleDataURI) assert ctrbB.uri == "https://picsum.photos/1024" assert isinstance(ctrbA.open(), list) assert isinstance(ctrbB.open(), list) def test_directory_reader(uses_sample_plugin, plugin_manager: PluginManager, tmp_path): reader = list(plugin_manager.iter_compatible_readers(str(tmp_path)))[0] assert reader.command == f"{SAMPLE_PLUGIN_NAME}.some_reader" def test_themes(uses_sample_plugin, plugin_manager: PluginManager): theme = list(plugin_manager.iter_themes())[0] assert theme.label == "SampleTheme" def test_command_exec(): """Test CommandContribution.exec()""" pm = PluginManager.instance() try: cmd_id = "pkg.some_id" cmd = CommandContribution(id=cmd_id, title="a title") mf = PluginManifest(name="pkg", contributions={"commands": [cmd]}) pm.register(mf) some_func = Mock() pm._command_registry.register(cmd_id, some_func) cmd.exec(args=("hi!",)) some_func.assert_called_once_with("hi!") finally: pm.__instance = None def test_menus(uses_sample_plugin, plugin_manager: PluginManager): menus = plugin_manager.menus() assert len(menus) == 2 assert set(menus) == {"/napari/layer_context", "mysubmenu"} items = list(plugin_manager.iter_menu("/napari/layer_context")) assert len(items) == 2 npe2-0.7.2/tests/test_conversion.py0000644000000000000000000001163513615410400014252 0ustar00from importlib.metadata import PackageNotFoundError import pytest from npe2._inspection import _from_npe1 from npe2._inspection._from_npe1 import ( convert_repository, get_top_module_path, manifest_from_npe1, ) @pytest.mark.filterwarnings("ignore:The distutils package is deprecated") @pytest.mark.filterwarnings("ignore:Found a multi-layer writer in") @pytest.mark.parametrize("package", ["svg"]) def test_conversion(package): assert manifest_from_npe1(package) @pytest.mark.filterwarnings("ignore:Failed to convert napari_provide_sample_data") @pytest.mark.filterwarnings("ignore:Error converting function") @pytest.mark.filterwarnings("ignore:Error converting dock widget") def test_conversion_from_module(mock_npe1_pm, npe1_plugin_module): mf = manifest_from_npe1(module=npe1_plugin_module) assert isinstance(mf.dict(), dict) def test_conversion_from_obj_with_locals(mock_npe1_pm): from napari_plugin_engine import napari_hook_implementation class MyPlugin: @staticmethod @napari_hook_implementation def napari_experimental_provide_function(): def f(x: int): ... return [f] with pytest.warns(UserWarning) as record: mf = manifest_from_npe1(module=MyPlugin) msg = str(record[0].message) assert "functions defined in local scopes are not yet supported." in msg assert isinstance(mf.dict(), dict) @pytest.mark.filterwarnings("ignore:Failed to convert napari_provide_sample_data") @pytest.mark.filterwarnings("ignore:Error converting function") @pytest.mark.filterwarnings("ignore:Error converting dock widget") def test_conversion_from_package(npe1_repo, mock_npe1_pm_with_plugin): setup_cfg = npe1_repo / "setup.cfg" before = setup_cfg.read_text() convert_repository(npe1_repo, dry_run=True) assert setup_cfg.read_text() == before assert not (npe1_repo / "npe1_module" / "napari.yaml").exists() convert_repository(npe1_repo, dry_run=False) new_setup = setup_cfg.read_text() assert new_setup != before assert ( "[options.entry_points]\n" "napari.manifest = \n npe1-plugin = npe1_module:napari.yaml" ) in new_setup assert "[options.package_data]\nnpe1_module = napari.yaml" in new_setup assert (npe1_repo / "npe1_module" / "napari.yaml").is_file() with pytest.raises(ValueError) as e: convert_repository(npe1_repo) assert "Is this package already converted?" in str(e.value) def _assert_expected_errors(record: pytest.WarningsRecorder): assert len(record) == 4 msg = str(record[0].message) assert "Error converting dock widget [2] from 'npe1_module'" in msg msg = str(record[1].message) assert "Error converting function [1] from 'npe1_module'" in msg msg = str(record[2].message) assert "Failed to convert napari_provide_sample_data in 'npe1-plugin'" in msg assert "could not get resolvable python name" in msg msg = str(record[3].message) assert "Cannot auto-update setup.py, please edit setup.py as follows" in msg assert "npe1-plugin = npe1_module:napari.yaml" in msg def test_conversion_from_package_setup_py(npe1_repo, mock_npe1_pm_with_plugin): (npe1_repo / "setup.cfg").unlink() (npe1_repo / "setup.py").write_text( """from setuptools import setup NAME = 'npe1-plugin' setup( name=NAME, entry_points={"napari.plugin": ["npe1-plugin = npe1_module"]} ) """ ) with pytest.warns(UserWarning) as record: convert_repository(npe1_repo) _assert_expected_errors(record) def test_conversion_entry_point_string(npe1_repo, mock_npe1_pm_with_plugin): (npe1_repo / "setup.cfg").unlink() (npe1_repo / "setup.py").write_text( """from setuptools import setup setup( name='npe1-plugin', entry_points={"napari.plugin": "npe1-plugin = npe1_module"} ) """ ) with pytest.warns(UserWarning) as record: convert_repository(npe1_repo) _assert_expected_errors(record) def test_conversion_missing(): with pytest.raises(ModuleNotFoundError), pytest.warns(UserWarning): manifest_from_npe1("does-not-exist-asdf6as987") def test_conversion_package_is_not_a_plugin(): with pytest.raises(PackageNotFoundError): manifest_from_npe1("pytest") def test_get_top_module_path(mock_npe1_pm_with_plugin): get_top_module_path("npe1-plugin") def test_python_name_local(): def f(): return lambda x: None with pytest.raises(ValueError) as e: _from_npe1._python_name(f()) assert "functions defined in local scopes are not yet supported" in str(e.value) def test_guess_fname_patterns(): def get_reader1(path): if isinstance(path, str) and path.endswith((".tiff", ".tif")): return 1 def get_reader2(path): if path.endswith(".xyz"): return 1 assert _from_npe1._guess_fname_patterns(get_reader1) == ["*.tiff", "*.tif"] assert _from_npe1._guess_fname_patterns(get_reader2) == ["*.xyz"] npe2-0.7.2/tests/test_docs.py0000644000000000000000000000100313615410400013001 0ustar00import sys from pathlib import Path import pytest from npe2 import PluginManifest DOCS_DIR = Path(__file__).parent.parent / "_docs" def test_example_manifest(): example = DOCS_DIR / "example_manifest.yaml" assert PluginManifest.from_file(example) @pytest.mark.github_main_only def test_render_docs(tmp_path, monkeypatch): sys.path.append(str(DOCS_DIR.parent)) from _docs.render import main assert not list(tmp_path.glob("*.md")) main(tmp_path) assert list(tmp_path.glob("*.md")) npe2-0.7.2/tests/test_fetch.py0000644000000000000000000001056413615410400013156 0ustar00import os import urllib.request from importlib.metadata import PackageNotFoundError from unittest.mock import patch import pytest from npe2 import PluginManifest, fetch_manifest from npe2._inspection._fetch import ( _manifest_from_pypi_sdist, get_hub_plugin, get_hub_plugins, get_manifest_from_wheel, get_pypi_plugins, get_pypi_url, ) from npe2._inspection._full_install import fetch_manifest_with_full_install from npe2.manifest._npe1_adapter import NPE1Adapter def test_fetch_npe2_manifest(): mf = fetch_manifest("napari-omero") assert mf.name == "napari-omero" assert any(mf.contributions.dict().values()) assert mf.npe1_shim is False def test_fetch_npe1_manifest_with_writer(): mf = fetch_manifest("example-plugin") assert mf.name == "example-plugin" assert mf.contributions.writers # Test will eventually fail when example-plugin is updated to npe2 # This is here as a sentinel assert mf.npe1_shim is True def test_fetch_npe1_manifest_with_sample_data(): mf = fetch_manifest("napari-pyclesperanto-assistant") assert mf.name == "napari-pyclesperanto-assistant" assert mf.contributions.sample_data # Test will eventually fail when napari-pyclesperanto-assistant is updated to npe2 # This is here as a sentinel assert mf.npe1_shim is True def test_fetch_npe1_manifest_dock_widget_as_attribute(): # This tests is just to add coverage of a specific branch of code in the # napari_experimental_provide_dock_widget parser, (where the return value # is a dotted attribute, rather than a direct name). I only saw it in # brainreg-segment. mf = fetch_manifest("brainreg-segment") assert mf.name == "brainreg-segment" assert mf.contributions.widgets # Test will eventually fail when brainreg-segment is updated to npe2 # This is here as a sentinel assert mf.npe1_shim is True @pytest.mark.parametrize("version", [None, "0.1.0"]) @pytest.mark.parametrize("packagetype", ["sdist", "bdist_wheel", None]) def test_get_pypi_url(version, packagetype): assert "npe2" in get_pypi_url("npe2", version=version, packagetype=packagetype) def test_from_pypi_wheel_bdist_missing(): error = PackageNotFoundError("No bdist_wheel releases found") with patch("npe2._inspection._fetch.get_pypi_url", side_effect=error): with pytest.raises(PackageNotFoundError): fetch_manifest("my-package") @pytest.mark.skipif(not os.getenv("CI"), reason="slow, only run on CI") def testfetch_manifest_with_full_install(): # TODO: slowest of the tests ... would be nice to provide a local mock mf = fetch_manifest_with_full_install("napari-ndtiffs", version="0.1.2") # use version 0.1.2 which is npe1 assert isinstance(mf, NPE1Adapter) assert mf.name == "napari-ndtiffs" assert mf.contributions @pytest.mark.skipif(not os.getenv("CI"), reason="slow, only run on CI") def test_manifest_from_sdist(): mf = _manifest_from_pypi_sdist("zarpaint") assert mf.name == "zarpaint" def test_get_manifest_from_wheel(tmp_path): url = "https://files.pythonhosted.org/packages/f0/cc/7f6fbce81be3eb73266f398e49df92859ba247134eb086704dd70b43819a/affinder-0.2.3-py3-none-any.whl" # noqa dest = tmp_path / "affinder-0.2.3-py3-none-any.whl" urllib.request.urlretrieve(url, dest) mf = get_manifest_from_wheel(dest) assert mf.name == "affinder" def test_get_hub_plugins(): plugins = get_hub_plugins() assert len(plugins) > 0 def test_get_hub_plugin(): info = get_hub_plugin("napari-svg") assert info["name"] == "napari-svg" def test_get_pypi_plugins(): plugins = get_pypi_plugins() assert len(plugins) > 0 @pytest.mark.skipif(not os.getenv("CI"), reason="slow, only run on CI") @pytest.mark.parametrize( "url", [ "https://files.pythonhosted.org/packages/fb/01/e59bc1d6ac96f84ce9d7a46cc5422250e047958ead6c5693ed386cf94003/napari_dv-0.3.0.tar.gz", # noqa "https://files.pythonhosted.org/packages/5d/ae/17779e12ce60d8329306963e1a8dec608465caee582440011ff0c1310715/example_plugin-0.0.7-py3-none-any.whl", # noqa "git+https://github.com/DragaDoncila/example-plugin.git", # this one doesn't use setuptools_scm, can check direct zip without clone "https://github.com/jo-mueller/napari-stl-exporter/archive/refs/heads/main.zip", ], ) def test_fetch_urls(url): assert isinstance(fetch_manifest(url), PluginManifest) npe2-0.7.2/tests/test_implements.py0000644000000000000000000000561113615410400014237 0ustar00import sys from contextlib import nullcontext from pathlib import Path import pytest import npe2.implements from npe2 import PluginManifest from npe2._inspection import find_npe2_module_contributions SAMPLE_PLUGIN_NAME = "my-plugin" SAMPLE_MODULE_NAME = "my_plugin" SAMPLE_DIR = Path(__file__).parent / "sample" def test_extract_manifest(): module_with_decorators = SAMPLE_DIR / "_with_decorators.py" extracted = find_npe2_module_contributions( module_with_decorators, plugin_name=SAMPLE_PLUGIN_NAME, module_name=SAMPLE_MODULE_NAME, ) assert extracted.commands assert extracted.readers assert extracted.writers assert extracted.widgets assert extracted.sample_data # get expectations from manually created manifest known_manifest = Path(__file__).parent / "sample" / "my_plugin" / "napari.yaml" expected = PluginManifest.from_file(known_manifest).contributions non_python = ("my-plugin.hello_world", "my-plugin.another_command") expected.commands = [c for c in expected.commands if c.id not in non_python] expected.sample_data = [c for c in expected.sample_data if not hasattr(c, "uri")] # check that they're all the same _id = lambda x: x.id # noqa assert sorted(extracted.commands, key=_id) == sorted(expected.commands, key=_id) k = lambda x: x.command # noqa assert sorted(extracted.readers, key=k) == sorted(expected.readers, key=k) assert sorted(extracted.writers, key=k) == sorted(expected.writers, key=k) assert sorted(extracted.widgets, key=k) == sorted(expected.widgets, key=k) assert sorted(extracted.sample_data, key=k) == sorted(expected.sample_data, key=k) def test_dynamic(monkeypatch): with monkeypatch.context() as m: m.setattr(sys, "path", [*sys.path, str(SAMPLE_DIR)]) import _with_decorators assert hasattr(_with_decorators.get_reader, "_npe2_ReaderContribution") info = _with_decorators.get_reader._npe2_ReaderContribution assert info == { "id": "some_reader", "title": "Some Reader", "filename_patterns": ["*.fzy", "*.fzzy"], "accepts_directories": True, } # we can compile a module object as well as a string path extracted = find_npe2_module_contributions( _with_decorators, plugin_name=SAMPLE_PLUGIN_NAME, module_name=SAMPLE_MODULE_NAME, ) assert extracted.commands @pytest.mark.parametrize("check", [True, False]) def test_decorator_arg_check(check): """Check that the decorators don't check arguments at runtime unless instructed.""" # tilde is wrong and filename_patterns is missing kwargs = {"id": "some_id", "tilde": "some_title"} kwargs[npe2.implements.CHECK_ARGS_PARAM] = check ctx = pytest.raises(TypeError) if check else nullcontext() with ctx: npe2.implements.reader(**kwargs)(lambda: None) npe2-0.7.2/tests/test_manifest.py0000644000000000000000000001573113615410400013674 0ustar00from importlib import metadata from pathlib import Path from unittest.mock import patch import pytest from pydantic import ValidationError from npe2 import PluginManifest from npe2.manifest import PackageMetadata from npe2.manifest.schema import ENTRY_POINT SAMPLE_PLUGIN_NAME = "my-plugin" SAMPLE_MODULE_NAME = "my_plugin" def test_sample_plugin_valid(sample_manifest): assert sample_manifest def test_discover_empty(): # sanity check to make sure sample_plugin must be in path to be discovered results = PluginManifest.discover() manifests = [result.manifest.name for result in results if result.manifest] assert SAMPLE_PLUGIN_NAME not in manifests def test_schema(): assert isinstance(PluginManifest.schema_json(), str) dschema = PluginManifest.schema() assert isinstance(dschema, dict) assert "name" in dschema["properties"] def test_discover(uses_sample_plugin): discover_results = list(PluginManifest.discover()) assert len(discover_results) == 1 [(manifest, distribution, error)] = discover_results assert manifest and manifest.name == SAMPLE_PLUGIN_NAME assert distribution entrypoint = tuple(distribution.entry_points)[0] assert entrypoint and entrypoint.group == "napari.manifest" == ENTRY_POINT assert entrypoint.value == f"{SAMPLE_MODULE_NAME}:napari.yaml" assert error is None def test_discover_errors(tmp_path: Path): """testing various discovery errors""" # package with proper `napari.manifest` entry_point, but invalid pointer to # a manifest should yield an error in results a = tmp_path / "a" a.mkdir() a_ep = a / "entry_points.txt" bad_value = "asdfsad:blahblahblah.yaml" a_ep.write_text(f"[napari.manifest]\n{SAMPLE_PLUGIN_NAME} = {bad_value}") # package with proper `napari.manifest` entry_point, but invalid manifest b = tmp_path / "b" b.mkdir() b_ep = b / "entry_points.txt" b_ep.write_text("[napari.manifest]\nsome_plugin = module:napari.yaml") module = tmp_path / "module" module.mkdir() (module / "napari.yaml").write_text("name: hi??") # a regular package, with out napari.manifest entry_point should just be skipped c = tmp_path / "c" c.mkdir() c_ep = c / "entry_points.txt" c_ep.write_text("[console.scripts]\nsomething = something") dists = [ metadata.PathDistribution(a), metadata.PathDistribution(b), metadata.PathDistribution(c), ] with patch.object(metadata, "distributions", return_value=dists): discover_results = list(PluginManifest.discover(paths=[tmp_path])) assert len(discover_results) == 2 res_a, res_b = discover_results assert res_a.manifest is None assert res_a.distribution assert tuple(res_a.distribution.entry_points)[0].value == bad_value assert "Cannot find module 'asdfsad'" in str(res_a.error) assert res_b.manifest is None assert res_b.distribution assert tuple(res_b.distribution.entry_points)[0].value == "module:napari.yaml" assert isinstance(res_b.error, ValidationError) def test_package_meta(uses_sample_plugin): direct_meta = PackageMetadata.for_package(SAMPLE_PLUGIN_NAME) assert direct_meta.name == SAMPLE_PLUGIN_NAME assert direct_meta.version == "1.2.3" discover_results = list(PluginManifest.discover()) [(manifest, *_)] = discover_results assert manifest assert manifest.package_metadata == direct_meta assert manifest.author == direct_meta.author == "The Black Knight" assert manifest.description == direct_meta.summary == "My napari plugin" assert manifest.package_version == direct_meta.version == "1.2.3" assert manifest.license == direct_meta.license == "BSD-3" def test_all_package_meta(): """make sure PackageMetadata works for whatever packages are in the environment. just a brute force way to get a little more validation coverage """ for d in metadata.distributions(): assert PackageMetadata.from_dist_metadata(d.metadata) @pytest.mark.parametrize("format", ["toml", "json", "yaml", "pyproject"]) def test_export_round_trip(sample_manifest, tmp_path, format): """Test that an exported manifest can be round-tripped.""" if format == "pyproject": out_file = tmp_path / "pyproject.toml" out_file.write_text(sample_manifest.toml(pyproject=True)) else: out_file = tmp_path / f"napari.{format}" out_file.write_text(getattr(sample_manifest, format)()) assert sample_manifest == PluginManifest.from_file(out_file) def test_from_distribution(uses_sample_plugin): mf = PluginManifest.from_distribution(SAMPLE_PLUGIN_NAME) assert mf.name == SAMPLE_PLUGIN_NAME assert mf.package_metadata == PackageMetadata.for_package(SAMPLE_PLUGIN_NAME) with pytest.raises(metadata.PackageNotFoundError): _ = PluginManifest.from_distribution("not-an-installed-package") with pytest.raises(ValueError) as e: # valid package, but doesn't have a manifest _ = PluginManifest.from_distribution("pytest") assert "exists but does not provide a napari manifest" in str(e.value) def test_from_package_name_err(): with pytest.raises(ValueError) as e: PluginManifest._from_package_or_name("nonsense") assert "Could not find manifest for 'nonsense'" in str(e.value) def test_dotted_name_with_command(): with pytest.raises(ValidationError, match="must start with the current package"): PluginManifest( name="plugin.plugin-sample", contributions={"commands": [{"id": "plugin.command", "title": "Sample"}]}, ) with pytest.raises(ValidationError, match="must begin with the package name"): PluginManifest( name="plugin.plugin-sample", contributions={ "commands": [{"id": "plugin.plugin-samplecommand", "title": "Sample"}] }, ) PluginManifest( name="plugin.plugin-sample", contributions={ "commands": [{"id": "plugin.plugin-sample.command", "title": "Sample"}] }, ) def test_visibility(): mf = PluginManifest(name="myplugin") assert mf.is_visible mf = PluginManifest(name="myplugin", visibility="hidden") assert not mf.is_visible with pytest.raises(ValidationError): mf = PluginManifest(name="myplugin", visibility="other") def test_icon(): PluginManifest(name="myplugin", icon="my_plugin:myicon.png") def test_dotted_plugin_name(): """Test that""" name = "some.namespaced.plugin" cmd_id = f"{name}.frame_rate_widget" mf = PluginManifest( name=name, contributions={ "commands": [ { "id": cmd_id, "title": "open my widget", } ], "widgets": [ { "command": cmd_id, "display_name": "Plot frame rate", } ], }, ) assert mf.contributions.widgets assert mf.contributions.widgets[0].plugin_name == name npe2-0.7.2/tests/test_npe1_adapter.py0000644000000000000000000001503213615410400014423 0ustar00from functools import partial from importlib import metadata from pathlib import Path from unittest.mock import patch import numpy as np import pytest from npe2 import PluginManager from npe2.manifest import _npe1_adapter, utils from npe2.manifest.contributions import SampleDataGenerator from npe2.manifest.utils import SHIM_NAME_PREFIX def test_adapter_no_npe1(): pm = PluginManager() pm.discover() assert not pm._npe1_adapters def test_npe1_adapter(uses_npe1_plugin, mock_cache: Path): """Test that the plugin manager detects npe1 plugins, and can index contribs""" pm = PluginManager() pm.discover(include_npe1=True) # we've found an adapter assert len(pm._npe1_adapters) == 1 mf = pm.get_manifest("npe1-plugin") assert isinstance(mf, _npe1_adapter.NPE1Adapter) assert mf.package_metadata assert mf.package_metadata.version == "0.1.0" assert mf.package_metadata.name == "npe1-plugin" # it's currently unindexed and unstored assert not mf._cache_path().exists() assert not list(mock_cache.iterdir()) with patch.object( _npe1_adapter, "manifest_from_npe1", wraps=_npe1_adapter.manifest_from_npe1, # type: ignore ) as mock: pm.index_npe1_adapters() # the adapter has been cleared by the indexing assert len(pm._npe1_adapters) == 0 # manifest_from_npe1 was called mock.assert_called_once_with(mf._dist, adapter=True) assert mf._cache_path().exists() # NOTE: accessing the `.contributions` object would have also triggered # importing, like pm.index_npe1_adapters() above, but it would not have # injected the contributions into the pm._contrib object. assert mf.contributions.sample_data mock.reset_mock() # clear and rediscover... this time we expect the cache to kick in pm.discover(clear=True, include_npe1=True) assert len(pm._npe1_adapters) == 1 pm.index_npe1_adapters() assert len(pm._npe1_adapters) == 0 mock.assert_not_called() def test_npe1_adapter_cache(uses_npe1_plugin, mock_cache: Path): """Test that we can clear cache, etc..""" pm = PluginManager() pm.discover(include_npe1=True) with patch.object( _npe1_adapter, "manifest_from_npe1", wraps=_npe1_adapter.manifest_from_npe1, # type: ignore ) as mock: # if we clear the cache, it should import again mf = pm.get_manifest("npe1-plugin") assert isinstance(mf, _npe1_adapter.NPE1Adapter) pm.index_npe1_adapters() mock.assert_called_once_with(mf._dist, adapter=True) assert mf._cache_path().exists() _npe1_adapter.clear_cache() assert not mf._cache_path().exists() mock.reset_mock() pm.discover(clear=True, include_npe1=True) pm.index_npe1_adapters() mf = pm.get_manifest("npe1-plugin") assert isinstance(mf, _npe1_adapter.NPE1Adapter) mock.assert_called_once_with(mf._dist, adapter=True) assert mf._cache_path().exists() _npe1_adapter.clear_cache(names=["not-our-plugin"]) assert mf._cache_path().exists() _npe1_adapter.clear_cache(names=["npe1-plugin"]) assert not mf._cache_path().exists() def _get_mf() -> _npe1_adapter.NPE1Adapter: pm = PluginManager.instance() pm.discover(include_npe1=True) pm.index_npe1_adapters() mf = pm.get_manifest("npe1-plugin") assert isinstance(mf, _npe1_adapter.NPE1Adapter) return mf def test_adapter_pyname_sample_data(uses_npe1_plugin, mock_cache): """Test that objects defined locally in npe1 hookspecs can be retrieved.""" mf = _get_mf() samples = mf.contributions.sample_data assert samples sample_generator = next(s for s in samples if s.key == "local_data") assert isinstance(sample_generator, SampleDataGenerator) ONES = np.ones((4, 4)) with patch.object(utils, "_import_npe1_shim", wraps=utils._import_npe1_shim) as m: func = sample_generator.get_callable() assert isinstance(func, partial) # this is how it was defined in npe1-plugin pyname = f"{SHIM_NAME_PREFIX}npe1_module:napari_provide_sample_data_1" m.assert_called_once_with(pyname) assert np.array_equal(func(), ONES) # test nested sample data too sample_generator = next(s for s in samples if s.display_name == "Some local ones") func = sample_generator.get_callable() assert np.array_equal(func(), ONES) def test_adapter_pyname_dock_widget(uses_npe1_plugin, mock_cache): """Test that objects defined locally in npe1 hookspecs can be retrieved.""" mf = _get_mf() widgets = mf.contributions.widgets assert widgets wdg_contrib = next(w for w in widgets if w.display_name == "Local Widget") with patch.object(utils, "_import_npe1_shim", wraps=utils._import_npe1_shim) as m: caller = wdg_contrib.get_callable() assert isinstance(caller, partial) assert ".local_widget" in caller.keywords["function"].__qualname__ pyname = ( f"{SHIM_NAME_PREFIX}npe1_module:napari_experimental_provide_dock_widget_2" ) m.assert_called_once_with(pyname) m.reset_mock() wdg_contrib2 = next( w for w in widgets if w.display_name == "local function" and w.autogenerate ) caller2 = wdg_contrib2.get_callable() assert isinstance(caller2, partial) assert ".local_function" in caller2.keywords["function"].__qualname__ pyname = f"{SHIM_NAME_PREFIX}npe1_module:napari_experimental_provide_function_1" m.assert_called_once_with(pyname) def test_adapter_error_on_import(): class FakeDist(metadata.Distribution): def read_text(self, filename): if filename == "METADATA": return "Name: fake-plugin\nVersion: 0.1.0\n" def locate_file(self, *_): ... adapter = _npe1_adapter.NPE1Adapter(FakeDist()) def err(): raise ImportError("No package found.") with pytest.warns(UserWarning) as record: with patch.object(_npe1_adapter, "manifest_from_npe1", wraps=err): _ = adapter.contributions assert "Error importing contributions for" in str(record[0]) def test_adapter_cache_fail(uses_npe1_plugin, mock_cache): pm = PluginManager() pm.discover(include_npe1=True) mf = pm.get_manifest("npe1-plugin") def err(obj): raise OSError("Can't cache") with patch.object(_npe1_adapter.NPE1Adapter, "_save_to_cache", err): # shouldn't reraise the error _ = mf.contributions npe2-0.7.2/tests/test_package_meta.py0000644000000000000000000000224113615410400014457 0ustar00from npe2 import PackageMetadata def test_package_metadata_version(): """Test that we intelligently pick the min required metadata version""" assert PackageMetadata(name="test", version="1.0").metadata_version == "1.0" pm2 = PackageMetadata(name="test", version="1.0", maintainer="bob") assert pm2.metadata_version == "1.2" pm3 = PackageMetadata( name="test", version="1.0", maintainer="bob", description_content_type="text/markdown", ) assert pm3.metadata_version == "2.1" def test_hashable(): hash(PackageMetadata(name="test", version="1.0")) def test_package_metadata_extra_field(): pkg = { "name": "test", "version": "1.0", "maintainer": "bob", "extra_field_that_is_definitely_not_in_the_model": False, } try: p = PackageMetadata(**pkg) except Exception as e: raise AssertionError( "failed to parse PackageMetadata from a dict with an extra field" ) from e assert p.name == "test" assert p.version == "1.0" assert p.maintainer == "bob" assert not hasattr(p, "extra_field_that_is_definitely_not_in_the_model") npe2-0.7.2/tests/test_plugin_manager.py0000644000000000000000000001671113615410400015055 0ustar00import sys from unittest.mock import Mock, patch import pytest from npe2._command_registry import CommandHandler, CommandRegistry from npe2._plugin_manager import PluginManager from npe2.manifest.schema import PluginManifest from npe2.types import PythonName SAMPLE_PLUGIN_NAME = "my-plugin" @pytest.fixture def pm(sample_path): pm = PluginManager() pm.discover() assert len(pm._manifests) == 0 sys.path.append(str(sample_path)) try: pm.discover() yield pm finally: sys.path.remove(str(sample_path)) def test_discover_clear(uses_sample_plugin): pm = PluginManager.instance() assert SAMPLE_PLUGIN_NAME in pm._manifests reg_mock = Mock() pm.events.plugins_registered.connect(reg_mock) with patch.object(pm, "register", wraps=pm.register) as mock: pm.discover() mock.assert_not_called() # nothing new to register reg_mock.assert_not_called() mock.reset_mock() pm.discover(clear=True) # clear forces reregister mock.assert_called_once() reg_mock.assert_called_once_with({pm._manifests[SAMPLE_PLUGIN_NAME]}) def test_plugin_manager(pm: PluginManager): assert pm.get_command(f"{SAMPLE_PLUGIN_NAME}.hello_world") assert SAMPLE_PLUGIN_NAME not in pm._contexts ctx = pm.activate(SAMPLE_PLUGIN_NAME) assert SAMPLE_PLUGIN_NAME in pm._contexts assert pm.get_manifest(SAMPLE_PLUGIN_NAME) # dual activation is prevented assert pm.activate(SAMPLE_PLUGIN_NAME) is ctx assert pm.get_command(f"{SAMPLE_PLUGIN_NAME}.hello_world") assert pm.get_submenu("mysubmenu") with pytest.raises(KeyError): pm.get_submenu("not-a-submenu") assert len(list(pm.iter_menu("/napari/layer_context"))) == 2 # deactivation assert SAMPLE_PLUGIN_NAME in pm._contexts pm.deactivate(SAMPLE_PLUGIN_NAME) assert SAMPLE_PLUGIN_NAME not in pm._contexts pm.deactivate(SAMPLE_PLUGIN_NAME) # second time is a no-op assert SAMPLE_PLUGIN_NAME not in pm._contexts def test_plugin_manager_register(sample_path): sys.path.append(str(sample_path)) try: pm = PluginManager() pm.register(str(sample_path / "my_plugin" / "napari.yaml")) assert "my-plugin" in pm._manifests pm.unregister("my-plugin") assert "my-plugin" not in pm._manifests pm.register("my_plugin") assert "my-plugin" in pm._manifests finally: sys.path.remove(str(sample_path)) def test_plugin_manager_raises(pm: PluginManager): with pytest.raises(KeyError): pm.get_manifest("not-a-pluginxxx") with pytest.raises(KeyError): pm.activate("not a thing") with pytest.raises(KeyError): pm.get_command(f"{SAMPLE_PLUGIN_NAME}.not_a_thing") with pytest.raises(ValueError) as e: pm.register(PluginManifest(name=SAMPLE_PLUGIN_NAME)) assert f"A manifest with name {SAMPLE_PLUGIN_NAME!r} already" in str(e.value) def test_command_handler(): with pytest.raises(RuntimeError): # cannot resolve something without either a python_name or function CommandHandler("hi").resolve() with pytest.raises(RuntimeError): # cannot resolve something without either a python_name or function CommandHandler("hi", python_name=PythonName("cannot.import.this")).resolve() def test_command_reg_register(): reg = CommandRegistry() with pytest.raises(ValueError): # must register non empty string id reg.register(1, lambda: None) # type: ignore with pytest.raises(TypeError): # neither a string or a callable reg.register("other.id", 8) # type: ignore with pytest.raises(ValueError): # must register non empty string id reg.register("some.id", "1_is_not.a_valid_python_name") reg.register("some.id", "this.is.a_valid_python_name") with pytest.raises(ValueError): # already registered reg.register("some.id", "this.is.a_valid_python_name") with pytest.raises(KeyError) as e: reg.get("not.a.command") assert "command 'not.a.command' not registered" in str(e.value) def test_command_reg_get(): def f(x, y): return x + y reg = CommandRegistry() reg.register("id", f) assert "id" in reg assert reg.get("id") is f assert reg.execute("id", (1, 2)) == 3 def _assert_sample_enabled(plugin_manager: PluginManager, enabled=True): i = SAMPLE_PLUGIN_NAME in plugin_manager._contrib._indexed assert i if enabled else not i _not = "not " if not enabled else "" # command if enabled: assert plugin_manager.get_command(f"{SAMPLE_PLUGIN_NAME}.hello_world") else: with pytest.raises(KeyError): assert plugin_manager.get_command(f"{SAMPLE_PLUGIN_NAME}.hello_world") # reader cmds = [r.command for r in plugin_manager.iter_compatible_readers("*.fzy")] b = f"{SAMPLE_PLUGIN_NAME}.some_reader" in cmds assert b if enabled else not b, f"Reader should {_not}be enabled" # writer cmds = [r.command for r in plugin_manager.iter_compatible_writers(["image"] * 2)] c = f"{SAMPLE_PLUGIN_NAME}.my_writer" in cmds assert c if enabled else not c, f"Writer should {_not}be enabled" d = "SampleTheme" in [t.label for t in plugin_manager.iter_themes()] assert d if enabled else not d, f"Theme should {_not}be enabled" def test_enable_disable(uses_sample_plugin, plugin_manager: PluginManager, tmp_path): _assert_sample_enabled(plugin_manager) # just to test the enabled= kwarg on iter_manifests # (this would show *only* disabled plugins) assert not list(plugin_manager.iter_manifests(disabled=True)) # Do disable mock = Mock() plugin_manager.events.enablement_changed.connect(mock) plugin_manager.disable(SAMPLE_PLUGIN_NAME) mock.assert_called_once_with({}, {SAMPLE_PLUGIN_NAME}) # enabled, disabled _assert_sample_enabled(plugin_manager, False) # stuff you can't do while disabled: with pytest.raises(ValueError): plugin_manager.activate(SAMPLE_PLUGIN_NAME) # re-enable mock.reset_mock() plugin_manager.enable(SAMPLE_PLUGIN_NAME) mock.assert_called_once_with({SAMPLE_PLUGIN_NAME}, {}) # enabled, disabled _assert_sample_enabled(plugin_manager) def test_warn_on_register_disabled(uses_sample_plugin, plugin_manager: PluginManager): assert SAMPLE_PLUGIN_NAME in plugin_manager mf = plugin_manager[SAMPLE_PLUGIN_NAME] plugin_manager.disable(SAMPLE_PLUGIN_NAME) plugin_manager._manifests.pop(SAMPLE_PLUGIN_NAME) # NOT good way to "unregister" with pytest.warns(UserWarning): plugin_manager.register(mf) def test_plugin_manager_dict(uses_sample_plugin, plugin_manager: PluginManager): """Test exporting the plugin manager state with `dict()`.""" d = plugin_manager.dict() assert SAMPLE_PLUGIN_NAME in d["plugins"] assert "disabled" in d assert "activated" in d d = plugin_manager.dict( include={"contributions", "package_metadata.version"}, exclude={"contributions.writers", "contributions.readers"}, ) plugin_dict = d["plugins"][SAMPLE_PLUGIN_NAME] assert set(plugin_dict) == {"contributions", "package_metadata"} contribs = set(plugin_dict["contributions"]) assert "readers" not in contribs assert "writers" not in contribs def test_plugin_context_dispose(): pm = PluginManager() mf = PluginManifest(name="test") pm.register(mf) mock = Mock() pm.get_context("test").register_disposable(mock) pm.deactivate("test") mock.assert_called_once() npe2-0.7.2/tests/test_pm_module.py0000644000000000000000000000112113615410400014033 0ustar00from npe2 import PluginManager def test_pm_module(): from npe2 import plugin_manager as pm assert pm.instance() is PluginManager.instance() # smoke-test checking that a few of the argument-free things work # they may or may-not be empty depending on other tests in this suite. pm.iter_widgets() pm.iter_sample_data() # make sure we have it covered. for k, v in vars(PluginManager).items(): if k.startswith("_") or isinstance(v, (classmethod, property)): continue assert hasattr(pm, k), f"pm.py module is missing function {k!r}" npe2-0.7.2/tests/test_pytest_plugin.py0000644000000000000000000000323313615410400014766 0ustar00import pytest pytest_plugins = "pytester" CASE1 = """ from npe2._pytest_plugin import TestPluginManager from npe2 import PluginManager def test_something_1(npe2pm): assert isinstance(npe2pm, TestPluginManager) assert PluginManager.instance() is npe2pm """ CASE2 = """ import pytest def test_something_2(npe2pm, caplog): npe2pm.discover() assert "TestPluginManager refusing to discover plugins" in caplog.text assert len(caplog.records) == 1 assert caplog.records[0].levelname == "WARNING" """ CASE3 = """ from npe2 import DynamicPlugin def test_something_3(npe2pm): with npe2pm.tmp_plugin(name='some_name') as plugin: assert isinstance(plugin, DynamicPlugin) assert plugin.name in npe2pm._manifests """ CASE4 = """ from npe2 import PluginManifest def test_something_4(npe2pm): mf = PluginManifest(name='some_name') with npe2pm.tmp_plugin(manifest=mf) as plugin: assert plugin.name in npe2pm._manifests assert plugin.manifest is mf """ CASE5 = """ import pytest from importlib.metadata import PackageNotFoundError def test_something_5(npe2pm): with pytest.raises(PackageNotFoundError): npe2pm.tmp_plugin(package='somepackage') """ CASE6 = """ import pytest def test_something_6(npe2pm): with pytest.raises(FileNotFoundError): npe2pm.tmp_plugin(manifest='some_path.yaml') """ @pytest.mark.parametrize("case", [CASE1, CASE2, CASE3, CASE4, CASE5, CASE6]) def test_npe2pm_fixture(pytester: pytest.Pytester, case): """Make sure that the npe2pm fixture works.""" # create a temporary pytest test file pytester.makepyfile(case) pytester.runpytest().assert_outcomes(passed=1) npe2-0.7.2/tests/test_setuptools_plugin.py0000644000000000000000000000367413615410400015670 0ustar00import os import subprocess import sys import zipfile from pathlib import Path import pytest from npe2 import PluginManifest ROOT = Path(__file__).parent.parent TEMPLATE = Path("my_module") / "_napari.yaml" PYPROJECT = """ [build-system] requires = ["setuptools", "wheel", "npe2 @ file://{}"] build-backend = "setuptools.build_meta" [tool.npe2] template="{}" """.format( ROOT, TEMPLATE ) @pytest.mark.skipif(not os.getenv("CI"), reason="slow, only run on CI") @pytest.mark.parametrize("dist_type", ["sdist", "wheel"]) def test_compile(compiled_plugin_dir: Path, tmp_path: Path, dist_type: str) -> None: """ Test that the plugin manager can be compiled. """ pyproject = compiled_plugin_dir / "pyproject.toml" pyproject.write_text(PYPROJECT) template = compiled_plugin_dir / TEMPLATE template.write_text("name: my_compiled_plugin\ndisplay_name: My Compiled Plugin\n") os.chdir(compiled_plugin_dir) subprocess.check_call([sys.executable, "-m", "build", f"--{dist_type}"]) dist_dir = compiled_plugin_dir / "dist" assert dist_dir.is_dir() if dist_type == "sdist": # for sdist, test pip install into a temporary directory # and make sure the compiled manifest is there dist = next(dist_dir.glob("*.tar.gz")) site = tmp_path / "site" subprocess.check_call( [sys.executable, "-m", "pip", "install", str(dist), "--target", str(site)] ) mf_file = site / "my_module" / "napari.yaml" else: # for wheel, make sure that the manifest is included in the wheel dist = next(dist_dir.glob("*.whl")) with zipfile.ZipFile(dist) as zip: zip.extractall(dist_dir) mf_file = dist_dir / "my_module" / "napari.yaml" assert mf_file.exists() mf = PluginManifest.from_file(mf_file) assert mf.display_name == "My Compiled Plugin" assert len(mf.contributions.readers) == 1 assert len(mf.contributions.writers) == 2 npe2-0.7.2/tests/test_tmp_plugin.py0000644000000000000000000000641313615410400014241 0ustar00import pytest from npe2 import DynamicPlugin, PluginManager from npe2.manifest.contributions import SampleDataGenerator TMP = "tmp" @pytest.fixture def tmp_plugin(): local_pm = PluginManager() with DynamicPlugin(TMP, plugin_manager=local_pm) as tp: assert TMP in local_pm # make sure it registered yield tp assert TMP not in local_pm # make sure it cleaned up def test_temporary_plugin(tmp_plugin: DynamicPlugin): """Test that we can use tmp_plugin to register commands for testing""" # everything is empty to begin with pm = tmp_plugin.plugin_manager contribs = tmp_plugin.manifest.contributions # everything is empty to begin with assert not contribs.commands assert not contribs.sample_data assert not contribs.readers assert not contribs.writers # we can populate with the contribute.x decorators @tmp_plugin.contribute.sample_data def make_image(x): return x @tmp_plugin.contribute.reader def read_path(path): ... # can override args ID = f"{TMP}.random_id" @tmp_plugin.contribute.command(id=ID) def some_command(): return "hi!" # some require args with pytest.raises(AssertionError) as e: @tmp_plugin.contribute.writer def write_path_bad(path, layer_data): ... assert "layer_types must not be empty" in str(e.value) # it didn't get added assert "tmp.write_path_bad" not in pm.commands @tmp_plugin.contribute.writer(layer_types=["image"]) def write_path(path, layer_data): ... # now it did assert "tmp.write_path" in pm.commands # contributions have been populated assert contribs.commands assert contribs.sample_data assert contribs.readers assert contribs.writers # and the commands work samples = next(contribs for plg, contribs in pm.iter_sample_data() if plg == TMP) gen = samples[0] assert isinstance(gen, SampleDataGenerator) assert gen.exec((1,), _registry=pm.commands) == 1 cmd = pm.get_command(ID) assert cmd.exec(_registry=pm.commands) == "hi!" def test_temporary_plugin_change_pm(tmp_plugin: DynamicPlugin): """We can change the plugin manager we're assigned to. Probably not necessary, but perhaps useful in tests. """ start_pm = tmp_plugin.plugin_manager new_pm = PluginManager() @tmp_plugin.contribute.command def some_command(): return "hi!" assert "tmp.some_command" in start_pm.commands assert "tmp.some_command" not in new_pm.commands tmp_plugin.plugin_manager = new_pm assert "tmp.some_command" not in start_pm.commands assert "tmp.some_command" in new_pm.commands tmp_plugin.clear() assert not tmp_plugin.manifest.contributions.commands def test_temporary_plugin_spawn(tmp_plugin: DynamicPlugin): new = tmp_plugin.spawn("another-name", register=True) assert new.name == "another-name" assert new.display_name == "another-name" assert new.plugin_manager == tmp_plugin.plugin_manager assert (t1 := tmp_plugin.spawn(register=True)).name == f"{tmp_plugin.name}-1" assert (t2 := tmp_plugin.spawn()).name == f"{tmp_plugin.name}-2" assert t1.name in tmp_plugin.plugin_manager._manifests assert t2.name not in tmp_plugin.plugin_manager._manifests npe2-0.7.2/tests/test_utils.py0000644000000000000000000000771113615410400013225 0ustar00import pytest from npe2.manifest.contributions import ContributionPoints from npe2.manifest.schema import PluginManifest from npe2.manifest.utils import ( Version, deep_update, merge_contributions, merge_manifests, ) def test_version(): v = Version.parse(b"0.1.2") assert v == "0.1.2" assert v > {"major": 0, "minor": 1, "patch": 0} assert v <= (0, 2, 0) assert v == Version(0, 1, 2) assert list(v) == [0, 1, 2, None, None] assert str(v) == "0.1.2" with pytest.raises(TypeError): assert v == 1.2 with pytest.raises(ValueError): Version.parse("alkfdjs") with pytest.raises(TypeError): Version.parse(1.2) # type: ignore def test_merge_manifests(): with pytest.raises(ValueError): merge_manifests([]) with pytest.raises(AssertionError) as e: merge_manifests([PluginManifest(name="p1"), PluginManifest(name="p2")]) assert "All manifests must have same name" in str(e.value) pm1 = PluginManifest( name="plugin", contributions={ "commands": [{"id": "plugin.command", "title": "some writer"}], "writers": [{"command": "plugin.command", "layer_types": ["image"]}], }, ) pm2 = PluginManifest( name="plugin", contributions={ "commands": [{"id": "plugin.command", "title": "some reader"}], "readers": [{"command": "plugin.command", "filename_patterns": [".tif"]}], }, ) expected_merge = PluginManifest( name="plugin", contributions={ "commands": [ {"id": "plugin.command", "title": "some writer"}, {"id": "plugin.command_2", "title": "some reader"}, # no dupes ], "writers": [{"command": "plugin.command", "layer_types": ["image"]}], "readers": [{"command": "plugin.command_2", "filename_patterns": [".tif"]}], }, ) assert merge_manifests([pm1]) is pm1 assert merge_manifests([pm1, pm2]) == expected_merge def test_merge_contributions(): a = ContributionPoints( commands=[ {"id": "plugin.command", "title": "some writer"}, ], writers=[{"command": "plugin.command", "layer_types": ["image"]}], ) b = ContributionPoints( commands=[ {"id": "plugin.command", "title": "some writer"}, ], writers=[{"command": "plugin.command", "layer_types": ["image"]}], ) c = ContributionPoints( commands=[ {"id": "plugin.command", "title": "some writer"}, ], writers=[{"command": "plugin.command", "layer_types": ["image"]}], ) expected = ContributionPoints( commands=[ {"id": "plugin.command", "title": "some writer"}, {"id": "plugin.command_2", "title": "some writer"}, {"id": "plugin.command_3", "title": "some writer"}, ], writers=[ {"command": "plugin.command", "layer_types": ["image"]}, {"command": "plugin.command_2", "layer_types": ["image"]}, {"command": "plugin.command_3", "layer_types": ["image"]}, ], ) d = ContributionPoints(**merge_contributions((a, b, c))) assert d == expected # with overwrite, later contributions with matching command ids take precendence. e = ContributionPoints(**merge_contributions((a, b, c), overwrite=True)) expected = ContributionPoints( commands=[ {"id": "plugin.command", "title": "some writer"}, ], writers=[ {"command": "plugin.command", "layer_types": ["image"]}, ], ) assert e == a def test_deep_update(): a = {"a": {"b": 1, "c": 2}, "e": 2} b = {"a": {"d": 4, "c": 3}, "f": 0} c = deep_update(a, b, copy=True) assert c == {"a": {"b": 1, "d": 4, "c": 3}, "e": 2, "f": 0} assert a == {"a": {"b": 1, "c": 2}, "e": 2} deep_update(a, b, copy=False) assert a == {"a": {"b": 1, "d": 4, "c": 3}, "e": 2, "f": 0} npe2-0.7.2/tests/test_validations.py0000644000000000000000000001673213615410400014405 0ustar00import json import pytest from pydantic import ValidationError from npe2 import PluginManifest from npe2.manifest import _validators # the docstrings here are used to assert the validation error that is printed. SAMPLE_PLUGIN_NAME = "my-plugin" def _mutator_invalid_package_name(data): """'invalid??' is not a valid python package name.""" data["name"] = "invalid??" def _mutator_invalid_package_name2(data): """'_invalid' is not a valid python package name.""" data["name"] = "_invalid" def _mutator_command_not_begin_with_package_name(data): """Commands identifiers must start with the current package name""" assert "contributions" in data c = data["contributions"]["commands"][0]["id"] data["contributions"]["commands"][0]["id"] = ".".join( ["not_packagename", *c.split(".")[1:]] ) def _mutator_python_name_no_colon(data): """'this.has.no.colon' is not a valid python_name.""" assert "contributions" in data data["contributions"]["commands"][0]["python_name"] = "this.has.no.colon" def _mutator_python_name_locals(data): """functions defined in local scopes are not yet supported""" assert "contributions" in data data["contributions"]["commands"][0]["python_name"] = "mod:func..another" def _mutator_python_name_starts_with_number(data): """'1starts_with_number' is not a valid python_name.""" assert "contributions" in data data["contributions"]["commands"][0]["python_name"] = "1starts_with_number" def _mutator_writer_requires_non_empty_layer_types(data): """layer_types must not be empty""" data["contributions"]["writers"][0]["layer_types"] = [] def _mutator_writer_invalid_layer_type_constraint(data): """'image{' is not a valid LayerType""" data["contributions"]["writers"][0]["layer_types"].append("image{") def _mutator_writer_invalid_file_extension_1(data): """Invalid file extension: Must have one character past the '.'""" data["contributions"]["writers"][0]["filename_extensions"] = ["*"] def _mutator_writer_invalid_file_extension_2(data): """Invalid file extension: Must have one character past the '.'""" data["contributions"]["writers"][0]["filename_extensions"] = ["."] def _mutator_invalid_icon(data): """is not a valid icon URL. It must start with 'https://'""" data["icon"] = "http://example.com/icon.png" @pytest.mark.parametrize( "mutator", [ _mutator_invalid_package_name, _mutator_invalid_package_name2, _mutator_command_not_begin_with_package_name, _mutator_python_name_no_colon, _mutator_python_name_locals, _mutator_python_name_starts_with_number, _mutator_writer_requires_non_empty_layer_types, _mutator_writer_invalid_layer_type_constraint, _mutator_writer_invalid_file_extension_1, _mutator_writer_invalid_file_extension_2, _mutator_invalid_icon, ], ) def test_invalid(mutator, uses_sample_plugin): result = next( result for result in PluginManifest.discover() if result.manifest and result.manifest.name == SAMPLE_PLUGIN_NAME ) assert result.error is None assert result.manifest is not None pm = result.manifest data = json.loads(pm.json(exclude_unset=True)) mutator(data) with pytest.raises(ValidationError) as excinfo: PluginManifest(**data) assert mutator.__doc__ in str(excinfo.value) def test_schema_version_too_high(): with pytest.warns( UserWarning, match=r"\(999.999.999\) is newer than npe2's schema version" ): PluginManifest(name="sample", schema_version="999.999.999") def test_invalid_python_name(uses_sample_plugin): mf = next( result for result in PluginManifest.discover() if result.manifest and result.manifest.name == SAMPLE_PLUGIN_NAME ).manifest assert mf and mf.contributions and mf.contributions.commands assert mf.contributions.commands[-1].python_name mf.validate_imports() mf.contributions.commands[-1].python_name += "_whoops" # type: ignore with pytest.raises(ValidationError) as e: mf.validate_imports() assert "has no attribute 'make_widget_from_function_whoops'" in str(e.value) def _valid_mutator_no_contributions(data): """ Contributions can be absent, in which case the Pydantic model will set the default value to an empty Contributions model """ del data["contributions"] def _valid_mutator_no_contributions_empty(data): """ Contributions can be an empty list, in which case the Pydantic model will set the default value to an empty Contributions model """ data["contributions"] = [] def _valid_mutator_no_contributions_None(data): """ Contributions can be None, in which case the Pydantic model will set the default value to None, and not the empty list, make sure that works. """ # This is no longer recommended. A missing contributions is fine, and an empty # list is fine. We preserve this for backwards compatibility, # but providing None explicitly shouldn't be used data["contributions"] = None @pytest.mark.parametrize( "mutator", [_valid_mutator_no_contributions, _valid_mutator_no_contributions_None], ) def test_valid_mutations(mutator, uses_sample_plugin): assert mutator.__name__.startswith("_valid") pm = list(PluginManifest.discover())[0] assert pm.manifest # make sure the data is a copy as we'll mutate it data = json.loads(pm.manifest.json(exclude_unset=True)) mutator(data) PluginManifest(**data) @pytest.mark.parametrize( "display_name", [ "Here there everywhere and more with giggles and friends on top of a mountainside drinking tea", # noqa "ab", "a ", " abc", "_abc", "abc♱", ], ) def test_invalid_display_names(display_name, uses_sample_plugin): field = PluginManifest.__fields__["display_name"] value, err = field.validate(display_name, {}, loc="display_name") assert err is not None @pytest.mark.parametrize( "display_name", [ "Some Cell & Stru买cture Segmenter", "Segment Blobs and Things with Membranes", "Segment: and Things.2 with Membranes ~= 8", "abc", "abc䜁䜂", ], ) def test_valid_display_names(display_name, uses_sample_plugin): field = PluginManifest.__fields__["display_name"] value, err = field.validate(display_name, {}, loc="display_name") assert err is None def test_display_name_default_is_valid(): PluginManifest(name="") @pytest.mark.parametrize( "expr", [ "vectors{", "image", # should parse fine, but be a duplication error "vectors{8,3}", "vectors{-1}", "vectors??", "other?", ], ) def test_writer_invalid_layer_type_expressions(expr, uses_sample_plugin): result = next( result for result in PluginManifest.discover() if result.manifest and result.manifest.name == SAMPLE_PLUGIN_NAME ) assert result.error is None assert result.manifest is not None pm = result.manifest data = json.loads(pm.json(exclude_unset=True)) assert "contributions" in data assert "writers" in data["contributions"] data["contributions"]["writers"][0]["layer_types"].append(expr) with pytest.raises(ValidationError): PluginManifest(**data) @pytest.mark.parametrize("id", ["badchar!?", "-bad-start", "has space"]) def test_invalid_command_id(id): with pytest.raises(ValueError): _validators.command_id(id) npe2-0.7.2/tests/fixtures/my-compiled-plugin/setup.cfg0000644000000000000000000000027113615410400017653 0ustar00[metadata] name = my_compiled_plugin version = 0.1.0 [options.entry_points] napari.manifest = my-compiled-plugin = my_module:napari.yaml [options.package_data] my_module = *.yaml npe2-0.7.2/tests/fixtures/my-compiled-plugin/my_module/__init__.py0000644000000000000000000000000013615410400022123 0ustar00npe2-0.7.2/tests/fixtures/my-compiled-plugin/my_module/_a.py0000644000000000000000000000207713615410400020763 0ustar00# mypy: disable-error-code=empty-body from typing import TYPE_CHECKING, Any, Dict, List, Tuple # alternative pattern that does not require npe2 at runtime if TYPE_CHECKING: from npe2 import implements else: # create no-op `implements.anything(**kwargs)` decorator D = type("D", (), {"__getattr__": lambda *_: (lambda **_: (lambda f: f))}) implements = D() @implements.on_activate def activate(ctx): ... @implements.on_deactivate def deactivate(ctx): ... @implements.reader( id="some_reader", title="Some Reader", filename_patterns=["*.fzy", "*.fzzy"], accepts_directories=True, ) def get_reader(path: str): ... @implements.writer( id="my_writer", title="My Multi-layer Writer", filename_extensions=["*.tif", "*.tiff"], layer_types=["image{2,4}", "tracks?"], ) @implements.writer( id="my_writer", title="My Multi-layer Writer", filename_extensions=["*.pcd", "*.e57"], layer_types=["points{1}", "surface+"], ) def writer_function(path: str, layer_data: List[Tuple[Any, Dict, str]]) -> List[str]: ... npe2-0.7.2/tests/fixtures/my-compiled-plugin/my_module/_b.py0000644000000000000000000000142513615410400020760 0ustar00from typing import TYPE_CHECKING if TYPE_CHECKING: from npe2 import implements else: D = type("D", (), {"__getattr__": lambda *_: (lambda **_: (lambda f: f))}) implements = D() @implements.widget(id="some_widget", title="Create my widget", display_name="My Widget") class SomeWidget: ... @implements.sample_data_generator( id="my-plugin.generate_random_data", # the plugin-name is optional title="Generate uniform random data", key="random_data", display_name="Some Random Data (512 x 512)", ) def random_data(): ... @implements.widget( id="some_function_widget", title="Create widget from my function", display_name="A Widget From a Function", autogenerate=True, ) def make_widget_from_function(x: int, threshold: int): ... npe2-0.7.2/tests/npe1-plugin/setup.cfg0000644000000000000000000000016413615410400014427 0ustar00[metadata] name = npe1-plugin version = 0.1.0 [options.entry_points] napari.plugin = npe1-plugin = npe1_module npe2-0.7.2/tests/npe1-plugin/npe1-plugin-0.0.1.dist-info/METADATA0000644000000000000000000000006713615410400020377 0ustar00Metadata-Version: 2.1 Name: npe1-plugin Version: 0.1.0 npe2-0.7.2/tests/npe1-plugin/npe1-plugin-0.0.1.dist-info/entry_points.txt0000644000000000000000000000005213615410400022564 0ustar00[napari.plugin] npe1-plugin = npe1_module npe2-0.7.2/tests/npe1-plugin/npe1-plugin-0.0.1.dist-info/top_level.txt0000644000000000000000000000001413615410400022016 0ustar00npe1_module npe2-0.7.2/tests/npe1-plugin/npe1_module/__init__.py0000644000000000000000000000415713615410400017135 0ustar00from functools import partial import numpy as np from magicgui import magic_factory from napari_plugin_engine import napari_hook_implementation class MyWidget: ... def some_function(x: int): ... def gen_data(): ... @napari_hook_implementation def napari_get_reader(path): ... @napari_hook_implementation def napari_write_image(path, data, meta): ... @napari_hook_implementation def napari_write_labels(path, data, meta): ... @napari_hook_implementation def napari_provide_sample_data(): return { "random data": gen_data, "local data": partial(np.ones, (4, 4)), "random image": "https://picsum.photos/1024", "sample_key": { "display_name": "Some Random Data (512 x 512)", "data": gen_data, }, "local_ones": { "display_name": "Some local ones", "data": partial(np.ones, (4, 4)), }, } @napari_hook_implementation def napari_experimental_provide_theme(): return { "super_dark": { "name": "super_dark", "background": "rgb(12, 12, 12)", "foreground": "rgb(65, 72, 81)", "primary": "rgb(90, 98, 108)", "secondary": "rgb(134, 142, 147)", "highlight": "rgb(106, 115, 128)", "text": "rgb(240, 241, 242)", "icon": "rgb(209, 210, 212)", "warning": "rgb(153, 18, 31)", "current": "rgb(0, 122, 204)", "syntax_style": "native", "console": "rgb(0, 0, 0)", "canvas": "black", }, "pretty_light": { "background": "rgb(192, 223, 139)", }, } factory = magic_factory(some_function) @napari_hook_implementation def napari_experimental_provide_dock_widget(): @magic_factory def local_widget(y: str): ... return [ MyWidget, (factory, {"name": "My Other Widget"}), (local_widget, {"name": "Local Widget"}), ] @napari_hook_implementation def napari_experimental_provide_function(): def local_function(x: int): ... return [some_function, local_function] npe2-0.7.2/tests/sample/_with_decorators.py0000644000000000000000000000463113615410400015644 0ustar00# mypy: disable-error-code=empty-body """This module mimics all of the contributions my-plugin... but is used to reverse-engineer the manifest.""" from typing import TYPE_CHECKING, Any, Dict, List, Tuple # to test various ways that this can be imported, since we're using static parsing. import npe2.implements import npe2.implements as impls from npe2 import implements from npe2.implements import reader # alternative pattern that does not require npe2 at runtime if TYPE_CHECKING: from npe2 import implements as noimport else: # create no-op `implements.anything(**kwargs)` decorator D = type("D", (), {"__getattr__": lambda *_: (lambda **_: (lambda f: f))}) noimport = D() @implements.on_activate def activate(ctx): ... @implements.on_deactivate def deactivate(ctx): ... @implements.reader( id="some_reader", title="Some Reader", filename_patterns=["*.fzy", "*.fzzy"], accepts_directories=True, ) def get_reader(path: str): ... @reader( id="url_reader", title="URL Reader", filename_patterns=["http://*", "https://*"], accepts_directories=False, ensure_args_valid=True, ) def url_reader(path: str): ... @noimport.writer( id="my_writer", title="My Multi-layer Writer", filename_extensions=["*.tif", "*.tiff"], layer_types=["image{2,4}", "tracks?"], ) @implements.writer( id="my_writer", title="My Multi-layer Writer", filename_extensions=["*.pcd", "*.e57"], layer_types=["points{1}", "surface+"], ) def writer_function(path: str, layer_data: List[Tuple[Any, Dict, str]]) -> List[str]: ... @implements.writer( id="my_single_writer", title="My single-layer Writer", filename_extensions=["*.xyz"], layer_types=["labels"], ) def writer_function_single(path: str, layer_data: Any, meta: Dict) -> List[str]: ... @npe2.implements.widget( id="some_widget", title="Create my widget", display_name="My Widget" ) class SomeWidget: ... @npe2.implements.sample_data_generator( id="my-plugin.generate_random_data", # the plugin-name is optional title="Generate uniform random data", key="random_data", display_name="Some Random Data (512 x 512)", ) def random_data(): ... @impls.widget( id="some_function_widget", title="Create widget from my function", display_name="A Widget From a Function", autogenerate=True, ) def make_widget_from_function(x: int, threshold: int): ... npe2-0.7.2/tests/sample/my_plugin/__init__.py0000644000000000000000000000332613615410400016047 0ustar00from typing import TYPE_CHECKING, Any, Dict, List, Tuple from pydantic import BaseModel from npe2 import PluginContext from npe2.types import PathOrPaths if TYPE_CHECKING: import napari.types def activate(context: PluginContext): @context.register_command("my_plugin.hello_world") def _hello(): ... context.register_command("my_plugin.another_command", lambda: print("yo!")) def deactivate(context: PluginContext): """just here for tests""" def get_reader(path: PathOrPaths): if isinstance(path, list): def read(path): assert isinstance(path, list) return [(None,)] return read assert isinstance(path, str) # please mypy. if path.endswith(".fzzy"): def read(path): assert isinstance(path, str) return [(None,)] return read else: raise ValueError("Test plugin should not receive unknown data") def url_reader(path: str): if path.startswith("http"): def read(path): return [(None,)] return read def writer_function(path: str, layer_data: List[Tuple[Any, Dict, str]]) -> List[str]: class Arg(BaseModel): data: Any meta: Dict layer_type: str for e in layer_data: Arg(data=e[0], meta=e[1], layer_type=e[2]) return [path] def writer_function_single(path: str, layer_data: Any, meta: Dict) -> List[str]: class Arg(BaseModel): data: Any meta: Dict Arg(data=layer_data, meta=meta) return [path] class SomeWidget: ... def random_data(): import numpy as np return [(np.random.rand(10, 10))] def make_widget_from_function(image: "napari.types.ImageData", threshold: int): ... npe2-0.7.2/tests/sample/my_plugin/napari.yaml0000644000000000000000000000567513615410400016105 0ustar00name: my-plugin display_name: My Plugin on_activate: my_plugin:activate on_deactivate: my_plugin:deactivate icon: https://picsum.photos/256 contributions: commands: - id: my-plugin.hello_world title: Hello World - id: my-plugin.another_command title: Another Command - id: my-plugin.some_reader title: Some Reader python_name: my_plugin:get_reader - id: my-plugin.url_reader title: URL Reader python_name: my_plugin:url_reader - id: my-plugin.my_writer title: My Multi-layer Writer python_name: my_plugin:writer_function - id: my-plugin.my_single_writer title: My single-layer Writer python_name: my_plugin:writer_function_single - id: my-plugin.generate_random_data title: Generate uniform random data python_name: my_plugin:random_data - id: my-plugin.some_widget title: Create my widget python_name: my_plugin:SomeWidget - id: my-plugin.some_function_widget title: Create widget from my function python_name: my_plugin:make_widget_from_function configuration: - title: My Plugin properties: my_plugin.reader.lazy: type: boolean default: false title: Load lazily description: Whether to load images lazily with dask readers: - command: my-plugin.some_reader filename_patterns: ["*.fzy", "*.fzzy"] accepts_directories: true - command: my-plugin.url_reader filename_patterns: ["http://*", "https://*"] accepts_directories: false writers: - command: my-plugin.my_writer filename_extensions: ["*.tif", "*.tiff"] layer_types: ["image{2,4}", "tracks?"] - command: my-plugin.my_writer filename_extensions: ["*.pcd", "*.e57"] layer_types: ["points{1}", "surface+"] - command: my-plugin.my_single_writer filename_extensions: ["*.xyz"] layer_types: ["labels"] widgets: - command: my-plugin.some_widget display_name: My Widget - command: my-plugin.some_function_widget display_name: A Widget From a Function autogenerate: true menus: /napari/layer_context: - submenu: mysubmenu - command: my-plugin.hello_world mysubmenu: - command: my-plugin.another_command submenus: - id: mysubmenu label: My SubMenu themes: - label: "SampleTheme" id: "sample_theme" type: "dark" syntax_style: "default" colors: canvas: "#000000" console: "#000000" background: "#272822" foreground: "#75715e" primary: "#cfcfc2" secondary: "#f8f8f2" highlight: "#e6db74" text: "#a1ef34" icon: "#a1ef34" warning: "#f92672" current: "#66d9ef" sample_data: - display_name: Some Random Data (512 x 512) key: random_data command: my-plugin.generate_random_data - display_name: Random internet image key: internet_image uri: https://picsum.photos/1024 npe2-0.7.2/tests/sample/my_plugin-1.2.3.dist-info/METADATA0000644000000000000000000000016713615410400017373 0ustar00Metadata-Version: 2.1 Name: my-plugin Version: 1.2.3 Summary: My napari plugin License: BSD-3 Author: The Black Knight npe2-0.7.2/tests/sample/my_plugin-1.2.3.dist-info/entry_points.txt0000644000000000000000000000006413615410400021562 0ustar00[napari.manifest] my-plugin = my_plugin:napari.yaml npe2-0.7.2/tests/sample/my_plugin-1.2.3.dist-info/top_level.txt0000644000000000000000000000001213615410400021007 0ustar00my_plugin npe2-0.7.2/.gitignore0000644000000000000000000000245213615410400011277 0ustar00# Byte-compiled / optimized / DLL files __pycache__/ *.py[cod] *$py.class # C extensions *.so # Distribution / packaging .Python env/ build/ develop-eggs/ dist/ downloads/ eggs/ .eggs/ lib/ lib64/ parts/ sdist/ var/ wheels/ *.egg-info/ .installed.cfg *.egg .DS_Store # PyInstaller # Usually these files are written by a python script from a template # before PyInstaller builds the exe, so as to inject date/other infos into it. *.manifest *.spec # Installer logs pip-log.txt pip-delete-this-directory.txt # Unit test / coverage reports htmlcov/ .tox/ .coverage .coverage.* .cache nosetests.xml coverage.xml *.cover .hypothesis/ .pytest_cache/ # Translations *.mo *.pot # Django stuff: *.log local_settings.py # Flask stuff: instance/ .webassets-cache # Scrapy stuff: .scrapy # Sphinx documentation docs/_build/ # PyBuilder target/ # Jupyter Notebook .ipynb_checkpoints # pyenv .python-version # celery beat schedule file celerybeat-schedule # SageMath parsed files *.sage.py # dotenv .env # virtualenv .venv venv/ ENV/ # Spyder project settings .spyderproject .spyproject # Rope project settings .ropeproject # mkdocs documentation /site # mypy .mypy_cache/ .ruff_cache/ # IDE settings .vscode/ src/npe2/_version.py # ignore everything that gets rendered from _docs docs/plugins/*.md schema.json npe2-0.7.2/LICENSE0000644000000000000000000000275213615410400010317 0ustar00 BSD License Copyright (c) 2021, Talley Lambert All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npe2-0.7.2/README.md0000644000000000000000000000232113615410400010561 0ustar00# npe2 [![CI](https://github.com/napari/npe2/actions/workflows/ci.yml/badge.svg)](https://github.com/napari/npe2/actions/workflows/ci.yml) [![codecov](https://codecov.io/gh/napari/npe2/branch/main/graph/badge.svg?token=FTH635x542)](https://codecov.io/gh/napari/npe2) napari plugin refactor see also https://github.com/napari/napari/issues/3115 ## Documentation For documentation on authoring npe2 plugins, see the [napari plugin docs](https://napari.org/plugins/index.html). These include: - the [manifest reference](https://napari.org/plugins/manifest.html) - the [contribution guide](https://napari.org/plugins/contributions.html) ## Command line tool Includes a command line tool `npe2` with the following commands: ```bash Commands: cache Cache utils convert Convert first generation napari plugin to new (manifest) format. parse Show parsed manifest as yaml validate Validate manifest for a distribution name or manifest filepath. ``` examples: ```bash # convert current directory to an npe2-ready plugin # (note: the repo must also be installed and importable in the current environment) npe2 convert . ``` ```bash npe2 validate your-plugin-package ``` ```bash npe2 parse your-plugin-package ``` npe2-0.7.2/pyproject.toml0000644000000000000000000000745213615410400012230 0ustar00# pyproject.toml [build-system] requires = ["hatchling", "hatch-vcs"] build-backend = "hatchling.build" [tool.hatch.version] source = "vcs" # https://peps.python.org/pep-0621/ [project] name = "npe2" dynamic = ["version"] description = "napari plugin engine v2" readme = "README.md" requires-python = ">=3.8" license = { text = "BSD-3-Clause" } authors = [ { name = "Talley Lambert", email = "talley.lambert@gmail.com" }, { name = "Nathan Clack" }, ] classifiers = [ "Development Status :: 3 - Alpha", "License :: OSI Approved :: BSD License", "Natural Language :: English", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Typing :: Typed", ] dependencies = [ "PyYAML", "appdirs", "build", "psygnal>=0.3.0", "pydantic<2", "tomli-w", "tomli; python_version < '3.11'", "rich", "typer", ] [project.urls] homepage = "https://github.com/napari/npe2" repository = "https://github.com/napari/npe2" # https://peps.python.org/pep-0621/#dependencies-optional-dependencies [project.optional-dependencies] testing = [ "magicgui", "napari-plugin-engine", "napari-svg==0.1.5", "numpy", "pytest", "pytest-cov", "jsonschema", ] dev = ["black", "ipython", "isort", "mypy", "pre-commit"] docs = ["Jinja2", "magicgui>=0.3.3"] json = ["jsonschema"] # Entry points # https://peps.python.org/pep-0621/#entry-points # same as console_scripts entry point [project.scripts] npe2 = "npe2.cli:main" [project.entry-points."distutils.commands"] npe2_compile = "npe2._setuptools_plugin:npe2_compile" [project.entry-points."pytest11"] npe2 = "npe2._pytest_plugin" [project.entry-points."setuptools.finalize_distribution_options"] finalize_npe2 = "npe2._setuptools_plugin:finalize_npe2" [tool.check-manifest] ignore = [] [tool.pytest.ini_options] filterwarnings = ["error:::npe2"] addopts = "-m 'not github_main_only'" markers = [ "github_main_only: Test to run only on github main (verify it does not break latest napari docs build)", ] [tool.black] target-version = ['py38', 'py39', 'py310'] line-length = 88 # https://github.com/charliermarsh/ruff [tool.ruff] line-length = 88 target-version = "py38" fix = true src = ["src/npe2", "tests"] select = [ "E", "F", "W", #flake8 "UP", # pyupgrade "I", # isort "B", # flake8-bugbear "C4", # flake8-comprehensions "TID", # flake8-tidy-imports "RUF", # ruff-specific rules ] [tool.ruff.per-file-ignores] "src/npe2/cli.py" = ["B008", "A00"] [tool.ruff.pyupgrade] # Preserve types, even if a file imports `from __future__ import annotations`. keep-runtime-typing = true [tool.ruff.isort] known-first-party = ['npe2'] # https://mypy.readthedocs.io/en/stable/config_file.html [tool.mypy] files = "src/**/*.py" warn_unused_configs = true warn_unused_ignores = true check_untyped_defs = true implicit_reexport = false show_column_numbers = true ignore_missing_imports = true show_error_codes = true pretty = true [tool.coverage.run] source = ["src"] omit = [ "src/npe2/manifest/contributions/_keybindings.py", "src/npe2/manifest/menus.py", "src/npe2/manifest/package_metadata.py", # due to all of the isolated sub-environments and sub-processes, # it's really hard to get coverage on the setuptools plugin. "src/npe2/_setuptools_plugin.py", ] # https://coverage.readthedocs.io/en/6.4/config.html [tool.coverage.report] exclude_lines = [ "pragma: no cover", "if TYPE_CHECKING:", "raise AssertionError", "@overload", "@abstractmethod", "except ImportError", "\\.\\.\\.", "raise NotImplementedError", "if __name__ == .__main__.:", ] npe2-0.7.2/PKG-INFO0000644000000000000000000000574513615410400010414 0ustar00Metadata-Version: 2.1 Name: npe2 Version: 0.7.2 Summary: napari plugin engine v2 Project-URL: homepage, https://github.com/napari/npe2 Project-URL: repository, https://github.com/napari/npe2 Author: Nathan Clack Author-email: Talley Lambert License: BSD-3-Clause License-File: LICENSE Classifier: Development Status :: 3 - Alpha Classifier: License :: OSI Approved :: BSD License Classifier: Natural Language :: English Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Typing :: Typed Requires-Python: >=3.8 Requires-Dist: appdirs Requires-Dist: build Requires-Dist: psygnal>=0.3.0 Requires-Dist: pydantic<2 Requires-Dist: pyyaml Requires-Dist: rich Requires-Dist: tomli-w Requires-Dist: tomli; python_version < '3.11' Requires-Dist: typer Provides-Extra: dev Requires-Dist: black; extra == 'dev' Requires-Dist: ipython; extra == 'dev' Requires-Dist: isort; extra == 'dev' Requires-Dist: mypy; extra == 'dev' Requires-Dist: pre-commit; extra == 'dev' Provides-Extra: docs Requires-Dist: jinja2; extra == 'docs' Requires-Dist: magicgui>=0.3.3; extra == 'docs' Provides-Extra: json Requires-Dist: jsonschema; extra == 'json' Provides-Extra: testing Requires-Dist: jsonschema; extra == 'testing' Requires-Dist: magicgui; extra == 'testing' Requires-Dist: napari-plugin-engine; extra == 'testing' Requires-Dist: napari-svg==0.1.5; extra == 'testing' Requires-Dist: numpy; extra == 'testing' Requires-Dist: pytest; extra == 'testing' Requires-Dist: pytest-cov; extra == 'testing' Description-Content-Type: text/markdown # npe2 [![CI](https://github.com/napari/npe2/actions/workflows/ci.yml/badge.svg)](https://github.com/napari/npe2/actions/workflows/ci.yml) [![codecov](https://codecov.io/gh/napari/npe2/branch/main/graph/badge.svg?token=FTH635x542)](https://codecov.io/gh/napari/npe2) napari plugin refactor see also https://github.com/napari/napari/issues/3115 ## Documentation For documentation on authoring npe2 plugins, see the [napari plugin docs](https://napari.org/plugins/index.html). These include: - the [manifest reference](https://napari.org/plugins/manifest.html) - the [contribution guide](https://napari.org/plugins/contributions.html) ## Command line tool Includes a command line tool `npe2` with the following commands: ```bash Commands: cache Cache utils convert Convert first generation napari plugin to new (manifest) format. parse Show parsed manifest as yaml validate Validate manifest for a distribution name or manifest filepath. ``` examples: ```bash # convert current directory to an npe2-ready plugin # (note: the repo must also be installed and importable in the current environment) npe2 convert . ``` ```bash npe2 validate your-plugin-package ``` ```bash npe2 parse your-plugin-package ```