pytest-dependency-0.5.1/0000755000175000001440000000000013621604715015307 5ustar rolfusers00000000000000pytest-dependency-0.5.1/.gitignore0000644000175000001440000000026313621604204017271 0ustar rolfusers00000000000000*.pyc *~ .cache/ __pycache__/ /.version /MANIFEST /build/ /dist/ /doc/doctest/ /doc/doctrees/ /doc/html/ /doc/latex/ /doc/linkcheck/ /pytest_dependency.egg-info/ /python2_6.patch pytest-dependency-0.5.1/.travis.yml0000644000175000001440000000027013621604204017410 0ustar rolfusers00000000000000language: python python: - "2.7" - "3.4" - "3.5" - "3.6" - "3.7" - "3.8" install: pip install -r requirements.txt script: make test # Local Variables: # mode: yaml # End: pytest-dependency-0.5.1/.version0000644000175000001440000000000513621604714016767 0ustar rolfusers000000000000000.5.1pytest-dependency-0.5.1/LICENSE.txt0000644000175000001440000002613613621604204017133 0ustar rolfusers00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. pytest-dependency-0.5.1/MANIFEST.in0000644000175000001440000000026113621604204017035 0ustar rolfusers00000000000000include .version include LICENSE.txt include MANIFEST.in include README.rst include doc/examples/*.py include tests/conftest.py include tests/pytest.ini include tests/test_*.py pytest-dependency-0.5.1/Makefile0000644000175000001440000000114713621604204016743 0ustar rolfusers00000000000000PYTHON = python BUILDDIR = $(CURDIR)/build build: $(PYTHON) setup.py build test: build PYTHONPATH=$(BUILDDIR)/lib $(PYTHON) -m pytest tests sdist: $(PYTHON) setup.py sdist doc-html: .version $(MAKE) -C doc html clean: rm -f *~ tests/*~ rm -rf build $(MAKE) -C doc clean distclean: clean rm -rf .cache tests/.cache .pytest_cache tests/.pytest_cache rm -f *.pyc tests/*.pyc rm -rf __pycache__ tests/__pycache__ rm -f MANIFEST .version rm -rf dist rm -rf pytest_dependency.egg-info $(MAKE) -C doc distclean .version: $(PYTHON) setup.py check .PHONY: build test sdist doc-html clean distclean pytest-dependency-0.5.1/PKG-INFO0000644000175000001440000000271413621604715016410 0ustar rolfusers00000000000000Metadata-Version: 1.2 Name: pytest-dependency Version: 0.5.1 Summary: Manage dependencies of tests Home-page: https://github.com/RKrahl/pytest-dependency Author: Rolf Krahl Author-email: rolf@rotkraut.de Maintainer: Rolf Krahl Maintainer-email: rolf@rotkraut.de License: Apache Software License 2.0 Project-URL: Documentation, https://pytest-dependency.readthedocs.io/ Project-URL: Source Code, https://github.com/RKrahl/pytest-dependency Description: pytest-dependency - Manage dependencies of tests This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Framework :: Pytest Classifier: Intended Audience :: Developers Classifier: Topic :: Software Development :: Testing Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Operating System :: OS Independent Classifier: License :: OSI Approved :: Apache Software License pytest-dependency-0.5.1/README.rst0000644000175000001440000000464113621604577017011 0ustar rolfusers00000000000000.. image:: https://travis-ci.org/RKrahl/pytest-dependency.svg?branch=master :target: https://travis-ci.org/RKrahl/pytest-dependency pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. Download -------- The latest release version can be found at PyPI, see https://pypi.python.org/pypi/pytest_dependency System requirements ------------------- + Python 2.7 or 3.4 and newer. + `setuptools`_. + `pytest`_ 3.6.0 or newer. Optional library packages: + `setuptools_scm`_ The version number is managed using this package. All source distributions add a static text file with the version number and fall back using that if `setuptools_scm` is not available. So this package is only needed to build out of the plain development source tree as cloned from GitHub. Installation ------------ 1. Download the sources, unpack, and change into the source directory. 2. Build (optional):: $ python setup.py build 3. Test (optional):: $ python -m pytest 4. Install:: $ python setup.py install The last step might require admin privileges in order to write into the site-packages directory of your Python installation. Documentation ------------- The documentation can be found at https://pytest-dependency.readthedocs.io/ The example test modules used in the documentation can be found in doc/examples in the source distribution. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016-2020 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _setuptools: http://pypi.python.org/pypi/setuptools/ .. _pytest: http://pytest.org/ .. _setuptools_scm: https://github.com/pypa/setuptools_scm/ pytest-dependency-0.5.1/doc/0000755000175000001440000000000013621604715016054 5ustar rolfusers00000000000000pytest-dependency-0.5.1/doc/Makefile0000644000175000001440000000210113621604204017477 0ustar rolfusers00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = a4 # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) src # Subdirectories of src that are supposed to be there but that may be # empty and may thus be missing after a git checkout. SRCDIRS = src/_static src/_templates html: $(SRCDIRS) $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) html latex: $(SRCDIRS) $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) latex latexpdf: latex $(MAKE) -C latex all-pdf linkcheck: $(SRCDIRS) $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) linkcheck doctest: $(SRCDIRS) $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) doctest clean: rm -f *~ examples/*~ src/*~ distclean: clean rm -rf doctrees html latex linkcheck doctest rm -f examples/*.pyc rm -rf examples/__pycache__ src/_static: mkdir $@ src/_templates: mkdir $@ .PHONY: html latex latexpdf linkcheck doctest clean distclean pytest-dependency-0.5.1/doc/examples/0000755000175000001440000000000013621604715017672 5ustar rolfusers00000000000000pytest-dependency-0.5.1/doc/examples/all_params.py0000644000175000001440000000221413621604204022347 0ustar rolfusers00000000000000import pytest def instances(name, params): def vstr(val): if isinstance(val, (list, tuple)): return "-".join([str(v) for v in val]) else: return str(val) return ["%s[%s]" % (name, vstr(v)) for v in params] params_a = range(17) @pytest.mark.parametrize("x", params_a) @pytest.mark.dependency() def test_a(x): if x == 13: pytest.xfail("deliberate fail") assert False else: pass @pytest.mark.dependency(depends=instances("test_a", params_a)) def test_b(): pass params_c = list(zip(range(0,8,2), range(2,6))) @pytest.mark.parametrize("x,y", params_c) @pytest.mark.dependency() def test_c(x, y): if x > y: pytest.xfail("deliberate fail") assert False else: pass @pytest.mark.dependency(depends=instances("test_c", params_c)) def test_d(): pass params_e = ['abc', 'def'] @pytest.mark.parametrize("s", params_e) @pytest.mark.dependency() def test_e(s): if 'e' in s: pytest.xfail("deliberate fail") assert False else: pass @pytest.mark.dependency(depends=instances("test_e", params_e)) def test_f(): pass pytest-dependency-0.5.1/doc/examples/basic.out0000644000175000001440000000141613621604204021477 0ustar rolfusers00000000000000$ pytest -rsx basic.py ============================= test session starts ============================== platform linux -- Python 3.8.1, pytest-5.3.4, py-1.8.1, pluggy-0.13.1 rootdir: /home/user/tests plugins: dependency-0.4.0 collected 5 items basic.py x.s.s [100%] =========================== short test summary info ============================ SKIPPED [1] /usr/lib/python3.8/site-packages/pytest_dependency.py:87: test_c depends on test_a SKIPPED [1] /usr/lib/python3.8/site-packages/pytest_dependency.py:87: test_e depends on test_c XFAIL basic.py::test_a deliberate fail =================== 2 passed, 2 skipped, 1 xfailed in 0.06s ==================== pytest-dependency-0.5.1/doc/examples/basic.py0000644000175000001440000000057513621604204021325 0ustar rolfusers00000000000000import pytest @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_a(): assert False @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency(depends=["test_a"]) def test_c(): pass @pytest.mark.dependency(depends=["test_b"]) def test_d(): pass @pytest.mark.dependency(depends=["test_b", "test_c"]) def test_e(): pass pytest-dependency-0.5.1/doc/examples/dyn-parametrized.py0000644000175000001440000000256513621604204023524 0ustar rolfusers00000000000000import pytest # Test data # Consider a bunch of Nodes, some of them are parents and some are children. class Node(object): NodeMap = {} def __init__(self, name, parent=None): self.name = name self.children = [] self.NodeMap[self.name] = self if parent: self.parent = self.NodeMap[parent] self.parent.children.append(self) else: self.parent = None def __str__(self): return self.name parents = [ Node("a"), Node("b"), Node("c"), Node("d"), ] childs = [ Node("e", "a"), Node("f", "a"), Node("g", "a"), Node("h", "b"), Node("i", "c"), Node("j", "c"), Node("k", "d"), Node("l", "d"), Node("m", "d"), ] # The test for the parent shall depend on the test of all its children. # Create enriched parameter lists, decorated with the dependency marker. childparam = [ pytest.param(c, marks=pytest.mark.dependency(name="test_child[%s]" % c)) for c in childs ] parentparam = [ pytest.param(p, marks=pytest.mark.dependency( name="test_parent[%s]" % p, depends=["test_child[%s]" % c for c in p.children] )) for p in parents ] @pytest.mark.parametrize("c", childparam) def test_child(c): if c.name == "l": pytest.xfail("deliberate fail") assert False @pytest.mark.parametrize("p", parentparam) def test_parent(p): pass pytest-dependency-0.5.1/doc/examples/group-fixture.py0000644000175000001440000000064613621604204023063 0ustar rolfusers00000000000000import pytest from pytest_dependency import depends @pytest.fixture(scope="module", params=range(1,10)) def testcase(request): param = request.param return param @pytest.mark.dependency() def test_a(testcase): if testcase % 7 == 0: pytest.xfail("deliberate fail") assert False @pytest.mark.dependency() def test_b(request, testcase): depends(request, ["test_a[%d]" % testcase]) pass pytest-dependency-0.5.1/doc/examples/group-fixture2.py0000644000175000001440000000107113621604204023136 0ustar rolfusers00000000000000import pytest from pytest_dependency import depends @pytest.fixture(scope="module", params=range(1,10)) def testcase(request): param = request.param return param @pytest.fixture(scope="module") def dep_testcase(request, testcase): depends(request, ["test_a[%d]" % testcase]) return testcase @pytest.mark.dependency() def test_a(testcase): if testcase % 7 == 0: pytest.xfail("deliberate fail") assert False @pytest.mark.dependency() def test_b(dep_testcase): pass @pytest.mark.dependency() def test_c(dep_testcase): pass pytest-dependency-0.5.1/doc/examples/named.py0000644000175000001440000000062713621604204021326 0ustar rolfusers00000000000000import pytest @pytest.mark.dependency(name="a") @pytest.mark.xfail(reason="deliberate fail") def test_a(): assert False @pytest.mark.dependency(name="b") def test_b(): pass @pytest.mark.dependency(name="c", depends=["a"]) def test_c(): pass @pytest.mark.dependency(name="d", depends=["b"]) def test_d(): pass @pytest.mark.dependency(name="e", depends=["b", "c"]) def test_e(): pass pytest-dependency-0.5.1/doc/examples/nodeid.out0000644000175000001440000000172713621604204021665 0ustar rolfusers00000000000000$ pytest --verbose ============================= test session starts ============================== platform linux -- Python 3.8.1, pytest-5.3.4, py-1.8.1, pluggy-0.13.1 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /home/user plugins: dependency-0.4.0 collected 7 items tests/test_nodeid.py::test_a PASSED [ 14%] tests/test_nodeid.py::test_b[7-True] PASSED [ 28%] tests/test_nodeid.py::test_b[0-False] PASSED [ 42%] tests/test_nodeid.py::test_b[-1-False] XFAIL [ 57%] tests/test_nodeid.py::TestClass::test_c PASSED [ 71%] tests/test_nodeid.py::TestClass::test_d[order] PASSED [ 85%] tests/test_nodeid.py::TestClass::test_d[disorder] PASSED [100%] ========================= 6 passed, 1 xfailed in 0.08s ========================= pytest-dependency-0.5.1/doc/examples/nodeid.py0000644000175000001440000000105013621604204021473 0ustar rolfusers00000000000000import random import pytest def test_a(): pass @pytest.mark.parametrize("i,b", [ (7, True), (0, False), pytest.param(-1, False, marks=pytest.mark.xfail(reason="nonsense")) ]) def test_b(i, b): assert bool(i) == b ordered = list(range(10)) unordered = random.sample(ordered, k=len(ordered)) class TestClass: def test_c(self): pass @pytest.mark.parametrize("l,ll", [(ordered, 10), (unordered, 10)], ids=["order", "disorder"]) def test_d(self, l, ll): assert len(l) == ll pytest-dependency-0.5.1/doc/examples/parametrized.py0000644000175000001440000000336313621604204022731 0ustar rolfusers00000000000000import pytest @pytest.mark.parametrize("x,y", [ pytest.param(0, 0, marks=pytest.mark.dependency(name="a1")), pytest.param(0, 1, marks=[pytest.mark.dependency(name="a2"), pytest.mark.xfail]), pytest.param(1, 0, marks=pytest.mark.dependency(name="a3")), pytest.param(1, 1, marks=pytest.mark.dependency(name="a4")) ]) def test_a(x,y): assert y <= x @pytest.mark.parametrize("u,v", [ pytest.param(1, 2, marks=pytest.mark.dependency(name="b1", depends=["a1", "a2"])), pytest.param(1, 3, marks=pytest.mark.dependency(name="b2", depends=["a1", "a3"])), pytest.param(1, 4, marks=pytest.mark.dependency(name="b3", depends=["a1", "a4"])), pytest.param(2, 3, marks=pytest.mark.dependency(name="b4", depends=["a2", "a3"])), pytest.param(2, 4, marks=pytest.mark.dependency(name="b5", depends=["a2", "a4"])), pytest.param(3, 4, marks=pytest.mark.dependency(name="b6", depends=["a3", "a4"])) ]) def test_b(u,v): pass @pytest.mark.parametrize("w", [ pytest.param(1, marks=pytest.mark.dependency(name="c1", depends=["b1", "b2", "b6"])), pytest.param(2, marks=pytest.mark.dependency(name="c2", depends=["b2", "b3", "b6"])), pytest.param(3, marks=pytest.mark.dependency(name="c3", depends=["b2", "b4", "b6"])) ]) def test_c(w): pass pytest-dependency-0.5.1/doc/examples/runtime.py0000644000175000001440000000061313621604204021720 0ustar rolfusers00000000000000import pytest from pytest_dependency import depends @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_b(): assert False @pytest.mark.dependency() def test_c(request): depends(request, ["test_b"]) pass @pytest.mark.dependency() def test_d(request): depends(request, ["test_a", "test_c"]) pass pytest-dependency-0.5.1/doc/examples/scope_class.py0000644000175000001440000000111313621604204022527 0ustar rolfusers00000000000000import pytest @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_a(): assert False class TestClass1(object): @pytest.mark.dependency() def test_b(self): pass class TestClass2(object): @pytest.mark.dependency() def test_a(self): pass @pytest.mark.dependency(depends=["test_a"]) def test_c(self): pass @pytest.mark.dependency(depends=["test_a"], scope='class') def test_d(self): pass @pytest.mark.dependency(depends=["test_b"], scope='class') def test_e(self): pass pytest-dependency-0.5.1/doc/examples/scope_module.py0000644000175000001440000000065513621604204022721 0ustar rolfusers00000000000000import pytest @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_a(): assert False @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency(depends=["test_a"], scope='module') def test_c(): pass @pytest.mark.dependency(depends=["test_b"], scope='module') def test_d(): pass @pytest.mark.dependency(depends=["test_b", "test_c"], scope='module') def test_e(): pass pytest-dependency-0.5.1/doc/examples/scope_session_mod_01.py0000644000175000001440000000053213621604204024250 0ustar rolfusers00000000000000# test_mod_01.py import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_b(): assert False @pytest.mark.dependency(depends=["test_a"]) def test_c(): pass class TestClass(object): @pytest.mark.dependency() def test_b(self): pass pytest-dependency-0.5.1/doc/examples/scope_session_mod_02.py0000644000175000001440000000106113621604204024247 0ustar rolfusers00000000000000# test_mod_02.py import pytest @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_a(): assert False @pytest.mark.dependency( depends=["tests/test_mod_01.py::test_a", "tests/test_mod_01.py::test_c"], scope='session' ) def test_e(): pass @pytest.mark.dependency( depends=["tests/test_mod_01.py::test_b", "tests/test_mod_02.py::test_e"], scope='session' ) def test_f(): pass @pytest.mark.dependency( depends=["tests/test_mod_01.py::TestClass::test_b"], scope='session' ) def test_g(): pass pytest-dependency-0.5.1/doc/examples/testclass.py0000644000175000001440000000202513621604204022241 0ustar rolfusers00000000000000import pytest class TestClass(object): @pytest.mark.dependency() @pytest.mark.xfail(reason="deliberate fail") def test_a(self): assert False @pytest.mark.dependency() def test_b(self): pass @pytest.mark.dependency(depends=["TestClass::test_a"]) def test_c(self): pass @pytest.mark.dependency(depends=["TestClass::test_b"]) def test_d(self): pass @pytest.mark.dependency(depends=["TestClass::test_b", "TestClass::test_c"]) def test_e(self): pass class TestClassNamed(object): @pytest.mark.dependency(name="a") @pytest.mark.xfail(reason="deliberate fail") def test_a(self): assert False @pytest.mark.dependency(name="b") def test_b(self): pass @pytest.mark.dependency(name="c", depends=["a"]) def test_c(self): pass @pytest.mark.dependency(name="d", depends=["b"]) def test_d(self): pass @pytest.mark.dependency(name="e", depends=["b", "c"]) def test_e(self): pass pytest-dependency-0.5.1/doc/src/0000755000175000001440000000000013621604715016643 5ustar rolfusers00000000000000pytest-dependency-0.5.1/doc/src/about.rst0000644000175000001440000000716513621604204020511 0ustar rolfusers00000000000000About pytest-dependency ======================= This module is a plugin for the popular Python testing framework `pytest`_. It manages dependencies of tests: you may mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. What is the purpose? -------------------- In the theory of good test design, tests should be self-contained and independent. Each test should cover one single issue, either verify that one single feature is working or that one single bug is fixed. Tests should be designed to work in any order independent of each other. So far the theory. The practice is often more complicated then that. Sometimes, the principle of independency of tests is simply unrealistic or impractical. Program features often depend on each other. If some feature B depends on another feature A in such a way that B cannot work without A, then it may simply be pointless to run the test for B unless the test for A has succeeded. Another case may be if the subject of the tests has an internal state that unavoidably is influenced by the tests. In this situation it may happen that test A, as a side effect, sets the system in some state that is the precondition to be able to run test B. Again, in this case it would be pointless to try running test B unless test A has been run successful. It should be emphasized however that the principle of independency of tests is still valid. Before using pytest-dependency, it is still advisable to reconsider your test design and to avoid dependencies of tests whenever possible, rather then to manage these dependencies. How does it work? ----------------- The pytest-dependency module defines a marker that can be applied to tests. The marker accepts an argument that allows to list the dependencies of the test. Both tests, the dependency and the dependent test should be decorated with the marker. Behind the scenes, the marker arranges for the result of the test to be recorded internally. If a list of dependencies has been given as argument, the marker verifies that a successful outcome of all the dependencies has been registered previously and causes a skip of the test if this was not the case. Why is this useful? ------------------- The benefit of skipping dependent tests is the same as for skipping tests in general: it avoids cluttering the test report with useless and misleading failure reports from tests that have been known beforehand not to work in this particular case. If tests depend on each other in such a way that test B cannot work unless test A has been run successfully, a failure of test A will likely result in failure messages from both tests. But the failure message from test B will not be helpful in any way. It will only distract the user from the real issue that is the failure of test A. Skipping test B in this case will help the user to concentrate on those results that really matter. Copyright and License --------------------- - Copyright 2013-2015 Helmholtz-Zentrum Berlin für Materialien und Energie GmbH - Copyright 2016-2018 Rolf Krahl Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. .. _pytest: http://pytest.org/ pytest-dependency-0.5.1/doc/src/advanced.rst0000644000175000001440000000645213621604204021142 0ustar rolfusers00000000000000Advanced usage ============== This section contains some advanced examples for using pytest-dependency. Dynamic compilation of marked parameters ---------------------------------------- Sometimes, the parameter values for parametrized tests cannot easily be typed as a simple list. It may need to be compiled at run time depending on a set of test data. This also works together with marking dependencies in the individual test instances. Consider the following example test module: .. literalinclude:: ../examples/dyn-parametrized.py In principle, this example works the very same way as the basic example for :ref:`usage-parametrized`. The only difference is that the lists of paramters are dynamically compiled beforehand. The test for child `l` deliberately fails, just to show the effect. As a consequence, the test for its parent `d` will be skipped. .. _advanced-grouping-fixtures: Grouping tests using fixtures ----------------------------- pytest features the `automatic grouping of tests by fixture instances`__. This is particularly useful if there is a set of test cases and a series of tests shall be run for each of the test case respectively. Consider the following example: .. literalinclude:: ../examples/group-fixture.py The test instances of `test_b` depend on `test_a` for the same parameter value. The test `test_a[7]` deliberately fails, as a consequence `test_b[7]` will be skipped. Note that we need to call :func:`pytest_dependency.depends` to mark the dependencies, because there is no way to use the :func:`pytest.mark.dependency` marker on the parameter values here. If many tests in the series depend on a single test, it might be an option, to move the call to :func:`pytest_dependency.depends` in a fixture on its own. Consider: .. literalinclude:: ../examples/group-fixture2.py In this example, both `test_b[7]` and `test_c[7]` are skipped, because `test_a[7]` deliberately fails. .. __: https://docs.pytest.org/en/stable/fixture.html#automatic-grouping-of-tests-by-fixture-instances Depend on all instances of a parametrized test at once ------------------------------------------------------ If a test depends on a all instances of a parametrized test at once, listing all of them in the :func:`pytest.mark.dependency` marker explicitly might not be the best solution. But you can dynamically compile these lists from the parameter values, as in the following example: .. literalinclude:: ../examples/all_params.py Here, `test_b`, `test_d`, and `test_f` will be skipped because they depend on all instances of `test_a`, `test_c`, and `test_e` respectively, but `test_a[13]`, `test_c[6-5]`, and `test_e[def]` fail. The list of the test instances is compiled in the helper function `instances()`. Unfortunately you need knowledge how pytest encodes parameter values in test instance names to write this helper function. Note in particular how lists of parameter values are compiled into one single string in the case of multi parameter tests. But also note that this example of the `instances()` helper will only work for simple cases. It requires the parameter values to be scalars that can easily be converted to strings. And it will fail if the same list of parameters is passed to the same test more then once, because then, pytest will add an index to the name to disambiguate the parameter values. pytest-dependency-0.5.1/doc/src/changelog.rst0000644000175000001440000001054513621604577021337 0ustar rolfusers00000000000000History of changes to pytest-dependency ======================================= 0.5.1 (2020-02-14) Bug fixes and minor changes + Fix failing documentation build. 0.5.0 (2020-02-14) New features + `#3`_, `#35`_: add a scope to dependencies. (Thanks to JoeSc and selenareneephillips!) Bug fixes and minor changes + `#34`_: failing test with pytest 4.2.0 and newer. + Use setuptools_scm to manage the version number. .. _#3: https://github.com/RKrahl/pytest-dependency/issues/3 .. _#34: https://github.com/RKrahl/pytest-dependency/issues/34 .. _#35: https://github.com/RKrahl/pytest-dependency/pull/35 0.4.0 (2018-12-02) Incompatible changes + Require pytest version 3.6.0 or newer. This implicitly drops support for Python 2.6 and for Python 3.3 and older. Bug fixes and minor changes + `#24`_, `#25`_: get_marker no longer available in pytest 4.0.0. (Thanks to Rogdham!) + `#28`_: Applying markers directly in parametrize is no longer available in 4.0. .. _#24: https://github.com/RKrahl/pytest-dependency/issues/24 .. _#25: https://github.com/RKrahl/pytest-dependency/pull/25 .. _#28: https://github.com/RKrahl/pytest-dependency/issues/28 0.3.2 (2018-01-17) Bug fixes and minor changes + `#5`_: properly register the dependency marker. + Do not add the documentation to the source distribution. .. _#5: https://github.com/RKrahl/pytest-dependency/issues/5 0.3.1 (2017-12-26) Bug fixes and minor changes + `#17`_: Move the online documentation to Read the Docs. + Some improvements in the documentation. .. _#17: https://github.com/RKrahl/pytest-dependency/issues/17 0.3 (2017-12-26) New features + `#7`_: Add a configuration switch to implicitly mark all tests. + `#10`_: Add an option to ignore unknown dependencies. Incompatible changes + Prepend the class name to the default test name for test class methods. This fixes a potential name conflict, see `#6`_. If your code uses test classes and you reference test methods by their default name, you must add the class name. E.g. if you have something like: .. code-block:: python class TestClass(object): @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency(depends=["test_a"]) def test_b(): pass you need to change this to: .. code-block:: python class TestClass(object): @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency(depends=["TestClass::test_a"]) def test_b(): pass If you override the test name in the pytest.mark.dependency() marker, nothing need to be changed. Bug fixes and minor changes + `#11`_: show the name of the skipped test. (Thanks asteriogonzalez!) + `#13`_: Do not import pytest in setup.py to make it compatible with pipenv. + `#15`_: tests fail with pytest 3.3.0. + `#8`_: document incompatibility with parallelization in pytest-xdist. + Clarify in the documentation that Python 3.1 is not officially supported because pytest 2.8 does not support it. There is no known issue with Python 3.1 though. .. _#6: https://github.com/RKrahl/pytest-dependency/issues/6 .. _#7: https://github.com/RKrahl/pytest-dependency/issues/7 .. _#8: https://github.com/RKrahl/pytest-dependency/issues/8 .. _#10: https://github.com/RKrahl/pytest-dependency/issues/10 .. _#11: https://github.com/RKrahl/pytest-dependency/pull/11 .. _#13: https://github.com/RKrahl/pytest-dependency/issues/13 .. _#15: https://github.com/RKrahl/pytest-dependency/issues/15 0.2 (2017-05-28) New features + `#2`_: Add documentation. + `#4`_: Add a depend() function to add a dependency to a test at runtime. .. _#2: https://github.com/RKrahl/pytest-dependency/issues/2 .. _#4: https://github.com/RKrahl/pytest-dependency/issues/4 0.1 (2017-01-29) + Initial release as an independent Python module. This code was first developed as part of a larger package, python-icat, at Helmholtz-Zentrum Berlin für Materialien und Energie, see https://icatproject.org/user-documentation/python-icat/ pytest-dependency-0.5.1/doc/src/conf.py0000644000175000001440000001754313621604577020162 0ustar rolfusers00000000000000# # pytest-dependency documentation build configuration file. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os import os.path # The top source directory. This file is exec'ed with its directory # as cwd. This is "doc/src" relativ to the top source directory. So # we need to go 2 dirs up. topdir = os.path.dirname(os.path.dirname(os.getcwd())) # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.insert(0, os.path.abspath('.')) # -- General configuration ----------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ['sphinx.ext.autodoc', 'sphinx.ext.doctest'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'pytest-dependency' copyright = u'2016-2020, Rolf Krahl' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. with open(os.path.join(topdir, ".version"), "rt") as f: release = f.read() version = ".".join(release.split(".")[0:2]) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = [] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. html_show_sourcelink = False # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'pytest-dependency-doc' # -- Options for LaTeX output -------------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'pytest-dependency.tex', u'pytest-dependency Documentation', u'Rolf Krahl', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'pytest-dependency', u'pytest-dependency Documentation', [u'Rolf Krahl'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------------ # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'pytest-dependency', u'pytest-dependency Documentation', u'Rolf Krahl', 'pytest-dependency', 'Call pytest from a distutils setup.py script.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' pytest-dependency-0.5.1/doc/src/configuration.rst0000644000175000001440000000376713621604204022252 0ustar rolfusers00000000000000Configuring pytest-dependency ============================= This section explains configuration options for pytest-dependency, but also options for pytest itself or other plugins that are recommended for the use with pytest-dependency. Notes on configuration for other plugins ---------------------------------------- pytest-xdist Test run parallelization in pytest-xdist is incompatible with pytest-dependency, see :ref:`install-other-packages`. By default, parallelization is disabled in pytest-xdist (`--dist=no`). You are advised to leave this default. Configuration file options -------------------------- Configuration file options can be set in the `ini file`. minversion This is a builtin configuration option of pytest itself. Since pytest-dependency requires pytest 3.6.0 or newer, it is recommended to set this option accordingly, either to 3.6.0 or to a newer version, if required by your test code. automark_dependency This is a flag. If set to `False`, the default, the outcome of a test will only be registered if the test has been decorated with the :func:`pytest.mark.dependency` marker. As a results, all tests, the dependencies and the dependent tests must be decorated. If set to `True`, the outcome of all tests will be registered. It has the same effect as implicitly decorating all tests with :func:`pytest.mark.dependency`. .. versionadded:: 0.3 Command line options -------------------- The following command line options are added by pytest.dependency: `--ignore-unknown-dependency` By default, a test will be skipped unless all the dependencies have been run successful. If this option is set, a test will be skipped if any of the dependencies has been skipped or failed. E.g. dependencies that have not been run at all will be ignored. This may be useful if you run only a subset of the testsuite and some tests in the selected set are marked to depend on other tests that have not been selected. .. versionadded:: 0.3 pytest-dependency-0.5.1/doc/src/index.rst0000644000175000001440000000075513621604204020504 0ustar rolfusers00000000000000pytest-dependency - Manage dependencies of tests ================================================ This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. Content of the documentation ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. toctree:: :maxdepth: 2 about install usage scope advanced names configuration changelog reference pytest-dependency-0.5.1/doc/src/install.rst0000644000175000001440000000255613621604204021044 0ustar rolfusers00000000000000Installation instructions ========================= System requirements ------------------- + Python 2.7 or 3.4 and newer. + `setuptools`_. + `pytest`_ 3.6.0 or newer. .. _install-other-packages: Interaction with other packages ------------------------------- pytest-xdist pytest-xdist features test run parallelization, e.g. distributing tests over separate processes that run in parallel. This is based on the assumption that the tests can be run independent of each other. Obviously, if you are using pytest-dependency, this assumption is not valid. Thus, pytest-dependency will only work if you do not enable parallelization in pytest-xdist. Download -------- The latest release version of pytest-dependency source can be found at PyPI, see https://pypi.python.org/pypi/pytest_dependency Installation ------------ 1. Download the sources, unpack, and change into the source directory. 2. Build (optional):: $ python setup.py build 3. Test (optional):: $ python -m pytest 4. Install:: $ python setup.py install The last step might require admin privileges in order to write into the site-packages directory of your Python installation. For production use, it is always recommended to use the latest release version from PyPI, see above. .. _setuptools: http://pypi.python.org/pypi/setuptools/ .. _pytest: http://pytest.org/ pytest-dependency-0.5.1/doc/src/names.rst0000644000175000001440000000665613621604204020506 0ustar rolfusers00000000000000.. _names: Names ===== Dependencies of tests are referenced by name. The default name is the `node id`__ assigned to the test by pytest. This default may be overridden by an explicit `name` argument to the :func:`pytest.mark.dependency` marker. The references also depend on the scope. .. __: https://docs.pytest.org/en/latest/example/markers.html#node-id Node ids -------- The node ids in pytest are built of several components, separated by a double colon "::". For test functions, these components are the relative path of the test module and the name of the function. In the case of a method of a test class the components are the module path, the name of the class, and the name of the method. If the function or method is parameterized, the parameter values, separated by minus "-", in square brackets "[]" are appended to the node id. The representation of the parameter values in the node id may be overridden using the `ids` argument to the `pytest.mark.parametrize()`__ marker. .. __: https://docs.pytest.org/en/latest/reference.html#pytest-mark-parametrize-ref One may check the node ids of all tests calling pytest with the `--verbose` command line option. As an example, consider the following test module: .. literalinclude:: ../examples/nodeid.py If this module is stored as `tests/test_nodeid.py`, the output will look like: .. literalinclude:: ../examples/nodeid.out .. note:: Old versions of pytest used to include an extra "()" component to the node ids of methods of test classes. This has been `removed in pytest 4.0.0`__. pytest-dependency strips this if present. Thus, when referencing dependencies, the new style node ids as described above may (and must) be used, regardless of the pytest version. .. __: https://docs.pytest.org/en/latest/changelog.html#pytest-4-0-0-2018-11-13 References and scope -------------------- When referencing dependencies of tests, the names to be used in the `depends` argument to the :func:`pytest.mark.dependency` marker or the `other` argument to the :func:`pytest_dependency.depends` function depend on the scope as follows: `session` The full node id must be used. `package` The full node id must be used. `module` The node id with the leading module path including the "::" separator removed must be used. `class` The node id with the module path and the class name including the "::" separator removed must be used. That is, in the example above, when referencing `test_a` as a dependency, it must be referenced as `tests/test_nodeid.py::test_a` in session scope and as `test_a` in module scope. When referencing the first invocation of `test_d` as a dependency, it must be referenced as `tests/test_nodeid.py::TestClass::test_d[order]` in session scope, as `TestClass::test_d[order]` in module scope, and as `test_d[order]` in class scope. If the name of the dependency has been set with an explicit `name` argument to the :func:`pytest.mark.dependency` marker, this name must always be used as is, regardless of the scope. .. note:: The module path in the node id is the path relative to the current working directory. This depends on the invocation of pytest. In the example above, if you change into the `tests` directory before invoking pytest, the module path in the node ids will be `test_nodeid.py`. If you use references in session scope, you'll need to make sure pytest is always invoked from the same working directory. pytest-dependency-0.5.1/doc/src/reference.rst0000644000175000001440000000270513621604204021330 0ustar rolfusers00000000000000Reference ========= .. py:decorator:: pytest.mark.dependency(name=None, depends=[], scope='module') Mark a test to be used as a dependency for other tests or to depend on other tests. This will cause the test results to be registered internally and thus other tests may depend on the test. The list of dependencies for the test may be set in the depends argument. :param name: the name of the test to be used for referencing by dependent tests. If not set, it defaults to the node ID defined by pytest. The name must be unique. :type name: :class:`str` :param depends: dependencies, a list of names of tests that this test depends on. The test will be skipped unless all of the dependencies have been run successfully. The dependencies must also have been decorated by the marker. The names of the dependencies must be adapted to the scope. :type depends: iterable of :class:`str` :param scope: the scope to search for the dependencies. Must be either `'session'`, `'package'`, `'module'`, or `'class'`. :type scope: :class:`str` See Section :ref:`names` for details on the default name if the `name` argument is not set and on how references in the `depends` argument must be adapted to the scope. .. versionchanged:: 0.5.0 the scope parameter has been added. .. py:module:: pytest_dependency .. autofunction:: pytest_dependency.depends pytest-dependency-0.5.1/doc/src/scope.rst0000644000175000001440000001007113621604204020476 0ustar rolfusers00000000000000Defining the scope of dependencies ================================== In the previous examples, we didn't specify a scope for the dependencies. All dependencies were taken in module scope, which is the default. As a consequence, tests were constraint to depend only from other tests in the same test module. The :func:`pytest.mark.dependency` marker as well as the :func:`pytest_dependency.depends` function take an optional `scope` argument. Possible values are `'session'`, `'package'`, `'module'`, or `'class'`. .. versionadded:: 0.5.0 the scope of dependencies has been introduced. In earlier versions, all dependencies were implicitly in module scope. Explicitely specifying the scope -------------------------------- The default value for the `scope` argument is `'module'`. Thus, the very first example from Section :ref:`usage-basic` could also be written as: .. literalinclude:: ../examples/scope_module.py It works exactly the same. The only difference is that the default scope has been made explicit. Dependencies in session scope ----------------------------- If a test depends on another test in a different test module, the dependency must either be in session or package scope. Consider the following two test modules: .. literalinclude:: ../examples/scope_session_mod_01.py and .. literalinclude:: ../examples/scope_session_mod_02.py Let's assume the modules to be stored as `tests/test_mod_01.py` and `tests/test_mod_02.py` relative to the current working directory respectively. The test `test_e` in `tests/test_mod_02.py` will be run and succeed. It depends on `test_a` and `test_c` in `tests/test_mod_01.py` that both succeed. It does not matter that there is another `test_a` in `tests/test_mod_02.py` that fails. Test `test_f` in `tests/test_mod_02.py` will be skipped, because it depends on `test_b` in `tests/test_mod_01.py` that fails. Test `test_g` in turn will be run and succeed. It depends on the test method `test_b` of class `TestClass` in `tests/test_mod_01.py`, not on the test function of the same name. The `scope` argument only affects the references in the `depends` argument of the marker. It does not matter which scope is set for the dependencies: the dependency of `test_e` in `tests/test_mod_02.py` on `test_a` in `tests/test_mod_01.py` is in session scope. It is not needed to set the scope also for `test_a`. Note that the references in session scope must use the full node id of the dependencies. This node id is composed of the module path, the name of the test class if applicable, and the name of the test, separated by a double colon "::", see Section :ref:`names` for details. References in module scope on the other hand must omit the module path in the node id, because that is implied by the scope. Package scope is only available if the test is in a package and then restricts the dependencies to tests within the same package. Otherwise it works the same as session scope. The class scope --------------- Test dependencies may also be in class scope. This is only available for methods of a test class and restricts the dependencies to other test methods of the same class. Consider the following example: .. literalinclude:: ../examples/scope_class.py The test method `test_c` of class `TestClass2` will be skipped because it depends on `test_a`. The marker does not have a `scope` argument, so this dependency defaults to module scope. The dependency thus resolves to the function `test_a` at module level, which failed. The fact that there is also a method `test_a` in this class does not matter, because that would need to be referenced as `TestClass2::test_a` in module scope. The test method `test_d` of class `TestClass2` depends on `test_a` in class scope. This resolves to the method `test_a` of `TestClass2` which succeeds. As a result, `test_d` will be run and succeed as well. Test method `test_e` of class `TestClass2` will be skipped, because it depends on `test_b` in class scope, but there is no method by that name in this class. The fact that there is another class `TestClass1` having a method by that name is irrelevant. pytest-dependency-0.5.1/doc/src/usage.rst0000644000175000001440000000773413621604204020505 0ustar rolfusers00000000000000Using pytest-dependency ======================= The plugin defines a new marker :func:`pytest.mark.dependency`. .. _usage-basic: Basic usage ----------- Consider the following example test module: .. literalinclude:: ../examples/basic.py All the tests are decorated with :func:`pytest.mark.dependency`. This will cause the test results to be registered internally and thus other tests may depend on them. The list of dependencies of a test may be set in the optional `depends` argument to the marker. Running this test, we will get the following result: .. literalinclude:: ../examples/basic.out The first test has deliberately been set to fail to illustrate the effect. We will get the following resuts: `test_a` deliberately fails. `test_b` succeeds. `test_c` will be skipped because it depends on `test_a`. `test_d` depends on `test_b` which did succeed. It will be run and succeed as well. `test_e` depends on `test_b` and `test_c`. `test_b` did succeed, but `test_c` has been skipped. So this one will also be skipped. Naming tests ------------ Tests are referenced by their name in the `depends` argument. The default for this name is the node id defined by pytest, that is the name of the test function, extended by the parameters if applicable, see Section :ref:`names` for details. In some cases, it's not easy to predict the names of the node ids. For this reason, the name of the tests can be overridden by an explicit `name` argument to the marker. The names must be unique. The following example works exactly as the last one, only the test names are explicitely set: .. literalinclude:: ../examples/named.py Using test classes ------------------ Tests may be grouped in classes in pytest. Marking the dependencies of methods in test classes works the same way as for simple test functions. In the following example we define two test classes. Each works in the same manner as the previous examples respectively: .. literalinclude:: ../examples/testclass.py In `TestClass` the default names for the tests are used, which is build from the name of the class and the respective method in this case, while in `TestClassNamed` these names are overridden by an explicit `name` argument to the :func:`pytest.mark.dependency` marker. .. versionchanged:: 0.3 The name of the class is prepended to the method name to form the default name for the test. .. _usage-parametrized: Parametrized tests ------------------ In the same way as the :func:`pytest.mark.skip` and :func:`pytest.mark.xfail` markers, the :func:`pytest.mark.dependency` marker may be applied to individual test instances in the case of parametrized tests. Consider the following example: .. literalinclude:: ../examples/parametrized.py The test instance `test_a[0-1]`, named `a2` in the :func:`pytest.mark.dependency` marker, is going to fail. As a result, the dependent tests `b1`, `b4`, `b5`, and in turn `c1` and `c3` will be skipped. Marking dependencies at runtime ------------------------------- Sometimes, dependencies of test instances are too complicated to be formulated explicitely beforehand using the :func:`pytest.mark.dependency` marker. It may be easier to compile the list of dependencies of a test at run time. In such cases, the function :func:`pytest_dependency.depends` comes handy. Consider the following example: .. literalinclude:: ../examples/runtime.py Tests `test_c` and `test_d` set their dependencies at runtime calling :func:`pytest_dependency.depends`. The first argument is the value of the `request` pytest fixture, the second argument is the list of dependencies. It has the same effect as passing this list as the `depends` argument to the :func:`pytest.mark.dependency` marker. The present example is certainly somewhat artificial, as the use of the :func:`pytest_dependency.depends` function would not be needed in such a simple case. For a more involved example that can not as easily be formulated with the static the `depends` argument, see :ref:`advanced-grouping-fixtures`. pytest-dependency-0.5.1/pytest_dependency.egg-info/0000755000175000001440000000000013621604715022527 5ustar rolfusers00000000000000pytest-dependency-0.5.1/pytest_dependency.egg-info/PKG-INFO0000644000175000001440000000271413621604714023627 0ustar rolfusers00000000000000Metadata-Version: 1.2 Name: pytest-dependency Version: 0.5.1 Summary: Manage dependencies of tests Home-page: https://github.com/RKrahl/pytest-dependency Author: Rolf Krahl Author-email: rolf@rotkraut.de Maintainer: Rolf Krahl Maintainer-email: rolf@rotkraut.de License: Apache Software License 2.0 Project-URL: Documentation, https://pytest-dependency.readthedocs.io/ Project-URL: Source Code, https://github.com/RKrahl/pytest-dependency Description: pytest-dependency - Manage dependencies of tests This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Framework :: Pytest Classifier: Intended Audience :: Developers Classifier: Topic :: Software Development :: Testing Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Operating System :: OS Independent Classifier: License :: OSI Approved :: Apache Software License pytest-dependency-0.5.1/pytest_dependency.egg-info/SOURCES.txt0000644000175000001440000000251213621604714024412 0ustar rolfusers00000000000000.gitignore .travis.yml .version LICENSE.txt MANIFEST.in Makefile README.rst pytest_dependency.py requirements.txt setup.py doc/Makefile doc/examples/all_params.py doc/examples/basic.out doc/examples/basic.py doc/examples/dyn-parametrized.py doc/examples/group-fixture.py doc/examples/group-fixture2.py doc/examples/named.py doc/examples/nodeid.out doc/examples/nodeid.py doc/examples/parametrized.py doc/examples/runtime.py doc/examples/scope_class.py doc/examples/scope_module.py doc/examples/scope_session_mod_01.py doc/examples/scope_session_mod_02.py doc/examples/testclass.py doc/src/about.rst doc/src/advanced.rst doc/src/changelog.rst doc/src/conf.py doc/src/configuration.rst doc/src/index.rst doc/src/install.rst doc/src/names.rst doc/src/reference.rst doc/src/scope.rst doc/src/usage.rst pytest_dependency.egg-info/PKG-INFO pytest_dependency.egg-info/SOURCES.txt pytest_dependency.egg-info/dependency_links.txt pytest_dependency.egg-info/entry_points.txt pytest_dependency.egg-info/requires.txt pytest_dependency.egg-info/top_level.txt tests/conftest.py tests/pytest.ini tests/test_01_marker.py tests/test_02_simple_dependency.py tests/test_03_class.py tests/test_03_multiple_dependency.py tests/test_03_param.py tests/test_03_runtime.py tests/test_03_scope.py tests/test_03_skipmsgs.py tests/test_04_automark.py tests/test_04_ignore_unknown.pypytest-dependency-0.5.1/pytest_dependency.egg-info/dependency_links.txt0000644000175000001440000000000113621604714026574 0ustar rolfusers00000000000000 pytest-dependency-0.5.1/pytest_dependency.egg-info/entry_points.txt0000644000175000001440000000005313621604714026022 0ustar rolfusers00000000000000[pytest11] dependency = pytest_dependency pytest-dependency-0.5.1/pytest_dependency.egg-info/requires.txt0000644000175000001440000000001613621604714025123 0ustar rolfusers00000000000000pytest>=3.6.0 pytest-dependency-0.5.1/pytest_dependency.egg-info/top_level.txt0000644000175000001440000000002213621604714025252 0ustar rolfusers00000000000000pytest_dependency pytest-dependency-0.5.1/pytest_dependency.py0000644000175000001440000001405613621604715021415 0ustar rolfusers00000000000000"""pytest-dependency - Manage dependencies of tests This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. """ __version__ = "0.5.1" import pytest _automark = False _ignore_unknown = False def _get_bool(value): """Evaluate string representation of a boolean value. """ if value: if value.lower() in ["0", "no", "n", "false", "f", "off"]: return False elif value.lower() in ["1", "yes", "y", "true", "t", "on"]: return True else: raise ValueError("Invalid truth value '%s'" % value) else: return False class DependencyItemStatus(object): """Status of a test item in a dependency manager. """ Phases = ('setup', 'call', 'teardown') def __init__(self): self.results = { w:None for w in self.Phases } def __str__(self): l = ["%s: %s" % (w, self.results[w]) for w in self.Phases] return "Status(%s)" % ", ".join(l) def addResult(self, rep): self.results[rep.when] = rep.outcome def isSuccess(self): return list(self.results.values()) == ['passed', 'passed', 'passed'] class DependencyManager(object): """Dependency manager, stores the results of tests. """ ScopeCls = { 'session': pytest.Session, 'package': pytest.Package, 'module': pytest.Module, 'class': pytest.Class, } @classmethod def getManager(cls, item, scope): """Get the DependencyManager object from the node at scope level. Create it, if not yet present. """ node = item.getparent(cls.ScopeCls[scope]) if not node: return None if not hasattr(node, 'dependencyManager'): node.dependencyManager = cls(scope) return node.dependencyManager def __init__(self, scope): self.results = {} self.scope = scope def addResult(self, item, name, rep): if not name: # Old versions of pytest used to add an extra "::()" to # the node ids of class methods to denote the class # instance. This has been removed in pytest 4.0.0. nodeid = item.nodeid.replace("::()::", "::") if self.scope == 'session' or self.scope == 'package': name = nodeid elif self.scope == 'module': name = nodeid.split("::", 1)[1] elif self.scope == 'class': name = nodeid.split("::", 2)[2] else: raise RuntimeError("Internal error: invalid scope '%s'" % self.scope) status = self.results.setdefault(name, DependencyItemStatus()) status.addResult(rep) def checkDepend(self, depends, item): for i in depends: if i in self.results: if self.results[i].isSuccess(): continue else: if _ignore_unknown: continue pytest.skip("%s depends on %s" % (item.name, i)) def depends(request, other, scope='module'): """Add dependency on other test. Call pytest.skip() unless a successful outcome of all of the tests in other has been registered previously. This has the same effect as the `depends` keyword argument to the :func:`pytest.mark.dependency` marker. In contrast to the marker, this function may be called at runtime during a test. :param request: the value of the `request` pytest fixture related to the current test. :param other: dependencies, a list of names of tests that this test depends on. The names of the dependencies must be adapted to the scope. :type other: iterable of :class:`str` :param scope: the scope to search for the dependencies. Must be either `'session'`, `'package'`, `'module'`, or `'class'`. :type scope: :class:`str` .. versionadded:: 0.2 .. versionchanged:: 0.5.0 the scope parameter has been added. """ item = request.node manager = DependencyManager.getManager(item, scope=scope) manager.checkDepend(other, item) def pytest_addoption(parser): parser.addini("automark_dependency", "Add the dependency marker to all tests automatically", default=False) parser.addoption("--ignore-unknown-dependency", action="store_true", default=False, help="ignore dependencies whose outcome is not known") def pytest_configure(config): global _automark, _ignore_unknown _automark = _get_bool(config.getini("automark_dependency")) _ignore_unknown = config.getoption("--ignore-unknown-dependency") config.addinivalue_line("markers", "dependency(name=None, depends=[]): " "mark a test to be used as a dependency for " "other tests or to depend on other tests.") @pytest.hookimpl(tryfirst=True, hookwrapper=True) def pytest_runtest_makereport(item, call): """Store the test outcome if this item is marked "dependency". """ outcome = yield marker = item.get_closest_marker("dependency") if marker is not None or _automark: rep = outcome.get_result() name = marker.kwargs.get('name') if marker is not None else None for scope in DependencyManager.ScopeCls: manager = DependencyManager.getManager(item, scope=scope) if (manager): manager.addResult(item, name, rep) def pytest_runtest_setup(item): """Check dependencies if this item is marked "dependency". Skip if any of the dependencies has not been run successfully. """ marker = item.get_closest_marker("dependency") if marker is not None: depends = marker.kwargs.get('depends') if depends: scope = marker.kwargs.get('scope', 'module') manager = DependencyManager.getManager(item, scope=scope) manager.checkDepend(depends, item) pytest-dependency-0.5.1/requirements.txt0000644000175000001440000000003613621604577020600 0ustar rolfusers00000000000000pytest >=3.6.0 setuptools_scm pytest-dependency-0.5.1/setup.cfg0000644000175000001440000000004613621604715017130 0ustar rolfusers00000000000000[egg_info] tag_build = tag_date = 0 pytest-dependency-0.5.1/setup.py0000755000175000001440000000542613621604204017024 0ustar rolfusers00000000000000#! /usr/bin/python """pytest-dependency - Manage dependencies of tests This pytest plugin manages dependencies of tests. It allows to mark some tests as dependent from other tests. These tests will then be skipped if any of the dependencies did fail or has been skipped. """ import distutils.log import os import os.path import re import string from setuptools import setup import setuptools.command.sdist as st_sdist try: import setuptools_scm version = setuptools_scm.get_version() with open(".version", "wt") as f: f.write(version) except (ImportError, LookupError): try: with open(".version", "rt") as f: version = f.read() except (OSError, IOError): distutils.log.warn("warning: cannot determine version number") version = "UNKNOWN" class sdist(st_sdist.sdist): def make_release_tree(self, base_dir, files): st_sdist.sdist.make_release_tree(self, base_dir, files) if not self.dry_run: src = "pytest_dependency.py" dest = os.path.join(base_dir, src) if hasattr(os, 'link') and os.path.exists(dest): os.unlink(dest) subst = {'DOC': __doc__, 'VERSION': version} with open(src, "rt") as sf, open(dest, "wt") as df: df.write(string.Template(sf.read()).substitute(subst)) setup( name='pytest-dependency', version=version, description='Manage dependencies of tests', author='Rolf Krahl', author_email='rolf@rotkraut.de', maintainer='Rolf Krahl', maintainer_email='rolf@rotkraut.de', url='https://github.com/RKrahl/pytest-dependency', license='Apache Software License 2.0', long_description=__doc__, project_urls={ 'Documentation': 'https://pytest-dependency.readthedocs.io/', 'Source Code': 'https://github.com/RKrahl/pytest-dependency', }, py_modules=['pytest_dependency'], install_requires=['pytest >= 3.6.0'], classifiers=[ 'Development Status :: 4 - Beta', 'Framework :: Pytest', 'Intended Audience :: Developers', 'Topic :: Software Development :: Testing', 'Programming Language :: Python', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: 3.8', 'Operating System :: OS Independent', 'License :: OSI Approved :: Apache Software License', ], entry_points={ 'pytest11': [ 'dependency = pytest_dependency', ], }, cmdclass = {'sdist': sdist}, ) pytest-dependency-0.5.1/tests/0000755000175000001440000000000013621604715016451 5ustar rolfusers00000000000000pytest-dependency-0.5.1/tests/conftest.py0000644000175000001440000000055513621604204020646 0ustar rolfusers00000000000000import pytest pytest_plugins = "pytester" @pytest.fixture def ctestdir(testdir): testdir.makefile('.ini', pytest=""" [pytest] console_output_style = classic """) testdir.makeconftest(""" import sys if "pytest_dependency" not in sys.modules: pytest_plugins = "pytest_dependency" """) return testdir pytest-dependency-0.5.1/tests/pytest.ini0000644000175000001440000000003213621604204020466 0ustar rolfusers00000000000000[pytest] minversion = 3.6 pytest-dependency-0.5.1/tests/test_01_marker.py0000644000175000001440000000141513621604204021635 0ustar rolfusers00000000000000"""The most basic test: check that the marker works. """ import pytest def test_marker_registered(ctestdir): result = ctestdir.runpytest("--markers") result.stdout.fnmatch_lines(""" @pytest.mark.dependency* """) def test_marker(ctestdir): ctestdir.makepyfile(""" import pytest from pytest_dependency import DependencyManager @pytest.mark.dependency() def test_marker(request): node = request.node.getparent(pytest.Module) assert hasattr(node, 'dependencyManager') assert isinstance(node.dependencyManager, DependencyManager) assert 'test_marker' in node.dependencyManager.results """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1) pytest-dependency-0.5.1/tests/test_02_simple_dependency.py0000644000175000001440000001154213621604204024046 0ustar rolfusers00000000000000"""Simple dependencies between tests. """ import pytest def test_no_skip(ctestdir): """One test is skipped, but no other test depends on it, so all other tests pass. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pytest.skip("explicit skip") @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency(depends=["test_b"]) def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=3, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_a SKIPPED *::test_b PASSED *::test_c PASSED *::test_d PASSED """) def test_skip_depend(ctestdir): """One test is skipped, other dependent tests are skipped as well. This also includes indirect dependencies. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pytest.skip("explicit skip") @pytest.mark.dependency(depends=["test_b"]) def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1, skipped=3, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b SKIPPED *::test_c SKIPPED *::test_d SKIPPED """) def test_fail_depend(ctestdir): """One test fails, other dependent tests are skipped. This also includes indirect dependencies. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): assert False @pytest.mark.dependency(depends=["test_b"]) def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1, skipped=2, failed=1) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b FAILED *::test_c SKIPPED *::test_d SKIPPED """) def test_named_fail_depend(ctestdir): """Same as test_fail_depend, but using custom test names. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency(name="a") def test_a(): pass @pytest.mark.dependency(name="b") def test_b(): assert False @pytest.mark.dependency(name="c", depends=["b"]) def test_c(): pass @pytest.mark.dependency(name="d", depends=["c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1, skipped=2, failed=1) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b FAILED *::test_c SKIPPED *::test_d SKIPPED """) def test_explicit_select(ctestdir): """Explicitly select only a single test that depends on another one. Since the other test has not been run at all, the selected test will be skipped. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency() def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose", "test_explicit_select.py::test_d") result.assert_outcomes(passed=0, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_d SKIPPED """) def test_depend_unknown(ctestdir): """Depend on an unknown test that is not even defined in the test set. Note that is not an error to depend on an undefined test, but the dependent test will be skipped since the non-existent dependency has not been run successfully. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency() def test_c(): pass @pytest.mark.dependency(depends=["test_x"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=3, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b PASSED *::test_c PASSED *::test_d SKIPPED """) pytest-dependency-0.5.1/tests/test_03_class.py0000644000175000001440000000663413621604204021473 0ustar rolfusers00000000000000"""Usage with test classes. """ import pytest def test_class_simple(ctestdir): """Simple dependencies of test methods in a class. test_a() deliberately fails, some other methods depend on it, some don't. """ ctestdir.makepyfile(""" import pytest class TestClass(object): @pytest.mark.dependency() def test_a(self): assert False @pytest.mark.dependency() def test_b(self): pass @pytest.mark.dependency(depends=["TestClass::test_a"]) def test_c(self): pass @pytest.mark.dependency(depends=["TestClass::test_b"]) def test_d(self): pass @pytest.mark.dependency(depends=["TestClass::test_b", "TestClass::test_c"]) def test_e(self): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=2, skipped=2, failed=1) result.stdout.fnmatch_lines(""" *::TestClass::test_a FAILED *::TestClass::test_b PASSED *::TestClass::test_c SKIPPED *::TestClass::test_d PASSED *::TestClass::test_e SKIPPED """) def test_class_simple_named(ctestdir): """Mostly the same as test_class_simple(), but name the test methods now explicitely. """ ctestdir.makepyfile(""" import pytest class TestClassNamed(object): @pytest.mark.dependency(name="a") def test_a(self): assert False @pytest.mark.dependency(name="b") def test_b(self): pass @pytest.mark.dependency(name="c", depends=["a"]) def test_c(self): pass @pytest.mark.dependency(name="d", depends=["b"]) def test_d(self): pass @pytest.mark.dependency(name="e", depends=["b", "c"]) def test_e(self): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=2, skipped=2, failed=1) result.stdout.fnmatch_lines(""" *::TestClassNamed::test_a FAILED *::TestClassNamed::test_b PASSED *::TestClassNamed::test_c SKIPPED *::TestClassNamed::test_d PASSED *::TestClassNamed::test_e SKIPPED """) def test_class_default_name(ctestdir): """Issue #6: for methods of test classes, the default name used to be the method name. This could have caused conflicts if there is a function having the same name outside the class. In the following example, before fixing this issue, the method test_a() of class TestClass would have shadowed the failure of function test_a(). Now the class name is prepended to the default test name, removing this conflict. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): assert False class TestClass(object): @pytest.mark.dependency() def test_a(self): pass @pytest.mark.dependency(depends=["test_a"]) def test_b(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1, skipped=1, failed=1) result.stdout.fnmatch_lines(""" *::test_a FAILED *::TestClass::test_a PASSED *::test_b SKIPPED """) pytest-dependency-0.5.1/tests/test_03_multiple_dependency.py0000644000175000001440000000317713621604204024416 0ustar rolfusers00000000000000"""A complicated scenario with tests having multiple dependencies. """ import pytest def test_multiple(ctestdir): ctestdir.makepyfile(""" import pytest @pytest.mark.dependency(name="a") def test_a(): pytest.skip("explicit skip") @pytest.mark.dependency(name="b") def test_b(): assert False @pytest.mark.dependency(name="c") def test_c(): pass @pytest.mark.dependency(name="d") def test_d(): pass @pytest.mark.dependency(name="e") def test_e(): pass @pytest.mark.dependency(name="f", depends=["a", "c"]) def test_f(): pass @pytest.mark.dependency(name="g", depends=["b", "d"]) def test_g(): pass @pytest.mark.dependency(name="h", depends=["c", "e"]) def test_h(): pass @pytest.mark.dependency(name="i", depends=["f", "h"]) def test_i(): pass @pytest.mark.dependency(name="j", depends=["d", "h"]) def test_j(): pass @pytest.mark.dependency(name="k", depends=["g", "i", "j"]) def test_k(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=5, skipped=5, failed=1) result.stdout.fnmatch_lines(""" *::test_a SKIPPED *::test_b FAILED *::test_c PASSED *::test_d PASSED *::test_e PASSED *::test_f SKIPPED *::test_g SKIPPED *::test_h PASSED *::test_i SKIPPED *::test_j PASSED *::test_k SKIPPED """) pytest-dependency-0.5.1/tests/test_03_param.py0000644000175000001440000000361513621604204021462 0ustar rolfusers00000000000000"""A scenario featuring parametrized tests. """ import pytest def test_multiple(ctestdir): ctestdir.makepyfile(""" import pytest _md = pytest.mark.dependency @pytest.mark.parametrize("x,y", [ pytest.param(0, 0, marks=_md(name="a1")), pytest.param(0, 1, marks=_md(name="a2")), pytest.param(1, 0, marks=_md(name="a3")), pytest.param(1, 1, marks=_md(name="a4")) ]) def test_a(x,y): assert x==0 or y==0 @pytest.mark.parametrize("u,v", [ pytest.param(1, 2, marks=_md(name="b1", depends=["a1", "a2"])), pytest.param(1, 3, marks=_md(name="b2", depends=["a1", "a3"])), pytest.param(1, 4, marks=_md(name="b3", depends=["a1", "a4"])), pytest.param(2, 3, marks=_md(name="b4", depends=["a2", "a3"])), pytest.param(2, 4, marks=_md(name="b5", depends=["a2", "a4"])), pytest.param(3, 4, marks=_md(name="b6", depends=["a3", "a4"])) ]) def test_b(u,v): pass @pytest.mark.parametrize("w", [ pytest.param(1, marks=_md(name="c1", depends=["b1", "b3", "b5"])), pytest.param(2, marks=_md(name="c2", depends=["b1", "b3", "b6"])), pytest.param(3, marks=_md(name="c3", depends=["b1", "b2", "b4"])) ]) def test_c(w): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=7, skipped=5, failed=1) result.stdout.fnmatch_lines(""" *::test_a?0-0? PASSED *::test_a?0-1? PASSED *::test_a?1-0? PASSED *::test_a?1-1? FAILED *::test_b?1-2? PASSED *::test_b?1-3? PASSED *::test_b?1-4? SKIPPED *::test_b?2-3? PASSED *::test_b?2-4? SKIPPED *::test_b?3-4? SKIPPED *::test_c?1? SKIPPED *::test_c?2? SKIPPED *::test_c?3? PASSED """) pytest-dependency-0.5.1/tests/test_03_runtime.py0000644000175000001440000000177413621604204022051 0ustar rolfusers00000000000000"""Using depends() to mark dependencies at runtime. """ import pytest def test_skip_depend_runtime(ctestdir): """One test is skipped, other dependent tests are skipped as well. This also includes indirect dependencies. """ ctestdir.makepyfile(""" import pytest from pytest_dependency import depends @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pytest.skip("explicit skip") @pytest.mark.dependency() def test_c(request): depends(request, ["test_b"]) pass @pytest.mark.dependency() def test_d(request): depends(request, ["test_a", "test_c"]) pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=1, skipped=3, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b SKIPPED *::test_c SKIPPED *::test_d SKIPPED """) pytest-dependency-0.5.1/tests/test_03_scope.py0000644000175000001440000004147613621604204021502 0ustar rolfusers00000000000000"""Specifying the scope of dependencies. """ import pytest def test_scope_module(ctestdir): """One single module, module scope is explicitely set in the pytest.mark.dependency() marker. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): assert False @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency(depends=["test_a"], scope='module') def test_c(): pass @pytest.mark.dependency(depends=["test_b"], scope='module') def test_d(): pass @pytest.mark.dependency(depends=["test_b", "test_c"], scope='module') def test_e(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=2, skipped=2, failed=1) result.stdout.fnmatch_lines(""" test_scope_module.py::test_a FAILED test_scope_module.py::test_b PASSED test_scope_module.py::test_c SKIPPED test_scope_module.py::test_d PASSED test_scope_module.py::test_e SKIPPED """) def test_scope_session(ctestdir): """Two modules, some cross module dependencies in session scope. """ ctestdir.makepyfile(test_scope_session_01=""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): assert False @pytest.mark.dependency(depends=["test_a"]) def test_c(): pass class TestClass(object): @pytest.mark.dependency() def test_b(self): pass """, test_scope_session_02=""" import pytest @pytest.mark.dependency() def test_a(): assert False @pytest.mark.dependency( depends=["test_scope_session_01.py::test_a", "test_scope_session_01.py::test_c"], scope='session' ) def test_e(): pass @pytest.mark.dependency( depends=["test_scope_session_01.py::test_b"], scope='session' ) def test_f(): pass @pytest.mark.dependency( depends=["test_scope_session_02.py::test_e"], scope='session' ) def test_g(): pass @pytest.mark.dependency( depends=["test_scope_session_01.py::TestClass::test_b"], scope='session' ) def test_h(): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=6, skipped=1, failed=2) result.stdout.fnmatch_lines(""" test_scope_session_01.py::test_a PASSED test_scope_session_01.py::test_b FAILED test_scope_session_01.py::test_c PASSED test_scope_session_01.py::TestClass::test_b PASSED test_scope_session_02.py::test_a FAILED test_scope_session_02.py::test_e PASSED test_scope_session_02.py::test_f SKIPPED test_scope_session_02.py::test_g PASSED test_scope_session_02.py::test_h PASSED """) def test_scope_package(ctestdir): """Two packages, some cross module dependencies within the package and across package boundaries. """ ctestdir.mkpydir("test_scope_package_a") ctestdir.mkpydir("test_scope_package_b") srcs = { 'test_scope_package_a/test_01': """ import pytest @pytest.mark.dependency() def test_a(): pass """, 'test_scope_package_b/test_02': """ import pytest @pytest.mark.dependency() def test_c(): pass @pytest.mark.dependency() def test_d(): assert False """, 'test_scope_package_b/test_03': """ import pytest @pytest.mark.dependency( depends=["test_scope_package_a/test_01.py::test_a"], scope='session' ) def test_e(): pass @pytest.mark.dependency( depends=["test_scope_package_a/test_01.py::test_a"], scope='package' ) def test_f(): pass @pytest.mark.dependency( depends=["test_scope_package_b/test_02.py::test_c"], scope='package' ) def test_g(): pass @pytest.mark.dependency( depends=["test_scope_package_b/test_02.py::test_d"], scope='package' ) def test_h(): pass """, } ctestdir.makepyfile(**srcs) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=4, skipped=2, failed=1) result.stdout.fnmatch_lines(""" test_scope_package_a/test_01.py::test_a PASSED test_scope_package_b/test_02.py::test_c PASSED test_scope_package_b/test_02.py::test_d FAILED test_scope_package_b/test_03.py::test_e PASSED test_scope_package_b/test_03.py::test_f SKIPPED test_scope_package_b/test_03.py::test_g PASSED test_scope_package_b/test_03.py::test_h SKIPPED """) def test_scope_class(ctestdir): """Dependencies in class scope. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): assert False @pytest.mark.dependency() def test_b(): pass class TestClass1(object): @pytest.mark.dependency() def test_c(self): pass class TestClass2(object): @pytest.mark.dependency() def test_a(self): pass @pytest.mark.dependency() def test_b(self): assert False @pytest.mark.dependency(depends=["test_a"]) def test_d(self): pass @pytest.mark.dependency(depends=["test_b"]) def test_e(self): pass @pytest.mark.dependency(depends=["test_a"], scope='class') def test_f(self): pass @pytest.mark.dependency(depends=["test_b"], scope='class') def test_g(self): pass @pytest.mark.dependency(depends=["test_c"], scope='class') def test_h(self): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=5, skipped=3, failed=2) result.stdout.fnmatch_lines(""" test_scope_class.py::test_a FAILED test_scope_class.py::test_b PASSED test_scope_class.py::TestClass1::test_c PASSED test_scope_class.py::TestClass2::test_a PASSED test_scope_class.py::TestClass2::test_b FAILED test_scope_class.py::TestClass2::test_d SKIPPED test_scope_class.py::TestClass2::test_e PASSED test_scope_class.py::TestClass2::test_f PASSED test_scope_class.py::TestClass2::test_g SKIPPED test_scope_class.py::TestClass2::test_h SKIPPED """) def test_scope_nodeid(ctestdir): """The default name of a test is the node id. The references to the default names must be adapted according to the scope. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency( depends=["test_a"], scope='module' ) def test_b(): pass @pytest.mark.dependency( depends=["test_scope_nodeid.py::test_a"], scope='module' ) def test_c(): pass @pytest.mark.dependency( depends=["test_a"], scope='session' ) def test_d(): pass @pytest.mark.dependency( depends=["test_scope_nodeid.py::test_a"], scope='session' ) def test_e(): pass class TestClass(object): @pytest.mark.dependency() def test_f(self): pass @pytest.mark.dependency( depends=["test_f"], scope='class' ) def test_g(self): pass @pytest.mark.dependency( depends=["TestClass::test_f"], scope='class' ) def test_h(self): pass @pytest.mark.dependency( depends=["test_scope_nodeid.py::TestClass::test_f"], scope='class' ) def test_i(self): pass @pytest.mark.dependency( depends=["test_f"], scope='module' ) def test_j(self): pass @pytest.mark.dependency( depends=["TestClass::test_f"], scope='module' ) def test_k(self): pass @pytest.mark.dependency( depends=["test_scope_nodeid.py::TestClass::test_f"], scope='module' ) def test_l(self): pass @pytest.mark.dependency( depends=["test_f"], scope='session' ) def test_m(self): pass @pytest.mark.dependency( depends=["TestClass::test_f"], scope='session' ) def test_n(self): pass @pytest.mark.dependency( depends=["test_scope_nodeid.py::TestClass::test_f"], scope='session' ) def test_o(self): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=7, skipped=8, failed=0) result.stdout.fnmatch_lines(""" test_scope_nodeid.py::test_a PASSED test_scope_nodeid.py::test_b PASSED test_scope_nodeid.py::test_c SKIPPED test_scope_nodeid.py::test_d SKIPPED test_scope_nodeid.py::test_e PASSED test_scope_nodeid.py::TestClass::test_f PASSED test_scope_nodeid.py::TestClass::test_g PASSED test_scope_nodeid.py::TestClass::test_h SKIPPED test_scope_nodeid.py::TestClass::test_i SKIPPED test_scope_nodeid.py::TestClass::test_j SKIPPED test_scope_nodeid.py::TestClass::test_k PASSED test_scope_nodeid.py::TestClass::test_l SKIPPED test_scope_nodeid.py::TestClass::test_m SKIPPED test_scope_nodeid.py::TestClass::test_n SKIPPED test_scope_nodeid.py::TestClass::test_o PASSED """) def test_scope_named(ctestdir): """Explicitely named tests are always referenced by that name, regardless of the scope. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency(name="a") def test_a(): pass @pytest.mark.dependency( depends=["a"], scope='module' ) def test_b(): pass @pytest.mark.dependency( depends=["test_a"], scope='module' ) def test_c(): pass @pytest.mark.dependency( depends=["a"], scope='session' ) def test_d(): pass @pytest.mark.dependency( depends=["test_scope_named.py::test_a"], scope='session' ) def test_e(): pass class TestClass(object): @pytest.mark.dependency(name="f") def test_f(self): pass @pytest.mark.dependency( depends=["f"], scope='class' ) def test_g(self): pass @pytest.mark.dependency( depends=["test_f"], scope='class' ) def test_h(self): pass @pytest.mark.dependency( depends=["f"], scope='module' ) def test_i(self): pass @pytest.mark.dependency( depends=["TestClass::test_f"], scope='module' ) def test_j(self): pass @pytest.mark.dependency( depends=["f"], scope='session' ) def test_k(self): pass @pytest.mark.dependency( depends=["test_scope_named.py::TestClass::test_f"], scope='session' ) def test_l(self): pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=7, skipped=5, failed=0) result.stdout.fnmatch_lines(""" test_scope_named.py::test_a PASSED test_scope_named.py::test_b PASSED test_scope_named.py::test_c SKIPPED test_scope_named.py::test_d PASSED test_scope_named.py::test_e SKIPPED test_scope_named.py::TestClass::test_f PASSED test_scope_named.py::TestClass::test_g PASSED test_scope_named.py::TestClass::test_h SKIPPED test_scope_named.py::TestClass::test_i PASSED test_scope_named.py::TestClass::test_j SKIPPED test_scope_named.py::TestClass::test_k PASSED test_scope_named.py::TestClass::test_l SKIPPED """) def test_scope_dependsfunc(ctestdir): """Test the scope argument to the depends() function. """ ctestdir.makepyfile(test_scope_dependsfunc_01=""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): assert False @pytest.mark.dependency(depends=["test_a"]) def test_c(): pass class TestClass(object): @pytest.mark.dependency() def test_b(self): pass """, test_scope_dependsfunc_02=""" import pytest from pytest_dependency import depends @pytest.mark.dependency() def test_a(): assert False @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency() def test_e(request): depends(request, ["test_scope_dependsfunc_01.py::test_a", "test_scope_dependsfunc_01.py::test_c"], scope='session') pass @pytest.mark.dependency() def test_f(request): depends(request, ["test_scope_dependsfunc_01.py::test_b"], scope='session') pass @pytest.mark.dependency() def test_g(request): depends(request, ["test_scope_dependsfunc_02.py::test_e"], scope='session') pass @pytest.mark.dependency() def test_h(request): depends(request, ["test_scope_dependsfunc_01.py::TestClass::test_b"], scope='session') pass @pytest.mark.dependency() def test_i(request): depends(request, ["test_a"], scope='module') pass @pytest.mark.dependency() def test_j(request): depends(request, ["test_b"], scope='module') pass class TestClass(object): @pytest.mark.dependency() def test_a(self): pass @pytest.mark.dependency() def test_b(self): assert False @pytest.mark.dependency() def test_c(self, request): depends(request, ["test_a"], scope='class') pass @pytest.mark.dependency() def test_d(self, request): depends(request, ["test_b"], scope='class') pass """) result = ctestdir.runpytest("--verbose") result.assert_outcomes(passed=10, skipped=3, failed=3) result.stdout.fnmatch_lines(""" test_scope_dependsfunc_01.py::test_a PASSED test_scope_dependsfunc_01.py::test_b FAILED test_scope_dependsfunc_01.py::test_c PASSED test_scope_dependsfunc_01.py::TestClass::test_b PASSED test_scope_dependsfunc_02.py::test_a FAILED test_scope_dependsfunc_02.py::test_b PASSED test_scope_dependsfunc_02.py::test_e PASSED test_scope_dependsfunc_02.py::test_f SKIPPED test_scope_dependsfunc_02.py::test_g PASSED test_scope_dependsfunc_02.py::test_h PASSED test_scope_dependsfunc_02.py::test_i SKIPPED test_scope_dependsfunc_02.py::test_j PASSED test_scope_dependsfunc_02.py::TestClass::test_a PASSED test_scope_dependsfunc_02.py::TestClass::test_b FAILED test_scope_dependsfunc_02.py::TestClass::test_c PASSED test_scope_dependsfunc_02.py::TestClass::test_d SKIPPED """) pytest-dependency-0.5.1/tests/test_03_skipmsgs.py0000644000175000001440000000176213621604204022223 0ustar rolfusers00000000000000"""Verify the messages issued when a dependent test is skipped. """ import pytest def test_simple(ctestdir): """One test fails, other dependent tests are skipped. This also includes indirect dependencies. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): assert False @pytest.mark.dependency(depends=["test_b"]) def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose", "-rs") result.assert_outcomes(passed=1, skipped=2, failed=1) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b FAILED *::test_c SKIPPED *::test_d SKIPPED """) result.stdout.fnmatch_lines_random(""" SKIP* test_c depends on test_b SKIP* test_d depends on test_c """) pytest-dependency-0.5.1/tests/test_04_automark.py0000644000175000001440000000457213621604204022211 0ustar rolfusers00000000000000"""Test the automark_dependency option. """ import pytest def test_not_set(ctestdir): """No pytest.ini file, e.g. automark_dependency is not set. Since automark_dependency defaults to false and test_a is not marked, the outcome of test_a will not be recorded. As a result, test_b will be skipped due to a missing dependency. """ ctestdir.makepyfile(""" import pytest def test_a(): pass @pytest.mark.dependency(depends=["test_a"]) def test_b(): pass """) result = ctestdir.runpytest("--verbose", "-rs") result.assert_outcomes(passed=1, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b SKIPPED """) def test_set_false(ctestdir): """A pytest.ini is present, automark_dependency is set to false. Since automark_dependency is set to false and test_a is not marked, the outcome of test_a will not be recorded. As a result, test_b will be skipped due to a missing dependency. """ ctestdir.makefile('.ini', pytest=""" [pytest] automark_dependency = false console_output_style = classic """) ctestdir.makepyfile(""" import pytest def test_a(): pass @pytest.mark.dependency(depends=["test_a"]) def test_b(): pass """) result = ctestdir.runpytest("--verbose", "-rs") result.assert_outcomes(passed=1, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b SKIPPED """) def test_set_true(ctestdir): """A pytest.ini is present, automark_dependency is set to false. Since automark_dependency is set to true, the outcome of test_a will be recorded, even though it is not marked. As a result, test_b will be skipped due to a missing dependency. """ ctestdir.makefile('.ini', pytest=""" [pytest] automark_dependency = true console_output_style = classic """) ctestdir.makepyfile(""" import pytest def test_a(): pass @pytest.mark.dependency(depends=["test_a"]) def test_b(): pass """) result = ctestdir.runpytest("--verbose", "-rs") result.assert_outcomes(passed=2, skipped=0, failed=0) result.stdout.fnmatch_lines(""" *::test_a PASSED *::test_b PASSED """) pytest-dependency-0.5.1/tests/test_04_ignore_unknown.py0000644000175000001440000000352613621604204023426 0ustar rolfusers00000000000000"""Test the ignore-unknown-dependency command line option. """ import pytest def test_no_ignore(ctestdir): """No command line option, e.g. ignore-unknown-dependency is not set. Explicitly select only a single test that depends on another one. Since the other test has not been run at all, the selected test will be skipped. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency() def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose", "test_no_ignore.py::test_d") result.assert_outcomes(passed=0, skipped=1, failed=0) result.stdout.fnmatch_lines(""" *::test_d SKIPPED """) def test_ignore(ctestdir): """Set the ignore-unknown-dependency command line option. Explicitly select only a single test that depends on another one. The other test has not been run at all, but since unknown dependencies will be ignored, the selected test will be run nevertheless. """ ctestdir.makepyfile(""" import pytest @pytest.mark.dependency() def test_a(): pass @pytest.mark.dependency() def test_b(): pass @pytest.mark.dependency() def test_c(): pass @pytest.mark.dependency(depends=["test_c"]) def test_d(): pass """) result = ctestdir.runpytest("--verbose", "--ignore-unknown-dependency", "test_ignore.py::test_d") result.assert_outcomes(passed=1, skipped=0, failed=0) result.stdout.fnmatch_lines(""" *::test_d PASSED """)