././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/0000775000175000017500000000000000000000000012213 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1573445122.0 treq-22.2.0/.coveragerc0000644000175000017500000000020600000000000014330 0ustar00twmtwm00000000000000[run] source = treq branch = True [paths] source = src/ .tox/*/lib/python*/site-packages/ .tox/pypy*/site-packages/ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391001.0 treq-22.2.0/CHANGELOG.rst0000664000175000017500000002001500000000000014232 0ustar00twmtwm00000000000000========= Changelog ========= .. currentmodule:: treq .. default-role:: any .. towncrier release notes start 22.2.0 (2022-02-08) =================== Features -------- - Python 3.10 and PyPy 3.8 are now supported. (`#338 `__) Bugfixes -------- - Address a regression introduced in Treq 22.1.0 that prevented transmission of cookies with requests to ports other than 80, including HTTPS (443). (`#343 `__) Deprecations and Removals ------------------------- - Support for Python 3.6, which has reached end of support, is deprecated. This is the last release with support for Python 3.6. (`#338 `__) 22.1.0 (2022-01-29) =================== Bugfixes -------- - Cookies specified as a dict were sent to every domain, not just the domain of the request, potentially exposing them on redirect. See `GHSA-fhpf-pp6p-55qc `_. (`#339 `__, CVE-2022-23607) 21.5.0 (2021-05-24) =================== Features -------- - PEP 517/518 ``build-system`` metadata is now provided in ``pyproject.toml``. (`#329 `__) Bugfixes -------- - ``treq.testing.StubTreq`` now persists ``twisted.web.server.Session`` instances between requests. (`#327 `__) Improved Documentation ---------------------- - The dependency on Sphinx required to build the documentation has been moved from the ``dev`` extra to the new ``docs`` extra. (`#296 `__) Deprecations and Removals ------------------------- - Support for Python 2.7 and 3.5 has been dropped. treq no longer depends on ``six`` or ``mock``. (`#318 `__) 21.1.0 (2021-01-14) =================== Features -------- - Support for Python 3.9: treq is now tested with CPython 3.9. (`#305 `__) - The *auth* parameter now accepts arbitrary text and `bytes` for usernames and passwords. Text is encoded as UTF-8, per :rfc:`7617`. Previously only ASCII was allowed. (`#268 `__) - treq produces a more helpful exception when passed a tuple of the wrong size in the *files* parameter. (`#299 `__) Bugfixes -------- - The *params* argument once more accepts non-ASCII ``bytes``, fixing a regression first introduced in treq 20.4.1. (`#303 `__) - treq request APIs no longer mutates a :class:`http_headers.Headers ` passed as the *headers* parameter when the *auth* parameter is also passed. (`#314 `__) - The agent returned by :func:`treq.auth.add_auth()` and :func:`treq.auth.add_basic_auth()` is now marked to provide :class:`twisted.web.iweb.IAgent`. (`#312 `__) - treq's package metadata has been updated to require ``six >= 1.13``, noting a dependency introduced in treq 20.9.0. (`#295 `__) Improved Documentation ---------------------- - The documentation of the *params* argument has been updated to more accurately describe its type-coercion behavior. (`#281 `__) - The :mod:`treq.auth` module has been documented. (`#313 `__) Deprecations and Removals ------------------------- - Support for Python 2.7, which has reached end of support, is deprecated. This is the last release with support for Python 2.7. (`#309 `__) - Support for Python 3.5, which has reached end of support, is deprecated. This is the last release with support for Python 3.5. (`#306 `__) - Deprecate tolerance of non-string values when passing headers as a dict. They have historically been silently dropped, but will raise TypeError in the next treq release. Also deprecate passing headers other than :class:`dict`, :class:`~twisted.web.http_headers.Headers`, or ``None``. Historically falsy values like ``[]`` or ``()`` were accepted. (`#294 `__) - treq request functions and methods like :func:`treq.get()` and :meth:`HTTPClient.post()` now issue a ``DeprecationWarning`` when passed unknown keyword arguments, rather than ignoring them. Mixing the *json* argument with *files* or *data* is also deprecated. These warnings will change to a ``TypeError`` in the next treq release. (`#297 `__) - The minimum supported Twisted version has increased to 18.7.0. Older versions are no longer tested in CI. (`#307 `__) 20.9.0 (2020-09-27) =================== Features -------- - The *url* parameter of :meth:`HTTPClient.request()` (and shortcuts like :meth:`~HTTPClient.get()`) now accept :class:`hyperlink.DecodedURL` and :class:`hyperlink.URL` in addition to :class:`str` and :class:`bytes`. (`#212 `__) - Compatibility with the upcoming Twisted 20.9.0 release (`#290 `__). Improved Documentation ---------------------- - An example of sending and receiving JSON has been added. (`#278 `__) 20.4.1 (2020-04-16) =================== Bugfixes -------- - Correct a typo in the treq 20.4.0 package metadata that prevented upload to PyPI (`pypa/twine#589 `__) 20.4.0 (2020-04-16) =================== Features -------- - Support for Python 3.8 and PyPy3: treq is now tested with these interpreters. (`#271 `__) Bugfixes -------- - `treq.client.HTTPClient.request()` and its aliases no longer raise `UnicodeEncodeError` when passed a Unicode *url* and non-empty *params*. Now the URL and query parameters are concatenated as documented. (`#264 `__) - In treq 20.3.0 the *params* argument didn't accept parameter names or values that contain the characters ``&`` or ``#``. Now these characters are properly escaped. (`#282 `__) Improved Documentation ---------------------- - The treq documentation has been revised to emphasize use of `treq.client.HTTPClient` over the module-level convenience functions in the `treq` module. (`#276 `__) 20.3.0 (2020-03-15) =================== Features -------- - Python 3.7 support. (`#228 `__) Bugfixes -------- - `treq.testing.RequestTraversalAgent` now passes its memory reactor to the `twisted.web.server.Site` it creates, preventing the ``Site`` from polluting the global reactor. (`#225 `__) - `treq.testing` no longer generates deprecation warnings about ``twisted.test.proto_helpers.MemoryReactor``. (`#253 `__) Improved Documentation ---------------------- - The ``download_file.py`` example has been updated to do a streaming download with *unbuffered=True*. (`#233 `__) - The *agent* parameter to `treq.request()` has been documented. (`#235 `__) - The type of the *headers* element of a response tuple passed to `treq.testing.RequestSequence` is now correctly documented as `str`. (`#237 `__) Deprecations and Removals ------------------------- - Drop support for Python 3.4. (`#240 `__) Misc ---- - `#247 `__, `#248 `__, `#249 `__ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611559720.0 treq-22.2.0/CONTRIBUTING.rst0000664000175000017500000000157700000000000014666 0ustar00twmtwm00000000000000Developing ========== This project uses `Tox `_ to manage virtual environments. To run the tests:: tox -e py38-twisted_latest Lint:: tox -e flake8 Build docs:: tox -e docs firefox docs/html/index.html To do it all:: tox -p Release notes ------------- We use `towncrier`_ to manage our release notes. Basically, every pull request that has a user visible effect should add a short file to the `changelog.d/ <./changelog.d>`_ directory describing the change, with a name like ..rst. See `changelog.d/README.rst `_ for details. This way we can keep a good list of changes as we go, which makes the release manager happy, which means we get more frequent releases, which means your change gets into users’ hands faster. .. _towncrier: https://pypi.org/project/towncrier/ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1609024904.0 treq-22.2.0/LICENSE0000664000175000017500000000207500000000000013224 0ustar00twmtwm00000000000000This is the MIT license. Copyright (c) 2012-2014 David Reid Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1642980426.0 treq-22.2.0/MANIFEST.in0000664000175000017500000000050000000000000013744 0ustar00twmtwm00000000000000include pyproject.toml include *.rst include *.md include LICENSE include .coveragerc recursive-include docs * prune docs/_build prune docs/html exclude tox.ini exclude .github exclude .readthedocs.yml # This directory will be empty at release time. prune changelog.d global-exclude .DS_Store *.pyc *.pyo __pycache__ ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/PKG-INFO0000664000175000017500000000566600000000000013325 0ustar00twmtwm00000000000000Metadata-Version: 2.1 Name: treq Version: 22.2.0 Summary: High-level Twisted HTTP Client API Home-page: https://github.com/twisted/treq Author: David Reid Author-email: dreid@dreid.org Maintainer: Tom Most Maintainer-email: twm@freecog.net License: MIT/X Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Framework :: Twisted Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Requires-Python: >=3.6 Description-Content-Type: text/x-rst Provides-Extra: dev Provides-Extra: docs License-File: LICENSE treq: High-level Twisted HTTP Client API ======================================== |pypi|_ |calver|_ |coverage|_ |documentation|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> import treq >>> def done(response): ... print(response.code) ... reactor.stop() >>> treq.get("https://github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contributing ------------ ``treq`` development is hosted on `GitHub `_. We welcome contributions: feel to fork and send contributions over. See `CONTRIBUTING.rst `_ for more info. Code of Conduct --------------- Refer to the `Twisted code of conduct `_. Copyright and License --------------------- ``treq`` is made available under the MIT license. See `LICENSE <./LICENSE>`_ for legal details and copyright notices. .. _pypi: https://pypi.org/project/treq/ .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg :alt: PyPI .. _calver: https://calver.org/ .. |calver| image:: https://img.shields.io/badge/calver-YY.MM.MICRO-22bfda.svg :alt: calver: YY.MM.MICRO .. _coverage: https://coveralls.io/github/twisted/treq .. |coverage| image:: https://coveralls.io/repos/github/twisted/treq/badge.svg :alt: Coverage .. _documentation: https://treq.readthedocs.org .. |documentation| image:: https://readthedocs.org/projects/treq/badge/ :alt: Documentation ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1640750759.0 treq-22.2.0/README.rst0000664000175000017500000000361500000000000013707 0ustar00twmtwm00000000000000treq: High-level Twisted HTTP Client API ======================================== |pypi|_ |calver|_ |coverage|_ |documentation|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> import treq >>> def done(response): ... print(response.code) ... reactor.stop() >>> treq.get("https://github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contributing ------------ ``treq`` development is hosted on `GitHub `_. We welcome contributions: feel to fork and send contributions over. See `CONTRIBUTING.rst `_ for more info. Code of Conduct --------------- Refer to the `Twisted code of conduct `_. Copyright and License --------------------- ``treq`` is made available under the MIT license. See `LICENSE <./LICENSE>`_ for legal details and copyright notices. .. _pypi: https://pypi.org/project/treq/ .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg :alt: PyPI .. _calver: https://calver.org/ .. |calver| image:: https://img.shields.io/badge/calver-YY.MM.MICRO-22bfda.svg :alt: calver: YY.MM.MICRO .. _coverage: https://coveralls.io/github/twisted/treq .. |coverage| image:: https://coveralls.io/repos/github/twisted/treq/badge.svg :alt: Coverage .. _documentation: https://treq.readthedocs.org .. |documentation| image:: https://readthedocs.org/projects/treq/badge/ :alt: Documentation ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1642980426.0 treq-22.2.0/SECURITY.md0000664000175000017500000000015600000000000014006 0ustar00twmtwm00000000000000# Security Policy ## Reporting a Vulnerability Please report security issues to `security@twistedmatrix.com`././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5717504 treq-22.2.0/docs/0000775000175000017500000000000000000000000013143 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1474002258.0 treq-22.2.0/docs/Makefile0000664000175000017500000001266400000000000014614 0ustar00twmtwm00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/treq.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/treq.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/treq" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/treq" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5717504 treq-22.2.0/docs/_static/0000775000175000017500000000000000000000000014571 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1474002258.0 treq-22.2.0/docs/_static/.keepme0000664000175000017500000000000000000000000016026 0ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611559119.0 treq-22.2.0/docs/api.rst0000664000175000017500000001101700000000000014446 0ustar00twmtwm00000000000000API Reference ============= This page lists all of the interfaces exposed by the `treq` package. Making Requests --------------- The :py:mod:`treq` module provides several convenience functions for making requests. These functions all create a default :py:class:`treq.client.HTTPClient` instance and pass their arguments to the appropriate :py:class:`~treq.client.HTTPClient` method. .. module:: treq .. autofunction:: request .. autofunction:: get .. autofunction:: head .. autofunction:: post .. autofunction:: put .. autofunction:: patch .. autofunction:: delete Accessing Content ----------------- .. autofunction:: collect .. autofunction:: content .. autofunction:: text_content .. autofunction:: json_content The HTTP Client =============== .. module:: treq.client :class:`treq.client.HTTPClient` has methods that match the signatures of the convenience request functions in the :mod:`treq` module. .. autoclass:: HTTPClient(agent, cookiejar=None, data_to_body_producer=IBodyProducer) .. automethod:: request .. automethod:: get .. automethod:: head .. automethod:: post .. automethod:: put .. automethod:: patch .. automethod:: delete Augmented Response Objects -------------------------- :func:`treq.request`, :func:`treq.get`, etc. return an object which provides :class:`twisted.web.iweb.IResponse`, plus a few additional convenience methods: .. module:: treq.response .. class:: _Response .. automethod:: collect .. automethod:: content .. automethod:: json .. automethod:: text .. automethod:: history .. automethod:: cookies Inherited from :class:`twisted.web.iweb.IResponse`: :ivar version: See :attr:`IResponse.version ` :ivar code: See :attr:`IResponse.code ` :ivar phrase: See :attr:`IResponse.phrase ` :ivar headers: See :attr:`IResponse.headers ` :ivar length: See :attr:`IResponse.length ` :ivar request: See :attr:`IResponse.request ` :ivar previousResponse: See :attr:`IResponse.previousResponse ` .. method:: deliverBody(protocol) See :meth:`IResponse.deliverBody() ` .. method:: setPreviousResponse(response) See :meth:`IResponse.setPreviousResponse() ` Authentication -------------- .. module:: treq.auth .. autofunction:: add_auth .. autofunction:: add_basic_auth .. autoexception:: UnknownAuthConfig Test Helpers ------------ .. module:: treq.testing The :mod:`treq.testing` module contains tools for in-memory testing of HTTP clients and servers. StubTreq Objects ~~~~~~~~~~~~~~~~ .. class:: treq.testing.StubTreq(resource) :class:`StubTreq` implements the same interface as the :mod:`treq` module or the :class:`~treq.client.HTTPClient` class, with the limitation that it does not support the ``files`` argument. .. method:: flush() Flush all data between pending client/server pairs. This is only necessary if a :obj:`Resource` under test returns :obj:`NOT_DONE_YET` from its ``render`` method, making a response asynchronous. In that case, after each write from the server, :meth:`flush()` must be called so the client can see it. As the methods on :class:`treq.client.HTTPClient`: .. method:: request See :func:`treq.request()`. .. method:: get See :func:`treq.get()`. .. method:: head See :func:`treq.head()`. .. method:: post See :func:`treq.post()`. .. method:: put See :func:`treq.put()`. .. method:: patch See :func:`treq.patch()`. .. method:: delete See :func:`treq.delete()`. RequestTraversalAgent Objects ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.RequestTraversalAgent :members: RequestSequence Objects ~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.RequestSequence :members: StringStubbingResource Objects ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.StringStubbingResource :members: HasHeaders Objects ~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.HasHeaders :members: MultiPartProducer Objects ------------------------- :class:`treq.multipart.MultiPartProducer` is used internally when making requests which involve files. .. automodule:: treq.multipart :members: ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/changelog.rst0000644000175000017500000000003600000000000015621 0ustar00twmtwm00000000000000.. include:: ../CHANGELOG.rst ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1594785923.0 treq-22.2.0/docs/conf.py0000644000175000017500000001754100000000000014450 0ustar00twmtwm00000000000000# -*- coding: utf-8 -*- # # treq documentation build configuration file, created by # sphinx-quickstart on Mon Dec 10 22:32:11 2012. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.insert(0, os.path.abspath('..')) # -- General configuration ----------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ['sphinx.ext.viewcode', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'treq' copyright = u'2014–2020 David Reid' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The full version, including alpha/beta/rc tags. from treq import __version__ as release version = '.'.join(release.split('.')[:2]) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'treqdoc' # -- Options for LaTeX output -------------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'treq.tex', u'treq Documentation', u'David Reid', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'treq', u'treq Documentation', [u'David Reid'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------------ # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'treq', u'treq Documentation', u'David Reid', 'treq', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' RTD_NEW_THEME = True intersphinx_mapping = { 'python': ('https://docs.python.org/3/', None), 'twisted': ('https://twistedmatrix.com/documents/current/api/', None), 'hyperlink': ('https://hyperlink.readthedocs.io/en/latest/', None), } ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5717504 treq-22.2.0/docs/examples/0000775000175000017500000000000000000000000014761 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584938368.0 treq-22.2.0/docs/examples/_utils.py0000644000175000017500000000032400000000000016627 0ustar00twmtwm00000000000000from __future__ import print_function import treq def print_response(response): print(response.code, response.phrase) print(response.headers) return treq.text_content(response).addCallback(print) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/basic_auth.py0000644000175000017500000000043500000000000017435 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get( 'https://httpbin.org/basic-auth/treq/treq', auth=('treq', 'treq') ) d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/basic_get.py0000644000175000017500000000033700000000000017254 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('https://httpbin.org/get') d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1601233447.0 treq-22.2.0/docs/examples/basic_post.py0000644000175000017500000000040300000000000017454 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor): d = treq.post("https://httpbin.org/post", data={"form": "data"}) d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1594785923.0 treq-22.2.0/docs/examples/basic_url.py0000644000175000017500000000065100000000000017276 0ustar00twmtwm00000000000000# -*- encoding: utf-8 -*- from hyperlink import DecodedURL from twisted.internet.task import react from _utils import print_response import treq def main(reactor): url = ( DecodedURL.from_text(u"https://httpbin.org") .child(u"get") # add path /get .add(u"foo", u"&") # add query ?foo=%26 ) print(url.to_text()) return treq.get(url).addCallback(print_response) react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585426944.0 treq-22.2.0/docs/examples/custom_agent.py0000644000175000017500000000077400000000000020031 0ustar00twmtwm00000000000000from treq.client import HTTPClient from _utils import print_response from twisted.internet.task import react from twisted.web.client import Agent def make_custom_agent(reactor): return Agent(reactor, connectTimeout=42) def main(reactor, *args): agent = make_custom_agent(reactor) http_client = HTTPClient(agent) d = http_client.get( 'https://secure.example.net/area51', auth=('admin', "you'll never guess!")) d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/disable_redirects.py0000644000175000017500000000037500000000000021005 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('https://httpbin.org/redirect/1', allow_redirects=False) d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/download_file.py0000644000175000017500000000057000000000000020141 0ustar00twmtwm00000000000000from twisted.internet.task import react import treq def download_file(reactor, url, destination_filename): destination = open(destination_filename, 'wb') d = treq.get(url, unbuffered=True) d.addCallback(treq.collect, destination.write) d.addBoth(lambda _: destination.close()) return d react(download_file, ['https://httpbin.org/get', 'download.txt']) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584938368.0 treq-22.2.0/docs/examples/iresource.py0000644000175000017500000000065000000000000017332 0ustar00twmtwm00000000000000import json from zope.interface import implementer from twisted.web.resource import IResource @implementer(IResource) class JsonResource(object): isLeaf = True # NB: means getChildWithDefault will not be called def __init__(self, data): self.data = data def render(self, request): request.setHeader(b'Content-Type', b'application/json') return json.dumps(self.data).encode('utf-8') ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1601233447.0 treq-22.2.0/docs/examples/json_post.py0000644000175000017500000000051400000000000017347 0ustar00twmtwm00000000000000from pprint import pprint from twisted.internet import defer from twisted.internet.task import react import treq @defer.inlineCallbacks def main(reactor): response = yield treq.post( 'https://httpbin.org/post', json={"msg": "Hello!"}, ) data = yield response.json() pprint(data) react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1610695252.0 treq-22.2.0/docs/examples/query_params.py0000664000175000017500000000226000000000000020043 0ustar00twmtwm00000000000000from twisted.internet.task import react from twisted.internet.defer import inlineCallbacks import treq @inlineCallbacks def main(reactor): print('List of tuples') resp = yield treq.get('https://httpbin.org/get', params=[('foo', 'bar'), ('baz', 'bax')]) content = yield resp.text() print(content) print('Single value dictionary') resp = yield treq.get('https://httpbin.org/get', params={'foo': 'bar', 'baz': 'bax'}) content = yield resp.text() print(content) print('Multi value dictionary') resp = yield treq.get('https://httpbin.org/get', params={b'foo': [b'bar', b'baz', b'bax']}) content = yield resp.text() print(content) print('Mixed value dictionary') resp = yield treq.get('https://httpbin.org/get', params={'foo': [1, 2, 3], 'bax': b'quux', b'bar': 'foo'}) content = yield resp.text() print(content) print('Preserved query parameters') resp = yield treq.get('https://httpbin.org/get?foo=bar', params={'baz': 'bax'}) content = yield resp.text() print(content) react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/redirects.py0000644000175000017500000000034600000000000017320 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('https://httpbin.org/redirect/1') d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/response_history.py0000644000175000017500000000053700000000000020755 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('https://httpbin.org/redirect/1') def cb(response): print('Response history:') print(response.history()) return print_response(response) d.addCallback(cb) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584938368.0 treq-22.2.0/docs/examples/testing_seq.py0000644000175000017500000000377400000000000017671 0ustar00twmtwm00000000000000from twisted.internet import defer from twisted.trial.unittest import SynchronousTestCase from twisted.web import http from treq.testing import StubTreq, HasHeaders from treq.testing import RequestSequence, StringStubbingResource @defer.inlineCallbacks def make_a_request(treq): """ Make a request using treq. """ response = yield treq.get('http://an.example/foo', params={'a': 'b'}, headers={b'Accept': b'application/json'}) if response.code == http.OK: result = yield response.json() else: message = yield response.text() raise Exception("Got an error from the server: {}".format(message)) defer.returnValue(result) class MakeARequestTests(SynchronousTestCase): """ Test :func:`make_a_request()` using :mod:`treq.testing.RequestSequence`. """ def test_200_ok(self): """On a 200 response, return the response's JSON.""" req_seq = RequestSequence([ ((b'get', 'http://an.example/foo', {b'a': [b'b']}, HasHeaders({'Accept': ['application/json']}), b''), (http.OK, {b'Content-Type': b'application/json'}, b'{"status": "ok"}')) ]) treq = StubTreq(StringStubbingResource(req_seq)) with req_seq.consume(self.fail): result = self.successResultOf(make_a_request(treq)) self.assertEqual({"status": "ok"}, result) def test_418_teapot(self): """On an unexpected response code, raise an exception""" req_seq = RequestSequence([ ((b'get', 'http://an.example/foo', {b'a': [b'b']}, HasHeaders({'Accept': ['application/json']}), b''), (418, {b'Content-Type': b'text/plain'}, b"I'm a teapot!")) ]) treq = StubTreq(StringStubbingResource(req_seq)) with req_seq.consume(self.fail): failure = self.failureResultOf(make_a_request(treq)) self.assertEqual(u"Got an error from the server: I'm a teapot!", failure.getErrorMessage()) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1585205055.0 treq-22.2.0/docs/examples/using_cookies.py0000644000175000017500000000073200000000000020174 0ustar00twmtwm00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('https://httpbin.org/cookies/set?hello=world') def _get_jar(resp): jar = resp.cookies() print('The server set our hello cookie to: {}'.format(jar['hello'])) return treq.get('https://httpbin.org/cookies', cookies=jar) d.addCallback(_get_jar) d.addCallback(print_response) return d react(main, []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1640750759.0 treq-22.2.0/docs/howto.rst0000664000175000017500000001305500000000000015041 0ustar00twmtwm00000000000000Use Cases ========= Handling Streaming Responses ---------------------------- In addition to `receiving responses `_ with :meth:`IResponse.deliverBody`, treq provides a helper function :py:func:`treq.collect` which takes a ``response`` and a single argument function which will be called with all new data available from the response. Much like :meth:`IProtocol.dataReceived`, :py:func:`treq.collect` knows nothing about the framing of your data and will simply call your collector function with any data that is currently available. Here is an example which simply a file object's write method to :py:func:`treq.collect` to save the response body to a file. .. literalinclude:: examples/download_file.py :linenos: :lines: 6-11 Full example: :download:`download_file.py ` URLs, URIs, and Hyperlinks -------------------------- The *url* argument to :py:meth:`HTTPClient.request` accepts three URL representations: - High-level: :class:`hyperlink.DecodedURL` - Mid-level :class:`str` (``unicode`` on Python 2) - Low-level: ASCII :class:`bytes` or :class:`hyperlink.URL` The high-level :class:`~hyperlink.DecodedURL` form is useful when programatically generating URLs. Here is an example that builds a URL that contains a ``&`` character, which is automatically escaped properly. .. literalinclude:: examples/basic_url.py :linenos: :pyobject: main Full example: :download:`basic_url.py ` Query Parameters ---------------- :py:func:`treq.HTTPClient.request` supports a ``params`` keyword argument which will be URL-encoded and added to the ``url`` argument in addition to any query parameters that may already exist. The ``params`` argument may be either a ``dict`` or a ``list`` of ``(key, value)`` tuples. If it is a ``dict`` then the values in the dict may either be scalar values or a ``list`` or ``tuple`` thereof. Scalar values means ``str``, ``bytes``, or anything else — even ``None`` — which will be coerced to ``str``. Strings are UTF-8 encoded. .. literalinclude:: examples/query_params.py :linenos: :lines: 7-37 Full example: :download:`query_params.py ` If you prefer a strictly-typed API, try :class:`hyperlink.DecodedURL`. Use its :meth:`~hyperlink.URL.add` and :meth:`~hyperlink.URL.set` methods to add query parameters without risk of accidental type coercion. JSON ---- :meth:`HTTPClient.request() ` supports a *json* keyword argument that gives a data structure to serialize as JSON (using :func:`json.dumps()`). This also implies a ``Content-Type: application/json`` request header. The *json* parameter is mutually-exclusive with *data*. The :meth:`_Response.json()` method decodes a JSON response body. It buffers the whole response and decodes it with :func:`json.loads()`. .. literalinclude:: examples/json_post.py :linenos: :pyobject: main Full example: :download:`json_post.py ` Auth ---- HTTP Basic authentication as specified in :rfc:`2617` is easily supported by passing an ``auth`` keyword argument to any of the request functions. The ``auth`` argument should be a tuple of the form ``('username', 'password')``. .. literalinclude:: examples/basic_auth.py :linenos: :lines: 7-15 Full example: :download:`basic_auth.py ` Redirects --------- treq handles redirects by default. The following will print a 200 OK response. .. literalinclude:: examples/redirects.py :linenos: :lines: 7-12 Full example: :download:`redirects.py ` You can easily disable redirects by simply passing `allow_redirects=False` to any of the request methods. .. literalinclude:: examples/disable_redirects.py :linenos: :lines: 7-12 Full example: :download:`disable_redirects.py ` You can even access the complete history of treq response objects by calling the :meth:`~treq.response._Response.history()` method on the response. .. literalinclude:: examples/response_history.py :linenos: :lines: 7-15 Full example: :download:`response_history.py ` Cookies ------- Cookies can be set by passing a ``dict`` or ``cookielib.CookieJar`` instance via the ``cookies`` keyword argument. Later cookies set by the server can be retrieved using the :py:meth:`~treq.response._Response.cookies()` method of the response. The object returned by :py:meth:`~treq.response._Response.cookies()` supports the same key/value access as `requests cookies `_. .. literalinclude:: examples/using_cookies.py :linenos: :lines: 7-20 Full example: :download:`using_cookies.py ` Customizing the Twisted Agent ----------------------------- The main :py:mod:`treq` module has helper functions that automatically instantiate an instance of :py:class:`treq.client.HTTPClient`. You can create an instance of :py:class:`~treq.client.HTTPClient` directly in order to customize the parameters used to initialize it. Internally, the :py:class:`~treq.client.HTTPClient` wraps an instance of :py:class:`twisted.web.client.Agent`. When you create an instance of :py:class:`~treq.client.HTTPClient`, you must initialize it with an instance of :py:class:`~twisted.web.client.Agent`. This allows you to customize its behavior. .. literalinclude:: examples/custom_agent.py :linenos: :lines: 6-19 Full example: :download:`custom_agent.py ` ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644389736.0 treq-22.2.0/docs/index.rst0000664000175000017500000000764200000000000015015 0ustar00twmtwm00000000000000treq: High-level Twisted HTTP Client API ======================================== Release v\ |release| (:doc:`What's new? `). `treq `_ depends on a recent Twisted and functions on Python 2.7 and Python 3.3+ (including PyPy). Why? ---- `requests`_ by Kenneth Reitz is a wonderful library. I want the same ease of use when writing Twisted applications. treq is not of course a perfect clone of `requests`_. I have tried to stay true to the do-what-I-mean spirit of the `requests`_ API and also kept the API familiar to users of `Twisted`_ and :class:`twisted.web.client.Agent` on which treq is based. .. _requests: https://requests.readthedocs.io/en/master/ .. _Twisted: https://twistedmatrix.com/ Quick Start ----------- Installation .. code-block:: console $ pip install treq GET +++ .. literalinclude:: examples/basic_get.py :pyobject: main Full example: :download:`basic_get.py ` POST ++++ .. literalinclude:: examples/basic_post.py :pyobject: main Full example: :download:`basic_post.py ` Why not 100% requests-alike? ---------------------------- Initially when I started off working on treq I thought the API should look exactly like `requests`_ except anything that would involve the network would return a :class:`~twisted.internet.defer.Deferred`. Over time while attempting to mimic the `requests`_ API it became clear that not enough code could be shared between `requests`_ and treq for it to be worth the effort to translate many of the usage patterns from `requests`_. With the current version of treq I have tried to keep the API simple, yet remain familiar to users of Twisted and its lower-level HTTP libraries. Feature Parity with Requests ---------------------------- Even though mimicking the `requests`_ API is not a goal, supporting most of its features is. Here is a list of `requests`_ features and their status in treq. +----------------------------------+----------+----------+ | | requests | treq | +----------------------------------+----------+----------+ | International Domains and URLs | yes | yes | +----------------------------------+----------+----------+ | Keep-Alive & Connection Pooling | yes | yes | +----------------------------------+----------+----------+ | Sessions with Cookie Persistence | yes | yes | +----------------------------------+----------+----------+ | Browser-style SSL Verification | yes | yes | +----------------------------------+----------+----------+ | Basic Authentication | yes | yes | +----------------------------------+----------+----------+ | Digest Authentication | yes | no | +----------------------------------+----------+----------+ | Elegant Key/Value Cookies | yes | yes | +----------------------------------+----------+----------+ | Automatic Decompression | yes | yes | +----------------------------------+----------+----------+ | Unicode Response Bodies | yes | yes | +----------------------------------+----------+----------+ | Multipart File Uploads | yes | yes | +----------------------------------+----------+----------+ | Connection Timeouts | yes | yes | +----------------------------------+----------+----------+ | HTTP(S) Proxy Support | yes | no | +----------------------------------+----------+----------+ | .netrc support | yes | no | +----------------------------------+----------+----------+ | Python 3.x | yes | yes | +----------------------------------+----------+----------+ Table of Contents ----------------- .. toctree:: :maxdepth: 3 howto testing api changelog Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1474002258.0 treq-22.2.0/docs/make.bat0000664000175000017500000001174400000000000014557 0ustar00twmtwm00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\treq.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\treq.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1621403682.0 treq-22.2.0/docs/testing.rst0000664000175000017500000000641400000000000015357 0ustar00twmtwm00000000000000Testing Helpers =============== The :mod:`treq.testing` module provides some tools for testing both HTTP clients which use the treq API and implementations of the `Twisted Web resource model `_. Writing tests for HTTP clients ------------------------------ The :class:`~treq.testing.StubTreq` class implements the :mod:`treq` module interface (:func:`treq.get()`, :func:`treq.post()`, etc.) but runs all I/O via a :class:`~twisted.internet.testing.MemoryReactor`. It wraps a :class:`twisted.web.resource.IResource` provider which handles each request. You can wrap a pre-existing `IResource` provider, or write your own. For example, the :class:`twisted.web.resource.ErrorPage` resource can produce an arbitrary HTTP status code. :class:`twisted.web.static.File` can serve files or directories. And you can easily achieve custom responses by writing trivial resources yourself: .. literalinclude:: examples/iresource.py :linenos: :pyobject: JsonResource However, those resources don't assert anything about the request. The :class:`~treq.testing.RequestSequence` and :class:`~treq.testing.StringStubbingResource` classes make it easy to construct a resource which encodes the expected request and response pairs. Do note that most parameters to these functions must be bytes—it's safest to use the ``b''`` string syntax, which works on both Python 2 and 3. For example: .. literalinclude:: examples/testing_seq.py :linenos: This may be run with ``trial testing_seq.py``. Download: :download:`testing_seq.py `. Loosely matching the request ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If you don't care about certain parts of the request, you can pass :data:`unittest.mock.ANY`, which compares equal to anything. This sequence matches a single GET request with any parameters or headers: .. code-block:: python from unittest.mock import ANY RequestSequence([ ((b'get', ANY, ANY, b''), (200, {}, b'ok')) ]) If you care about headers, use :class:`~treq.testing.HasHeaders` to make assertions about the headers present in the request. It compares equal to a superset of the headers specified, which helps make your test robust to changes in treq or Agent. Right now treq adds the ``Accept-Encoding: gzip`` header, but as support for additional compression methods is added, this may change. Writing tests for Twisted Web resources --------------------------------------- Since :class:`~treq.testing.StubTreq` wraps any resource, you can use it to test your server-side code as well. This is superior to calling your resource's methods directly or passing mock objects, since it uses a real :class:`~twisted.web.client.Agent` to generate the request and a real :class:`~twisted.web.server.Site` to process the response. Thus, the ``request`` object your code interacts with is a *real* :class:`twisted.web.server.Request` and behaves the same as it would in production. Note that if your resource returns :data:`~twisted.web.server.NOT_DONE_YET` you must keep a reference to the :class:`~treq.testing.RequestTraversalAgent` and call its :meth:`~treq.testing.RequestTraversalAgent.flush()` method to spin the memory reactor once the server writes additional data before the client will receive it. ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1620797689.0 treq-22.2.0/pyproject.toml0000664000175000017500000000057500000000000015136 0ustar00twmtwm00000000000000[build-system] requires = [ "setuptools >= 35.0.2", "wheel >= 0.29.0", "incremental >= 21.3.0", ] build-backend = "setuptools.build_meta" [tool.towncrier] package = "treq" package_dir = "src" filename = "CHANGELOG.rst" directory = "changelog.d" title_format = "{version} ({project_date})" issue_format = "`#{issue} `__" ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/setup.cfg0000664000175000017500000000004600000000000014034 0ustar00twmtwm00000000000000[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644389736.0 treq-22.2.0/setup.py0000664000175000017500000000344000000000000013726 0ustar00twmtwm00000000000000from setuptools import find_packages, setup classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Framework :: Twisted", "Programming Language :: Python", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", ] if __name__ == "__main__": with open('README.rst') as f: readme = f.read() setup( name="treq", packages=find_packages('src'), package_dir={"": "src"}, setup_requires=["incremental"], use_incremental=True, python_requires='>=3.6', install_requires=[ "incremental", "requests >= 2.1.0", "hyperlink >= 21.0.0", "Twisted[tls] >= 18.7.0", "attrs", ], extras_require={ "dev": [ "pep8", "pyflakes", "httpbin==0.5.0", ], "docs": [ "sphinx>=1.4.8", ], }, package_data={"treq": ["_version"]}, author="David Reid", author_email="dreid@dreid.org", maintainer="Tom Most", maintainer_email="twm@freecog.net", classifiers=classifiers, description="High-level Twisted HTTP Client API", license="MIT/X", url="https://github.com/twisted/treq", long_description=readme, long_description_content_type='text/x-rst', ) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5717504 treq-22.2.0/src/0000775000175000017500000000000000000000000013002 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/src/treq/0000775000175000017500000000000000000000000013755 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1573445122.0 treq-22.2.0/src/treq/__init__.py0000644000175000017500000000063000000000000016063 0ustar00twmtwm00000000000000from __future__ import absolute_import, division, print_function from ._version import __version__ from treq.api import head, get, post, put, patch, delete, request from treq.content import collect, content, text_content, json_content __version__ = __version__.base() __all__ = ['head', 'get', 'post', 'put', 'patch', 'delete', 'request', 'collect', 'content', 'text_content', 'json_content'] ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/_agentspy.py0000664000175000017500000000636000000000000016325 0ustar00twmtwm00000000000000# Copyright (c) The treq Authors. # See LICENSE for details. from typing import Callable, List, Optional, Tuple import attr from twisted.internet.defer import Deferred from twisted.web.http_headers import Headers from twisted.web.iweb import IAgent, IBodyProducer, IResponse from zope.interface import implementer @attr.s(frozen=True, order=False, slots=True) class RequestRecord: """ The details of a call to :meth:`_AgentSpy.request` :ivar method: The *method* argument to :meth:`IAgent.request` :ivar uri: The *uri* argument to :meth:`IAgent.request` :ivar headers: The *headers* argument to :meth:`IAgent.request` :ivar bodyProducer: The *bodyProducer* argument to :meth:`IAgent.request` :ivar deferred: The :class:`Deferred` returned by :meth:`IAgent.request` """ method = attr.ib() # type: bytes uri = attr.ib() # type: bytes headers = attr.ib() # type: Optional[Headers] bodyProducer = attr.ib() # type: Optional[IBodyProducer] deferred = attr.ib() # type: Deferred @implementer(IAgent) @attr.s class _AgentSpy: """ An agent that records HTTP requests :ivar _callback: A function called with each :class:`RequestRecord` """ _callback = attr.ib() # type: Callable[Tuple[RequestRecord], None] def request(self, method, uri, headers=None, bodyProducer=None): # type: (bytes, bytes, Optional[Headers], Optional[IBodyProducer]) -> Deferred[IResponse] # noqa if not isinstance(method, bytes): raise TypeError( "method must be bytes, not {!r} of type {}".format(method, type(method)) ) if not isinstance(uri, bytes): raise TypeError( "uri must be bytes, not {!r} of type {}".format(uri, type(uri)) ) if headers is not None and not isinstance(headers, Headers): raise TypeError( "headers must be {}, not {!r} of type {}".format( type(Headers), headers, type(headers) ) ) if bodyProducer is not None and not IBodyProducer.providedBy(bodyProducer): raise TypeError( ( "bodyProducer must implement IBodyProducer, but {!r} does not." " Is the implementation marked with @implementer(IBodyProducer)?" ).format(bodyProducer) ) d = Deferred() record = RequestRecord(method, uri, headers, bodyProducer, d) self._callback(record) return d def agent_spy(): # type: () -> Tuple[IAgent, List[RequestRecord]] """ Record HTTP requests made with an agent This is suitable for low-level testing of wrapper agents. It validates the parameters of each call to :meth:`IAgent.request` (synchronously raising :exc:`TypeError`) and captures them as a :class:`RequestRecord`, which can then be used to inspect the request or generate a response by firing the :attr:`~RequestRecord.deferred`. :returns: A two-tuple of: - An :class:`twisted.web.iweb.IAgent` - A list of calls made to the agent's :meth:`~twisted.web.iweb.IAgent.request()` method """ records = [] agent = _AgentSpy(records.append) return agent, records ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644390932.0 treq-22.2.0/src/treq/_version.py0000664000175000017500000000037300000000000016156 0ustar00twmtwm00000000000000""" Provides treq version information. """ # This file is auto-generated! Do not edit! # Use `python -m incremental.update treq` to change this file. from incremental import Version __version__ = Version("treq", 22, 2, 0) __all__ = ["__version__"] ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1640750759.0 treq-22.2.0/src/treq/api.py0000664000175000017500000001575200000000000015112 0ustar00twmtwm00000000000000from __future__ import absolute_import, division, print_function from twisted.web.client import Agent, HTTPConnectionPool from treq.client import HTTPClient def head(url, **kwargs): """ Make a ``HEAD`` request. See :py:func:`treq.request` """ return _client(kwargs).head(url, _stacklevel=4, **kwargs) def get(url, headers=None, **kwargs): """ Make a ``GET`` request. See :py:func:`treq.request` """ return _client(kwargs).get(url, headers=headers, _stacklevel=4, **kwargs) def post(url, data=None, **kwargs): """ Make a ``POST`` request. See :py:func:`treq.request` """ return _client(kwargs).post(url, data=data, _stacklevel=4, **kwargs) def put(url, data=None, **kwargs): """ Make a ``PUT`` request. See :py:func:`treq.request` """ return _client(kwargs).put(url, data=data, _stacklevel=4, **kwargs) def patch(url, data=None, **kwargs): """ Make a ``PATCH`` request. See :py:func:`treq.request` """ return _client(kwargs).patch(url, data=data, _stacklevel=4, **kwargs) def delete(url, **kwargs): """ Make a ``DELETE`` request. See :py:func:`treq.request` """ return _client(kwargs).delete(url, _stacklevel=4, **kwargs) def request(method, url, **kwargs): """ Make an HTTP request. :param str method: HTTP method. Example: ``'GET'``, ``'HEAD'``. ``'PUT'``, ``'POST'``. :param url: http or https URL, which may include query arguments. :type url: :class:`hyperlink.DecodedURL`, `str`, `bytes`, or :class:`hyperlink.EncodedURL` :param headers: Optional HTTP Headers to send with this request. :type headers: :class:`~twisted.web.http_headers.Headers` or None :param params: Optional parameters to be append to the URL query string. Any query string parameters in the *url* will be preserved. :type params: dict w/ str or list/tuple of str values, list of 2-tuples, or None. :param data: Arbitrary request body data. If *files* is also passed this must be a :class:`dict`, a :class:`tuple` or :class:`list` of field tuples as accepted by :class:`MultiPartProducer`. The request is assigned a Content-Type of ``multipart/form-data``. If a :class:`dict`, :class:`list`, or :class:`tuple` it is URL-encoded and the request assigned a Content-Type of ``application/x-www-form-urlencoded``. Otherwise, any non-``None`` value is passed to the client's *data_to_body_producer* callable (by default, :class:`IBodyProducer`), which accepts :class:`bytes` and binary files like returned by ``open(..., "rb")``. :type data: `bytes`, `typing.BinaryIO`, `IBodyProducer`, or `None` :param files: Files to include in the request body, in any of the several formats: - ``[("fieldname", binary_file)]`` - ``[("fieldname", "filename", binary_file)]`` - ``[("fieldname, "filename', "content-type", binary_file)]`` Or a mapping: - ``{"fieldname": binary_file}`` - ``{"fieldname": ("filename", binary_file)}`` - ``{"fieldname": ("filename", "content-type", binary_file)}`` Each ``binary_file`` is a file-like object open in binary mode (like returned by ``open("filename", "rb")``). The filename is taken from the file's ``name`` attribute if not specified. The Content-Type is guessed based on the filename using :func:`mimetypes.guess_type()` if not specified, falling back to ``application/octet-stream``. While uploading Treq will measure the length of seekable files to populate the Content-Length header of the file part. If *files* is given the request is assigned a Content-Type of ``multipart/form-data``. Additional fields may be given in the *data* argument. :param json: Optional JSON-serializable content for the request body. Mutually exclusive with *data* and *files*. :type json: `dict`, `list`, `tuple`, `int`, `str`, `bool`, or `None` :param auth: HTTP Basic Authentication information --- see :func:`treq.auth.add_auth`. :type auth: tuple of ``('username', 'password')`` :param cookies: Cookies to send with this request. The HTTP kind, not the tasty kind. :type cookies: ``dict`` or ``cookielib.CookieJar`` :param int timeout: Request timeout seconds. If a response is not received within this timeframe, a connection is aborted with ``CancelledError``. :param bool allow_redirects: Follow HTTP redirects. Default: ``True`` :param bool browser_like_redirects: Follow redirects like a web browser: When a 301 or 302 redirect is received in response to a POST request convert the method to GET. See :rfc:`7231 <7231#section-6.4.3>` and :class:`~twisted.web.client.BrowserLikeRedirectAgent`). Default: ``False`` :param bool unbuffered: Pass ``True`` to to disable response buffering. By default treq buffers the entire response body in memory. :param reactor: Optional Twisted reactor. :param bool persistent: Use persistent HTTP connections. Default: ``True`` :param agent: Provide your own custom agent. Use this to override things like ``connectTimeout`` or ``BrowserLikePolicyForHTTPS``. By default, treq will create its own Agent with reasonable defaults. :type agent: twisted.web.iweb.IAgent :rtype: Deferred that fires with an :class:`IResponse` .. versionchanged:: treq 20.9.0 The *url* param now accepts :class:`hyperlink.DecodedURL` and :class:`hyperlink.EncodedURL` objects. """ return _client(kwargs).request(method, url, _stacklevel=3, **kwargs) # # Private API # def default_reactor(reactor): """ Return the specified reactor or the default. """ if reactor is None: from twisted.internet import reactor return reactor _global_pool = [None] def get_global_pool(): return _global_pool[0] def set_global_pool(pool): _global_pool[0] = pool def default_pool(reactor, pool, persistent): """ Return the specified pool or a pool with the specified reactor and persistence. """ reactor = default_reactor(reactor) if pool is not None: return pool if persistent is False: return HTTPConnectionPool(reactor, persistent=persistent) if get_global_pool() is None: set_global_pool(HTTPConnectionPool(reactor, persistent=True)) return get_global_pool() def _client(kwargs): agent = kwargs.pop("agent", None) pool = kwargs.pop("pool", None) persistent = kwargs.pop("persistent", None) if agent is None: # "reactor" isn't removed from kwargs because it must also be passed # down for use in the timeout logic. reactor = default_reactor(kwargs.get("reactor")) pool = default_pool(reactor, pool, persistent) agent = Agent(reactor, pool=pool) return HTTPClient(agent) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644041076.0 treq-22.2.0/src/treq/auth.py0000664000175000017500000000577600000000000015307 0ustar00twmtwm00000000000000# Copyright 2012-2020 The treq Authors. # See LICENSE for details. from __future__ import absolute_import, division, print_function import binascii from typing import Union from twisted.web.http_headers import Headers from twisted.web.iweb import IAgent from zope.interface import implementer class UnknownAuthConfig(Exception): """ The authentication config provided couldn't be interpreted. """ def __init__(self, config): super(Exception, self).__init__( '{0!r} not of a known type.'.format(config)) @implementer(IAgent) class _RequestHeaderSetterAgent: """ Wrap an agent to set request headers :ivar _agent: The wrapped agent. :ivar _request_headers: Headers to set on each request before forwarding it to the wrapped agent. """ def __init__(self, agent, headers): self._agent = agent self._headers = headers def request(self, method, uri, headers=None, bodyProducer=None): if headers is None: requestHeaders = self._headers else: requestHeaders = headers.copy() for header, values in self._headers.getAllRawHeaders(): requestHeaders.setRawHeaders(header, values) return self._agent.request( method, uri, headers=requestHeaders, bodyProducer=bodyProducer) def add_basic_auth(agent, username, password): # type: (IAgent, Union[str, bytes], Union[str, bytes]) -> IAgent """ Wrap an agent to add HTTP basic authentication The returned agent sets the *Authorization* request header according to the basic authentication scheme described in :rfc:`7617`. This header contains the given *username* and *password* in plaintext, and thus should only be used over an encrypted transport (HTTPS). Note that the colon (``:``) is used as a delimiter between the *username* and *password*, so if either parameter includes a colon the interpretation of the *Authorization* header is server-defined. :param agent: Agent to wrap. :param username: The username. :param password: The password. :returns: :class:`~twisted.web.iweb.IAgent` """ if not isinstance(username, bytes): username = username.encode('utf-8') if not isinstance(password, bytes): password = password.encode('utf-8') creds = binascii.b2a_base64(b'%s:%s' % (username, password)).rstrip(b'\n') return _RequestHeaderSetterAgent( agent, Headers({b'Authorization': [b'Basic ' + creds]}), ) def add_auth(agent, auth_config): """ Wrap an agent to perform authentication :param agent: Agent to wrap. :param auth_config: A ``('username', 'password')`` tuple --- see :func:`add_basic_auth`. :returns: :class:`~twisted.web.iweb.IAgent` :raises UnknownAuthConfig: When the format *auth_config* isn't supported. """ if isinstance(auth_config, tuple): return add_basic_auth(agent, auth_config[0], auth_config[1]) raise UnknownAuthConfig(auth_config) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644389690.0 treq-22.2.0/src/treq/client.py0000664000175000017500000004137600000000000015620 0ustar00twmtwm00000000000000import io import mimetypes import uuid import warnings from collections.abc import Mapping from http.cookiejar import CookieJar, Cookie from urllib.parse import quote_plus, urlencode as _urlencode from twisted.internet.interfaces import IProtocol from twisted.internet.defer import Deferred from twisted.python.components import proxyForInterface from twisted.python.filepath import FilePath from hyperlink import DecodedURL, EncodedURL from twisted.web.http_headers import Headers from twisted.web.iweb import IBodyProducer, IResponse from twisted.web.client import ( FileBodyProducer, RedirectAgent, BrowserLikeRedirectAgent, ContentDecoderAgent, GzipDecoder, CookieAgent ) from twisted.python.components import registerAdapter from json import dumps as json_dumps from treq.auth import add_auth from treq import multipart from treq.response import _Response from requests.cookies import merge_cookies _NOTHING = object() def urlencode(query, doseq): s = _urlencode(query, doseq) if not isinstance(s, bytes): s = s.encode("ascii") return s def _scoped_cookiejar_from_dict(url_object, cookie_dict): """ Create a CookieJar from a dictionary whose cookies are all scoped to the given URL's origin. @note: This does not scope the cookies to any particular path, only the host, port, and scheme of the given URL. """ cookie_jar = CookieJar() if cookie_dict is None: return cookie_jar for k, v in cookie_dict.items(): secure = url_object.scheme == 'https' port_specified = not ( (url_object.scheme == "https" and url_object.port == 443) or (url_object.scheme == "http" and url_object.port == 80) ) port = str(url_object.port) if port_specified else None domain = url_object.host netscape_domain = domain if '.' in domain else domain + '.local' cookie_jar.set_cookie( Cookie( # Scoping domain=netscape_domain, port=port, secure=secure, port_specified=port_specified, # Contents name=k, value=v, # Constant/always-the-same stuff version=0, path="/", expires=None, discard=False, comment=None, comment_url=None, rfc2109=False, path_specified=False, domain_specified=False, domain_initial_dot=False, rest=[], ) ) return cookie_jar class _BodyBufferingProtocol(proxyForInterface(IProtocol)): def __init__(self, original, buffer, finished): self.original = original self.buffer = buffer self.finished = finished def dataReceived(self, data): self.buffer.append(data) self.original.dataReceived(data) def connectionLost(self, reason): self.original.connectionLost(reason) self.finished.errback(reason) class _BufferedResponse(proxyForInterface(IResponse)): def __init__(self, original): self.original = original self._buffer = [] self._waiters = [] self._waiting = None self._finished = False self._reason = None def _deliverWaiting(self, reason): self._reason = reason self._finished = True for waiter in self._waiters: for segment in self._buffer: waiter.dataReceived(segment) waiter.connectionLost(reason) def deliverBody(self, protocol): if self._waiting is None and not self._finished: self._waiting = Deferred() self._waiting.addBoth(self._deliverWaiting) self.original.deliverBody( _BodyBufferingProtocol( protocol, self._buffer, self._waiting ) ) elif self._finished: for segment in self._buffer: protocol.dataReceived(segment) protocol.connectionLost(self._reason) else: self._waiters.append(protocol) class HTTPClient: def __init__(self, agent, cookiejar=None, data_to_body_producer=IBodyProducer): self._agent = agent if cookiejar is None: cookiejar = CookieJar() self._cookiejar = cookiejar self._data_to_body_producer = data_to_body_producer def get(self, url, **kwargs): """ See :func:`treq.get()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('GET', url, **kwargs) def put(self, url, data=None, **kwargs): """ See :func:`treq.put()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('PUT', url, data=data, **kwargs) def patch(self, url, data=None, **kwargs): """ See :func:`treq.patch()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('PATCH', url, data=data, **kwargs) def post(self, url, data=None, **kwargs): """ See :func:`treq.post()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('POST', url, data=data, **kwargs) def head(self, url, **kwargs): """ See :func:`treq.head()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('HEAD', url, **kwargs) def delete(self, url, **kwargs): """ See :func:`treq.delete()`. """ kwargs.setdefault('_stacklevel', 3) return self.request('DELETE', url, **kwargs) def request( self, method, url, *, params=None, headers=None, data=None, files=None, json=_NOTHING, auth=None, cookies=None, allow_redirects=True, browser_like_redirects=False, unbuffered=False, reactor=None, timeout=None, _stacklevel=2, ): """ See :func:`treq.request()`. """ method = method.encode('ascii').upper() if isinstance(url, DecodedURL): parsed_url = url.encoded_url elif isinstance(url, EncodedURL): parsed_url = url elif isinstance(url, str): # We use hyperlink in lazy mode so that users can pass arbitrary # bytes in the path and querystring. parsed_url = EncodedURL.from_text(url) else: parsed_url = EncodedURL.from_text(url.decode('ascii')) # Join parameters provided in the URL # and the ones passed as argument. if params: parsed_url = parsed_url.replace( query=parsed_url.query + tuple(_coerced_query_params(params)) ) url = parsed_url.to_uri().to_text().encode('ascii') headers = self._request_headers(headers, _stacklevel + 1) bodyProducer, contentType = self._request_body(data, files, json, stacklevel=_stacklevel + 1) if contentType is not None: headers.setRawHeaders(b'Content-Type', [contentType]) if not isinstance(cookies, CookieJar): cookies = _scoped_cookiejar_from_dict(parsed_url, cookies) cookies = merge_cookies(self._cookiejar, cookies) wrapped_agent = CookieAgent(self._agent, cookies) if allow_redirects: if browser_like_redirects: wrapped_agent = BrowserLikeRedirectAgent(wrapped_agent) else: wrapped_agent = RedirectAgent(wrapped_agent) wrapped_agent = ContentDecoderAgent(wrapped_agent, [(b'gzip', GzipDecoder)]) if auth: wrapped_agent = add_auth(wrapped_agent, auth) d = wrapped_agent.request( method, url, headers=headers, bodyProducer=bodyProducer) if reactor is None: from twisted.internet import reactor if timeout: delayedCall = reactor.callLater(timeout, d.cancel) def gotResult(result): if delayedCall.active(): delayedCall.cancel() return result d.addBoth(gotResult) if not unbuffered: d.addCallback(_BufferedResponse) return d.addCallback(_Response, cookies) def _request_headers(self, headers, stacklevel): """ Convert the *headers* argument to a :class:`Headers` instance :returns: :class:`twisted.web.http_headers.Headers` """ if isinstance(headers, dict): h = Headers({}) for k, v in headers.items(): if isinstance(v, (bytes, str)): h.addRawHeader(k, v) elif isinstance(v, list): h.setRawHeaders(k, v) else: warnings.warn( ( "The value of headers key {!r} has non-string type {}" " and will be dropped." " This will raise TypeError in the next treq release." ).format(k, type(v)), DeprecationWarning, stacklevel=stacklevel, ) return h if isinstance(headers, Headers): return headers if headers is None: return Headers({}) warnings.warn( ( "headers must be a dict, twisted.web.http_headers.Headers, or None," " but found {}, which will be ignored." " This will raise TypeError in the next treq release." ).format(type(headers)), DeprecationWarning, stacklevel=stacklevel, ) return Headers({}) def _request_body(self, data, files, json, stacklevel): """ Here we choose a right producer based on the parameters passed in. :params data: Arbitrary request body data. If *files* is also passed this must be a :class:`dict`, a :class:`tuple` or :class:`list` of field tuples as accepted by :class:`MultiPartProducer`. The request is assigned a Content-Type of ``multipart/form-data``. If a :class:`dict`, :class:`list`, or :class:`tuple` it is URL-encoded and the request assigned a Content-Type of ``application/x-www-form-urlencoded``. Otherwise, any non-``None`` value is passed to the client's *data_to_body_producer* callable (by default, :class:`IBodyProducer`), which accepts file-like objects. :params files: Files to include in the request body, in any of the several formats described in :func:`_convert_files()`. :params json: JSON-encodable data, or the sentinel `_NOTHING`. The sentinel is necessary because ``None`` is a valid JSON value. """ if json is not _NOTHING and (files or data): warnings.warn( ( "Argument 'json' will be ignored because '{}' was also passed." " This will raise TypeError in the next treq release." ).format("data" if data else "files"), DeprecationWarning, stacklevel=stacklevel, ) if files: # If the files keyword is present we will issue a # multipart/form-data request as it suits better for cases # with files and/or large objects. files = list(_convert_files(files)) boundary = str(uuid.uuid4()).encode('ascii') if data: data = _convert_params(data) else: data = [] return ( multipart.MultiPartProducer(data + files, boundary=boundary), b'multipart/form-data; boundary=' + boundary, ) # Otherwise stick to x-www-form-urlencoded format # as it's generally faster for smaller requests. if isinstance(data, (dict, list, tuple)): return ( self._data_to_body_producer(urlencode(data, doseq=True)), b'application/x-www-form-urlencoded', ) elif data: return ( self._data_to_body_producer(data), None, ) if json is not _NOTHING: return ( self._data_to_body_producer( json_dumps(json, separators=(u',', u':')).encode('utf-8'), ), b'application/json; charset=UTF-8', ) return None, None def _convert_params(params): if hasattr(params, "iteritems"): return list(sorted(params.iteritems())) elif hasattr(params, "items"): return list(sorted(params.items())) elif isinstance(params, (tuple, list)): return list(params) else: raise ValueError("Unsupported format") def _convert_files(files): """ Files can be passed in a variety of formats: * {"fieldname": open("bla.f", "rb")} * {"fieldname": ("filename", open("bla.f", "rb"))} * {"fieldname": ("filename", "content-type", open("bla.f", "rb"))} * Anything that has iteritems method, e.g. MultiDict: MultiDict([(name, open()), (name, open())] Our goal is to standardize it to unified form of: * [(param, (file name, content type, producer))] """ if hasattr(files, "iteritems"): files = files.iteritems() elif hasattr(files, "items"): files = files.items() for param, val in files: file_name, content_type, fobj = (None, None, None) if isinstance(val, tuple): if len(val) == 2: file_name, fobj = val elif len(val) == 3: file_name, content_type, fobj = val else: # NB: This is TypeError for backward compatibility. This case # used to fall through to `IBodyProducer`, below, which raised # TypeError about being unable to coerce None. raise TypeError( ( "`files` argument must be a sequence of tuples of" " (file_name, file_obj) or" " (file_name, content_type, file_obj)," " but the {!r} tuple has length {}: {!r}" ).format(param, len(val), val), ) else: fobj = val if hasattr(fobj, "name"): file_name = FilePath(fobj.name).basename() if not content_type: content_type = _guess_content_type(file_name) # XXX: Shouldn't this call self._data_to_body_producer? yield (param, (file_name, content_type, IBodyProducer(fobj))) def _query_quote(v): # (Any) -> Text """ Percent-encode a querystring name or value. :param v: A value. :returns: The value, coerced to a string and percent-encoded as appropriate for a querystring (with space as ``+``). """ if not isinstance(v, (str, bytes)): v = str(v) if not isinstance(v, bytes): v = v.encode("utf-8") q = quote_plus(v) return q def _coerced_query_params(params): """ Carefully coerce *params* in the same way as `urllib.parse.urlencode()` Parameter names and values are coerced to unicode, which is encoded as UTF-8 and then percent-encoded. As a special case, `bytes` are directly percent-encoded. :param params: A mapping or sequence of (name, value) two-tuples. The value may be a list or tuple of multiple values. Names and values may be pretty much any type. :returns: A generator that yields two-tuples containing percent-encoded text strings. :rtype: Iterator[Tuple[Text, Text]] """ if isinstance(params, Mapping): items = params.items() else: items = params for key, values in items: key_quoted = _query_quote(key) if not isinstance(values, (list, tuple)): values = (values,) for value in values: yield key_quoted, _query_quote(value) def _from_bytes(orig_bytes): return FileBodyProducer(io.BytesIO(orig_bytes)) def _from_file(orig_file): return FileBodyProducer(orig_file) def _guess_content_type(filename): if filename: guessed = mimetypes.guess_type(filename)[0] else: guessed = None return guessed or 'application/octet-stream' registerAdapter(_from_bytes, bytes, IBodyProducer) registerAdapter(_from_file, io.BytesIO, IBodyProducer) # file()/open() equiv registerAdapter(_from_file, io.BufferedReader, IBodyProducer) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/content.py0000664000175000017500000000712400000000000016005 0ustar00twmtwm00000000000000import cgi import json from twisted.internet.defer import Deferred, succeed from twisted.internet.protocol import Protocol from twisted.web.client import ResponseDone from twisted.web.http import PotentialDataLoss def _encoding_from_headers(headers): content_types = headers.getRawHeaders(u'content-type') if content_types is None: return None # This seems to be the choice browsers make when encountering multiple # content-type headers. content_type, params = cgi.parse_header(content_types[-1]) if 'charset' in params: return params.get('charset').strip("'\"") if content_type == 'application/json': return 'UTF-8' class _BodyCollector(Protocol): def __init__(self, finished, collector): self.finished = finished self.collector = collector def dataReceived(self, data): self.collector(data) def connectionLost(self, reason): if reason.check(ResponseDone): self.finished.callback(None) elif reason.check(PotentialDataLoss): # http://twistedmatrix.com/trac/ticket/4840 self.finished.callback(None) else: self.finished.errback(reason) def collect(response, collector): """ Incrementally collect the body of the response. This function may only be called **once** for a given response. :param IResponse response: The HTTP response to collect the body from. :param collector: A callable to be called each time data is available from the response body. :type collector: single argument callable :rtype: Deferred that fires with None when the entire body has been read. """ if response.length == 0: return succeed(None) d = Deferred() response.deliverBody(_BodyCollector(d, collector)) return d def content(response): """ Read the contents of an HTTP response. This function may be called multiple times for a response, it uses a ``WeakKeyDictionary`` to cache the contents of the response. :param IResponse response: The HTTP Response to get the contents of. :rtype: Deferred that fires with the content as a str. """ _content = [] d = collect(response, _content.append) d.addCallback(lambda _: b''.join(_content)) return d def json_content(response, **kwargs): """ Read the contents of an HTTP response and attempt to decode it as JSON. This function relies on :py:func:`content` and so may be called more than once for a given response. :param IResponse response: The HTTP Response to get the contents of. :param kwargs: Any keyword arguments accepted by :py:func:`json.loads` :rtype: Deferred that fires with the decoded JSON. """ # RFC7159 (8.1): Default JSON character encoding is UTF-8 d = text_content(response, encoding='utf-8') d.addCallback(lambda text: json.loads(text, **kwargs)) return d def text_content(response, encoding='ISO-8859-1'): """ Read the contents of an HTTP response and decode it with an appropriate charset, which may be guessed from the ``Content-Type`` header. :param IResponse response: The HTTP Response to get the contents of. :param str encoding: A charset, such as ``UTF-8`` or ``ISO-8859-1``, used if the response does not specify an encoding. :rtype: Deferred that fires with a unicode string. """ def _decode_content(c): e = _encoding_from_headers(response.headers) if e is not None: return c.decode(e) return c.decode(encoding) d = content(response) d.addCallback(_decode_content) return d ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1640750759.0 treq-22.2.0/src/treq/multipart.py0000664000175000017500000003015100000000000016350 0ustar00twmtwm00000000000000# Copyright (c) Twisted Matrix Laboratories. # See LICENSE for details. from uuid import uuid4 from io import BytesIO from contextlib import closing from twisted.internet import defer, task from twisted.web.iweb import UNKNOWN_LENGTH, IBodyProducer from zope.interface import implementer CRLF = b"\r\n" @implementer(IBodyProducer) class MultiPartProducer: """ :class:`MultiPartProducer` takes parameters for a HTTP request and produces bytes in multipart/form-data format defined in :rfc:`2388` and :rfc:`2046`. The encoded request is produced incrementally and the bytes are written to a consumer. Fields should have form: ``[(parameter name, value), ...]`` Accepted values: * Unicode strings (in this case parameter will be encoded with utf-8) * Tuples with (file name, content-type, :class:`~twisted.web.iweb.IBodyProducer` objects) Since :class:`MultiPartProducer` can accept objects like :class:`~twisted.web.iweb.IBodyProducer` which cannot be read from in an event-driven manner it uses uses a :class:`~twisted.internet.task.Cooperator` instance to schedule reads from the underlying producers. Reading is also paused and resumed based on notifications from the :class:`IConsumer` provider being written to. :ivar _fields: Sorted parameters, where all strings are enforced to be unicode and file objects stacked on bottom (to produce a human readable form-data request) :ivar _cooperate: A method like `Cooperator.cooperate` which is used to schedule all reads. :ivar boundary: The generated boundary used in form-data encoding :type boundary: `bytes` """ def __init__(self, fields, boundary=None, cooperator=task): self._fields = list(_sorted_by_type(_converted(fields))) self._currentProducer = None self._cooperate = cooperator.cooperate self.boundary = boundary or uuid4().hex if isinstance(self.boundary, str): self.boundary = self.boundary.encode('ascii') self.length = self._calculateLength() def startProducing(self, consumer): """ Start a cooperative task which will read bytes from the input file and write them to `consumer`. Return a `Deferred` which fires after all bytes have been written. :param consumer: Any `IConsumer` provider """ self._task = self._cooperate(self._writeLoop(consumer)) d = self._task.whenDone() def maybeStopped(reason): reason.trap(task.TaskStopped) return defer.Deferred() d.addCallbacks(lambda ignored: None, maybeStopped) return d def stopProducing(self): """ Permanently stop writing bytes from the file to the consumer by stopping the underlying `CooperativeTask`. """ if self._currentProducer: self._currentProducer.stopProducing() self._task.stop() def pauseProducing(self): """ Temporarily suspend copying bytes from the input file to the consumer by pausing the `CooperativeTask` which drives that activity. """ if self._currentProducer: # Having a current producer means that we are in # the paused state because we've returned # the deferred of the current producer to the # the cooperator. So this request # for pausing us is actually a request to pause # our underlying current producer. self._currentProducer.pauseProducing() else: self._task.pause() def resumeProducing(self): """ Undo the effects of a previous `pauseProducing` and resume copying bytes to the consumer by resuming the `CooperativeTask` which drives the write activity. """ if self._currentProducer: self._currentProducer.resumeProducing() else: self._task.resume() def _calculateLength(self): """ Determine how many bytes the overall form post would consume. The easiest way is to calculate is to generate of `fObj` (assuming it is not modified from this point on). If the determination cannot be made, return `UNKNOWN_LENGTH`. """ consumer = _LengthConsumer() for i in list(self._writeLoop(consumer)): pass return consumer.length def _getBoundary(self, final=False): """ Returns a boundary line, either final (the one that ends the form data request or a regular, the one that separates the boundaries) --this-is-my-boundary """ f = b"--" if final else b"" return b"--" + self.boundary + f def _writeLoop(self, consumer): """ Return an iterator which generates the multipart/form-data request including the encoded objects and writes them to the consumer for each time it is iterated. """ for index, (name, value) in enumerate(self._fields): # We don't write the CRLF of the first boundary: # HTTP request headers are already separated with CRLF # from the request body, another newline is possible # and should be considered as an empty preamble per rfc2046, # but is generally confusing, so we omit it when generating # the request. We don't write Content-Type: multipart/form-data # header here as well as it's defined in the context of the HTTP # request headers, not the producer, so we gust generate # the body. # It's also important to note that the boundary in the message # is defined not only by "--boundary-value" but # but with CRLF characters before it and after the line. # This is very important. # proper boundary is "CRLF--boundary-valueCRLF" consumer.write( (CRLF if index != 0 else b"") + self._getBoundary() + CRLF) yield self._writeField(name, value, consumer) consumer.write(CRLF + self._getBoundary(final=True) + CRLF) def _writeField(self, name, value, consumer): if isinstance(value, str): self._writeString(name, value, consumer) elif isinstance(value, tuple): filename, content_type, producer = value return self._writeFile( name, filename, content_type, producer, consumer) def _writeString(self, name, value, consumer): cdisp = _Header(b"Content-Disposition", b"form-data") cdisp.add_param(b"name", name) consumer.write(bytes(cdisp) + CRLF + CRLF) encoded = value.encode("utf-8") consumer.write(encoded) self._currentProducer = None def _writeFile(self, name, filename, content_type, producer, consumer): cdisp = _Header(b"Content-Disposition", b"form-data") cdisp.add_param(b"name", name) if filename: cdisp.add_param(b"filename", filename) consumer.write(bytes(cdisp) + CRLF) consumer.write(bytes(_Header(b"Content-Type", content_type)) + CRLF) if producer.length != UNKNOWN_LENGTH: consumer.write( bytes(_Header(b"Content-Length", producer.length)) + CRLF) consumer.write(CRLF) if isinstance(consumer, _LengthConsumer): consumer.write(producer.length) else: self._currentProducer = producer def unset(val): self._currentProducer = None return val d = producer.startProducing(consumer) d.addCallback(unset) return d def _escape(value): """ This function prevents header values from corrupting the request, a newline in the file name parameter makes form-data request unreadable for majority of parsers. """ if not isinstance(value, (bytes, str)): value = str(value) if isinstance(value, bytes): value = value.decode('utf-8') return value.replace(u"\r", u"").replace(u"\n", u"").replace(u'"', u'\\"') def _enforce_unicode(value): """ This function enforces the strings passed to be unicode, so we won't need to guess what's the encoding of the binary strings passed in. If someone needs to pass the binary string, use BytesIO and wrap it with `FileBodyProducer`. """ if isinstance(value, str): return value elif isinstance(value, bytes): # we got a byte string, and we have no idea what's the encoding of it # we can only assume that it's something cool try: return value.decode("utf-8") except UnicodeDecodeError: raise ValueError( "Supplied raw bytes that are not ascii/utf-8." " When supplying raw string make sure it's ascii or utf-8" ", or work with unicode if you are not sure") else: raise ValueError( "Unsupported field type: %s" % (value.__class__.__name__,)) def _converted(fields): if hasattr(fields, "iteritems"): fields = fields.iteritems() elif hasattr(fields, "items"): fields = fields.items() for name, value in fields: name = _enforce_unicode(name) if isinstance(value, (tuple, list)): if len(value) != 3: raise ValueError( "Expected tuple: (filename, content type, producer)") filename, content_type, producer = value filename = _enforce_unicode(filename) if filename else None yield name, (filename, content_type, producer) elif isinstance(value, (bytes, str)): yield name, _enforce_unicode(value) else: raise ValueError( "Unsupported value, expected string, unicode " "or tuple (filename, content type, IBodyProducer)") class _LengthConsumer: """ `_LengthConsumer` is used to calculate the length of the multi-part request. The easiest way to do that is to consume all the fields, but instead writing them to the string just accumulate the request length. :ivar length: The length of the request. Can be `UNKNOWN_LENGTH` if consumer finds the field that has length that can not be calculated """ def __init__(self): self.length = 0 def write(self, value): # this means that we have encountered # unknown length producer # so we need to stop attempts calculating if self.length is UNKNOWN_LENGTH: return if value is UNKNOWN_LENGTH: self.length = value elif isinstance(value, int): self.length += value else: self.length += len(value) class _Header: """ `_Header` This class is a tiny wrapper that produces request headers. We can't use standard python header class because it encodes unicode fields using =? bla bla ?= encoding, which is correct, but no one in HTTP world expects that, everyone wants utf-8 raw bytes. """ def __init__(self, name, value, params=None): self.name = name self.value = value self.params = params or [] def add_param(self, name, value): self.params.append((name, value)) def __bytes__(self): with closing(BytesIO()) as h: h.write(self.name + b": " + _escape(self.value).encode("us-ascii")) if self.params: for (name, val) in self.params: h.write(b"; ") h.write(_escape(name).encode("us-ascii")) h.write(b"=") h.write(b'"' + _escape(val).encode('utf-8') + b'"') h.seek(0) return h.read() def __str__(self): return self.__bytes__() def _sorted_by_type(fields): """Sorts params so that strings are placed before files. That makes a request more readable, as generally files are bigger. It also provides deterministic order of fields what is easier for testing. """ def key(p): key, val = p if isinstance(val, (bytes, str)): return (0, key) else: return (1, key) return sorted(fields, key=key) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/response.py0000664000175000017500000000720000000000000016164 0ustar00twmtwm00000000000000from twisted.python.components import proxyForInterface from twisted.web.iweb import IResponse, UNKNOWN_LENGTH from twisted.python import reflect from requests.cookies import cookiejar_from_dict from treq.content import collect, content, json_content, text_content class _Response(proxyForInterface(IResponse)): """ A wrapper for :class:`twisted.web.iweb.IResponse` which manages cookies and adds a few convenience methods. """ def __init__(self, original, cookiejar): self.original = original self._cookiejar = cookiejar def __repr__(self): """ Generate a representation of the response which includes the HTTP status code, Content-Type header, and body size, if available. """ if self.original.length == UNKNOWN_LENGTH: size = 'unknown size' else: size = '{:,d} bytes'.format(self.original.length) # Display non-ascii bits of the content-type header as backslash # escapes. content_type_bytes = b', '.join( self.original.headers.getRawHeaders(b'content-type', ())) content_type = repr(content_type_bytes).lstrip('b')[1:-1] return "<{} {} '{:.40s}' {}>".format( reflect.qual(self.__class__), self.original.code, content_type, size, ) def collect(self, collector): """ Incrementally collect the body of the response, per :func:`treq.collect()`. :param collector: A single argument callable that will be called with chunks of body data as it is received. :returns: A `Deferred` that fires when the entire body has been received. """ return collect(self.original, collector) def content(self): """ Read the entire body all at once, per :func:`treq.content()`. :returns: A `Deferred` that fires with a `bytes` object when the entire body has been received. """ return content(self.original) def json(self, **kwargs): """ Collect the response body as JSON per :func:`treq.json_content()`. :param kwargs: Any keyword arguments accepted by :py:func:`json.loads` :rtype: Deferred that fires with the decoded JSON when the entire body has been read. """ return json_content(self.original, **kwargs) def text(self, encoding='ISO-8859-1'): """ Read the entire body all at once as text, per :func:`treq.text_content()`. :rtype: A `Deferred` that fires with a unicode string when the entire body has been received. """ return text_content(self.original, encoding) def history(self): """ Get a list of all responses that (such as intermediate redirects), that ultimately ended in the current response. The responses are ordered chronologically. :returns: A `list` of :class:`~treq.response._Response` objects """ response = self history = [] while response.previousResponse is not None: history.append(_Response(response.previousResponse, self._cookiejar)) response = response.previousResponse history.reverse() return history def cookies(self): """ Get a copy of this response's cookies. :rtype: :class:`requests.cookies.RequestsCookieJar` """ jar = cookiejar_from_dict({}) if self._cookiejar is not None: for cookie in self._cookiejar: jar.set_cookie(cookie) return jar ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/src/treq/test/0000775000175000017500000000000000000000000014734 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1573445122.0 treq-22.2.0/src/treq/test/__init__.py0000644000175000017500000000000000000000000017031 0ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/src/treq/test/local_httpbin/0000775000175000017500000000000000000000000017556 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584213863.0 treq-22.2.0/src/treq/test/local_httpbin/__init__.py0000644000175000017500000000000000000000000021653 0ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/local_httpbin/child.py0000664000175000017500000002154400000000000021221 0ustar00twmtwm00000000000000""" A local ``httpbin`` server to run integration tests against. This ensures tests do not depend on `httpbin `_. """ from __future__ import print_function import argparse import datetime import sys import httpbin from twisted.internet.defer import Deferred, inlineCallbacks from twisted.internet.endpoints import TCP4ServerEndpoint, SSL4ServerEndpoint from twisted.internet.task import react from twisted.internet.ssl import (Certificate, CertificateOptions) from OpenSSL.crypto import PKey, X509 from twisted.python.threadpool import ThreadPool from twisted.web.server import Site from twisted.web.wsgi import WSGIResource from cryptography import x509 from cryptography.hazmat.backends import default_backend from cryptography.hazmat.primitives import hashes from cryptography.hazmat.primitives.asymmetric import rsa from cryptography.x509.oid import NameOID from cryptography.hazmat.primitives.serialization import Encoding from .shared import _HTTPBinDescription def _certificates_for_authority_and_server(service_identity, key_size=2048): """ Create a self-signed CA certificate and server certificate signed by the CA. :param service_identity: The identity (hostname) of the server. :type service_identity: :py:class:`unicode` :param key_size: (optional) The size of CA's and server's private RSA keys. Defaults to 2048 bits, which is the minimum allowed by OpenSSL Contexts at the default security level. :type key_size: :py:class:`int` :return: a 3-tuple of ``(certificate_authority_certificate, server_private_key, server_certificate)``. :rtype: :py:class:`tuple` of (:py:class:`sslverify.Certificate`, :py:class:`OpenSSL.crypto.PKey`, :py:class:`OpenSSL.crypto.X509`) """ common_name_for_ca = x509.Name( [x509.NameAttribute(NameOID.COMMON_NAME, u'Testing Example CA')] ) common_name_for_server = x509.Name( [x509.NameAttribute(NameOID.COMMON_NAME, u'Testing Example Server')] ) one_day = datetime.timedelta(1, 0, 0) private_key_for_ca = rsa.generate_private_key( public_exponent=65537, key_size=key_size, backend=default_backend() ) public_key_for_ca = private_key_for_ca.public_key() ca_certificate = ( x509.CertificateBuilder() .subject_name(common_name_for_ca) .issuer_name(common_name_for_ca) .not_valid_before(datetime.datetime.today() - one_day) .not_valid_after(datetime.datetime.today() + one_day) .serial_number(x509.random_serial_number()) .public_key(public_key_for_ca) .add_extension( x509.BasicConstraints(ca=True, path_length=9), critical=True, ) .sign( private_key=private_key_for_ca, algorithm=hashes.SHA256(), backend=default_backend() ) ) private_key_for_server = rsa.generate_private_key( public_exponent=65537, key_size=key_size, backend=default_backend() ) public_key_for_server = private_key_for_server.public_key() server_certificate = ( x509.CertificateBuilder() .subject_name(common_name_for_server) .issuer_name(common_name_for_ca) .not_valid_before(datetime.datetime.today() - one_day) .not_valid_after(datetime.datetime.today() + one_day) .serial_number(x509.random_serial_number()) .public_key(public_key_for_server) .add_extension( x509.BasicConstraints(ca=False, path_length=None), critical=True, ) .add_extension( x509.SubjectAlternativeName( [x509.DNSName(service_identity)] ), critical=True, ) .sign( private_key=private_key_for_ca, algorithm=hashes.SHA256(), backend=default_backend() ) ) ca_self_cert = Certificate.loadPEM( ca_certificate.public_bytes(Encoding.PEM) ) pkey = PKey.from_cryptography_key(private_key_for_server) x509_server_certificate = X509.from_cryptography(server_certificate) return ca_self_cert, pkey, x509_server_certificate def _make_httpbin_site(reactor, threadpool_factory=ThreadPool): """ Return a :py:class:`Site` that hosts an ``httpbin`` WSGI application. :param reactor: The reactor. :param threadpool_factory: (optional) A callable that creates a :py:class:`ThreadPool`. :return: A :py:class:`Site` that hosts ``httpbin`` """ wsgi_threads = threadpool_factory() wsgi_threads.start() reactor.addSystemEventTrigger("before", "shutdown", wsgi_threads.stop) wsgi_resource = WSGIResource(reactor, wsgi_threads, httpbin.app) return Site(wsgi_resource) @inlineCallbacks def _serve_tls(reactor, host, port, site): """ Serve a site over TLS. :param reactor: The reactor. :param host: The host on which to listen. :type host: :py:class:`str` :param port: The host on which to listen. :type port: :py:class:`int` :type site: The :py:class:`Site` to serve. :return: A :py:class:`Deferred` that fires with a :py:class:`_HTTPBinDescription` """ ( ca_cert, private_key, certificate, ) = _certificates_for_authority_and_server(host) context_factory = CertificateOptions(privateKey=private_key, certificate=certificate) endpoint = SSL4ServerEndpoint(reactor, port, sslContextFactory=context_factory, interface=host) port = yield endpoint.listen(site) description = _HTTPBinDescription(host=host, port=port.getHost().port, cacert=ca_cert.dumpPEM().decode('ascii')) return description @inlineCallbacks def _serve_tcp(reactor, host, port, site): """ Serve a site over plain TCP. :param reactor: The reactor. :param host: The host on which to listen. :type host: :py:class:`str` :param port: The host on which to listen. :type port: :py:class:`int` :return: A :py:class:`Deferred` that fires with a :py:class:`_HTTPBinDescription` """ endpoint = TCP4ServerEndpoint(reactor, port, interface=host) port = yield endpoint.listen(site) description = _HTTPBinDescription(host=host, port=port.getHost().port) return description def _output_process_description(description, stdout=sys.stdout): """ Write a process description to standard out. :param description: The process description. :type description: :py:class:`_HTTPBinDescription` :param stdout: (optional) Standard out. """ stdout.buffer.write(description.to_json_bytes() + b'\n') stdout.buffer.flush() def _forever_httpbin(reactor, argv, _make_httpbin_site=_make_httpbin_site, _serve_tcp=_serve_tcp, _serve_tls=_serve_tls, _output_process_description=_output_process_description): """ Run ``httpbin`` forever. :param reactor: The Twisted reactor. :param argv: The arguments with which the script was ran. :type argv: :py:class:`list` of :py:class:`str` :return: a :py:class:`Deferred` that never fires. """ parser = argparse.ArgumentParser( description=""" Run httpbin forever. This writes a JSON object to standard out. The host and port properties contain the host and port on which httpbin listens. When run with HTTPS, the cacert property contains the PEM-encode CA certificate that clients must trust. """ ) parser.add_argument("--https", help="Serve HTTPS", action="store_const", dest='serve', const=_serve_tls, default=_serve_tcp) parser.add_argument("--host", help="The host on which the server will listen.", type=str, default="localhost") parser.add_argument("--port", help="The on which the server will listen.", type=int, default=0) arguments = parser.parse_args(argv) site = _make_httpbin_site(reactor) description_deferred = arguments.serve(reactor, arguments.host, arguments.port, site) description_deferred.addCallback(_output_process_description) description_deferred.addCallback(lambda _: Deferred()) return description_deferred if __name__ == '__main__': react(_forever_httpbin, (sys.argv[1:],)) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/local_httpbin/parent.py0000664000175000017500000001231100000000000021417 0ustar00twmtwm00000000000000""" Spawn and monitor an ``httpbin`` child process. """ import attr import signal import sys import os from twisted.protocols import basic, policies from twisted.internet import protocol, endpoints, error from twisted.internet.defer import Deferred, succeed from .shared import _HTTPBinDescription class _HTTPBinServerProcessProtocol(basic.LineOnlyReceiver): """ Manage the lifecycle of an ``httpbin`` process. """ delimiter = b'\n' def __init__(self, all_data_received, terminated): """ Manage the lifecycle of an ``httpbin`` process. :param all_data_received: A Deferred that will be called back with an :py:class:`_HTTPBinDescription` object :type all_data_received: :py:class:`Deferred` :param terminated: A Deferred that will be called back when the process has ended. :type terminated: :py:class:`Deferred` """ self._all_data_received = all_data_received self._received = False self._terminated = terminated def lineReceived(self, line): if self._received: raise RuntimeError("Unexpected line: {!r}".format(line)) description = _HTTPBinDescription.from_json_bytes(line) self._received = True # Remove readers and writers that leave the reactor in a dirty # state after a test. self.transport.closeStdin() self.transport.closeStdout() self.transport.closeStderr() self._all_data_received.callback(description) def connectionLost(self, reason): if not self._received: self._all_data_received.errback(reason) self._terminated.errback(reason) @attr.s class _HTTPBinProcess: """ Manage an ``httpbin`` server process. :ivar _all_data_received: See :py:attr:`_HTTPBinServerProcessProtocol.all_data_received` :ivar _terminated: See :py:attr:`_HTTPBinServerProcessProtocol.terminated` """ _https = attr.ib() _error_log_path = attr.ib(default='httpbin-server-error.log') _all_data_received = attr.ib(init=False, default=attr.Factory(Deferred)) _terminated = attr.ib(init=False, default=attr.Factory(Deferred)) _process = attr.ib(init=False, default=None) _process_description = attr.ib(init=False, default=None) _open = staticmethod(open) def _spawn_httpbin_process(self, reactor): """ Spawn an ``httpbin`` process, returning a :py:class:`Deferred` that fires with the process transport and result. """ server = _HTTPBinServerProcessProtocol( all_data_received=self._all_data_received, terminated=self._terminated ) argv = [ sys.executable, '-m', 'treq.test.local_httpbin.child', ] if self._https: argv.append('--https') with self._open(self._error_log_path, 'wb') as error_log: endpoint = endpoints.ProcessEndpoint( reactor, sys.executable, argv, env=os.environ, childFDs={ 1: 'r', 2: error_log.fileno(), }, ) # Processes are spawned synchronously. spawned = endpoint.connect( # ProtocolWrapper, WrappingFactory's protocol, has a # disconnecting attribute. See # https://twistedmatrix.com/trac/ticket/6606 policies.WrappingFactory( protocol.Factory.forProtocol(lambda: server), ), ) def wait_for_protocol(connected_protocol): process = connected_protocol.transport return self._all_data_received.addCallback( return_result_and_process, process, ) def return_result_and_process(description, process): return description, process return spawned.addCallback(wait_for_protocol) def server_description(self, reactor): """ Return a :py:class:`Deferred` that fires with the the process' :py:class:`_HTTPBinDescription`, spawning the process if necessary. """ if self._process is None: ready = self._spawn_httpbin_process(reactor) def store_and_schedule_termination(description_and_process): description, process = description_and_process self._process = process self._process_description = description reactor.addSystemEventTrigger("before", "shutdown", self.kill) return self._process_description return ready.addCallback(store_and_schedule_termination) else: return succeed(self._process_description) def kill(self): """ Kill the ``httpbin`` process. """ if not self._process: return self._process.signalProcess("KILL") def suppress_process_terminated(exit_failure): exit_failure.trap(error.ProcessTerminated) if exit_failure.value.signal != signal.SIGKILL: return exit_failure return self._terminated.addErrback(suppress_process_terminated) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/local_httpbin/shared.py0000664000175000017500000000201500000000000021374 0ustar00twmtwm00000000000000""" Things shared between the ``httpbin`` child and parent processes """ import attr import json @attr.s class _HTTPBinDescription: """ Describe an ``httpbin`` process. :param host: The host on which the process listens. :type host: :py:class:`str` :param port: The port on which the process listens. :type port: :py:class:`int` :param cacert: (optional) The PEM-encoded certificate authority's certificate. The calling process' treq must trust this when running HTTPS tests. :type cacert: :py:class:`bytes` or :py:class:`None` """ host = attr.ib() port = attr.ib() cacert = attr.ib(default=None) @classmethod def from_json_bytes(cls, json_data): """ Deserialize an instance from JSON bytes. """ return cls(**json.loads(json_data.decode('ascii'))) def to_json_bytes(self): """ Serialize an instance from JSON bytes. """ return json.dumps(attr.asdict(self), sort_keys=True).encode('ascii') ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/src/treq/test/local_httpbin/test/0000775000175000017500000000000000000000000020535 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584213863.0 treq-22.2.0/src/treq/test/local_httpbin/test/__init__.py0000644000175000017500000000000000000000000022632 0ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/local_httpbin/test/test_child.py0000664000175000017500000003123100000000000023231 0ustar00twmtwm00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.child` """ import attr from cryptography.hazmat.primitives.asymmetric import padding import functools import io from twisted.trial.unittest import SynchronousTestCase try: from twisted.internet.testing import MemoryReactor except ImportError: from twisted.test.proto_helpers import MemoryReactor from twisted.internet import defer from treq.test.util import skip_on_windows_because_of_199 from twisted.web.server import Site from twisted.web.resource import Resource from service_identity.cryptography import verify_certificate_hostname from .. import child, shared skip = skip_on_windows_because_of_199() class CertificatesForAuthorityAndServerTests(SynchronousTestCase): """ Tests for :py:func:`child._certificates_for_authority_and_server` """ def setUp(self): self.hostname = u".example.org" ( self.ca_cert, self.server_private_key, self.server_x509_cert, ) = child._certificates_for_authority_and_server( self.hostname, ) def test_pkey_x509_paired(self): """ The returned private key corresponds to the X.509 certificate's public key. """ server_private_key = self.server_private_key.to_cryptography_key() server_x509_cert = self.server_x509_cert.to_cryptography() plaintext = b'plaintext' ciphertext = server_x509_cert.public_key().encrypt( plaintext, padding.PKCS1v15(), ) self.assertEqual( server_private_key.decrypt( ciphertext, padding.PKCS1v15(), ), plaintext, ) def test_ca_signed_x509(self): """ The returned X.509 certificate was signed by the returned certificate authority's certificate. """ ca_cert = self.ca_cert.original.to_cryptography() server_x509_cert = self.server_x509_cert.to_cryptography() # Raises an InvalidSignature exception on failure. ca_cert.public_key().verify( server_x509_cert.signature, server_x509_cert.tbs_certificate_bytes, padding.PKCS1v15(), server_x509_cert.signature_hash_algorithm ) def test_x509_matches_hostname(self): """ The returned X.509 certificate is valid for the hostname. """ verify_certificate_hostname( self.server_x509_cert.to_cryptography(), self.hostname, ) @attr.s class FakeThreadPoolState(object): """ State for :py:class:`FakeThreadPool`. """ init_call_count = attr.ib(default=0) start_call_count = attr.ib(default=0) @attr.s class FakeThreadPool(object): """ A fake :py:class:`twisted.python.threadpool.ThreadPool` """ _state = attr.ib() def init(self): self._state.init_call_count += 1 return self def start(self): """ See :py:meth:`twisted.python.threadpool.ThreadPool.start` """ self._state.start_call_count += 1 def stop(self): """ See :py:meth:`twisted.python.threadpool.ThreadPool.stop` """ class MakeHTTPBinSiteTests(SynchronousTestCase): """ Tests for :py:func:`_make_httpbin_site`. """ def setUp(self): self.fake_threadpool_state = FakeThreadPoolState() self.fake_threadpool = FakeThreadPool(self.fake_threadpool_state) self.reactor = MemoryReactor() def test_threadpool_management(self): """ A thread pool is created that will be shut down when the reactor shuts down. """ child._make_httpbin_site( self.reactor, threadpool_factory=self.fake_threadpool.init, ) self.assertEqual(self.fake_threadpool_state.init_call_count, 1) self.assertEqual(self.fake_threadpool_state.start_call_count, 1) self.assertEqual(len(self.reactor.triggers['before']['shutdown']), 1) [(stop, _, _)] = self.reactor.triggers['before']['shutdown'] self.assertEqual(stop, self.fake_threadpool.stop) class ServeTLSTests(SynchronousTestCase): """ Tests for :py:func:`_serve_tls` """ def setUp(self): self.reactor = MemoryReactor() self.site = Site(Resource()) def test_tls_listener_matches_description(self): """ An SSL listener is established on the requested host and port, and the host, port, and CA certificate are returned in its description. """ expected_host = 'host' expected_port = 123 description_deferred = child._serve_tls( self.reactor, host=expected_host, port=expected_port, site=self.site, ) self.assertEqual(len(self.reactor.sslServers), 1) [ (actual_port, actual_site, _, _, actual_host) ] = self.reactor.sslServers self.assertEqual(actual_host, expected_host) self.assertEqual(actual_port, expected_port) self.assertIs(actual_site, self.site) description = self.successResultOf(description_deferred) self.assertEqual(description.host, expected_host) self.assertEqual(description.port, expected_port) self.assertTrue(description.cacert) class ServeTCPTests(SynchronousTestCase): """ Tests for :py:func:`_serve_tcp` """ def setUp(self): self.reactor = MemoryReactor() self.site = Site(Resource) def test_tcp_listener_matches_description(self): """ A TCP listeneris established on the request host and port, and the host and port are returned in its description. """ expected_host = 'host' expected_port = 123 description_deferred = child._serve_tcp( self.reactor, host=expected_host, port=expected_port, site=self.site, ) self.assertEqual(len(self.reactor.tcpServers), 1) [ (actual_port, actual_site, _, actual_host) ] = self.reactor.tcpServers self.assertEqual(actual_host, expected_host) self.assertEqual(actual_port, expected_port) self.assertIs(actual_site, self.site) description = self.successResultOf(description_deferred) self.assertEqual(description.host, expected_host) self.assertEqual(description.port, expected_port) self.assertFalse(description.cacert) @attr.s class FlushableBytesIOState(object): """ State for :py:class:`FlushableBytesIO` """ bio = attr.ib(default=attr.Factory(io.BytesIO)) flush_count = attr.ib(default=0) @attr.s class FlushableBytesIO(object): """ A :py:class:`io.BytesIO` wrapper that records flushes. """ _state = attr.ib() def write(self, data): self._state.bio.write(data) def flush(self): self._state.flush_count += 1 @attr.s class BufferedStandardOut(object): """ A standard out that whose ``buffer`` is a :py:class:`FlushableBytesIO` instance. """ buffer = attr.ib() class OutputProcessDescriptionTests(SynchronousTestCase): """ Tests for :py:func:`_output_process_description` """ def setUp(self): self.stdout_state = FlushableBytesIOState() self.stdout = BufferedStandardOut(FlushableBytesIO(self.stdout_state)) def test_description_written(self): """ An :py:class:`shared._HTTPBinDescription` is written to standard out and the line flushed. """ description = shared._HTTPBinDescription(host="host", port=123, cacert="cacert") child._output_process_description(description, self.stdout) written = self.stdout_state.bio.getvalue() self.assertEqual( written, b'{"cacert": "cacert", "host": "host", "port": 123}' + b'\n', ) self.assertEqual(self.stdout_state.flush_count, 1) class ForeverHTTPBinTests(SynchronousTestCase): """ Tests for :py:func:`_forever_httpbin` """ def setUp(self): self.make_httpbin_site_returns = Site(Resource()) self.serve_tcp_calls = [] self.serve_tcp_returns = defer.Deferred() self.serve_tls_calls = [] self.serve_tls_returns = defer.Deferred() self.output_process_description_calls = [] self.output_process_description_returns = None self.reactor = MemoryReactor() self.forever_httpbin = functools.partial( child._forever_httpbin, _make_httpbin_site=self.make_httpbin_site, _serve_tcp=self.serve_tcp, _serve_tls=self.serve_tls, _output_process_description=self.output_process_description, ) def make_httpbin_site(self, reactor, *args, **kwargs): """ A fake :py:func:`child._make_httpbin_site`. """ return self.make_httpbin_site_returns def serve_tcp(self, reactor, host, port, site): """ A fake :py:func:`child._serve_tcp`. """ self.serve_tcp_calls.append((reactor, host, port, site)) return self.serve_tcp_returns def serve_tls(self, reactor, host, port, site): """ A fake :py:func:`child._serve_tls`. """ self.serve_tls_calls.append((reactor, host, port, site)) return self.serve_tls_returns def output_process_description(self, description, *args, **kwargs): """ A fake :py:func:`child._output_process_description` """ self.output_process_description_calls.append(description) return self.output_process_description_returns def assertDescriptionAndDeferred(self, description_deferred, forever_deferred): """ Assert that firing ``description_deferred`` outputs the description but that ``forever_deferred`` never fires. """ description_deferred.callback("description") self.assertEqual(self.output_process_description_calls, ["description"]) self.assertNoResult(forever_deferred) def test_default_arguments(self): """ The default command line arguments host ``httpbin`` on ``localhost`` and a randomly-assigned port, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, []) self.assertEqual( self.serve_tcp_calls, [ (self.reactor, 'localhost', 0, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) def test_https(self): """ The ``--https`` command line argument serves ``httpbin`` over HTTPS, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--https']) self.assertEqual( self.serve_tls_calls, [ (self.reactor, 'localhost', 0, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tls_returns, forever_deferred=deferred, ) def test_host(self): """ The ``--host`` command line argument serves ``httpbin`` on provided host, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--host', 'example.org']) self.assertEqual( self.serve_tcp_calls, [ ( self.reactor, 'example.org', 0, self.make_httpbin_site_returns, ) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) def test_port(self): """ The ``--port`` command line argument serves ``httpbin`` on the provided port, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--port', '91']) self.assertEqual( self.serve_tcp_calls, [ (self.reactor, 'localhost', 91, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584214397.0 treq-22.2.0/src/treq/test/local_httpbin/test/test_parent.py0000644000175000017500000003675000000000000023450 0ustar00twmtwm00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.parent` """ import attr import json import signal import sys from twisted.internet import defer from twisted.internet.interfaces import (IProcessTransport, IReactorCore, IReactorProcess) from twisted.python.failure import Failure from treq.test.util import skip_on_windows_because_of_199 from twisted.internet.error import ProcessTerminated, ConnectionDone try: from twisted.internet.testing import MemoryReactor, StringTransport except ImportError: from twisted.test.proto_helpers import MemoryReactor, StringTransport from twisted.trial.unittest import SynchronousTestCase from zope.interface import implementer, verify from .. import parent, shared skip = skip_on_windows_because_of_199() @attr.s class FakeProcessTransportState(object): """ State for :py:class:`FakeProcessTransport`. """ standard_in_closed = attr.ib(default=False) standard_out_closed = attr.ib(default=False) standard_error_closed = attr.ib(default=False) signals = attr.ib(default=attr.Factory(list)) @implementer(IProcessTransport) @attr.s class FakeProcessTransport(StringTransport, object): """ A fake process transport. """ pid = 1234 _state = attr.ib() def closeStdin(self): """ Close standard in. """ self._state.standard_in_closed = True def closeStdout(self): """ Close standard out. """ self._state.standard_out_closed = True def closeStderr(self): """ Close standard error. """ self._state.standard_error_closed = True def closeChildFD(self, descriptor): """ Close a child's file descriptor. :param descriptor: See :py:class:`IProcessProtocol.closeChildFD` """ def writeToChild(self, childFD, data): """ Write data to a child's file descriptor. :param childFD: See :py:class:`IProcessProtocol.writeToChild` :param data: See :py:class:`IProcessProtocol.writeToChild` """ def signalProcess(self, signalID): """ Send a signal. :param signalID: See :py:class:`IProcessProtocol.signalProcess` """ self._state.signals.append(signalID) class FakeProcessTransportTests(SynchronousTestCase): """ Tests for :py:class:`FakeProcessTransport`. """ def setUp(self): self.state = FakeProcessTransportState() self.transport = FakeProcessTransport(self.state) def test_provides_interface(self): """ Instances provide :py:class:`IProcessTransport`. """ verify.verifyObject(IProcessTransport, self.transport) def test_closeStdin(self): """ Closing standard in updates the state instance. """ self.assertFalse(self.state.standard_in_closed) self.transport.closeStdin() self.assertTrue(self.state.standard_in_closed) def test_closeStdout(self): """ Closing standard out updates the state instance. """ self.assertFalse(self.state.standard_out_closed) self.transport.closeStdout() self.assertTrue(self.state.standard_out_closed) def test_closeStderr(self): """ Closing standard error updates the state instance. """ self.assertFalse(self.state.standard_error_closed) self.transport.closeStderr() self.assertTrue(self.state.standard_error_closed) class HTTPServerProcessProtocolTests(SynchronousTestCase): """ Tests for :py:class:`parent._HTTPBinServerProcessProtocol` """ def setUp(self): self.transport_state = FakeProcessTransportState() self.transport = FakeProcessTransport(self.transport_state) self.all_data_received = defer.Deferred() self.terminated = defer.Deferred() self.protocol = parent._HTTPBinServerProcessProtocol( all_data_received=self.all_data_received, terminated=self.terminated, ) self.protocol.makeConnection(self.transport) def assertStandardInputAndOutputClosed(self): """ The transport's standard in, out, and error are closed. """ self.assertTrue(self.transport_state.standard_in_closed) self.assertTrue(self.transport_state.standard_out_closed) self.assertTrue(self.transport_state.standard_error_closed) def test_receive_http_description(self): """ Receiving a serialized :py:class:`_HTTPBinDescription` fires the ``all_data_received`` :py:class:`Deferred`. """ self.assertNoResult(self.all_data_received) description = shared._HTTPBinDescription("host", 1234, "cert") self.protocol.lineReceived( json.dumps(attr.asdict(description)).encode('ascii') ) self.assertStandardInputAndOutputClosed() self.assertEqual(self.successResultOf(self.all_data_received), description) def test_receive_unexpected_line(self): """ Receiving a line after the description synchronously raises in :py:class:`RuntimeError` """ self.test_receive_http_description() with self.assertRaises(RuntimeError): self.protocol.lineReceived(b"unexpected") def test_connection_lost_before_receiving_data(self): """ If the process terminates before its data is received, both ``all_data_received`` and ``terminated`` errback. """ self.assertNoResult(self.all_data_received) self.protocol.connectionLost(Failure(ConnectionDone("done"))) self.assertIsInstance( self.failureResultOf(self.all_data_received).value, ConnectionDone, ) self.assertIsInstance( self.failureResultOf(self.terminated).value, ConnectionDone, ) def test_connection_lost(self): """ ``terminated`` fires when the connection is lost. """ self.test_receive_http_description() self.protocol.connectionLost(Failure(ConnectionDone("done"))) self.assertIsInstance( self.failureResultOf(self.terminated).value, ConnectionDone, ) @attr.s class SpawnedProcess(object): """ A call to :py:class:`MemoryProcessReactor.spawnProcess`. """ process_protocol = attr.ib() executable = attr.ib() args = attr.ib() env = attr.ib() path = attr.ib() uid = attr.ib() gid = attr.ib() use_pty = attr.ib() child_fds = attr.ib() returned_process_transport = attr.ib() returned_process_transport_state = attr.ib() def send_stdout(self, data): """ Send data from the process' standard out. :param data: The standard out data. """ self.process_protocol.childDataReceived(1, data) def end_process(self, reason): """ End the process. :param reason: The reason. :type reason: :py:class:`Failure` """ self.process_protocol.processEnded(reason) @implementer(IReactorCore, IReactorProcess) class MemoryProcessReactor(MemoryReactor): """ A fake :py:class:`IReactorProcess` and :py:class:`IReactorCore` provider to be used in tests. """ def __init__(self): MemoryReactor.__init__(self) self.spawnedProcesses = [] def spawnProcess(self, processProtocol, executable, args=(), env={}, path=None, uid=None, gid=None, usePTY=0, childFDs=None): """ :ivar process_protocol: Stores the protocol passed to the reactor. :return: An L{IProcessTransport} provider. """ transport_state = FakeProcessTransportState() transport = FakeProcessTransport(transport_state) self.spawnedProcesses.append(SpawnedProcess( process_protocol=processProtocol, executable=executable, args=args, env=env, path=path, uid=uid, gid=gid, use_pty=usePTY, child_fds=childFDs, returned_process_transport=transport, returned_process_transport_state=transport_state, )) processProtocol.makeConnection(transport) return transport class MemoryProcessReactorTests(SynchronousTestCase): """ Tests for :py:class:`MemoryProcessReactor` """ def test_provides_interfaces(self): """ :py:class:`MemoryProcessReactor` instances provide :py:class:`IReactorCore` and :py:class:`IReactorProcess`. """ reactor = MemoryProcessReactor() verify.verifyObject(IReactorCore, reactor) verify.verifyObject(IReactorProcess, reactor) class HTTPBinProcessTests(SynchronousTestCase): """ Tests for :py:class:`_HTTPBinProcesss`. """ def setUp(self): self.reactor = MemoryProcessReactor() self.opened_file_descriptors = [] def fd_recording_open(self, *args, **kwargs): """ Record the file descriptors of files opened by :py:func:`open`. :return: A file object. """ fobj = open(*args, **kwargs) self.opened_file_descriptors.append(fobj.fileno()) return fobj def spawned_process(self): """ Assert that ``self.reactor`` has spawned only one process and return the :py:class:`SpawnedProcess` representing it. :return: The :py:class:`SpawnedProcess`. """ self.assertEqual(len(self.reactor.spawnedProcesses), 1) return self.reactor.spawnedProcesses[0] def assertSpawnAndDescription(self, process, args, description): """ Assert that spawning the given process invokes the command with the given args, that standard error is redirected, that it is killed at reactor shutdown, and that it returns a description that matches the provided one. :param process: :py:class:`_HTTPBinProcesss` instance. :param args: The arguments with which to execute the child process. :type args: :py:class:`tuple` of :py:class:`str` :param description: The expected :py:class:`_HTTPBinDescription`. :return: The returned :py:class:`_HTTPBinDescription` """ process._open = self.fd_recording_open description_deferred = process.server_description(self.reactor) spawned_process = self.spawned_process() self.assertEqual(spawned_process.args, args) self.assertEqual(len(self.opened_file_descriptors), 1) [error_log_fd] = self.opened_file_descriptors self.assertEqual(spawned_process.child_fds.get(2), error_log_fd) self.assertNoResult(description_deferred) spawned_process.send_stdout(description.to_json_bytes() + b'\n') before_shutdown = self.reactor.triggers["before"]["shutdown"] self.assertEqual(len(before_shutdown), 1) [(before_shutdown_function, _, _)] = before_shutdown self.assertEqual(before_shutdown_function, process.kill) self.assertEqual(self.successResultOf(description_deferred), description) def test_server_description_spawns_process(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process that it monitors with :py:class:`_HTTPBinServerProcessProtocol`, and redirects its standard error to a log file. """ httpbin_process = parent._HTTPBinProcess(https=False) description = shared._HTTPBinDescription(host="host", port=1234) self.assertSpawnAndDescription( httpbin_process, [ sys.executable, '-m', 'treq.test.local_httpbin.child' ], description) def test_server_description_spawns_process_https(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process that listens over HTTPS, that it monitors with :py:class:`_HTTPBinServerProcessProtocol`, and redirects the process' standard error to a log file. """ httpbin_process = parent._HTTPBinProcess(https=True) description = shared._HTTPBinDescription(host="host", port=1234, cacert="cert") self.assertSpawnAndDescription( httpbin_process, [ sys.executable, '-m', 'treq.test.local_httpbin.child', '--https', ], description) def test_server_description_caches_description(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process only once, after which it returns a cached :py:class:`_HTTPBinDescription`. """ httpbin_process = parent._HTTPBinProcess(https=False) description_deferred = httpbin_process.server_description(self.reactor) self.spawned_process().send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) description = self.successResultOf(description_deferred) cached_description_deferred = httpbin_process.server_description( self.reactor, ) cached_description = self.successResultOf(cached_description_deferred) self.assertIs(description, cached_description) def test_kill_before_spawn(self): """ Killing a process before it has been spawned has no effect. """ parent._HTTPBinProcess(https=False).kill() def test_kill(self): """ Kill terminates the process as quickly as the platform allows, and the termination failure is suppressed. """ httpbin_process = parent._HTTPBinProcess(https=False) httpbin_process.server_description(self.reactor) spawned_process = self.spawned_process() spawned_process.send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) termination_deferred = httpbin_process.kill() self.assertEqual( spawned_process.returned_process_transport_state.signals, ['KILL'], ) spawned_process.end_process( Failure(ProcessTerminated(1, signal=signal.SIGKILL)), ) self.successResultOf(termination_deferred) def test_kill_unexpected_exit(self): """ The :py:class:`Deferred` returned by :py:meth:`_HTTPBinProcess.kill` errbacks with the failure when it is not :py:class:`ProcessTerminated`, or its signal does not match the expected signal. """ for error in [ProcessTerminated(1, signal=signal.SIGIO), ConnectionDone("Bye")]: httpbin_process = parent._HTTPBinProcess(https=False) httpbin_process.server_description(self.reactor) spawned_process = self.reactor.spawnedProcesses[-1] spawned_process.send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) termination_deferred = httpbin_process.kill() spawned_process.end_process(Failure(error)) self.assertIs(self.failureResultOf(termination_deferred).value, error) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1584213863.0 treq-22.2.0/src/treq/test/local_httpbin/test/test_shared.py0000644000175000017500000000243100000000000023412 0ustar00twmtwm00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.shared` """ from twisted.trial import unittest from .. import shared class HTTPBinDescriptionTests(unittest.SynchronousTestCase): """ Tests for :py:class:`shared._HTTPBinDescription` """ def test_round_trip(self): """ :py:class:`shared._HTTPBinDescription.from_json_bytes` can deserialize the output of :py:class:`shared._HTTPBinDescription.to_json_bytes` """ original = shared._HTTPBinDescription(host="host", port=123) round_tripped = shared._HTTPBinDescription.from_json_bytes( original.to_json_bytes(), ) self.assertEqual(original, round_tripped) def test_round_trip_cacert(self): """ :py:class:`shared._HTTPBinDescription.from_json_bytes` can deserialize the output of :py:class:`shared._HTTPBinDescription.to_json_bytes` when ``cacert`` is set. """ original = shared._HTTPBinDescription(host="host", port=123, cacert='cacert') round_tripped = shared._HTTPBinDescription.from_json_bytes( original.to_json_bytes(), ) self.assertEqual(original, round_tripped) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1609193998.0 treq-22.2.0/src/treq/test/test_agentspy.py0000664000175000017500000000513700000000000020205 0ustar00twmtwm00000000000000# Copyright (c) The treq Authors. # See LICENSE for details. from io import BytesIO from twisted.trial.unittest import SynchronousTestCase from twisted.web.client import FileBodyProducer from twisted.web.http_headers import Headers from twisted.web.iweb import IAgent from treq._agentspy import RequestRecord, agent_spy class APISpyTests(SynchronousTestCase): """ The agent_spy API provides an agent that records each request made to it. """ def test_provides_iagent(self): """ The agent returned by agent_spy() provides the IAgent interface. """ agent, _ = agent_spy() self.assertTrue(IAgent.providedBy(agent)) def test_records(self): """ Each request made with the agent is recorded. """ agent, requests = agent_spy() body = FileBodyProducer(BytesIO(b"...")) d1 = agent.request(b"GET", b"https://foo") d2 = agent.request(b"POST", b"http://bar", Headers({})) d3 = agent.request(b"PUT", b"https://baz", None, bodyProducer=body) self.assertEqual( requests, [ RequestRecord(b"GET", b"https://foo", None, None, d1), RequestRecord(b"POST", b"http://bar", Headers({}), None, d2), RequestRecord(b"PUT", b"https://baz", None, body, d3), ], ) def test_record_attributes(self): """ Each parameter passed to `request` is available as an attribute of the RequestRecord. Additionally, the deferred returned by the call is available. """ agent, requests = agent_spy() headers = Headers() body = FileBodyProducer(BytesIO(b"...")) deferred = agent.request(b"method", b"uri", headers=headers, bodyProducer=body) [rr] = requests self.assertIs(rr.method, b"method") self.assertIs(rr.uri, b"uri") self.assertIs(rr.headers, headers) self.assertIs(rr.bodyProducer, body) self.assertIs(rr.deferred, deferred) def test_type_validation(self): """ The request method enforces correctness by raising TypeError when passed parameters of the wrong type. """ agent, _ = agent_spy() self.assertRaises(TypeError, agent.request, u"method not bytes", b"uri") self.assertRaises(TypeError, agent.request, b"method", u"uri not bytes") self.assertRaises( TypeError, agent.request, b"method", b"uri", {"not": "headers"} ) self.assertRaises( TypeError, agent.request, b"method", b"uri", None, b"not ibodyproducer" ) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1613279689.0 treq-22.2.0/src/treq/test/test_api.py0000664000175000017500000001677600000000000017137 0ustar00twmtwm00000000000000from __future__ import absolute_import, division from twisted.web.iweb import IAgent from twisted.web.client import HTTPConnectionPool from twisted.trial.unittest import TestCase from twisted.internet import defer from zope.interface import implementer import treq from treq.api import default_reactor, default_pool, set_global_pool, get_global_pool try: from twisted.internet.testing import MemoryReactorClock except ImportError: from twisted.test.proto_helpers import MemoryReactorClock class SyntacticAbominationHTTPConnectionPool: """ A HTTP connection pool that always fails to return a connection, but counts the number of requests made. """ requests = 0 def getConnection(self, key, endpoint): """ Count each request, then fail with `IndentationError`. """ self.requests += 1 return defer.fail(TabError()) class TreqAPITests(TestCase): def test_default_pool(self): """ The module-level API uses the global connection pool by default. """ pool = SyntacticAbominationHTTPConnectionPool() set_global_pool(pool) d = treq.get("http://test.com") self.assertEqual(pool.requests, 1) self.failureResultOf(d, TabError) def test_cached_pool(self): """ The first use of the module-level API populates the global connection pool, which is used for all subsequent requests. """ pool = SyntacticAbominationHTTPConnectionPool() self.patch(treq.api, "HTTPConnectionPool", lambda reactor, persistent: pool) self.failureResultOf(treq.head("http://test.com"), TabError) self.failureResultOf(treq.get("http://test.com"), TabError) self.failureResultOf(treq.post("http://test.com"), TabError) self.failureResultOf(treq.put("http://test.com"), TabError) self.failureResultOf(treq.delete("http://test.com"), TabError) self.failureResultOf(treq.request("OPTIONS", "http://test.com"), TabError) self.assertEqual(pool.requests, 6) def test_custom_pool(self): """ `treq.post()` accepts a *pool* argument to use for the request. The global pool is unaffected. """ pool = SyntacticAbominationHTTPConnectionPool() d = treq.post("http://foo", data=b"bar", pool=pool) self.assertEqual(pool.requests, 1) self.failureResultOf(d, TabError) self.assertIsNot(pool, get_global_pool()) def test_custom_agent(self): """ A custom Agent is used if specified. """ @implementer(IAgent) class CounterAgent: requests = 0 def request(self, method, uri, headers=None, bodyProducer=None): self.requests += 1 return defer.Deferred() custom_agent = CounterAgent() d = treq.get("https://www.example.org/", agent=custom_agent) self.assertNoResult(d) self.assertEqual(1, custom_agent.requests) def test_request_invalid_param(self): """ `treq.request()` warns that it ignores unknown keyword arguments, but this is deprecated. This test verifies that stacklevel is set appropriately when issuing the warning. """ with self.assertRaises(TypeError) as c: treq.request( "GET", "https://foo.bar", invalid=True, pool=SyntacticAbominationHTTPConnectionPool(), ) self.assertIn("invalid", str(c.exception)) def test_post_json_with_data(self): """ `treq.post()` warns that mixing *data* and *json* is deprecated. This test verifies that stacklevel is set appropriately when issuing the warning. """ self.failureResultOf( treq.post( "https://test.example/", data={"hello": "world"}, json={"goodnight": "moon"}, pool=SyntacticAbominationHTTPConnectionPool(), ) ) [w] = self.flushWarnings([self.test_post_json_with_data]) self.assertEqual(DeprecationWarning, w["category"]) self.assertEqual( ( "Argument 'json' will be ignored because 'data' was also passed." " This will raise TypeError in the next treq release." ), w["message"], ) class DefaultReactorTests(TestCase): """ Test `treq.api.default_reactor()` """ def test_passes_reactor(self): """ `default_reactor()` returns any reactor passed. """ reactor = MemoryReactorClock() self.assertIs(default_reactor(reactor), reactor) def test_uses_default_reactor(self): """ `default_reactor()` returns the global reactor when passed ``None``. """ from twisted.internet import reactor self.assertEqual(default_reactor(None), reactor) class DefaultPoolTests(TestCase): """ Test `treq.api.default_pool`. """ def setUp(self): set_global_pool(None) self.reactor = MemoryReactorClock() def test_persistent_false(self): """ When *persistent=False* is passed a non-persistent pool is created. """ pool = default_pool(self.reactor, None, False) self.assertTrue(isinstance(pool, HTTPConnectionPool)) self.assertFalse(pool.persistent) def test_persistent_false_not_stored(self): """ When *persistent=False* is passed the resulting pool is not stored as the global pool. """ pool = default_pool(self.reactor, None, persistent=False) self.assertIsNot(pool, get_global_pool()) def test_persistent_false_new(self): """ When *persistent=False* is passed a new pool is returned each time. """ pool1 = default_pool(self.reactor, None, persistent=False) pool2 = default_pool(self.reactor, None, persistent=False) self.assertIsNot(pool1, pool2) def test_pool_none_persistent_none(self): """ When *persistent=None* is passed a _persistent_ pool is created for backwards compatibility. """ pool = default_pool(self.reactor, None, None) self.assertTrue(pool.persistent) def test_pool_none_persistent_true(self): """ When *persistent=True* is passed a persistent pool is created and stored as the global pool. """ pool = default_pool(self.reactor, None, True) self.assertTrue(isinstance(pool, HTTPConnectionPool)) self.assertTrue(pool.persistent) def test_cached_global_pool(self): """ When *persistent=True* or *persistent=None* is passed the pool created is cached as the global pool. """ pool1 = default_pool(self.reactor, None, None) pool2 = default_pool(self.reactor, None, True) self.assertEqual(pool1, pool2) def test_specified_pool(self): """ When the user passes a pool it is returned directly. The *persistent* argument is ignored. It is not cached as the global pool. """ user_pool = HTTPConnectionPool(self.reactor, persistent=True) pool1 = default_pool(self.reactor, user_pool, None) pool2 = default_pool(self.reactor, user_pool, True) pool3 = default_pool(self.reactor, user_pool, False) self.assertIs(pool1, user_pool) self.assertIs(pool2, user_pool) self.assertIs(pool3, user_pool) self.assertIsNot(get_global_pool(), user_pool) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1609193998.0 treq-22.2.0/src/treq/test/test_auth.py0000664000175000017500000001053200000000000017307 0ustar00twmtwm00000000000000# Copyright (c) The treq Authors. # See LICENSE for details. from twisted.trial.unittest import SynchronousTestCase from twisted.web.http_headers import Headers from twisted.web.iweb import IAgent from treq._agentspy import agent_spy from treq.auth import _RequestHeaderSetterAgent, add_auth, UnknownAuthConfig class RequestHeaderSetterAgentTests(SynchronousTestCase): def setUp(self): self.agent, self.requests = agent_spy() def test_sets_headers(self): agent = _RequestHeaderSetterAgent( self.agent, Headers({b'X-Test-Header': [b'Test-Header-Value']}), ) agent.request(b'method', b'uri') self.assertEqual( self.requests[0].headers, Headers({b'X-Test-Header': [b'Test-Header-Value']}), ) def test_overrides_per_request_headers(self): agent = _RequestHeaderSetterAgent( self.agent, Headers({b'X-Test-Header': [b'Test-Header-Value']}) ) agent.request( b'method', b'uri', Headers({b'X-Test-Header': [b'Unwanted-Value']}) ) self.assertEqual( self.requests[0].headers, Headers({b'X-Test-Header': [b'Test-Header-Value']}), ) def test_no_mutation(self): """ The agent never mutates the headers passed to its request method. This reproduces https://github.com/twisted/treq/issues/314 """ requestHeaders = Headers({}) agent = _RequestHeaderSetterAgent( self.agent, Headers({b'Added': [b'1']}), ) agent.request(b'method', b'uri', headers=requestHeaders) self.assertEqual(requestHeaders, Headers({})) class AddAuthTests(SynchronousTestCase): def test_add_basic_auth(self): """ add_auth() wraps the given agent with one that adds an ``Authorization: Basic ...`` HTTP header that contains the given credentials. """ agent, requests = agent_spy() authAgent = add_auth(agent, ('username', 'password')) authAgent.request(b'method', b'uri') self.assertTrue(IAgent.providedBy(authAgent)) self.assertEqual( requests[0].headers, Headers({b'authorization': [b'Basic dXNlcm5hbWU6cGFzc3dvcmQ=']}) ) def test_add_basic_auth_huge(self): """ The Authorization header doesn't include linebreaks, even if the credentials are so long that Python's base64 implementation inserts them. """ agent, requests = agent_spy() pwd = ('verylongpasswordthatextendsbeyondthepointwheremultiplel' 'inesaregenerated') expectedAuth = ( b'Basic dXNlcm5hbWU6dmVyeWxvbmdwYXNzd29yZHRoYXRleHRlbmRzY' b'mV5b25kdGhlcG9pbnR3aGVyZW11bHRpcGxlbGluZXNhcmVnZW5lcmF0ZWQ=' ) authAgent = add_auth(agent, ('username', pwd)) authAgent.request(b'method', b'uri') self.assertEqual( requests[0].headers, Headers({b'authorization': [expectedAuth]}), ) def test_add_basic_auth_utf8(self): """ Basic auth username and passwords given as `str` are encoded as UTF-8. https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication#Character_encoding_of_HTTP_authentication """ agent, requests = agent_spy() auth = (u'\u16d7', u'\u16b9') authAgent = add_auth(agent, auth) authAgent.request(b'method', b'uri') self.assertEqual( requests[0].headers, Headers({b'Authorization': [b'Basic 4ZuXOuGauQ==']}), ) def test_add_basic_auth_bytes(self): """ Basic auth can be passed as `bytes`, allowing the user full control over the encoding. """ agent, requests = agent_spy() auth = (b'\x01\x0f\xff', b'\xff\xf0\x01') authAgent = add_auth(agent, auth) authAgent.request(b'method', b'uri') self.assertEqual( requests[0].headers, Headers({b'Authorization': [b'Basic AQ//Ov/wAQ==']}), ) def test_add_unknown_auth(self): """ add_auth() raises UnknownAuthConfig when given anything other than a tuple. """ agent, _ = agent_spy() invalidAuth = 1234 self.assertRaises(UnknownAuthConfig, add_auth, agent, invalidAuth) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1613279689.0 treq-22.2.0/src/treq/test/test_client.py0000664000175000017500000007502700000000000017636 0ustar00twmtwm00000000000000from collections import OrderedDict from io import BytesIO from unittest import mock from hyperlink import DecodedURL, EncodedURL from twisted.internet.defer import Deferred, succeed, CancelledError from twisted.internet.protocol import Protocol from twisted.python.failure import Failure from twisted.trial.unittest import TestCase from twisted.web.client import Agent, ResponseFailed from twisted.web.http_headers import Headers from treq.test.util import with_clock from treq.client import ( HTTPClient, _BodyBufferingProtocol, _BufferedResponse ) class HTTPClientTests(TestCase): def setUp(self): self.agent = mock.Mock(Agent) self.client = HTTPClient(self.agent) self.fbp_patcher = mock.patch('treq.client.FileBodyProducer') self.FileBodyProducer = self.fbp_patcher.start() self.addCleanup(self.fbp_patcher.stop) self.mbp_patcher = mock.patch('treq.multipart.MultiPartProducer') self.MultiPartProducer = self.mbp_patcher.start() self.addCleanup(self.mbp_patcher.stop) def assertBody(self, expected): body = self.FileBodyProducer.mock_calls[0][1][0] self.assertEqual(body.read(), expected) def test_post(self): self.client.post('http://example.com/') self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_uri_idn(self): self.client.request('GET', u'http://č.net') self.agent.request.assert_called_once_with( b'GET', b'http://xn--bea.net', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_uri_decodedurl(self): """ A URL may be passed as a `hyperlink.DecodedURL` object. It is converted to bytes when passed to the underlying agent. """ url = DecodedURL.from_text(u"https://example.org/foo") self.client.request("GET", url) self.agent.request.assert_called_once_with( b"GET", b"https://example.org/foo", Headers({b"accept-encoding": [b"gzip"]}), None, ) def test_request_uri_encodedurl(self): """ A URL may be passed as a `hyperlink.EncodedURL` object. It is converted to bytes when passed to the underlying agent. """ url = EncodedURL.from_text(u"https://example.org/foo") self.client.request("GET", url) self.agent.request.assert_called_once_with( b"GET", b"https://example.org/foo", Headers({b"accept-encoding": [b"gzip"]}), None, ) def test_request_uri_bytes_pass(self): """ The URL parameter may contain path segments or querystring parameters that are not valid UTF-8. These pass through. """ # This URL is http://example.com/hello?who=you, but "hello", "who", and # "you" are encoded as UTF-16. The particulars of the encoding aren't # important; what matters is that those segments can't be decoded by # Hyperlink's UTF-8 default. self.client.request( "GET", ( "http://example.com/%FF%FEh%00e%00l%00l%00o%00" "?%FF%FEw%00h%00o%00=%FF%FEy%00o%00u%00" ), ) self.agent.request.assert_called_once_with( b'GET', ( b'http://example.com/%FF%FEh%00e%00l%00l%00o%00' b'?%FF%FEw%00h%00o%00=%FF%FEy%00o%00u%00' ), Headers({b'accept-encoding': [b'gzip']}), None, ) def test_request_uri_plus_pass(self): """ URL parameters may contain spaces encoded as ``+``. These remain as such and are not mangled. This reproduces `Klein #339 `_. """ self.client.request( "GET", "https://example.com/?foo+bar=baz+biff", ) self.agent.request.assert_called_once_with( b'GET', b"https://example.com/?foo+bar=baz+biff", Headers({b'accept-encoding': [b'gzip']}), None, ) def test_request_uri_idn_params(self): """ A URL that contains non-ASCII characters can be augmented with querystring parameters. This reproduces treq #264. """ self.client.request('GET', u'http://č.net', params={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'GET', b'http://xn--bea.net/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_uri_hyperlink_params(self): """ The *params* argument augments an instance of `hyperlink.DecodedURL` passed as the *url* parameter, just as if it were a string. """ self.client.request( method="GET", url=DecodedURL.from_text(u"http://č.net"), params={"foo": "bar"}, ) self.agent.request.assert_called_once_with( b"GET", b"http://xn--bea.net/?foo=bar", Headers({b"accept-encoding": [b"gzip"]}), None, ) def test_request_case_insensitive_methods(self): self.client.request('gEt', 'http://example.com/') self.agent.request.assert_called_once_with( b'GET', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_query_params(self): self.client.request('GET', 'http://example.com/', params={'foo': ['bar']}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_tuple_query_values(self): self.client.request('GET', 'http://example.com/', params={'foo': ('bar',)}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_tuple_query_value_coercion(self): """ treq coerces non-string values passed to *params* like `urllib.urlencode()` """ self.client.request('GET', 'http://example.com/', params=[ ('text', u'A\u03a9'), ('text-seq', [u'A\u03a9']), ('bytes', [b'ascii']), ('bytes-seq', [b'ascii']), ('native', ['native']), ('native-seq', ['aa', 'bb']), ('int', 1), ('int-seq', (1, 2, 3)), ('none', None), ('none-seq', [None, None]), ]) self.agent.request.assert_called_once_with( b'GET', ( b'http://example.com/?' b'text=A%CE%A9&text-seq=A%CE%A9' b'&bytes=ascii&bytes-seq=ascii' b'&native=native&native-seq=aa&native-seq=bb' b'&int=1&int-seq=1&int-seq=2&int-seq=3' b'&none=None&none-seq=None&none-seq=None' ), Headers({b'accept-encoding': [b'gzip']}), None, ) def test_request_tuple_query_param_coercion(self): """ treq coerces non-string param names passed to *params* like `urllib.urlencode()` """ # A value used to test that it is never encoded or decoded. # It should be invalid UTF-8 or UTF-32 (at least). raw_bytes = b"\x00\xff\xfb" self.client.request('GET', 'http://example.com/', params=[ (u'text', u'A\u03a9'), (b'bytes', ['ascii', raw_bytes]), ('native', 'native'), (1, 'int'), (None, ['none']), ]) self.agent.request.assert_called_once_with( b'GET', ( b'http://example.com/' b'?text=A%CE%A9&bytes=ascii&bytes=%00%FF%FB' b'&native=native&1=int&None=none' ), Headers({b'accept-encoding': [b'gzip']}), None, ) def test_request_query_param_seps(self): """ When the characters ``&`` and ``#`` are passed to *params* as param names or values they are percent-escaped in the URL. This reproduces https://github.com/twisted/treq/issues/282 """ self.client.request('GET', 'http://example.com/', params=( ('ampersand', '&'), ('&', 'ampersand'), ('octothorpe', '#'), ('#', 'octothorpe'), )) self.agent.request.assert_called_once_with( b'GET', ( b'http://example.com/' b'?ampersand=%26' b'&%26=ampersand' b'&octothorpe=%23' b'&%23=octothorpe' ), Headers({b'accept-encoding': [b'gzip']}), None, ) def test_request_merge_query_params(self): self.client.request('GET', 'http://example.com/?baz=bax', params={'foo': ['bar', 'baz']}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?baz=bax&foo=bar&foo=baz', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_merge_tuple_query_params(self): self.client.request('GET', 'http://example.com/?baz=bax', params=[('foo', 'bar')]) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?baz=bax&foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_dict_single_value_query_params(self): self.client.request('GET', 'http://example.com/', params={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_data_dict(self): self.client.request('POST', 'http://example.com/', data={'foo': ['bar', 'baz']}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar&foo=baz') def test_request_data_single_dict(self): self.client.request('POST', 'http://example.com/', data={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar') def test_request_data_tuple(self): self.client.request('POST', 'http://example.com/', data=[('foo', 'bar')]) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar') def test_request_data_file(self): temp_fn = self.mktemp() with open(temp_fn, "wb") as temp_file: temp_file.write(b'hello') self.client.request('POST', 'http://example.com/', data=open(temp_fn, 'rb')) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'hello') def test_request_json_dict(self): self.client.request('POST', 'http://example.com/', json={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'{"foo":"bar"}') def test_request_json_tuple(self): self.client.request('POST', 'http://example.com/', json=('foo', 1)) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'["foo",1]') def test_request_json_number(self): self.client.request('POST', 'http://example.com/', json=1.) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'1.0') def test_request_json_string(self): self.client.request('POST', 'http://example.com/', json='hello') self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'"hello"') def test_request_json_bool(self): self.client.request('POST', 'http://example.com/', json=True) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'true') def test_request_json_none(self): self.client.request('POST', 'http://example.com/', json=None) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'null') @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_no_name_attachment(self): self.client.request( 'POST', 'http://example.com/', files={"name": BytesIO(b"hello")}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', (None, 'application/octet-stream', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_named_attachment(self): self.client.request( 'POST', 'http://example.com/', files={ "name": ('image.jpg', BytesIO(b"hello"))}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', ('image.jpg', 'image/jpeg', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_named_attachment_and_ctype(self): self.client.request( 'POST', 'http://example.com/', files={ "name": ('image.jpg', 'text/plain', BytesIO(b"hello"))}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', ('image.jpg', 'text/plain', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) def test_request_files_tuple_too_short(self): """ The `HTTPClient.request()` *files* argument requires tuples of length 2 or 3. It raises `TypeError` when the tuple is too short. """ with self.assertRaises(TypeError) as c: self.client.request( "POST", b"http://example.com/", files=[("t1", ("foo.txt",))], ) self.assertIn("'t1' tuple has length 1", str(c.exception)) def test_request_files_tuple_too_long(self): """ The `HTTPClient.request()` *files* argument requires tuples of length 2 or 3. It raises `TypeError` when the tuple is too long. """ with self.assertRaises(TypeError) as c: self.client.request( "POST", b"http://example.com/", files=[ ("t4", ("foo.txt", "text/plain", BytesIO(b"...\n"), "extra!")), ], ) self.assertIn("'t4' tuple has length 4", str(c.exception)) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_mixed_params(self): class NamedFile(BytesIO): def __init__(self, val): BytesIO.__init__(self, val) self.name = "image.png" self.client.request( 'POST', 'http://example.com/', data=[("a", "b"), ("key", "val")], files=[ ("file1", ('image.jpg', BytesIO(b"hello"))), ("file2", NamedFile(b"yo"))]) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call([ ('a', 'b'), ('key', 'val'), ('file1', ('image.jpg', 'image/jpeg', FP)), ('file2', ('image.png', 'image/png', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_mixed_params_dict(self): self.client.request( 'POST', 'http://example.com/', data={"key": "a", "key2": "b"}, files={"file1": BytesIO(b"hey")}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call([ ('key', 'a'), ('key2', 'b'), ('file1', (None, 'application/octet-stream', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) def test_request_unsupported_params_combination(self): self.assertRaises(ValueError, self.client.request, 'POST', 'http://example.com/', data=BytesIO(b"yo"), files={"file1": BytesIO(b"hey")}) def test_request_json_with_data(self): """ Passing `HTTPClient.request()` both *data* and *json* parameters is invalid because *json* is ignored. This behavior is deprecated. """ self.client.request( "POST", "http://example.com/", data=BytesIO(b"..."), json=None, # NB: None is a valid value. It encodes to b'null'. ) [w] = self.flushWarnings([self.test_request_json_with_data]) self.assertEqual(DeprecationWarning, w["category"]) self.assertEqual( ( "Argument 'json' will be ignored because 'data' was also passed." " This will raise TypeError in the next treq release." ), w['message'], ) def test_request_json_with_files(self): """ Passing `HTTPClient.request()` both *files* and *json* parameters is invalid because *json* is ignored. This behavior is deprecated. """ self.client.request( "POST", "http://example.com/", files={"f1": ("foo.txt", "text/plain", BytesIO(b"...\n"))}, json=["this is ignored"], ) [w] = self.flushWarnings([self.test_request_json_with_files]) self.assertEqual(DeprecationWarning, w["category"]) self.assertEqual( ( "Argument 'json' will be ignored because 'files' was also passed." " This will raise TypeError in the next treq release." ), w['message'], ) def test_request_dict_headers(self): self.client.request('GET', 'http://example.com/', headers={ 'User-Agent': 'treq/0.1dev', 'Accept': ['application/json', 'text/plain'] }) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/', Headers({b'User-Agent': [b'treq/0.1dev'], b'accept-encoding': [b'gzip'], b'Accept': [b'application/json', b'text/plain']}), None) def test_request_headers_object(self): """ The *headers* parameter accepts a `twisted.web.http_headers.Headers` instance. """ self.client.request( "GET", "https://example.com", headers=Headers({"X-Foo": ["bar"]}), ) self.agent.request.assert_called_once_with( b"GET", b"https://example.com", Headers({ "X-Foo": ["bar"], "Accept-Encoding": ["gzip"], }), None, ) def test_request_headers_invalid_type(self): """ `HTTPClient.request()` warns that headers of an unexpected type are invalid and that this behavior is deprecated. """ self.client.request('GET', 'http://example.com', headers=[]) [w] = self.flushWarnings([self.test_request_headers_invalid_type]) self.assertEqual(DeprecationWarning, w['category']) self.assertIn( "headers must be a dict, twisted.web.http_headers.Headers, or None,", w['message'], ) def test_request_dict_headers_invalid_values(self): """ `HTTPClient.request()` warns that non-string header values are dropped and that this behavior is deprecated. """ self.client.request('GET', 'http://example.com', headers=OrderedDict([ ('none', None), ('one', 1), ('ok', 'string'), ])) [w1, w2] = self.flushWarnings([self.test_request_dict_headers_invalid_values]) self.assertEqual(DeprecationWarning, w1['category']) self.assertEqual(DeprecationWarning, w2['category']) self.assertIn( "The value of headers key 'none' has non-string type", w1['message'], ) self.assertIn( "The value of headers key 'one' has non-string type", w2['message'], ) def test_request_invalid_param(self): """ `HTTPClient.request()` rejects invalid keyword parameters with `TypeError`. """ self.assertRaises( TypeError, self.client.request, "GET", b"http://example.com", invalid=True, ) @with_clock def test_request_timeout_fired(self, clock): """ Verify the request is cancelled if a response is not received within specified timeout period. """ self.agent.request.return_value = d = Deferred() self.client.request('GET', 'http://example.com', timeout=2) # simulate we haven't gotten a response within timeout seconds clock.advance(3) # a deferred should have been cancelled self.failureResultOf(d, CancelledError) @with_clock def test_request_timeout_cancelled(self, clock): """ Verify timeout is cancelled if a response is received before timeout period elapses. """ self.agent.request.return_value = d = Deferred() self.client.request('GET', 'http://example.com', timeout=2) # simulate a response d.callback(mock.Mock(code=200, headers=Headers({}))) # now advance the clock but since we already got a result, # a cancellation timer should have been cancelled clock.advance(3) self.successResultOf(d) def test_response_is_buffered(self): response = mock.Mock(deliverBody=mock.Mock(), headers=Headers({})) self.agent.request.return_value = succeed(response) d = self.client.get('http://www.example.com') result = self.successResultOf(d) protocol = mock.Mock(Protocol) result.deliverBody(protocol) self.assertEqual(response.deliverBody.call_count, 1) result.deliverBody(protocol) self.assertEqual(response.deliverBody.call_count, 1) def test_response_buffering_is_disabled_with_unbufferred_arg(self): response = mock.Mock(headers=Headers({})) self.agent.request.return_value = succeed(response) d = self.client.get('http://www.example.com', unbuffered=True) # YOLO public attribute. self.assertEqual(self.successResultOf(d).original, response) def test_request_post_redirect_denied(self): response = mock.Mock(code=302, headers=Headers({'Location': ['/']})) self.agent.request.return_value = succeed(response) d = self.client.post('http://www.example.com') self.failureResultOf(d, ResponseFailed) def test_request_browser_like_redirects(self): response = mock.Mock(code=302, headers=Headers({'Location': ['/']})) self.agent.request.return_value = succeed(response) raw = mock.Mock(return_value=[]) final_resp = mock.Mock(code=200, headers=mock.Mock(getRawHeaders=raw)) with mock.patch('twisted.web.client.RedirectAgent._handleRedirect', return_value=final_resp): d = self.client.post('http://www.google.com', browser_like_redirects=True, unbuffered=True) self.assertEqual(self.successResultOf(d).original, final_resp) class BodyBufferingProtocolTests(TestCase): def test_buffers_data(self): buffer = [] protocol = _BodyBufferingProtocol( mock.Mock(Protocol), buffer, None ) protocol.dataReceived("foo") self.assertEqual(buffer, ["foo"]) protocol.dataReceived("bar") self.assertEqual(buffer, ["foo", "bar"]) def test_propagates_data_to_destination(self): destination = mock.Mock(Protocol) protocol = _BodyBufferingProtocol( destination, [], None ) protocol.dataReceived(b"foo") destination.dataReceived.assert_called_once_with(b"foo") protocol.dataReceived(b"bar") destination.dataReceived.assert_called_with(b"bar") def test_fires_finished_deferred(self): finished = Deferred() protocol = _BodyBufferingProtocol( mock.Mock(Protocol), [], finished ) class TestResponseDone(Exception): pass protocol.connectionLost(TestResponseDone()) self.failureResultOf(finished, TestResponseDone) def test_propogates_connectionLost_reason(self): destination = mock.Mock(Protocol) protocol = _BodyBufferingProtocol( destination, [], Deferred().addErrback(lambda ign: None) ) class TestResponseDone(Exception): pass reason = TestResponseDone() protocol.connectionLost(reason) destination.connectionLost.assert_called_once_with(reason) class BufferedResponseTests(TestCase): def test_wraps_protocol(self): wrappers = [] wrapped = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(wrapped) response.deliverBody.assert_called_once_with(wrappers[0]) self.assertNotEqual(wrapped, wrappers[0]) def test_concurrent_receivers(self): wrappers = [] wrapped = mock.Mock(Protocol) unwrapped = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(wrapped) br.deliverBody(unwrapped) response.deliverBody.assert_called_once_with(wrappers[0]) wrappers[0].dataReceived(b"foo") wrapped.dataReceived.assert_called_once_with(b"foo") self.assertEqual(unwrapped.dataReceived.call_count, 0) class TestResponseDone(Exception): pass done = Failure(TestResponseDone()) wrappers[0].connectionLost(done) wrapped.connectionLost.assert_called_once_with(done) unwrapped.dataReceived.assert_called_once_with(b"foo") unwrapped.connectionLost.assert_called_once_with(done) def test_receiver_after_finished(self): wrappers = [] finished = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(mock.Mock(Protocol)) wrappers[0].dataReceived(b"foo") class TestResponseDone(Exception): pass done = Failure(TestResponseDone()) wrappers[0].connectionLost(done) br.deliverBody(finished) finished.dataReceived.assert_called_once_with(b"foo") finished.connectionLost.assert_called_once_with(done) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/test_content.py0000664000175000017500000001524600000000000020027 0ustar00twmtwm00000000000000from unittest import mock from twisted.python.failure import Failure from twisted.trial.unittest import TestCase from twisted.web.http_headers import Headers from twisted.web.client import ResponseDone, ResponseFailed from twisted.web.http import PotentialDataLoss from treq import collect, content, json_content, text_content from treq.client import _BufferedResponse class ContentTests(TestCase): def setUp(self): self.response = mock.Mock() self.protocol = None def deliverBody(protocol): self.protocol = protocol self.response.deliverBody.side_effect = deliverBody self.response = _BufferedResponse(self.response) def test_collect(self): data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'{') self.protocol.dataReceived(b'"msg": "hell') self.protocol.dataReceived(b'o"}') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), None) self.assertEqual(data, [b'{', b'"msg": "hell', b'o"}']) def test_collect_failure(self): data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(ResponseFailed("test failure"))) self.failureResultOf(d, ResponseFailed) self.assertEqual(data, [b'foo']) def test_collect_failure_potential_data_loss(self): """ PotentialDataLoss failures are treated as success. """ data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(PotentialDataLoss())) self.assertEqual(self.successResultOf(d), None) self.assertEqual(data, [b'foo']) def test_collect_0_length(self): self.response.length = 0 d = collect( self.response, lambda d: self.fail("Unexpectedly called with: {0}".format(d))) self.assertEqual(self.successResultOf(d), None) def test_content(self): d = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.dataReceived(b'bar') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), b'foobar') def test_content_cached(self): d1 = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.dataReceived(b'bar') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d1), b'foobar') def _fail_deliverBody(protocol): self.fail("deliverBody unexpectedly called.") self.response.original.deliverBody.side_effect = _fail_deliverBody d3 = content(self.response) self.assertEqual(self.successResultOf(d3), b'foobar') self.assertNotIdentical(d1, d3) def test_content_multiple_waiters(self): d1 = content(self.response) d2 = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d1), b'foo') self.assertEqual(self.successResultOf(d2), b'foo') self.assertNotIdentical(d1, d2) def test_json_content(self): self.response.headers = Headers() d = json_content(self.response) self.protocol.dataReceived(b'{"msg":"hello!"}') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {"msg": "hello!"}) def test_json_content_unicode(self): """ When Unicode JSON content is received, the JSON text should be correctly decoded. RFC7159 (8.1): "JSON text SHALL be encoded in UTF-8, UTF-16, or UTF-32. The default encoding is UTF-8" """ self.response.headers = Headers() d = json_content(self.response) self.protocol.dataReceived(u'{"msg":"hëlló!"}'.encode('utf-8')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {u'msg': u'hëlló!'}) def test_json_content_utf16(self): """ JSON received is decoded according to the charset given in the Content-Type header. """ self.response.headers = Headers({ b'Content-Type': [b"application/json; charset='UTF-16LE'"], }) d = json_content(self.response) self.protocol.dataReceived(u'{"msg":"hëlló!"}'.encode('UTF-16LE')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {u'msg': u'hëlló!'}) def test_text_content(self): self.response.headers = Headers( {b'Content-Type': [b'text/plain; charset=utf-8']}) d = text_content(self.response) self.protocol.dataReceived(b'\xe2\x98\x83') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\u2603') def test_text_content_default_encoding_no_param(self): self.response.headers = Headers( {b'Content-Type': [b'text/plain']}) d = text_content(self.response) self.protocol.dataReceived(b'\xa1') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\xa1') def test_text_content_default_encoding_no_header(self): self.response.headers = Headers() d = text_content(self.response) self.protocol.dataReceived(b'\xa1') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\xa1') def test_content_application_json_default_encoding(self): self.response.headers = Headers( {b'Content-Type': [b'application/json']}) d = text_content(self.response) self.protocol.dataReceived(b'gr\xc3\xbcn') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'grün') def test_text_content_unicode_headers(self): """ Header parsing is robust against unicode header names and values. """ self.response.headers = Headers({ b'Content-Type': [ u'text/plain; charset="UTF-16BE"; u=ᛃ'.encode('utf-8')], u'Coördination'.encode('iso-8859-1'): [ u'koʊˌɔrdɪˈneɪʃən'.encode('utf-8')], }) d = text_content(self.response) self.protocol.dataReceived(u'ᚠᚡ'.encode('UTF-16BE')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'ᚠᚡ') ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1613279689.0 treq-22.2.0/src/treq/test/test_multipart.py0000664000175000017500000005174700000000000020404 0ustar00twmtwm00000000000000# Copyright (c) Twisted Matrix Laboratories. # See LICENSE for details. import cgi import sys from io import BytesIO from twisted.trial import unittest from zope.interface.verify import verifyObject from twisted.internet import task from twisted.web.client import FileBodyProducer from twisted.web.iweb import UNKNOWN_LENGTH, IBodyProducer from treq.multipart import MultiPartProducer, _LengthConsumer class MultiPartProducerTestCase(unittest.TestCase): """ Tests for the L{MultiPartProducer} which gets dictionary like object with post parameters, converts them to mutltipart/form-data format and feeds them to an L{IConsumer}. """ def _termination(self): """ This method can be used as the C{terminationPredicateFactory} for a L{Cooperator}. It returns a predicate which immediately returns C{False}, indicating that no more work should be done this iteration. This has the result of only allowing one iteration of a cooperative task to be run per L{Cooperator} iteration. """ return lambda: True def setUp(self): """ Create a L{Cooperator} hooked up to an easily controlled, deterministic scheduler to use with L{MultiPartProducer}. """ self._scheduled = [] self.cooperator = task.Cooperator( self._termination, self._scheduled.append) def getOutput(self, producer, with_producer=False): """ A convenience function to consume and return outpute. """ consumer = output = BytesIO() producer.startProducing(consumer) while self._scheduled: self._scheduled.pop(0)() if with_producer: return (output.getvalue(), producer) else: return output.getvalue() def newLines(self, value): if isinstance(value, str): return value.replace(u"\n", u"\r\n") else: return value.replace(b"\n", b"\r\n") def test_interface(self): """ L{MultiPartProducer} instances provide L{IBodyProducer}. """ self.assertTrue( verifyObject( IBodyProducer, MultiPartProducer({}))) def test_unknownLength(self): """ If the L{MultiPartProducer} is constructed with a file-like object passed as a parameter without either a C{seek} or C{tell} method, its C{length} attribute is set to C{UNKNOWN_LENGTH}. """ class CantTell: def seek(self, offset, whence): """ A C{seek} method that is never called because there is no matching C{tell} method. """ class CantSeek: def tell(self): """ A C{tell} method that is never called because there is no matching C{seek} method. """ producer = MultiPartProducer( {"f": ("name", None, FileBodyProducer(CantTell()))}) self.assertEqual(UNKNOWN_LENGTH, producer.length) producer = MultiPartProducer( {"f": ("name", None, FileBodyProducer(CantSeek()))}) self.assertEqual(UNKNOWN_LENGTH, producer.length) def test_knownLengthOnFile(self): """ If the L{MultiPartProducer} is constructed with a file-like object with both C{seek} and C{tell} methods, its C{length} attribute is set to the size of the file as determined by those methods. """ inputBytes = b"here are some bytes" inputFile = BytesIO(inputBytes) inputFile.seek(5) producer = MultiPartProducer({ "field": ('file name', None, FileBodyProducer( inputFile, cooperator=self.cooperator))}) # Make sure we are generous enough not to alter seek position: self.assertEqual(inputFile.tell(), 5) # Total length is hard to calculate manually # as it contains a lot of headers parameters, newlines and boundaries # let's assert for now that it's no less than the input parameter self.assertTrue(producer.length > len(inputBytes)) # Calculating length should not touch producers self.assertTrue(producer._currentProducer is None) def test_defaultCooperator(self): """ If no L{Cooperator} instance is passed to L{MultiPartProducer}, the global cooperator is used. """ producer = MultiPartProducer({ "field": ('file name', None, FileBodyProducer( BytesIO(b"yo"), cooperator=self.cooperator)) }) self.assertEqual(task.cooperate, producer._cooperate) def test_startProducing(self): """ L{MultiPartProducer.startProducing} starts writing bytes from the input file to the given L{IConsumer} and returns a L{Deferred} which fires when they have all been written. """ consumer = output = BytesIO() producer = MultiPartProducer({ b"field": ('file name', "text/hello-world", FileBodyProducer( BytesIO(b"Hello, World"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) iterations = 0 while self._scheduled: iterations += 1 self._scheduled.pop(0)() self.assertTrue(iterations > 1) self.assertEqual(self.newLines(b"""--heyDavid Content-Disposition: form-data; name="field"; filename="file name" Content-Type: text/hello-world Content-Length: 12 Hello, World --heyDavid-- """), output.getvalue()) self.assertEqual(None, self.successResultOf(complete)) def test_inputClosedAtEOF(self): """ When L{MultiPartProducer} reaches end-of-file on the input file given to it, the input file is closed. """ inputFile = BytesIO(b"hello, world!") consumer = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") producer.startProducing(consumer) while self._scheduled: self._scheduled.pop(0)() self.assertTrue(inputFile.closed) def test_failedReadWhileProducing(self): """ If a read from the input file fails while producing bytes to the consumer, the L{Deferred} returned by L{MultiPartProducer.startProducing} fires with a L{Failure} wrapping that exception. """ class BrokenFile: def read(self, count): raise IOError("Simulated bad thing") producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( BrokenFile(), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(BytesIO()) while self._scheduled: self._scheduled.pop(0)() self.failureResultOf(complete).trap(IOError) def test_stopProducing(self): """ L{MultiPartProducer.stopProducing} stops the underlying L{IPullProducer} and the cooperative task responsible for calling C{resumeProducing} and closes the input file but does not cause the L{Deferred} returned by C{startProducing} to fire. """ inputFile = BytesIO(b"hello, world!") consumer = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) self._scheduled.pop(0)() producer.stopProducing() self.assertTrue(inputFile.closed) self._scheduled.pop(0)() self.assertNoResult(complete) def test_pauseProducing(self): """ L{MultiPartProducer.pauseProducing} temporarily suspends writing bytes from the input file to the given L{IConsumer}. """ inputFile = BytesIO(b"hello, world!") consumer = output = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) self._scheduled.pop(0)() currentValue = output.getvalue() self.assertTrue(currentValue) producer.pauseProducing() # Sort of depends on an implementation detail of Cooperator: even # though the only task is paused, there's still a scheduled call. If # this were to go away because Cooperator became smart enough to cancel # this call in this case, that would be fine. self._scheduled.pop(0)() # Since the producer is paused, no new data should be here. self.assertEqual(output.getvalue(), currentValue) self.assertNoResult(complete) def test_resumeProducing(self): """ L{MultoPartProducer.resumeProducing} re-commences writing bytes from the input file to the given L{IConsumer} after it was previously paused with L{MultiPartProducer.pauseProducing}. """ inputFile = BytesIO(b"hello, world!") consumer = output = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") producer.startProducing(consumer) self._scheduled.pop(0)() currentValue = output.getvalue() self.assertTrue(currentValue) producer.pauseProducing() producer.resumeProducing() self._scheduled.pop(0)() # make sure we started producing new data after resume self.assertTrue(len(currentValue) < len(output.getvalue())) def test_unicodeString(self): """ Make sure unicode string is passed properly """ output, producer = self.getOutput( MultiPartProducer({ "afield": u"Это моя строчечка\r\n", }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(u"""--heyDavid Content-Disposition: form-data; name="afield" Это моя строчечка --heyDavid-- """.encode("utf-8")) self.assertEqual(producer.length, len(expected)) self.assertEqual(expected, output) def test_failOnByteStrings(self): """ If byte string is passed as a param and we don't know the encoding, fail early to prevent corrupted form posts """ self.assertRaises( ValueError, MultiPartProducer, { "afield": u"это моя строчечка".encode("utf-32"), }, cooperator=self.cooperator, boundary=b"heyDavid") def test_failOnUnknownParams(self): """ If byte string is passed as a param and we don't know the encoding, fail early to prevent corrupted form posts """ # unknown key self.assertRaises( ValueError, MultiPartProducer, { (1, 2): BytesIO(b"yo"), }, cooperator=self.cooperator, boundary=b"heyDavid") # tuple length self.assertRaises( ValueError, MultiPartProducer, { "a": (1,), }, cooperator=self.cooperator, boundary=b"heyDavid") # unknown value type self.assertRaises( ValueError, MultiPartProducer, { "a": {"a": "b"}, }, cooperator=self.cooperator, boundary=b"heyDavid") def test_twoFields(self): """ Make sure multiple fields are rendered properly. """ output = self.getOutput( MultiPartProducer({ "afield": "just a string\r\n", "bfield": "another string" }, cooperator=self.cooperator, boundary=b"heyDavid")) self.assertEqual(self.newLines(b"""--heyDavid Content-Disposition: form-data; name="afield" just a string --heyDavid Content-Disposition: form-data; name="bfield" another string --heyDavid-- """), output) def test_fieldsAndAttachment(self): """ Make sure multiple fields are rendered properly. """ output, producer = self.getOutput( MultiPartProducer({ "bfield": "just a string\r\n", "cfield": "another string", "afield": ( "file name", "text/hello-world", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="bfield" just a string --heyDavid Content-Disposition: form-data; name="cfield" another string --heyDavid Content-Disposition: form-data; name="afield"; filename="file name" Content-Type: text/hello-world Content-Length: 15 my lovely bytes --heyDavid-- """) self.assertEqual(producer.length, len(expected)) self.assertEqual(output, expected) def test_multipleFieldsAndAttachments(self): """ Make sure multiple fields, attachments etc are rendered properly. """ output, producer = self.getOutput( MultiPartProducer({ "cfield": "just a string\r\n", "bfield": "another string", "efield": ( "ef", "text/html", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes2"), cooperator=self.cooperator)), "xfield": ( "xf", "text/json", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes219"), cooperator=self.cooperator)), "afield": ( "af", "text/xml", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes22"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="bfield" another string --heyDavid Content-Disposition: form-data; name="cfield" just a string --heyDavid Content-Disposition: form-data; name="afield"; filename="af" Content-Type: text/xml Content-Length: 17 my lovely bytes22 --heyDavid Content-Disposition: form-data; name="efield"; filename="ef" Content-Type: text/html Content-Length: 16 my lovely bytes2 --heyDavid Content-Disposition: form-data; name="xfield"; filename="xf" Content-Type: text/json Content-Length: 18 my lovely bytes219 --heyDavid-- """) self.assertEqual(producer.length, len(expected)) self.assertEqual(output, expected) def test_unicodeAttachmentName(self): """ Make sure unicode attachment names are supported. """ output, producer = self.getOutput( MultiPartProducer({ "field": ( u'Так себе имя.jpg', "image/jpeg", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator ) ) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(u"""--heyDavid Content-Disposition: form-data; name="field"; filename="Так себе имя.jpg" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """.encode("utf-8")) self.assertEqual(len(expected), producer.length) self.assertEqual(expected, output) def test_missingAttachmentName(self): """ Make sure attachments without names are supported """ output, producer = self.getOutput( MultiPartProducer({ "field": ( None, "image/jpeg", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator, ) ) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="field" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """) self.assertEqual(len(expected), producer.length) self.assertEqual(expected, output) def test_newLinesInParams(self): """ Make sure we generate proper format even with newlines in attachments """ output = self.getOutput( MultiPartProducer({ "field": ( u'\r\noops.j\npg', "image/jp\reg\n", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator ) ) }, cooperator=self.cooperator, boundary=b"heyDavid" ) ) self.assertEqual(self.newLines(u"""--heyDavid Content-Disposition: form-data; name="field"; filename="oops.jpg" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """.encode("utf-8")), output) def test_worksWithCgi(self): """ Make sure the stuff we generated actually parsed by python cgi """ output = self.getOutput( MultiPartProducer([ ("cfield", "just a string\r\n"), ("cfield", "another string"), ("efield", ('ef', "text/html", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes2"), cooperator=self.cooperator, ))), ("xfield", ('xf', "text/json", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes219"), cooperator=self.cooperator, ))), ("afield", ('af', "text/xml", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes22"), cooperator=self.cooperator, ))) ], cooperator=self.cooperator, boundary=b"heyDavid" ) ) form = cgi.parse_multipart(BytesIO(output), { "boundary": b"heyDavid", "CONTENT-LENGTH": str(len(output)), }) # Since Python 3.7, the value for a non-file field is now a list # of strings, not bytes. if sys.version_info >= (3, 7): self.assertEqual(set(['just a string\r\n', 'another string']), set(form['cfield'])) else: self.assertEqual(set([b'just a string\r\n', b'another string']), set(form['cfield'])) self.assertEqual(set([b'my lovely bytes2']), set(form['efield'])) self.assertEqual(set([b'my lovely bytes219']), set(form['xfield'])) self.assertEqual(set([b'my lovely bytes22']), set(form['afield'])) class LengthConsumerTestCase(unittest.TestCase): """ Tests for the _LengthConsumer, an L{IConsumer} which is used to compute the length of a produced content. """ def test_scalarsUpdateCounter(self): """ When an int is written, _LengthConsumer updates its internal counter. """ consumer = _LengthConsumer() self.assertEqual(consumer.length, 0) consumer.write(1) self.assertEqual(consumer.length, 1) consumer.write(2147483647) self.assertEqual(consumer.length, 2147483648) def test_stringUpdatesCounter(self): """ Use the written string length to update the internal counter """ a = (b"Cantami, o Diva, del Pelide Achille\n l'ira funesta che " b"infiniti addusse\n lutti agli Achei") consumer = _LengthConsumer() self.assertEqual(consumer.length, 0) consumer.write(a) self.assertEqual(consumer.length, 89) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/test_response.py0000664000175000017500000001110000000000000020174 0ustar00twmtwm00000000000000from decimal import Decimal from twisted.trial.unittest import SynchronousTestCase from twisted.python.failure import Failure from twisted.web.client import ResponseDone from twisted.web.iweb import UNKNOWN_LENGTH from twisted.web.http_headers import Headers from treq.response import _Response class FakeResponse: def __init__(self, code, headers, body=()): self.code = code self.headers = headers self.previousResponse = None self._body = body self.length = sum(len(c) for c in body) def setPreviousResponse(self, response): self.previousResponse = response def deliverBody(self, protocol): for chunk in self._body: protocol.dataReceived(chunk) protocol.connectionLost(Failure(ResponseDone())) class ResponseTests(SynchronousTestCase): def test_repr_content_type(self): """ When the response has a Content-Type header its value is included in the response. """ headers = Headers({'Content-Type': ['text/html']}) original = FakeResponse(200, headers, body=[b'']) self.assertEqual( "", repr(_Response(original, None)), ) def test_repr_content_type_missing(self): """ A request with no Content-Type just displays an empty field. """ original = FakeResponse(204, Headers(), body=[b'']) self.assertEqual( "", repr(_Response(original, None)), ) def test_repr_content_type_hostile(self): """ Garbage in the Content-Type still produces a reasonable representation. """ headers = Headers({'Content-Type': [u'\u2e18', ' x/y']}) original = FakeResponse(418, headers, body=[b'']) self.assertEqual( r"", repr(_Response(original, None)), ) def test_repr_unknown_length(self): """ A HTTP 1.0 or chunked response displays an unknown length. """ headers = Headers({'Content-Type': ['text/event-stream']}) original = FakeResponse(200, headers) original.length = UNKNOWN_LENGTH self.assertEqual( "", repr(_Response(original, None)), ) def test_collect(self): original = FakeResponse(200, Headers(), body=[b'foo', b'bar', b'baz']) calls = [] _Response(original, None).collect(calls.append) self.assertEqual([b'foo', b'bar', b'baz'], calls) def test_content(self): original = FakeResponse(200, Headers(), body=[b'foo', b'bar', b'baz']) self.assertEqual( b'foobarbaz', self.successResultOf(_Response(original, None).content()), ) def test_json(self): original = FakeResponse(200, Headers(), body=[b'{"foo": ', b'"bar"}']) self.assertEqual( {'foo': 'bar'}, self.successResultOf(_Response(original, None).json()), ) def test_json_customized(self): original = FakeResponse(200, Headers(), body=[b'{"foo": ', b'1.0000000000000001}']) self.assertEqual( self.successResultOf(_Response(original, None).json( parse_float=Decimal) )["foo"], Decimal("1.0000000000000001") ) def test_text(self): headers = Headers({b'content-type': [b'text/plain;charset=utf-8']}) original = FakeResponse(200, headers, body=[b'\xe2\x98', b'\x83']) self.assertEqual( u'\u2603', self.successResultOf(_Response(original, None).text()), ) def test_history(self): redirect1 = FakeResponse( 301, Headers({'location': ['http://example.com/']}) ) redirect2 = FakeResponse( 302, Headers({'location': ['https://example.com/']}) ) redirect2.setPreviousResponse(redirect1) final = FakeResponse(200, Headers({})) final.setPreviousResponse(redirect2) wrapper = _Response(final, None) history = wrapper.history() self.assertEqual(wrapper.code, 200) self.assertEqual(history[0].code, 301) self.assertEqual(history[1].code, 302) def test_no_history(self): wrapper = _Response(FakeResponse(200, Headers({})), None) self.assertEqual(wrapper.history(), []) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644389690.0 treq-22.2.0/src/treq/test/test_testing.py0000664000175000017500000005476600000000000020044 0ustar00twmtwm00000000000000""" In-memory treq returns stubbed responses. """ from functools import partial from inspect import getmembers, isfunction from json import dumps from unittest.mock import ANY from twisted.trial.unittest import TestCase from twisted.web.client import ResponseFailed from twisted.web.error import SchemeNotSupported from twisted.web.resource import Resource from twisted.web.server import NOT_DONE_YET import treq from treq.testing import ( HasHeaders, RequestSequence, StringStubbingResource, StubTreq ) class _StaticTestResource(Resource): """Resource that always returns 418 "I'm a teapot""" isLeaf = True def render(self, request): request.setResponseCode(418) request.setHeader(b"x-teapot", b"teapot!") return b"I'm a teapot" class _RedirectResource(Resource): """ Resource that redirects to a different domain. """ isLeaf = True def render(self, request): if b'redirected' not in request.uri: request.redirect(b'https://example.org/redirected') return dumps( { key.decode("charmap"): [ value.decode("charmap") for value in values ] for key, values in request.requestHeaders.getAllRawHeaders()} ).encode("utf-8") class _NonResponsiveTestResource(Resource): """Resource that returns NOT_DONE_YET and never finishes the request""" isLeaf = True def render(self, request): return NOT_DONE_YET class _EventuallyResponsiveTestResource(Resource): """ Resource that returns NOT_DONE_YET and stores the request so that something else can finish the response later. """ isLeaf = True def render(self, request): self.stored_request = request return NOT_DONE_YET class _SessionIdTestResource(Resource): """ Resource that returns the current session ID. """ isLeaf = True def __init__(self): super().__init__() # keep track of all sessions created, so we can manually expire them later self.sessions = [] def render(self, request): session = request.getSession() if session not in self.sessions: # new session, add to internal list self.sessions.append(session) uid = session.uid return uid def expire_sessions(self): """ Manually expire all sessions created by this resource. """ for session in self.sessions: session.expire() self.sessions = [] class StubbingTests(TestCase): """ Tests for :class:`StubTreq`. """ def test_stubtreq_provides_all_functions_in_treq_all(self): """ Every single function and attribute exposed by :obj:`treq.__all__` is provided by :obj:`StubTreq`. """ treq_things = [(name, obj) for name, obj in getmembers(treq) if name in treq.__all__] stub = StubTreq(_StaticTestResource()) api_things = [(name, obj) for name, obj in treq_things if obj.__module__ == "treq.api"] content_things = [(name, obj) for name, obj in treq_things if obj.__module__ == "treq.content"] # sanity checks - this test should fail if treq exposes a new API # without changes being made to StubTreq and this test. msg = ("At the time this test was written, StubTreq only knew about " "treq exposing functions from treq.api and treq.content. If " "this has changed, StubTreq will need to be updated, as will " "this test.") self.assertTrue(all(isfunction(obj) for name, obj in treq_things), msg) self.assertEqual(set(treq_things), set(api_things + content_things), msg) for name, obj in api_things: self.assertTrue( isfunction(getattr(stub, name, None)), "StubTreq.{0} should be a function.".format(name)) for name, obj in content_things: self.assertIs( getattr(stub, name, None), obj, "StubTreq.{0} should just expose treq.{0}".format(name)) def test_providing_resource_to_stub_treq(self): """ The resource provided to StubTreq responds to every request no matter what the URI or parameters or data. """ verbs = ('GET', 'PUT', 'HEAD', 'PATCH', 'DELETE', 'POST') urls = ( 'http://supports-http.com', 'https://supports-https.com', 'http://this/has/a/path/and/invalid/domain/name', 'https://supports-https.com:8080', 'http://supports-http.com:8080', ) params = (None, {}, {b'page': [1]}) headers = (None, {}, {b'x-random-header': [b'value', b'value2']}) data = (None, b"", b'some data', b'{"some": "json"}') stub = StubTreq(_StaticTestResource()) combos = ( (verb, {"url": url, "params": p, "headers": h, "data": d}) for verb in verbs for url in urls for p in params for h in headers for d in data ) for combo in combos: verb, kwargs = combo deferreds = (stub.request(verb, **kwargs), getattr(stub, verb.lower())(**kwargs)) for d in deferreds: resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual([b'teapot!'], resp.headers.getRawHeaders(b'x-teapot')) self.assertEqual(b"" if verb == "HEAD" else b"I'm a teapot", self.successResultOf(stub.content(resp))) def test_handles_invalid_schemes(self): """ Invalid URLs errback with a :obj:`SchemeNotSupported` failure, and does so even after a successful request. """ stub = StubTreq(_StaticTestResource()) self.failureResultOf(stub.get("x-unknown-1:"), SchemeNotSupported) self.successResultOf(stub.get("http://url.com")) self.failureResultOf(stub.get("x-unknown-2:"), SchemeNotSupported) def test_files_are_rejected(self): """ StubTreq does not handle files yet - it should reject requests which attempt to pass files. """ stub = StubTreq(_StaticTestResource()) self.assertRaises( AssertionError, stub.request, 'method', 'http://url', files=b'some file') def test_passing_in_strange_data_is_rejected(self): """ StubTreq rejects data that isn't list/dictionary/tuple/bytes/unicode. """ stub = StubTreq(_StaticTestResource()) self.assertRaises( AssertionError, stub.request, 'method', 'http://url', data=object()) self.successResultOf(stub.request('method', 'http://url', data={})) self.successResultOf(stub.request('method', 'http://url', data=[])) self.successResultOf(stub.request('method', 'http://url', data=())) self.successResultOf( stub.request('method', 'http://url', data=b"")) self.successResultOf( stub.request('method', 'http://url', data="")) def test_handles_failing_asynchronous_requests(self): """ Handle a resource returning NOT_DONE_YET and then canceling the request. """ stub = StubTreq(_NonResponsiveTestResource()) d = stub.request('method', 'http://url', data=b"1234") self.assertNoResult(d) d.cancel() self.failureResultOf(d, ResponseFailed) def test_handles_successful_asynchronous_requests(self): """ Handle a resource returning NOT_DONE_YET and then later finishing the response. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data=b"1234") self.assertNoResult(d) rsrc.stored_request.finish() stub.flush() resp = self.successResultOf(d) self.assertEqual(resp.code, 200) def test_handles_successful_asynchronous_requests_with_response_data(self): """ Handle a resource returning NOT_DONE_YET and then sending some data in the response. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data=b"1234") self.assertNoResult(d) chunks = [] rsrc.stored_request.write(b'spam ') rsrc.stored_request.write(b'eggs') stub.flush() resp = self.successResultOf(d) d = stub.collect(resp, chunks.append) self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'spam eggs') rsrc.stored_request.finish() stub.flush() self.successResultOf(d) def test_handles_successful_asynchronous_requests_with_streaming(self): """ Handle a resource returning NOT_DONE_YET and then streaming data back gradually over time. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data="1234") self.assertNoResult(d) chunks = [] rsrc.stored_request.write(b'spam ') rsrc.stored_request.write(b'eggs') stub.flush() resp = self.successResultOf(d) d = stub.collect(resp, chunks.append) self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'spam eggs') del chunks[:] rsrc.stored_request.write(b'eggs\r\nspam\r\n') stub.flush() self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'eggs\r\nspam\r\n') rsrc.stored_request.finish() stub.flush() self.successResultOf(d) def test_session_persistence_between_requests(self): """ Calling request.getSession() in the wrapped resource will return a session with the same ID, until the sessions are cleaned; in other words, cookies are propagated between requests when the result of C{response.cookies()} is passed to the next request. """ rsrc = _SessionIdTestResource() stub = StubTreq(rsrc) # request 1, getting original session ID d = stub.request("method", "http://example.com/") resp = self.successResultOf(d) cookies = resp.cookies() sid_1 = self.successResultOf(resp.content()) # request 2, ensuring session ID stays the same d = stub.request("method", "http://example.com/", cookies=cookies) resp = self.successResultOf(d) sid_2 = self.successResultOf(resp.content()) self.assertEqual(sid_1, sid_2) # request 3, ensuring the session IDs are different after cleaning # or expiring the sessions # manually expire the sessions. rsrc.expire_sessions() d = stub.request("method", "http://example.com/") resp = self.successResultOf(d) cookies = resp.cookies() sid_3 = self.successResultOf(resp.content()) self.assertNotEqual(sid_1, sid_3) # request 4, ensuring that once again the session IDs are the same d = stub.request("method", "http://example.com/", cookies=cookies) resp = self.successResultOf(d) sid_4 = self.successResultOf(resp.content()) self.assertEqual(sid_3, sid_4) def test_cookies_not_sent_to_different_domains(self): """ Cookies manually specified as part of a dictionary are not relayed through redirects to different domains. (This is really more of a test for scoping of cookies within treq itself, rather than just for testing.) """ rsrc = _RedirectResource() stub = StubTreq(rsrc) d = stub.request( "GET", "http://example.com/", cookies={"not-across-redirect": "nope"} ) resp = self.successResultOf(d) received = self.successResultOf(resp.json()) self.assertNotIn('not-across-redirect', received.get('Cookie', [''])[0]) def test_cookies_sent_for_same_domain(self): """ Cookies manually specified as part of a dictionary are relayed through redirects to the same domain. (This is really more of a test for scoping of cookies within treq itself, rather than just for testing.) """ rsrc = _RedirectResource() stub = StubTreq(rsrc) d = stub.request( "GET", "https://example.org/", cookies={'sent-to-same-domain': 'yes'} ) resp = self.successResultOf(d) received = self.successResultOf(resp.json()) self.assertIn('sent-to-same-domain', received.get('Cookie', [''])[0]) def test_cookies_sent_with_explicit_port(self): """ Cookies will be sent for URLs that specify a non-default port for their scheme. (This is really more of a test for scoping of cookies within treq itself, rather than just for testing.) """ rsrc = _RedirectResource() stub = StubTreq(rsrc) d = stub.request( "GET", "http://example.org:8080/redirected", cookies={'sent-to-non-default-port': 'yes'} ) resp = self.successResultOf(d) received = self.successResultOf(resp.json()) self.assertIn('sent-to-non-default-port', received.get('Cookie', [''])[0]) d = stub.request( "GET", "https://example.org:8443/redirected", cookies={'sent-to-non-default-port': 'yes'} ) resp = self.successResultOf(d) received = self.successResultOf(resp.json()) self.assertIn('sent-to-non-default-port', received.get('Cookie', [''])[0]) class HasHeadersTests(TestCase): """ Tests for :obj:`HasHeaders`. """ def test_equality_and_strict_subsets_succeed(self): """ The :obj:`HasHeaders` returns True if both sets of headers are equivalent, or the first is a strict subset of the second. """ self.assertEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two', 'three']}, "Equivalent headers do not match.") self.assertEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two', 'three', 'four'], 'ten': ['six']}, "Strict subset headers do not match") def test_partial_or_zero_intersection_subsets_fail(self): """ The :obj:`HasHeaders` returns False if both sets of headers overlap but the first is not a strict subset of the second. It also returns False if there is no overlap. """ self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['three', 'four']}, "Partial value overlap matches") self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two']}, "Missing value matches") self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'ten': ['six']}, "Complete inequality matches") def test_case_insensitive_keys(self): """ The :obj:`HasHeaders` equality function ignores the case of the header keys. """ self.assertEqual(HasHeaders({b'A': [b'1'], b'b': [b'2']}), {b'a': [b'1'], b'B': [b'2']}) def test_case_sensitive_values(self): """ The :obj:`HasHeaders` equality function does care about the case of the header value. """ self.assertNotEqual(HasHeaders({b'a': [b'a']}), {b'a': [b'A']}) def test_bytes_encoded_forms(self): """ The :obj:`HasHeaders` equality function compares the bytes-encoded forms of both sets of headers. """ self.assertEqual(HasHeaders({b'a': [b'a']}), {u'a': [u'a']}) self.assertEqual(HasHeaders({u'b': [u'b']}), {b'b': [b'b']}) def test_repr(self): """ :obj:`HasHeaders` returns a nice string repr. """ self.assertEqual( "HasHeaders({b'a': [b'b']})", repr(HasHeaders({b"A": [b"b"]})), ) class StringStubbingTests(TestCase): """ Tests for :obj:`StringStubbingResource`. """ def _get_response_for(self, expected_args, response): """ Make a :obj:`IStringResponseStubs` that checks the expected args and returns the given response. """ method, url, params, headers, data = expected_args def get_response_for(_method, _url, _params, _headers, _data): self.assertEqual((method, url, params, data), (_method, _url, _params, _data)) self.assertEqual(HasHeaders(headers), _headers) return response return get_response_for def test_interacts_successfully_with_istub(self): """ The :obj:`IStringResponseStubs` is passed the correct parameters with which to evaluate the response, and the response is returned. """ resource = StringStubbingResource(self._get_response_for( (b'DELETE', 'http://what/a/thing', {b'page': [b'1']}, {b'x-header': [b'eh']}, b'datastr'), (418, {b'x-response': b'responseheader'}, b'response body'))) stub = StubTreq(resource) d = stub.delete('http://what/a/thing', headers={b'x-header': b'eh'}, params={b'page': b'1'}, data=b'datastr') resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual([b'responseheader'], resp.headers.getRawHeaders(b'x-response')) self.assertEqual(b'response body', self.successResultOf(stub.content(resp))) class RequestSequenceTests(TestCase): """ Tests for :obj:`RequestSequence`. """ def setUp(self): """ Set up a way to report failures asynchronously. """ self.async_failures = [] def test_mismatched_request_causes_failure(self): """ If a request is made that is not expected as the next request, causes a failure. """ sequence = RequestSequence( [((b'get', 'https://anything/', {b'1': [b'2']}, HasHeaders({b'1': [b'1']}), b'what'), (418, {}, b'body')), ((b'get', 'http://anything', {}, HasHeaders({b'2': [b'1']}), b'what'), (202, {}, b'deleted'))], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) get = partial(stub.get, 'https://anything?1=2', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(get()) self.assertEqual(418, resp.code) self.assertEqual(b'body', self.successResultOf(stub.content(resp))) self.assertEqual([], self.async_failures) resp = self.successResultOf(get()) self.assertEqual(500, resp.code) self.assertEqual(1, len(self.async_failures)) self.assertIn("Expected the next request to be", self.async_failures[0]) self.assertFalse(sequence.consumed()) def test_unexpected_number_of_request_causes_failure(self): """ If there are no more expected requests, making a request causes a failure. """ sequence = RequestSequence( [], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) d = stub.get('https://anything', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(d) self.assertEqual(500, resp.code) self.assertEqual(b'StubbingError', self.successResultOf(resp.content())) self.assertEqual(1, len(self.async_failures)) self.assertIn("No more requests expected, but request", self.async_failures[0]) # the expected requests have all been made self.assertTrue(sequence.consumed()) def test_works_with_mock_any(self): """ :obj:`mock.ANY` can be used with the request parameters. """ sequence = RequestSequence( [((ANY, ANY, ANY, ANY, ANY), (418, {}, b'body'))], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) with sequence.consume(sync_failure_reporter=self.fail): d = stub.get('https://anything', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual(b'body', self.successResultOf(stub.content(resp))) self.assertEqual([], self.async_failures) # the expected requests have all been made self.assertTrue(sequence.consumed()) def test_consume_context_manager_fails_on_remaining_requests(self): """ If the `consume` context manager is used, if there are any remaining expecting requests, the test case will be failed. """ sequence = RequestSequence( [((ANY, ANY, ANY, ANY, ANY), (418, {}, b'body'))] * 2, async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) consume_failures = [] with sequence.consume(sync_failure_reporter=consume_failures.append): self.successResultOf(stub.get('https://anything', data=b'what', headers={b'1': b'1'})) self.assertEqual(1, len(consume_failures)) self.assertIn( "Not all expected requests were made. Still expecting:", consume_failures[0]) self.assertIn( "{0}(url={0}, params={0}, headers={0}, data={0})".format( repr(ANY)), consume_failures[0]) # no asynchronous failures (mismatches, etc.) self.assertEqual([], self.async_failures) def test_async_failures_logged(self): """ When no `async_failure_reporter` is passed async failures are logged by default. """ sequence = RequestSequence([]) stub = StubTreq(StringStubbingResource(sequence)) with sequence.consume(self.fail): self.successResultOf(stub.get('https://example.com')) [failure] = self.flushLoggedErrors() self.assertIsInstance(failure.value, AssertionError) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644040085.0 treq-22.2.0/src/treq/test/test_treq_integration.py0000664000175000017500000002303000000000000021721 0ustar00twmtwm00000000000000from io import BytesIO from twisted.python.url import URL from twisted.trial.unittest import TestCase from twisted.internet.defer import CancelledError, inlineCallbacks from twisted.internet.task import deferLater from twisted.internet import reactor from twisted.internet.tcp import Client from twisted.internet.ssl import Certificate, trustRootFromCertificates from twisted.web.client import (Agent, BrowserLikePolicyForHTTPS, HTTPConnectionPool, ResponseFailed) from treq.test.util import DEBUG, skip_on_windows_because_of_199 from .local_httpbin.parent import _HTTPBinProcess import treq skip = skip_on_windows_because_of_199() @inlineCallbacks def print_response(response): if DEBUG: print() print('---') print(response.code) print(response.headers) print(response.request.headers) text = yield treq.text_content(response) print(text) print('---') def with_baseurl(method): def _request(self, url, *args, **kwargs): return method(self.baseurl + url, *args, agent=self.agent, pool=self.pool, **kwargs) return _request class TreqIntegrationTests(TestCase): get = with_baseurl(treq.get) head = with_baseurl(treq.head) post = with_baseurl(treq.post) put = with_baseurl(treq.put) patch = with_baseurl(treq.patch) delete = with_baseurl(treq.delete) _httpbin_process = _HTTPBinProcess(https=False) @inlineCallbacks def setUp(self): description = yield self._httpbin_process.server_description( reactor) self.baseurl = URL(scheme=u"http", host=description.host, port=description.port).asText() self.agent = Agent(reactor) self.pool = HTTPConnectionPool(reactor, False) def tearDown(self): def _check_fds(_): # This appears to only be necessary for HTTPS tests. # For the normal HTTP tests then closeCachedConnections is # sufficient. fds = set(reactor.getReaders() + reactor.getReaders()) if not [fd for fd in fds if isinstance(fd, Client)]: return return deferLater(reactor, 0, _check_fds, None) return self.pool.closeCachedConnections().addBoth(_check_fds) @inlineCallbacks def assert_data(self, response, expected_data): body = yield treq.json_content(response) self.assertIn('data', body) self.assertEqual(body['data'], expected_data) @inlineCallbacks def assert_sent_header(self, response, header, expected_value): body = yield treq.json_content(response) self.assertIn(header, body['headers']) self.assertEqual(body['headers'][header], expected_value) @inlineCallbacks def test_get(self): response = yield self.get('/get') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_headers(self): response = yield self.get('/get', {b'X-Blah': [b'Foo', b'Bar']}) self.assertEqual(response.code, 200) yield self.assert_sent_header(response, 'X-Blah', 'Foo,Bar') yield print_response(response) @inlineCallbacks def test_get_headers_unicode(self): response = yield self.get('/get', {u'X-Blah': [u'Foo', b'Bar']}) self.assertEqual(response.code, 200) yield self.assert_sent_header(response, 'X-Blah', 'Foo,Bar') yield print_response(response) @inlineCallbacks def test_get_302_absolute_redirect(self): response = yield self.get( '/redirect-to?url={0}/get'.format(self.baseurl)) self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_302_relative_redirect(self): response = yield self.get('/relative-redirect/1') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_302_redirect_disallowed(self): response = yield self.get('/redirect/1', allow_redirects=False) self.assertEqual(response.code, 302) yield print_response(response) @inlineCallbacks def test_head(self): response = yield self.head('/get') body = yield treq.content(response) self.assertEqual(b'', body) yield print_response(response) @inlineCallbacks def test_head_302_absolute_redirect(self): response = yield self.head( '/redirect-to?url={0}/get'.format(self.baseurl)) self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_head_302_relative_redirect(self): response = yield self.head('/relative-redirect/1') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_head_302_redirect_disallowed(self): response = yield self.head('/redirect/1', allow_redirects=False) self.assertEqual(response.code, 302) yield print_response(response) @inlineCallbacks def test_post(self): response = yield self.post('/post', b'Hello!') self.assertEqual(response.code, 200) yield self.assert_data(response, 'Hello!') yield print_response(response) @inlineCallbacks def test_multipart_post(self): class FileLikeObject(BytesIO): def __init__(self, val): BytesIO.__init__(self, val) self.name = "david.png" def read(*args, **kwargs): return BytesIO.read(*args, **kwargs) response = yield self.post( '/post', data={"a": "b"}, files={"file1": FileLikeObject(b"file")}) self.assertEqual(response.code, 200) body = yield treq.json_content(response) self.assertEqual('b', body['form']['a']) self.assertEqual('file', body['files']['file1']) yield print_response(response) @inlineCallbacks def test_post_headers(self): response = yield self.post( '/post', b'{msg: "Hello!"}', headers={'Content-Type': ['application/json']} ) self.assertEqual(response.code, 200) yield self.assert_sent_header( response, 'Content-Type', 'application/json') yield self.assert_data(response, '{msg: "Hello!"}') yield print_response(response) @inlineCallbacks def test_put(self): response = yield self.put('/put', data=b'Hello!') yield print_response(response) @inlineCallbacks def test_patch(self): response = yield self.patch('/patch', data=b'Hello!') self.assertEqual(response.code, 200) yield self.assert_data(response, 'Hello!') yield print_response(response) @inlineCallbacks def test_delete(self): response = yield self.delete('/delete') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_gzip(self): response = yield self.get('/gzip') self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertTrue(json['gzipped']) @inlineCallbacks def test_basic_auth(self): response = yield self.get('/basic-auth/treq/treq', auth=('treq', 'treq')) self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertTrue(json['authenticated']) self.assertEqual(json['user'], 'treq') @inlineCallbacks def test_failed_basic_auth(self): response = yield self.get('/basic-auth/treq/treq', auth=('not-treq', 'not-treq')) self.assertEqual(response.code, 401) yield print_response(response) @inlineCallbacks def test_timeout(self): """ Verify a timeout fires if a request takes too long. """ yield self.assertFailure(self.get('/delay/2', timeout=1), CancelledError, ResponseFailed) @inlineCallbacks def test_cookie(self): response = yield self.get('/cookies', cookies={'hello': 'there'}) self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertEqual(json['cookies']['hello'], 'there') @inlineCallbacks def test_set_cookie(self): response = yield self.get('/cookies/set', allow_redirects=False, params={'hello': 'there'}) # self.assertEqual(response.code, 200) yield print_response(response) self.assertEqual(response.cookies()['hello'], 'there') class HTTPSTreqIntegrationTests(TreqIntegrationTests): _httpbin_process = _HTTPBinProcess(https=True) @inlineCallbacks def setUp(self): description = yield self._httpbin_process.server_description( reactor) self.baseurl = URL(scheme=u"https", host=description.host, port=description.port).asText() root = trustRootFromCertificates( [Certificate.loadPEM(description.cacert)], ) self.agent = Agent( reactor, contextFactory=BrowserLikePolicyForHTTPS(root), ) self.pool = HTTPConnectionPool(reactor, False) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1611529172.0 treq-22.2.0/src/treq/test/util.py0000664000175000017500000000143300000000000016264 0ustar00twmtwm00000000000000import os import platform from unittest import mock from twisted.internet import reactor from twisted.internet.task import Clock DEBUG = os.getenv("TREQ_DEBUG", False) == "true" is_pypy = platform.python_implementation() == 'PyPy' def with_clock(fn): def wrapper(*args, **kwargs): clock = Clock() with mock.patch.object(reactor, 'callLater', clock.callLater): return fn(*(args + (clock,)), **kwargs) return wrapper def skip_on_windows_because_of_199(): """ Return a skip describing issue #199 under Windows. :return: A :py:class:`str` skip reason. """ if platform.system() == 'Windows': return ("HTTPBin process cannot run under Windows." " See https://github.com/twisted/treq/issues/199") return None ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1621403682.0 treq-22.2.0/src/treq/testing.py0000664000175000017500000004715600000000000016021 0ustar00twmtwm00000000000000""" In-memory version of treq for testing. """ from contextlib import contextmanager from functools import wraps try: from twisted.internet.testing import MemoryReactorClock except ImportError: from twisted.test.proto_helpers import MemoryReactorClock from twisted.test import iosim from twisted.internet.address import IPv4Address from twisted.internet.defer import succeed from twisted.internet.interfaces import ISSLTransport from twisted.logger import Logger from twisted.python.failure import Failure from twisted.python.urlpath import URLPath from twisted.internet.endpoints import TCP4ClientEndpoint from twisted.web.client import Agent from twisted.web.error import SchemeNotSupported from twisted.web.iweb import IAgent, IAgentEndpointFactory, IBodyProducer from twisted.web.resource import Resource from twisted.web.server import Session, Site from zope.interface import directlyProvides, implementer import treq from treq.client import HTTPClient import attr @implementer(IAgentEndpointFactory) @attr.s class _EndpointFactory: """ An endpoint factory used by :class:`RequestTraversalAgent`. :ivar reactor: The agent's reactor. :type reactor: :class:`MemoryReactorClock` """ reactor = attr.ib() def endpointForURI(self, uri): """ Create an endpoint that represents an in-memory connection to a URI. Note: This always creates a :class:`~twisted.internet.endpoints.TCP4ClientEndpoint` on the assumption :class:`RequestTraversalAgent` ignores everything about the endpoint but its port. :param uri: The URI to connect to. :type uri: :class:`~twisted.web.client.URI` :return: The endpoint. :rtype: An :class:`~twisted.internet.interfaces.IStreamClientEndpoint` provider. """ if uri.scheme not in {b'http', b'https'}: raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,)) return TCP4ClientEndpoint(self.reactor, "127.0.0.1", uri.port) @implementer(IAgent) class RequestTraversalAgent: """ :obj:`~twisted.web.iweb.IAgent` implementation that issues an in-memory request rather than going out to a real network socket. """ def __init__(self, rootResource): """ :param rootResource: The Twisted `IResource` at the root of the resource tree. """ self._memoryReactor = MemoryReactorClock() self._realAgent = Agent.usingEndpointFactory( reactor=self._memoryReactor, endpointFactory=_EndpointFactory(self._memoryReactor)) self._rootResource = rootResource self._serverFactory = Site(self._rootResource, reactor=self._memoryReactor) self._serverFactory.sessionFactory = lambda site, uid: Session( site, uid, reactor=self._memoryReactor, ) self._pumps = set() def request(self, method, uri, headers=None, bodyProducer=None): """ Implement IAgent.request. """ # We want to use Agent to parse the HTTP response, so let's ask it to # make a request against our in-memory reactor. response = self._realAgent.request(method, uri, headers, bodyProducer) # If the request has already finished, just propagate the result. In # reality this would only happen in failure, but if the agent ever adds # a local cache this might be a success. already_called = [] def check_already_called(r): already_called.append(r) return r response.addBoth(check_already_called) if already_called: return response # That will try to establish an HTTP connection with the reactor's # connectTCP method, and MemoryReactor will place Agent's factory into # the tcpClients list. Alternately, it will try to establish an HTTPS # connection with the reactor's connectSSL method, and MemoryReactor # will place it into the sslClients list. We'll extract that. scheme = URLPath.fromBytes(uri).scheme host, port, factory, timeout, bindAddress = ( self._memoryReactor.tcpClients[-1]) serverAddress = IPv4Address('TCP', '127.0.0.1', port) clientAddress = IPv4Address('TCP', '127.0.0.1', 31337) # Create the protocol and fake transport for the client and server, # using the factory that was passed to the MemoryReactor for the # client, and a Site around our rootResource for the server. serverProtocol = self._serverFactory.buildProtocol(clientAddress) serverTransport = iosim.FakeTransport( serverProtocol, isServer=True, hostAddress=serverAddress, peerAddress=clientAddress) clientProtocol = factory.buildProtocol(None) clientTransport = iosim.FakeTransport( clientProtocol, isServer=False, hostAddress=clientAddress, peerAddress=serverAddress) if scheme == b"https": # Provide ISSLTransport on both transports, so everyone knows that # this is HTTPS. directlyProvides(serverTransport, ISSLTransport) directlyProvides(clientTransport, ISSLTransport) # Make a pump for wiring the client and server together. pump = iosim.connect( serverProtocol, serverTransport, clientProtocol, clientTransport) self._pumps.add(pump) return response def flush(self): """ Flush all data between pending client/server pairs. This is only necessary if a :obj:`Resource` under test returns :obj:`NOT_DONE_YET` from its ``render`` method, making a response asynchronous. In that case, after each write from the server, :meth:`flush()` must be called so the client can see it. """ old_pumps = self._pumps new_pumps = self._pumps = set() for p in old_pumps: p.flush() if p.clientIO.disconnected and p.serverIO.disconnected: continue new_pumps.add(p) @implementer(IBodyProducer) class _SynchronousProducer: """ A partial implementation of an :obj:`IBodyProducer` which produces its entire payload immediately. There is no way to access to an instance of this object from :obj:`RequestTraversalAgent` or :obj:`StubTreq`, or even a :obj:`Resource: passed to :obj:`StubTreq`. This does not implement the :func:`IBodyProducer.stopProducing` method, because that is very difficult to trigger. (The request from `RequestTraversalAgent` would have to be canceled while it is still in the transmitting state), and the intent is to use `RequestTraversalAgent` to make synchronous requests. """ def __init__(self, body): """ Create a synchronous producer with some bytes. """ self.body = body msg = ("StubTreq currently only supports url-encodable types, bytes, " "or unicode as data.") assert isinstance(body, (bytes, str)), msg if isinstance(body, str): self.body = body.encode('utf-8') self.length = len(body) def startProducing(self, consumer): """ Immediately produce all data. """ consumer.write(self.body) return succeed(None) def _reject_files(f): """ Decorator that rejects the 'files' keyword argument to the request functions, because that is not handled by this yet. """ @wraps(f) def wrapper(*args, **kwargs): if 'files' in kwargs: raise AssertionError("StubTreq cannot handle files.") return f(*args, **kwargs) return wrapper class StubTreq: """ A fake version of the treq module that can be used for testing that provides all the function calls exposed in :obj:`treq.__all__`. """ def __init__(self, resource): """ Construct a client, and pass through client methods and/or treq.content functions. :param resource: A :obj:`Resource` object that provides the fake responses """ self._agent = RequestTraversalAgent(resource) _client = HTTPClient(agent=self._agent, data_to_body_producer=_SynchronousProducer) for function_name in treq.__all__: function = getattr(_client, function_name, None) if function is None: function = getattr(treq, function_name) else: function = _reject_files(function) setattr(self, function_name, function) self.flush = self._agent.flush class StringStubbingResource(Resource): """ A resource that takes a callable with 5 parameters ``(method, url, params, headers, data)`` and returns ``(code, headers, body)``. The resource uses the callable to return a real response as a result of a request. The parameters for the callable are: - ``method``, the HTTP method as `bytes`. - ``url``, the full URL of the request as text. - ``params``, a dictionary of query parameters mapping query keys lists of values (sorted alphabetically). - ``headers``, a dictionary of headers mapping header keys to a list of header values (sorted alphabetically). - ``data``, the request body as `bytes`. The callable must return a ``tuple`` of (code, headers, body) where the code is the HTTP status code, the headers is a dictionary of bytes (unlike the `headers` parameter, which is a dictionary of lists), and body is a string that will be returned as the response body. If there is a stubbing error, the return value is undefined (if an exception is raised, :obj:`~twisted.web.resource.Resource` will just eat it and return 500 in its place). The callable, or whomever creates the callable, should have a way to handle error reporting. """ isLeaf = True def __init__(self, get_response_for): """ See :class:`StringStubbingResource`. """ Resource.__init__(self) self._get_response_for = get_response_for def render(self, request): """ Produce a response according to the stubs provided. """ params = request.args headers = {} for k, v in request.requestHeaders.getAllRawHeaders(): headers[k] = v for dictionary in (params, headers): for k in dictionary: dictionary[k] = sorted(dictionary[k]) # The incoming request does not have the absoluteURI property, because # an incoming request is a IRequest, not an IClientRequest, so it # the absolute URI needs to be synthesized. # But request.URLPath() only returns the scheme and hostname, because # that is the URL for this resource (because this resource handles # everything from the root on down). # So we need to add the request.path (not request.uri, which includes # the query parameters) absoluteURI = str(request.URLPath().click(request.path)) status_code, headers, body = self._get_response_for( request.method, absoluteURI, params, headers, request.content.read()) request.setResponseCode(status_code) for k, v in headers.items(): request.setHeader(k, v) return body def _maybeEncode(someStr): """ Encode `someStr` to ASCII if required. """ if isinstance(someStr, str): return someStr.encode('ascii') return someStr def _maybeEncodeHeaders(headers): """ Convert a headers mapping to its bytes-encoded form. """ return {_maybeEncode(k).lower(): [_maybeEncode(v) for v in vs] for k, vs in headers.items()} class HasHeaders: """ Since Twisted adds headers to a request, such as the host and the content length, it's necessary to test whether request headers CONTAIN the expected headers (the ones that are not automatically added by Twisted). This wraps a set of headers, and can be used in an equality test against a superset if the provided headers. The headers keys are lowercased, and keys and values are compared in their bytes-encoded forms. Headers should be provided as a mapping from strings or bytes to a list of strings or bytes. """ def __init__(self, headers): self._headers = _maybeEncodeHeaders(headers) def __repr__(self): return "HasHeaders({0})".format(repr(self._headers)) def __eq__(self, other_headers): compare_to = _maybeEncodeHeaders(other_headers) return (set(self._headers.keys()).issubset(set(compare_to.keys())) and all([set(v).issubset(set(compare_to[k])) for k, v in self._headers.items()])) def __ne__(self, other_headers): return not self.__eq__(other_headers) class RequestSequence: """ For an example usage, see :meth:`RequestSequence.consume`. Takes a sequence of:: [((method, url, params, headers, data), (code, headers, body)), ...] Expects the requests to arrive in sequence order. If there are no more responses, or the request's parameters do not match the next item's expected request parameters, calls `sync_failure_reporter` or `async_failure_reporter`. For the expected request tuples: - ``method`` should be :class:`bytes` normalized to lowercase. - ``url`` should be a `str` normalized as per the `transformations in that (usually) preserve semantics `_. A URL to `http://something-that-looks-like-a-directory` would be normalized to `http://something-that-looks-like-a-directory/` and a URL to `http://something-that-looks-like-a-page/page.html` remains unchanged. - ``params`` is a dictionary mapping :class:`bytes` to :class:`list` of :class:`bytes`. - ``headers`` is a dictionary mapping :class:`bytes` to :class:`list` of :class:`bytes` -- note that :class:`twisted.web.client.Agent` may add its own headers which are not guaranteed to be present (for instance, `user-agent` or `content-length`), so it's better to use some kind of matcher like :class:`HasHeaders`. - ``data`` is a :class:`bytes`. For the response tuples: - ``code`` is an integer representing the HTTP status code to return. - ``headers`` is a dictionary mapping :class:`bytes` to :class:`bytes` or :class:`str`. Note that the value is *not* a list. - ``body`` is a :class:`bytes`. :ivar list sequence: A sequence of (request tuple, response tuple) two-tuples, as described above. :ivar async_failure_reporter: An optional callable that takes a :class:`str` message indicating a failure. It's asynchronous because it cannot just raise an exception—if it does, :meth:`Resource.render ` will just convert that into a 500 response, and there will be no other failure reporting mechanism. When the `async_failure_reporter` parameter is not passed, async failures will be reported via a :class:`twisted.logger.Logger` instance, which Trial's test case classes (:class:`twisted.trial.unittest.TestCase` and :class:`~twisted.trial.unittest.SynchronousTestCase`) will translate into a test failure. .. note:: Some versions of :class:`twisted.trial.unittest.SynchronousTestCase` report logged errors on the wrong test: see `Twisted #9267 `_. .. TODO Update the above note to say what version of SynchronousTestCase is fixed once Twisted >17.5.0 is released. When not subclassing Trial's classes you must pass `async_failure_reporter` and implement equivalent behavior or errors will pass silently. For example:: async_failures = [] sequence_stubs = RequestSequence([...], async_failures.append) stub_treq = StubTreq(StringStubbingResource(sequence_stubs)) with sequence_stubs.consume(self.fail): # self = unittest.TestCase stub_treq.get('http://fakeurl.com') self.assertEqual([], async_failures) """ _log = Logger() def __init__(self, sequence, async_failure_reporter=None): self._sequence = sequence self._async_reporter = async_failure_reporter or self._log_async_error def _log_async_error(self, message): """ The default async failure reporter—see `async_failure_reporter`. Logs a failure which wraps an :ex:`AssertionError`. :param str message: Failure message """ # Passing message twice may look redundant, but Trial only preserves # the Failure, not the log message. self._log.failure( "RequestSequence async error: {message}", message=message, failure=Failure(AssertionError(message)), ) def consumed(self): """ :return: `bool` representing whether the entire sequence has been consumed. This is useful in tests to assert that the expected requests have all been made. """ return len(self._sequence) == 0 @contextmanager def consume(self, sync_failure_reporter): """ Usage:: sequence_stubs = RequestSequence([...]) stub_treq = StubTreq(StringStubbingResource(sequence_stubs)) # self = twisted.trial.unittest.SynchronousTestCase with sequence_stubs.consume(self.fail): stub_treq.get('http://fakeurl.com') stub_treq.get('http://another-fake-url.com') If there are still remaining expected requests to be made in the sequence, fails the provided test case. :param sync_failure_reporter: A callable that takes a single message reporting failures. This can just raise an exception - it does not need to be asynchronous, since the exception would not get raised within a Resource. :return: a context manager that can be used to ensure all expected requests have been made. """ yield if not self.consumed(): sync_failure_reporter("\n".join( ["Not all expected requests were made. Still expecting:"] + ["- {0}(url={1}, params={2}, headers={3}, data={4})".format( *expected) for expected, _ in self._sequence])) def __call__(self, method, url, params, headers, data): """ :return: the next response in the sequence, provided that the parameters match the next in the sequence. """ if len(self._sequence) == 0: self._async_reporter( "No more requests expected, but request {0!r} made.".format( (method, url, params, headers, data))) return (500, {}, b"StubbingError") expected, response = self._sequence[0] e_method, e_url, e_params, e_headers, e_data = expected checks = [ (e_method == method.lower(), "method"), (e_url == url, "url"), (e_params == params, 'parameters'), (e_headers == headers, "headers"), (e_data == data, "data") ] mismatches = [param for success, param in checks if not success] if mismatches: self._async_reporter( "\nExpected the next request to be: {0!r}" "\nGot request : {1!r}\n" "\nMismatches: {2!r}" .format(expected, (method, url, params, headers, data), mismatches)) return (500, {}, b"StubbingError") self._sequence = self._sequence[1:] return response ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1644391241.5757504 treq-22.2.0/src/treq.egg-info/0000775000175000017500000000000000000000000015447 5ustar00twmtwm00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391241.0 treq-22.2.0/src/treq.egg-info/PKG-INFO0000664000175000017500000000566600000000000016561 0ustar00twmtwm00000000000000Metadata-Version: 2.1 Name: treq Version: 22.2.0 Summary: High-level Twisted HTTP Client API Home-page: https://github.com/twisted/treq Author: David Reid Author-email: dreid@dreid.org Maintainer: Tom Most Maintainer-email: twm@freecog.net License: MIT/X Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Framework :: Twisted Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Requires-Python: >=3.6 Description-Content-Type: text/x-rst Provides-Extra: dev Provides-Extra: docs License-File: LICENSE treq: High-level Twisted HTTP Client API ======================================== |pypi|_ |calver|_ |coverage|_ |documentation|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> import treq >>> def done(response): ... print(response.code) ... reactor.stop() >>> treq.get("https://github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contributing ------------ ``treq`` development is hosted on `GitHub `_. We welcome contributions: feel to fork and send contributions over. See `CONTRIBUTING.rst `_ for more info. Code of Conduct --------------- Refer to the `Twisted code of conduct `_. Copyright and License --------------------- ``treq`` is made available under the MIT license. See `LICENSE <./LICENSE>`_ for legal details and copyright notices. .. _pypi: https://pypi.org/project/treq/ .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg :alt: PyPI .. _calver: https://calver.org/ .. |calver| image:: https://img.shields.io/badge/calver-YY.MM.MICRO-22bfda.svg :alt: calver: YY.MM.MICRO .. _coverage: https://coveralls.io/github/twisted/treq .. |coverage| image:: https://coveralls.io/repos/github/twisted/treq/badge.svg :alt: Coverage .. _documentation: https://treq.readthedocs.org .. |documentation| image:: https://readthedocs.org/projects/treq/badge/ :alt: Documentation ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391241.0 treq-22.2.0/src/treq.egg-info/SOURCES.txt0000664000175000017500000000325300000000000017336 0ustar00twmtwm00000000000000.coveragerc CHANGELOG.rst CONTRIBUTING.rst LICENSE MANIFEST.in README.rst SECURITY.md pyproject.toml setup.py docs/Makefile docs/api.rst docs/changelog.rst docs/conf.py docs/howto.rst docs/index.rst docs/make.bat docs/testing.rst docs/_static/.keepme docs/examples/_utils.py docs/examples/basic_auth.py docs/examples/basic_get.py docs/examples/basic_post.py docs/examples/basic_url.py docs/examples/custom_agent.py docs/examples/disable_redirects.py docs/examples/download_file.py docs/examples/iresource.py docs/examples/json_post.py docs/examples/query_params.py docs/examples/redirects.py docs/examples/response_history.py docs/examples/testing_seq.py docs/examples/using_cookies.py src/treq/__init__.py src/treq/_agentspy.py src/treq/_version.py src/treq/api.py src/treq/auth.py src/treq/client.py src/treq/content.py src/treq/multipart.py src/treq/response.py src/treq/testing.py src/treq.egg-info/PKG-INFO src/treq.egg-info/SOURCES.txt src/treq.egg-info/dependency_links.txt src/treq.egg-info/requires.txt src/treq.egg-info/top_level.txt src/treq/test/__init__.py src/treq/test/test_agentspy.py src/treq/test/test_api.py src/treq/test/test_auth.py src/treq/test/test_client.py src/treq/test/test_content.py src/treq/test/test_multipart.py src/treq/test/test_response.py src/treq/test/test_testing.py src/treq/test/test_treq_integration.py src/treq/test/util.py src/treq/test/local_httpbin/__init__.py src/treq/test/local_httpbin/child.py src/treq/test/local_httpbin/parent.py src/treq/test/local_httpbin/shared.py src/treq/test/local_httpbin/test/__init__.py src/treq/test/local_httpbin/test/test_child.py src/treq/test/local_httpbin/test/test_parent.py src/treq/test/local_httpbin/test/test_shared.py././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391241.0 treq-22.2.0/src/treq.egg-info/dependency_links.txt0000664000175000017500000000000100000000000021515 0ustar00twmtwm00000000000000 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391241.0 treq-22.2.0/src/treq.egg-info/requires.txt0000664000175000017500000000020300000000000020042 0ustar00twmtwm00000000000000incremental requests>=2.1.0 hyperlink>=21.0.0 Twisted[tls]>=18.7.0 attrs [dev] pep8 pyflakes httpbin==0.5.0 [docs] sphinx>=1.4.8 ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1644391241.0 treq-22.2.0/src/treq.egg-info/top_level.txt0000664000175000017500000000000500000000000020174 0ustar00twmtwm00000000000000treq