treq-18.6.0/0000755000076500000240000000000013315341523013027 5ustar glyphstaff00000000000000treq-18.6.0/PKG-INFO0000644000076500000240000000552013315341523014126 0ustar glyphstaff00000000000000Metadata-Version: 2.1 Name: treq Version: 18.6.0 Summary: A requests-like API built on top of twisted.web's Agent Home-page: https://github.com/twisted/treq Author: David Reid Author-email: dreid@dreid.org Maintainer: Amber Brown Maintainer-email: hawkowl@twistedmatrix.com License: MIT/X Description: treq ==== |pypi|_ |build|_ |coverage|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> from treq import get >>> def done(response): ... print response.code ... reactor.stop() >>> get("http://www.github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contribute ========== ``treq`` is hosted on `GitHub `_. Feel free to fork and send contributions over. Developing ========== Install dependencies: :: pip install treq[dev] Run Tests (unit & integration): :: trial treq Lint: :: pep8 treq pyflakes treq Build docs:: tox -e docs .. |build| image:: https://api.travis-ci.org/twisted/treq.svg?branch=master .. _build: https://travis-ci.org/twisted/treq .. |coverage| image:: https://codecov.io/github/twisted/treq/coverage.svg?branch=master .. _coverage: https://codecov.io/github/twisted/treq .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg .. _pypi: https://pypi.python.org/pypi/treq Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Framework :: Twisted Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Provides-Extra: dev treq-18.6.0/LICENSE0000644000076500000240000000207513061143557014045 0ustar glyphstaff00000000000000This is the MIT license. Copyright (c) 2012-2014 David Reid Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. treq-18.6.0/MANIFEST.in0000644000076500000240000000026513061143557014575 0ustar glyphstaff00000000000000include README.rst LICENSE tox.ini tox2travis.py .coveragerc recursive-include docs * prune docs/_build exclude .travis.yml exclude .readthedocs.yml global-exclude .DS_Store *.pyc treq-18.6.0/.coveragerc0000644000076500000240000000020613061143557015153 0ustar glyphstaff00000000000000[run] source = treq branch = True [paths] source = src/ .tox/*/lib/python*/site-packages/ .tox/pypy*/site-packages/ treq-18.6.0/docs/0000755000076500000240000000000013315341523013757 5ustar glyphstaff00000000000000treq-18.6.0/docs/index.rst0000644000076500000240000000777613106516575015652 0ustar glyphstaff00000000000000treq: High-level Twisted HTTP Client API ======================================== `treq `_ depends on a recent Twisted and functions on Python 2.7 and Python 3.3+ (including PyPy). Why? ---- `requests`_ by `Kenneth Reitz`_ is a wonderful library. I want the same ease of use when writing Twisted applications. treq is not of course a perfect clone of `requests`_. I have tried to stay true to the do-what-I-mean spirit of the `requests`_ API and also kept the API familiar to users of `Twisted`_ and :class:`twisted.web.client.Agent` on which treq is based. .. _requests: http://python-requests.org/ .. _Kenneth Reitz: https://www.gittip.com/kennethreitz/ .. _Twisted: http://twistedmatrix.com/ Quick Start ----------- Installation:: pip install treq GET +++ .. literalinclude:: examples/basic_get.py :linenos: :lines: 7-10 Full example: :download:`basic_get.py ` POST ++++ .. literalinclude:: examples/basic_post.py :linenos: :lines: 9-14 Full example: :download:`basic_post.py ` Why not 100% requests-alike? ---------------------------- Initially when I started off working on treq I thought the API should look exactly like `requests`_ except anything that would involve the network would return a :class:`~twisted.internet.defer.Deferred`. Over time while attempting to mimic the `requests`_ API it became clear that not enough code could be shared between `requests`_ and treq for it to be worth the effort to translate many of the usage patterns from `requests`_. With the current version of treq I have tried to keep the API simple, yet remain familiar to users of Twisted and its lower-level HTTP libraries. Feature Parity with Requests ---------------------------- Even though mimicking the `requests`_ API is not a goal, supporting most of its features is. Here is a list of `requests`_ features and their status in treq. +----------------------------------+----------+----------+ | | requests | treq | +----------------------------------+----------+----------+ | International Domains and URLs | yes | yes | +----------------------------------+----------+----------+ | Keep-Alive & Connection Pooling | yes | yes | +----------------------------------+----------+----------+ | Sessions with Cookie Persistence | yes | yes | +----------------------------------+----------+----------+ | Browser-style SSL Verification | yes | yes | +----------------------------------+----------+----------+ | Basic Authentication | yes | yes | +----------------------------------+----------+----------+ | Digest Authentication | yes | no | +----------------------------------+----------+----------+ | Elegant Key/Value Cookies | yes | yes | +----------------------------------+----------+----------+ | Automatic Decompression | yes | yes | +----------------------------------+----------+----------+ | Unicode Response Bodies | yes | yes | +----------------------------------+----------+----------+ | Multipart File Uploads | yes | yes | +----------------------------------+----------+----------+ | Connection Timeouts | yes | yes | +----------------------------------+----------+----------+ | .netrc support | yes | no | +----------------------------------+----------+----------+ | Python 2.6 | yes | no | +----------------------------------+----------+----------+ | Python 2.7 | yes | yes | +----------------------------------+----------+----------+ | Python 3.x | yes | yes | +----------------------------------+----------+----------+ Table of Contents ----------------- .. toctree:: :maxdepth: 3 howto testing api Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` treq-18.6.0/docs/testing.rst0000644000076500000240000000635313221271655016201 0ustar glyphstaff00000000000000Testing Helpers =============== The :mod:`treq.testing` module provides some tools for testing both HTTP clients which use the treq API and implementations of the `Twisted Web resource model `_. Writing tests for HTTP clients ------------------------------ The :class:`~treq.testing.StubTreq` class implements the :mod:`treq` module interface (:func:`treq.get()`, :func:`treq.post()`, etc.) but runs all I/O via a :class:`~twisted.test.proto_helpers.MemoryReactor`. It wraps a :class:`twisted.web.resource.IResource` provider which handles each request. You can wrap a pre-existing `IResource` provider, or write your own. For example, the :class:`twisted.web.resource.ErrorPage` resource can produce an arbitrary HTTP status code. :class:`twisted.web.static.File` can serve files or directories. And you can easily achieve custom responses by writing trivial resources yourself: .. literalinclude:: examples/iresource.py :linenos: :pyobject: JsonResource However, those resources don't assert anything about the request. The :class:`~treq.testing.RequestSequence` and :class:`~treq.testing.StringStubbingResource` classes make it easy to construct a resource which encodes the expected request and response pairs. Do note that most parameters to these functions must be bytes—it's safest to use the ``b''`` string syntax, which works on both Python 2 and 3. For example: .. literalinclude:: examples/testing_seq.py :linenos: This may be run with ``trial testing_seq.py``. Download: :download:`testing_seq.py `. Loosely matching the request ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If you don't care about certain parts of the request, you can pass :data:`mock.ANY`, which compares equal to anything. This sequence matches a single GET request with any parameters or headers: .. code-block:: python RequestSequence([ ((b'get', mock.ANY, mock.ANY, b''), (200, {}, b'ok')) ]) If you care about headers, use :class:`~treq.testing.HasHeaders` to make assertions about the headers present in the request. It compares equal to a superset of the headers specified, which helps make your test robust to changes in treq or Agent. Right now treq adds the ``Accept-Encoding: gzip`` header, but as support for additional compression methods is added, this may change. Writing tests for Twisted Web resources --------------------------------------- Since :class:`~treq.testing.StubTreq` wraps any resource, you can use it to test your server-side code as well. This is superior to calling your resource's methods directly or passing mock objects, since it uses a real :class:`~twisted.web.client.Agent` to generate the request and a real :class:`~twisted.web.server.Site` to process the response. Thus, the ``request`` object your code interacts with is a *real* :class:`twisted.web.server.Request` and behaves the same as it would in production. Note that if your resource returns :data:`~twisted.web.server.NOT_DONE_YET` you must keep a reference to the :class:`~treq.testing.RequestTraversalAgent` and call its :meth:`~treq.testing.RequestTraversalAgent.flush()` method to spin the memory reactor once the server writes additional data before the client will receive it. treq-18.6.0/docs/Makefile0000644000076500000240000001266413061143557015435 0ustar glyphstaff00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/treq.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/treq.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/treq" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/treq" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." treq-18.6.0/docs/conf.py0000644000076500000240000001742413061143557015273 0ustar glyphstaff00000000000000# -*- coding: utf-8 -*- # # treq documentation build configuration file, created by # sphinx-quickstart on Mon Dec 10 22:32:11 2012. # # This file is execfile()d with the current directory set to its containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import sys, os # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.insert(0, os.path.abspath('..')) # -- General configuration ----------------------------------------------------- # If your documentation needs a minimal Sphinx version, state it here. #needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ['sphinx.ext.viewcode', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx'] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. #source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = u'treq' copyright = u'2014, David Reid' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The full version, including alpha/beta/rc tags. from treq import __version__ as release version = '.'.join(release.split('.')[:2]) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. #language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: #today = '' # Else, today_fmt is used as the format for a strftime call. #today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all documents. #default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. #add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). #add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. #show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # A list of ignored prefixes for module index sorting. #modindex_common_prefix = [] # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'default' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. #html_theme_options = {} # Add any paths that contain custom themes here, relative to this directory. #html_theme_path = [] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". #html_title = None # A shorter title for the navigation bar. Default is the same as html_title. #html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. #html_logo = None # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. #html_favicon = None # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. #html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. #html_use_smartypants = True # Custom sidebar templates, maps document names to template names. #html_sidebars = {} # Additional templates that should be rendered to pages, maps page names to # template names. #html_additional_pages = {} # If false, no module index is generated. #html_domain_indices = True # If false, no index is generated. #html_use_index = True # If true, the index is split into individual pages for each letter. #html_split_index = False # If true, links to the reST sources are added to the pages. #html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. #html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. #html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. #html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). #html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'treqdoc' # -- Options for LaTeX output -------------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). #'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). #'pointsize': '10pt', # Additional stuff for the LaTeX preamble. #'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass [howto/manual]). latex_documents = [ ('index', 'treq.tex', u'treq Documentation', u'David Reid', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. #latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. #latex_use_parts = False # If true, show page references after internal links. #latex_show_pagerefs = False # If true, show URL addresses after external links. #latex_show_urls = False # Documents to append as an appendix to all manuals. #latex_appendices = [] # If false, no module index is generated. #latex_domain_indices = True # -- Options for manual page output -------------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'treq', u'treq Documentation', [u'David Reid'], 1) ] # If true, show URL addresses after external links. #man_show_urls = False # -- Options for Texinfo output ------------------------------------------------ # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'treq', u'treq Documentation', u'David Reid', 'treq', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. #texinfo_appendices = [] # If false, no module index is generated. #texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. #texinfo_show_urls = 'footnote' RTD_NEW_THEME = True intersphinx_mapping = { 'python': ('https://docs.python.org/3.5', None), 'twisted': ('https://twistedmatrix.com/documents/current/api/', None), } treq-18.6.0/docs/_static/0000755000076500000240000000000013315341523015405 5ustar glyphstaff00000000000000treq-18.6.0/docs/_static/.keepme0000644000076500000240000000000013061143557016647 0ustar glyphstaff00000000000000treq-18.6.0/docs/make.bat0000644000076500000240000001174413061143557015400 0ustar glyphstaff00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\treq.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\treq.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end treq-18.6.0/docs/examples/0000755000076500000240000000000013315341523015575 5ustar glyphstaff00000000000000treq-18.6.0/docs/examples/using_cookies.py0000644000076500000240000000073013061143557021015 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('http://httpbin.org/cookies/set?hello=world') def _get_jar(resp): jar = resp.cookies() print('The server set our hello cookie to: {}'.format(jar['hello'])) return treq.get('http://httpbin.org/cookies', cookies=jar) d.addCallback(_get_jar) d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/iresource.py0000644000076500000240000000065013106516575020161 0ustar glyphstaff00000000000000import json from zope.interface import implementer from twisted.web.resource import IResource @implementer(IResource) class JsonResource(object): isLeaf = True # NB: means getChildWithDefault will not be called def __init__(self, data): self.data = data def render(self, request): request.setHeader(b'Content-Type', b'application/json') return json.dumps(self.data).encode('utf-8') treq-18.6.0/docs/examples/basic_post.py0000644000076500000240000000056213061143557020305 0ustar glyphstaff00000000000000import json from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.post('http://httpbin.org/post', json.dumps({"msg": "Hello!"}).encode('ascii'), headers={b'Content-Type': [b'application/json']}) d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/basic_auth.py0000644000076500000240000000043413061143557020257 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get( 'http://httpbin.org/basic-auth/treq/treq', auth=('treq', 'treq') ) d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/redirects.py0000644000076500000240000000034513061143557020142 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('http://httpbin.org/redirect/1') d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/query_params.py0000644000076500000240000000223413061143557020665 0ustar glyphstaff00000000000000from twisted.internet.task import react from twisted.internet.defer import inlineCallbacks import treq @inlineCallbacks def main(reactor): print('List of tuples') resp = yield treq.get('http://httpbin.org/get', params=[('foo', 'bar'), ('baz', 'bax')]) content = yield resp.text() print(content) print('Single value dictionary') resp = yield treq.get('http://httpbin.org/get', params={'foo': 'bar', 'baz': 'bax'}) content = yield resp.text() print(content) print('Multi value dictionary') resp = yield treq.get('http://httpbin.org/get', params={'foo': ['bar', 'baz', 'bax']}) content = yield resp.text() print(content) print('Mixed value dictionary') resp = yield treq.get('http://httpbin.org/get', params={'foo': ['bar', 'baz'], 'bax': 'quux'}) content = yield resp.text() print(content) print('Preserved query parameters') resp = yield treq.get('http://httpbin.org/get?foo=bar', params={'baz': 'bax'}) content = yield resp.text() print(content) react(main, []) treq-18.6.0/docs/examples/basic_get.py0000644000076500000240000000033613061143557020076 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('http://httpbin.org/get') d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/testing_seq.py0000644000076500000240000000377413221271655020513 0ustar glyphstaff00000000000000from twisted.internet import defer from twisted.trial.unittest import SynchronousTestCase from twisted.web import http from treq.testing import StubTreq, HasHeaders from treq.testing import RequestSequence, StringStubbingResource @defer.inlineCallbacks def make_a_request(treq): """ Make a request using treq. """ response = yield treq.get('http://an.example/foo', params={'a': 'b'}, headers={b'Accept': b'application/json'}) if response.code == http.OK: result = yield response.json() else: message = yield response.text() raise Exception("Got an error from the server: {}".format(message)) defer.returnValue(result) class MakeARequestTests(SynchronousTestCase): """ Test :func:`make_a_request()` using :mod:`treq.testing.RequestSequence`. """ def test_200_ok(self): """On a 200 response, return the response's JSON.""" req_seq = RequestSequence([ ((b'get', 'http://an.example/foo', {b'a': [b'b']}, HasHeaders({'Accept': ['application/json']}), b''), (http.OK, {b'Content-Type': b'application/json'}, b'{"status": "ok"}')) ]) treq = StubTreq(StringStubbingResource(req_seq)) with req_seq.consume(self.fail): result = self.successResultOf(make_a_request(treq)) self.assertEqual({"status": "ok"}, result) def test_418_teapot(self): """On an unexpected response code, raise an exception""" req_seq = RequestSequence([ ((b'get', 'http://an.example/foo', {b'a': [b'b']}, HasHeaders({'Accept': ['application/json']}), b''), (418, {b'Content-Type': b'text/plain'}, b"I'm a teapot!")) ]) treq = StubTreq(StringStubbingResource(req_seq)) with req_seq.consume(self.fail): failure = self.failureResultOf(make_a_request(treq)) self.assertEqual(u"Got an error from the server: I'm a teapot!", failure.getErrorMessage()) treq-18.6.0/docs/examples/response_history.py0000644000076500000240000000053613061143557021577 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('http://httpbin.org/redirect/1') def cb(response): print('Response history:') print(response.history()) return print_response(response) d.addCallback(cb) return d react(main, []) treq-18.6.0/docs/examples/disable_redirects.py0000644000076500000240000000037413061143557021627 0ustar glyphstaff00000000000000from twisted.internet.task import react from _utils import print_response import treq def main(reactor, *args): d = treq.get('http://httpbin.org/redirect/1', allow_redirects=False) d.addCallback(print_response) return d react(main, []) treq-18.6.0/docs/examples/download_file.py0000644000076500000240000000054613061143557020767 0ustar glyphstaff00000000000000from twisted.internet.task import react import treq def download_file(reactor, url, destination_filename): destination = open(destination_filename, 'wb') d = treq.get(url) d.addCallback(treq.collect, destination.write) d.addBoth(lambda _: destination.close()) return d react(download_file, ['http://httpbin.org/get', 'download.txt']) treq-18.6.0/docs/examples/_utils.py0000644000076500000240000000032413061143557017452 0ustar glyphstaff00000000000000from __future__ import print_function import treq def print_response(response): print(response.code, response.phrase) print(response.headers) return treq.text_content(response).addCallback(print) treq-18.6.0/docs/api.rst0000644000076500000240000001001013315340321015245 0ustar glyphstaff00000000000000API Reference ============= This page lists all of the interfaces exposed by the `treq` package. Making Requests --------------- .. module:: treq .. autofunction:: request .. autofunction:: get .. autofunction:: head .. autofunction:: post .. autofunction:: put .. autofunction:: patch .. autofunction:: delete Accessing Content ----------------- .. autofunction:: collect .. autofunction:: content .. autofunction:: text_content .. autofunction:: json_content HTTPClient Objects ------------------ .. module:: treq.client The :class:`treq.client.HTTPClient` class provides the same interface as the :mod:`treq` module itself. .. autoclass:: HTTPClient .. automethod:: request .. automethod:: get .. automethod:: head .. automethod:: post .. automethod:: put .. automethod:: patch .. automethod:: delete Augmented Response Objects -------------------------- :func:`treq.request`, :func:`treq.get`, etc. return an object which provides :class:`twisted.web.iweb.IResponse`, plus a few additional convenience methods: .. module:: treq.response .. class:: _Response .. automethod:: collect .. automethod:: content .. automethod:: json .. automethod:: text .. automethod:: history .. automethod:: cookies Inherited from :class:`twisted.web.iweb.IResponse`: :ivar version: See :attr:`IResponse.version ` :ivar code: See :attr:`IResponse.code ` :ivar phrase: See :attr:`IResponse.phrase ` :ivar headers: See :attr:`IResponse.headers ` :ivar length: See :attr:`IResponse.length ` :ivar request: See :attr:`IResponse.request ` :ivar previousResponse: See :attr:`IResponse.previousResponse ` .. method:: deliverBody(protocol) See :meth:`IResponse.deliverBody() ` .. method:: setPreviousResponse(response) See :meth:`IResponse.setPreviousResponse() ` Test Helpers ------------ The :mod:`treq.testing` module contains tools for in-memory testing of HTTP clients and servers. StubTreq Objects ~~~~~~~~~~~~~~~~ .. class:: treq.testing.StubTreq(resource) :class:`StubTreq` implements the same interface as the :mod:`treq` module or the :class:`~treq.client.HTTPClient` class, with the limitation that it does not support the ``files`` argument. .. method:: flush() Flush all data between pending client/server pairs. This is only necessary if a :obj:`Resource` under test returns :obj:`NOT_DONE_YET` from its ``render`` method, making a response asynchronous. In that case, after each write from the server, :meth:`flush()` must be called so the client can see it. As the methods on :class:`treq.client.HTTPClient`: .. method:: request See :func:`treq.request()`. .. method:: get See :func:`treq.get()`. .. method:: head See :func:`treq.head()`. .. method:: post See :func:`treq.post()`. .. method:: put See :func:`treq.put()`. .. method:: patch See :func:`treq.patch()`. .. method:: delete See :func:`treq.delete()`. RequestTraversalAgent Objects ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.RequestTraversalAgent :members: RequestSequence Objects ~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.RequestSequence :members: StringStubbingResource Objects ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.StringStubbingResource :members: HasHeaders Objects ~~~~~~~~~~~~~~~~~~ .. autoclass:: treq.testing.HasHeaders :members: MultiPartProducer Objects ------------------------- :class:`treq.multipart.MultiPartProducer` is used internally when making requests which involve files. .. automodule:: treq.multipart :members: treq-18.6.0/docs/howto.rst0000644000076500000240000000763513315340321015657 0ustar glyphstaff00000000000000Use Cases ========= Handling Streaming Responses ---------------------------- In addition to `receiving responses `_ with :meth:`IResponse.deliverBody`, treq provides a helper function :py:func:`treq.collect` which takes a ``response`` and a single argument function which will be called with all new data available from the response. Much like :meth:`IProtocol.dataReceived`, :py:func:`treq.collect` knows nothing about the framing of your data and will simply call your collector function with any data that is currently available. Here is an example which simply a file object's write method to :py:func:`treq.collect` to save the response body to a file. .. literalinclude:: examples/download_file.py :linenos: :lines: 6-11 Full example: :download:`download_file.py ` Query Parameters ---------------- :py:func:`treq.request` supports a ``params`` keyword argument which will be URL-encoded and added to the ``url`` argument in addition to any query parameters that may already exist. The ``params`` argument may be either a ``dict`` or a ``list`` of ``(key, value)`` tuples. If it is a ``dict`` then the values in the dict may either be a ``str`` value or a ``list`` of ``str`` values. .. literalinclude:: examples/query_params.py :linenos: :lines: 7-37 Full example: :download:`query_params.py ` Auth ---- HTTP Basic authentication as specified in :rfc:`2617` is easily supported by passing an ``auth`` keyword argument to any of the request functions. The ``auth`` argument should be a tuple of the form ``('username', 'password')``. .. literalinclude:: examples/basic_auth.py :linenos: :lines: 7-13 Full example: :download:`basic_auth.py ` Redirects --------- treq handles redirects by default. The following will print a 200 OK response. .. literalinclude:: examples/redirects.py :linenos: :lines: 7-13 Full example: :download:`redirects.py ` You can easily disable redirects by simply passing `allow_redirects=False` to any of the request methods. .. literalinclude:: examples/disable_redirects.py :linenos: :lines: 7-13 Full example: :download:`disable_redirects.py ` You can even access the complete history of treq response objects by calling the :meth:`~treq.response._Response.history()` method on the response. .. literalinclude:: examples/response_history.py :linenos: :lines: 7-15 Full example: :download:`response_history.py ` Cookies ------- Cookies can be set by passing a ``dict`` or ``cookielib.CookieJar`` instance via the ``cookies`` keyword argument. Later cookies set by the server can be retrieved using the :py:meth:`~treq.response._Response.cookies()` method. The object returned by :py:meth:`~treq.response._Response.cookies()` supports the same key/value access as `requests cookies `_. .. literalinclude:: examples/using_cookies.py :linenos: :lines: 7-20 Full example: :download:`using_cookies.py ` Agent Customization ------------------- treq creates its own `twisted.web.client.Agent `_ with reasonable defaults, but you may want to provide your own custom agent. A custom agent can be passed to the various treq request methods using the ``agent`` keyword argument. .. code-block:: python custom_agent = Agent(reactor, connectTimeout=42) treq.get(url, agent=custom_agent) Additionally a custom client can be instantiated to use a custom agent using the ``agent`` keyword argument: .. code-block:: python custom_agent = Agent(reactor, connectTimeout=42) client = treq.client.HTTPClient(agent=custom_agent) client.get(url) treq-18.6.0/setup.py0000644000076500000240000000315513315340321014540 0ustar glyphstaff00000000000000from setuptools import find_packages, setup classifiers = [ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Framework :: Twisted", "Programming Language :: Python", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", ] if __name__ == "__main__": with open('README.rst') as f: readme = f.read() setup( name="treq", packages=find_packages('src'), package_dir={"": "src"}, setup_requires=["incremental"], use_incremental=True, install_requires=[ "incremental", "requests >= 2.1.0", "six", "Twisted[tls] >= 16.4.0", "attrs", ], extras_require={ "dev": [ "mock", "pep8", "pyflakes", "sphinx", "httpbin==0.5.0", ], }, package_data={"treq": ["_version"]}, author="David Reid", author_email="dreid@dreid.org", maintainer="Amber Brown", maintainer_email="hawkowl@twistedmatrix.com", classifiers=classifiers, description="A requests-like API built on top of twisted.web's Agent", license="MIT/X", url="https://github.com/twisted/treq", long_description=readme ) treq-18.6.0/tox.ini0000644000076500000240000000210413315340321014332 0ustar glyphstaff00000000000000[tox] envlist = {pypy,py27,py34,py35,py36}-twisted_{lowest,latest}, {pypy,py27,py34,py35,py36}-twisted_trunk-pyopenssl_trunk, pypi-readme, check-manifest, flake8, docs [testenv] extras = dev deps = coverage mock twisted_lowest: Twisted==16.4.0 twisted_latest: Twisted twisted_trunk: https://github.com/twisted/twisted/archive/trunk.zip pyopenssl_trunk: https://github.com/pyca/pyopenssl/archive/master.zip docs: Sphinx>=1.4.8 setenv = # Avoid unnecessary network access when creating virtualenvs for speed. VIRTUALENV_NO_DOWNLOAD=1 PIP_DISABLE_PIP_VERSION_CHECK=1 passenv = TERM # ensure colors commands = pip list coverage run {envbindir}/trial {posargs:treq} coverage report -m [testenv:flake8] skip_install = True deps = flake8 commands = flake8 src/treq/ [testenv:pypi-readme] deps = readme_renderer commands = python setup.py check -r -s [testenv:check-manifest] deps = check-manifest commands = check-manifest [testenv:docs] changedir = docs commands = sphinx-build -b html . html treq-18.6.0/setup.cfg0000644000076500000240000000010313315341523014642 0ustar glyphstaff00000000000000[bdist_wheel] universal = 1 [egg_info] tag_build = tag_date = 0 treq-18.6.0/README.rst0000644000076500000240000000260613061143557014527 0ustar glyphstaff00000000000000treq ==== |pypi|_ |build|_ |coverage|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> from treq import get >>> def done(response): ... print response.code ... reactor.stop() >>> get("http://www.github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contribute ========== ``treq`` is hosted on `GitHub `_. Feel free to fork and send contributions over. Developing ========== Install dependencies: :: pip install treq[dev] Run Tests (unit & integration): :: trial treq Lint: :: pep8 treq pyflakes treq Build docs:: tox -e docs .. |build| image:: https://api.travis-ci.org/twisted/treq.svg?branch=master .. _build: https://travis-ci.org/twisted/treq .. |coverage| image:: https://codecov.io/github/twisted/treq/coverage.svg?branch=master .. _coverage: https://codecov.io/github/twisted/treq .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg .. _pypi: https://pypi.python.org/pypi/treq treq-18.6.0/tox2travis.py0000755000076500000240000000450613315340321015531 0ustar glyphstaff00000000000000#!/usr/bin/env python """ Generate a Travis CI configuration based on Tox's configured environments. Usage: tox -l | ./tox2travis.py > .travis.yml """ from __future__ import absolute_import, print_function import re import sys travis_template = """\ # AUTO-GENERATED BY tox2travis.py -- DO NOT EDIT THIS FILE BY HAND! sudo: false language: python cache: pip matrix: include: {includes} # Don't fail on trunk versions. allow_failures: - env: TOXENV=pypy-twisted_trunk-pyopenssl_trunk - env: TOXENV=py27-twisted_trunk-pyopenssl_trunk - env: TOXENV=py34-twisted_trunk-pyopenssl_trunk - env: TOXENV=py35-twisted_trunk-pyopenssl_trunk - env: TOXENV=py36-twisted_trunk-pyopenssl_trunk before_install: - | if [[ "${{TOXENV::5}}" == "pypy-" ]]; then PYENV_ROOT="$HOME/.pyenv" git clone --depth 1 https://github.com/yyuu/pyenv.git "$PYENV_ROOT" PATH="$PYENV_ROOT/bin:$PATH" eval "$(pyenv init -)" pyenv install pypy-5.4.1 pyenv global pypy-5.4.1 fi - pip install --upgrade pip - pip install --upgrade setuptools install: - pip install tox codecov script: - tox after_success: - codecov after_failure: - | if [[ -f "_trial_temp/httpbin-server-error.log" ]] then echo "httpbin-server-error.log:" cat "_trial_temp/httpbin-server-error.log" fi notifications: email: false branches: only: - master # AUTO-GENERATED BY tox2travis.py -- DO NOT EDIT THIS FILE BY HAND!""" if __name__ == "__main__": line = sys.stdin.readline() tox_envs = [] while line: tox_envs.append(line.strip()) line = sys.stdin.readline() includes = [] for tox_env in tox_envs: # Parse the Python version from the tox environment name python_match = re.match(r'^py(?:(\d{2})|py)-', tox_env) if python_match is not None: version = python_match.group(1) if version is not None: python = "'{0}.{1}'".format(version[0], version[1]) else: python = 'pypy' else: python = "'2.7'" # Default to Python 2.7 if a version isn't found includes.extend([ '- python: {0}'.format(python), ' env: TOXENV={0}'.format(tox_env) ]) print(travis_template.format(includes='\n '.join(includes))) treq-18.6.0/src/0000755000076500000240000000000013315341523013616 5ustar glyphstaff00000000000000treq-18.6.0/src/treq/0000755000076500000240000000000013315341523014571 5ustar glyphstaff00000000000000treq-18.6.0/src/treq/auth.py0000644000076500000240000000241413061143557016112 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function from twisted.web.http_headers import Headers import base64 class UnknownAuthConfig(Exception): def __init__(self, config): super(Exception, self).__init__( '{0!r} not of a known type.'.format(config)) class _RequestHeaderSettingAgent(object): def __init__(self, agent, request_headers): self._agent = agent self._request_headers = request_headers def request(self, method, uri, headers=None, bodyProducer=None): if headers is None: headers = self._request_headers else: for header, values in self._request_headers.getAllRawHeaders(): headers.setRawHeaders(header, values) return self._agent.request( method, uri, headers=headers, bodyProducer=bodyProducer) def add_basic_auth(agent, username, password): creds = base64.b64encode( '{0}:{1}'.format(username, password).encode('ascii')) return _RequestHeaderSettingAgent( agent, Headers({b'Authorization': [b'Basic ' + creds]})) def add_auth(agent, auth_config): if isinstance(auth_config, tuple): return add_basic_auth(agent, auth_config[0], auth_config[1]) raise UnknownAuthConfig(auth_config) treq-18.6.0/src/treq/multipart.py0000644000076500000240000003045713061143557017202 0ustar glyphstaff00000000000000# Copyright (c) Twisted Matrix Laboratories. # See LICENSE for details. from __future__ import absolute_import, division, print_function from uuid import uuid4 from io import BytesIO from contextlib import closing from twisted.internet import defer, task from twisted.python.compat import unicode, _PY3 from twisted.web.iweb import UNKNOWN_LENGTH, IBodyProducer from zope.interface import implementer if _PY3: long = int CRLF = b"\r\n" @implementer(IBodyProducer) class MultiPartProducer(object): """ :class:`MultiPartProducer` takes parameters for a HTTP request and produces bytes in multipart/form-data format defined in :rfc:`2388` and :rfc:`2046`. The encoded request is produced incrementally and the bytes are written to a consumer. Fields should have form: ``[(parameter name, value), ...]`` Accepted values: * Unicode strings (in this case parameter will be encoded with utf-8) * Tuples with (file name, content-type, :class:`~twisted.web.iweb.IBodyProducer` objects) Since :class:`MultiPartProducer` can accept objects like :class:`~twisted.web.iweb.IBodyProducer` which cannot be read from in an event-driven manner it uses uses a :class:`~twisted.internet.task.Cooperator` instance to schedule reads from the underlying producers. Reading is also paused and resumed based on notifications from the :class:`IConsumer` provider being written to. :ivar _fields: Sorted parameters, where all strings are enforced to be unicode and file objects stacked on bottom (to produce a human readable form-data request) :ivar _cooperate: A method like `Cooperator.cooperate` which is used to schedule all reads. :ivar boundary: The generated boundary used in form-data encoding :type boundary: `bytes` """ def __init__(self, fields, boundary=None, cooperator=task): self._fields = list(_sorted_by_type(_converted(fields))) self._currentProducer = None self._cooperate = cooperator.cooperate self.boundary = boundary or uuid4().hex if isinstance(self.boundary, unicode): self.boundary = self.boundary.encode('ascii') self.length = self._calculateLength() def startProducing(self, consumer): """ Start a cooperative task which will read bytes from the input file and write them to `consumer`. Return a `Deferred` which fires after all bytes have been written. :param consumer: Any `IConsumer` provider """ self._task = self._cooperate(self._writeLoop(consumer)) d = self._task.whenDone() def maybeStopped(reason): reason.trap(task.TaskStopped) return defer.Deferred() d.addCallbacks(lambda ignored: None, maybeStopped) return d def stopProducing(self): """ Permanently stop writing bytes from the file to the consumer by stopping the underlying `CooperativeTask`. """ if self._currentProducer: self._currentProducer.stopProducing() self._task.stop() def pauseProducing(self): """ Temporarily suspend copying bytes from the input file to the consumer by pausing the `CooperativeTask` which drives that activity. """ if self._currentProducer: # Having a current producer means that we are in # the paused state because we've returned # the deferred of the current producer to the # the cooperator. So this request # for pausing us is actually a request to pause # our underlying current producer. self._currentProducer.pauseProducing() else: self._task.pause() def resumeProducing(self): """ Undo the effects of a previous `pauseProducing` and resume copying bytes to the consumer by resuming the `CooperativeTask` which drives the write activity. """ if self._currentProducer: self._currentProducer.resumeProducing() else: self._task.resume() def _calculateLength(self): """ Determine how many bytes the overall form post would consume. The easiest way is to calculate is to generate of `fObj` (assuming it is not modified from this point on). If the determination cannot be made, return `UNKNOWN_LENGTH`. """ consumer = _LengthConsumer() for i in list(self._writeLoop(consumer)): pass return consumer.length def _getBoundary(self, final=False): """ Returns a boundary line, either final (the one that ends the form data request or a regular, the one that separates the boundaries) --this-is-my-boundary """ f = b"--" if final else b"" return b"--" + self.boundary + f def _writeLoop(self, consumer): """ Return an iterator which generates the multipart/form-data request including the encoded objects and writes them to the consumer for each time it is iterated. """ for index, (name, value) in enumerate(self._fields): # We don't write the CRLF of the first boundary: # HTTP request headers are already separated with CRLF # from the request body, another newline is possible # and should be considered as an empty preamble per rfc2046, # but is generally confusing, so we omit it when generating # the request. We don't write Content-Type: multipart/form-data # header here as well as it's defined in the context of the HTTP # request headers, not the producer, so we gust generate # the body. # It's also important to note that the boundary in the message # is defined not only by "--boundary-value" but # but with CRLF characers before it and after the line. # This is very important. # proper boundary is "CRLF--boundary-valueCRLF" consumer.write( (CRLF if index != 0 else b"") + self._getBoundary() + CRLF) yield self._writeField(name, value, consumer) consumer.write(CRLF + self._getBoundary(final=True) + CRLF) def _writeField(self, name, value, consumer): if isinstance(value, unicode): self._writeString(name, value, consumer) elif isinstance(value, tuple): filename, content_type, producer = value return self._writeFile( name, filename, content_type, producer, consumer) def _writeString(self, name, value, consumer): cdisp = _Header(b"Content-Disposition", b"form-data") cdisp.add_param(b"name", name) consumer.write(bytes(cdisp) + CRLF + CRLF) encoded = value.encode("utf-8") consumer.write(encoded) self._currentProducer = None def _writeFile(self, name, filename, content_type, producer, consumer): cdisp = _Header(b"Content-Disposition", b"form-data") cdisp.add_param(b"name", name) if filename: cdisp.add_param(b"filename", filename) consumer.write(bytes(cdisp) + CRLF) consumer.write(bytes(_Header(b"Content-Type", content_type)) + CRLF) if producer.length != UNKNOWN_LENGTH: consumer.write( bytes(_Header(b"Content-Length", producer.length)) + CRLF) consumer.write(CRLF) if isinstance(consumer, _LengthConsumer): consumer.write(producer.length) else: self._currentProducer = producer def unset(val): self._currentProducer = None return val d = producer.startProducing(consumer) d.addCallback(unset) return d def _escape(value): """ This function prevents header values from corrupting the request, a newline in the file name parameter makes form-data request unreadable for majority of parsers. """ if not isinstance(value, (bytes, unicode)): value = unicode(value) if isinstance(value, bytes): value = value.decode('utf-8') return value.replace(u"\r", u"").replace(u"\n", u"").replace(u'"', u'\\"') def _enforce_unicode(value): """ This function enforces the stings passed to be unicode, so we won't need to guess what's the encoding of the binary strings passed in. If someone needs to pass the binary string, use BytesIO and wrap it with `FileBodyProducer`. """ if isinstance(value, unicode): return value elif isinstance(value, bytes): # we got a byte string, and we have no ide what's the encoding of it # we can only assume that it's something cool try: return unicode(value, "utf-8") except UnicodeDecodeError: raise ValueError( "Supplied raw bytes that are not ascii/utf-8." " When supplying raw string make sure it's ascii or utf-8" ", or work with unicode if you are not sure") else: raise ValueError( "Unsupported field type: %s" % (value.__class__.__name__,)) def _converted(fields): if hasattr(fields, "iteritems"): fields = fields.iteritems() elif hasattr(fields, "items"): fields = fields.items() for name, value in fields: name = _enforce_unicode(name) if isinstance(value, (tuple, list)): if len(value) != 3: raise ValueError( "Expected tuple: (filename, content type, producer)") filename, content_type, producer = value filename = _enforce_unicode(filename) if filename else None yield name, (filename, content_type, producer) elif isinstance(value, (bytes, unicode)): yield name, _enforce_unicode(value) else: raise ValueError( "Unsupported value, expected string, unicode " "or tuple (filename, content type, IBodyProducer)") class _LengthConsumer(object): """ `_LengthConsumer` is used to calculate the length of the multi-part request. The easiest way to do that is to consume all the fields, but instead writing them to the string just accumulate the request length. :ivar length: The length of the request. Can be `UNKNOWN_LENGTH` if consumer finds the field that has length that can not be calculated """ def __init__(self): self.length = 0 def write(self, value): # this means that we have encountered # unknown length producer # so we need to stop attempts calculating if self.length is UNKNOWN_LENGTH: return if value is UNKNOWN_LENGTH: self.length = value elif isinstance(value, (int, long)): self.length += value else: self.length += len(value) class _Header(object): """ `_Header` This class is a tiny wrapper that produces request headers. We can't use standard python header class because it encodes unicode fields using =? bla bla ?= encoding, which is correct, but no one in HTTP world expects that, everyone wants utf-8 raw bytes. """ def __init__(self, name, value, params=None): self.name = name self.value = value self.params = params or [] def add_param(self, name, value): self.params.append((name, value)) def __bytes__(self): with closing(BytesIO()) as h: h.write(self.name + b": " + _escape(self.value).encode("us-ascii")) if self.params: for (name, val) in self.params: h.write(b"; ") h.write(_escape(name).encode("us-ascii")) h.write(b"=") h.write(b'"' + _escape(val).encode('utf-8') + b'"') h.seek(0) return h.read() def __str__(self): return self.__bytes__() def _sorted_by_type(fields): """Sorts params so that strings are placed before files. That makes a request more readable, as generally files are bigger. It also provides deterministic order of fields what is easier for testing. """ def key(p): key, val = p if isinstance(val, (bytes, unicode)): return (0, key) else: return (1, key) return sorted(fields, key=key) treq-18.6.0/src/treq/test/0000755000076500000240000000000013315341523015550 5ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/test_utils.py0000644000076500000240000000405413106516575020335 0ustar glyphstaff00000000000000import mock from twisted.trial.unittest import TestCase from treq._utils import default_reactor, default_pool, set_global_pool class DefaultReactorTests(TestCase): def test_passes_reactor(self): mock_reactor = mock.Mock() self.assertEqual(default_reactor(mock_reactor), mock_reactor) def test_uses_default_reactor(self): from twisted.internet import reactor self.assertEqual(default_reactor(None), reactor) class DefaultPoolTests(TestCase): def setUp(self): set_global_pool(None) pool_patcher = mock.patch('treq._utils.HTTPConnectionPool') self.HTTPConnectionPool = pool_patcher.start() self.addCleanup(pool_patcher.stop) self.reactor = mock.Mock() def test_persistent_false(self): self.assertEqual( default_pool(self.reactor, None, False), self.HTTPConnectionPool.return_value ) self.HTTPConnectionPool.assert_called_once_with( self.reactor, persistent=False ) def test_pool_none_persistent_none(self): self.assertEqual( default_pool(self.reactor, None, None), self.HTTPConnectionPool.return_value ) self.HTTPConnectionPool.assert_called_once_with( self.reactor, persistent=True ) def test_pool_none_persistent_true(self): self.assertEqual( default_pool(self.reactor, None, True), self.HTTPConnectionPool.return_value ) self.HTTPConnectionPool.assert_called_once_with( self.reactor, persistent=True ) def test_cached_global_pool(self): pool1 = default_pool(self.reactor, None, None) self.HTTPConnectionPool.return_value = mock.Mock() pool2 = default_pool(self.reactor, None, True) self.assertEqual(pool1, pool2) def test_specified_pool(self): pool = mock.Mock() self.assertEqual( default_pool(self.reactor, pool, None), pool ) self.HTTPConnectionPool.assert_not_called() treq-18.6.0/src/treq/test/test_treq_integration.py0000644000076500000240000002276013221271655022552 0ustar glyphstaff00000000000000from io import BytesIO from twisted.python.url import URL from twisted.trial.unittest import TestCase from twisted.internet.defer import CancelledError, inlineCallbacks from twisted.internet.task import deferLater from twisted.internet import reactor from twisted.internet.tcp import Client from twisted.internet.ssl import Certificate, trustRootFromCertificates from twisted.web.client import (Agent, BrowserLikePolicyForHTTPS, HTTPConnectionPool, ResponseFailed) from treq.test.util import DEBUG, skip_on_windows_because_of_199 from .local_httpbin.parent import _HTTPBinProcess import treq skip = skip_on_windows_because_of_199() @inlineCallbacks def print_response(response): if DEBUG: print() print('---') print(response.code) print(response.headers) text = yield treq.text_content(response) print(text) print('---') def with_baseurl(method): def _request(self, url, *args, **kwargs): return method(self.baseurl + url, *args, agent=self.agent, pool=self.pool, **kwargs) return _request class TreqIntegrationTests(TestCase): get = with_baseurl(treq.get) head = with_baseurl(treq.head) post = with_baseurl(treq.post) put = with_baseurl(treq.put) patch = with_baseurl(treq.patch) delete = with_baseurl(treq.delete) _httpbin_process = _HTTPBinProcess(https=False) @inlineCallbacks def setUp(self): description = yield self._httpbin_process.server_description( reactor) self.baseurl = URL(scheme=u"http", host=description.host, port=description.port).asText() self.agent = Agent(reactor) self.pool = HTTPConnectionPool(reactor, False) def tearDown(self): def _check_fds(_): # This appears to only be necessary for HTTPS tests. # For the normal HTTP tests then closeCachedConnections is # sufficient. fds = set(reactor.getReaders() + reactor.getReaders()) if not [fd for fd in fds if isinstance(fd, Client)]: return return deferLater(reactor, 0, _check_fds, None) return self.pool.closeCachedConnections().addBoth(_check_fds) @inlineCallbacks def assert_data(self, response, expected_data): body = yield treq.json_content(response) self.assertIn('data', body) self.assertEqual(body['data'], expected_data) @inlineCallbacks def assert_sent_header(self, response, header, expected_value): body = yield treq.json_content(response) self.assertIn(header, body['headers']) self.assertEqual(body['headers'][header], expected_value) @inlineCallbacks def test_get(self): response = yield self.get('/get') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_headers(self): response = yield self.get('/get', {b'X-Blah': [b'Foo', b'Bar']}) self.assertEqual(response.code, 200) yield self.assert_sent_header(response, 'X-Blah', 'Foo,Bar') yield print_response(response) @inlineCallbacks def test_get_headers_unicode(self): response = yield self.get('/get', {u'X-Blah': [u'Foo', b'Bar']}) self.assertEqual(response.code, 200) yield self.assert_sent_header(response, 'X-Blah', 'Foo,Bar') yield print_response(response) @inlineCallbacks def test_get_302_absolute_redirect(self): response = yield self.get( '/redirect-to?url={0}/get'.format(self.baseurl)) self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_302_relative_redirect(self): response = yield self.get('/relative-redirect/1') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_get_302_redirect_disallowed(self): response = yield self.get('/redirect/1', allow_redirects=False) self.assertEqual(response.code, 302) yield print_response(response) @inlineCallbacks def test_head(self): response = yield self.head('/get') body = yield treq.content(response) self.assertEqual(b'', body) yield print_response(response) @inlineCallbacks def test_head_302_absolute_redirect(self): response = yield self.head( '/redirect-to?url={0}/get'.format(self.baseurl)) self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_head_302_relative_redirect(self): response = yield self.head('/relative-redirect/1') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_head_302_redirect_disallowed(self): response = yield self.head('/redirect/1', allow_redirects=False) self.assertEqual(response.code, 302) yield print_response(response) @inlineCallbacks def test_post(self): response = yield self.post('/post', b'Hello!') self.assertEqual(response.code, 200) yield self.assert_data(response, 'Hello!') yield print_response(response) @inlineCallbacks def test_multipart_post(self): class FileLikeObject(BytesIO): def __init__(self, val): BytesIO.__init__(self, val) self.name = "david.png" def read(*args, **kwargs): return BytesIO.read(*args, **kwargs) response = yield self.post( '/post', data={"a": "b"}, files={"file1": FileLikeObject(b"file")}) self.assertEqual(response.code, 200) body = yield treq.json_content(response) self.assertEqual('b', body['form']['a']) self.assertEqual('file', body['files']['file1']) yield print_response(response) @inlineCallbacks def test_post_headers(self): response = yield self.post( '/post', b'{msg: "Hello!"}', headers={'Content-Type': ['application/json']} ) self.assertEqual(response.code, 200) yield self.assert_sent_header( response, 'Content-Type', 'application/json') yield self.assert_data(response, '{msg: "Hello!"}') yield print_response(response) @inlineCallbacks def test_put(self): response = yield self.put('/put', data=b'Hello!') yield print_response(response) @inlineCallbacks def test_patch(self): response = yield self.patch('/patch', data=b'Hello!') self.assertEqual(response.code, 200) yield self.assert_data(response, 'Hello!') yield print_response(response) @inlineCallbacks def test_delete(self): response = yield self.delete('/delete') self.assertEqual(response.code, 200) yield print_response(response) @inlineCallbacks def test_gzip(self): response = yield self.get('/gzip') self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertTrue(json['gzipped']) @inlineCallbacks def test_basic_auth(self): response = yield self.get('/basic-auth/treq/treq', auth=('treq', 'treq')) self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertTrue(json['authenticated']) self.assertEqual(json['user'], 'treq') @inlineCallbacks def test_failed_basic_auth(self): response = yield self.get('/basic-auth/treq/treq', auth=('not-treq', 'not-treq')) self.assertEqual(response.code, 401) yield print_response(response) @inlineCallbacks def test_timeout(self): """ Verify a timeout fires if a request takes too long. """ yield self.assertFailure(self.get('/delay/2', timeout=1), CancelledError, ResponseFailed) @inlineCallbacks def test_cookie(self): response = yield self.get('/cookies', cookies={'hello': 'there'}) self.assertEqual(response.code, 200) yield print_response(response) json = yield treq.json_content(response) self.assertEqual(json['cookies']['hello'], 'there') @inlineCallbacks def test_set_cookie(self): response = yield self.get('/cookies/set', allow_redirects=False, params={'hello': 'there'}) # self.assertEqual(response.code, 200) yield print_response(response) self.assertEqual(response.cookies()['hello'], 'there') class HTTPSTreqIntegrationTests(TreqIntegrationTests): _httpbin_process = _HTTPBinProcess(https=True) @inlineCallbacks def setUp(self): description = yield self._httpbin_process.server_description( reactor) self.baseurl = URL(scheme=u"https", host=description.host, port=description.port).asText() root = trustRootFromCertificates( [Certificate.loadPEM(description.cacert)], ) self.agent = Agent( reactor, contextFactory=BrowserLikePolicyForHTTPS(root), ) self.pool = HTTPConnectionPool(reactor, False) treq-18.6.0/src/treq/test/test_response.py0000644000076500000240000001111013315340321021004 0ustar glyphstaff00000000000000from decimal import Decimal from twisted.trial.unittest import SynchronousTestCase from twisted.python.failure import Failure from twisted.web.client import ResponseDone from twisted.web.iweb import UNKNOWN_LENGTH from twisted.web.http_headers import Headers from treq.response import _Response class FakeResponse(object): def __init__(self, code, headers, body=()): self.code = code self.headers = headers self.previousResponse = None self._body = body self.length = sum(len(c) for c in body) def setPreviousResponse(self, response): self.previousResponse = response def deliverBody(self, protocol): for chunk in self._body: protocol.dataReceived(chunk) protocol.connectionLost(Failure(ResponseDone())) class ResponseTests(SynchronousTestCase): def test_repr_content_type(self): """ When the response has a Content-Type header its value is included in the response. """ headers = Headers({'Content-Type': ['text/html']}) original = FakeResponse(200, headers, body=[b'']) self.assertEqual( "", repr(_Response(original, None)), ) def test_repr_content_type_missing(self): """ A request with no Content-Type just displays an empty field. """ original = FakeResponse(204, Headers(), body=[b'']) self.assertEqual( "", repr(_Response(original, None)), ) def test_repr_content_type_hostile(self): """ Garbage in the Content-Type still produces a reasonable representation. """ headers = Headers({'Content-Type': [u'\u2e18', ' x/y']}) original = FakeResponse(418, headers, body=[b'']) self.assertEqual( r"", repr(_Response(original, None)), ) def test_repr_unknown_length(self): """ A HTTP 1.0 or chunked response displays an unknown length. """ headers = Headers({'Content-Type': ['text/event-stream']}) original = FakeResponse(200, headers) original.length = UNKNOWN_LENGTH self.assertEqual( "", repr(_Response(original, None)), ) def test_collect(self): original = FakeResponse(200, Headers(), body=[b'foo', b'bar', b'baz']) calls = [] _Response(original, None).collect(calls.append) self.assertEqual([b'foo', b'bar', b'baz'], calls) def test_content(self): original = FakeResponse(200, Headers(), body=[b'foo', b'bar', b'baz']) self.assertEqual( b'foobarbaz', self.successResultOf(_Response(original, None).content()), ) def test_json(self): original = FakeResponse(200, Headers(), body=[b'{"foo": ', b'"bar"}']) self.assertEqual( {'foo': 'bar'}, self.successResultOf(_Response(original, None).json()), ) def test_json_customized(self): original = FakeResponse(200, Headers(), body=[b'{"foo": ', b'1.0000000000000001}']) self.assertEqual( self.successResultOf(_Response(original, None).json( parse_float=Decimal) )["foo"], Decimal("1.0000000000000001") ) def test_text(self): headers = Headers({b'content-type': [b'text/plain;charset=utf-8']}) original = FakeResponse(200, headers, body=[b'\xe2\x98', b'\x83']) self.assertEqual( u'\u2603', self.successResultOf(_Response(original, None).text()), ) def test_history(self): redirect1 = FakeResponse( 301, Headers({'location': ['http://example.com/']}) ) redirect2 = FakeResponse( 302, Headers({'location': ['https://example.com/']}) ) redirect2.setPreviousResponse(redirect1) final = FakeResponse(200, Headers({})) final.setPreviousResponse(redirect2) wrapper = _Response(final, None) history = wrapper.history() self.assertEqual(wrapper.code, 200) self.assertEqual(history[0].code, 301) self.assertEqual(history[1].code, 302) def test_no_history(self): wrapper = _Response(FakeResponse(200, Headers({})), None) self.assertEqual(wrapper.history(), []) treq-18.6.0/src/treq/test/test_auth.py0000644000076500000240000000457213106516575020143 0ustar glyphstaff00000000000000import mock from twisted.trial.unittest import TestCase from twisted.web.client import Agent from twisted.web.http_headers import Headers from treq.auth import _RequestHeaderSettingAgent, add_auth, UnknownAuthConfig class RequestHeaderSettingAgentTests(TestCase): def setUp(self): self.agent = mock.Mock(Agent) def test_sets_headers(self): agent = _RequestHeaderSettingAgent( self.agent, Headers({b'X-Test-Header': [b'Test-Header-Value']})) agent.request('method', 'uri') self.agent.request.assert_called_once_with( 'method', 'uri', headers=Headers({b'X-Test-Header': [b'Test-Header-Value']}), bodyProducer=None ) def test_overrides_per_request_headers(self): agent = _RequestHeaderSettingAgent( self.agent, Headers({b'X-Test-Header': [b'Test-Header-Value']}) ) agent.request( 'method', 'uri', Headers({b'X-Test-Header': [b'Unwanted-Value']}) ) self.agent.request.assert_called_once_with( 'method', 'uri', headers=Headers({b'X-Test-Header': [b'Test-Header-Value']}), bodyProducer=None ) class AddAuthTests(TestCase): def setUp(self): self.rhsa_patcher = mock.patch('treq.auth._RequestHeaderSettingAgent') self._RequestHeaderSettingAgent = self.rhsa_patcher.start() self.addCleanup(self.rhsa_patcher.stop) def test_add_basic_auth(self): agent = mock.Mock() add_auth(agent, ('username', 'password')) self._RequestHeaderSettingAgent.assert_called_once_with( agent, Headers({b'authorization': [b'Basic dXNlcm5hbWU6cGFzc3dvcmQ=']}) ) def test_add_basic_auth_huge(self): agent = mock.Mock() pwd = ('verylongpasswordthatextendsbeyondthepointwheremultiplel' 'inesaregenerated') auth = (b'Basic dXNlcm5hbWU6dmVyeWxvbmdwYXNzd29yZHRoYXRleHRlbmRzY' b'mV5b25kdGhlcG9pbnR3aGVyZW11bHRpcGxlbGluZXNhcmVnZW5lcmF0ZWQ=') add_auth(agent, ('username', pwd)) self._RequestHeaderSettingAgent.assert_called_once_with( agent, Headers({b'authorization': [auth]})) def test_add_unknown_auth(self): agent = mock.Mock() self.assertRaises(UnknownAuthConfig, add_auth, agent, mock.Mock()) treq-18.6.0/src/treq/test/util.py0000644000076500000240000000141513221271655017104 0ustar glyphstaff00000000000000import os import platform import mock from twisted.internet import reactor from twisted.internet.task import Clock DEBUG = os.getenv("TREQ_DEBUG", False) == "true" is_pypy = platform.python_implementation() == 'PyPy' def with_clock(fn): def wrapper(*args, **kwargs): clock = Clock() with mock.patch.object(reactor, 'callLater', clock.callLater): return fn(*(args + (clock,)), **kwargs) return wrapper def skip_on_windows_because_of_199(): """ Return a skip describing issue #199 under Windows. :return: A :py:class:`str` skip reason. """ if platform.system() == 'Windows': return ("HTTPBin process cannot run under Windows." " See https://github.com/twisted/treq/issues/199") return None treq-18.6.0/src/treq/test/__init__.py0000644000076500000240000000000013061143557017654 0ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/local_httpbin/0000755000076500000240000000000013315341523020372 5ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/local_httpbin/test/0000755000076500000240000000000013315341523021351 5ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/local_httpbin/test/test_shared.py0000644000076500000240000000243113221271655024234 0ustar glyphstaff00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.shared` """ from twisted.trial import unittest from .. import shared class HTTPBinDescriptionTests(unittest.SynchronousTestCase): """ Tests for :py:class:`shared._HTTPBinDescription` """ def test_round_trip(self): """ :py:class:`shared._HTTPBinDescription.from_json_bytes` can deserialize the output of :py:class:`shared._HTTPBinDescription.to_json_bytes` """ original = shared._HTTPBinDescription(host="host", port=123) round_tripped = shared._HTTPBinDescription.from_json_bytes( original.to_json_bytes(), ) self.assertEqual(original, round_tripped) def test_round_trip_cacert(self): """ :py:class:`shared._HTTPBinDescription.from_json_bytes` can deserialize the output of :py:class:`shared._HTTPBinDescription.to_json_bytes` when ``cacert`` is set. """ original = shared._HTTPBinDescription(host="host", port=123, cacert='cacert') round_tripped = shared._HTTPBinDescription.from_json_bytes( original.to_json_bytes(), ) self.assertEqual(original, round_tripped) treq-18.6.0/src/treq/test/local_httpbin/test/test_child.py0000644000076500000240000003127213221271655024056 0ustar glyphstaff00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.child` """ import attr from cryptography.hazmat.primitives.asymmetric import padding import functools import io from twisted.trial.unittest import SynchronousTestCase from twisted.test.proto_helpers import MemoryReactor from twisted.internet import defer from treq.test.util import skip_on_windows_because_of_199 from twisted.web.server import Site from twisted.web.resource import Resource from service_identity.cryptography import verify_certificate_hostname import six from .. import child, shared skip = skip_on_windows_because_of_199() class CertificatesForAuthorityAndServerTests(SynchronousTestCase): """ Tests for :py:func:`child._certificates_for_authority_and_server` """ def setUp(self): self.hostname = u".example.org" ( self.ca_cert, self.server_private_key, self.server_x509_cert, ) = child._certificates_for_authority_and_server( self.hostname, ) def test_pkey_x509_paired(self): """ The returned private key corresponds to the X.509 certificate's public key. """ server_private_key = self.server_private_key.to_cryptography_key() server_x509_cert = self.server_x509_cert.to_cryptography() plaintext = b'plaintext' ciphertext = server_x509_cert.public_key().encrypt( plaintext, padding.PKCS1v15(), ) self.assertEqual( server_private_key.decrypt( ciphertext, padding.PKCS1v15(), ), plaintext, ) def test_ca_signed_x509(self): """ The returned X.509 certificate was signed by the returned certificate authority's certificate. """ ca_cert = self.ca_cert.original.to_cryptography() server_x509_cert = self.server_x509_cert.to_cryptography() # Raises an InvalidSignature exception on failure. ca_cert.public_key().verify( server_x509_cert.signature, server_x509_cert.tbs_certificate_bytes, padding.PKCS1v15(), server_x509_cert.signature_hash_algorithm ) def test_x509_matches_hostname(self): """ The returned X.509 certificate is valid for the hostname. """ verify_certificate_hostname( self.server_x509_cert.to_cryptography(), self.hostname, ) @attr.s class FakeThreadPoolState(object): """ State for :py:class:`FakeThreadPool`. """ init_call_count = attr.ib(default=0) start_call_count = attr.ib(default=0) @attr.s class FakeThreadPool(object): """ A fake :py:class:`twisted.python.threadpool.ThreadPool` """ _state = attr.ib() def init(self): self._state.init_call_count += 1 return self def start(self): """ See :py:meth:`twisted.python.threadpool.ThreadPool.start` """ self._state.start_call_count += 1 def stop(self): """ See :py:meth:`twisted.python.threadpool.ThreadPool.stop` """ class MakeHTTPBinSiteTests(SynchronousTestCase): """ Tests for :py:func:`_make_httpbin_site`. """ def setUp(self): self.fake_threadpool_state = FakeThreadPoolState() self.fake_threadpool = FakeThreadPool(self.fake_threadpool_state) self.reactor = MemoryReactor() def test_threadpool_management(self): """ A thread pool is created that will be shut down when the reactor shuts down. """ child._make_httpbin_site( self.reactor, threadpool_factory=self.fake_threadpool.init, ) self.assertEqual(self.fake_threadpool_state.init_call_count, 1) self.assertEqual(self.fake_threadpool_state.start_call_count, 1) self.assertEqual(len(self.reactor.triggers['before']['shutdown']), 1) [(stop, _, _)] = self.reactor.triggers['before']['shutdown'] self.assertEqual(stop, self.fake_threadpool.stop) class ServeTLSTests(SynchronousTestCase): """ Tests for :py:func:`_serve_tls` """ def setUp(self): self.reactor = MemoryReactor() self.site = Site(Resource()) def test_tls_listener_matches_description(self): """ An SSL listener is established on the requested host and port, and the host, port, and CA certificate are returned in its description. """ expected_host = 'host' expected_port = 123 description_deferred = child._serve_tls( self.reactor, host=expected_host, port=expected_port, site=self.site, ) self.assertEqual(len(self.reactor.sslServers), 1) [ (actual_port, actual_site, _, _, actual_host) ] = self.reactor.sslServers self.assertEqual(actual_host, expected_host) self.assertEqual(actual_port, expected_port) self.assertIs(actual_site, self.site) description = self.successResultOf(description_deferred) self.assertEqual(description.host, expected_host) self.assertEqual(description.port, expected_port) self.assertTrue(description.cacert) class ServeTCPTests(SynchronousTestCase): """ Tests for :py:func:`_serve_tcp` """ def setUp(self): self.reactor = MemoryReactor() self.site = Site(Resource) def test_tcp_listener_matches_description(self): """ A TCP listeneris established on the request host and port, and the host and port are returned in its description. """ expected_host = 'host' expected_port = 123 description_deferred = child._serve_tcp( self.reactor, host=expected_host, port=expected_port, site=self.site, ) self.assertEqual(len(self.reactor.tcpServers), 1) [ (actual_port, actual_site, _, actual_host) ] = self.reactor.tcpServers self.assertEqual(actual_host, expected_host) self.assertEqual(actual_port, expected_port) self.assertIs(actual_site, self.site) description = self.successResultOf(description_deferred) self.assertEqual(description.host, expected_host) self.assertEqual(description.port, expected_port) self.assertFalse(description.cacert) @attr.s class FlushableBytesIOState(object): """ State for :py:class:`FlushableBytesIO` """ bio = attr.ib(default=attr.Factory(io.BytesIO)) flush_count = attr.ib(default=0) @attr.s class FlushableBytesIO(object): """ A :py:class:`io.BytesIO` wrapper that records flushes. """ _state = attr.ib() def write(self, data): self._state.bio.write(data) def flush(self): self._state.flush_count += 1 if not six.PY2: @attr.s class BufferedStandardOut(object): """ A standard out that whose ``buffer`` is a :py:class:`FlushableBytesIO` instance. """ buffer = attr.ib() class OutputProcessDescriptionTests(SynchronousTestCase): """ Tests for :py:func:`_output_process_description` """ def setUp(self): self.stdout_state = FlushableBytesIOState() self.stdout = FlushableBytesIO(self.stdout_state) if not six.PY2: self.stdout = BufferedStandardOut(self.stdout) def test_description_written(self): """ An :py:class:`shared._HTTPBinDescription` is written to standard out and the line flushed. """ description = shared._HTTPBinDescription(host="host", port=123, cacert="cacert") child._output_process_description(description, self.stdout) written = self.stdout_state.bio.getvalue() self.assertEqual( written, b'{"cacert": "cacert", "host": "host", "port": 123}' + b'\n', ) self.assertEqual(self.stdout_state.flush_count, 1) class ForeverHTTPBinTests(SynchronousTestCase): """ Tests for :py:func:`_forever_httpbin` """ def setUp(self): self.make_httpbin_site_returns = Site(Resource()) self.serve_tcp_calls = [] self.serve_tcp_returns = defer.Deferred() self.serve_tls_calls = [] self.serve_tls_returns = defer.Deferred() self.output_process_description_calls = [] self.output_process_description_returns = None self.reactor = MemoryReactor() self.forever_httpbin = functools.partial( child._forever_httpbin, _make_httpbin_site=self.make_httpbin_site, _serve_tcp=self.serve_tcp, _serve_tls=self.serve_tls, _output_process_description=self.output_process_description, ) def make_httpbin_site(self, reactor, *args, **kwargs): """ A fake :py:func:`child._make_httpbin_site`. """ return self.make_httpbin_site_returns def serve_tcp(self, reactor, host, port, site): """ A fake :py:func:`child._serve_tcp`. """ self.serve_tcp_calls.append((reactor, host, port, site)) return self.serve_tcp_returns def serve_tls(self, reactor, host, port, site): """ A fake :py:func:`child._serve_tls`. """ self.serve_tls_calls.append((reactor, host, port, site)) return self.serve_tls_returns def output_process_description(self, description, *args, **kwargs): """ A fake :py:func:`child._output_process_description` """ self.output_process_description_calls.append(description) return self.output_process_description_returns def assertDescriptionAndDeferred(self, description_deferred, forever_deferred): """ Assert that firing ``description_deferred`` outputs the description but that ``forever_deferred`` never fires. """ description_deferred.callback("description") self.assertEqual(self.output_process_description_calls, ["description"]) self.assertNoResult(forever_deferred) def test_default_arguments(self): """ The default command line arguments host ``httpbin`` on ``localhost`` and a randomly-assigned port, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, []) self.assertEqual( self.serve_tcp_calls, [ (self.reactor, 'localhost', 0, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) def test_https(self): """ The ``--https`` command line argument serves ``httpbin`` over HTTPS, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--https']) self.assertEqual( self.serve_tls_calls, [ (self.reactor, 'localhost', 0, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tls_returns, forever_deferred=deferred, ) def test_host(self): """ The ``--host`` command line argument serves ``httpbin`` on provided host, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--host', 'example.org']) self.assertEqual( self.serve_tcp_calls, [ ( self.reactor, 'example.org', 0, self.make_httpbin_site_returns, ) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) def test_port(self): """ The ``--port`` command line argument serves ``httpbin`` on the provided port, returning a :py:class:`Deferred` that never fires. """ deferred = self.forever_httpbin(self.reactor, ['--port', '91']) self.assertEqual( self.serve_tcp_calls, [ (self.reactor, 'localhost', 91, self.make_httpbin_site_returns) ] ) self.assertDescriptionAndDeferred( description_deferred=self.serve_tcp_returns, forever_deferred=deferred, ) treq-18.6.0/src/treq/test/local_httpbin/test/__init__.py0000644000076500000240000000000013221271655023454 0ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/local_httpbin/test/test_parent.py0000644000076500000240000003660213221271655024266 0ustar glyphstaff00000000000000""" Tests for :py:mod:`treq.test.local_httpbin.parent` """ import attr import json import signal import sys from twisted.internet import defer from twisted.internet.interfaces import (IProcessTransport, IReactorCore, IReactorProcess) from twisted.python.failure import Failure from treq.test.util import skip_on_windows_because_of_199 from twisted.internet.error import ProcessTerminated, ConnectionDone from twisted.test.proto_helpers import MemoryReactor, StringTransport from twisted.trial.unittest import SynchronousTestCase from zope.interface import implementer, verify from .. import parent, shared skip = skip_on_windows_because_of_199() @attr.s class FakeProcessTransportState(object): """ State for :py:class:`FakeProcessTransport`. """ standard_in_closed = attr.ib(default=False) standard_out_closed = attr.ib(default=False) standard_error_closed = attr.ib(default=False) signals = attr.ib(default=attr.Factory(list)) @implementer(IProcessTransport) @attr.s class FakeProcessTransport(StringTransport, object): """ A fake process transport. """ pid = 1234 _state = attr.ib() def closeStdin(self): """ Close standard in. """ self._state.standard_in_closed = True def closeStdout(self): """ Close standard out. """ self._state.standard_out_closed = True def closeStderr(self): """ Close standard error. """ self._state.standard_error_closed = True def closeChildFD(self, descriptor): """ Close a child's file descriptor. :param descriptor: See :py:class:`IProcessProtocol.closeChildFD` """ def writeToChild(self, childFD, data): """ Write data to a child's file descriptor. :param childFD: See :py:class:`IProcessProtocol.writeToChild` :param data: See :py:class:`IProcessProtocol.writeToChild` """ def signalProcess(self, signalID): """ Send a signal. :param signalID: See :py:class:`IProcessProtocol.signalProcess` """ self._state.signals.append(signalID) class FakeProcessTransportTests(SynchronousTestCase): """ Tests for :py:class:`FakeProcessTransport`. """ def setUp(self): self.state = FakeProcessTransportState() self.transport = FakeProcessTransport(self.state) def test_provides_interface(self): """ Instances provide :py:class:`IProcessTransport`. """ verify.verifyObject(IProcessTransport, self.transport) def test_closeStdin(self): """ Closing standard in updates the state instance. """ self.assertFalse(self.state.standard_in_closed) self.transport.closeStdin() self.assertTrue(self.state.standard_in_closed) def test_closeStdout(self): """ Closing standard out updates the state instance. """ self.assertFalse(self.state.standard_out_closed) self.transport.closeStdout() self.assertTrue(self.state.standard_out_closed) def test_closeStderr(self): """ Closing standard error updates the state instance. """ self.assertFalse(self.state.standard_error_closed) self.transport.closeStderr() self.assertTrue(self.state.standard_error_closed) class HTTPServerProcessProtocolTests(SynchronousTestCase): """ Tests for :py:class:`parent._HTTPBinServerProcessProtocol` """ def setUp(self): self.transport_state = FakeProcessTransportState() self.transport = FakeProcessTransport(self.transport_state) self.all_data_received = defer.Deferred() self.terminated = defer.Deferred() self.protocol = parent._HTTPBinServerProcessProtocol( all_data_received=self.all_data_received, terminated=self.terminated, ) self.protocol.makeConnection(self.transport) def assertStandardInputAndOutputClosed(self): """ The transport's standard in, out, and error are closed. """ self.assertTrue(self.transport_state.standard_in_closed) self.assertTrue(self.transport_state.standard_out_closed) self.assertTrue(self.transport_state.standard_error_closed) def test_receive_http_description(self): """ Receiving a serialized :py:class:`_HTTPBinDescription` fires the ``all_data_received`` :py:class:`Deferred`. """ self.assertNoResult(self.all_data_received) description = shared._HTTPBinDescription("host", 1234, "cert") self.protocol.lineReceived( json.dumps(attr.asdict(description)).encode('ascii') ) self.assertStandardInputAndOutputClosed() self.assertEqual(self.successResultOf(self.all_data_received), description) def test_receive_unexpected_line(self): """ Receiving a line after the description synchronously raises in :py:class:`RuntimeError` """ self.test_receive_http_description() with self.assertRaises(RuntimeError): self.protocol.lineReceived(b"unexpected") def test_connection_lost_before_receiving_data(self): """ If the process terminates before its data is received, both ``all_data_received`` and ``terminated`` errback. """ self.assertNoResult(self.all_data_received) self.protocol.connectionLost(Failure(ConnectionDone("done"))) self.assertIsInstance( self.failureResultOf(self.all_data_received).value, ConnectionDone, ) self.assertIsInstance( self.failureResultOf(self.terminated).value, ConnectionDone, ) def test_connection_lost(self): """ ``terminated`` fires when the connection is lost. """ self.test_receive_http_description() self.protocol.connectionLost(Failure(ConnectionDone("done"))) self.assertIsInstance( self.failureResultOf(self.terminated).value, ConnectionDone, ) @attr.s class SpawnedProcess(object): """ A call to :py:class:`MemoryProcessReactor.spawnProcess`. """ process_protocol = attr.ib() executable = attr.ib() args = attr.ib() env = attr.ib() path = attr.ib() uid = attr.ib() gid = attr.ib() use_pty = attr.ib() child_fds = attr.ib() returned_process_transport = attr.ib() returned_process_transport_state = attr.ib() def send_stdout(self, data): """ Send data from the process' standard out. :param data: The standard out data. """ self.process_protocol.childDataReceived(1, data) def end_process(self, reason): """ End the process. :param reason: The reason. :type reason: :py:class:`Failure` """ self.process_protocol.processEnded(reason) @implementer(IReactorCore, IReactorProcess) class MemoryProcessReactor(MemoryReactor): """ A fake :py:class:`IReactorProcess` and :py:class:`IReactorCore` provider to be used in tests. """ def __init__(self): MemoryReactor.__init__(self) self.spawnedProcesses = [] def spawnProcess(self, processProtocol, executable, args=(), env={}, path=None, uid=None, gid=None, usePTY=0, childFDs=None): """ :ivar process_protocol: Stores the protocol passed to the reactor. :return: An L{IProcessTransport} provider. """ transport_state = FakeProcessTransportState() transport = FakeProcessTransport(transport_state) self.spawnedProcesses.append(SpawnedProcess( process_protocol=processProtocol, executable=executable, args=args, env=env, path=path, uid=uid, gid=gid, use_pty=usePTY, child_fds=childFDs, returned_process_transport=transport, returned_process_transport_state=transport_state, )) processProtocol.makeConnection(transport) return transport class MemoryProcessReactorTests(SynchronousTestCase): """ Tests for :py:class:`MemoryProcessReactor` """ def test_provides_interfaces(self): """ :py:class:`MemoryProcessReactor` instances provide :py:class:`IReactorCore` and :py:class:`IReactorProcess`. """ reactor = MemoryProcessReactor() verify.verifyObject(IReactorCore, reactor) verify.verifyObject(IReactorProcess, reactor) class HTTPBinProcessTests(SynchronousTestCase): """ Tests for :py:class:`_HTTPBinProcesss`. """ def setUp(self): self.reactor = MemoryProcessReactor() self.opened_file_descriptors = [] def fd_recording_open(self, *args, **kwargs): """ Record the file descriptors of files opened by :py:func:`open`. :return: A file object. """ fobj = open(*args, **kwargs) self.opened_file_descriptors.append(fobj.fileno()) return fobj def spawned_process(self): """ Assert that ``self.reactor`` has spawned only one process and return the :py:class:`SpawnedProcess` representing it. :return: The :py:class:`SpawnedProcess`. """ self.assertEqual(len(self.reactor.spawnedProcesses), 1) return self.reactor.spawnedProcesses[0] def assertSpawnAndDescription(self, process, args, description): """ Assert that spawning the given process invokes the command with the given args, that standard error is redirected, that it is killed at reactor shutdown, and that it returns a description that matches the provided one. :param process: :py:class:`_HTTPBinProcesss` instance. :param args: The arguments with which to execute the child process. :type args: :py:class:`tuple` of :py:class:`str` :param description: The expected :py:class:`_HTTPBinDescription`. :return: The returned :py:class:`_HTTPBinDescription` """ process._open = self.fd_recording_open description_deferred = process.server_description(self.reactor) spawned_process = self.spawned_process() self.assertEqual(spawned_process.args, args) self.assertEqual(len(self.opened_file_descriptors), 1) [error_log_fd] = self.opened_file_descriptors self.assertEqual(spawned_process.child_fds.get(2), error_log_fd) self.assertNoResult(description_deferred) spawned_process.send_stdout(description.to_json_bytes() + b'\n') before_shutdown = self.reactor.triggers["before"]["shutdown"] self.assertEqual(len(before_shutdown), 1) [(before_shutdown_function, _, _)] = before_shutdown self.assertEqual(before_shutdown_function, process.kill) self.assertEqual(self.successResultOf(description_deferred), description) def test_server_description_spawns_process(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process that it monitors with :py:class:`_HTTPBinServerProcessProtocol`, and redirects its standard error to a log file. """ httpbin_process = parent._HTTPBinProcess(https=False) description = shared._HTTPBinDescription(host="host", port=1234) self.assertSpawnAndDescription( httpbin_process, [ sys.executable, '-m', 'treq.test.local_httpbin.child' ], description) def test_server_description_spawns_process_https(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process that listens over HTTPS, that it monitors with :py:class:`_HTTPBinServerProcessProtocol`, and redirects the process' standard error to a log file. """ httpbin_process = parent._HTTPBinProcess(https=True) description = shared._HTTPBinDescription(host="host", port=1234, cacert="cert") self.assertSpawnAndDescription( httpbin_process, [ sys.executable, '-m', 'treq.test.local_httpbin.child', '--https', ], description) def test_server_description_caches_description(self): """ :py:class:`_HTTPBinProcess.server_description` spawns an ``httpbin`` child process only once, after which it returns a cached :py:class:`_HTTPBinDescription`. """ httpbin_process = parent._HTTPBinProcess(https=False) description_deferred = httpbin_process.server_description(self.reactor) self.spawned_process().send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) description = self.successResultOf(description_deferred) cached_description_deferred = httpbin_process.server_description( self.reactor, ) cached_description = self.successResultOf(cached_description_deferred) self.assertIs(description, cached_description) def test_kill_before_spawn(self): """ Killing a process before it has been spawned has no effect. """ parent._HTTPBinProcess(https=False).kill() def test_kill(self): """ Kill terminates the process as quickly as the platform allows, and the termination failure is suppressed. """ httpbin_process = parent._HTTPBinProcess(https=False) httpbin_process.server_description(self.reactor) spawned_process = self.spawned_process() spawned_process.send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) termination_deferred = httpbin_process.kill() self.assertEqual( spawned_process.returned_process_transport_state.signals, ['KILL'], ) spawned_process.end_process( Failure(ProcessTerminated(1, signal=signal.SIGKILL)), ) self.successResultOf(termination_deferred) def test_kill_unexpected_exit(self): """ The :py:class:`Deferred` returned by :py:meth:`_HTTPBinProcess.kill` errbacks with the failure when it is not :py:class:`ProcessTerminated`, or its signal does not match the expected signal. """ for error in [ProcessTerminated(1, signal=signal.SIGIO), ConnectionDone("Bye")]: httpbin_process = parent._HTTPBinProcess(https=False) httpbin_process.server_description(self.reactor) spawned_process = self.reactor.spawnedProcesses[-1] spawned_process.send_stdout( shared._HTTPBinDescription(host="host", port=1234).to_json_bytes() + b'\n' ) termination_deferred = httpbin_process.kill() spawned_process.end_process(Failure(error)) self.assertIs(self.failureResultOf(termination_deferred).value, error) treq-18.6.0/src/treq/test/local_httpbin/parent.py0000644000076500000240000001232113221271655022240 0ustar glyphstaff00000000000000""" Spawn and monitor an ``httpbin`` child process. """ import attr import signal import sys import os from twisted.protocols import basic, policies from twisted.internet import protocol, endpoints, error from twisted.internet.defer import Deferred, succeed from .shared import _HTTPBinDescription class _HTTPBinServerProcessProtocol(basic.LineOnlyReceiver): """ Manage the lifecycle of an ``httpbin`` process. """ delimiter = b'\n' def __init__(self, all_data_received, terminated): """ Manage the lifecycle of an ``httpbin`` process. :param all_data_received: A Deferred that will be called back with an :py:class:`_HTTPBinDescription` object :type all_data_received: :py:class:`Deferred` :param terminated: A Deferred that will be called back when the process has ended. :type terminated: :py:class:`Deferred` """ self._all_data_received = all_data_received self._received = False self._terminated = terminated def lineReceived(self, line): if self._received: raise RuntimeError("Unexpected line: {!r}".format(line)) description = _HTTPBinDescription.from_json_bytes(line) self._received = True # Remove readers and writers that leave the reactor in a dirty # state after a test. self.transport.closeStdin() self.transport.closeStdout() self.transport.closeStderr() self._all_data_received.callback(description) def connectionLost(self, reason): if not self._received: self._all_data_received.errback(reason) self._terminated.errback(reason) @attr.s class _HTTPBinProcess(object): """ Manage an ``httpbin`` server process. :ivar _all_data_received: See :py:attr:`_HTTPBinServerProcessProtocol.all_data_received` :ivar _terminated: See :py:attr:`_HTTPBinServerProcessProtocol.terminated` """ _https = attr.ib() _error_log_path = attr.ib(default='httpbin-server-error.log') _all_data_received = attr.ib(init=False, default=attr.Factory(Deferred)) _terminated = attr.ib(init=False, default=attr.Factory(Deferred)) _process = attr.ib(init=False, default=None) _process_description = attr.ib(init=False, default=None) _open = staticmethod(open) def _spawn_httpbin_process(self, reactor): """ Spawn an ``httpbin`` process, returning a :py:class:`Deferred` that fires with the process transport and result. """ server = _HTTPBinServerProcessProtocol( all_data_received=self._all_data_received, terminated=self._terminated ) argv = [ sys.executable, '-m', 'treq.test.local_httpbin.child', ] if self._https: argv.append('--https') with self._open(self._error_log_path, 'wb') as error_log: endpoint = endpoints.ProcessEndpoint( reactor, sys.executable, argv, env=os.environ, childFDs={ 1: 'r', 2: error_log.fileno(), }, ) # Processes are spawned synchronously. spawned = endpoint.connect( # ProtocolWrapper, WrappingFactory's protocol, has a # disconnecting attribute. See # https://twistedmatrix.com/trac/ticket/6606 policies.WrappingFactory( protocol.Factory.forProtocol(lambda: server), ), ) def wait_for_protocol(connected_protocol): process = connected_protocol.transport return self._all_data_received.addCallback( return_result_and_process, process, ) def return_result_and_process(description, process): return description, process return spawned.addCallback(wait_for_protocol) def server_description(self, reactor): """ Return a :py:class:`Deferred` that fires with the the process' :py:class:`_HTTPBinDescription`, spawning the process if necessary. """ if self._process is None: ready = self._spawn_httpbin_process(reactor) def store_and_schedule_termination(description_and_process): description, process = description_and_process self._process = process self._process_description = description reactor.addSystemEventTrigger("before", "shutdown", self.kill) return self._process_description return ready.addCallback(store_and_schedule_termination) else: return succeed(self._process_description) def kill(self): """ Kill the ``httpbin`` process. """ if not self._process: return self._process.signalProcess("KILL") def suppress_process_terminated(exit_failure): exit_failure.trap(error.ProcessTerminated) if exit_failure.value.signal != signal.SIGKILL: return exit_failure return self._terminated.addErrback(suppress_process_terminated) treq-18.6.0/src/treq/test/local_httpbin/__init__.py0000644000076500000240000000000013221271655022475 0ustar glyphstaff00000000000000treq-18.6.0/src/treq/test/local_httpbin/shared.py0000644000076500000240000000202513221271655022215 0ustar glyphstaff00000000000000""" Things shared between the ``httpbin`` child and parent processes """ import attr import json @attr.s class _HTTPBinDescription(object): """ Describe an ``httpbin`` process. :param host: The host on which the process listens. :type host: :py:class:`str` :param port: The port on which the process listens. :type port: :py:class:`int` :param cacert: (optional) The PEM-encoded certificate authority's certificate. The calling process' treq must trust this when running HTTPS tests. :type cacert: :py:class:`bytes` or :py:class:`None` """ host = attr.ib() port = attr.ib() cacert = attr.ib(default=None) @classmethod def from_json_bytes(cls, json_data): """ Deserialize an instance from JSON bytes. """ return cls(**json.loads(json_data.decode('ascii'))) def to_json_bytes(self): """ Serialize an instance from JSON bytes. """ return json.dumps(attr.asdict(self), sort_keys=True).encode('ascii') treq-18.6.0/src/treq/test/local_httpbin/child.py0000644000076500000240000002212413221271655022034 0ustar glyphstaff00000000000000""" A local ``httpbin`` server to run integration tests against. This ensures tests do not depend on `httpbin `_. """ from __future__ import print_function import argparse import datetime import sys import httpbin import six from twisted.internet.defer import Deferred, inlineCallbacks, returnValue from twisted.internet.endpoints import TCP4ServerEndpoint, SSL4ServerEndpoint from twisted.internet.task import react from twisted.internet.ssl import (Certificate, CertificateOptions) from OpenSSL.crypto import PKey, X509 from twisted.python.threadpool import ThreadPool from twisted.web.server import Site from twisted.web.wsgi import WSGIResource from cryptography import x509 from cryptography.hazmat.backends import default_backend from cryptography.hazmat.primitives import hashes from cryptography.hazmat.primitives.asymmetric import rsa from cryptography.x509.oid import NameOID from cryptography.hazmat.primitives.serialization import Encoding from .shared import _HTTPBinDescription def _certificates_for_authority_and_server(service_identity, key_size=1024): """ Create a self-signed CA certificate and server certificate signed by the CA. :param service_identity: The identity (hostname) of the server. :type service_identity: :py:class:`unicode` :param key_size: (optional) The size of CA's and server's private RSA keys. Defaults to 1024 bits, which is the minimum allowed by OpenSSL Contexts at the default security level as of 1.1. :type key_size: :py:class:`int` :return: a 3-tuple of ``(certificate_authority_certificate, server_private_key, server_certificate)``. :rtype: :py:class:`tuple` of (:py:class:`sslverify.Certificate`, :py:class:`OpenSSL.crypto.PKey`, :py:class:`OpenSSL.crypto.X509`) """ common_name_for_ca = x509.Name( [x509.NameAttribute(NameOID.COMMON_NAME, u'Testing Example CA')] ) common_name_for_server = x509.Name( [x509.NameAttribute(NameOID.COMMON_NAME, u'Testing Example Server')] ) one_day = datetime.timedelta(1, 0, 0) private_key_for_ca = rsa.generate_private_key( public_exponent=65537, key_size=key_size, backend=default_backend() ) public_key_for_ca = private_key_for_ca.public_key() ca_certificate = ( x509.CertificateBuilder() .subject_name(common_name_for_ca) .issuer_name(common_name_for_ca) .not_valid_before(datetime.datetime.today() - one_day) .not_valid_after(datetime.datetime.today() + one_day) .serial_number(x509.random_serial_number()) .public_key(public_key_for_ca) .add_extension( x509.BasicConstraints(ca=True, path_length=9), critical=True, ) .sign( private_key=private_key_for_ca, algorithm=hashes.SHA256(), backend=default_backend() ) ) private_key_for_server = rsa.generate_private_key( public_exponent=65537, key_size=key_size, backend=default_backend() ) public_key_for_server = private_key_for_server.public_key() server_certificate = ( x509.CertificateBuilder() .subject_name(common_name_for_server) .issuer_name(common_name_for_ca) .not_valid_before(datetime.datetime.today() - one_day) .not_valid_after(datetime.datetime.today() + one_day) .serial_number(x509.random_serial_number()) .public_key(public_key_for_server) .add_extension( x509.BasicConstraints(ca=False, path_length=None), critical=True, ) .add_extension( x509.SubjectAlternativeName( [x509.DNSName(service_identity)] ), critical=True, ) .sign( private_key=private_key_for_ca, algorithm=hashes.SHA256(), backend=default_backend() ) ) ca_self_cert = Certificate.loadPEM( ca_certificate.public_bytes(Encoding.PEM) ) pkey = PKey.from_cryptography_key(private_key_for_server) x509_server_certificate = X509.from_cryptography(server_certificate) return ca_self_cert, pkey, x509_server_certificate def _make_httpbin_site(reactor, threadpool_factory=ThreadPool): """ Return a :py:class:`Site` that hosts an ``httpbin`` WSGI application. :param reactor: The reactor. :param threadpool_factory: (optional) A callable that creates a :py:class:`ThreadPool`. :return: A :py:class:`Site` that hosts ``httpbin`` """ wsgi_threads = threadpool_factory() wsgi_threads.start() reactor.addSystemEventTrigger("before", "shutdown", wsgi_threads.stop) wsgi_resource = WSGIResource(reactor, wsgi_threads, httpbin.app) return Site(wsgi_resource) @inlineCallbacks def _serve_tls(reactor, host, port, site): """ Serve a site over TLS. :param reactor: The reactor. :param host: The host on which to listen. :type host: :py:class:`str` :param port: The host on which to listen. :type port: :py:class:`int` :type site: The :py:class:`Site` to serve. :return: A :py:class:`Deferred` that fires with a :py:class:`_HTTPBinDescription` """ cert_host = host.decode('ascii') if six.PY2 else host ( ca_cert, private_key, certificate, ) = _certificates_for_authority_and_server(cert_host) context_factory = CertificateOptions(privateKey=private_key, certificate=certificate) endpoint = SSL4ServerEndpoint(reactor, port, sslContextFactory=context_factory, interface=host) port = yield endpoint.listen(site) description = _HTTPBinDescription(host=host, port=port.getHost().port, cacert=ca_cert.dumpPEM().decode('ascii')) returnValue(description) @inlineCallbacks def _serve_tcp(reactor, host, port, site): """ Serve a site over plain TCP. :param reactor: The reactor. :param host: The host on which to listen. :type host: :py:class:`str` :param port: The host on which to listen. :type port: :py:class:`int` :return: A :py:class:`Deferred` that fires with a :py:class:`_HTTPBinDescription` """ endpoint = TCP4ServerEndpoint(reactor, port, interface=host) port = yield endpoint.listen(site) description = _HTTPBinDescription(host=host, port=port.getHost().port) returnValue(description) def _output_process_description(description, stdout=sys.stdout): """ Write a process description to standard out. :param description: The process description. :type description: :py:class:`_HTTPBinDescription` :param stdout: (optional) Standard out. """ if six.PY2: write = stdout.write flush = stdout.flush else: write = stdout.buffer.write flush = stdout.buffer.flush write(description.to_json_bytes() + b'\n') flush() def _forever_httpbin(reactor, argv, _make_httpbin_site=_make_httpbin_site, _serve_tcp=_serve_tcp, _serve_tls=_serve_tls, _output_process_description=_output_process_description): """ Run ``httpbin`` forever. :param reactor: The Twisted reactor. :param argv: The arguments with which the script was ran. :type argv: :py:class:`list` of :py:class:`str` :return: a :py:class:`Deferred` that never fires. """ parser = argparse.ArgumentParser( description=""" Run httpbin forever. This writes a JSON object to standard out. The host and port properties contain the host and port on which httpbin listens. When run with HTTPS, the cacert property contains the PEM-encode CA certificate that clients must trust. """ ) parser.add_argument("--https", help="Serve HTTPS", action="store_const", dest='serve', const=_serve_tls, default=_serve_tcp) parser.add_argument("--host", help="The host on which the server will listen.", type=str, default="localhost") parser.add_argument("--port", help="The on which the server will listen.", type=int, default=0) arguments = parser.parse_args(argv) site = _make_httpbin_site(reactor) description_deferred = arguments.serve(reactor, arguments.host, arguments.port, site) description_deferred.addCallback(_output_process_description) description_deferred.addCallback(lambda _: Deferred()) return description_deferred if __name__ == '__main__': react(_forever_httpbin, (sys.argv[1:],)) treq-18.6.0/src/treq/test/test_content.py0000644000076500000240000001526013221271655020643 0ustar glyphstaff00000000000000# -*- coding: utf-8 -*- import mock from twisted.python.failure import Failure from twisted.trial.unittest import TestCase from twisted.web.http_headers import Headers from twisted.web.client import ResponseDone, ResponseFailed from twisted.web.http import PotentialDataLoss from treq import collect, content, json_content, text_content from treq.client import _BufferedResponse class ContentTests(TestCase): def setUp(self): self.response = mock.Mock() self.protocol = None def deliverBody(protocol): self.protocol = protocol self.response.deliverBody.side_effect = deliverBody self.response = _BufferedResponse(self.response) def test_collect(self): data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'{') self.protocol.dataReceived(b'"msg": "hell') self.protocol.dataReceived(b'o"}') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), None) self.assertEqual(data, [b'{', b'"msg": "hell', b'o"}']) def test_collect_failure(self): data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(ResponseFailed("test failure"))) self.failureResultOf(d, ResponseFailed) self.assertEqual(data, [b'foo']) def test_collect_failure_potential_data_loss(self): """ PotentialDataLoss failures are treated as success. """ data = [] d = collect(self.response, data.append) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(PotentialDataLoss())) self.assertEqual(self.successResultOf(d), None) self.assertEqual(data, [b'foo']) def test_collect_0_length(self): self.response.length = 0 d = collect( self.response, lambda d: self.fail("Unexpectedly called with: {0}".format(d))) self.assertEqual(self.successResultOf(d), None) def test_content(self): d = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.dataReceived(b'bar') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), b'foobar') def test_content_cached(self): d1 = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.dataReceived(b'bar') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d1), b'foobar') def _fail_deliverBody(protocol): self.fail("deliverBody unexpectedly called.") self.response.original.deliverBody.side_effect = _fail_deliverBody d3 = content(self.response) self.assertEqual(self.successResultOf(d3), b'foobar') self.assertNotIdentical(d1, d3) def test_content_multiple_waiters(self): d1 = content(self.response) d2 = content(self.response) self.protocol.dataReceived(b'foo') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d1), b'foo') self.assertEqual(self.successResultOf(d2), b'foo') self.assertNotIdentical(d1, d2) def test_json_content(self): self.response.headers = Headers() d = json_content(self.response) self.protocol.dataReceived(b'{"msg":"hello!"}') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {"msg": "hello!"}) def test_json_content_unicode(self): """ When Unicode JSON content is received, the JSON text should be correctly decoded. RFC7159 (8.1): "JSON text SHALL be encoded in UTF-8, UTF-16, or UTF-32. The default encoding is UTF-8" """ self.response.headers = Headers() d = json_content(self.response) self.protocol.dataReceived(u'{"msg":"hëlló!"}'.encode('utf-8')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {u'msg': u'hëlló!'}) def test_json_content_utf16(self): """ JSON received is decoded according to the charset given in the Content-Type header. """ self.response.headers = Headers({ b'Content-Type': [b"application/json; charset='UTF-16LE'"], }) d = json_content(self.response) self.protocol.dataReceived(u'{"msg":"hëlló!"}'.encode('UTF-16LE')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), {u'msg': u'hëlló!'}) def test_text_content(self): self.response.headers = Headers( {b'Content-Type': [b'text/plain; charset=utf-8']}) d = text_content(self.response) self.protocol.dataReceived(b'\xe2\x98\x83') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\u2603') def test_text_content_default_encoding_no_param(self): self.response.headers = Headers( {b'Content-Type': [b'text/plain']}) d = text_content(self.response) self.protocol.dataReceived(b'\xa1') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\xa1') def test_text_content_default_encoding_no_header(self): self.response.headers = Headers() d = text_content(self.response) self.protocol.dataReceived(b'\xa1') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'\xa1') def test_content_application_json_default_encoding(self): self.response.headers = Headers( {b'Content-Type': [b'application/json']}) d = text_content(self.response) self.protocol.dataReceived(b'gr\xc3\xbcn') self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'grün') def test_text_content_unicode_headers(self): """ Header parsing is robust against unicode header names and values. """ self.response.headers = Headers({ b'Content-Type': [ u'text/plain; charset="UTF-16BE"; u=ᛃ'.encode('utf-8')], u'Coördination'.encode('iso-8859-1'): [ u'koʊˌɔrdɪˈneɪʃən'.encode('utf-8')], }) d = text_content(self.response) self.protocol.dataReceived(u'ᚠᚡ'.encode('UTF-16BE')) self.protocol.connectionLost(Failure(ResponseDone())) self.assertEqual(self.successResultOf(d), u'ᚠᚡ') treq-18.6.0/src/treq/test/test_multipart.py0000644000076500000240000005100213106516575021211 0ustar glyphstaff00000000000000# coding: utf-8 # Copyright (c) Twisted Matrix Laboratories. # See LICENSE for details. import cgi from io import BytesIO from twisted.trial import unittest from zope.interface.verify import verifyObject from twisted.python import compat from twisted.internet import task from twisted.web.client import FileBodyProducer from twisted.web.iweb import UNKNOWN_LENGTH, IBodyProducer from treq.multipart import MultiPartProducer, _LengthConsumer if compat._PY3: long = int unicode = compat.unicode class MultiPartProducerTestCase(unittest.TestCase): """ Tests for the L{MultiPartProducer} which gets dictionary like object with post parameters, converts them to mutltipart/form-data format and feeds them to an L{IConsumer}. """ def _termination(self): """ This method can be used as the C{terminationPredicateFactory} for a L{Cooperator}. It returns a predicate which immediately returns C{False}, indicating that no more work should be done this iteration. This has the result of only allowing one iteration of a cooperative task to be run per L{Cooperator} iteration. """ return lambda: True def setUp(self): """ Create a L{Cooperator} hooked up to an easily controlled, deterministic scheduler to use with L{MultiPartProducer}. """ self._scheduled = [] self.cooperator = task.Cooperator( self._termination, self._scheduled.append) def getOutput(self, producer, with_producer=False): """ A convenience function to consume and return outpute. """ consumer = output = BytesIO() producer.startProducing(consumer) while self._scheduled: self._scheduled.pop(0)() if with_producer: return (output.getvalue(), producer) else: return output.getvalue() def newLines(self, value): if isinstance(value, unicode): return value.replace(u"\n", u"\r\n") else: return value.replace(b"\n", b"\r\n") def test_interface(self): """ L{MultiPartProducer} instances provide L{IBodyProducer}. """ self.assertTrue( verifyObject( IBodyProducer, MultiPartProducer({}))) def test_unknownLength(self): """ If the L{MultiPartProducer} is constructed with a file-like object passed as a parameter without either a C{seek} or C{tell} method, its C{length} attribute is set to C{UNKNOWN_LENGTH}. """ class HasSeek(object): def seek(self, offset, whence): pass class HasTell(object): def tell(self): pass producer = MultiPartProducer( {"f": ("name", None, FileBodyProducer(HasSeek()))}) self.assertEqual(UNKNOWN_LENGTH, producer.length) producer = MultiPartProducer( {"f": ("name", None, FileBodyProducer(HasTell()))}) self.assertEqual(UNKNOWN_LENGTH, producer.length) def test_knownLengthOnFile(self): """ If the L{MultiPartProducer} is constructed with a file-like object with both C{seek} and C{tell} methods, its C{length} attribute is set to the size of the file as determined by those methods. """ inputBytes = b"here are some bytes" inputFile = BytesIO(inputBytes) inputFile.seek(5) producer = MultiPartProducer({ "field": ('file name', None, FileBodyProducer( inputFile, cooperator=self.cooperator))}) # Make sure we are generous enough not to alter seek position: self.assertEqual(inputFile.tell(), 5) # Total length is hard to calculate manually # as it contains a lot of headers parameters, newlines and boundaries # let's assert for now that it's no less than the input parameter self.assertTrue(producer.length > len(inputBytes)) # Calculating length should not touch producers self.assertTrue(producer._currentProducer is None) def test_defaultCooperator(self): """ If no L{Cooperator} instance is passed to L{MultiPartProducer}, the global cooperator is used. """ producer = MultiPartProducer({ "field": ('file name', None, FileBodyProducer( BytesIO(b"yo"), cooperator=self.cooperator)) }) self.assertEqual(task.cooperate, producer._cooperate) def test_startProducing(self): """ L{MultiPartProducer.startProducing} starts writing bytes from the input file to the given L{IConsumer} and returns a L{Deferred} which fires when they have all been written. """ consumer = output = BytesIO() producer = MultiPartProducer({ b"field": ('file name', "text/hello-world", FileBodyProducer( BytesIO(b"Hello, World"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) iterations = 0 while self._scheduled: iterations += 1 self._scheduled.pop(0)() self.assertTrue(iterations > 1) self.assertEqual(self.newLines(b"""--heyDavid Content-Disposition: form-data; name="field"; filename="file name" Content-Type: text/hello-world Content-Length: 12 Hello, World --heyDavid-- """), output.getvalue()) self.assertEqual(None, self.successResultOf(complete)) def test_inputClosedAtEOF(self): """ When L{MultiPartProducer} reaches end-of-file on the input file given to it, the input file is closed. """ inputFile = BytesIO(b"hello, world!") consumer = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") producer.startProducing(consumer) while self._scheduled: self._scheduled.pop(0)() self.assertTrue(inputFile.closed) def test_failedReadWhileProducing(self): """ If a read from the input file fails while producing bytes to the consumer, the L{Deferred} returned by L{MultiPartProducer.startProducing} fires with a L{Failure} wrapping that exception. """ class BrokenFile(object): def read(self, count): raise IOError("Simulated bad thing") producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( BrokenFile(), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(BytesIO()) while self._scheduled: self._scheduled.pop(0)() self.failureResultOf(complete).trap(IOError) def test_stopProducing(self): """ L{MultiPartProducer.stopProducing} stops the underlying L{IPullProducer} and the cooperative task responsible for calling C{resumeProducing} and closes the input file but does not cause the L{Deferred} returned by C{startProducing} to fire. """ inputFile = BytesIO(b"hello, world!") consumer = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) self._scheduled.pop(0)() producer.stopProducing() self.assertTrue(inputFile.closed) self._scheduled.pop(0)() self.assertNoResult(complete) def test_pauseProducing(self): """ L{MultiPartProducer.pauseProducing} temporarily suspends writing bytes from the input file to the given L{IConsumer}. """ inputFile = BytesIO(b"hello, world!") consumer = output = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") complete = producer.startProducing(consumer) self._scheduled.pop(0)() currentValue = output.getvalue() self.assertTrue(currentValue) producer.pauseProducing() # Sort of depends on an implementation detail of Cooperator: even # though the only task is paused, there's still a scheduled call. If # this were to go away because Cooperator became smart enough to cancel # this call in this case, that would be fine. self._scheduled.pop(0)() # Since the producer is paused, no new data should be here. self.assertEqual(output.getvalue(), currentValue) self.assertNoResult(complete) def test_resumeProducing(self): """ L{MultoPartProducer.resumeProducing} re-commences writing bytes from the input file to the given L{IConsumer} after it was previously paused with L{MultiPartProducer.pauseProducing}. """ inputFile = BytesIO(b"hello, world!") consumer = output = BytesIO() producer = MultiPartProducer({ "field": ( "file name", "text/hello-world", FileBodyProducer( inputFile, cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid") producer.startProducing(consumer) self._scheduled.pop(0)() currentValue = output.getvalue() self.assertTrue(currentValue) producer.pauseProducing() producer.resumeProducing() self._scheduled.pop(0)() # make sure we started producing new data after resume self.assertTrue(len(currentValue) < len(output.getvalue())) def test_unicodeString(self): """ Make sure unicode string is passed properly """ output, producer = self.getOutput( MultiPartProducer({ "afield": u"Это моя строчечка\r\n", }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(u"""--heyDavid Content-Disposition: form-data; name="afield" Это моя строчечка --heyDavid-- """.encode("utf-8")) self.assertEqual(producer.length, len(expected)) self.assertEqual(expected, output) def test_failOnByteStrings(self): """ If byte string is passed as a param and we don't know the encoding, fail early to prevent corrupted form posts """ self.assertRaises( ValueError, MultiPartProducer, { "afield": u"это моя строчечка".encode("utf-32"), }, cooperator=self.cooperator, boundary=b"heyDavid") def test_failOnUnknownParams(self): """ If byte string is passed as a param and we don't know the encoding, fail early to prevent corrupted form posts """ # unknown key self.assertRaises( ValueError, MultiPartProducer, { (1, 2): BytesIO(b"yo"), }, cooperator=self.cooperator, boundary=b"heyDavid") # tuple length self.assertRaises( ValueError, MultiPartProducer, { "a": (1,), }, cooperator=self.cooperator, boundary=b"heyDavid") # unknown value type self.assertRaises( ValueError, MultiPartProducer, { "a": {"a": "b"}, }, cooperator=self.cooperator, boundary=b"heyDavid") def test_twoFields(self): """ Make sure multiple fields are rendered properly. """ output = self.getOutput( MultiPartProducer({ "afield": "just a string\r\n", "bfield": "another string" }, cooperator=self.cooperator, boundary=b"heyDavid")) self.assertEqual(self.newLines(b"""--heyDavid Content-Disposition: form-data; name="afield" just a string --heyDavid Content-Disposition: form-data; name="bfield" another string --heyDavid-- """), output) def test_fieldsAndAttachment(self): """ Make sure multiple fields are rendered properly. """ output, producer = self.getOutput( MultiPartProducer({ "bfield": "just a string\r\n", "cfield": "another string", "afield": ( "file name", "text/hello-world", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="bfield" just a string --heyDavid Content-Disposition: form-data; name="cfield" another string --heyDavid Content-Disposition: form-data; name="afield"; filename="file name" Content-Type: text/hello-world Content-Length: 15 my lovely bytes --heyDavid-- """) self.assertEqual(producer.length, len(expected)) self.assertEqual(output, expected) def test_multipleFieldsAndAttachments(self): """ Make sure multiple fields, attachments etc are rendered properly. """ output, producer = self.getOutput( MultiPartProducer({ "cfield": "just a string\r\n", "bfield": "another string", "efield": ( "ef", "text/html", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes2"), cooperator=self.cooperator)), "xfield": ( "xf", "text/json", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes219"), cooperator=self.cooperator)), "afield": ( "af", "text/xml", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes22"), cooperator=self.cooperator)) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="bfield" another string --heyDavid Content-Disposition: form-data; name="cfield" just a string --heyDavid Content-Disposition: form-data; name="afield"; filename="af" Content-Type: text/xml Content-Length: 17 my lovely bytes22 --heyDavid Content-Disposition: form-data; name="efield"; filename="ef" Content-Type: text/html Content-Length: 16 my lovely bytes2 --heyDavid Content-Disposition: form-data; name="xfield"; filename="xf" Content-Type: text/json Content-Length: 18 my lovely bytes219 --heyDavid-- """) self.assertEqual(producer.length, len(expected)) self.assertEqual(output, expected) def test_unicodeAttachmentName(self): """ Make sure unicode attachment names are supported. """ output, producer = self.getOutput( MultiPartProducer({ "field": ( u'Так себе имя.jpg', "image/jpeg", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator ) ) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(u"""--heyDavid Content-Disposition: form-data; name="field"; filename="Так себе имя.jpg" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """.encode("utf-8")) self.assertEqual(len(expected), producer.length) self.assertEqual(expected, output) def test_missingAttachmentName(self): """ Make sure attachments without names are supported """ output, producer = self.getOutput( MultiPartProducer({ "field": ( None, "image/jpeg", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator, ) ) }, cooperator=self.cooperator, boundary=b"heyDavid"), with_producer=True) expected = self.newLines(b"""--heyDavid Content-Disposition: form-data; name="field" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """) self.assertEqual(len(expected), producer.length) self.assertEqual(expected, output) def test_newLinesInParams(self): """ Make sure we generate proper format even with newlines in attachments """ output = self.getOutput( MultiPartProducer({ "field": ( u'\r\noops.j\npg', "image/jp\reg\n", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes"), cooperator=self.cooperator ) ) }, cooperator=self.cooperator, boundary=b"heyDavid" ) ) self.assertEqual(self.newLines(u"""--heyDavid Content-Disposition: form-data; name="field"; filename="oops.jpg" Content-Type: image/jpeg Content-Length: 15 my lovely bytes --heyDavid-- """.encode("utf-8")), output) def test_worksWithCgi(self): """ Make sure the stuff we generated actually parsed by python cgi """ output = self.getOutput( MultiPartProducer([ ("cfield", "just a string\r\n"), ("cfield", "another string"), ("efield", ('ef', "text/html", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes2"), cooperator=self.cooperator, ))), ("xfield", ('xf', "text/json", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes219"), cooperator=self.cooperator, ))), ("afield", ('af', "text/xml", FileBodyProducer( inputFile=BytesIO(b"my lovely bytes22"), cooperator=self.cooperator, ))) ], cooperator=self.cooperator, boundary=b"heyDavid" ) ) form = cgi.parse_multipart(BytesIO(output), {"boundary": b"heyDavid"}) self.assertEqual(set([b'just a string\r\n', b'another string']), set(form['cfield'])) self.assertEqual(set([b'my lovely bytes2']), set(form['efield'])) self.assertEqual(set([b'my lovely bytes219']), set(form['xfield'])) self.assertEqual(set([b'my lovely bytes22']), set(form['afield'])) class LengthConsumerTestCase(unittest.TestCase): """ Tests for the _LengthConsumer, an L{IConsumer} which is used to compute the length of a produced content. """ def test_scalarsUpdateCounter(self): """ When a long or an int are written, _LengthConsumer updates its internal counter. """ consumer = _LengthConsumer() self.assertEqual(consumer.length, 0) consumer.write(long(1)) self.assertEqual(consumer.length, 1) consumer.write(2147483647) self.assertEqual(consumer.length, long(2147483648)) def test_stringUpdatesCounter(self): """ Use the written string length to update the internal counter """ a = (b"Cantami, o Diva, del Pelide Achille\n l'ira funesta che " b"infiniti addusse\n lutti agli Achei") consumer = _LengthConsumer() self.assertEqual(consumer.length, 0) consumer.write(a) self.assertEqual(consumer.length, 89) treq-18.6.0/src/treq/test/test_api.py0000644000076500000240000000304313106516575017743 0ustar glyphstaff00000000000000from __future__ import absolute_import, division import mock from twisted.trial.unittest import TestCase import treq from treq._utils import set_global_pool class TreqAPITests(TestCase): def setUp(self): set_global_pool(None) agent_patcher = mock.patch('treq.api.Agent') self.Agent = agent_patcher.start() self.addCleanup(agent_patcher.stop) client_patcher = mock.patch('treq.api.HTTPClient') self.HTTPClient = client_patcher.start() self.addCleanup(client_patcher.stop) pool_patcher = mock.patch('treq._utils.HTTPConnectionPool') self.HTTPConnectionPool = pool_patcher.start() self.addCleanup(pool_patcher.stop) self.client = self.HTTPClient.return_value def test_default_pool(self): resp = treq.get('http://test.com') self.Agent.assert_called_once_with( mock.ANY, pool=self.HTTPConnectionPool.return_value ) self.assertEqual(self.client.get.return_value, resp) def test_cached_pool(self): pool = self.HTTPConnectionPool.return_value treq.get('http://test.com') self.HTTPConnectionPool.return_value = mock.Mock() treq.get('http://test.com') self.Agent.assert_called_with(mock.ANY, pool=pool) def test_custom_agent(self): """ A custom Agent is used if specified. """ custom_agent = mock.Mock() treq.get('https://www.example.org/', agent=custom_agent) self.HTTPClient.assert_called_once_with(custom_agent) treq-18.6.0/src/treq/test/test_client.py0000644000076500000240000004657113315340321020447 0ustar glyphstaff00000000000000# -*- encoding: utf-8 -*- from io import BytesIO import mock from twisted.internet.defer import Deferred, succeed, CancelledError from twisted.internet.protocol import Protocol from twisted.python.failure import Failure from twisted.trial.unittest import TestCase from twisted.web.client import Agent, ResponseFailed from twisted.web.http_headers import Headers from treq.test.util import with_clock from treq.client import ( HTTPClient, _BodyBufferingProtocol, _BufferedResponse ) class HTTPClientTests(TestCase): def setUp(self): self.agent = mock.Mock(Agent) self.client = HTTPClient(self.agent) self.fbp_patcher = mock.patch('treq.client.FileBodyProducer') self.FileBodyProducer = self.fbp_patcher.start() self.addCleanup(self.fbp_patcher.stop) self.mbp_patcher = mock.patch('treq.multipart.MultiPartProducer') self.MultiPartProducer = self.mbp_patcher.start() self.addCleanup(self.mbp_patcher.stop) def assertBody(self, expected): body = self.FileBodyProducer.mock_calls[0][1][0] self.assertEqual(body.read(), expected) def test_post(self): self.client.post('http://example.com/') self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_uri_idn(self): self.client.request('GET', u'http://č.net') self.agent.request.assert_called_once_with( b'GET', b'http://xn--bea.net', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_case_insensitive_methods(self): self.client.request('gEt', 'http://example.com/') self.agent.request.assert_called_once_with( b'GET', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_query_params(self): self.client.request('GET', 'http://example.com/', params={'foo': ['bar']}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_tuple_query_values(self): self.client.request('GET', 'http://example.com/', params={'foo': ('bar',)}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_merge_query_params(self): self.client.request('GET', 'http://example.com/?baz=bax', params={'foo': ['bar', 'baz']}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?baz=bax&foo=bar&foo=baz', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_merge_tuple_query_params(self): self.client.request('GET', 'http://example.com/?baz=bax', params=[('foo', 'bar')]) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?baz=bax&foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_dict_single_value_query_params(self): self.client.request('GET', 'http://example.com/', params={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/?foo=bar', Headers({b'accept-encoding': [b'gzip']}), None) def test_request_data_dict(self): self.client.request('POST', 'http://example.com/', data={'foo': ['bar', 'baz']}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar&foo=baz') def test_request_data_single_dict(self): self.client.request('POST', 'http://example.com/', data={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar') def test_request_data_tuple(self): self.client.request('POST', 'http://example.com/', data=[('foo', 'bar')]) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/x-www-form-urlencoded'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'foo=bar') def test_request_data_file(self): temp_fn = self.mktemp() with open(temp_fn, "wb") as temp_file: temp_file.write(b'hello') self.client.request('POST', 'http://example.com/', data=open(temp_fn, 'rb')) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'hello') def test_request_json_dict(self): self.client.request('POST', 'http://example.com/', json={'foo': 'bar'}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'{"foo":"bar"}') def test_request_json_tuple(self): self.client.request('POST', 'http://example.com/', json=('foo', 1)) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'["foo",1]') def test_request_json_number(self): self.client.request('POST', 'http://example.com/', json=1.) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'1.0') def test_request_json_string(self): self.client.request('POST', 'http://example.com/', json='hello') self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'"hello"') def test_request_json_bool(self): self.client.request('POST', 'http://example.com/', json=True) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'true') def test_request_json_none(self): self.client.request('POST', 'http://example.com/', json=None) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({b'Content-Type': [b'application/json; charset=UTF-8'], b'accept-encoding': [b'gzip']}), self.FileBodyProducer.return_value) self.assertBody(b'null') @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_no_name_attachment(self): self.client.request( 'POST', 'http://example.com/', files={"name": BytesIO(b"hello")}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', (None, 'application/octet-stream', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_named_attachment(self): self.client.request( 'POST', 'http://example.com/', files={ "name": ('image.jpg', BytesIO(b"hello"))}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', ('image.jpg', 'image/jpeg', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_named_attachment_and_ctype(self): self.client.request( 'POST', 'http://example.com/', files={ "name": ('image.jpg', 'text/plain', BytesIO(b"hello"))}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call( [('name', ('image.jpg', 'text/plain', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_mixed_params(self): class NamedFile(BytesIO): def __init__(self, val): BytesIO.__init__(self, val) self.name = "image.png" self.client.request( 'POST', 'http://example.com/', data=[("a", "b"), ("key", "val")], files=[ ("file1", ('image.jpg', BytesIO(b"hello"))), ("file2", NamedFile(b"yo"))]) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call([ ('a', 'b'), ('key', 'val'), ('file1', ('image.jpg', 'image/jpeg', FP)), ('file2', ('image.png', 'image/png', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) @mock.patch('treq.client.uuid.uuid4', mock.Mock(return_value="heyDavid")) def test_request_mixed_params_dict(self): self.client.request( 'POST', 'http://example.com/', data={"key": "a", "key2": "b"}, files={"file1": BytesIO(b"hey")}) self.agent.request.assert_called_once_with( b'POST', b'http://example.com/', Headers({ b'accept-encoding': [b'gzip'], b'Content-Type': [b'multipart/form-data; boundary=heyDavid']}), self.MultiPartProducer.return_value) FP = self.FileBodyProducer.return_value self.assertEqual( mock.call([ ('key', 'a'), ('key2', 'b'), ('file1', (None, 'application/octet-stream', FP))], boundary=b'heyDavid'), self.MultiPartProducer.call_args) def test_request_unsupported_params_combination(self): self.assertRaises(ValueError, self.client.request, 'POST', 'http://example.com/', data=BytesIO(b"yo"), files={"file1": BytesIO(b"hey")}) def test_request_dict_headers(self): self.client.request('GET', 'http://example.com/', headers={ 'User-Agent': 'treq/0.1dev', 'Accept': ['application/json', 'text/plain'] }) self.agent.request.assert_called_once_with( b'GET', b'http://example.com/', Headers({b'User-Agent': [b'treq/0.1dev'], b'accept-encoding': [b'gzip'], b'Accept': [b'application/json', b'text/plain']}), None) @with_clock def test_request_timeout_fired(self, clock): """ Verify the request is cancelled if a response is not received within specified timeout period. """ self.agent.request.return_value = d = Deferred() self.client.request('GET', 'http://example.com', timeout=2) # simulate we haven't gotten a response within timeout seconds clock.advance(3) # a deferred should have been cancelled self.failureResultOf(d, CancelledError) @with_clock def test_request_timeout_cancelled(self, clock): """ Verify timeout is cancelled if a response is received before timeout period elapses. """ self.agent.request.return_value = d = Deferred() self.client.request('GET', 'http://example.com', timeout=2) # simulate a response d.callback(mock.Mock(code=200, headers=Headers({}))) # now advance the clock but since we already got a result, # a cancellation timer should have been cancelled clock.advance(3) self.successResultOf(d) def test_response_is_buffered(self): response = mock.Mock(deliverBody=mock.Mock(), headers=Headers({})) self.agent.request.return_value = succeed(response) d = self.client.get('http://www.example.com') result = self.successResultOf(d) protocol = mock.Mock(Protocol) result.deliverBody(protocol) self.assertEqual(response.deliverBody.call_count, 1) result.deliverBody(protocol) self.assertEqual(response.deliverBody.call_count, 1) def test_response_buffering_is_disabled_with_unbufferred_arg(self): response = mock.Mock(headers=Headers({})) self.agent.request.return_value = succeed(response) d = self.client.get('http://www.example.com', unbuffered=True) # YOLO public attribute. self.assertEqual(self.successResultOf(d).original, response) def test_request_post_redirect_denied(self): response = mock.Mock(code=302, headers=Headers({'Location': ['/']})) self.agent.request.return_value = succeed(response) d = self.client.post('http://www.example.com') self.failureResultOf(d, ResponseFailed) def test_request_browser_like_redirects(self): response = mock.Mock(code=302, headers=Headers({'Location': ['/']})) self.agent.request.return_value = succeed(response) raw = mock.Mock(return_value=[]) final_resp = mock.Mock(code=200, headers=mock.Mock(getRawHeaders=raw)) with mock.patch('twisted.web.client.RedirectAgent._handleRedirect', return_value=final_resp): d = self.client.post('http://www.google.com', browser_like_redirects=True, unbuffered=True) self.assertEqual(self.successResultOf(d).original, final_resp) class BodyBufferingProtocolTests(TestCase): def test_buffers_data(self): buffer = [] protocol = _BodyBufferingProtocol( mock.Mock(Protocol), buffer, None ) protocol.dataReceived("foo") self.assertEqual(buffer, ["foo"]) protocol.dataReceived("bar") self.assertEqual(buffer, ["foo", "bar"]) def test_propagates_data_to_destination(self): destination = mock.Mock(Protocol) protocol = _BodyBufferingProtocol( destination, [], None ) protocol.dataReceived(b"foo") destination.dataReceived.assert_called_once_with(b"foo") protocol.dataReceived(b"bar") destination.dataReceived.assert_called_with(b"bar") def test_fires_finished_deferred(self): finished = Deferred() protocol = _BodyBufferingProtocol( mock.Mock(Protocol), [], finished ) class TestResponseDone(Exception): pass protocol.connectionLost(TestResponseDone()) self.failureResultOf(finished, TestResponseDone) def test_propogates_connectionLost_reason(self): destination = mock.Mock(Protocol) protocol = _BodyBufferingProtocol( destination, [], Deferred().addErrback(lambda ign: None) ) class TestResponseDone(Exception): pass reason = TestResponseDone() protocol.connectionLost(reason) destination.connectionLost.assert_called_once_with(reason) class BufferedResponseTests(TestCase): def test_wraps_protocol(self): wrappers = [] wrapped = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(wrapped) response.deliverBody.assert_called_once_with(wrappers[0]) self.assertNotEqual(wrapped, wrappers[0]) def test_concurrent_receivers(self): wrappers = [] wrapped = mock.Mock(Protocol) unwrapped = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(wrapped) br.deliverBody(unwrapped) response.deliverBody.assert_called_once_with(wrappers[0]) wrappers[0].dataReceived(b"foo") wrapped.dataReceived.assert_called_once_with(b"foo") self.assertEqual(unwrapped.dataReceived.call_count, 0) class TestResponseDone(Exception): pass done = Failure(TestResponseDone()) wrappers[0].connectionLost(done) wrapped.connectionLost.assert_called_once_with(done) unwrapped.dataReceived.assert_called_once_with(b"foo") unwrapped.connectionLost.assert_called_once_with(done) def test_receiver_after_finished(self): wrappers = [] finished = mock.Mock(Protocol) response = mock.Mock(deliverBody=mock.Mock(wraps=wrappers.append)) br = _BufferedResponse(response) br.deliverBody(mock.Mock(Protocol)) wrappers[0].dataReceived(b"foo") class TestResponseDone(Exception): pass done = Failure(TestResponseDone()) wrappers[0].connectionLost(done) br.deliverBody(finished) finished.dataReceived.assert_called_once_with(b"foo") finished.connectionLost.assert_called_once_with(done) treq-18.6.0/src/treq/test/test_testing.py0000644000076500000240000004267613221271655020661 0ustar glyphstaff00000000000000""" In-memory treq returns stubbed responses. """ from functools import partial from inspect import getmembers, isfunction from mock import ANY from six import text_type, binary_type from twisted.trial.unittest import TestCase from twisted.web.client import ResponseFailed from twisted.web.error import SchemeNotSupported from twisted.web.resource import Resource from twisted.web.server import NOT_DONE_YET from twisted.python.compat import _PY3 import treq from treq.testing import ( HasHeaders, RequestSequence, StringStubbingResource, StubTreq ) class _StaticTestResource(Resource): """Resource that always returns 418 "I'm a teapot""" isLeaf = True def render(self, request): request.setResponseCode(418) request.setHeader(b"x-teapot", b"teapot!") return b"I'm a teapot" class _NonResponsiveTestResource(Resource): """Resource that returns NOT_DONE_YET and never finishes the request""" isLeaf = True def render(self, request): return NOT_DONE_YET class _EventuallyResponsiveTestResource(Resource): """ Resource that returns NOT_DONE_YET and stores the request so that something else can finish the response later. """ isLeaf = True def render(self, request): self.stored_request = request return NOT_DONE_YET class StubbingTests(TestCase): """ Tests for :class:`StubTreq`. """ def test_stubtreq_provides_all_functions_in_treq_all(self): """ Every single function and attribute exposed by :obj:`treq.__all__` is provided by :obj:`StubTreq`. """ treq_things = [(name, obj) for name, obj in getmembers(treq) if name in treq.__all__] stub = StubTreq(_StaticTestResource()) api_things = [(name, obj) for name, obj in treq_things if obj.__module__ == "treq.api"] content_things = [(name, obj) for name, obj in treq_things if obj.__module__ == "treq.content"] # sanity checks - this test should fail if treq exposes a new API # without changes being made to StubTreq and this test. msg = ("At the time this test was written, StubTreq only knew about " "treq exposing functions from treq.api and treq.content. If " "this has changed, StubTreq will need to be updated, as will " "this test.") self.assertTrue(all(isfunction(obj) for name, obj in treq_things), msg) self.assertEqual(set(treq_things), set(api_things + content_things), msg) for name, obj in api_things: self.assertTrue( isfunction(getattr(stub, name, None)), "StubTreq.{0} should be a function.".format(name)) for name, obj in content_things: self.assertIs( getattr(stub, name, None), obj, "StubTreq.{0} should just expose treq.{0}".format(name)) def test_providing_resource_to_stub_treq(self): """ The resource provided to StubTreq responds to every request no matter what the URI or parameters or data. """ verbs = ('GET', 'PUT', 'HEAD', 'PATCH', 'DELETE', 'POST') urls = ( 'http://supports-http.com', 'https://supports-https.com', 'http://this/has/a/path/and/invalid/domain/name', 'https://supports-https.com:8080', 'http://supports-http.com:8080', ) params = (None, {}, {b'page': [1]}) headers = (None, {}, {b'x-random-header': [b'value', b'value2']}) data = (None, b"", b'some data', b'{"some": "json"}') stub = StubTreq(_StaticTestResource()) combos = ( (verb, {"url": url, "params": p, "headers": h, "data": d}) for verb in verbs for url in urls for p in params for h in headers for d in data ) for combo in combos: verb, kwargs = combo deferreds = (stub.request(verb, **kwargs), getattr(stub, verb.lower())(**kwargs)) for d in deferreds: resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual([b'teapot!'], resp.headers.getRawHeaders(b'x-teapot')) self.assertEqual(b"" if verb == "HEAD" else b"I'm a teapot", self.successResultOf(stub.content(resp))) def test_handles_invalid_schemes(self): """ Invalid URLs errback with a :obj:`SchemeNotSupported` failure, and does so even after a successful request. """ stub = StubTreq(_StaticTestResource()) self.failureResultOf(stub.get(""), SchemeNotSupported) self.successResultOf(stub.get("http://url.com")) self.failureResultOf(stub.get(""), SchemeNotSupported) def test_files_are_rejected(self): """ StubTreq does not handle files yet - it should reject requests which attempt to pass files. """ stub = StubTreq(_StaticTestResource()) self.assertRaises( AssertionError, stub.request, 'method', 'http://url', files=b'some file') def test_passing_in_strange_data_is_rejected(self): """ StubTreq rejects data that isn't list/dictionary/tuple/bytes/unicode. """ stub = StubTreq(_StaticTestResource()) self.assertRaises( AssertionError, stub.request, 'method', 'http://url', data=object()) self.successResultOf(stub.request('method', 'http://url', data={})) self.successResultOf(stub.request('method', 'http://url', data=[])) self.successResultOf(stub.request('method', 'http://url', data=())) self.successResultOf( stub.request('method', 'http://url', data=binary_type(b""))) self.successResultOf( stub.request('method', 'http://url', data=text_type(""))) def test_handles_failing_asynchronous_requests(self): """ Handle a resource returning NOT_DONE_YET and then canceling the request. """ stub = StubTreq(_NonResponsiveTestResource()) d = stub.request('method', 'http://url', data=b"1234") self.assertNoResult(d) d.cancel() self.failureResultOf(d, ResponseFailed) def test_handles_successful_asynchronous_requests(self): """ Handle a resource returning NOT_DONE_YET and then later finishing the response. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data=b"1234") self.assertNoResult(d) rsrc.stored_request.finish() stub.flush() resp = self.successResultOf(d) self.assertEqual(resp.code, 200) def test_handles_successful_asynchronous_requests_with_response_data(self): """ Handle a resource returning NOT_DONE_YET and then sending some data in the response. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data=b"1234") self.assertNoResult(d) chunks = [] rsrc.stored_request.write(b'spam ') rsrc.stored_request.write(b'eggs') stub.flush() resp = self.successResultOf(d) d = stub.collect(resp, chunks.append) self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'spam eggs') rsrc.stored_request.finish() stub.flush() self.successResultOf(d) def test_handles_successful_asynchronous_requests_with_streaming(self): """ Handle a resource returning NOT_DONE_YET and then streaming data back gradually over time. """ rsrc = _EventuallyResponsiveTestResource() stub = StubTreq(rsrc) d = stub.request('method', 'http://example.com/', data="1234") self.assertNoResult(d) chunks = [] rsrc.stored_request.write(b'spam ') rsrc.stored_request.write(b'eggs') stub.flush() resp = self.successResultOf(d) d = stub.collect(resp, chunks.append) self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'spam eggs') del chunks[:] rsrc.stored_request.write(b'eggs\r\nspam\r\n') stub.flush() self.assertNoResult(d) self.assertEqual(b''.join(chunks), b'eggs\r\nspam\r\n') rsrc.stored_request.finish() stub.flush() self.successResultOf(d) class HasHeadersTests(TestCase): """ Tests for :obj:`HasHeaders`. """ def test_equality_and_strict_subsets_succeed(self): """ The :obj:`HasHeaders` returns True if both sets of headers are equivalent, or the first is a strict subset of the second. """ self.assertEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two', 'three']}, "Equivalent headers do not match.") self.assertEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two', 'three', 'four'], 'ten': ['six']}, "Strict subset headers do not match") def test_partial_or_zero_intersection_subsets_fail(self): """ The :obj:`HasHeaders` returns False if both sets of headers overlap but the first is not a strict subset of the second. It also returns False if there is no overlap. """ self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['three', 'four']}, "Partial value overlap matches") self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'one': ['two']}, "Missing value matches") self.assertNotEqual(HasHeaders({'one': ['two', 'three']}), {'ten': ['six']}, "Complete inequality matches") def test_case_insensitive_keys(self): """ The :obj:`HasHeaders` equality function ignores the case of the header keys. """ self.assertEqual(HasHeaders({b'A': [b'1'], b'b': [b'2']}), {b'a': [b'1'], b'B': [b'2']}) def test_case_sensitive_values(self): """ The :obj:`HasHeaders` equality function does care about the case of the header value. """ self.assertNotEqual(HasHeaders({b'a': [b'a']}), {b'a': [b'A']}) def test_bytes_encoded_forms(self): """ The :obj:`HasHeaders` equality function compares the bytes-encoded forms of both sets of headers. """ self.assertEqual(HasHeaders({b'a': [b'a']}), {u'a': [u'a']}) self.assertEqual(HasHeaders({u'b': [u'b']}), {b'b': [b'b']}) def test_repr(self): """ :obj:`HasHeaders` returns a nice string repr. """ if _PY3: reprOutput = "HasHeaders({b'a': [b'b']})" else: reprOutput = "HasHeaders({'a': ['b']})" self.assertEqual(reprOutput, repr(HasHeaders({b'A': [b'b']}))) class StringStubbingTests(TestCase): """ Tests for :obj:`StringStubbingResource`. """ def _get_response_for(self, expected_args, response): """ Make a :obj:`IStringResponseStubs` that checks the expected args and returns the given response. """ method, url, params, headers, data = expected_args def get_response_for(_method, _url, _params, _headers, _data): self.assertEqual((method, url, params, data), (_method, _url, _params, _data)) self.assertEqual(HasHeaders(headers), _headers) return response return get_response_for def test_interacts_successfully_with_istub(self): """ The :obj:`IStringResponseStubs` is passed the correct parameters with which to evaluate the response, and the response is returned. """ resource = StringStubbingResource(self._get_response_for( (b'DELETE', 'http://what/a/thing', {b'page': [b'1']}, {b'x-header': [b'eh']}, b'datastr'), (418, {b'x-response': b'responseheader'}, b'response body'))) stub = StubTreq(resource) d = stub.delete('http://what/a/thing', headers={b'x-header': b'eh'}, params={b'page': b'1'}, data=b'datastr') resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual([b'responseheader'], resp.headers.getRawHeaders(b'x-response')) self.assertEqual(b'response body', self.successResultOf(stub.content(resp))) class RequestSequenceTests(TestCase): """ Tests for :obj:`RequestSequence`. """ def setUp(self): """ Set up a way to report failures asynchronously. """ self.async_failures = [] def test_mismatched_request_causes_failure(self): """ If a request is made that is not expected as the next request, causes a failure. """ sequence = RequestSequence( [((b'get', 'https://anything/', {b'1': [b'2']}, HasHeaders({b'1': [b'1']}), b'what'), (418, {}, b'body')), ((b'get', 'http://anything', {}, HasHeaders({b'2': [b'1']}), b'what'), (202, {}, b'deleted'))], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) get = partial(stub.get, 'https://anything?1=2', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(get()) self.assertEqual(418, resp.code) self.assertEqual(b'body', self.successResultOf(stub.content(resp))) self.assertEqual([], self.async_failures) resp = self.successResultOf(get()) self.assertEqual(500, resp.code) self.assertEqual(1, len(self.async_failures)) self.assertIn("Expected the next request to be", self.async_failures[0]) self.assertFalse(sequence.consumed()) def test_unexpected_number_of_request_causes_failure(self): """ If there are no more expected requests, making a request causes a failure. """ sequence = RequestSequence( [], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) d = stub.get('https://anything', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(d) self.assertEqual(500, resp.code) self.assertEqual(b'StubbingError', self.successResultOf(resp.content())) self.assertEqual(1, len(self.async_failures)) self.assertIn("No more requests expected, but request", self.async_failures[0]) # the expected requests have all been made self.assertTrue(sequence.consumed()) def test_works_with_mock_any(self): """ :obj:`mock.ANY` can be used with the request parameters. """ sequence = RequestSequence( [((ANY, ANY, ANY, ANY, ANY), (418, {}, b'body'))], async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) with sequence.consume(sync_failure_reporter=self.fail): d = stub.get('https://anything', data=b'what', headers={b'1': b'1'}) resp = self.successResultOf(d) self.assertEqual(418, resp.code) self.assertEqual(b'body', self.successResultOf(stub.content(resp))) self.assertEqual([], self.async_failures) # the expected requests have all been made self.assertTrue(sequence.consumed()) def test_consume_context_manager_fails_on_remaining_requests(self): """ If the `consume` context manager is used, if there are any remaining expecting requests, the test case will be failed. """ sequence = RequestSequence( [((ANY, ANY, ANY, ANY, ANY), (418, {}, b'body'))] * 2, async_failure_reporter=self.async_failures.append) stub = StubTreq(StringStubbingResource(sequence)) consume_failures = [] with sequence.consume(sync_failure_reporter=consume_failures.append): self.successResultOf(stub.get('https://anything', data=b'what', headers={b'1': b'1'})) self.assertEqual(1, len(consume_failures)) self.assertIn( "Not all expected requests were made. Still expecting:", consume_failures[0]) self.assertIn( "{0}(url={0}, params={0}, headers={0}, data={0})".format( repr(ANY)), consume_failures[0]) # no asynchronous failures (mismatches, etc.) self.assertEqual([], self.async_failures) def test_async_failures_logged(self): """ When no `async_failure_reporter` is passed async failures are logged by default. """ sequence = RequestSequence([]) stub = StubTreq(StringStubbingResource(sequence)) with sequence.consume(self.fail): self.successResultOf(stub.get('https://example.com')) [failure] = self.flushLoggedErrors() self.assertIsInstance(failure.value, AssertionError) treq-18.6.0/src/treq/_version.py0000644000076500000240000000037313315341106016767 0ustar glyphstaff00000000000000""" Provides treq version information. """ # This file is auto-generated! Do not edit! # Use `python -m incremental.update treq` to change this file. from incremental import Version __version__ = Version('treq', 18, 6, 0) __all__ = ["__version__"] treq-18.6.0/src/treq/client.py0000644000076500000240000002516213315340321016422 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function import mimetypes import uuid from io import BytesIO from twisted.internet.interfaces import IProtocol from twisted.internet.defer import Deferred from twisted.python.components import proxyForInterface from twisted.python.compat import _PY3, unicode from twisted.python.filepath import FilePath from twisted.python.url import URL from twisted.web.http import urlparse from twisted.web.http_headers import Headers from twisted.web.iweb import IBodyProducer, IResponse from twisted.web.client import ( FileBodyProducer, RedirectAgent, BrowserLikeRedirectAgent, ContentDecoderAgent, GzipDecoder, CookieAgent ) from twisted.python.components import registerAdapter from json import dumps as json_dumps from treq._utils import default_reactor from treq.auth import add_auth from treq import multipart from treq.response import _Response from requests.cookies import cookiejar_from_dict, merge_cookies if _PY3: from urllib.parse import urlunparse, urlencode as _urlencode def urlencode(query, doseq): return _urlencode(query, doseq).encode('ascii') from http.cookiejar import CookieJar else: from cookielib import CookieJar from urlparse import urlunparse from urllib import urlencode class _BodyBufferingProtocol(proxyForInterface(IProtocol)): def __init__(self, original, buffer, finished): self.original = original self.buffer = buffer self.finished = finished def dataReceived(self, data): self.buffer.append(data) self.original.dataReceived(data) def connectionLost(self, reason): self.original.connectionLost(reason) self.finished.errback(reason) class _BufferedResponse(proxyForInterface(IResponse)): def __init__(self, original): self.original = original self._buffer = [] self._waiters = [] self._waiting = None self._finished = False self._reason = None def _deliverWaiting(self, reason): self._reason = reason self._finished = True for waiter in self._waiters: for segment in self._buffer: waiter.dataReceived(segment) waiter.connectionLost(reason) def deliverBody(self, protocol): if self._waiting is None and not self._finished: self._waiting = Deferred() self._waiting.addBoth(self._deliverWaiting) self.original.deliverBody( _BodyBufferingProtocol( protocol, self._buffer, self._waiting ) ) elif self._finished: for segment in self._buffer: protocol.dataReceived(segment) protocol.connectionLost(self._reason) else: self._waiters.append(protocol) class HTTPClient(object): def __init__(self, agent, cookiejar=None, data_to_body_producer=IBodyProducer): self._agent = agent self._cookiejar = cookiejar or cookiejar_from_dict({}) self._data_to_body_producer = data_to_body_producer def get(self, url, **kwargs): """ See :func:`treq.get()`. """ return self.request('GET', url, **kwargs) def put(self, url, data=None, **kwargs): """ See :func:`treq.put()`. """ return self.request('PUT', url, data=data, **kwargs) def patch(self, url, data=None, **kwargs): """ See :func:`treq.patch()`. """ return self.request('PATCH', url, data=data, **kwargs) def post(self, url, data=None, **kwargs): """ See :func:`treq.post()`. """ return self.request('POST', url, data=data, **kwargs) def head(self, url, **kwargs): """ See :func:`treq.head()`. """ return self.request('HEAD', url, **kwargs) def delete(self, url, **kwargs): """ See :func:`treq.delete()`. """ return self.request('DELETE', url, **kwargs) def request(self, method, url, **kwargs): """ See :func:`treq.request()`. """ method = method.encode('ascii').upper() # Join parameters provided in the URL # and the ones passed as argument. params = kwargs.get('params') if params: url = _combine_query_params(url, params) if isinstance(url, unicode): url = URL.fromText(url).asURI().asText().encode('ascii') # Convert headers dictionary to # twisted raw headers format. headers = kwargs.get('headers') if headers: if isinstance(headers, dict): h = Headers({}) for k, v in headers.items(): if isinstance(v, (bytes, unicode)): h.addRawHeader(k, v) elif isinstance(v, list): h.setRawHeaders(k, v) headers = h else: headers = Headers({}) # Here we choose a right producer # based on the parameters passed in. bodyProducer = None data = kwargs.get('data') files = kwargs.get('files') # since json=None needs to be serialized as 'null', we need to # explicitly check kwargs for this key has_json = 'json' in kwargs if files: # If the files keyword is present we will issue a # multipart/form-data request as it suits better for cases # with files and/or large objects. files = list(_convert_files(files)) boundary = str(uuid.uuid4()).encode('ascii') headers.setRawHeaders( b'content-type', [ b'multipart/form-data; boundary=' + boundary]) if data: data = _convert_params(data) else: data = [] bodyProducer = multipart.MultiPartProducer( data + files, boundary=boundary) elif data: # Otherwise stick to x-www-form-urlencoded format # as it's generally faster for smaller requests. if isinstance(data, (dict, list, tuple)): headers.setRawHeaders( b'content-type', [b'application/x-www-form-urlencoded']) data = urlencode(data, doseq=True) bodyProducer = self._data_to_body_producer(data) elif has_json: # If data is sent as json, set Content-Type as 'application/json' headers.setRawHeaders( b'content-type', [b'application/json; charset=UTF-8']) content = kwargs['json'] json = json_dumps(content, separators=(u',', u':')).encode('utf-8') bodyProducer = self._data_to_body_producer(json) cookies = kwargs.get('cookies', {}) if not isinstance(cookies, CookieJar): cookies = cookiejar_from_dict(cookies) cookies = merge_cookies(self._cookiejar, cookies) wrapped_agent = CookieAgent(self._agent, cookies) if kwargs.get('allow_redirects', True): if kwargs.get('browser_like_redirects', False): wrapped_agent = BrowserLikeRedirectAgent(wrapped_agent) else: wrapped_agent = RedirectAgent(wrapped_agent) wrapped_agent = ContentDecoderAgent(wrapped_agent, [(b'gzip', GzipDecoder)]) auth = kwargs.get('auth') if auth: wrapped_agent = add_auth(wrapped_agent, auth) d = wrapped_agent.request( method, url, headers=headers, bodyProducer=bodyProducer) timeout = kwargs.get('timeout') if timeout: delayedCall = default_reactor(kwargs.get('reactor')).callLater( timeout, d.cancel) def gotResult(result): if delayedCall.active(): delayedCall.cancel() return result d.addBoth(gotResult) if not kwargs.get('unbuffered', False): d.addCallback(_BufferedResponse) return d.addCallback(_Response, cookies) def _convert_params(params): if hasattr(params, "iteritems"): return list(sorted(params.iteritems())) elif hasattr(params, "items"): return list(sorted(params.items())) elif isinstance(params, (tuple, list)): return list(params) else: raise ValueError("Unsupported format") def _convert_files(files): """Files can be passed in a variety of formats: * {'file': open("bla.f")} * {'file': (name, open("bla.f"))} * {'file': (name, content-type, open("bla.f"))} * Anything that has iteritems method, e.g. MultiDict: MultiDict([(name, open()), (name, open())] Our goal is to standardize it to unified form of: * [(param, (file name, content type, producer))] """ if hasattr(files, "iteritems"): files = files.iteritems() elif hasattr(files, "items"): files = files.items() for param, val in files: file_name, content_type, fobj = (None, None, None) if isinstance(val, tuple): if len(val) == 2: file_name, fobj = val elif len(val) == 3: file_name, content_type, fobj = val else: fobj = val if hasattr(fobj, "name"): file_name = FilePath(fobj.name).basename() if not content_type: content_type = _guess_content_type(file_name) yield (param, (file_name, content_type, IBodyProducer(fobj))) def _combine_query_params(url, params): parsed_url = urlparse(url.encode('ascii')) qs = [] if parsed_url.query: qs.extend([parsed_url.query, b'&']) qs.append(urlencode(params, doseq=True)) return urlunparse((parsed_url[0], parsed_url[1], parsed_url[2], parsed_url[3], b''.join(qs), parsed_url[5])) def _from_bytes(orig_bytes): return FileBodyProducer(BytesIO(orig_bytes)) def _from_file(orig_file): return FileBodyProducer(orig_file) def _guess_content_type(filename): if filename: guessed = mimetypes.guess_type(filename)[0] else: guessed = None return guessed or 'application/octet-stream' registerAdapter(_from_bytes, bytes, IBodyProducer) registerAdapter(_from_file, BytesIO, IBodyProducer) if not _PY3: from StringIO import StringIO registerAdapter(_from_file, StringIO, IBodyProducer) registerAdapter(_from_file, file, IBodyProducer) else: import io # file()/open() equiv on Py3 registerAdapter(_from_file, io.BufferedReader, IBodyProducer) treq-18.6.0/src/treq/__init__.py0000644000076500000240000000063013061143557016706 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function from ._version import __version__ from treq.api import head, get, post, put, patch, delete, request from treq.content import collect, content, text_content, json_content __version__ = __version__.base() __all__ = ['head', 'get', 'post', 'put', 'patch', 'delete', 'request', 'collect', 'content', 'text_content', 'json_content'] treq-18.6.0/src/treq/response.py0000644000076500000240000000730213315340321016776 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function from twisted.python.components import proxyForInterface from twisted.web.iweb import IResponse, UNKNOWN_LENGTH from twisted.python import reflect from requests.cookies import cookiejar_from_dict from treq.content import collect, content, json_content, text_content class _Response(proxyForInterface(IResponse)): """ A wrapper for :class:`twisted.web.iweb.IResponse` which manages cookies and adds a few convenience methods. """ def __init__(self, original, cookiejar): self.original = original self._cookiejar = cookiejar def __repr__(self): """ Generate a representation of the response which includes the HTTP status code, Content-Type header, and body size, if available. """ if self.original.length == UNKNOWN_LENGTH: size = 'unknown size' else: size = '{:,d} bytes'.format(self.original.length) # Display non-ascii bits of the content-type header as backslash # escapes. content_type_bytes = b', '.join( self.original.headers.getRawHeaders(b'content-type', ())) content_type = repr(content_type_bytes).lstrip('b')[1:-1] return "<{} {} '{:.40s}' {}>".format( reflect.qual(self.__class__), self.original.code, content_type, size, ) def collect(self, collector): """ Incrementally collect the body of the response, per :func:`treq.collect()`. :param collector: A single argument callable that will be called with chunks of body data as it is received. :returns: A `Deferred` that fires when the entire body has been received. """ return collect(self.original, collector) def content(self): """ Read the entire body all at once, per :func:`treq.content()`. :returns: A `Deferred` that fires with a `bytes` object when the entire body has been received. """ return content(self.original) def json(self, **kwargs): """ Collect the response body as JSON per :func:`treq.json_content()`. :param kwargs: Any keyword arguments accepted by :py:func:`json.loads` :rtype: Deferred that fires with the decoded JSON when the entire body has been read. """ return json_content(self.original, **kwargs) def text(self, encoding='ISO-8859-1'): """ Read the entire body all at once as text, per :func:`treq.text_content()`. :rtype: A `Deferred` that fires with a unicode string when the entire body has been received. """ return text_content(self.original, encoding) def history(self): """ Get a list of all responses that (such as intermediate redirects), that ultimately ended in the current response. The responses are ordered chronologically. :returns: A `list` of :class:`~treq.response._Response` objects """ response = self history = [] while response.previousResponse is not None: history.append(_Response(response.previousResponse, self._cookiejar)) response = response.previousResponse history.reverse() return history def cookies(self): """ Get a copy of this response's cookies. :rtype: :class:`requests.cookies.RequestsCookieJar` """ jar = cookiejar_from_dict({}) if self._cookiejar is not None: for cookie in self._cookiejar: jar.set_cookie(cookie) return jar treq-18.6.0/src/treq/api.py0000644000076500000240000000671513106516575015736 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function from twisted.web.client import Agent from treq.client import HTTPClient from treq._utils import default_pool, default_reactor def head(url, **kwargs): """ Make a ``HEAD`` request. See :py:func:`treq.request` """ return _client(**kwargs).head(url, **kwargs) def get(url, headers=None, **kwargs): """ Make a ``GET`` request. See :py:func:`treq.request` """ return _client(**kwargs).get(url, headers=headers, **kwargs) def post(url, data=None, **kwargs): """ Make a ``POST`` request. See :py:func:`treq.request` """ return _client(**kwargs).post(url, data=data, **kwargs) def put(url, data=None, **kwargs): """ Make a ``PUT`` request. See :py:func:`treq.request` """ return _client(**kwargs).put(url, data=data, **kwargs) def patch(url, data=None, **kwargs): """ Make a ``PATCH`` request. See :py:func:`treq.request` """ return _client(**kwargs).patch(url, data=data, **kwargs) def delete(url, **kwargs): """ Make a ``DELETE`` request. See :py:func:`treq.request` """ return _client(**kwargs).delete(url, **kwargs) def request(method, url, **kwargs): """ Make an HTTP request. :param str method: HTTP method. Example: ``'GET'``, ``'HEAD'``. ``'PUT'``, ``'POST'``. :param str url: http or https URL, which may include query arguments. :param headers: Optional HTTP Headers to send with this request. :type headers: Headers or None :param params: Optional parameters to be append as the query string to the URL, any query string parameters in the URL already will be preserved. :type params: dict w/ str or list/tuple of str values, list of 2-tuples, or None. :param data: Optional request body. :type data: str, file-like, IBodyProducer, or None :param json: Optional JSON-serializable content to pass in body. :type json: dict, list/tuple, int, string/unicode, bool, or None :param reactor: Optional twisted reactor. :param bool persistent: Use persistent HTTP connections. Default: ``True`` :param bool allow_redirects: Follow HTTP redirects. Default: ``True`` :param auth: HTTP Basic Authentication information. :type auth: tuple of ``('username', 'password')``. :param cookies: Cookies to send with this request. The HTTP kind, not the tasty kind. :type cookies: ``dict`` or ``cookielib.CookieJar`` :param int timeout: Request timeout seconds. If a response is not received within this timeframe, a connection is aborted with ``CancelledError``. :param bool browser_like_redirects: Use browser like redirects (i.e. Ignore RFC2616 section 10.3 and follow redirects from POST requests). Default: ``False`` :param bool unbuffered: Pass ``True`` to to disable response buffering. By default treq buffers the entire response body in memory. :rtype: Deferred that fires with an IResponse provider. """ return _client(**kwargs).request(method, url, **kwargs) # # Private API # def _client(*args, **kwargs): agent = kwargs.get('agent') if agent is None: reactor = default_reactor(kwargs.get('reactor')) pool = default_pool(reactor, kwargs.get('pool'), kwargs.get('persistent')) agent = Agent(reactor, pool=pool) return HTTPClient(agent) treq-18.6.0/src/treq/content.py0000644000076500000240000000722613315340321016617 0ustar glyphstaff00000000000000from __future__ import absolute_import, division, print_function import cgi import json from twisted.internet.defer import Deferred, succeed from twisted.internet.protocol import Protocol from twisted.web.client import ResponseDone from twisted.web.http import PotentialDataLoss def _encoding_from_headers(headers): content_types = headers.getRawHeaders(u'content-type') if content_types is None: return None # This seems to be the choice browsers make when encountering multiple # content-type headers. content_type, params = cgi.parse_header(content_types[-1]) if 'charset' in params: return params.get('charset').strip("'\"") if content_type == 'application/json': return 'UTF-8' class _BodyCollector(Protocol): def __init__(self, finished, collector): self.finished = finished self.collector = collector def dataReceived(self, data): self.collector(data) def connectionLost(self, reason): if reason.check(ResponseDone): self.finished.callback(None) elif reason.check(PotentialDataLoss): # http://twistedmatrix.com/trac/ticket/4840 self.finished.callback(None) else: self.finished.errback(reason) def collect(response, collector): """ Incrementally collect the body of the response. This function may only be called **once** for a given response. :param IResponse response: The HTTP response to collect the body from. :param collector: A callable to be called each time data is available from the response body. :type collector: single argument callable :rtype: Deferred that fires with None when the entire body has been read. """ if response.length == 0: return succeed(None) d = Deferred() response.deliverBody(_BodyCollector(d, collector)) return d def content(response): """ Read the contents of an HTTP response. This function may be called multiple times for a response, it uses a ``WeakKeyDictionary`` to cache the contents of the response. :param IResponse response: The HTTP Response to get the contents of. :rtype: Deferred that fires with the content as a str. """ _content = [] d = collect(response, _content.append) d.addCallback(lambda _: b''.join(_content)) return d def json_content(response, **kwargs): """ Read the contents of an HTTP response and attempt to decode it as JSON. This function relies on :py:func:`content` and so may be called more than once for a given response. :param IResponse response: The HTTP Response to get the contents of. :param kwargs: Any keyword arguments accepted by :py:func:`json.loads` :rtype: Deferred that fires with the decoded JSON. """ # RFC7159 (8.1): Default JSON character encoding is UTF-8 d = text_content(response, encoding='utf-8') d.addCallback(lambda text: json.loads(text, **kwargs)) return d def text_content(response, encoding='ISO-8859-1'): """ Read the contents of an HTTP response and decode it with an appropriate charset, which may be guessed from the ``Content-Type`` header. :param IResponse response: The HTTP Response to get the contents of. :param str encoding: A charset, such as ``UTF-8`` or ``ISO-8859-1``, used if the response does not specify an encoding. :rtype: Deferred that fires with a unicode string. """ def _decode_content(c): e = _encoding_from_headers(response.headers) if e is not None: return c.decode(e) return c.decode(encoding) d = content(response) d.addCallback(_decode_content) return d treq-18.6.0/src/treq/testing.py0000644000076500000240000004676713315340321016637 0ustar glyphstaff00000000000000# -*- coding: utf-8 -*- """ In-memory version of treq for testing. """ from __future__ import absolute_import, division, print_function from six import text_type, PY3 from contextlib import contextmanager from functools import wraps from twisted.test.proto_helpers import MemoryReactor from twisted.test import iosim from twisted.internet.address import IPv4Address from twisted.internet.defer import succeed from twisted.internet.interfaces import ISSLTransport from twisted.logger import Logger from twisted.python.failure import Failure from twisted.python.urlpath import URLPath from twisted.internet.endpoints import TCP4ClientEndpoint from twisted.web.client import Agent from twisted.web.error import SchemeNotSupported from twisted.web.iweb import IAgent, IAgentEndpointFactory, IBodyProducer from twisted.web.resource import Resource from twisted.web.server import Site from zope.interface import directlyProvides, implementer import treq from treq.client import HTTPClient import attr @implementer(IAgentEndpointFactory) @attr.s class _EndpointFactory(object): """ An endpoint factory used by :class:`RequestTraversalAgent`. :ivar reactor: The agent's reactor. :type reactor: :class:`MemoryReactor` """ reactor = attr.ib() def endpointForURI(self, uri): """ Create an endpoint that represents an in-memory connection to a URI. Note: This always creates a :class:`~twisted.internet.endpoints.TCP4ClientEndpoint` on the assumption :class:`RequestTraversalAgent` ignores everything about the endpoint but its port. :param uri: The URI to connect to. :type uri: :class:`~twisted.web.client.URI` :return: The endpoint. :rtype: An :class:`~twisted.internet.interfaces.IStreamClientEndpoint` provider. """ if uri.scheme not in {b'http', b'https'}: raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,)) return TCP4ClientEndpoint(self.reactor, "127.0.0.1", uri.port) @implementer(IAgent) class RequestTraversalAgent(object): """ :obj:`~twisted.web.iweb.IAgent` implementation that issues an in-memory request rather than going out to a real network socket. """ def __init__(self, rootResource): """ :param rootResource: The Twisted `IResource` at the root of the resource tree. """ self._memoryReactor = MemoryReactor() self._realAgent = Agent.usingEndpointFactory( reactor=self._memoryReactor, endpointFactory=_EndpointFactory(self._memoryReactor)) self._rootResource = rootResource self._pumps = set() def request(self, method, uri, headers=None, bodyProducer=None): """ Implement IAgent.request. """ # We want to use Agent to parse the HTTP response, so let's ask it to # make a request against our in-memory reactor. response = self._realAgent.request(method, uri, headers, bodyProducer) # If the request has already finished, just propagate the result. In # reality this would only happen in failure, but if the agent ever adds # a local cache this might be a success. already_called = [] def check_already_called(r): already_called.append(r) return r response.addBoth(check_already_called) if already_called: return response # That will try to establish an HTTP connection with the reactor's # connectTCP method, and MemoryReactor will place Agent's factory into # the tcpClients list. Alternately, it will try to establish an HTTPS # connection with the reactor's connectSSL method, and MemoryReactor # will place it into the sslClients list. We'll extract that. if PY3: scheme = URLPath.fromBytes(uri).scheme else: scheme = URLPath.fromString(uri).scheme host, port, factory, timeout, bindAddress = ( self._memoryReactor.tcpClients[-1]) serverAddress = IPv4Address('TCP', '127.0.0.1', port) clientAddress = IPv4Address('TCP', '127.0.0.1', 31337) # Create the protocol and fake transport for the client and server, # using the factory that was passed to the MemoryReactor for the # client, and a Site around our rootResource for the server. serverProtocol = Site(self._rootResource).buildProtocol(None) serverTransport = iosim.FakeTransport( serverProtocol, isServer=True, hostAddress=serverAddress, peerAddress=clientAddress) clientProtocol = factory.buildProtocol(None) clientTransport = iosim.FakeTransport( clientProtocol, isServer=False, hostAddress=clientAddress, peerAddress=serverAddress) if scheme == b"https": # Provide ISSLTransport on both transports, so everyone knows that # this is HTTPS. directlyProvides(serverTransport, ISSLTransport) directlyProvides(clientTransport, ISSLTransport) # Make a pump for wiring the client and server together. pump = iosim.connect( serverProtocol, serverTransport, clientProtocol, clientTransport) self._pumps.add(pump) return response def flush(self): """ Flush all data between pending client/server pairs. This is only necessary if a :obj:`Resource` under test returns :obj:`NOT_DONE_YET` from its ``render`` method, making a response asynchronous. In that case, after each write from the server, :meth:`flush()` must be called so the client can see it. """ old_pumps = self._pumps new_pumps = self._pumps = set() for p in old_pumps: p.flush() if p.clientIO.disconnected and p.serverIO.disconnected: continue new_pumps.add(p) @implementer(IBodyProducer) class _SynchronousProducer(object): """ A partial implementation of an :obj:`IBodyProducer` which produces its entire payload immediately. There is no way to access to an instance of this object from :obj:`RequestTraversalAgent` or :obj:`StubTreq`, or even a :obj:`Resource: passed to :obj:`StubTreq`. This does not implement the :func:`IBodyProducer.stopProducing` method, because that is very difficult to trigger. (The request from `RequestTraversalAgent` would have to be canceled while it is still in the transmitting state), and the intent is to use `RequestTraversalAgent` to make synchronous requests. """ def __init__(self, body): """ Create a synchronous producer with some bytes. """ self.body = body msg = ("StubTreq currently only supports url-encodable types, bytes, " "or unicode as data.") assert isinstance(body, (bytes, text_type)), msg if isinstance(body, text_type): self.body = body.encode('utf-8') self.length = len(body) def startProducing(self, consumer): """ Immediately produce all data. """ consumer.write(self.body) return succeed(None) def _reject_files(f): """ Decorator that rejects the 'files' keyword argument to the request functions, because that is not handled by this yet. """ @wraps(f) def wrapper(*args, **kwargs): if 'files' in kwargs: raise AssertionError("StubTreq cannot handle files.") return f(*args, **kwargs) return wrapper class StubTreq(object): """ A fake version of the treq module that can be used for testing that provides all the function calls exposed in :obj:`treq.__all__`. """ def __init__(self, resource): """ Construct a client, and pass through client methods and/or treq.content functions. :param resource: A :obj:`Resource` object that provides the fake responses """ _agent = RequestTraversalAgent(resource) _client = HTTPClient(agent=_agent, data_to_body_producer=_SynchronousProducer) for function_name in treq.__all__: function = getattr(_client, function_name, None) if function is None: function = getattr(treq, function_name) else: function = _reject_files(function) setattr(self, function_name, function) self.flush = _agent.flush class StringStubbingResource(Resource): """ A resource that takes a callable with 5 parameters ``(method, url, params, headers, data)`` and returns ``(code, headers, body)``. The resource uses the callable to return a real response as a result of a request. The parameters for the callable are: - ``method``, the HTTP method as `bytes`. - ``url``, the full URL of the request as text. - ``params``, a dictionary of query parameters mapping query keys lists of values (sorted alphabetically). - ``headers``, a dictionary of headers mapping header keys to a list of header values (sorted alphabetically). - ``data``, the request body as `bytes`. The callable must return a ``tuple`` of (code, headers, body) where the code is the HTTP status code, the headers is a dictionary of bytes (unlike the `headers` parameter, which is a dictionary of lists), and body is a string that will be returned as the response body. If there is a stubbing error, the return value is undefined (if an exception is raised, :obj:`~twisted.web.resource.Resource` will just eat it and return 500 in its place). The callable, or whomever creates the callable, should have a way to handle error reporting. """ isLeaf = True def __init__(self, get_response_for): """ See :class:`StringStubbingResource`. """ Resource.__init__(self) self._get_response_for = get_response_for def render(self, request): """ Produce a response according to the stubs provided. """ params = request.args headers = {} for k, v in request.requestHeaders.getAllRawHeaders(): headers[k] = v for dictionary in (params, headers): for k in dictionary: dictionary[k] = sorted(dictionary[k]) # The incoming request does not have the absoluteURI property, because # an incoming request is a IRequest, not an IClientRequest, so it # the absolute URI needs to be synthesized. # But request.URLPath() only returns the scheme and hostname, because # that is the URL for this resource (because this resource handles # everything from the root on down). # So we need to add the request.path (not request.uri, which includes # the query parameters) absoluteURI = str(request.URLPath().click(request.path)) status_code, headers, body = self._get_response_for( request.method, absoluteURI, params, headers, request.content.read()) request.setResponseCode(status_code) for k, v in headers.items(): request.setHeader(k, v) return body def _maybeEncode(someStr): """ Encode `someStr` to ASCII if required. """ if isinstance(someStr, text_type): return someStr.encode('ascii') return someStr def _maybeEncodeHeaders(headers): """ Convert a headers mapping to its bytes-encoded form. """ return {_maybeEncode(k).lower(): [_maybeEncode(v) for v in vs] for k, vs in headers.items()} class HasHeaders(object): """ Since Twisted adds headers to a request, such as the host and the content length, it's necessary to test whether request headers CONTAIN the expected headers (the ones that are not automatically added by Twisted). This wraps a set of headers, and can be used in an equality test against a superset if the provided headers. The headers keys are lowercased, and keys and values are compared in their bytes-encoded forms. Headers should be provided as a mapping from strings or bytes to a list of strings or bytes. """ def __init__(self, headers): self._headers = _maybeEncodeHeaders(headers) def __repr__(self): return "HasHeaders({0})".format(repr(self._headers)) def __eq__(self, other_headers): compare_to = _maybeEncodeHeaders(other_headers) return (set(self._headers.keys()).issubset(set(compare_to.keys())) and all([set(v).issubset(set(compare_to[k])) for k, v in self._headers.items()])) def __ne__(self, other_headers): return not self.__eq__(other_headers) class RequestSequence(object): """ For an example usage, see :meth:`RequestSequence.consume`. Takes a sequence of:: [((method, url, params, headers, data), (code, headers, body)), ...] Expects the requests to arrive in sequence order. If there are no more responses, or the request's parameters do not match the next item's expected request parameters, calls `sync_failure_reporter` or `async_failure_reporter`. For the expected request tuples: - ``method`` should be :class:`bytes` normalized to lowercase. - ``url`` should be a `str` normalized as per the `transformations in that (usually) preserve semantics `_. A URL to `http://something-that-looks-like-a-directory` would be normalized to `http://something-that-looks-like-a-directory/` and a URL to `http://something-that-looks-like-a-page/page.html` remains unchanged. - ``params`` is a dictionary mapping :class:`bytes` to :class:`list` of :class:`bytes`. - ``headers`` is a dictionary mapping :class:`bytes` to :class:`list` of :class:`bytes` -- note that :class:`twisted.web.client.Agent` may add its own headers which are not guaranteed to be present (for instance, `user-agent` or `content-length`), so it's better to use some kind of matcher like :class:`HasHeaders`. - ``data`` is a :class:`bytes`. For the response tuples: - ``code`` is an integer representing the HTTP status code to return. - ``headers`` is a dictionary mapping :class:`bytes` to :class:`bytes` or :class:`list` of :class:`bytes`. - ``body`` is a :class:`bytes`. :ivar list sequence: A sequence of (request tuple, response tuple) two-tuples, as described above. :ivar async_failure_reporter: An optional callable that takes a :class:`str` message indicating a failure. It's asynchronous because it cannot just raise an exception—if it does, :meth:`Resource.render ` will just convert that into a 500 response, and there will be no other failure reporting mechanism. When the `async_failure_reporter` parameter is not passed, async failures will be reported via a :class:`twisted.logger.Logger` instance, which Trial's test case classes (:class:`twisted.trial.unittest.TestCase` and :class:`~twisted.trial.unittest.SynchronousTestCase`) will translate into a test failure. .. note:: Some versions of :class:`twisted.trial.unittest.SynchronousTestCase` report logged errors on the wrong test: see `Twisted #9267 `_. .. TODO Update the above note to say what version of SynchronousTestCase is fixed once Twisted >17.5.0 is released. When not subclassing Trial's classes you must pass `async_failure_reporter` and implement equivalent behavior or errors will pass silently. For example:: async_failures = [] sequence_stubs = RequestSequence([...], async_failures.append) stub_treq = StubTreq(StringStubbingResource(sequence_stubs)) with sequence_stubs.consume(self.fail): # self = unittest.TestCase stub_treq.get('http://fakeurl.com') self.assertEqual([], async_failures) """ _log = Logger() def __init__(self, sequence, async_failure_reporter=None): self._sequence = sequence self._async_reporter = async_failure_reporter or self._log_async_error def _log_async_error(self, message): """ The default async failure reporter—see `async_failure_reporter`. Logs a failure which wraps an :ex:`AssertionError`. :param str message: Failure message """ # Passing message twice may look redundant, but Trial only preserves # the Failure, not the log message. self._log.failure( "RequestSequence async error: {message}", message=message, failure=Failure(AssertionError(message)), ) def consumed(self): """ :return: `bool` representing whether the entire sequence has been consumed. This is useful in tests to assert that the expected requests have all been made. """ return len(self._sequence) == 0 @contextmanager def consume(self, sync_failure_reporter): """ Usage:: sequence_stubs = RequestSequence([...]) stub_treq = StubTreq(StringStubbingResource(sequence_stubs)) # self = twisted.trial.unittest.SynchronousTestCase with sequence_stubs.consume(self.fail): stub_treq.get('http://fakeurl.com') stub_treq.get('http://another-fake-url.com') If there are still remaining expected requests to be made in the sequence, fails the provided test case. :param sync_failure_reporter: A callable that takes a single message reporting failures. This can just raise an exception - it does not need to be asynchronous, since the exception would not get raised within a Resource. :return: a context manager that can be used to ensure all expected requests have been made. """ yield if not self.consumed(): sync_failure_reporter("\n".join( ["Not all expected requests were made. Still expecting:"] + ["- {0}(url={1}, params={2}, headers={3}, data={4})".format( *expected) for expected, _ in self._sequence])) def __call__(self, method, url, params, headers, data): """ :return: the next response in the sequence, provided that the parameters match the next in the sequence. """ if len(self._sequence) == 0: self._async_reporter( "No more requests expected, but request {0!r} made.".format( (method, url, params, headers, data))) return (500, {}, b"StubbingError") expected, response = self._sequence[0] e_method, e_url, e_params, e_headers, e_data = expected checks = [ (e_method == method.lower(), "method"), (e_url == url, "url"), (e_params == params, 'parameters'), (e_headers == headers, "headers"), (e_data == data, "data") ] mismatches = [param for success, param in checks if not success] if mismatches: self._async_reporter( "\nExpected the next request to be: {0!r}" "\nGot request : {1!r}\n" "\nMismatches: {2!r}" .format(expected, (method, url, params, headers, data), mismatches)) return (500, {}, b"StubbingError") self._sequence = self._sequence[1:] return response treq-18.6.0/src/treq/_utils.py0000644000076500000240000000165413061143557016455 0ustar glyphstaff00000000000000""" Strictly internal utilities. """ from __future__ import absolute_import, division, print_function from twisted.web.client import HTTPConnectionPool def default_reactor(reactor): """ Return the specified reactor or the default. """ if reactor is None: from twisted.internet import reactor return reactor _global_pool = [None] def get_global_pool(): return _global_pool[0] def set_global_pool(pool): _global_pool[0] = pool def default_pool(reactor, pool, persistent): """ Return the specified pool or a a pool with the specified reactor and persistence. """ reactor = default_reactor(reactor) if pool is not None: return pool if persistent is False: return HTTPConnectionPool(reactor, persistent=persistent) if get_global_pool() is None: set_global_pool(HTTPConnectionPool(reactor, persistent=True)) return get_global_pool() treq-18.6.0/src/treq.egg-info/0000755000076500000240000000000013315341523016263 5ustar glyphstaff00000000000000treq-18.6.0/src/treq.egg-info/PKG-INFO0000644000076500000240000000552013315341523017362 0ustar glyphstaff00000000000000Metadata-Version: 2.1 Name: treq Version: 18.6.0 Summary: A requests-like API built on top of twisted.web's Agent Home-page: https://github.com/twisted/treq Author: David Reid Author-email: dreid@dreid.org Maintainer: Amber Brown Maintainer-email: hawkowl@twistedmatrix.com License: MIT/X Description: treq ==== |pypi|_ |build|_ |coverage|_ ``treq`` is an HTTP library inspired by `requests `_ but written on top of `Twisted `_'s `Agents `_. It provides a simple, higher level API for making HTTP requests when using Twisted. .. code-block:: python >>> from treq import get >>> def done(response): ... print response.code ... reactor.stop() >>> get("http://www.github.com").addCallback(done) >>> from twisted.internet import reactor >>> reactor.run() 200 For more info `read the docs `_. Contribute ========== ``treq`` is hosted on `GitHub `_. Feel free to fork and send contributions over. Developing ========== Install dependencies: :: pip install treq[dev] Run Tests (unit & integration): :: trial treq Lint: :: pep8 treq pyflakes treq Build docs:: tox -e docs .. |build| image:: https://api.travis-ci.org/twisted/treq.svg?branch=master .. _build: https://travis-ci.org/twisted/treq .. |coverage| image:: https://codecov.io/github/twisted/treq/coverage.svg?branch=master .. _coverage: https://codecov.io/github/twisted/treq .. |pypi| image:: https://img.shields.io/pypi/v/treq.svg .. _pypi: https://pypi.python.org/pypi/treq Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Framework :: Twisted Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Provides-Extra: dev treq-18.6.0/src/treq.egg-info/SOURCES.txt0000644000076500000240000000304413315341523020150 0ustar glyphstaff00000000000000.coveragerc LICENSE MANIFEST.in README.rst setup.cfg setup.py tox.ini tox2travis.py docs/Makefile docs/api.rst docs/conf.py docs/howto.rst docs/index.rst docs/make.bat docs/testing.rst docs/_static/.keepme docs/examples/_utils.py docs/examples/basic_auth.py docs/examples/basic_get.py docs/examples/basic_post.py docs/examples/disable_redirects.py docs/examples/download_file.py docs/examples/iresource.py docs/examples/query_params.py docs/examples/redirects.py docs/examples/response_history.py docs/examples/testing_seq.py docs/examples/using_cookies.py src/treq/__init__.py src/treq/_utils.py src/treq/_version.py src/treq/api.py src/treq/auth.py src/treq/client.py src/treq/content.py src/treq/multipart.py src/treq/response.py src/treq/testing.py src/treq.egg-info/PKG-INFO src/treq.egg-info/SOURCES.txt src/treq.egg-info/dependency_links.txt src/treq.egg-info/requires.txt src/treq.egg-info/top_level.txt src/treq/test/__init__.py src/treq/test/test_api.py src/treq/test/test_auth.py src/treq/test/test_client.py src/treq/test/test_content.py src/treq/test/test_multipart.py src/treq/test/test_response.py src/treq/test/test_testing.py src/treq/test/test_treq_integration.py src/treq/test/test_utils.py src/treq/test/util.py src/treq/test/local_httpbin/__init__.py src/treq/test/local_httpbin/child.py src/treq/test/local_httpbin/parent.py src/treq/test/local_httpbin/shared.py src/treq/test/local_httpbin/test/__init__.py src/treq/test/local_httpbin/test/test_child.py src/treq/test/local_httpbin/test/test_parent.py src/treq/test/local_httpbin/test/test_shared.pytreq-18.6.0/src/treq.egg-info/requires.txt0000644000076500000240000000015313315341523020662 0ustar glyphstaff00000000000000incremental requests>=2.1.0 six Twisted[tls]>=16.4.0 attrs [dev] mock pep8 pyflakes sphinx httpbin==0.5.0 treq-18.6.0/src/treq.egg-info/top_level.txt0000644000076500000240000000000513315341523021010 0ustar glyphstaff00000000000000treq treq-18.6.0/src/treq.egg-info/dependency_links.txt0000644000076500000240000000000113315341523022331 0ustar glyphstaff00000000000000