aiohttp-3.6.2/0000755000175100001650000000000013547410140013461 5ustar vstsdocker00000000000000aiohttp-3.6.2/CHANGES.rst0000644000175100001650000004202213547410117015267 0ustar vstsdocker00000000000000========= Changelog ========= .. You should *NOT* be adding new change log entries to this file, this file is managed by towncrier. You *may* edit previous change logs to fix problems like typo corrections or such. To add a new change log entry, please see https://pip.pypa.io/en/latest/development/#adding-a-news-entry we named the news folder "changes". WARNING: Don't drop the next directive! .. towncrier release notes start 3.6.2 (2019-10-09) ================== Features -------- - Made exceptions pickleable. Also changed the repr of some exceptions. `#4077 `_ - Use ``Iterable`` type hint instead of ``Sequence`` for ``Application`` *middleware* parameter. `#4125 `_ Bugfixes -------- - Reset the ``sock_read`` timeout each time data is received for a ``aiohttp.ClientResponse``. `#3808 `_ - Fix handling of expired cookies so they are not stored in CookieJar. `#4063 `_ - Fix misleading message in the string representation of ``ClientConnectorError``; ``self.ssl == None`` means default SSL context, not SSL disabled `#4097 `_ - Don't clobber HTTP status when using FileResponse. `#4106 `_ Improved Documentation ---------------------- - Added minimal required logging configuration to logging documentation. `#2469 `_ - Update docs to reflect proxy support. `#4100 `_ - Fix typo in code example in testing docs. `#4108 `_ Misc ---- - `#4102 `_ ---- 3.6.1 (2019-09-19) ================== Features -------- - Compatibility with Python 3.8. `#4056 `_ Bugfixes -------- - correct some exception string format `#4068 `_ - Emit a warning when ``ssl.OP_NO_COMPRESSION`` is unavailable because the runtime is built against an outdated OpenSSL. `#4052 `_ - Update multidict requirement to >= 4.5 `#4057 `_ Improved Documentation ---------------------- - Provide pytest-aiohttp namespace for pytest fixtures in docs. `#3723 `_ ---- 3.6.0 (2019-09-06) ================== Features -------- - Add support for Named Pipes (Site and Connector) under Windows. This feature requires Proactor event loop to work. `#3629 `_ - Removed ``Transfer-Encoding: chunked`` header from websocket responses to be compatible with more http proxy servers. `#3798 `_ - Accept non-GET request for starting websocket handshake on server side. `#3980 `_ Bugfixes -------- - Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. `#3532 `_ - Fix an issue where cookies would sometimes not be set during a redirect. `#3576 `_ - Change normalize_path_middleware to use 308 redirect instead of 301. This behavior should prevent clients from being unable to use PUT/POST methods on endpoints that are redirected because of a trailing slash. `#3579 `_ - Drop the processed task from ``all_tasks()`` list early. It prevents logging about a task with unhandled exception when the server is used in conjunction with ``asyncio.run()``. `#3587 `_ - ``Signal`` type annotation changed from ``Signal[Callable[['TraceConfig'], Awaitable[None]]]`` to ``Signal[Callable[ClientSession, SimpleNamespace, ...]``. `#3595 `_ - Use sanitized URL as Location header in redirects `#3614 `_ - Improve typing annotations for multipart.py along with changes required by mypy in files that references multipart.py. `#3621 `_ - Close session created inside ``aiohttp.request`` when unhandled exception occurs `#3628 `_ - Cleanup per-chunk data in generic data read. Memory leak fixed. `#3631 `_ - Use correct type for add_view and family `#3633 `_ - Fix _keepalive field in __slots__ of ``RequestHandler``. `#3644 `_ - Properly handle ConnectionResetError, to silence the "Cannot write to closing transport" exception when clients disconnect uncleanly. `#3648 `_ - Suppress pytest warnings due to ``test_utils`` classes `#3660 `_ - Fix overshadowing of overlapped sub-application prefixes. `#3701 `_ - Fixed return type annotation for WSMessage.json() `#3720 `_ - Properly expose TooManyRedirects publicly as documented. `#3818 `_ - Fix missing brackets for IPv6 in proxy CONNECT request `#3841 `_ - Make the signature of ``aiohttp.test_utils.TestClient.request`` match ``asyncio.ClientSession.request`` according to the docs `#3852 `_ - Use correct style for re-exported imports, makes mypy ``--strict`` mode happy. `#3868 `_ - Fixed type annotation for add_view method of UrlDispatcher to accept any subclass of View `#3880 `_ - Made cython HTTP parser set Reason-Phrase of the response to an empty string if it is missing. `#3906 `_ - Add URL to the string representation of ClientResponseError. `#3959 `_ - Accept ``istr`` keys in ``LooseHeaders`` type hints. `#3976 `_ - Fixed race conditions in _resolve_host caching and throttling when tracing is enabled. `#4013 `_ - For URLs like "unix://localhost/..." set Host HTTP header to "localhost" instead of "localhost:None". `#4039 `_ Improved Documentation ---------------------- - Modify documentation for Background Tasks to remove deprecated usage of event loop. `#3526 `_ - use ``if __name__ == '__main__':`` in server examples. `#3775 `_ - Update documentation reference to the default access logger. `#3783 `_ - Improve documentation for ``web.BaseRequest.path`` and ``web.BaseRequest.raw_path``. `#3791 `_ - Removed deprecation warning in tracing example docs `#3964 `_ ---- 3.5.4 (2019-01-12) ================== Bugfixes -------- - Fix stream ``.read()`` / ``.readany()`` / ``.iter_any()`` which used to return a partial content only in case of compressed content `#3525 `_ 3.5.3 (2019-01-10) ================== Bugfixes -------- - Fix type stubs for ``aiohttp.web.run_app(access_log=True)`` and fix edge case of ``access_log=True`` and the event loop being in debug mode. `#3504 `_ - Fix ``aiohttp.ClientTimeout`` type annotations to accept ``None`` for fields `#3511 `_ - Send custom per-request cookies even if session jar is empty `#3515 `_ - Restore Linux binary wheels publishing on PyPI ---- 3.5.2 (2019-01-08) ================== Features -------- - ``FileResponse`` from ``web_fileresponse.py`` uses a ``ThreadPoolExecutor`` to work with files asynchronously. I/O based payloads from ``payload.py`` uses a ``ThreadPoolExecutor`` to work with I/O objects asynchronously. `#3313 `_ - Internal Server Errors in plain text if the browser does not support HTML. `#3483 `_ Bugfixes -------- - Preserve MultipartWriter parts headers on write. Refactor the way how ``Payload.headers`` are handled. Payload instances now always have headers and Content-Type defined. Fix Payload Content-Disposition header reset after initial creation. `#3035 `_ - Log suppressed exceptions in ``GunicornWebWorker``. `#3464 `_ - Remove wildcard imports. `#3468 `_ - Use the same task for app initialization and web server handling in gunicorn workers. It allows to use Python3.7 context vars smoothly. `#3471 `_ - Fix handling of chunked+gzipped response when first chunk does not give uncompressed data `#3477 `_ - Replace ``collections.MutableMapping`` with ``collections.abc.MutableMapping`` to avoid a deprecation warning. `#3480 `_ - ``Payload.size`` type annotation changed from ``Optional[float]`` to ``Optional[int]``. `#3484 `_ - Ignore done tasks when cancels pending activities on ``web.run_app`` finalization. `#3497 `_ Improved Documentation ---------------------- - Add documentation for ``aiohttp.web.HTTPException``. `#3490 `_ Misc ---- - `#3487 `_ ---- 3.5.1 (2018-12-24) ==================== - Fix a regression about ``ClientSession._requote_redirect_url`` modification in debug mode. 3.5.0 (2018-12-22) ==================== Features -------- - The library type annotations are checked in strict mode now. - Add support for setting cookies for individual request (`#2387 `_) - Application.add_domain implementation (`#2809 `_) - The default ``app`` in the request returned by ``test_utils.make_mocked_request`` can now have objects assigned to it and retrieved using the ``[]`` operator. (`#3174 `_) - Make ``request.url`` accessible when transport is closed. (`#3177 `_) - Add ``zlib_executor_size`` argument to ``Response`` constructor to allow compression to run in a background executor to avoid blocking the main thread and potentially triggering health check failures. (`#3205 `_) - Enable users to set ``ClientTimeout`` in ``aiohttp.request`` (`#3213 `_) - Don't raise a warning if ``NETRC`` environment variable is not set and ``~/.netrc`` file doesn't exist. (`#3267 `_) - Add default logging handler to web.run_app If the ``Application.debug``` flag is set and the default logger ``aiohttp.access`` is used, access logs will now be output using a *stderr* ``StreamHandler`` if no handlers are attached. Furthermore, if the default logger has no log level set, the log level will be set to ``DEBUG``. (`#3324 `_) - Add method argument to ``session.ws_connect()``. Sometimes server API requires a different HTTP method for WebSocket connection establishment. For example, ``Docker exec`` needs POST. (`#3378 `_) - Create a task per request handling. (`#3406 `_) Bugfixes -------- - Enable passing ``access_log_class`` via ``handler_args`` (`#3158 `_) - Return empty bytes with end-of-chunk marker in empty stream reader. (`#3186 `_) - Accept ``CIMultiDictProxy`` instances for ``headers`` argument in ``web.Response`` constructor. (`#3207 `_) - Don't uppercase HTTP method in parser (`#3233 `_) - Make method match regexp RFC-7230 compliant (`#3235 `_) - Add ``app.pre_frozen`` state to properly handle startup signals in sub-applications. (`#3237 `_) - Enhanced parsing and validation of helpers.BasicAuth.decode. (`#3239 `_) - Change imports from collections module in preparation for 3.8. (`#3258 `_) - Ensure Host header is added first to ClientRequest to better replicate browser (`#3265 `_) - Fix forward compatibility with Python 3.8: importing ABCs directly from the collections module will not be supported anymore. (`#3273 `_) - Keep the query string by ``normalize_path_middleware``. (`#3278 `_) - Fix missing parameter ``raise_for_status`` for aiohttp.request() (`#3290 `_) - Bracket IPv6 addresses in the HOST header (`#3304 `_) - Fix default message for server ping and pong frames. (`#3308 `_) - Fix tests/test_connector.py typo and tests/autobahn/server.py duplicate loop def. (`#3337 `_) - Fix false-negative indicator end_of_HTTP_chunk in StreamReader.readchunk function (`#3361 `_) - Release HTTP response before raising status exception (`#3364 `_) - Fix task cancellation when ``sendfile()`` syscall is used by static file handling. (`#3383 `_) - Fix stack trace for ``asyncio.TimeoutError`` which was not logged, when it is caught in the handler. (`#3414 `_) Improved Documentation ---------------------- - Improve documentation of ``Application.make_handler`` parameters. (`#3152 `_) - Fix BaseRequest.raw_headers doc. (`#3215 `_) - Fix typo in TypeError exception reason in ``web.Application._handle`` (`#3229 `_) - Make server access log format placeholder %b documentation reflect behavior and docstring. (`#3307 `_) Deprecations and Removals ------------------------- - Deprecate modification of ``session.requote_redirect_url`` (`#2278 `_) - Deprecate ``stream.unread_data()`` (`#3260 `_) - Deprecated use of boolean in ``resp.enable_compression()`` (`#3318 `_) - Encourage creation of aiohttp public objects inside a coroutine (`#3331 `_) - Drop dead ``Connection.detach()`` and ``Connection.writer``. Both methods were broken for more than 2 years. (`#3358 `_) - Deprecate ``app.loop``, ``request.loop``, ``client.loop`` and ``connector.loop`` properties. (`#3374 `_) - Deprecate explicit debug argument. Use asyncio debug mode instead. (`#3381 `_) - Deprecate body parameter in HTTPException (and derived classes) constructor. (`#3385 `_) - Deprecate bare connector close, use ``async with connector:`` and ``await connector.close()`` instead. (`#3417 `_) - Deprecate obsolete ``read_timeout`` and ``conn_timeout`` in ``ClientSession`` constructor. (`#3438 `_) Misc ---- - #3341, #3351 aiohttp-3.6.2/CONTRIBUTORS.txt0000644000175100001650000000777213547410117016200 0ustar vstsdocker00000000000000- Contributors - ---------------- A. Jesse Jiryu Davis Adam Cooper Adam Mills Adrián Chaves Alan Tse Alec Hanefeld Alejandro Gómez Aleksandr Danshyn Aleksey Kutepov Alex Hayes Alex Key Alex Khomchenko Alex Kuzmenko Alex Lisovoy Alexander Bayandin Alexander Karpinsky Alexander Koshevoy Alexander Malev Alexander Mohr Alexander Shorin Alexander Travov Alexandru Mihai Alexey Firsov Alexey Popravka Alexey Stepanov Amin Etesamian Amit Tulshyan Amy Boyle Anders Melchiorsen Andrei Ursulenko Andrej Antonov Andrew Leech Andrew Lytvyn Andrew Svetlov Andrew Zhou Andrii Soldatenko Antoine Pietri Anton Kasyanov Anton Zhdan-Pushkin Arthur Darcet Ben Bader Ben Timby Benedikt Reinartz Boris Feld Boyi Chen Brett Cannon Brian C. Lane Brian Muller Bryan Kok Bryce Drennan Carl George Cecile Tonglet Chien-Wei Huang Chih-Yuan Chen Chris AtLee Chris Laws Chris Moore Christopher Schmitt Claudiu Popa Colin Dunklau Cong Xu Damien Nadé Dan Xu Daniel García Daniel Grossmann-Kavanagh Daniel Nelson Danny Song David Bibb David Michael Brown Denilson Amorim Denis Matiychuk Dennis Kliban Dima Veselov Dimitar Dimitrov Dmitriy Safonov Dmitry Doroshev Dmitry Lukashin Dmitry Marakasov Dmitry Shamov Dmitry Trofimov Dmytro Bohomiakov Dmytro Kuznetsov Dustin J. Mitchell Eduard Iskandarov Eli Ribble Elizabeth Leddy Enrique Saez Eric Sheng Erich Healy Eugene Chernyshov Eugene Naydenov Eugene Nikolaiev Eugene Tolmachev Evert Lammerts FichteFoll Florian Scheffler Frederik Gladhorn Frederik Peter Aalund Gabriel Tremblay Gennady Andreyev Georges Dubus Greg Holt Gregory Haynes Gus Goulart Gustavo Carneiro Günther Jena Hans Adema Harmon Y. Hu Bo Hugh Young Hugo Herter Hynek Schlawack Igor Alexandrov Igor Davydenko Igor Mozharovsky Igor Pavlov Ilya Chichak Ingmar Steen Jacob Champion Jaesung Lee Jake Davis Jakob Ackermann Jakub Wilk Jashandeep Sohi Jeongkyu Shin Jeroen van der Heijden Jesus Cea Jian Zeng Jinkyu Yi Joel Watts Jon Nabozny Jonas Obrist Joongi Kim Josep Cugat Joshu Coats Julia Tsemusheva Julien Duponchelle Jungkook Park Junjie Tao Justas Trimailovas Justin Foo Justin Turner Arthur Kay Zheng Kimmo Parviainen-Jalanko Kirill Klenov Kirill Malovitsa Kyrylo Perevozchikov Lars P. Søndergaard Louis-Philippe Huberdeau Loïc Lajeanne Lu Gong Lubomir Gelo Ludovic Gasc Luis Pedrosa Lukasz Marcin Dobrzanski Makc Belousow Manuel Miranda Marat Sharafutdinov Marco Paolini Mariano Anaya Martijn Pieters Martin Melka Martin Richard Mathias Fröjdman Mathieu Dugré Matthieu Hauglustaine Matthieu Rigal Michael Ihnatenko Mikhail Burshteyn Mikhail Kashkin Mikhail Lukyanchenko Mikhail Nacharov Misha Behersky Mitchell Ferree Morgan Delahaye-Prat Moss Collum Mun Gwan-gyeong Navid Sheikhol Nicolas Braem Nikolay Kim Nikolay Novik Oisin Aylward Olaf Conradi Pahaz Blinov Panagiotis Kolokotronis Pankaj Pandey Pau Freixes Paul Colomiets Paulius Šileikis Paulus Schoutsen Pavel Kamaev Pavel Polyakov Pawel Kowalski Pawel Miech Pepe Osca Philipp A. Pieter van Beek Rafael Viotti Raúl Cumplido Required Field Robert Lu Robert Nikolich Roman Podoliaka Samuel Colvin Sean Hunt Sebastian Acuna Sebastian Hanula Sebastian Hüther Sebastien Geffroy SeongSoo Cho Sergey Ninua Sergey Skripnick Serhii Kostel Simon Kennedy Sin-Woo Bang Stanislas Plum Stanislav Prokop Stefan Tjarks Stepan Pletnev Stephen Granade Steven Seguin Sunghyun Hwang Sviatoslav Bulbakha Sviatoslav Sydorenko Taha Jahangir Taras Voinarovskyi Terence Honles Thanos Lefteris Thijs Vermeir Thomas Forbes Thomas Grainger Tolga Tezel Tomasz Trebski Trinh Hoang Nhu Vadim Suharnikov Vaibhav Sagar Vamsi Krishna Avula Vasiliy Faronov Vasyl Baran Victor Kovtun Vikas Kawadia Viktor Danyliuk Ville Skyttä Vincent Maillol Vitalik Verhovodov Vitaly Haritonsky Vitaly Magerya Vladimir Kozlovski Vladimir Rutsky Vladimir Shulyak Vladimir Zakharov Vladyslav Bondar W. Trevor King Wei Lin Weiwei Wang Will McGugan Willem de Groot William Grzybowski Wilson Ong Yang Zhou Yannick Koechlin Yannick Péroux Ye Cao Yegor Roganov Yifei Kong Young-Ho Cha Yuriy Shatrov Yury Selivanov Yusuke Tsutsumi Марк Коренберг Семён Марьясин aiohttp-3.6.2/LICENSE.txt0000644000175100001650000002610413547410117015313 0ustar vstsdocker00000000000000Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "{}" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright 2013-2019 Nikolay Kim and Andrew Svetlov Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. aiohttp-3.6.2/MANIFEST.in0000644000175100001650000000061213547410117015222 0ustar vstsdocker00000000000000include LICENSE.txt include CHANGES.rst include README.rst include CONTRIBUTORS.txt include Makefile graft aiohttp graft docs graft examples graft tests recursive-include vendor * global-include aiohttp *.pyi global-exclude *.pyc global-exclude *.pyd global-exclude *.so global-exclude *.lib global-exclude *.dll global-exclude *.a global-exclude *.obj exclude aiohttp/*.html prune docs/_build aiohttp-3.6.2/Makefile0000644000175100001650000000626613547410117015137 0ustar vstsdocker00000000000000# Some simple testing tasks (sorry, UNIX only). PYXS = $(wildcard aiohttp/*.pyx) SRC = aiohttp examples tests setup.py all: test .install-cython: pip install -r requirements/cython.txt touch .install-cython aiohttp/%.c: aiohttp/%.pyx cython -3 -o $@ $< -I aiohttp cythonize: .install-cython $(PYXS:.pyx=.c) .install-deps: cythonize $(shell find requirements -type f) pip install -r requirements/dev.txt @touch .install-deps isort: isort -rc $(SRC) flake: .flake .flake: .install-deps $(shell find aiohttp -type f) \ $(shell find tests -type f) \ $(shell find examples -type f) flake8 aiohttp examples tests python setup.py check -rms @if ! isort -c -rc aiohttp tests examples; then \ echo "Import sort errors, run 'make isort' to fix them!!!"; \ isort --diff -rc aiohttp tests examples; \ false; \ fi @if ! LC_ALL=C sort -c CONTRIBUTORS.txt; then \ echo "CONTRIBUTORS.txt sort error"; \ fi @touch .flake flake8: flake8 $(SRC) mypy: .flake mypy aiohttp isort-check: @if ! isort -rc --check-only $(SRC); then \ echo "Import sort errors, run 'make isort' to fix them!!!"; \ isort --diff -rc $(SRC); \ false; \ fi check_changes: ./tools/check_changes.py .develop: .install-deps $(shell find aiohttp -type f) .flake check_changes mypy # pip install -e . @touch .develop test: .develop @pytest -c pytest.ci.ini -q vtest: .develop @pytest -c pytest.ci.ini -s -v cov cover coverage: tox cov-dev: .develop @pytest -c pytest.ci.ini --cov-report=html @echo "open file://`pwd`/htmlcov/index.html" cov-ci-run: .develop @echo "Regular run" @pytest -c pytest.ci.ini --cov-report=html cov-dev-full: cov-ci-run @echo "open file://`pwd`/htmlcov/index.html" clean: @rm -rf `find . -name __pycache__` @rm -f `find . -type f -name '*.py[co]' ` @rm -f `find . -type f -name '*~' ` @rm -f `find . -type f -name '.*~' ` @rm -f `find . -type f -name '@*' ` @rm -f `find . -type f -name '#*#' ` @rm -f `find . -type f -name '*.orig' ` @rm -f `find . -type f -name '*.rej' ` @rm -f .coverage @rm -rf htmlcov @rm -rf build @rm -rf cover @make -C docs clean @python setup.py clean @rm -f aiohttp/_frozenlist.html @rm -f aiohttp/_frozenlist.c @rm -f aiohttp/_frozenlist.*.so @rm -f aiohttp/_frozenlist.*.pyd @rm -f aiohttp/_http_parser.html @rm -f aiohttp/_http_parser.c @rm -f aiohttp/_http_parser.*.so @rm -f aiohttp/_http_parser.*.pyd @rm -f aiohttp/_multidict.html @rm -f aiohttp/_multidict.c @rm -f aiohttp/_multidict.*.so @rm -f aiohttp/_multidict.*.pyd @rm -f aiohttp/_websocket.html @rm -f aiohttp/_websocket.c @rm -f aiohttp/_websocket.*.so @rm -f aiohttp/_websocket.*.pyd @rm -f aiohttp/_parser.html @rm -f aiohttp/_parser.c @rm -f aiohttp/_parser.*.so @rm -f aiohttp/_parser.*.pyd @rm -rf .tox @rm -f .develop @rm -f .flake @rm -f .install-deps @rm -rf aiohttp.egg-info doc: @make -C docs html SPHINXOPTS="-W -E" @echo "open file://`pwd`/docs/_build/html/index.html" doc-spelling: @make -C docs spelling SPHINXOPTS="-W -E" install: @pip install -U 'pip' @pip install -Ur requirements/dev.txt .PHONY: all build flake test vtest cov clean doc mypy aiohttp-3.6.2/PKG-INFO0000644000175100001650000007021313547410140014561 0ustar vstsdocker00000000000000Metadata-Version: 2.1 Name: aiohttp Version: 3.6.2 Summary: Async http client/server framework (asyncio) Home-page: https://github.com/aio-libs/aiohttp Author: Nikolay Kim Author-email: fafhrd91@gmail.com Maintainer: Nikolay Kim , Andrew Svetlov Maintainer-email: aio-libs@googlegroups.com License: Apache 2 Project-URL: Chat: Gitter, https://gitter.im/aio-libs/Lobby Project-URL: CI: AppVeyor, https://ci.appveyor.com/project/aio-libs/aiohttp Project-URL: CI: Circle, https://circleci.com/gh/aio-libs/aiohttp Project-URL: CI: Shippable, https://app.shippable.com/github/aio-libs/aiohttp Project-URL: CI: Travis, https://travis-ci.com/aio-libs/aiohttp Project-URL: Coverage: codecov, https://codecov.io/github/aio-libs/aiohttp Project-URL: Docs: RTD, https://docs.aiohttp.org Project-URL: GitHub: issues, https://github.com/aio-libs/aiohttp/issues Project-URL: GitHub: repo, https://github.com/aio-libs/aiohttp Description: ================================== Async http client/server framework ================================== .. image:: https://raw.githubusercontent.com/aio-libs/aiohttp/master/docs/_static/aiohttp-icon-128x128.png :height: 64px :width: 64px :alt: aiohttp logo | .. image:: https://travis-ci.com/aio-libs/aiohttp.svg?branch=master :target: https://travis-ci.com/aio-libs/aiohttp :align: right :alt: Travis status for master branch .. image:: https://ci.appveyor.com/api/projects/status/tnddy9k6pphl8w7k/branch/master?svg=true :target: https://ci.appveyor.com/project/aio-libs/aiohttp :align: right :alt: AppVeyor status for master branch .. image:: https://codecov.io/gh/aio-libs/aiohttp/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/aiohttp :alt: codecov.io status for master branch .. image:: https://badge.fury.io/py/aiohttp.svg :target: https://pypi.org/project/aiohttp :alt: Latest PyPI package version .. image:: https://readthedocs.org/projects/aiohttp/badge/?version=latest :target: https://docs.aiohttp.org/ :alt: Latest Read The Docs .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Key Features ============ - Supports both client and server side of HTTP protocol. - Supports both client and server Web-Sockets out-of-the-box and avoids Callback Hell. - Provides Web-server with middlewares and pluggable routing. Getting started =============== Client ------ To get something from the web: .. code-block:: python import aiohttp import asyncio async def fetch(session, url): async with session.get(url) as response: return await response.text() async def main(): async with aiohttp.ClientSession() as session: html = await fetch(session, 'http://python.org') print(html) if __name__ == '__main__': loop = asyncio.get_event_loop() loop.run_until_complete(main()) Server ------ An example using a simple server: .. code-block:: python # examples/server_simple.py from aiohttp import web async def handle(request): name = request.match_info.get('name', "Anonymous") text = "Hello, " + name return web.Response(text=text) async def wshandle(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: if msg.type == web.WSMsgType.text: await ws.send_str("Hello, {}".format(msg.data)) elif msg.type == web.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == web.WSMsgType.close: break return ws app = web.Application() app.add_routes([web.get('/', handle), web.get('/echo', wshandle), web.get('/{name}', handle)]) if __name__ == '__main__': web.run_app(app) Documentation ============= https://aiohttp.readthedocs.io/ Demos ===== https://github.com/aio-libs/aiohttp-demos External links ============== * `Third party libraries `_ * `Built with aiohttp `_ * `Powered by aiohttp `_ Feel free to make a Pull Request for adding your link to these pages! Communication channels ====================== *aio-libs* google group: https://groups.google.com/forum/#!forum/aio-libs Feel free to post your questions and ideas here. *gitter chat* https://gitter.im/aio-libs/Lobby We support `Stack Overflow `_. Please add *aiohttp* tag to your question there. Requirements ============ - Python >= 3.5.3 - async-timeout_ - attrs_ - chardet_ - multidict_ - yarl_ Optionally you may install the cChardet_ and aiodns_ libraries (highly recommended for sake of speed). .. _chardet: https://pypi.python.org/pypi/chardet .. _aiodns: https://pypi.python.org/pypi/aiodns .. _attrs: https://github.com/python-attrs/attrs .. _multidict: https://pypi.python.org/pypi/multidict .. _yarl: https://pypi.python.org/pypi/yarl .. _async-timeout: https://pypi.python.org/pypi/async_timeout .. _cChardet: https://pypi.python.org/pypi/cchardet License ======= ``aiohttp`` is offered under the Apache 2 license. Keepsafe ======== The aiohttp community would like to thank Keepsafe (https://www.getkeepsafe.com) for its support in the early days of the project. Source code =========== The latest developer version is available in a GitHub repository: https://github.com/aio-libs/aiohttp Benchmarks ========== If you are interested in efficiency, the AsyncIO community maintains a list of benchmarks on the official wiki: https://github.com/python/asyncio/wiki/Benchmarks ========= Changelog ========= .. You should *NOT* be adding new change log entries to this file, this file is managed by towncrier. You *may* edit previous change logs to fix problems like typo corrections or such. To add a new change log entry, please see https://pip.pypa.io/en/latest/development/#adding-a-news-entry we named the news folder "changes". WARNING: Don't drop the next directive! .. towncrier release notes start 3.6.2 (2019-10-09) ================== Features -------- - Made exceptions pickleable. Also changed the repr of some exceptions. `#4077 `_ - Use ``Iterable`` type hint instead of ``Sequence`` for ``Application`` *middleware* parameter. `#4125 `_ Bugfixes -------- - Reset the ``sock_read`` timeout each time data is received for a ``aiohttp.ClientResponse``. `#3808 `_ - Fix handling of expired cookies so they are not stored in CookieJar. `#4063 `_ - Fix misleading message in the string representation of ``ClientConnectorError``; ``self.ssl == None`` means default SSL context, not SSL disabled `#4097 `_ - Don't clobber HTTP status when using FileResponse. `#4106 `_ Improved Documentation ---------------------- - Added minimal required logging configuration to logging documentation. `#2469 `_ - Update docs to reflect proxy support. `#4100 `_ - Fix typo in code example in testing docs. `#4108 `_ Misc ---- - `#4102 `_ ---- 3.6.1 (2019-09-19) ================== Features -------- - Compatibility with Python 3.8. `#4056 `_ Bugfixes -------- - correct some exception string format `#4068 `_ - Emit a warning when ``ssl.OP_NO_COMPRESSION`` is unavailable because the runtime is built against an outdated OpenSSL. `#4052 `_ - Update multidict requirement to >= 4.5 `#4057 `_ Improved Documentation ---------------------- - Provide pytest-aiohttp namespace for pytest fixtures in docs. `#3723 `_ ---- 3.6.0 (2019-09-06) ================== Features -------- - Add support for Named Pipes (Site and Connector) under Windows. This feature requires Proactor event loop to work. `#3629 `_ - Removed ``Transfer-Encoding: chunked`` header from websocket responses to be compatible with more http proxy servers. `#3798 `_ - Accept non-GET request for starting websocket handshake on server side. `#3980 `_ Bugfixes -------- - Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. `#3532 `_ - Fix an issue where cookies would sometimes not be set during a redirect. `#3576 `_ - Change normalize_path_middleware to use 308 redirect instead of 301. This behavior should prevent clients from being unable to use PUT/POST methods on endpoints that are redirected because of a trailing slash. `#3579 `_ - Drop the processed task from ``all_tasks()`` list early. It prevents logging about a task with unhandled exception when the server is used in conjunction with ``asyncio.run()``. `#3587 `_ - ``Signal`` type annotation changed from ``Signal[Callable[['TraceConfig'], Awaitable[None]]]`` to ``Signal[Callable[ClientSession, SimpleNamespace, ...]``. `#3595 `_ - Use sanitized URL as Location header in redirects `#3614 `_ - Improve typing annotations for multipart.py along with changes required by mypy in files that references multipart.py. `#3621 `_ - Close session created inside ``aiohttp.request`` when unhandled exception occurs `#3628 `_ - Cleanup per-chunk data in generic data read. Memory leak fixed. `#3631 `_ - Use correct type for add_view and family `#3633 `_ - Fix _keepalive field in __slots__ of ``RequestHandler``. `#3644 `_ - Properly handle ConnectionResetError, to silence the "Cannot write to closing transport" exception when clients disconnect uncleanly. `#3648 `_ - Suppress pytest warnings due to ``test_utils`` classes `#3660 `_ - Fix overshadowing of overlapped sub-application prefixes. `#3701 `_ - Fixed return type annotation for WSMessage.json() `#3720 `_ - Properly expose TooManyRedirects publicly as documented. `#3818 `_ - Fix missing brackets for IPv6 in proxy CONNECT request `#3841 `_ - Make the signature of ``aiohttp.test_utils.TestClient.request`` match ``asyncio.ClientSession.request`` according to the docs `#3852 `_ - Use correct style for re-exported imports, makes mypy ``--strict`` mode happy. `#3868 `_ - Fixed type annotation for add_view method of UrlDispatcher to accept any subclass of View `#3880 `_ - Made cython HTTP parser set Reason-Phrase of the response to an empty string if it is missing. `#3906 `_ - Add URL to the string representation of ClientResponseError. `#3959 `_ - Accept ``istr`` keys in ``LooseHeaders`` type hints. `#3976 `_ - Fixed race conditions in _resolve_host caching and throttling when tracing is enabled. `#4013 `_ - For URLs like "unix://localhost/..." set Host HTTP header to "localhost" instead of "localhost:None". `#4039 `_ Improved Documentation ---------------------- - Modify documentation for Background Tasks to remove deprecated usage of event loop. `#3526 `_ - use ``if __name__ == '__main__':`` in server examples. `#3775 `_ - Update documentation reference to the default access logger. `#3783 `_ - Improve documentation for ``web.BaseRequest.path`` and ``web.BaseRequest.raw_path``. `#3791 `_ - Removed deprecation warning in tracing example docs `#3964 `_ ---- 3.5.4 (2019-01-12) ================== Bugfixes -------- - Fix stream ``.read()`` / ``.readany()`` / ``.iter_any()`` which used to return a partial content only in case of compressed content `#3525 `_ 3.5.3 (2019-01-10) ================== Bugfixes -------- - Fix type stubs for ``aiohttp.web.run_app(access_log=True)`` and fix edge case of ``access_log=True`` and the event loop being in debug mode. `#3504 `_ - Fix ``aiohttp.ClientTimeout`` type annotations to accept ``None`` for fields `#3511 `_ - Send custom per-request cookies even if session jar is empty `#3515 `_ - Restore Linux binary wheels publishing on PyPI ---- 3.5.2 (2019-01-08) ================== Features -------- - ``FileResponse`` from ``web_fileresponse.py`` uses a ``ThreadPoolExecutor`` to work with files asynchronously. I/O based payloads from ``payload.py`` uses a ``ThreadPoolExecutor`` to work with I/O objects asynchronously. `#3313 `_ - Internal Server Errors in plain text if the browser does not support HTML. `#3483 `_ Bugfixes -------- - Preserve MultipartWriter parts headers on write. Refactor the way how ``Payload.headers`` are handled. Payload instances now always have headers and Content-Type defined. Fix Payload Content-Disposition header reset after initial creation. `#3035 `_ - Log suppressed exceptions in ``GunicornWebWorker``. `#3464 `_ - Remove wildcard imports. `#3468 `_ - Use the same task for app initialization and web server handling in gunicorn workers. It allows to use Python3.7 context vars smoothly. `#3471 `_ - Fix handling of chunked+gzipped response when first chunk does not give uncompressed data `#3477 `_ - Replace ``collections.MutableMapping`` with ``collections.abc.MutableMapping`` to avoid a deprecation warning. `#3480 `_ - ``Payload.size`` type annotation changed from ``Optional[float]`` to ``Optional[int]``. `#3484 `_ - Ignore done tasks when cancels pending activities on ``web.run_app`` finalization. `#3497 `_ Improved Documentation ---------------------- - Add documentation for ``aiohttp.web.HTTPException``. `#3490 `_ Misc ---- - `#3487 `_ ---- 3.5.1 (2018-12-24) ==================== - Fix a regression about ``ClientSession._requote_redirect_url`` modification in debug mode. 3.5.0 (2018-12-22) ==================== Features -------- - The library type annotations are checked in strict mode now. - Add support for setting cookies for individual request (`#2387 `_) - Application.add_domain implementation (`#2809 `_) - The default ``app`` in the request returned by ``test_utils.make_mocked_request`` can now have objects assigned to it and retrieved using the ``[]`` operator. (`#3174 `_) - Make ``request.url`` accessible when transport is closed. (`#3177 `_) - Add ``zlib_executor_size`` argument to ``Response`` constructor to allow compression to run in a background executor to avoid blocking the main thread and potentially triggering health check failures. (`#3205 `_) - Enable users to set ``ClientTimeout`` in ``aiohttp.request`` (`#3213 `_) - Don't raise a warning if ``NETRC`` environment variable is not set and ``~/.netrc`` file doesn't exist. (`#3267 `_) - Add default logging handler to web.run_app If the ``Application.debug``` flag is set and the default logger ``aiohttp.access`` is used, access logs will now be output using a *stderr* ``StreamHandler`` if no handlers are attached. Furthermore, if the default logger has no log level set, the log level will be set to ``DEBUG``. (`#3324 `_) - Add method argument to ``session.ws_connect()``. Sometimes server API requires a different HTTP method for WebSocket connection establishment. For example, ``Docker exec`` needs POST. (`#3378 `_) - Create a task per request handling. (`#3406 `_) Bugfixes -------- - Enable passing ``access_log_class`` via ``handler_args`` (`#3158 `_) - Return empty bytes with end-of-chunk marker in empty stream reader. (`#3186 `_) - Accept ``CIMultiDictProxy`` instances for ``headers`` argument in ``web.Response`` constructor. (`#3207 `_) - Don't uppercase HTTP method in parser (`#3233 `_) - Make method match regexp RFC-7230 compliant (`#3235 `_) - Add ``app.pre_frozen`` state to properly handle startup signals in sub-applications. (`#3237 `_) - Enhanced parsing and validation of helpers.BasicAuth.decode. (`#3239 `_) - Change imports from collections module in preparation for 3.8. (`#3258 `_) - Ensure Host header is added first to ClientRequest to better replicate browser (`#3265 `_) - Fix forward compatibility with Python 3.8: importing ABCs directly from the collections module will not be supported anymore. (`#3273 `_) - Keep the query string by ``normalize_path_middleware``. (`#3278 `_) - Fix missing parameter ``raise_for_status`` for aiohttp.request() (`#3290 `_) - Bracket IPv6 addresses in the HOST header (`#3304 `_) - Fix default message for server ping and pong frames. (`#3308 `_) - Fix tests/test_connector.py typo and tests/autobahn/server.py duplicate loop def. (`#3337 `_) - Fix false-negative indicator end_of_HTTP_chunk in StreamReader.readchunk function (`#3361 `_) - Release HTTP response before raising status exception (`#3364 `_) - Fix task cancellation when ``sendfile()`` syscall is used by static file handling. (`#3383 `_) - Fix stack trace for ``asyncio.TimeoutError`` which was not logged, when it is caught in the handler. (`#3414 `_) Improved Documentation ---------------------- - Improve documentation of ``Application.make_handler`` parameters. (`#3152 `_) - Fix BaseRequest.raw_headers doc. (`#3215 `_) - Fix typo in TypeError exception reason in ``web.Application._handle`` (`#3229 `_) - Make server access log format placeholder %b documentation reflect behavior and docstring. (`#3307 `_) Deprecations and Removals ------------------------- - Deprecate modification of ``session.requote_redirect_url`` (`#2278 `_) - Deprecate ``stream.unread_data()`` (`#3260 `_) - Deprecated use of boolean in ``resp.enable_compression()`` (`#3318 `_) - Encourage creation of aiohttp public objects inside a coroutine (`#3331 `_) - Drop dead ``Connection.detach()`` and ``Connection.writer``. Both methods were broken for more than 2 years. (`#3358 `_) - Deprecate ``app.loop``, ``request.loop``, ``client.loop`` and ``connector.loop`` properties. (`#3374 `_) - Deprecate explicit debug argument. Use asyncio debug mode instead. (`#3381 `_) - Deprecate body parameter in HTTPException (and derived classes) constructor. (`#3385 `_) - Deprecate bare connector close, use ``async with connector:`` and ``await connector.close()`` instead. (`#3417 `_) - Deprecate obsolete ``read_timeout`` and ``conn_timeout`` in ``ClientSession`` constructor. (`#3438 `_) Misc ---- - #3341, #3351 Platform: UNKNOWN Classifier: License :: OSI Approved :: Apache Software License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Development Status :: 5 - Production/Stable Classifier: Operating System :: POSIX Classifier: Operating System :: MacOS :: MacOS X Classifier: Operating System :: Microsoft :: Windows Classifier: Topic :: Internet :: WWW/HTTP Classifier: Framework :: AsyncIO Requires-Python: >=3.5.3 Provides-Extra: speedups aiohttp-3.6.2/README.rst0000644000175100001650000001143613547410117015161 0ustar vstsdocker00000000000000================================== Async http client/server framework ================================== .. image:: https://raw.githubusercontent.com/aio-libs/aiohttp/master/docs/_static/aiohttp-icon-128x128.png :height: 64px :width: 64px :alt: aiohttp logo | .. image:: https://travis-ci.com/aio-libs/aiohttp.svg?branch=master :target: https://travis-ci.com/aio-libs/aiohttp :align: right :alt: Travis status for master branch .. image:: https://ci.appveyor.com/api/projects/status/tnddy9k6pphl8w7k/branch/master?svg=true :target: https://ci.appveyor.com/project/aio-libs/aiohttp :align: right :alt: AppVeyor status for master branch .. image:: https://codecov.io/gh/aio-libs/aiohttp/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/aiohttp :alt: codecov.io status for master branch .. image:: https://badge.fury.io/py/aiohttp.svg :target: https://pypi.org/project/aiohttp :alt: Latest PyPI package version .. image:: https://readthedocs.org/projects/aiohttp/badge/?version=latest :target: https://docs.aiohttp.org/ :alt: Latest Read The Docs .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Key Features ============ - Supports both client and server side of HTTP protocol. - Supports both client and server Web-Sockets out-of-the-box and avoids Callback Hell. - Provides Web-server with middlewares and pluggable routing. Getting started =============== Client ------ To get something from the web: .. code-block:: python import aiohttp import asyncio async def fetch(session, url): async with session.get(url) as response: return await response.text() async def main(): async with aiohttp.ClientSession() as session: html = await fetch(session, 'http://python.org') print(html) if __name__ == '__main__': loop = asyncio.get_event_loop() loop.run_until_complete(main()) Server ------ An example using a simple server: .. code-block:: python # examples/server_simple.py from aiohttp import web async def handle(request): name = request.match_info.get('name', "Anonymous") text = "Hello, " + name return web.Response(text=text) async def wshandle(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: if msg.type == web.WSMsgType.text: await ws.send_str("Hello, {}".format(msg.data)) elif msg.type == web.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == web.WSMsgType.close: break return ws app = web.Application() app.add_routes([web.get('/', handle), web.get('/echo', wshandle), web.get('/{name}', handle)]) if __name__ == '__main__': web.run_app(app) Documentation ============= https://aiohttp.readthedocs.io/ Demos ===== https://github.com/aio-libs/aiohttp-demos External links ============== * `Third party libraries `_ * `Built with aiohttp `_ * `Powered by aiohttp `_ Feel free to make a Pull Request for adding your link to these pages! Communication channels ====================== *aio-libs* google group: https://groups.google.com/forum/#!forum/aio-libs Feel free to post your questions and ideas here. *gitter chat* https://gitter.im/aio-libs/Lobby We support `Stack Overflow `_. Please add *aiohttp* tag to your question there. Requirements ============ - Python >= 3.5.3 - async-timeout_ - attrs_ - chardet_ - multidict_ - yarl_ Optionally you may install the cChardet_ and aiodns_ libraries (highly recommended for sake of speed). .. _chardet: https://pypi.python.org/pypi/chardet .. _aiodns: https://pypi.python.org/pypi/aiodns .. _attrs: https://github.com/python-attrs/attrs .. _multidict: https://pypi.python.org/pypi/multidict .. _yarl: https://pypi.python.org/pypi/yarl .. _async-timeout: https://pypi.python.org/pypi/async_timeout .. _cChardet: https://pypi.python.org/pypi/cchardet License ======= ``aiohttp`` is offered under the Apache 2 license. Keepsafe ======== The aiohttp community would like to thank Keepsafe (https://www.getkeepsafe.com) for its support in the early days of the project. Source code =========== The latest developer version is available in a GitHub repository: https://github.com/aio-libs/aiohttp Benchmarks ========== If you are interested in efficiency, the AsyncIO community maintains a list of benchmarks on the official wiki: https://github.com/python/asyncio/wiki/Benchmarks aiohttp-3.6.2/aiohttp/0000755000175100001650000000000013547410140015131 5ustar vstsdocker00000000000000aiohttp-3.6.2/aiohttp/__init__.py0000644000175100001650000002001113547410117017240 0ustar vstsdocker00000000000000__version__ = '3.6.2' from typing import Tuple # noqa from . import hdrs as hdrs from .client import BaseConnector as BaseConnector from .client import ClientConnectionError as ClientConnectionError from .client import ( ClientConnectorCertificateError as ClientConnectorCertificateError, ) from .client import ClientConnectorError as ClientConnectorError from .client import ClientConnectorSSLError as ClientConnectorSSLError from .client import ClientError as ClientError from .client import ClientHttpProxyError as ClientHttpProxyError from .client import ClientOSError as ClientOSError from .client import ClientPayloadError as ClientPayloadError from .client import ClientProxyConnectionError as ClientProxyConnectionError from .client import ClientRequest as ClientRequest from .client import ClientResponse as ClientResponse from .client import ClientResponseError as ClientResponseError from .client import ClientSession as ClientSession from .client import ClientSSLError as ClientSSLError from .client import ClientTimeout as ClientTimeout from .client import ClientWebSocketResponse as ClientWebSocketResponse from .client import ContentTypeError as ContentTypeError from .client import Fingerprint as Fingerprint from .client import InvalidURL as InvalidURL from .client import NamedPipeConnector as NamedPipeConnector from .client import RequestInfo as RequestInfo from .client import ServerConnectionError as ServerConnectionError from .client import ServerDisconnectedError as ServerDisconnectedError from .client import ServerFingerprintMismatch as ServerFingerprintMismatch from .client import ServerTimeoutError as ServerTimeoutError from .client import TCPConnector as TCPConnector from .client import TooManyRedirects as TooManyRedirects from .client import UnixConnector as UnixConnector from .client import WSServerHandshakeError as WSServerHandshakeError from .client import request as request from .cookiejar import CookieJar as CookieJar from .cookiejar import DummyCookieJar as DummyCookieJar from .formdata import FormData as FormData from .helpers import BasicAuth as BasicAuth from .helpers import ChainMapProxy as ChainMapProxy from .http import HttpVersion as HttpVersion from .http import HttpVersion10 as HttpVersion10 from .http import HttpVersion11 as HttpVersion11 from .http import WebSocketError as WebSocketError from .http import WSCloseCode as WSCloseCode from .http import WSMessage as WSMessage from .http import WSMsgType as WSMsgType from .multipart import ( BadContentDispositionHeader as BadContentDispositionHeader, ) from .multipart import BadContentDispositionParam as BadContentDispositionParam from .multipart import BodyPartReader as BodyPartReader from .multipart import MultipartReader as MultipartReader from .multipart import MultipartWriter as MultipartWriter from .multipart import ( content_disposition_filename as content_disposition_filename, ) from .multipart import parse_content_disposition as parse_content_disposition from .payload import PAYLOAD_REGISTRY as PAYLOAD_REGISTRY from .payload import AsyncIterablePayload as AsyncIterablePayload from .payload import BufferedReaderPayload as BufferedReaderPayload from .payload import BytesIOPayload as BytesIOPayload from .payload import BytesPayload as BytesPayload from .payload import IOBasePayload as IOBasePayload from .payload import JsonPayload as JsonPayload from .payload import Payload as Payload from .payload import StringIOPayload as StringIOPayload from .payload import StringPayload as StringPayload from .payload import TextIOPayload as TextIOPayload from .payload import get_payload as get_payload from .payload import payload_type as payload_type from .payload_streamer import streamer as streamer from .resolver import AsyncResolver as AsyncResolver from .resolver import DefaultResolver as DefaultResolver from .resolver import ThreadedResolver as ThreadedResolver from .signals import Signal as Signal from .streams import EMPTY_PAYLOAD as EMPTY_PAYLOAD from .streams import DataQueue as DataQueue from .streams import EofStream as EofStream from .streams import FlowControlDataQueue as FlowControlDataQueue from .streams import StreamReader as StreamReader from .tracing import TraceConfig as TraceConfig from .tracing import ( TraceConnectionCreateEndParams as TraceConnectionCreateEndParams, ) from .tracing import ( TraceConnectionCreateStartParams as TraceConnectionCreateStartParams, ) from .tracing import ( TraceConnectionQueuedEndParams as TraceConnectionQueuedEndParams, ) from .tracing import ( TraceConnectionQueuedStartParams as TraceConnectionQueuedStartParams, ) from .tracing import ( TraceConnectionReuseconnParams as TraceConnectionReuseconnParams, ) from .tracing import TraceDnsCacheHitParams as TraceDnsCacheHitParams from .tracing import TraceDnsCacheMissParams as TraceDnsCacheMissParams from .tracing import ( TraceDnsResolveHostEndParams as TraceDnsResolveHostEndParams, ) from .tracing import ( TraceDnsResolveHostStartParams as TraceDnsResolveHostStartParams, ) from .tracing import TraceRequestChunkSentParams as TraceRequestChunkSentParams from .tracing import TraceRequestEndParams as TraceRequestEndParams from .tracing import TraceRequestExceptionParams as TraceRequestExceptionParams from .tracing import TraceRequestRedirectParams as TraceRequestRedirectParams from .tracing import TraceRequestStartParams as TraceRequestStartParams from .tracing import ( TraceResponseChunkReceivedParams as TraceResponseChunkReceivedParams, ) __all__ = ( 'hdrs', # client 'BaseConnector', 'ClientConnectionError', 'ClientConnectorCertificateError', 'ClientConnectorError', 'ClientConnectorSSLError', 'ClientError', 'ClientHttpProxyError', 'ClientOSError', 'ClientPayloadError', 'ClientProxyConnectionError', 'ClientResponse', 'ClientRequest', 'ClientResponseError', 'ClientSSLError', 'ClientSession', 'ClientTimeout', 'ClientWebSocketResponse', 'ContentTypeError', 'Fingerprint', 'InvalidURL', 'RequestInfo', 'ServerConnectionError', 'ServerDisconnectedError', 'ServerFingerprintMismatch', 'ServerTimeoutError', 'TCPConnector', 'TooManyRedirects', 'UnixConnector', 'NamedPipeConnector', 'WSServerHandshakeError', 'request', # cookiejar 'CookieJar', 'DummyCookieJar', # formdata 'FormData', # helpers 'BasicAuth', 'ChainMapProxy', # http 'HttpVersion', 'HttpVersion10', 'HttpVersion11', 'WSMsgType', 'WSCloseCode', 'WSMessage', 'WebSocketError', # multipart 'BadContentDispositionHeader', 'BadContentDispositionParam', 'BodyPartReader', 'MultipartReader', 'MultipartWriter', 'content_disposition_filename', 'parse_content_disposition', # payload 'AsyncIterablePayload', 'BufferedReaderPayload', 'BytesIOPayload', 'BytesPayload', 'IOBasePayload', 'JsonPayload', 'PAYLOAD_REGISTRY', 'Payload', 'StringIOPayload', 'StringPayload', 'TextIOPayload', 'get_payload', 'payload_type', # payload_streamer 'streamer', # resolver 'AsyncResolver', 'DefaultResolver', 'ThreadedResolver', # signals 'Signal', 'DataQueue', 'EMPTY_PAYLOAD', 'EofStream', 'FlowControlDataQueue', 'StreamReader', # tracing 'TraceConfig', 'TraceConnectionCreateEndParams', 'TraceConnectionCreateStartParams', 'TraceConnectionQueuedEndParams', 'TraceConnectionQueuedStartParams', 'TraceConnectionReuseconnParams', 'TraceDnsCacheHitParams', 'TraceDnsCacheMissParams', 'TraceDnsResolveHostEndParams', 'TraceDnsResolveHostStartParams', 'TraceRequestChunkSentParams', 'TraceRequestEndParams', 'TraceRequestExceptionParams', 'TraceRequestRedirectParams', 'TraceRequestStartParams', 'TraceResponseChunkReceivedParams', ) # type: Tuple[str, ...] try: from .worker import GunicornWebWorker, GunicornUVLoopWebWorker # noqa __all__ += ('GunicornWebWorker', 'GunicornUVLoopWebWorker') except ImportError: # pragma: no cover pass aiohttp-3.6.2/aiohttp/_cparser.pxd0000644000175100001650000000756713547410117017467 0ustar vstsdocker00000000000000from libc.stdint cimport uint16_t, uint32_t, uint64_t cdef extern from "../vendor/http-parser/http_parser.h": ctypedef int (*http_data_cb) (http_parser*, const char *at, size_t length) except -1 ctypedef int (*http_cb) (http_parser*) except -1 struct http_parser: unsigned int type unsigned int flags unsigned int state unsigned int header_state unsigned int index uint32_t nread uint64_t content_length unsigned short http_major unsigned short http_minor unsigned int status_code unsigned int method unsigned int http_errno unsigned int upgrade void *data struct http_parser_settings: http_cb on_message_begin http_data_cb on_url http_data_cb on_status http_data_cb on_header_field http_data_cb on_header_value http_cb on_headers_complete http_data_cb on_body http_cb on_message_complete http_cb on_chunk_header http_cb on_chunk_complete enum http_parser_type: HTTP_REQUEST, HTTP_RESPONSE, HTTP_BOTH enum http_errno: HPE_OK, HPE_CB_message_begin, HPE_CB_url, HPE_CB_header_field, HPE_CB_header_value, HPE_CB_headers_complete, HPE_CB_body, HPE_CB_message_complete, HPE_CB_status, HPE_CB_chunk_header, HPE_CB_chunk_complete, HPE_INVALID_EOF_STATE, HPE_HEADER_OVERFLOW, HPE_CLOSED_CONNECTION, HPE_INVALID_VERSION, HPE_INVALID_STATUS, HPE_INVALID_METHOD, HPE_INVALID_URL, HPE_INVALID_HOST, HPE_INVALID_PORT, HPE_INVALID_PATH, HPE_INVALID_QUERY_STRING, HPE_INVALID_FRAGMENT, HPE_LF_EXPECTED, HPE_INVALID_HEADER_TOKEN, HPE_INVALID_CONTENT_LENGTH, HPE_INVALID_CHUNK_SIZE, HPE_INVALID_CONSTANT, HPE_INVALID_INTERNAL_STATE, HPE_STRICT, HPE_PAUSED, HPE_UNKNOWN enum flags: F_CHUNKED, F_CONNECTION_KEEP_ALIVE, F_CONNECTION_CLOSE, F_CONNECTION_UPGRADE, F_TRAILING, F_UPGRADE, F_SKIPBODY, F_CONTENTLENGTH enum http_method: DELETE, GET, HEAD, POST, PUT, CONNECT, OPTIONS, TRACE, COPY, LOCK, MKCOL, MOVE, PROPFIND, PROPPATCH, SEARCH, UNLOCK, BIND, REBIND, UNBIND, ACL, REPORT, MKACTIVITY, CHECKOUT, MERGE, MSEARCH, NOTIFY, SUBSCRIBE, UNSUBSCRIBE, PATCH, PURGE, MKCALENDAR, LINK, UNLINK void http_parser_init(http_parser *parser, http_parser_type type) size_t http_parser_execute(http_parser *parser, const http_parser_settings *settings, const char *data, size_t len) int http_should_keep_alive(const http_parser *parser) void http_parser_settings_init(http_parser_settings *settings) const char *http_errno_name(http_errno err) const char *http_errno_description(http_errno err) const char *http_method_str(http_method m) # URL Parser enum http_parser_url_fields: UF_SCHEMA = 0, UF_HOST = 1, UF_PORT = 2, UF_PATH = 3, UF_QUERY = 4, UF_FRAGMENT = 5, UF_USERINFO = 6, UF_MAX = 7 struct http_parser_url_field_data: uint16_t off uint16_t len struct http_parser_url: uint16_t field_set uint16_t port http_parser_url_field_data[UF_MAX] field_data void http_parser_url_init(http_parser_url *u) int http_parser_parse_url(const char *buf, size_t buflen, int is_connect, http_parser_url *u) aiohttp-3.6.2/aiohttp/_find_header.c0000644000175100001650000056275413547410117017713 0ustar vstsdocker00000000000000/* The file is autogenerated from aiohttp/hdrs.py Run ./tools/gen.py to update it after the origin changing. */ #include "_find_header.h" #define NEXT_CHAR() \ { \ count++; \ if (count == size) { \ /* end of search */ \ return -1; \ } \ pchar++; \ ch = *pchar; \ last = (count == size -1); \ } while(0); int find_header(const char *str, int size) { char *pchar = str; int last; char ch; int count = -1; pchar--; INITIAL: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto A; case 'a': if (last) { return -1; } goto A; case 'C': if (last) { return -1; } goto C; case 'c': if (last) { return -1; } goto C; case 'D': if (last) { return -1; } goto D; case 'd': if (last) { return -1; } goto D; case 'E': if (last) { return -1; } goto E; case 'e': if (last) { return -1; } goto E; case 'F': if (last) { return -1; } goto F; case 'f': if (last) { return -1; } goto F; case 'H': if (last) { return -1; } goto H; case 'h': if (last) { return -1; } goto H; case 'I': if (last) { return -1; } goto I; case 'i': if (last) { return -1; } goto I; case 'K': if (last) { return -1; } goto K; case 'k': if (last) { return -1; } goto K; case 'L': if (last) { return -1; } goto L; case 'l': if (last) { return -1; } goto L; case 'M': if (last) { return -1; } goto M; case 'm': if (last) { return -1; } goto M; case 'O': if (last) { return -1; } goto O; case 'o': if (last) { return -1; } goto O; case 'P': if (last) { return -1; } goto P; case 'p': if (last) { return -1; } goto P; case 'R': if (last) { return -1; } goto R; case 'r': if (last) { return -1; } goto R; case 'S': if (last) { return -1; } goto S; case 's': if (last) { return -1; } goto S; case 'T': if (last) { return -1; } goto T; case 't': if (last) { return -1; } goto T; case 'U': if (last) { return -1; } goto U; case 'u': if (last) { return -1; } goto U; case 'V': if (last) { return -1; } goto V; case 'v': if (last) { return -1; } goto V; case 'W': if (last) { return -1; } goto W; case 'w': if (last) { return -1; } goto W; case 'X': if (last) { return -1; } goto X; case 'x': if (last) { return -1; } goto X; default: return -1; } A: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto AC; case 'c': if (last) { return -1; } goto AC; case 'G': if (last) { return -1; } goto AG; case 'g': if (last) { return -1; } goto AG; case 'L': if (last) { return -1; } goto AL; case 'l': if (last) { return -1; } goto AL; case 'U': if (last) { return -1; } goto AU; case 'u': if (last) { return -1; } goto AU; default: return -1; } AC: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto ACC; case 'c': if (last) { return -1; } goto ACC; default: return -1; } ACC: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCE; case 'e': if (last) { return -1; } goto ACCE; default: return -1; } ACCE: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto ACCEP; case 'p': if (last) { return -1; } goto ACCEP; case 'S': if (last) { return -1; } goto ACCES; case 's': if (last) { return -1; } goto ACCES; default: return -1; } ACCEP: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 0; } goto ACCEPT; case 't': if (last) { return 0; } goto ACCEPT; default: return -1; } ACCEPT: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCEPT_; default: return -1; } ACCEPT_: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto ACCEPT_C; case 'c': if (last) { return -1; } goto ACCEPT_C; case 'E': if (last) { return -1; } goto ACCEPT_E; case 'e': if (last) { return -1; } goto ACCEPT_E; case 'L': if (last) { return -1; } goto ACCEPT_L; case 'l': if (last) { return -1; } goto ACCEPT_L; case 'R': if (last) { return -1; } goto ACCEPT_R; case 'r': if (last) { return -1; } goto ACCEPT_R; default: return -1; } ACCEPT_C: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto ACCEPT_CH; case 'h': if (last) { return -1; } goto ACCEPT_CH; default: return -1; } ACCEPT_CH: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCEPT_CHA; case 'a': if (last) { return -1; } goto ACCEPT_CHA; default: return -1; } ACCEPT_CHA: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCEPT_CHAR; case 'r': if (last) { return -1; } goto ACCEPT_CHAR; default: return -1; } ACCEPT_CHAR: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto ACCEPT_CHARS; case 's': if (last) { return -1; } goto ACCEPT_CHARS; default: return -1; } ACCEPT_CHARS: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCEPT_CHARSE; case 'e': if (last) { return -1; } goto ACCEPT_CHARSE; default: return -1; } ACCEPT_CHARSE: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 1; } goto ACCEPT_CHARSET; case 't': if (last) { return 1; } goto ACCEPT_CHARSET; default: return -1; } ACCEPT_E: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCEPT_EN; case 'n': if (last) { return -1; } goto ACCEPT_EN; default: return -1; } ACCEPT_EN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto ACCEPT_ENC; case 'c': if (last) { return -1; } goto ACCEPT_ENC; default: return -1; } ACCEPT_ENC: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCEPT_ENCO; case 'o': if (last) { return -1; } goto ACCEPT_ENCO; default: return -1; } ACCEPT_ENCO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCEPT_ENCOD; case 'd': if (last) { return -1; } goto ACCEPT_ENCOD; default: return -1; } ACCEPT_ENCOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ACCEPT_ENCODI; case 'i': if (last) { return -1; } goto ACCEPT_ENCODI; default: return -1; } ACCEPT_ENCODI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCEPT_ENCODIN; case 'n': if (last) { return -1; } goto ACCEPT_ENCODIN; default: return -1; } ACCEPT_ENCODIN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 2; } goto ACCEPT_ENCODING; case 'g': if (last) { return 2; } goto ACCEPT_ENCODING; default: return -1; } ACCEPT_L: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCEPT_LA; case 'a': if (last) { return -1; } goto ACCEPT_LA; default: return -1; } ACCEPT_LA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCEPT_LAN; case 'n': if (last) { return -1; } goto ACCEPT_LAN; default: return -1; } ACCEPT_LAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ACCEPT_LANG; case 'g': if (last) { return -1; } goto ACCEPT_LANG; default: return -1; } ACCEPT_LANG: NEXT_CHAR(); switch (ch) { case 'U': if (last) { return -1; } goto ACCEPT_LANGU; case 'u': if (last) { return -1; } goto ACCEPT_LANGU; default: return -1; } ACCEPT_LANGU: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCEPT_LANGUA; case 'a': if (last) { return -1; } goto ACCEPT_LANGUA; default: return -1; } ACCEPT_LANGUA: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ACCEPT_LANGUAG; case 'g': if (last) { return -1; } goto ACCEPT_LANGUAG; default: return -1; } ACCEPT_LANGUAG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 3; } goto ACCEPT_LANGUAGE; case 'e': if (last) { return 3; } goto ACCEPT_LANGUAGE; default: return -1; } ACCEPT_R: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCEPT_RA; case 'a': if (last) { return -1; } goto ACCEPT_RA; default: return -1; } ACCEPT_RA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCEPT_RAN; case 'n': if (last) { return -1; } goto ACCEPT_RAN; default: return -1; } ACCEPT_RAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ACCEPT_RANG; case 'g': if (last) { return -1; } goto ACCEPT_RANG; default: return -1; } ACCEPT_RANG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCEPT_RANGE; case 'e': if (last) { return -1; } goto ACCEPT_RANGE; default: return -1; } ACCEPT_RANGE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 4; } goto ACCEPT_RANGES; case 's': if (last) { return 4; } goto ACCEPT_RANGES; default: return -1; } ACCES: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto ACCESS; case 's': if (last) { return -1; } goto ACCESS; default: return -1; } ACCESS: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_; default: return -1; } ACCESS_: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto ACCESS_C; case 'c': if (last) { return -1; } goto ACCESS_C; default: return -1; } ACCESS_C: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CO; case 'o': if (last) { return -1; } goto ACCESS_CO; default: return -1; } ACCESS_CO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCESS_CON; case 'n': if (last) { return -1; } goto ACCESS_CON; default: return -1; } ACCESS_CON: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ACCESS_CONT; case 't': if (last) { return -1; } goto ACCESS_CONT; default: return -1; } ACCESS_CONT: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTR; case 'r': if (last) { return -1; } goto ACCESS_CONTR; default: return -1; } ACCESS_CONTR: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CONTRO; case 'o': if (last) { return -1; } goto ACCESS_CONTRO; default: return -1; } ACCESS_CONTRO: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto ACCESS_CONTROL; case 'l': if (last) { return -1; } goto ACCESS_CONTROL; default: return -1; } ACCESS_CONTROL: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_CONTROL_; default: return -1; } ACCESS_CONTROL_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_A; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_A; case 'E': if (last) { return -1; } goto ACCESS_CONTROL_E; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_E; case 'M': if (last) { return -1; } goto ACCESS_CONTROL_M; case 'm': if (last) { return -1; } goto ACCESS_CONTROL_M; case 'R': if (last) { return -1; } goto ACCESS_CONTROL_R; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_R; default: return -1; } ACCESS_CONTROL_A: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto ACCESS_CONTROL_AL; case 'l': if (last) { return -1; } goto ACCESS_CONTROL_AL; default: return -1; } ACCESS_CONTROL_AL: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto ACCESS_CONTROL_ALL; case 'l': if (last) { return -1; } goto ACCESS_CONTROL_ALL; default: return -1; } ACCESS_CONTROL_ALL: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CONTROL_ALLO; case 'o': if (last) { return -1; } goto ACCESS_CONTROL_ALLO; default: return -1; } ACCESS_CONTROL_ALLO: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW; case 'w': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW; default: return -1; } ACCESS_CONTROL_ALLOW: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_; default: return -1; } ACCESS_CONTROL_ALLOW_: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_C; case 'c': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_C; case 'H': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_H; case 'h': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_H; case 'M': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_M; case 'm': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_M; case 'O': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_O; case 'o': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_O; default: return -1; } ACCESS_CONTROL_ALLOW_C: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CR; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CR; default: return -1; } ACCESS_CONTROL_ALLOW_CR: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CRE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CRE; default: return -1; } ACCESS_CONTROL_ALLOW_CRE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CRED; case 'd': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CRED; default: return -1; } ACCESS_CONTROL_ALLOW_CRED: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDE; default: return -1; } ACCESS_CONTROL_ALLOW_CREDE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDEN; case 'n': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDEN; default: return -1; } ACCESS_CONTROL_ALLOW_CREDEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENT; case 't': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENT; default: return -1; } ACCESS_CONTROL_ALLOW_CREDENT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTI; case 'i': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTI; default: return -1; } ACCESS_CONTROL_ALLOW_CREDENTI: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTIA; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTIA; default: return -1; } ACCESS_CONTROL_ALLOW_CREDENTIA: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTIAL; case 'l': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_CREDENTIAL; default: return -1; } ACCESS_CONTROL_ALLOW_CREDENTIAL: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 5; } goto ACCESS_CONTROL_ALLOW_CREDENTIALS; case 's': if (last) { return 5; } goto ACCESS_CONTROL_ALLOW_CREDENTIALS; default: return -1; } ACCESS_CONTROL_ALLOW_H: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HE; default: return -1; } ACCESS_CONTROL_ALLOW_HE: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEA; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEA; default: return -1; } ACCESS_CONTROL_ALLOW_HEA: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEAD; case 'd': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEAD; default: return -1; } ACCESS_CONTROL_ALLOW_HEAD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEADE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEADE; default: return -1; } ACCESS_CONTROL_ALLOW_HEADE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEADER; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_HEADER; default: return -1; } ACCESS_CONTROL_ALLOW_HEADER: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 6; } goto ACCESS_CONTROL_ALLOW_HEADERS; case 's': if (last) { return 6; } goto ACCESS_CONTROL_ALLOW_HEADERS; default: return -1; } ACCESS_CONTROL_ALLOW_M: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ME; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ME; default: return -1; } ACCESS_CONTROL_ALLOW_ME: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_MET; case 't': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_MET; default: return -1; } ACCESS_CONTROL_ALLOW_MET: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METH; case 'h': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METH; default: return -1; } ACCESS_CONTROL_ALLOW_METH: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METHO; case 'o': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METHO; default: return -1; } ACCESS_CONTROL_ALLOW_METHO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METHOD; case 'd': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_METHOD; default: return -1; } ACCESS_CONTROL_ALLOW_METHOD: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 7; } goto ACCESS_CONTROL_ALLOW_METHODS; case 's': if (last) { return 7; } goto ACCESS_CONTROL_ALLOW_METHODS; default: return -1; } ACCESS_CONTROL_ALLOW_O: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_OR; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_OR; default: return -1; } ACCESS_CONTROL_ALLOW_OR: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORI; case 'i': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORI; default: return -1; } ACCESS_CONTROL_ALLOW_ORI: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORIG; case 'g': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORIG; default: return -1; } ACCESS_CONTROL_ALLOW_ORIG: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORIGI; case 'i': if (last) { return -1; } goto ACCESS_CONTROL_ALLOW_ORIGI; default: return -1; } ACCESS_CONTROL_ALLOW_ORIGI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 8; } goto ACCESS_CONTROL_ALLOW_ORIGIN; case 'n': if (last) { return 8; } goto ACCESS_CONTROL_ALLOW_ORIGIN; default: return -1; } ACCESS_CONTROL_E: NEXT_CHAR(); switch (ch) { case 'X': if (last) { return -1; } goto ACCESS_CONTROL_EX; case 'x': if (last) { return -1; } goto ACCESS_CONTROL_EX; default: return -1; } ACCESS_CONTROL_EX: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto ACCESS_CONTROL_EXP; case 'p': if (last) { return -1; } goto ACCESS_CONTROL_EXP; default: return -1; } ACCESS_CONTROL_EXP: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CONTROL_EXPO; case 'o': if (last) { return -1; } goto ACCESS_CONTROL_EXPO; default: return -1; } ACCESS_CONTROL_EXPO: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto ACCESS_CONTROL_EXPOS; case 's': if (last) { return -1; } goto ACCESS_CONTROL_EXPOS; default: return -1; } ACCESS_CONTROL_EXPOS: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE; default: return -1; } ACCESS_CONTROL_EXPOSE: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_; default: return -1; } ACCESS_CONTROL_EXPOSE_: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_H; case 'h': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_H; default: return -1; } ACCESS_CONTROL_EXPOSE_H: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HE; default: return -1; } ACCESS_CONTROL_EXPOSE_HE: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEA; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEA; default: return -1; } ACCESS_CONTROL_EXPOSE_HEA: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEAD; case 'd': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEAD; default: return -1; } ACCESS_CONTROL_EXPOSE_HEAD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEADE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEADE; default: return -1; } ACCESS_CONTROL_EXPOSE_HEADE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEADER; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_EXPOSE_HEADER; default: return -1; } ACCESS_CONTROL_EXPOSE_HEADER: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 9; } goto ACCESS_CONTROL_EXPOSE_HEADERS; case 's': if (last) { return 9; } goto ACCESS_CONTROL_EXPOSE_HEADERS; default: return -1; } ACCESS_CONTROL_M: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_MA; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_MA; default: return -1; } ACCESS_CONTROL_MA: NEXT_CHAR(); switch (ch) { case 'X': if (last) { return -1; } goto ACCESS_CONTROL_MAX; case 'x': if (last) { return -1; } goto ACCESS_CONTROL_MAX; default: return -1; } ACCESS_CONTROL_MAX: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_CONTROL_MAX_; default: return -1; } ACCESS_CONTROL_MAX_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_MAX_A; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_MAX_A; default: return -1; } ACCESS_CONTROL_MAX_A: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ACCESS_CONTROL_MAX_AG; case 'g': if (last) { return -1; } goto ACCESS_CONTROL_MAX_AG; default: return -1; } ACCESS_CONTROL_MAX_AG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 10; } goto ACCESS_CONTROL_MAX_AGE; case 'e': if (last) { return 10; } goto ACCESS_CONTROL_MAX_AGE; default: return -1; } ACCESS_CONTROL_R: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_RE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_RE; default: return -1; } ACCESS_CONTROL_RE: NEXT_CHAR(); switch (ch) { case 'Q': if (last) { return -1; } goto ACCESS_CONTROL_REQ; case 'q': if (last) { return -1; } goto ACCESS_CONTROL_REQ; default: return -1; } ACCESS_CONTROL_REQ: NEXT_CHAR(); switch (ch) { case 'U': if (last) { return -1; } goto ACCESS_CONTROL_REQU; case 'u': if (last) { return -1; } goto ACCESS_CONTROL_REQU; default: return -1; } ACCESS_CONTROL_REQU: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_REQUE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_REQUE; default: return -1; } ACCESS_CONTROL_REQUE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto ACCESS_CONTROL_REQUES; case 's': if (last) { return -1; } goto ACCESS_CONTROL_REQUES; default: return -1; } ACCESS_CONTROL_REQUES: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST; case 't': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST; default: return -1; } ACCESS_CONTROL_REQUEST: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_; default: return -1; } ACCESS_CONTROL_REQUEST_: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_H; case 'h': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_H; case 'M': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_M; case 'm': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_M; default: return -1; } ACCESS_CONTROL_REQUEST_H: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HE; default: return -1; } ACCESS_CONTROL_REQUEST_HE: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEA; case 'a': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEA; default: return -1; } ACCESS_CONTROL_REQUEST_HEA: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEAD; case 'd': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEAD; default: return -1; } ACCESS_CONTROL_REQUEST_HEAD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEADE; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEADE; default: return -1; } ACCESS_CONTROL_REQUEST_HEADE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEADER; case 'r': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_HEADER; default: return -1; } ACCESS_CONTROL_REQUEST_HEADER: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 11; } goto ACCESS_CONTROL_REQUEST_HEADERS; case 's': if (last) { return 11; } goto ACCESS_CONTROL_REQUEST_HEADERS; default: return -1; } ACCESS_CONTROL_REQUEST_M: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_ME; case 'e': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_ME; default: return -1; } ACCESS_CONTROL_REQUEST_ME: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_MET; case 't': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_MET; default: return -1; } ACCESS_CONTROL_REQUEST_MET: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_METH; case 'h': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_METH; default: return -1; } ACCESS_CONTROL_REQUEST_METH: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_METHO; case 'o': if (last) { return -1; } goto ACCESS_CONTROL_REQUEST_METHO; default: return -1; } ACCESS_CONTROL_REQUEST_METHO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return 12; } goto ACCESS_CONTROL_REQUEST_METHOD; case 'd': if (last) { return 12; } goto ACCESS_CONTROL_REQUEST_METHOD; default: return -1; } AG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 13; } goto AGE; case 'e': if (last) { return 13; } goto AGE; default: return -1; } AL: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto ALL; case 'l': if (last) { return -1; } goto ALL; default: return -1; } ALL: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto ALLO; case 'o': if (last) { return -1; } goto ALLO; default: return -1; } ALLO: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return 14; } goto ALLOW; case 'w': if (last) { return 14; } goto ALLOW; default: return -1; } AU: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto AUT; case 't': if (last) { return -1; } goto AUT; default: return -1; } AUT: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto AUTH; case 'h': if (last) { return -1; } goto AUTH; default: return -1; } AUTH: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto AUTHO; case 'o': if (last) { return -1; } goto AUTHO; default: return -1; } AUTHO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto AUTHOR; case 'r': if (last) { return -1; } goto AUTHOR; default: return -1; } AUTHOR: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto AUTHORI; case 'i': if (last) { return -1; } goto AUTHORI; default: return -1; } AUTHORI: NEXT_CHAR(); switch (ch) { case 'Z': if (last) { return -1; } goto AUTHORIZ; case 'z': if (last) { return -1; } goto AUTHORIZ; default: return -1; } AUTHORIZ: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto AUTHORIZA; case 'a': if (last) { return -1; } goto AUTHORIZA; default: return -1; } AUTHORIZA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto AUTHORIZAT; case 't': if (last) { return -1; } goto AUTHORIZAT; default: return -1; } AUTHORIZAT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto AUTHORIZATI; case 'i': if (last) { return -1; } goto AUTHORIZATI; default: return -1; } AUTHORIZATI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto AUTHORIZATIO; case 'o': if (last) { return -1; } goto AUTHORIZATIO; default: return -1; } AUTHORIZATIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 15; } goto AUTHORIZATION; case 'n': if (last) { return 15; } goto AUTHORIZATION; default: return -1; } C: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CA; case 'a': if (last) { return -1; } goto CA; case 'O': if (last) { return -1; } goto CO; case 'o': if (last) { return -1; } goto CO; default: return -1; } CA: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CAC; case 'c': if (last) { return -1; } goto CAC; default: return -1; } CAC: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto CACH; case 'h': if (last) { return -1; } goto CACH; default: return -1; } CACH: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto CACHE; case 'e': if (last) { return -1; } goto CACHE; default: return -1; } CACHE: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto CACHE_; default: return -1; } CACHE_: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CACHE_C; case 'c': if (last) { return -1; } goto CACHE_C; default: return -1; } CACHE_C: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CACHE_CO; case 'o': if (last) { return -1; } goto CACHE_CO; default: return -1; } CACHE_CO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CACHE_CON; case 'n': if (last) { return -1; } goto CACHE_CON; default: return -1; } CACHE_CON: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CACHE_CONT; case 't': if (last) { return -1; } goto CACHE_CONT; default: return -1; } CACHE_CONT: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto CACHE_CONTR; case 'r': if (last) { return -1; } goto CACHE_CONTR; default: return -1; } CACHE_CONTR: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CACHE_CONTRO; case 'o': if (last) { return -1; } goto CACHE_CONTRO; default: return -1; } CACHE_CONTRO: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return 16; } goto CACHE_CONTROL; case 'l': if (last) { return 16; } goto CACHE_CONTROL; default: return -1; } CO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CON; case 'n': if (last) { return -1; } goto CON; case 'O': if (last) { return -1; } goto COO; case 'o': if (last) { return -1; } goto COO; default: return -1; } CON: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONN; case 'n': if (last) { return -1; } goto CONN; case 'T': if (last) { return -1; } goto CONT; case 't': if (last) { return -1; } goto CONT; default: return -1; } CONN: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto CONNE; case 'e': if (last) { return -1; } goto CONNE; default: return -1; } CONNE: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CONNEC; case 'c': if (last) { return -1; } goto CONNEC; default: return -1; } CONNEC: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CONNECT; case 't': if (last) { return -1; } goto CONNECT; default: return -1; } CONNECT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONNECTI; case 'i': if (last) { return -1; } goto CONNECTI; default: return -1; } CONNECTI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONNECTIO; case 'o': if (last) { return -1; } goto CONNECTIO; default: return -1; } CONNECTIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 17; } goto CONNECTION; case 'n': if (last) { return 17; } goto CONNECTION; default: return -1; } CONT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto CONTE; case 'e': if (last) { return -1; } goto CONTE; default: return -1; } CONTE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTEN; case 'n': if (last) { return -1; } goto CONTEN; default: return -1; } CONTEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CONTENT; case 't': if (last) { return -1; } goto CONTENT; default: return -1; } CONTENT: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto CONTENT_; default: return -1; } CONTENT_: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto CONTENT_D; case 'd': if (last) { return -1; } goto CONTENT_D; case 'E': if (last) { return -1; } goto CONTENT_E; case 'e': if (last) { return -1; } goto CONTENT_E; case 'L': if (last) { return -1; } goto CONTENT_L; case 'l': if (last) { return -1; } goto CONTENT_L; case 'M': if (last) { return -1; } goto CONTENT_M; case 'm': if (last) { return -1; } goto CONTENT_M; case 'R': if (last) { return -1; } goto CONTENT_R; case 'r': if (last) { return -1; } goto CONTENT_R; case 'T': if (last) { return -1; } goto CONTENT_T; case 't': if (last) { return -1; } goto CONTENT_T; default: return -1; } CONTENT_D: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_DI; case 'i': if (last) { return -1; } goto CONTENT_DI; default: return -1; } CONTENT_DI: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto CONTENT_DIS; case 's': if (last) { return -1; } goto CONTENT_DIS; default: return -1; } CONTENT_DIS: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto CONTENT_DISP; case 'p': if (last) { return -1; } goto CONTENT_DISP; default: return -1; } CONTENT_DISP: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONTENT_DISPO; case 'o': if (last) { return -1; } goto CONTENT_DISPO; default: return -1; } CONTENT_DISPO: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto CONTENT_DISPOS; case 's': if (last) { return -1; } goto CONTENT_DISPOS; default: return -1; } CONTENT_DISPOS: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_DISPOSI; case 'i': if (last) { return -1; } goto CONTENT_DISPOSI; default: return -1; } CONTENT_DISPOSI: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CONTENT_DISPOSIT; case 't': if (last) { return -1; } goto CONTENT_DISPOSIT; default: return -1; } CONTENT_DISPOSIT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_DISPOSITI; case 'i': if (last) { return -1; } goto CONTENT_DISPOSITI; default: return -1; } CONTENT_DISPOSITI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONTENT_DISPOSITIO; case 'o': if (last) { return -1; } goto CONTENT_DISPOSITIO; default: return -1; } CONTENT_DISPOSITIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 18; } goto CONTENT_DISPOSITION; case 'n': if (last) { return 18; } goto CONTENT_DISPOSITION; default: return -1; } CONTENT_E: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_EN; case 'n': if (last) { return -1; } goto CONTENT_EN; default: return -1; } CONTENT_EN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CONTENT_ENC; case 'c': if (last) { return -1; } goto CONTENT_ENC; default: return -1; } CONTENT_ENC: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONTENT_ENCO; case 'o': if (last) { return -1; } goto CONTENT_ENCO; default: return -1; } CONTENT_ENCO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto CONTENT_ENCOD; case 'd': if (last) { return -1; } goto CONTENT_ENCOD; default: return -1; } CONTENT_ENCOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_ENCODI; case 'i': if (last) { return -1; } goto CONTENT_ENCODI; default: return -1; } CONTENT_ENCODI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_ENCODIN; case 'n': if (last) { return -1; } goto CONTENT_ENCODIN; default: return -1; } CONTENT_ENCODIN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 19; } goto CONTENT_ENCODING; case 'g': if (last) { return 19; } goto CONTENT_ENCODING; default: return -1; } CONTENT_L: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CONTENT_LA; case 'a': if (last) { return -1; } goto CONTENT_LA; case 'E': if (last) { return -1; } goto CONTENT_LE; case 'e': if (last) { return -1; } goto CONTENT_LE; case 'O': if (last) { return -1; } goto CONTENT_LO; case 'o': if (last) { return -1; } goto CONTENT_LO; default: return -1; } CONTENT_LA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_LAN; case 'n': if (last) { return -1; } goto CONTENT_LAN; default: return -1; } CONTENT_LAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto CONTENT_LANG; case 'g': if (last) { return -1; } goto CONTENT_LANG; default: return -1; } CONTENT_LANG: NEXT_CHAR(); switch (ch) { case 'U': if (last) { return -1; } goto CONTENT_LANGU; case 'u': if (last) { return -1; } goto CONTENT_LANGU; default: return -1; } CONTENT_LANGU: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CONTENT_LANGUA; case 'a': if (last) { return -1; } goto CONTENT_LANGUA; default: return -1; } CONTENT_LANGUA: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto CONTENT_LANGUAG; case 'g': if (last) { return -1; } goto CONTENT_LANGUAG; default: return -1; } CONTENT_LANGUAG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 20; } goto CONTENT_LANGUAGE; case 'e': if (last) { return 20; } goto CONTENT_LANGUAGE; default: return -1; } CONTENT_LE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_LEN; case 'n': if (last) { return -1; } goto CONTENT_LEN; default: return -1; } CONTENT_LEN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto CONTENT_LENG; case 'g': if (last) { return -1; } goto CONTENT_LENG; default: return -1; } CONTENT_LENG: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CONTENT_LENGT; case 't': if (last) { return -1; } goto CONTENT_LENGT; default: return -1; } CONTENT_LENGT: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return 21; } goto CONTENT_LENGTH; case 'h': if (last) { return 21; } goto CONTENT_LENGTH; default: return -1; } CONTENT_LO: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CONTENT_LOC; case 'c': if (last) { return -1; } goto CONTENT_LOC; default: return -1; } CONTENT_LOC: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CONTENT_LOCA; case 'a': if (last) { return -1; } goto CONTENT_LOCA; default: return -1; } CONTENT_LOCA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto CONTENT_LOCAT; case 't': if (last) { return -1; } goto CONTENT_LOCAT; default: return -1; } CONTENT_LOCAT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_LOCATI; case 'i': if (last) { return -1; } goto CONTENT_LOCATI; default: return -1; } CONTENT_LOCATI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONTENT_LOCATIO; case 'o': if (last) { return -1; } goto CONTENT_LOCATIO; default: return -1; } CONTENT_LOCATIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 22; } goto CONTENT_LOCATION; case 'n': if (last) { return 22; } goto CONTENT_LOCATION; default: return -1; } CONTENT_M: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto CONTENT_MD; case 'd': if (last) { return -1; } goto CONTENT_MD; default: return -1; } CONTENT_MD: NEXT_CHAR(); switch (ch) { case '5': if (last) { return 23; } goto CONTENT_MD5; default: return -1; } CONTENT_R: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CONTENT_RA; case 'a': if (last) { return -1; } goto CONTENT_RA; default: return -1; } CONTENT_RA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_RAN; case 'n': if (last) { return -1; } goto CONTENT_RAN; default: return -1; } CONTENT_RAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto CONTENT_RANG; case 'g': if (last) { return -1; } goto CONTENT_RANG; default: return -1; } CONTENT_RANG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 24; } goto CONTENT_RANGE; case 'e': if (last) { return 24; } goto CONTENT_RANGE; default: return -1; } CONTENT_T: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto CONTENT_TR; case 'r': if (last) { return -1; } goto CONTENT_TR; case 'Y': if (last) { return -1; } goto CONTENT_TY; case 'y': if (last) { return -1; } goto CONTENT_TY; default: return -1; } CONTENT_TR: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto CONTENT_TRA; case 'a': if (last) { return -1; } goto CONTENT_TRA; default: return -1; } CONTENT_TRA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_TRAN; case 'n': if (last) { return -1; } goto CONTENT_TRAN; default: return -1; } CONTENT_TRAN: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto CONTENT_TRANS; case 's': if (last) { return -1; } goto CONTENT_TRANS; default: return -1; } CONTENT_TRANS: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto CONTENT_TRANSF; case 'f': if (last) { return -1; } goto CONTENT_TRANSF; default: return -1; } CONTENT_TRANSF: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto CONTENT_TRANSFE; case 'e': if (last) { return -1; } goto CONTENT_TRANSFE; default: return -1; } CONTENT_TRANSFE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto CONTENT_TRANSFER; case 'r': if (last) { return -1; } goto CONTENT_TRANSFER; default: return -1; } CONTENT_TRANSFER: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto CONTENT_TRANSFER_; default: return -1; } CONTENT_TRANSFER_: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto CONTENT_TRANSFER_E; case 'e': if (last) { return -1; } goto CONTENT_TRANSFER_E; default: return -1; } CONTENT_TRANSFER_E: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_TRANSFER_EN; case 'n': if (last) { return -1; } goto CONTENT_TRANSFER_EN; default: return -1; } CONTENT_TRANSFER_EN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto CONTENT_TRANSFER_ENC; case 'c': if (last) { return -1; } goto CONTENT_TRANSFER_ENC; default: return -1; } CONTENT_TRANSFER_ENC: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto CONTENT_TRANSFER_ENCO; case 'o': if (last) { return -1; } goto CONTENT_TRANSFER_ENCO; default: return -1; } CONTENT_TRANSFER_ENCO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto CONTENT_TRANSFER_ENCOD; case 'd': if (last) { return -1; } goto CONTENT_TRANSFER_ENCOD; default: return -1; } CONTENT_TRANSFER_ENCOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto CONTENT_TRANSFER_ENCODI; case 'i': if (last) { return -1; } goto CONTENT_TRANSFER_ENCODI; default: return -1; } CONTENT_TRANSFER_ENCODI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto CONTENT_TRANSFER_ENCODIN; case 'n': if (last) { return -1; } goto CONTENT_TRANSFER_ENCODIN; default: return -1; } CONTENT_TRANSFER_ENCODIN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 25; } goto CONTENT_TRANSFER_ENCODING; case 'g': if (last) { return 25; } goto CONTENT_TRANSFER_ENCODING; default: return -1; } CONTENT_TY: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto CONTENT_TYP; case 'p': if (last) { return -1; } goto CONTENT_TYP; default: return -1; } CONTENT_TYP: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 26; } goto CONTENT_TYPE; case 'e': if (last) { return 26; } goto CONTENT_TYPE; default: return -1; } COO: NEXT_CHAR(); switch (ch) { case 'K': if (last) { return -1; } goto COOK; case 'k': if (last) { return -1; } goto COOK; default: return -1; } COOK: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto COOKI; case 'i': if (last) { return -1; } goto COOKI; default: return -1; } COOKI: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 27; } goto COOKIE; case 'e': if (last) { return 27; } goto COOKIE; default: return -1; } D: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto DA; case 'a': if (last) { return -1; } goto DA; case 'E': if (last) { return -1; } goto DE; case 'e': if (last) { return -1; } goto DE; case 'I': if (last) { return -1; } goto DI; case 'i': if (last) { return -1; } goto DI; default: return -1; } DA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto DAT; case 't': if (last) { return -1; } goto DAT; default: return -1; } DAT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 28; } goto DATE; case 'e': if (last) { return 28; } goto DATE; default: return -1; } DE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto DES; case 's': if (last) { return -1; } goto DES; default: return -1; } DES: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto DEST; case 't': if (last) { return -1; } goto DEST; default: return -1; } DEST: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto DESTI; case 'i': if (last) { return -1; } goto DESTI; default: return -1; } DESTI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto DESTIN; case 'n': if (last) { return -1; } goto DESTIN; default: return -1; } DESTIN: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto DESTINA; case 'a': if (last) { return -1; } goto DESTINA; default: return -1; } DESTINA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto DESTINAT; case 't': if (last) { return -1; } goto DESTINAT; default: return -1; } DESTINAT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto DESTINATI; case 'i': if (last) { return -1; } goto DESTINATI; default: return -1; } DESTINATI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto DESTINATIO; case 'o': if (last) { return -1; } goto DESTINATIO; default: return -1; } DESTINATIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 29; } goto DESTINATION; case 'n': if (last) { return 29; } goto DESTINATION; default: return -1; } DI: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto DIG; case 'g': if (last) { return -1; } goto DIG; default: return -1; } DIG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto DIGE; case 'e': if (last) { return -1; } goto DIGE; default: return -1; } DIGE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto DIGES; case 's': if (last) { return -1; } goto DIGES; default: return -1; } DIGES: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 30; } goto DIGEST; case 't': if (last) { return 30; } goto DIGEST; default: return -1; } E: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto ET; case 't': if (last) { return -1; } goto ET; case 'X': if (last) { return -1; } goto EX; case 'x': if (last) { return -1; } goto EX; default: return -1; } ET: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto ETA; case 'a': if (last) { return -1; } goto ETA; default: return -1; } ETA: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 31; } goto ETAG; case 'g': if (last) { return 31; } goto ETAG; default: return -1; } EX: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto EXP; case 'p': if (last) { return -1; } goto EXP; default: return -1; } EXP: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto EXPE; case 'e': if (last) { return -1; } goto EXPE; case 'I': if (last) { return -1; } goto EXPI; case 'i': if (last) { return -1; } goto EXPI; default: return -1; } EXPE: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto EXPEC; case 'c': if (last) { return -1; } goto EXPEC; default: return -1; } EXPEC: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 32; } goto EXPECT; case 't': if (last) { return 32; } goto EXPECT; default: return -1; } EXPI: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto EXPIR; case 'r': if (last) { return -1; } goto EXPIR; default: return -1; } EXPIR: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto EXPIRE; case 'e': if (last) { return -1; } goto EXPIRE; default: return -1; } EXPIRE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 33; } goto EXPIRES; case 's': if (last) { return 33; } goto EXPIRES; default: return -1; } F: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto FO; case 'o': if (last) { return -1; } goto FO; case 'R': if (last) { return -1; } goto FR; case 'r': if (last) { return -1; } goto FR; default: return -1; } FO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto FOR; case 'r': if (last) { return -1; } goto FOR; default: return -1; } FOR: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto FORW; case 'w': if (last) { return -1; } goto FORW; default: return -1; } FORW: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto FORWA; case 'a': if (last) { return -1; } goto FORWA; default: return -1; } FORWA: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto FORWAR; case 'r': if (last) { return -1; } goto FORWAR; default: return -1; } FORWAR: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto FORWARD; case 'd': if (last) { return -1; } goto FORWARD; default: return -1; } FORWARD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto FORWARDE; case 'e': if (last) { return -1; } goto FORWARDE; default: return -1; } FORWARDE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return 34; } goto FORWARDED; case 'd': if (last) { return 34; } goto FORWARDED; default: return -1; } FR: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto FRO; case 'o': if (last) { return -1; } goto FRO; default: return -1; } FRO: NEXT_CHAR(); switch (ch) { case 'M': if (last) { return 35; } goto FROM; case 'm': if (last) { return 35; } goto FROM; default: return -1; } H: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto HO; case 'o': if (last) { return -1; } goto HO; default: return -1; } HO: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto HOS; case 's': if (last) { return -1; } goto HOS; default: return -1; } HOS: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 36; } goto HOST; case 't': if (last) { return 36; } goto HOST; default: return -1; } I: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto IF; case 'f': if (last) { return -1; } goto IF; default: return -1; } IF: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto IF_; default: return -1; } IF_: NEXT_CHAR(); switch (ch) { case 'M': if (last) { return -1; } goto IF_M; case 'm': if (last) { return -1; } goto IF_M; case 'N': if (last) { return -1; } goto IF_N; case 'n': if (last) { return -1; } goto IF_N; case 'R': if (last) { return -1; } goto IF_R; case 'r': if (last) { return -1; } goto IF_R; case 'U': if (last) { return -1; } goto IF_U; case 'u': if (last) { return -1; } goto IF_U; default: return -1; } IF_M: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto IF_MA; case 'a': if (last) { return -1; } goto IF_MA; case 'O': if (last) { return -1; } goto IF_MO; case 'o': if (last) { return -1; } goto IF_MO; default: return -1; } IF_MA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto IF_MAT; case 't': if (last) { return -1; } goto IF_MAT; default: return -1; } IF_MAT: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto IF_MATC; case 'c': if (last) { return -1; } goto IF_MATC; default: return -1; } IF_MATC: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return 37; } goto IF_MATCH; case 'h': if (last) { return 37; } goto IF_MATCH; default: return -1; } IF_MO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto IF_MOD; case 'd': if (last) { return -1; } goto IF_MOD; default: return -1; } IF_MOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_MODI; case 'i': if (last) { return -1; } goto IF_MODI; default: return -1; } IF_MODI: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto IF_MODIF; case 'f': if (last) { return -1; } goto IF_MODIF; default: return -1; } IF_MODIF: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_MODIFI; case 'i': if (last) { return -1; } goto IF_MODIFI; default: return -1; } IF_MODIFI: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto IF_MODIFIE; case 'e': if (last) { return -1; } goto IF_MODIFIE; default: return -1; } IF_MODIFIE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto IF_MODIFIED; case 'd': if (last) { return -1; } goto IF_MODIFIED; default: return -1; } IF_MODIFIED: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto IF_MODIFIED_; default: return -1; } IF_MODIFIED_: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto IF_MODIFIED_S; case 's': if (last) { return -1; } goto IF_MODIFIED_S; default: return -1; } IF_MODIFIED_S: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_MODIFIED_SI; case 'i': if (last) { return -1; } goto IF_MODIFIED_SI; default: return -1; } IF_MODIFIED_SI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto IF_MODIFIED_SIN; case 'n': if (last) { return -1; } goto IF_MODIFIED_SIN; default: return -1; } IF_MODIFIED_SIN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto IF_MODIFIED_SINC; case 'c': if (last) { return -1; } goto IF_MODIFIED_SINC; default: return -1; } IF_MODIFIED_SINC: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 38; } goto IF_MODIFIED_SINCE; case 'e': if (last) { return 38; } goto IF_MODIFIED_SINCE; default: return -1; } IF_N: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto IF_NO; case 'o': if (last) { return -1; } goto IF_NO; default: return -1; } IF_NO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto IF_NON; case 'n': if (last) { return -1; } goto IF_NON; default: return -1; } IF_NON: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto IF_NONE; case 'e': if (last) { return -1; } goto IF_NONE; default: return -1; } IF_NONE: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto IF_NONE_; default: return -1; } IF_NONE_: NEXT_CHAR(); switch (ch) { case 'M': if (last) { return -1; } goto IF_NONE_M; case 'm': if (last) { return -1; } goto IF_NONE_M; default: return -1; } IF_NONE_M: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto IF_NONE_MA; case 'a': if (last) { return -1; } goto IF_NONE_MA; default: return -1; } IF_NONE_MA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto IF_NONE_MAT; case 't': if (last) { return -1; } goto IF_NONE_MAT; default: return -1; } IF_NONE_MAT: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto IF_NONE_MATC; case 'c': if (last) { return -1; } goto IF_NONE_MATC; default: return -1; } IF_NONE_MATC: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return 39; } goto IF_NONE_MATCH; case 'h': if (last) { return 39; } goto IF_NONE_MATCH; default: return -1; } IF_R: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto IF_RA; case 'a': if (last) { return -1; } goto IF_RA; default: return -1; } IF_RA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto IF_RAN; case 'n': if (last) { return -1; } goto IF_RAN; default: return -1; } IF_RAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto IF_RANG; case 'g': if (last) { return -1; } goto IF_RANG; default: return -1; } IF_RANG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 40; } goto IF_RANGE; case 'e': if (last) { return 40; } goto IF_RANGE; default: return -1; } IF_U: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto IF_UN; case 'n': if (last) { return -1; } goto IF_UN; default: return -1; } IF_UN: NEXT_CHAR(); switch (ch) { case 'M': if (last) { return -1; } goto IF_UNM; case 'm': if (last) { return -1; } goto IF_UNM; default: return -1; } IF_UNM: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto IF_UNMO; case 'o': if (last) { return -1; } goto IF_UNMO; default: return -1; } IF_UNMO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto IF_UNMOD; case 'd': if (last) { return -1; } goto IF_UNMOD; default: return -1; } IF_UNMOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_UNMODI; case 'i': if (last) { return -1; } goto IF_UNMODI; default: return -1; } IF_UNMODI: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto IF_UNMODIF; case 'f': if (last) { return -1; } goto IF_UNMODIF; default: return -1; } IF_UNMODIF: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_UNMODIFI; case 'i': if (last) { return -1; } goto IF_UNMODIFI; default: return -1; } IF_UNMODIFI: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto IF_UNMODIFIE; case 'e': if (last) { return -1; } goto IF_UNMODIFIE; default: return -1; } IF_UNMODIFIE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto IF_UNMODIFIED; case 'd': if (last) { return -1; } goto IF_UNMODIFIED; default: return -1; } IF_UNMODIFIED: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto IF_UNMODIFIED_; default: return -1; } IF_UNMODIFIED_: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto IF_UNMODIFIED_S; case 's': if (last) { return -1; } goto IF_UNMODIFIED_S; default: return -1; } IF_UNMODIFIED_S: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto IF_UNMODIFIED_SI; case 'i': if (last) { return -1; } goto IF_UNMODIFIED_SI; default: return -1; } IF_UNMODIFIED_SI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto IF_UNMODIFIED_SIN; case 'n': if (last) { return -1; } goto IF_UNMODIFIED_SIN; default: return -1; } IF_UNMODIFIED_SIN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto IF_UNMODIFIED_SINC; case 'c': if (last) { return -1; } goto IF_UNMODIFIED_SINC; default: return -1; } IF_UNMODIFIED_SINC: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 41; } goto IF_UNMODIFIED_SINCE; case 'e': if (last) { return 41; } goto IF_UNMODIFIED_SINCE; default: return -1; } K: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto KE; case 'e': if (last) { return -1; } goto KE; default: return -1; } KE: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto KEE; case 'e': if (last) { return -1; } goto KEE; default: return -1; } KEE: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto KEEP; case 'p': if (last) { return -1; } goto KEEP; default: return -1; } KEEP: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto KEEP_; default: return -1; } KEEP_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto KEEP_A; case 'a': if (last) { return -1; } goto KEEP_A; default: return -1; } KEEP_A: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto KEEP_AL; case 'l': if (last) { return -1; } goto KEEP_AL; default: return -1; } KEEP_AL: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto KEEP_ALI; case 'i': if (last) { return -1; } goto KEEP_ALI; default: return -1; } KEEP_ALI: NEXT_CHAR(); switch (ch) { case 'V': if (last) { return -1; } goto KEEP_ALIV; case 'v': if (last) { return -1; } goto KEEP_ALIV; default: return -1; } KEEP_ALIV: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 42; } goto KEEP_ALIVE; case 'e': if (last) { return 42; } goto KEEP_ALIVE; default: return -1; } L: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto LA; case 'a': if (last) { return -1; } goto LA; case 'I': if (last) { return -1; } goto LI; case 'i': if (last) { return -1; } goto LI; case 'O': if (last) { return -1; } goto LO; case 'o': if (last) { return -1; } goto LO; default: return -1; } LA: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto LAS; case 's': if (last) { return -1; } goto LAS; default: return -1; } LAS: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto LAST; case 't': if (last) { return -1; } goto LAST; default: return -1; } LAST: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto LAST_; default: return -1; } LAST_: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto LAST_E; case 'e': if (last) { return -1; } goto LAST_E; case 'M': if (last) { return -1; } goto LAST_M; case 'm': if (last) { return -1; } goto LAST_M; default: return -1; } LAST_E: NEXT_CHAR(); switch (ch) { case 'V': if (last) { return -1; } goto LAST_EV; case 'v': if (last) { return -1; } goto LAST_EV; default: return -1; } LAST_EV: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto LAST_EVE; case 'e': if (last) { return -1; } goto LAST_EVE; default: return -1; } LAST_EVE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto LAST_EVEN; case 'n': if (last) { return -1; } goto LAST_EVEN; default: return -1; } LAST_EVEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto LAST_EVENT; case 't': if (last) { return -1; } goto LAST_EVENT; default: return -1; } LAST_EVENT: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto LAST_EVENT_; default: return -1; } LAST_EVENT_: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto LAST_EVENT_I; case 'i': if (last) { return -1; } goto LAST_EVENT_I; default: return -1; } LAST_EVENT_I: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return 43; } goto LAST_EVENT_ID; case 'd': if (last) { return 43; } goto LAST_EVENT_ID; default: return -1; } LAST_M: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto LAST_MO; case 'o': if (last) { return -1; } goto LAST_MO; default: return -1; } LAST_MO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto LAST_MOD; case 'd': if (last) { return -1; } goto LAST_MOD; default: return -1; } LAST_MOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto LAST_MODI; case 'i': if (last) { return -1; } goto LAST_MODI; default: return -1; } LAST_MODI: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto LAST_MODIF; case 'f': if (last) { return -1; } goto LAST_MODIF; default: return -1; } LAST_MODIF: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto LAST_MODIFI; case 'i': if (last) { return -1; } goto LAST_MODIFI; default: return -1; } LAST_MODIFI: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto LAST_MODIFIE; case 'e': if (last) { return -1; } goto LAST_MODIFIE; default: return -1; } LAST_MODIFIE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return 44; } goto LAST_MODIFIED; case 'd': if (last) { return 44; } goto LAST_MODIFIED; default: return -1; } LI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto LIN; case 'n': if (last) { return -1; } goto LIN; default: return -1; } LIN: NEXT_CHAR(); switch (ch) { case 'K': if (last) { return 45; } goto LINK; case 'k': if (last) { return 45; } goto LINK; default: return -1; } LO: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto LOC; case 'c': if (last) { return -1; } goto LOC; default: return -1; } LOC: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto LOCA; case 'a': if (last) { return -1; } goto LOCA; default: return -1; } LOCA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto LOCAT; case 't': if (last) { return -1; } goto LOCAT; default: return -1; } LOCAT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto LOCATI; case 'i': if (last) { return -1; } goto LOCATI; default: return -1; } LOCATI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto LOCATIO; case 'o': if (last) { return -1; } goto LOCATIO; default: return -1; } LOCATIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 46; } goto LOCATION; case 'n': if (last) { return 46; } goto LOCATION; default: return -1; } M: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto MA; case 'a': if (last) { return -1; } goto MA; default: return -1; } MA: NEXT_CHAR(); switch (ch) { case 'X': if (last) { return -1; } goto MAX; case 'x': if (last) { return -1; } goto MAX; default: return -1; } MAX: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto MAX_; default: return -1; } MAX_: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto MAX_F; case 'f': if (last) { return -1; } goto MAX_F; default: return -1; } MAX_F: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto MAX_FO; case 'o': if (last) { return -1; } goto MAX_FO; default: return -1; } MAX_FO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto MAX_FOR; case 'r': if (last) { return -1; } goto MAX_FOR; default: return -1; } MAX_FOR: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto MAX_FORW; case 'w': if (last) { return -1; } goto MAX_FORW; default: return -1; } MAX_FORW: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto MAX_FORWA; case 'a': if (last) { return -1; } goto MAX_FORWA; default: return -1; } MAX_FORWA: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto MAX_FORWAR; case 'r': if (last) { return -1; } goto MAX_FORWAR; default: return -1; } MAX_FORWAR: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto MAX_FORWARD; case 'd': if (last) { return -1; } goto MAX_FORWARD; default: return -1; } MAX_FORWARD: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 47; } goto MAX_FORWARDS; case 's': if (last) { return 47; } goto MAX_FORWARDS; default: return -1; } O: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto OR; case 'r': if (last) { return -1; } goto OR; default: return -1; } OR: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ORI; case 'i': if (last) { return -1; } goto ORI; default: return -1; } ORI: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto ORIG; case 'g': if (last) { return -1; } goto ORIG; default: return -1; } ORIG: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto ORIGI; case 'i': if (last) { return -1; } goto ORIGI; default: return -1; } ORIGI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 48; } goto ORIGIN; case 'n': if (last) { return 48; } goto ORIGIN; default: return -1; } P: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto PR; case 'r': if (last) { return -1; } goto PR; default: return -1; } PR: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto PRA; case 'a': if (last) { return -1; } goto PRA; case 'O': if (last) { return -1; } goto PRO; case 'o': if (last) { return -1; } goto PRO; default: return -1; } PRA: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto PRAG; case 'g': if (last) { return -1; } goto PRAG; default: return -1; } PRAG: NEXT_CHAR(); switch (ch) { case 'M': if (last) { return -1; } goto PRAGM; case 'm': if (last) { return -1; } goto PRAGM; default: return -1; } PRAGM: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return 49; } goto PRAGMA; case 'a': if (last) { return 49; } goto PRAGMA; default: return -1; } PRO: NEXT_CHAR(); switch (ch) { case 'X': if (last) { return -1; } goto PROX; case 'x': if (last) { return -1; } goto PROX; default: return -1; } PROX: NEXT_CHAR(); switch (ch) { case 'Y': if (last) { return -1; } goto PROXY; case 'y': if (last) { return -1; } goto PROXY; default: return -1; } PROXY: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto PROXY_; default: return -1; } PROXY_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto PROXY_A; case 'a': if (last) { return -1; } goto PROXY_A; default: return -1; } PROXY_A: NEXT_CHAR(); switch (ch) { case 'U': if (last) { return -1; } goto PROXY_AU; case 'u': if (last) { return -1; } goto PROXY_AU; default: return -1; } PROXY_AU: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto PROXY_AUT; case 't': if (last) { return -1; } goto PROXY_AUT; default: return -1; } PROXY_AUT: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto PROXY_AUTH; case 'h': if (last) { return -1; } goto PROXY_AUTH; default: return -1; } PROXY_AUTH: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto PROXY_AUTHE; case 'e': if (last) { return -1; } goto PROXY_AUTHE; case 'O': if (last) { return -1; } goto PROXY_AUTHO; case 'o': if (last) { return -1; } goto PROXY_AUTHO; default: return -1; } PROXY_AUTHE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto PROXY_AUTHEN; case 'n': if (last) { return -1; } goto PROXY_AUTHEN; default: return -1; } PROXY_AUTHEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto PROXY_AUTHENT; case 't': if (last) { return -1; } goto PROXY_AUTHENT; default: return -1; } PROXY_AUTHENT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto PROXY_AUTHENTI; case 'i': if (last) { return -1; } goto PROXY_AUTHENTI; default: return -1; } PROXY_AUTHENTI: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto PROXY_AUTHENTIC; case 'c': if (last) { return -1; } goto PROXY_AUTHENTIC; default: return -1; } PROXY_AUTHENTIC: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto PROXY_AUTHENTICA; case 'a': if (last) { return -1; } goto PROXY_AUTHENTICA; default: return -1; } PROXY_AUTHENTICA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto PROXY_AUTHENTICAT; case 't': if (last) { return -1; } goto PROXY_AUTHENTICAT; default: return -1; } PROXY_AUTHENTICAT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 50; } goto PROXY_AUTHENTICATE; case 'e': if (last) { return 50; } goto PROXY_AUTHENTICATE; default: return -1; } PROXY_AUTHO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto PROXY_AUTHOR; case 'r': if (last) { return -1; } goto PROXY_AUTHOR; default: return -1; } PROXY_AUTHOR: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto PROXY_AUTHORI; case 'i': if (last) { return -1; } goto PROXY_AUTHORI; default: return -1; } PROXY_AUTHORI: NEXT_CHAR(); switch (ch) { case 'Z': if (last) { return -1; } goto PROXY_AUTHORIZ; case 'z': if (last) { return -1; } goto PROXY_AUTHORIZ; default: return -1; } PROXY_AUTHORIZ: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto PROXY_AUTHORIZA; case 'a': if (last) { return -1; } goto PROXY_AUTHORIZA; default: return -1; } PROXY_AUTHORIZA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto PROXY_AUTHORIZAT; case 't': if (last) { return -1; } goto PROXY_AUTHORIZAT; default: return -1; } PROXY_AUTHORIZAT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto PROXY_AUTHORIZATI; case 'i': if (last) { return -1; } goto PROXY_AUTHORIZATI; default: return -1; } PROXY_AUTHORIZATI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto PROXY_AUTHORIZATIO; case 'o': if (last) { return -1; } goto PROXY_AUTHORIZATIO; default: return -1; } PROXY_AUTHORIZATIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 51; } goto PROXY_AUTHORIZATION; case 'n': if (last) { return 51; } goto PROXY_AUTHORIZATION; default: return -1; } R: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto RA; case 'a': if (last) { return -1; } goto RA; case 'E': if (last) { return -1; } goto RE; case 'e': if (last) { return -1; } goto RE; default: return -1; } RA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto RAN; case 'n': if (last) { return -1; } goto RAN; default: return -1; } RAN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto RANG; case 'g': if (last) { return -1; } goto RANG; default: return -1; } RANG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 52; } goto RANGE; case 'e': if (last) { return 52; } goto RANGE; default: return -1; } RE: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto REF; case 'f': if (last) { return -1; } goto REF; case 'T': if (last) { return -1; } goto RET; case 't': if (last) { return -1; } goto RET; default: return -1; } REF: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto REFE; case 'e': if (last) { return -1; } goto REFE; default: return -1; } REFE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto REFER; case 'r': if (last) { return -1; } goto REFER; default: return -1; } REFER: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto REFERE; case 'e': if (last) { return -1; } goto REFERE; default: return -1; } REFERE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return 53; } goto REFERER; case 'r': if (last) { return 53; } goto REFERER; default: return -1; } RET: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto RETR; case 'r': if (last) { return -1; } goto RETR; default: return -1; } RETR: NEXT_CHAR(); switch (ch) { case 'Y': if (last) { return -1; } goto RETRY; case 'y': if (last) { return -1; } goto RETRY; default: return -1; } RETRY: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto RETRY_; default: return -1; } RETRY_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto RETRY_A; case 'a': if (last) { return -1; } goto RETRY_A; default: return -1; } RETRY_A: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto RETRY_AF; case 'f': if (last) { return -1; } goto RETRY_AF; default: return -1; } RETRY_AF: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto RETRY_AFT; case 't': if (last) { return -1; } goto RETRY_AFT; default: return -1; } RETRY_AFT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto RETRY_AFTE; case 'e': if (last) { return -1; } goto RETRY_AFTE; default: return -1; } RETRY_AFTE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return 54; } goto RETRY_AFTER; case 'r': if (last) { return 54; } goto RETRY_AFTER; default: return -1; } S: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SE; case 'e': if (last) { return -1; } goto SE; default: return -1; } SE: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SEC; case 'c': if (last) { return -1; } goto SEC; case 'R': if (last) { return -1; } goto SER; case 'r': if (last) { return -1; } goto SER; case 'T': if (last) { return -1; } goto SET; case 't': if (last) { return -1; } goto SET; default: return -1; } SEC: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto SEC_; default: return -1; } SEC_: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto SEC_W; case 'w': if (last) { return -1; } goto SEC_W; default: return -1; } SEC_W: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WE; case 'e': if (last) { return -1; } goto SEC_WE; default: return -1; } SEC_WE: NEXT_CHAR(); switch (ch) { case 'B': if (last) { return -1; } goto SEC_WEB; case 'b': if (last) { return -1; } goto SEC_WEB; default: return -1; } SEC_WEB: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto SEC_WEBS; case 's': if (last) { return -1; } goto SEC_WEBS; default: return -1; } SEC_WEBS: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSO; case 'o': if (last) { return -1; } goto SEC_WEBSO; default: return -1; } SEC_WEBSO: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SEC_WEBSOC; case 'c': if (last) { return -1; } goto SEC_WEBSOC; default: return -1; } SEC_WEBSOC: NEXT_CHAR(); switch (ch) { case 'K': if (last) { return -1; } goto SEC_WEBSOCK; case 'k': if (last) { return -1; } goto SEC_WEBSOCK; default: return -1; } SEC_WEBSOCK: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WEBSOCKE; case 'e': if (last) { return -1; } goto SEC_WEBSOCKE; default: return -1; } SEC_WEBSOCKE: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto SEC_WEBSOCKET; case 't': if (last) { return -1; } goto SEC_WEBSOCKET; default: return -1; } SEC_WEBSOCKET: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto SEC_WEBSOCKET_; default: return -1; } SEC_WEBSOCKET_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto SEC_WEBSOCKET_A; case 'a': if (last) { return -1; } goto SEC_WEBSOCKET_A; case 'E': if (last) { return -1; } goto SEC_WEBSOCKET_E; case 'e': if (last) { return -1; } goto SEC_WEBSOCKET_E; case 'K': if (last) { return -1; } goto SEC_WEBSOCKET_K; case 'k': if (last) { return -1; } goto SEC_WEBSOCKET_K; case 'P': if (last) { return -1; } goto SEC_WEBSOCKET_P; case 'p': if (last) { return -1; } goto SEC_WEBSOCKET_P; case 'V': if (last) { return -1; } goto SEC_WEBSOCKET_V; case 'v': if (last) { return -1; } goto SEC_WEBSOCKET_V; default: return -1; } SEC_WEBSOCKET_A: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SEC_WEBSOCKET_AC; case 'c': if (last) { return -1; } goto SEC_WEBSOCKET_AC; default: return -1; } SEC_WEBSOCKET_AC: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SEC_WEBSOCKET_ACC; case 'c': if (last) { return -1; } goto SEC_WEBSOCKET_ACC; default: return -1; } SEC_WEBSOCKET_ACC: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WEBSOCKET_ACCE; case 'e': if (last) { return -1; } goto SEC_WEBSOCKET_ACCE; default: return -1; } SEC_WEBSOCKET_ACCE: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto SEC_WEBSOCKET_ACCEP; case 'p': if (last) { return -1; } goto SEC_WEBSOCKET_ACCEP; default: return -1; } SEC_WEBSOCKET_ACCEP: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 55; } goto SEC_WEBSOCKET_ACCEPT; case 't': if (last) { return 55; } goto SEC_WEBSOCKET_ACCEPT; default: return -1; } SEC_WEBSOCKET_E: NEXT_CHAR(); switch (ch) { case 'X': if (last) { return -1; } goto SEC_WEBSOCKET_EX; case 'x': if (last) { return -1; } goto SEC_WEBSOCKET_EX; default: return -1; } SEC_WEBSOCKET_EX: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto SEC_WEBSOCKET_EXT; case 't': if (last) { return -1; } goto SEC_WEBSOCKET_EXT; default: return -1; } SEC_WEBSOCKET_EXT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WEBSOCKET_EXTE; case 'e': if (last) { return -1; } goto SEC_WEBSOCKET_EXTE; default: return -1; } SEC_WEBSOCKET_EXTE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto SEC_WEBSOCKET_EXTEN; case 'n': if (last) { return -1; } goto SEC_WEBSOCKET_EXTEN; default: return -1; } SEC_WEBSOCKET_EXTEN: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENS; case 's': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENS; default: return -1; } SEC_WEBSOCKET_EXTENS: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSI; case 'i': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSI; default: return -1; } SEC_WEBSOCKET_EXTENSI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSIO; case 'o': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSIO; default: return -1; } SEC_WEBSOCKET_EXTENSIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSION; case 'n': if (last) { return -1; } goto SEC_WEBSOCKET_EXTENSION; default: return -1; } SEC_WEBSOCKET_EXTENSION: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return 56; } goto SEC_WEBSOCKET_EXTENSIONS; case 's': if (last) { return 56; } goto SEC_WEBSOCKET_EXTENSIONS; default: return -1; } SEC_WEBSOCKET_K: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WEBSOCKET_KE; case 'e': if (last) { return -1; } goto SEC_WEBSOCKET_KE; default: return -1; } SEC_WEBSOCKET_KE: NEXT_CHAR(); switch (ch) { case 'Y': if (last) { return 57; } goto SEC_WEBSOCKET_KEY; case 'y': if (last) { return 57; } goto SEC_WEBSOCKET_KEY; default: return -1; } SEC_WEBSOCKET_KEY: NEXT_CHAR(); switch (ch) { case '1': if (last) { return 58; } goto SEC_WEBSOCKET_KEY1; default: return -1; } SEC_WEBSOCKET_P: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto SEC_WEBSOCKET_PR; case 'r': if (last) { return -1; } goto SEC_WEBSOCKET_PR; default: return -1; } SEC_WEBSOCKET_PR: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSOCKET_PRO; case 'o': if (last) { return -1; } goto SEC_WEBSOCKET_PRO; default: return -1; } SEC_WEBSOCKET_PRO: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto SEC_WEBSOCKET_PROT; case 't': if (last) { return -1; } goto SEC_WEBSOCKET_PROT; default: return -1; } SEC_WEBSOCKET_PROT: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSOCKET_PROTO; case 'o': if (last) { return -1; } goto SEC_WEBSOCKET_PROTO; default: return -1; } SEC_WEBSOCKET_PROTO: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SEC_WEBSOCKET_PROTOC; case 'c': if (last) { return -1; } goto SEC_WEBSOCKET_PROTOC; default: return -1; } SEC_WEBSOCKET_PROTOC: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSOCKET_PROTOCO; case 'o': if (last) { return -1; } goto SEC_WEBSOCKET_PROTOCO; default: return -1; } SEC_WEBSOCKET_PROTOCO: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return 59; } goto SEC_WEBSOCKET_PROTOCOL; case 'l': if (last) { return 59; } goto SEC_WEBSOCKET_PROTOCOL; default: return -1; } SEC_WEBSOCKET_V: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SEC_WEBSOCKET_VE; case 'e': if (last) { return -1; } goto SEC_WEBSOCKET_VE; default: return -1; } SEC_WEBSOCKET_VE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto SEC_WEBSOCKET_VER; case 'r': if (last) { return -1; } goto SEC_WEBSOCKET_VER; default: return -1; } SEC_WEBSOCKET_VER: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto SEC_WEBSOCKET_VERS; case 's': if (last) { return -1; } goto SEC_WEBSOCKET_VERS; default: return -1; } SEC_WEBSOCKET_VERS: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto SEC_WEBSOCKET_VERSI; case 'i': if (last) { return -1; } goto SEC_WEBSOCKET_VERSI; default: return -1; } SEC_WEBSOCKET_VERSI: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SEC_WEBSOCKET_VERSIO; case 'o': if (last) { return -1; } goto SEC_WEBSOCKET_VERSIO; default: return -1; } SEC_WEBSOCKET_VERSIO: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return 60; } goto SEC_WEBSOCKET_VERSION; case 'n': if (last) { return 60; } goto SEC_WEBSOCKET_VERSION; default: return -1; } SER: NEXT_CHAR(); switch (ch) { case 'V': if (last) { return -1; } goto SERV; case 'v': if (last) { return -1; } goto SERV; default: return -1; } SERV: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto SERVE; case 'e': if (last) { return -1; } goto SERVE; default: return -1; } SERVE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return 61; } goto SERVER; case 'r': if (last) { return 61; } goto SERVER; default: return -1; } SET: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto SET_; default: return -1; } SET_: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto SET_C; case 'c': if (last) { return -1; } goto SET_C; default: return -1; } SET_C: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SET_CO; case 'o': if (last) { return -1; } goto SET_CO; default: return -1; } SET_CO: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto SET_COO; case 'o': if (last) { return -1; } goto SET_COO; default: return -1; } SET_COO: NEXT_CHAR(); switch (ch) { case 'K': if (last) { return -1; } goto SET_COOK; case 'k': if (last) { return -1; } goto SET_COOK; default: return -1; } SET_COOK: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto SET_COOKI; case 'i': if (last) { return -1; } goto SET_COOKI; default: return -1; } SET_COOKI: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 62; } goto SET_COOKIE; case 'e': if (last) { return 62; } goto SET_COOKIE; default: return -1; } T: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 63; } goto TE; case 'e': if (last) { return 63; } goto TE; case 'R': if (last) { return -1; } goto TR; case 'r': if (last) { return -1; } goto TR; default: return -1; } TR: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto TRA; case 'a': if (last) { return -1; } goto TRA; default: return -1; } TRA: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto TRAI; case 'i': if (last) { return -1; } goto TRAI; case 'N': if (last) { return -1; } goto TRAN; case 'n': if (last) { return -1; } goto TRAN; default: return -1; } TRAI: NEXT_CHAR(); switch (ch) { case 'L': if (last) { return -1; } goto TRAIL; case 'l': if (last) { return -1; } goto TRAIL; default: return -1; } TRAIL: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto TRAILE; case 'e': if (last) { return -1; } goto TRAILE; default: return -1; } TRAILE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return 64; } goto TRAILER; case 'r': if (last) { return 64; } goto TRAILER; default: return -1; } TRAN: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto TRANS; case 's': if (last) { return -1; } goto TRANS; default: return -1; } TRANS: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto TRANSF; case 'f': if (last) { return -1; } goto TRANSF; default: return -1; } TRANSF: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto TRANSFE; case 'e': if (last) { return -1; } goto TRANSFE; default: return -1; } TRANSFE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto TRANSFER; case 'r': if (last) { return -1; } goto TRANSFER; default: return -1; } TRANSFER: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto TRANSFER_; default: return -1; } TRANSFER_: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto TRANSFER_E; case 'e': if (last) { return -1; } goto TRANSFER_E; default: return -1; } TRANSFER_E: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto TRANSFER_EN; case 'n': if (last) { return -1; } goto TRANSFER_EN; default: return -1; } TRANSFER_EN: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto TRANSFER_ENC; case 'c': if (last) { return -1; } goto TRANSFER_ENC; default: return -1; } TRANSFER_ENC: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto TRANSFER_ENCO; case 'o': if (last) { return -1; } goto TRANSFER_ENCO; default: return -1; } TRANSFER_ENCO: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto TRANSFER_ENCOD; case 'd': if (last) { return -1; } goto TRANSFER_ENCOD; default: return -1; } TRANSFER_ENCOD: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto TRANSFER_ENCODI; case 'i': if (last) { return -1; } goto TRANSFER_ENCODI; default: return -1; } TRANSFER_ENCODI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto TRANSFER_ENCODIN; case 'n': if (last) { return -1; } goto TRANSFER_ENCODIN; default: return -1; } TRANSFER_ENCODIN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 65; } goto TRANSFER_ENCODING; case 'g': if (last) { return 65; } goto TRANSFER_ENCODING; default: return -1; } U: NEXT_CHAR(); switch (ch) { case 'P': if (last) { return -1; } goto UP; case 'p': if (last) { return -1; } goto UP; case 'R': if (last) { return -1; } goto UR; case 'r': if (last) { return -1; } goto UR; case 'S': if (last) { return -1; } goto US; case 's': if (last) { return -1; } goto US; default: return -1; } UP: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto UPG; case 'g': if (last) { return -1; } goto UPG; default: return -1; } UPG: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto UPGR; case 'r': if (last) { return -1; } goto UPGR; default: return -1; } UPGR: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto UPGRA; case 'a': if (last) { return -1; } goto UPGRA; default: return -1; } UPGRA: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto UPGRAD; case 'd': if (last) { return -1; } goto UPGRAD; default: return -1; } UPGRAD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 66; } goto UPGRADE; case 'e': if (last) { return 66; } goto UPGRADE; default: return -1; } UR: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return 67; } goto URI; case 'i': if (last) { return 67; } goto URI; default: return -1; } US: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto USE; case 'e': if (last) { return -1; } goto USE; default: return -1; } USE: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto USER; case 'r': if (last) { return -1; } goto USER; default: return -1; } USER: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto USER_; default: return -1; } USER_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto USER_A; case 'a': if (last) { return -1; } goto USER_A; default: return -1; } USER_A: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto USER_AG; case 'g': if (last) { return -1; } goto USER_AG; default: return -1; } USER_AG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto USER_AGE; case 'e': if (last) { return -1; } goto USER_AGE; default: return -1; } USER_AGE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto USER_AGEN; case 'n': if (last) { return -1; } goto USER_AGEN; default: return -1; } USER_AGEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 68; } goto USER_AGENT; case 't': if (last) { return 68; } goto USER_AGENT; default: return -1; } V: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto VA; case 'a': if (last) { return -1; } goto VA; case 'I': if (last) { return -1; } goto VI; case 'i': if (last) { return -1; } goto VI; default: return -1; } VA: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto VAR; case 'r': if (last) { return -1; } goto VAR; default: return -1; } VAR: NEXT_CHAR(); switch (ch) { case 'Y': if (last) { return 69; } goto VARY; case 'y': if (last) { return 69; } goto VARY; default: return -1; } VI: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return 70; } goto VIA; case 'a': if (last) { return 70; } goto VIA; default: return -1; } W: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto WA; case 'a': if (last) { return -1; } goto WA; case 'E': if (last) { return -1; } goto WE; case 'e': if (last) { return -1; } goto WE; case 'W': if (last) { return -1; } goto WW; case 'w': if (last) { return -1; } goto WW; default: return -1; } WA: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto WAN; case 'n': if (last) { return -1; } goto WAN; case 'R': if (last) { return -1; } goto WAR; case 'r': if (last) { return -1; } goto WAR; default: return -1; } WAN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto WANT; case 't': if (last) { return -1; } goto WANT; default: return -1; } WANT: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto WANT_; default: return -1; } WANT_: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto WANT_D; case 'd': if (last) { return -1; } goto WANT_D; default: return -1; } WANT_D: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto WANT_DI; case 'i': if (last) { return -1; } goto WANT_DI; default: return -1; } WANT_DI: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return -1; } goto WANT_DIG; case 'g': if (last) { return -1; } goto WANT_DIG; default: return -1; } WANT_DIG: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto WANT_DIGE; case 'e': if (last) { return -1; } goto WANT_DIGE; default: return -1; } WANT_DIGE: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto WANT_DIGES; case 's': if (last) { return -1; } goto WANT_DIGES; default: return -1; } WANT_DIGES: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 71; } goto WANT_DIGEST; case 't': if (last) { return 71; } goto WANT_DIGEST; default: return -1; } WAR: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto WARN; case 'n': if (last) { return -1; } goto WARN; default: return -1; } WARN: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto WARNI; case 'i': if (last) { return -1; } goto WARNI; default: return -1; } WARNI: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto WARNIN; case 'n': if (last) { return -1; } goto WARNIN; default: return -1; } WARNIN: NEXT_CHAR(); switch (ch) { case 'G': if (last) { return 72; } goto WARNING; case 'g': if (last) { return 72; } goto WARNING; default: return -1; } WE: NEXT_CHAR(); switch (ch) { case 'B': if (last) { return -1; } goto WEB; case 'b': if (last) { return -1; } goto WEB; default: return -1; } WEB: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto WEBS; case 's': if (last) { return -1; } goto WEBS; default: return -1; } WEBS: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto WEBSO; case 'o': if (last) { return -1; } goto WEBSO; default: return -1; } WEBSO: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto WEBSOC; case 'c': if (last) { return -1; } goto WEBSOC; default: return -1; } WEBSOC: NEXT_CHAR(); switch (ch) { case 'K': if (last) { return -1; } goto WEBSOCK; case 'k': if (last) { return -1; } goto WEBSOCK; default: return -1; } WEBSOCK: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto WEBSOCKE; case 'e': if (last) { return -1; } goto WEBSOCKE; default: return -1; } WEBSOCKE: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 73; } goto WEBSOCKET; case 't': if (last) { return 73; } goto WEBSOCKET; default: return -1; } WW: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto WWW; case 'w': if (last) { return -1; } goto WWW; default: return -1; } WWW: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto WWW_; default: return -1; } WWW_: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto WWW_A; case 'a': if (last) { return -1; } goto WWW_A; default: return -1; } WWW_A: NEXT_CHAR(); switch (ch) { case 'U': if (last) { return -1; } goto WWW_AU; case 'u': if (last) { return -1; } goto WWW_AU; default: return -1; } WWW_AU: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto WWW_AUT; case 't': if (last) { return -1; } goto WWW_AUT; default: return -1; } WWW_AUT: NEXT_CHAR(); switch (ch) { case 'H': if (last) { return -1; } goto WWW_AUTH; case 'h': if (last) { return -1; } goto WWW_AUTH; default: return -1; } WWW_AUTH: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto WWW_AUTHE; case 'e': if (last) { return -1; } goto WWW_AUTHE; default: return -1; } WWW_AUTHE: NEXT_CHAR(); switch (ch) { case 'N': if (last) { return -1; } goto WWW_AUTHEN; case 'n': if (last) { return -1; } goto WWW_AUTHEN; default: return -1; } WWW_AUTHEN: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto WWW_AUTHENT; case 't': if (last) { return -1; } goto WWW_AUTHENT; default: return -1; } WWW_AUTHENT: NEXT_CHAR(); switch (ch) { case 'I': if (last) { return -1; } goto WWW_AUTHENTI; case 'i': if (last) { return -1; } goto WWW_AUTHENTI; default: return -1; } WWW_AUTHENTI: NEXT_CHAR(); switch (ch) { case 'C': if (last) { return -1; } goto WWW_AUTHENTIC; case 'c': if (last) { return -1; } goto WWW_AUTHENTIC; default: return -1; } WWW_AUTHENTIC: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto WWW_AUTHENTICA; case 'a': if (last) { return -1; } goto WWW_AUTHENTICA; default: return -1; } WWW_AUTHENTICA: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto WWW_AUTHENTICAT; case 't': if (last) { return -1; } goto WWW_AUTHENTICAT; default: return -1; } WWW_AUTHENTICAT: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return 74; } goto WWW_AUTHENTICATE; case 'e': if (last) { return 74; } goto WWW_AUTHENTICATE; default: return -1; } X: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto X_; default: return -1; } X_: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto X_F; case 'f': if (last) { return -1; } goto X_F; default: return -1; } X_F: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto X_FO; case 'o': if (last) { return -1; } goto X_FO; default: return -1; } X_FO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto X_FOR; case 'r': if (last) { return -1; } goto X_FOR; default: return -1; } X_FOR: NEXT_CHAR(); switch (ch) { case 'W': if (last) { return -1; } goto X_FORW; case 'w': if (last) { return -1; } goto X_FORW; default: return -1; } X_FORW: NEXT_CHAR(); switch (ch) { case 'A': if (last) { return -1; } goto X_FORWA; case 'a': if (last) { return -1; } goto X_FORWA; default: return -1; } X_FORWA: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto X_FORWAR; case 'r': if (last) { return -1; } goto X_FORWAR; default: return -1; } X_FORWAR: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto X_FORWARD; case 'd': if (last) { return -1; } goto X_FORWARD; default: return -1; } X_FORWARD: NEXT_CHAR(); switch (ch) { case 'E': if (last) { return -1; } goto X_FORWARDE; case 'e': if (last) { return -1; } goto X_FORWARDE; default: return -1; } X_FORWARDE: NEXT_CHAR(); switch (ch) { case 'D': if (last) { return -1; } goto X_FORWARDED; case 'd': if (last) { return -1; } goto X_FORWARDED; default: return -1; } X_FORWARDED: NEXT_CHAR(); switch (ch) { case '-': if (last) { return -1; } goto X_FORWARDED_; default: return -1; } X_FORWARDED_: NEXT_CHAR(); switch (ch) { case 'F': if (last) { return -1; } goto X_FORWARDED_F; case 'f': if (last) { return -1; } goto X_FORWARDED_F; case 'H': if (last) { return -1; } goto X_FORWARDED_H; case 'h': if (last) { return -1; } goto X_FORWARDED_H; case 'P': if (last) { return -1; } goto X_FORWARDED_P; case 'p': if (last) { return -1; } goto X_FORWARDED_P; default: return -1; } X_FORWARDED_F: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto X_FORWARDED_FO; case 'o': if (last) { return -1; } goto X_FORWARDED_FO; default: return -1; } X_FORWARDED_FO: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return 75; } goto X_FORWARDED_FOR; case 'r': if (last) { return 75; } goto X_FORWARDED_FOR; default: return -1; } X_FORWARDED_H: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto X_FORWARDED_HO; case 'o': if (last) { return -1; } goto X_FORWARDED_HO; default: return -1; } X_FORWARDED_HO: NEXT_CHAR(); switch (ch) { case 'S': if (last) { return -1; } goto X_FORWARDED_HOS; case 's': if (last) { return -1; } goto X_FORWARDED_HOS; default: return -1; } X_FORWARDED_HOS: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return 76; } goto X_FORWARDED_HOST; case 't': if (last) { return 76; } goto X_FORWARDED_HOST; default: return -1; } X_FORWARDED_P: NEXT_CHAR(); switch (ch) { case 'R': if (last) { return -1; } goto X_FORWARDED_PR; case 'r': if (last) { return -1; } goto X_FORWARDED_PR; default: return -1; } X_FORWARDED_PR: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return -1; } goto X_FORWARDED_PRO; case 'o': if (last) { return -1; } goto X_FORWARDED_PRO; default: return -1; } X_FORWARDED_PRO: NEXT_CHAR(); switch (ch) { case 'T': if (last) { return -1; } goto X_FORWARDED_PROT; case 't': if (last) { return -1; } goto X_FORWARDED_PROT; default: return -1; } X_FORWARDED_PROT: NEXT_CHAR(); switch (ch) { case 'O': if (last) { return 77; } goto X_FORWARDED_PROTO; case 'o': if (last) { return 77; } goto X_FORWARDED_PROTO; default: return -1; } ACCEPT_CHARSET: ACCEPT_ENCODING: ACCEPT_LANGUAGE: ACCEPT_RANGES: ACCESS_CONTROL_ALLOW_CREDENTIALS: ACCESS_CONTROL_ALLOW_HEADERS: ACCESS_CONTROL_ALLOW_METHODS: ACCESS_CONTROL_ALLOW_ORIGIN: ACCESS_CONTROL_EXPOSE_HEADERS: ACCESS_CONTROL_MAX_AGE: ACCESS_CONTROL_REQUEST_HEADERS: ACCESS_CONTROL_REQUEST_METHOD: AGE: ALLOW: AUTHORIZATION: CACHE_CONTROL: CONNECTION: CONTENT_DISPOSITION: CONTENT_ENCODING: CONTENT_LANGUAGE: CONTENT_LENGTH: CONTENT_LOCATION: CONTENT_MD5: CONTENT_RANGE: CONTENT_TRANSFER_ENCODING: CONTENT_TYPE: COOKIE: DATE: DESTINATION: DIGEST: ETAG: EXPECT: EXPIRES: FORWARDED: FROM: HOST: IF_MATCH: IF_MODIFIED_SINCE: IF_NONE_MATCH: IF_RANGE: IF_UNMODIFIED_SINCE: KEEP_ALIVE: LAST_EVENT_ID: LAST_MODIFIED: LINK: LOCATION: MAX_FORWARDS: ORIGIN: PRAGMA: PROXY_AUTHENTICATE: PROXY_AUTHORIZATION: RANGE: REFERER: RETRY_AFTER: SEC_WEBSOCKET_ACCEPT: SEC_WEBSOCKET_EXTENSIONS: SEC_WEBSOCKET_KEY1: SEC_WEBSOCKET_PROTOCOL: SEC_WEBSOCKET_VERSION: SERVER: SET_COOKIE: TE: TRAILER: TRANSFER_ENCODING: UPGRADE: URI: USER_AGENT: VARY: VIA: WANT_DIGEST: WARNING: WEBSOCKET: WWW_AUTHENTICATE: X_FORWARDED_FOR: X_FORWARDED_HOST: X_FORWARDED_PROTO: missing: /* nothing found */ return -1; } aiohttp-3.6.2/aiohttp/_find_header.h0000644000175100001650000000025213547410117017674 0ustar vstsdocker00000000000000#ifndef _FIND_HEADERS_H #define _FIND_HEADERS_H #ifdef __cplusplus extern "C" { #endif int find_header(const char *str, int size); #ifdef __cplusplus } #endif #endif aiohttp-3.6.2/aiohttp/_find_header.pxd0000644000175100001650000000010413547410117020234 0ustar vstsdocker00000000000000cdef extern from "_find_header.h": int find_header(char *, int) aiohttp-3.6.2/aiohttp/_frozenlist.c0000644000175100001650000106425613547410136017656 0ustar vstsdocker00000000000000/* Generated by Cython 0.29.13 */ #define PY_SSIZE_T_CLEAN #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02060000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.6+ or Python 3.3+. #else #define CYTHON_ABI "0_29_13" #define CYTHON_HEX_VERSION 0x001D0DF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #if PY_VERSION_HEX >= 0x02070000 #define HAVE_LONG_LONG #endif #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #ifdef PYPY_VERSION #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif defined(PYSTON_VERSION) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #elif !defined(CYTHON_USE_PYTYPE_LOOKUP) #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #elif !defined(CYTHON_USE_PYLONG_INTERNALS) #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT (PY_VERSION_HEX >= 0x03050000) #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1) #endif #ifndef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX >= 0x030600B1) #endif #ifndef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK (PY_VERSION_HEX >= 0x030700A3) #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if CYTHON_USE_PYLONG_INTERNALS #include "longintrepr.h" #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_MAYBE_UNUSED_VAR( const T& ) { } # else # define CYTHON_MAYBE_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int32 uint32_t; #endif #endif #else #include #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) && __cplusplus >= 201103L #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #elif __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__ ) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #define Py_OptimizeFlag 0 #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #define __Pyx_DefaultClassType PyClass_Type #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #if PY_VERSION_HEX >= 0x030800A4 && PY_VERSION_HEX < 0x030800B2 #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #define __Pyx_DefaultClassType PyType_Type #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #define __Pyx_PyCFunctionFast _PyCFunctionFast #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #if CYTHON_FAST_PYCCALL #define __Pyx_PyFastCFunction_Check(func)\ ((PyCFunction_Check(func) && (METH_FASTCALL == (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))))) #else #define __Pyx_PyFastCFunction_Check(func) 0 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030400A1 #define PyMem_RawMalloc(n) PyMem_Malloc(n) #define PyMem_RawRealloc(p, n) PyMem_Realloc(p, n) #define PyMem_RawFree(p) PyMem_Free(p) #endif #if CYTHON_COMPILING_IN_PYSTON #define __Pyx_PyCode_HasFreeVars(co) PyCode_HasFreeVars(co) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if !CYTHON_FAST_THREAD_STATE || PY_VERSION_HEX < 0x02070000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStr(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) #else #define __Pyx_PyDict_GetItemStr(dict, name) PyDict_GetItem(dict, name) #endif #if PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) PyUnicode_KIND(u) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535 : 1114111) #define __Pyx_PyUnicode_KIND(u) (sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains) #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #define PyObject_Unicode PyObject_Str #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) (Py_TYPE(obj) == &PySet_Type) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #else #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t PyInt_AsLong #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t PyInt_AsSsize_t #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyMethod_New(func, self, klass) ((self) ? PyMethod_New(func, self) : (Py_INCREF(func), func)) #else #define __Pyx_PyMethod_New(func, self, klass) PyMethod_New(func, self, klass) #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(WIN32) || defined(MS_WINDOWS) #define _USE_MATH_DEFINES #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_ERR(f_index, lineno, Ln_error) \ { \ __pyx_filename = __pyx_f[f_index]; __pyx_lineno = lineno; __pyx_clineno = __LINE__; goto Ln_error; \ } #ifndef __PYX_EXTERN_C #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__aiohttp___frozenlist #define __PYX_HAVE_API__aiohttp___frozenlist /* Early includes */ #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); #define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) { const Py_UNICODE *u_end = u; while (*u_end++) ; return (size_t)(u_end - u - 1); } #define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) #define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #define __Pyx_PyNumber_Float(x) (PyFloat_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Float(x)) #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } static PyObject *__pyx_m = NULL; static PyObject *__pyx_d; static PyObject *__pyx_b; static PyObject *__pyx_cython_runtime = NULL; static PyObject *__pyx_empty_tuple; static PyObject *__pyx_empty_bytes; static PyObject *__pyx_empty_unicode; static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm= __FILE__; static const char *__pyx_filename; static const char *__pyx_f[] = { "aiohttp/_frozenlist.pyx", "stringsource", }; /*--- Type declarations ---*/ struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList; /* "aiohttp/_frozenlist.pyx":4 * * * cdef class FrozenList: # <<<<<<<<<<<<<< * * cdef readonly bint frozen */ struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList { PyObject_HEAD struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *__pyx_vtab; int frozen; PyObject *_items; }; struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList { PyObject *(*_check_frozen)(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *); PyObject *(*_fast_len)(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *); }; static struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *__pyx_vtabptr_7aiohttp_11_frozenlist_FrozenList; static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_11_frozenlist_10FrozenList__fast_len(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *); /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, int); void (*DECREF)(void*, PyObject*, int); void (*GOTREF)(void*, PyObject*, int); void (*GIVEREF)(void*, PyObject*, int); void* (*SetupContext)(const char*, int, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__) #endif #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_XINCREF(r) do { if((r) != NULL) {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) != NULL) {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) != NULL) {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) != NULL) {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject **argnames[],\ PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,\ const char* function_name); /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #define __Pyx_PyErr_Occurred() __pyx_tstate->curexc_type #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() PyErr_Occurred() #endif /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* RaiseException.proto */ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); /* GetItemInt.proto */ #define __Pyx_GetItemInt(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Fast(o, (Py_ssize_t)i, is_list, wraparound, boundscheck) :\ (is_list ? (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL) :\ __Pyx_GetItemInt_Generic(o, to_py_func(i)))) #define __Pyx_GetItemInt_List(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_List_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); #define __Pyx_GetItemInt_Tuple(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Tuple_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "tuple index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j); static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, int wraparound, int boundscheck); /* ObjectGetItem.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject* key); #else #define __Pyx_PyObject_GetItem(obj, key) PyObject_GetItem(obj, key) #endif /* PyFunctionFastCall.proto */ #if CYTHON_FAST_PYCALL #define __Pyx_PyFunction_FastCall(func, args, nargs)\ __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); #else #define __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs) _PyFunction_FastCallDict(func, args, nargs, kwargs) #endif #define __Pyx_BUILD_ASSERT_EXPR(cond)\ (sizeof(char [1 - 2*!(cond)]) - 1) #ifndef Py_MEMBER_SIZE #define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) #endif static size_t __pyx_pyframe_localsplus_offset = 0; #include "frameobject.h" #define __Pxy_PyFrame_Initialize_Offsets()\ ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) #define __Pyx_PyFrame_GetLocalsplus(frame)\ (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) #endif /* PyObjectCallMethO.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); #endif /* PyObjectCallNoArg.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func); #else #define __Pyx_PyObject_CallNoArg(func) __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL) #endif /* PyCFunctionFastCall.proto */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject *__Pyx_PyCFunction_FastCall(PyObject *func, PyObject **args, Py_ssize_t nargs); #else #define __Pyx_PyCFunction_FastCall(func, args, nargs) (assert(0), NULL) #endif /* PyObjectCallOneArg.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); /* PyIntCompare.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_EqObjC(PyObject *op1, PyObject *op2, long intval, long inplace); /* PySequenceContains.proto */ static CYTHON_INLINE int __Pyx_PySequence_ContainsTF(PyObject* item, PyObject* seq, int eq) { int result = PySequence_Contains(seq, item); return unlikely(result < 0) ? result : (result == (eq == Py_EQ)); } /* PyObjectCall2Args.proto */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2); /* PyObjectGetMethod.proto */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method); /* PyObjectCallMethod1.proto */ static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg); /* pop_index.proto */ static PyObject* __Pyx__PyObject_PopNewIndex(PyObject* L, PyObject* py_ix); static PyObject* __Pyx__PyObject_PopIndex(PyObject* L, PyObject* py_ix); #if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS static PyObject* __Pyx__PyList_PopIndex(PyObject* L, PyObject* py_ix, Py_ssize_t ix); #define __Pyx_PyObject_PopIndex(L, py_ix, ix, is_signed, type, to_py_func) (\ (likely(PyList_CheckExact(L) && __Pyx_fits_Py_ssize_t(ix, type, is_signed))) ?\ __Pyx__PyList_PopIndex(L, py_ix, ix) : (\ (unlikely((py_ix) == Py_None)) ? __Pyx__PyObject_PopNewIndex(L, to_py_func(ix)) :\ __Pyx__PyObject_PopIndex(L, py_ix))) #define __Pyx_PyList_PopIndex(L, py_ix, ix, is_signed, type, to_py_func) (\ __Pyx_fits_Py_ssize_t(ix, type, is_signed) ?\ __Pyx__PyList_PopIndex(L, py_ix, ix) : (\ (unlikely((py_ix) == Py_None)) ? __Pyx__PyObject_PopNewIndex(L, to_py_func(ix)) :\ __Pyx__PyObject_PopIndex(L, py_ix))) #else #define __Pyx_PyList_PopIndex(L, py_ix, ix, is_signed, type, to_py_func)\ __Pyx_PyObject_PopIndex(L, py_ix, ix, is_signed, type, to_py_func) #define __Pyx_PyObject_PopIndex(L, py_ix, ix, is_signed, type, to_py_func) (\ (unlikely((py_ix) == Py_None)) ? __Pyx__PyObject_PopNewIndex(L, to_py_func(ix)) :\ __Pyx__PyObject_PopIndex(L, py_ix)) #endif /* ListAppend.proto */ #if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS static CYTHON_INLINE int __Pyx_PyList_Append(PyObject* list, PyObject* x) { PyListObject* L = (PyListObject*) list; Py_ssize_t len = Py_SIZE(list); if (likely(L->allocated > len) & likely(len > (L->allocated >> 1))) { Py_INCREF(x); PyList_SET_ITEM(list, len, x); Py_SIZE(list) = len+1; return 0; } return PyList_Append(list, x); } #else #define __Pyx_PyList_Append(L,x) PyList_Append(L,x) #endif /* PyErrExceptionMatches.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err) static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err); #else #define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err) #endif /* GetAttr.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *, PyObject *); /* GetAttr3.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *, PyObject *, PyObject *); /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* GetModuleGlobalName.proto */ #if CYTHON_USE_DICT_VERSIONS #define __Pyx_GetModuleGlobalName(var, name) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } #define __Pyx_GetModuleGlobalNameUncached(var, name) {\ PY_UINT64_T __pyx_dict_version;\ PyObject *__pyx_dict_cached_value;\ (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); #else #define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) #define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); #endif /* Import.proto */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); /* ImportFrom.proto */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name); /* HasAttr.proto */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *, PyObject *); /* PyObject_GenericGetAttrNoDict.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttrNoDict PyObject_GenericGetAttr #endif /* PyObject_GenericGetAttr.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttr PyObject_GenericGetAttr #endif /* SetVTable.proto */ static int __Pyx_SetVtable(PyObject *dict, void *vtable); /* SetupReduce.proto */ static int __Pyx_setup_reduce(PyObject* type_obj); /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* CheckBinaryVersion.proto */ static int __Pyx_check_binary_version(void); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); static PyObject *__pyx_f_7aiohttp_11_frozenlist_10FrozenList__check_frozen(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_11_frozenlist_10FrozenList__fast_len(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto*/ /* Module declarations from 'aiohttp._frozenlist' */ static PyTypeObject *__pyx_ptype_7aiohttp_11_frozenlist_FrozenList = 0; static PyObject *__pyx_f_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList__set_state(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *, PyObject *); /*proto*/ #define __Pyx_MODULE_NAME "aiohttp._frozenlist" extern int __pyx_module_is_main_aiohttp___frozenlist; int __pyx_module_is_main_aiohttp___frozenlist = 0; /* Implementation of 'aiohttp._frozenlist' */ static PyObject *__pyx_builtin_RuntimeError; static const char __pyx_k_new[] = "__new__"; static const char __pyx_k_pop[] = "pop"; static const char __pyx_k_pos[] = "pos"; static const char __pyx_k_dict[] = "__dict__"; static const char __pyx_k_item[] = "item"; static const char __pyx_k_iter[] = "__iter__"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_clear[] = "clear"; static const char __pyx_k_count[] = "count"; static const char __pyx_k_index[] = "index"; static const char __pyx_k_items[] = "items"; static const char __pyx_k_format[] = "format"; static const char __pyx_k_import[] = "__import__"; static const char __pyx_k_pickle[] = "pickle"; static const char __pyx_k_reduce[] = "__reduce__"; static const char __pyx_k_remove[] = "remove"; static const char __pyx_k_update[] = "update"; static const char __pyx_k_getstate[] = "__getstate__"; static const char __pyx_k_pyx_type[] = "__pyx_type"; static const char __pyx_k_register[] = "register"; static const char __pyx_k_reversed[] = "__reversed__"; static const char __pyx_k_setstate[] = "__setstate__"; static const char __pyx_k_pyx_state[] = "__pyx_state"; static const char __pyx_k_reduce_ex[] = "__reduce_ex__"; static const char __pyx_k_FrozenList[] = "FrozenList"; static const char __pyx_k_pyx_result[] = "__pyx_result"; static const char __pyx_k_pyx_vtable[] = "__pyx_vtable__"; static const char __pyx_k_PickleError[] = "PickleError"; static const char __pyx_k_RuntimeError[] = "RuntimeError"; static const char __pyx_k_pyx_checksum[] = "__pyx_checksum"; static const char __pyx_k_stringsource[] = "stringsource"; static const char __pyx_k_reduce_cython[] = "__reduce_cython__"; static const char __pyx_k_MutableSequence[] = "MutableSequence"; static const char __pyx_k_collections_abc[] = "collections.abc"; static const char __pyx_k_pyx_PickleError[] = "__pyx_PickleError"; static const char __pyx_k_setstate_cython[] = "__setstate_cython__"; static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; static const char __pyx_k_FrozenList_frozen_r[] = ""; static const char __pyx_k_aiohttp__frozenlist[] = "aiohttp._frozenlist"; static const char __pyx_k_pyx_unpickle_FrozenList[] = "__pyx_unpickle_FrozenList"; static const char __pyx_k_Cannot_modify_frozen_list[] = "Cannot modify frozen list."; static const char __pyx_k_Incompatible_checksums_s_vs_0x94[] = "Incompatible checksums (%s vs 0x949a143 = (_items, frozen))"; static PyObject *__pyx_kp_u_Cannot_modify_frozen_list; static PyObject *__pyx_n_s_FrozenList; static PyObject *__pyx_kp_u_FrozenList_frozen_r; static PyObject *__pyx_kp_s_Incompatible_checksums_s_vs_0x94; static PyObject *__pyx_n_s_MutableSequence; static PyObject *__pyx_n_s_PickleError; static PyObject *__pyx_n_s_RuntimeError; static PyObject *__pyx_n_s_aiohttp__frozenlist; static PyObject *__pyx_n_s_clear; static PyObject *__pyx_n_s_cline_in_traceback; static PyObject *__pyx_n_s_collections_abc; static PyObject *__pyx_n_s_count; static PyObject *__pyx_n_s_dict; static PyObject *__pyx_n_s_format; static PyObject *__pyx_n_s_getstate; static PyObject *__pyx_n_s_import; static PyObject *__pyx_n_s_index; static PyObject *__pyx_n_s_item; static PyObject *__pyx_n_s_items; static PyObject *__pyx_n_s_iter; static PyObject *__pyx_n_s_main; static PyObject *__pyx_n_s_name; static PyObject *__pyx_n_s_new; static PyObject *__pyx_n_s_pickle; static PyObject *__pyx_n_s_pop; static PyObject *__pyx_n_s_pos; static PyObject *__pyx_n_s_pyx_PickleError; static PyObject *__pyx_n_s_pyx_checksum; static PyObject *__pyx_n_s_pyx_result; static PyObject *__pyx_n_s_pyx_state; static PyObject *__pyx_n_s_pyx_type; static PyObject *__pyx_n_s_pyx_unpickle_FrozenList; static PyObject *__pyx_n_s_pyx_vtable; static PyObject *__pyx_n_s_reduce; static PyObject *__pyx_n_s_reduce_cython; static PyObject *__pyx_n_s_reduce_ex; static PyObject *__pyx_n_s_register; static PyObject *__pyx_n_s_remove; static PyObject *__pyx_n_s_reversed; static PyObject *__pyx_n_s_setstate; static PyObject *__pyx_n_s_setstate_cython; static PyObject *__pyx_kp_s_stringsource; static PyObject *__pyx_n_s_test; static PyObject *__pyx_n_s_update; static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList___init__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_2freeze(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_4__getitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index); /* proto */ static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6__setitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index, PyObject *__pyx_v_value); /* proto */ static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_8__delitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index); /* proto */ static Py_ssize_t __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_10__len__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_12__iter__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_14__reversed__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_16__richcmp__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_other, PyObject *__pyx_v_op); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_18insert(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_pos, PyObject *__pyx_v_item); /* proto */ static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_20__contains__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_22__iadd__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_24index(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_26remove(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_28clear(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_30extend(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_32reverse(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_34pop(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_36append(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_38count(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_40__repr__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6frozen___get__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_42__reduce_cython__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_44__setstate_cython__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v___pyx_state); /* proto */ static PyObject *__pyx_pf_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state); /* proto */ static PyObject *__pyx_tp_new_7aiohttp_11_frozenlist_FrozenList(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/ static PyObject *__pyx_int_0; static PyObject *__pyx_int_1; static PyObject *__pyx_int_2; static PyObject *__pyx_int_3; static PyObject *__pyx_int_4; static PyObject *__pyx_int_5; static PyObject *__pyx_int_155820355; static PyObject *__pyx_int_neg_1; static PyObject *__pyx_tuple_; static PyObject *__pyx_tuple__2; static PyObject *__pyx_codeobj__3; /* Late includes */ /* "aiohttp/_frozenlist.pyx":9 * cdef list _items * * def __init__(self, items=None): # <<<<<<<<<<<<<< * self.frozen = False * if items is not None: */ /* Python wrapper */ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_items = 0; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_items,0}; PyObject* values[1] = {0}; values[0] = ((PyObject *)Py_None); if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_items); if (value) { values[0] = value; kw_args--; } } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 9, __pyx_L3_error) } } else { switch (PyTuple_GET_SIZE(__pyx_args)) { case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } } __pyx_v_items = values[0]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 0, 0, 1, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 9, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList___init__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), __pyx_v_items); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList___init__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items) { int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__init__", 0); __Pyx_INCREF(__pyx_v_items); /* "aiohttp/_frozenlist.pyx":10 * * def __init__(self, items=None): * self.frozen = False # <<<<<<<<<<<<<< * if items is not None: * items = list(items) */ __pyx_v_self->frozen = 0; /* "aiohttp/_frozenlist.pyx":11 * def __init__(self, items=None): * self.frozen = False * if items is not None: # <<<<<<<<<<<<<< * items = list(items) * else: */ __pyx_t_1 = (__pyx_v_items != Py_None); __pyx_t_2 = (__pyx_t_1 != 0); if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":12 * self.frozen = False * if items is not None: * items = list(items) # <<<<<<<<<<<<<< * else: * items = [] */ __pyx_t_3 = PySequence_List(__pyx_v_items); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_items, __pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_frozenlist.pyx":11 * def __init__(self, items=None): * self.frozen = False * if items is not None: # <<<<<<<<<<<<<< * items = list(items) * else: */ goto __pyx_L3; } /* "aiohttp/_frozenlist.pyx":14 * items = list(items) * else: * items = [] # <<<<<<<<<<<<<< * self._items = items * */ /*else*/ { __pyx_t_3 = PyList_New(0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_items, __pyx_t_3); __pyx_t_3 = 0; } __pyx_L3:; /* "aiohttp/_frozenlist.pyx":15 * else: * items = [] * self._items = items # <<<<<<<<<<<<<< * * cdef object _check_frozen(self): */ if (!(likely(PyList_CheckExact(__pyx_v_items))||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "list", Py_TYPE(__pyx_v_items)->tp_name), 0))) __PYX_ERR(0, 15, __pyx_L1_error) __pyx_t_3 = __pyx_v_items; __Pyx_INCREF(__pyx_t_3); __Pyx_GIVEREF(__pyx_t_3); __Pyx_GOTREF(__pyx_v_self->_items); __Pyx_DECREF(__pyx_v_self->_items); __pyx_v_self->_items = ((PyObject*)__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_frozenlist.pyx":9 * cdef list _items * * def __init__(self, items=None): # <<<<<<<<<<<<<< * self.frozen = False * if items is not None: */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF(__pyx_v_items); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":17 * self._items = items * * cdef object _check_frozen(self): # <<<<<<<<<<<<<< * if self.frozen: * raise RuntimeError("Cannot modify frozen list.") */ static PyObject *__pyx_f_7aiohttp_11_frozenlist_10FrozenList__check_frozen(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("_check_frozen", 0); /* "aiohttp/_frozenlist.pyx":18 * * cdef object _check_frozen(self): * if self.frozen: # <<<<<<<<<<<<<< * raise RuntimeError("Cannot modify frozen list.") * */ __pyx_t_1 = (__pyx_v_self->frozen != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_frozenlist.pyx":19 * cdef object _check_frozen(self): * if self.frozen: * raise RuntimeError("Cannot modify frozen list.") # <<<<<<<<<<<<<< * * cdef inline object _fast_len(self): */ __pyx_t_2 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple_, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_Raise(__pyx_t_2, 0, 0, 0); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __PYX_ERR(0, 19, __pyx_L1_error) /* "aiohttp/_frozenlist.pyx":18 * * cdef object _check_frozen(self): * if self.frozen: # <<<<<<<<<<<<<< * raise RuntimeError("Cannot modify frozen list.") * */ } /* "aiohttp/_frozenlist.pyx":17 * self._items = items * * cdef object _check_frozen(self): # <<<<<<<<<<<<<< * if self.frozen: * raise RuntimeError("Cannot modify frozen list.") */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList._check_frozen", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":21 * raise RuntimeError("Cannot modify frozen list.") * * cdef inline object _fast_len(self): # <<<<<<<<<<<<<< * return len(self._items) * */ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_11_frozenlist_10FrozenList__fast_len(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; __Pyx_RefNannySetupContext("_fast_len", 0); /* "aiohttp/_frozenlist.pyx":22 * * cdef inline object _fast_len(self): * return len(self._items) # <<<<<<<<<<<<<< * * def freeze(self): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = __pyx_v_self->_items; __Pyx_INCREF(__pyx_t_1); if (unlikely(__pyx_t_1 == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(0, 22, __pyx_L1_error) } __pyx_t_2 = PyList_GET_SIZE(__pyx_t_1); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1))) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyInt_FromSsize_t(__pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":21 * raise RuntimeError("Cannot modify frozen list.") * * cdef inline object _fast_len(self): # <<<<<<<<<<<<<< * return len(self._items) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList._fast_len", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":24 * return len(self._items) * * def freeze(self): # <<<<<<<<<<<<<< * self.frozen = True * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_3freeze(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_3freeze(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("freeze (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_2freeze(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_2freeze(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("freeze", 0); /* "aiohttp/_frozenlist.pyx":25 * * def freeze(self): * self.frozen = True # <<<<<<<<<<<<<< * * def __getitem__(self, index): */ __pyx_v_self->frozen = 1; /* "aiohttp/_frozenlist.pyx":24 * return len(self._items) * * def freeze(self): # <<<<<<<<<<<<<< * self.frozen = True * */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":27 * self.frozen = True * * def __getitem__(self, index): # <<<<<<<<<<<<<< * return self._items[index] * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_5__getitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_5__getitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__getitem__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_4__getitem__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_index)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_4__getitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__getitem__", 0); /* "aiohttp/_frozenlist.pyx":28 * * def __getitem__(self, index): * return self._items[index] # <<<<<<<<<<<<<< * * def __setitem__(self, index, value): */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 28, __pyx_L1_error) } __pyx_t_1 = __Pyx_PyObject_GetItem(__pyx_v_self->_items, __pyx_v_index); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 28, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":27 * self.frozen = True * * def __getitem__(self, index): # <<<<<<<<<<<<<< * return self._items[index] * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__getitem__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":30 * return self._items[index] * * def __setitem__(self, index, value): # <<<<<<<<<<<<<< * self._check_frozen() * self._items[index] = value */ /* Python wrapper */ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_7__setitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index, PyObject *__pyx_v_value); /*proto*/ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_7__setitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index, PyObject *__pyx_v_value) { int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setitem__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6__setitem__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_index), ((PyObject *)__pyx_v_value)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6__setitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index, PyObject *__pyx_v_value) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setitem__", 0); /* "aiohttp/_frozenlist.pyx":31 * * def __setitem__(self, index, value): * self._check_frozen() # <<<<<<<<<<<<<< * self._items[index] = value * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 31, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":32 * def __setitem__(self, index, value): * self._check_frozen() * self._items[index] = value # <<<<<<<<<<<<<< * * def __delitem__(self, index): */ if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 32, __pyx_L1_error) } if (unlikely(PyObject_SetItem(__pyx_v_self->_items, __pyx_v_index, __pyx_v_value) < 0)) __PYX_ERR(0, 32, __pyx_L1_error) /* "aiohttp/_frozenlist.pyx":30 * return self._items[index] * * def __setitem__(self, index, value): # <<<<<<<<<<<<<< * self._check_frozen() * self._items[index] = value */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__setitem__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":34 * self._items[index] = value * * def __delitem__(self, index): # <<<<<<<<<<<<<< * self._check_frozen() * del self._items[index] */ /* Python wrapper */ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_9__delitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index); /*proto*/ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_9__delitem__(PyObject *__pyx_v_self, PyObject *__pyx_v_index) { int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__delitem__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_8__delitem__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_index)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_8__delitem__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__delitem__", 0); /* "aiohttp/_frozenlist.pyx":35 * * def __delitem__(self, index): * self._check_frozen() # <<<<<<<<<<<<<< * del self._items[index] * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 35, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":36 * def __delitem__(self, index): * self._check_frozen() * del self._items[index] # <<<<<<<<<<<<<< * * def __len__(self): */ if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 36, __pyx_L1_error) } if (unlikely(PyObject_DelItem(__pyx_v_self->_items, __pyx_v_index) < 0)) __PYX_ERR(0, 36, __pyx_L1_error) /* "aiohttp/_frozenlist.pyx":34 * self._items[index] = value * * def __delitem__(self, index): # <<<<<<<<<<<<<< * self._check_frozen() * del self._items[index] */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__delitem__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":38 * del self._items[index] * * def __len__(self): # <<<<<<<<<<<<<< * return self._fast_len() * */ /* Python wrapper */ static Py_ssize_t __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_11__len__(PyObject *__pyx_v_self); /*proto*/ static Py_ssize_t __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_11__len__(PyObject *__pyx_v_self) { Py_ssize_t __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__len__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_10__len__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static Py_ssize_t __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_10__len__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { Py_ssize_t __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; __Pyx_RefNannySetupContext("__len__", 0); /* "aiohttp/_frozenlist.pyx":39 * * def __len__(self): * return self._fast_len() # <<<<<<<<<<<<<< * * def __iter__(self): */ __pyx_t_1 = __pyx_f_7aiohttp_11_frozenlist_10FrozenList__fast_len(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 39, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyIndex_AsSsize_t(__pyx_t_1); if (unlikely((__pyx_t_2 == (Py_ssize_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 39, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_r = __pyx_t_2; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":38 * del self._items[index] * * def __len__(self): # <<<<<<<<<<<<<< * return self._fast_len() * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__len__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":41 * return self._fast_len() * * def __iter__(self): # <<<<<<<<<<<<<< * return self._items.__iter__() * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_13__iter__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_13__iter__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__iter__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_12__iter__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_12__iter__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__iter__", 0); /* "aiohttp/_frozenlist.pyx":42 * * def __iter__(self): * return self._items.__iter__() # <<<<<<<<<<<<<< * * def __reversed__(self): */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_iter); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 42, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 42, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":41 * return self._fast_len() * * def __iter__(self): # <<<<<<<<<<<<<< * return self._items.__iter__() * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__iter__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":44 * return self._items.__iter__() * * def __reversed__(self): # <<<<<<<<<<<<<< * return self._items.__reversed__() * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_15__reversed__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_15__reversed__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reversed__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_14__reversed__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_14__reversed__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__reversed__", 0); /* "aiohttp/_frozenlist.pyx":45 * * def __reversed__(self): * return self._items.__reversed__() # <<<<<<<<<<<<<< * * def __richcmp__(self, other, op): */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_reversed); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 45, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 45, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":44 * return self._items.__iter__() * * def __reversed__(self): # <<<<<<<<<<<<<< * return self._items.__reversed__() * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__reversed__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":47 * return self._items.__reversed__() * * def __richcmp__(self, other, op): # <<<<<<<<<<<<<< * if op == 0: # < * return list(self) < other */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_17__richcmp__(PyObject *__pyx_v_self, PyObject *__pyx_v_other, int __pyx_arg_op); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_17__richcmp__(PyObject *__pyx_v_self, PyObject *__pyx_v_other, int __pyx_arg_op) { PyObject *__pyx_v_op = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__richcmp__ (wrapper)", 0); __pyx_v_op = __Pyx_PyInt_From_int(__pyx_arg_op); if (unlikely(!__pyx_v_op)) __PYX_ERR(0, 47, __pyx_L3_error) __Pyx_GOTREF(__pyx_v_op); goto __pyx_L4_argument_unpacking_done; __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__richcmp__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_16__richcmp__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_other), ((PyObject *)__pyx_v_op)); /* function exit code */ __Pyx_XDECREF(__pyx_v_op); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_16__richcmp__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_other, PyObject *__pyx_v_op) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__richcmp__", 0); /* "aiohttp/_frozenlist.pyx":48 * * def __richcmp__(self, other, op): * if op == 0: # < # <<<<<<<<<<<<<< * return list(self) < other * if op == 1: # <= */ __pyx_t_1 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_0, 0, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 48, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 48, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":49 * def __richcmp__(self, other, op): * if op == 0: # < * return list(self) < other # <<<<<<<<<<<<<< * if op == 1: # <= * return list(self) <= other */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 49, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = PyObject_RichCompare(__pyx_t_1, __pyx_v_other, Py_LT); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 49, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":48 * * def __richcmp__(self, other, op): * if op == 0: # < # <<<<<<<<<<<<<< * return list(self) < other * if op == 1: # <= */ } /* "aiohttp/_frozenlist.pyx":50 * if op == 0: # < * return list(self) < other * if op == 1: # <= # <<<<<<<<<<<<<< * return list(self) <= other * if op == 2: # == */ __pyx_t_3 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_1, 1, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":51 * return list(self) < other * if op == 1: # <= * return list(self) <= other # <<<<<<<<<<<<<< * if op == 2: # == * return list(self) == other */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 51, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_1 = PyObject_RichCompare(__pyx_t_3, __pyx_v_other, Py_LE); __Pyx_XGOTREF(__pyx_t_1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 51, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":50 * if op == 0: # < * return list(self) < other * if op == 1: # <= # <<<<<<<<<<<<<< * return list(self) <= other * if op == 2: # == */ } /* "aiohttp/_frozenlist.pyx":52 * if op == 1: # <= * return list(self) <= other * if op == 2: # == # <<<<<<<<<<<<<< * return list(self) == other * if op == 3: # != */ __pyx_t_1 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_2, 2, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 52, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 52, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":53 * return list(self) <= other * if op == 2: # == * return list(self) == other # <<<<<<<<<<<<<< * if op == 3: # != * return list(self) != other */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 53, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = PyObject_RichCompare(__pyx_t_1, __pyx_v_other, Py_EQ); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 53, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":52 * if op == 1: # <= * return list(self) <= other * if op == 2: # == # <<<<<<<<<<<<<< * return list(self) == other * if op == 3: # != */ } /* "aiohttp/_frozenlist.pyx":54 * if op == 2: # == * return list(self) == other * if op == 3: # != # <<<<<<<<<<<<<< * return list(self) != other * if op == 4: # > */ __pyx_t_3 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_3, 3, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 54, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 54, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":55 * return list(self) == other * if op == 3: # != * return list(self) != other # <<<<<<<<<<<<<< * if op == 4: # > * return list(self) > other */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 55, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_1 = PyObject_RichCompare(__pyx_t_3, __pyx_v_other, Py_NE); __Pyx_XGOTREF(__pyx_t_1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 55, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":54 * if op == 2: # == * return list(self) == other * if op == 3: # != # <<<<<<<<<<<<<< * return list(self) != other * if op == 4: # > */ } /* "aiohttp/_frozenlist.pyx":56 * if op == 3: # != * return list(self) != other * if op == 4: # > # <<<<<<<<<<<<<< * return list(self) > other * if op == 5: # => */ __pyx_t_1 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_4, 4, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 56, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 56, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":57 * return list(self) != other * if op == 4: # > * return list(self) > other # <<<<<<<<<<<<<< * if op == 5: # => * return list(self) >= other */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 57, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = PyObject_RichCompare(__pyx_t_1, __pyx_v_other, Py_GT); __Pyx_XGOTREF(__pyx_t_3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 57, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":56 * if op == 3: # != * return list(self) != other * if op == 4: # > # <<<<<<<<<<<<<< * return list(self) > other * if op == 5: # => */ } /* "aiohttp/_frozenlist.pyx":58 * if op == 4: # > * return list(self) > other * if op == 5: # => # <<<<<<<<<<<<<< * return list(self) >= other * */ __pyx_t_3 = __Pyx_PyInt_EqObjC(__pyx_v_op, __pyx_int_5, 5, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 58, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_3); if (unlikely(__pyx_t_2 < 0)) __PYX_ERR(0, 58, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; if (__pyx_t_2) { /* "aiohttp/_frozenlist.pyx":59 * return list(self) > other * if op == 5: # => * return list(self) >= other # <<<<<<<<<<<<<< * * def insert(self, pos, item): */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = PySequence_List(((PyObject *)__pyx_v_self)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 59, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_1 = PyObject_RichCompare(__pyx_t_3, __pyx_v_other, Py_GE); __Pyx_XGOTREF(__pyx_t_1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 59, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":58 * if op == 4: # > * return list(self) > other * if op == 5: # => # <<<<<<<<<<<<<< * return list(self) >= other * */ } /* "aiohttp/_frozenlist.pyx":47 * return self._items.__reversed__() * * def __richcmp__(self, other, op): # <<<<<<<<<<<<<< * if op == 0: # < * return list(self) < other */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__richcmp__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":61 * return list(self) >= other * * def insert(self, pos, item): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.insert(pos, item) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_19insert(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_19insert(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_pos = 0; PyObject *__pyx_v_item = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("insert (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_pos,&__pyx_n_s_item,0}; PyObject* values[2] = {0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pos)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_item)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("insert", 1, 2, 2, 1); __PYX_ERR(0, 61, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "insert") < 0)) __PYX_ERR(0, 61, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 2) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); } __pyx_v_pos = values[0]; __pyx_v_item = values[1]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("insert", 1, 2, 2, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 61, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.insert", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_18insert(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), __pyx_v_pos, __pyx_v_item); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_18insert(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_pos, PyObject *__pyx_v_item) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; int __pyx_t_3; __Pyx_RefNannySetupContext("insert", 0); /* "aiohttp/_frozenlist.pyx":62 * * def insert(self, pos, item): * self._check_frozen() # <<<<<<<<<<<<<< * self._items.insert(pos, item) * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 62, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":63 * def insert(self, pos, item): * self._check_frozen() * self._items.insert(pos, item) # <<<<<<<<<<<<<< * * def __contains__(self, item): */ if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "insert"); __PYX_ERR(0, 63, __pyx_L1_error) } __pyx_t_2 = __Pyx_PyIndex_AsSsize_t(__pyx_v_pos); if (unlikely((__pyx_t_2 == (Py_ssize_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 63, __pyx_L1_error) __pyx_t_3 = PyList_Insert(__pyx_v_self->_items, __pyx_t_2, __pyx_v_item); if (unlikely(__pyx_t_3 == ((int)-1))) __PYX_ERR(0, 63, __pyx_L1_error) /* "aiohttp/_frozenlist.pyx":61 * return list(self) >= other * * def insert(self, pos, item): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.insert(pos, item) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.insert", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":65 * self._items.insert(pos, item) * * def __contains__(self, item): # <<<<<<<<<<<<<< * return item in self._items * */ /* Python wrapper */ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_21__contains__(PyObject *__pyx_v_self, PyObject *__pyx_v_item); /*proto*/ static int __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_21__contains__(PyObject *__pyx_v_self, PyObject *__pyx_v_item) { int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__contains__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_20__contains__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_item)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_20__contains__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item) { int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; __Pyx_RefNannySetupContext("__contains__", 0); /* "aiohttp/_frozenlist.pyx":66 * * def __contains__(self, item): * return item in self._items # <<<<<<<<<<<<<< * * def __iadd__(self, items): */ __pyx_t_1 = (__Pyx_PySequence_ContainsTF(__pyx_v_item, __pyx_v_self->_items, Py_EQ)); if (unlikely(__pyx_t_1 < 0)) __PYX_ERR(0, 66, __pyx_L1_error) __pyx_r = __pyx_t_1; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":65 * self._items.insert(pos, item) * * def __contains__(self, item): # <<<<<<<<<<<<<< * return item in self._items * */ /* function exit code */ __pyx_L1_error:; __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__contains__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":68 * return item in self._items * * def __iadd__(self, items): # <<<<<<<<<<<<<< * self._check_frozen() * self._items += list(items) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_23__iadd__(PyObject *__pyx_v_self, PyObject *__pyx_v_items); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_23__iadd__(PyObject *__pyx_v_self, PyObject *__pyx_v_items) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__iadd__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_22__iadd__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_items)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_22__iadd__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__iadd__", 0); /* "aiohttp/_frozenlist.pyx":69 * * def __iadd__(self, items): * self._check_frozen() # <<<<<<<<<<<<<< * self._items += list(items) * return self */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 69, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":70 * def __iadd__(self, items): * self._check_frozen() * self._items += list(items) # <<<<<<<<<<<<<< * return self * */ __pyx_t_1 = PySequence_List(__pyx_v_items); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 70, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyNumber_InPlaceAdd(__pyx_v_self->_items, __pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 70, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->_items); __Pyx_DECREF(__pyx_v_self->_items); __pyx_v_self->_items = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_frozenlist.pyx":71 * self._check_frozen() * self._items += list(items) * return self # <<<<<<<<<<<<<< * * def index(self, item): */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_self)); __pyx_r = ((PyObject *)__pyx_v_self); goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":68 * return item in self._items * * def __iadd__(self, items): # <<<<<<<<<<<<<< * self._check_frozen() * self._items += list(items) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__iadd__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":73 * return self * * def index(self, item): # <<<<<<<<<<<<<< * return self._items.index(item) * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_25index(PyObject *__pyx_v_self, PyObject *__pyx_v_item); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_25index(PyObject *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("index (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_24index(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_item)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_24index(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("index", 0); /* "aiohttp/_frozenlist.pyx":74 * * def index(self, item): * return self._items.index(item) # <<<<<<<<<<<<<< * * def remove(self, item): */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_index); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 74, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_3, __pyx_v_item) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v_item); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 74, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":73 * return self * * def index(self, item): # <<<<<<<<<<<<<< * return self._items.index(item) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.index", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":76 * return self._items.index(item) * * def remove(self, item): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.remove(item) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_27remove(PyObject *__pyx_v_self, PyObject *__pyx_v_item); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_27remove(PyObject *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("remove (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_26remove(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_item)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_26remove(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("remove", 0); /* "aiohttp/_frozenlist.pyx":77 * * def remove(self, item): * self._check_frozen() # <<<<<<<<<<<<<< * self._items.remove(item) * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 77, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":78 * def remove(self, item): * self._check_frozen() * self._items.remove(item) # <<<<<<<<<<<<<< * * def clear(self): */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_remove); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 78, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_3, __pyx_v_item) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v_item); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 78, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":76 * return self._items.index(item) * * def remove(self, item): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.remove(item) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.remove", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":80 * self._items.remove(item) * * def clear(self): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.clear() */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_29clear(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_29clear(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("clear (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_28clear(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_28clear(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("clear", 0); /* "aiohttp/_frozenlist.pyx":81 * * def clear(self): * self._check_frozen() # <<<<<<<<<<<<<< * self._items.clear() * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 81, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":82 * def clear(self): * self._check_frozen() * self._items.clear() # <<<<<<<<<<<<<< * * def extend(self, items): */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_clear); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 82, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 82, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":80 * self._items.remove(item) * * def clear(self): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.clear() */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.clear", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":84 * self._items.clear() * * def extend(self, items): # <<<<<<<<<<<<<< * self._check_frozen() * self._items += list(items) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_31extend(PyObject *__pyx_v_self, PyObject *__pyx_v_items); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_31extend(PyObject *__pyx_v_self, PyObject *__pyx_v_items) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("extend (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_30extend(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_items)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_30extend(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_items) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("extend", 0); /* "aiohttp/_frozenlist.pyx":85 * * def extend(self, items): * self._check_frozen() # <<<<<<<<<<<<<< * self._items += list(items) * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 85, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":86 * def extend(self, items): * self._check_frozen() * self._items += list(items) # <<<<<<<<<<<<<< * * def reverse(self): */ __pyx_t_1 = PySequence_List(__pyx_v_items); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 86, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyNumber_InPlaceAdd(__pyx_v_self->_items, __pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 86, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->_items); __Pyx_DECREF(__pyx_v_self->_items); __pyx_v_self->_items = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_frozenlist.pyx":84 * self._items.clear() * * def extend(self, items): # <<<<<<<<<<<<<< * self._check_frozen() * self._items += list(items) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.extend", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":88 * self._items += list(items) * * def reverse(self): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.reverse() */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_33reverse(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_33reverse(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("reverse (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_32reverse(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_32reverse(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; __Pyx_RefNannySetupContext("reverse", 0); /* "aiohttp/_frozenlist.pyx":89 * * def reverse(self): * self._check_frozen() # <<<<<<<<<<<<<< * self._items.reverse() * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 89, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":90 * def reverse(self): * self._check_frozen() * self._items.reverse() # <<<<<<<<<<<<<< * * def pop(self, index=-1): */ if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "reverse"); __PYX_ERR(0, 90, __pyx_L1_error) } __pyx_t_2 = PyList_Reverse(__pyx_v_self->_items); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 90, __pyx_L1_error) /* "aiohttp/_frozenlist.pyx":88 * self._items += list(items) * * def reverse(self): # <<<<<<<<<<<<<< * self._check_frozen() * self._items.reverse() */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.reverse", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":92 * self._items.reverse() * * def pop(self, index=-1): # <<<<<<<<<<<<<< * self._check_frozen() * return self._items.pop(index) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_35pop(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_35pop(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_index = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("pop (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_index,0}; PyObject* values[1] = {0}; values[0] = ((PyObject *)__pyx_int_neg_1); if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_index); if (value) { values[0] = value; kw_args--; } } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "pop") < 0)) __PYX_ERR(0, 92, __pyx_L3_error) } } else { switch (PyTuple_GET_SIZE(__pyx_args)) { case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } } __pyx_v_index = values[0]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("pop", 0, 0, 1, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 92, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.pop", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_34pop(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), __pyx_v_index); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_34pop(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_index) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; __Pyx_RefNannySetupContext("pop", 0); /* "aiohttp/_frozenlist.pyx":93 * * def pop(self, index=-1): * self._check_frozen() # <<<<<<<<<<<<<< * return self._items.pop(index) * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 93, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":94 * def pop(self, index=-1): * self._check_frozen() * return self._items.pop(index) # <<<<<<<<<<<<<< * * def append(self, item): */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "pop"); __PYX_ERR(0, 94, __pyx_L1_error) } __pyx_t_2 = __Pyx_PyIndex_AsSsize_t(__pyx_v_index); if (unlikely((__pyx_t_2 == (Py_ssize_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 94, __pyx_L1_error) __pyx_t_1 = __Pyx_PyList_PopIndex(__pyx_v_self->_items, __pyx_v_index, __pyx_t_2, 1, Py_ssize_t, PyInt_FromSsize_t); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 94, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":92 * self._items.reverse() * * def pop(self, index=-1): # <<<<<<<<<<<<<< * self._check_frozen() * return self._items.pop(index) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.pop", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":96 * return self._items.pop(index) * * def append(self, item): # <<<<<<<<<<<<<< * self._check_frozen() * return self._items.append(item) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_37append(PyObject *__pyx_v_self, PyObject *__pyx_v_item); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_37append(PyObject *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("append (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_36append(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_item)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_36append(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; __Pyx_RefNannySetupContext("append", 0); /* "aiohttp/_frozenlist.pyx":97 * * def append(self, item): * self._check_frozen() # <<<<<<<<<<<<<< * return self._items.append(item) * */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self->__pyx_vtab)->_check_frozen(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 97, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_frozenlist.pyx":98 * def append(self, item): * self._check_frozen() * return self._items.append(item) # <<<<<<<<<<<<<< * * def count(self, item): */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_self->_items == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "append"); __PYX_ERR(0, 98, __pyx_L1_error) } __pyx_t_2 = __Pyx_PyList_Append(__pyx_v_self->_items, __pyx_v_item); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 98, __pyx_L1_error) __pyx_t_1 = __Pyx_Owned_Py_None(__pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 98, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":96 * return self._items.pop(index) * * def append(self, item): # <<<<<<<<<<<<<< * self._check_frozen() * return self._items.append(item) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.append", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":100 * return self._items.append(item) * * def count(self, item): # <<<<<<<<<<<<<< * return self._items.count(item) * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_39count(PyObject *__pyx_v_self, PyObject *__pyx_v_item); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_39count(PyObject *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("count (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_38count(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v_item)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_38count(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v_item) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("count", 0); /* "aiohttp/_frozenlist.pyx":101 * * def count(self, item): * return self._items.count(item) # <<<<<<<<<<<<<< * * def __repr__(self): */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_items, __pyx_n_s_count); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 101, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_3, __pyx_v_item) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v_item); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 101, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":100 * return self._items.append(item) * * def count(self, item): # <<<<<<<<<<<<<< * return self._items.count(item) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.count", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":103 * return self._items.count(item) * * def __repr__(self): # <<<<<<<<<<<<<< * return ''.format(self.frozen, * self._items) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_41__repr__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_41__repr__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__repr__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_40__repr__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_40__repr__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; __Pyx_RefNannySetupContext("__repr__", 0); /* "aiohttp/_frozenlist.pyx":104 * * def __repr__(self): * return ''.format(self.frozen, # <<<<<<<<<<<<<< * self._items) * */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_kp_u_FrozenList_frozen_r, __pyx_n_s_format); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = __Pyx_PyBool_FromLong(__pyx_v_self->frozen); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); /* "aiohttp/_frozenlist.pyx":105 * def __repr__(self): * return ''.format(self.frozen, * self._items) # <<<<<<<<<<<<<< * * */ __pyx_t_4 = NULL; __pyx_t_5 = 0; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); __pyx_t_5 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_2)) { PyObject *__pyx_temp[3] = {__pyx_t_4, __pyx_t_3, __pyx_v_self->_items}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_2, __pyx_temp+1-__pyx_t_5, 2+__pyx_t_5); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_2)) { PyObject *__pyx_temp[3] = {__pyx_t_4, __pyx_t_3, __pyx_v_self->_items}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_2, __pyx_temp+1-__pyx_t_5, 2+__pyx_t_5); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } else #endif { __pyx_t_6 = PyTuple_New(2+__pyx_t_5); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); if (__pyx_t_4) { __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_4); __pyx_t_4 = NULL; } __Pyx_GIVEREF(__pyx_t_3); PyTuple_SET_ITEM(__pyx_t_6, 0+__pyx_t_5, __pyx_t_3); __Pyx_INCREF(__pyx_v_self->_items); __Pyx_GIVEREF(__pyx_v_self->_items); PyTuple_SET_ITEM(__pyx_t_6, 1+__pyx_t_5, __pyx_v_self->_items); __pyx_t_3 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_2, __pyx_t_6, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 104, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; } __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_frozenlist.pyx":103 * return self._items.count(item) * * def __repr__(self): # <<<<<<<<<<<<<< * return ''.format(self.frozen, * self._items) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__repr__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_frozenlist.pyx":6 * cdef class FrozenList: * * cdef readonly bint frozen # <<<<<<<<<<<<<< * cdef list _items * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_6frozen_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_6frozen_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6frozen___get__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_6frozen___get__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_self->frozen); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.frozen.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_43__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_43__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_42__reduce_cython__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_42__reduce_cython__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self) { PyObject *__pyx_v_state = 0; PyObject *__pyx_v__dict = 0; int __pyx_v_use_setstate; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; int __pyx_t_3; int __pyx_t_4; PyObject *__pyx_t_5 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":5 * cdef object _dict * cdef bint use_setstate * state = (self._items, self.frozen) # <<<<<<<<<<<<<< * _dict = getattr(self, '__dict__', None) * if _dict is not None: */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_self->frozen); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyTuple_New(2); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v_self->_items); __Pyx_GIVEREF(__pyx_v_self->_items); PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v_self->_items); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_t_1); __pyx_t_1 = 0; __pyx_v_state = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "(tree fragment)":6 * cdef bint use_setstate * state = (self._items, self.frozen) * _dict = getattr(self, '__dict__', None) # <<<<<<<<<<<<<< * if _dict is not None: * state += (_dict,) */ __pyx_t_2 = __Pyx_GetAttr3(((PyObject *)__pyx_v_self), __pyx_n_s_dict, Py_None); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v__dict = __pyx_t_2; __pyx_t_2 = 0; /* "(tree fragment)":7 * state = (self._items, self.frozen) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ __pyx_t_3 = (__pyx_v__dict != Py_None); __pyx_t_4 = (__pyx_t_3 != 0); if (__pyx_t_4) { /* "(tree fragment)":8 * _dict = getattr(self, '__dict__', None) * if _dict is not None: * state += (_dict,) # <<<<<<<<<<<<<< * use_setstate = True * else: */ __pyx_t_2 = PyTuple_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v__dict); __Pyx_GIVEREF(__pyx_v__dict); PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v__dict); __pyx_t_1 = PyNumber_InPlaceAdd(__pyx_v_state, __pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF_SET(__pyx_v_state, ((PyObject*)__pyx_t_1)); __pyx_t_1 = 0; /* "(tree fragment)":9 * if _dict is not None: * state += (_dict,) * use_setstate = True # <<<<<<<<<<<<<< * else: * use_setstate = self._items is not None */ __pyx_v_use_setstate = 1; /* "(tree fragment)":7 * state = (self._items, self.frozen) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ goto __pyx_L3; } /* "(tree fragment)":11 * use_setstate = True * else: * use_setstate = self._items is not None # <<<<<<<<<<<<<< * if use_setstate: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, None), state */ /*else*/ { __pyx_t_4 = (__pyx_v_self->_items != ((PyObject*)Py_None)); __pyx_v_use_setstate = __pyx_t_4; } __pyx_L3:; /* "(tree fragment)":12 * else: * use_setstate = self._items is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, None), state * else: */ __pyx_t_4 = (__pyx_v_use_setstate != 0); if (__pyx_t_4) { /* "(tree fragment)":13 * use_setstate = self._items is not None * if use_setstate: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, None), state # <<<<<<<<<<<<<< * else: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, state) */ __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_pyx_unpickle_FrozenList); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyTuple_New(3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_2, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_155820355); __Pyx_GIVEREF(__pyx_int_155820355); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_int_155820355); __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); PyTuple_SET_ITEM(__pyx_t_2, 2, Py_None); __pyx_t_5 = PyTuple_New(3); if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_2); PyTuple_SET_ITEM(__pyx_t_5, 1, __pyx_t_2); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_5, 2, __pyx_v_state); __pyx_t_1 = 0; __pyx_t_2 = 0; __pyx_r = __pyx_t_5; __pyx_t_5 = 0; goto __pyx_L0; /* "(tree fragment)":12 * else: * use_setstate = self._items is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, None), state * else: */ } /* "(tree fragment)":15 * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, None), state * else: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, state) # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_FrozenList__set_state(self, __pyx_state) */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_5, __pyx_n_s_pyx_unpickle_FrozenList); if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_2 = PyTuple_New(3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_2, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_155820355); __Pyx_GIVEREF(__pyx_int_155820355); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_int_155820355); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_2, 2, __pyx_v_state); __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_5); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_5); __Pyx_GIVEREF(__pyx_t_2); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_t_2); __pyx_t_5 = 0; __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_state); __Pyx_XDECREF(__pyx_v__dict); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":16 * else: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_FrozenList__set_state(self, __pyx_state) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_45__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_45__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_11_frozenlist_10FrozenList_44__setstate_cython__(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist_10FrozenList_44__setstate_cython__(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":17 * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, state) * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_FrozenList__set_state(self, __pyx_state) # <<<<<<<<<<<<<< */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 17, __pyx_L1_error) __pyx_t_1 = __pyx_f_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList__set_state(__pyx_v_self, ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":16 * else: * return __pyx_unpickle_FrozenList, (type(self), 0x949a143, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_FrozenList__set_state(self, __pyx_state) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._frozenlist.FrozenList.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __pyx_unpickle_FrozenList(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_11_frozenlist_1__pyx_unpickle_FrozenList(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_11_frozenlist_1__pyx_unpickle_FrozenList = {"__pyx_unpickle_FrozenList", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_11_frozenlist_1__pyx_unpickle_FrozenList, METH_VARARGS|METH_KEYWORDS, 0}; static PyObject *__pyx_pw_7aiohttp_11_frozenlist_1__pyx_unpickle_FrozenList(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v___pyx_type = 0; long __pyx_v___pyx_checksum; PyObject *__pyx_v___pyx_state = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__pyx_unpickle_FrozenList (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_pyx_type,&__pyx_n_s_pyx_checksum,&__pyx_n_s_pyx_state,0}; PyObject* values[3] = {0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_type)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_checksum)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_FrozenList", 1, 3, 3, 1); __PYX_ERR(1, 1, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_state)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_FrozenList", 1, 3, 3, 2); __PYX_ERR(1, 1, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__pyx_unpickle_FrozenList") < 0)) __PYX_ERR(1, 1, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 3) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); } __pyx_v___pyx_type = values[0]; __pyx_v___pyx_checksum = __Pyx_PyInt_As_long(values[1]); if (unlikely((__pyx_v___pyx_checksum == (long)-1) && PyErr_Occurred())) __PYX_ERR(1, 1, __pyx_L3_error) __pyx_v___pyx_state = values[2]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_FrozenList", 1, 3, 3, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(1, 1, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._frozenlist.__pyx_unpickle_FrozenList", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList(__pyx_self, __pyx_v___pyx_type, __pyx_v___pyx_checksum, __pyx_v___pyx_state); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_v___pyx_PickleError = 0; PyObject *__pyx_v___pyx_result = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; __Pyx_RefNannySetupContext("__pyx_unpickle_FrozenList", 0); /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x949a143: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) */ __pyx_t_1 = ((__pyx_v___pyx_checksum != 0x949a143) != 0); if (__pyx_t_1) { /* "(tree fragment)":5 * cdef object __pyx_result * if __pyx_checksum != 0x949a143: * from pickle import PickleError as __pyx_PickleError # <<<<<<<<<<<<<< * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) * __pyx_result = FrozenList.__new__(__pyx_type) */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_PickleError); __Pyx_GIVEREF(__pyx_n_s_PickleError); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_PickleError); __pyx_t_3 = __Pyx_Import(__pyx_n_s_pickle, __pyx_t_2, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_3, __pyx_n_s_PickleError); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_t_2); __pyx_v___pyx_PickleError = __pyx_t_2; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":6 * if __pyx_checksum != 0x949a143: * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) # <<<<<<<<<<<<<< * __pyx_result = FrozenList.__new__(__pyx_type) * if __pyx_state is not None: */ __pyx_t_2 = __Pyx_PyInt_From_long(__pyx_v___pyx_checksum); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = __Pyx_PyString_Format(__pyx_kp_s_Incompatible_checksums_s_vs_0x94, __pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_INCREF(__pyx_v___pyx_PickleError); __pyx_t_2 = __pyx_v___pyx_PickleError; __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(1, 6, __pyx_L1_error) /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x949a143: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) */ } /* "(tree fragment)":7 * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) * __pyx_result = FrozenList.__new__(__pyx_type) # <<<<<<<<<<<<<< * if __pyx_state is not None: * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_ptype_7aiohttp_11_frozenlist_FrozenList), __pyx_n_s_new); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_4, __pyx_v___pyx_type) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v___pyx_type); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_v___pyx_result = __pyx_t_3; __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) * __pyx_result = FrozenList.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) * return __pyx_result */ __pyx_t_1 = (__pyx_v___pyx_state != Py_None); __pyx_t_6 = (__pyx_t_1 != 0); if (__pyx_t_6) { /* "(tree fragment)":9 * __pyx_result = FrozenList.__new__(__pyx_type) * if __pyx_state is not None: * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) # <<<<<<<<<<<<<< * return __pyx_result * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 9, __pyx_L1_error) __pyx_t_3 = __pyx_f_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList__set_state(((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)__pyx_v___pyx_result), ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x949a143 = (_items, frozen))" % __pyx_checksum) * __pyx_result = FrozenList.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) * return __pyx_result */ } /* "(tree fragment)":10 * if __pyx_state is not None: * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) * return __pyx_result # <<<<<<<<<<<<<< * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v___pyx_result); __pyx_r = __pyx_v___pyx_result; goto __pyx_L0; /* "(tree fragment)":1 * def __pyx_unpickle_FrozenList(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._frozenlist.__pyx_unpickle_FrozenList", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v___pyx_PickleError); __Pyx_XDECREF(__pyx_v___pyx_result); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":11 * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): */ static PyObject *__pyx_f_7aiohttp_11_frozenlist___pyx_unpickle_FrozenList__set_state(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *__pyx_v___pyx_result, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; Py_ssize_t __pyx_t_3; int __pyx_t_4; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; __Pyx_RefNannySetupContext("__pyx_unpickle_FrozenList__set_state", 0); /* "(tree fragment)":12 * return __pyx_result * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] # <<<<<<<<<<<<<< * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[2]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (!(likely(PyList_CheckExact(__pyx_t_1))||((__pyx_t_1) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "list", Py_TYPE(__pyx_t_1)->tp_name), 0))) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->_items); __Pyx_DECREF(__pyx_v___pyx_result->_items); __pyx_v___pyx_result->_items = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyObject_IsTrue(__pyx_t_1); if (unlikely((__pyx_t_2 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v___pyx_result->frozen = __pyx_t_2; /* "(tree fragment)":13 * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[2]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(1, 13, __pyx_L1_error) } __pyx_t_3 = PyTuple_GET_SIZE(__pyx_v___pyx_state); if (unlikely(__pyx_t_3 == ((Py_ssize_t)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_4 = ((__pyx_t_3 > 2) != 0); if (__pyx_t_4) { } else { __pyx_t_2 = __pyx_t_4; goto __pyx_L4_bool_binop_done; } __pyx_t_4 = __Pyx_HasAttr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_5 = (__pyx_t_4 != 0); __pyx_t_2 = __pyx_t_5; __pyx_L4_bool_binop_done:; if (__pyx_t_2) { /* "(tree fragment)":14 * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[2]) # <<<<<<<<<<<<<< */ __pyx_t_6 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_t_6, __pyx_n_s_update); if (unlikely(!__pyx_t_7)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 14, __pyx_L1_error) } __pyx_t_6 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_8 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_7))) { __pyx_t_8 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_8)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_8); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); } } __pyx_t_1 = (__pyx_t_8) ? __Pyx_PyObject_Call2Args(__pyx_t_7, __pyx_t_8, __pyx_t_6) : __Pyx_PyObject_CallOneArg(__pyx_t_7, __pyx_t_6); __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[2]) */ } /* "(tree fragment)":11 * __pyx_unpickle_FrozenList__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_FrozenList__set_state(FrozenList __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result._items = __pyx_state[0]; __pyx_result.frozen = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("aiohttp._frozenlist.__pyx_unpickle_FrozenList__set_state", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static struct __pyx_vtabstruct_7aiohttp_11_frozenlist_FrozenList __pyx_vtable_7aiohttp_11_frozenlist_FrozenList; static PyObject *__pyx_tp_new_7aiohttp_11_frozenlist_FrozenList(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *p; PyObject *o; if (likely((t->tp_flags & Py_TPFLAGS_IS_ABSTRACT) == 0)) { o = (*t->tp_alloc)(t, 0); } else { o = (PyObject *) PyBaseObject_Type.tp_new(t, __pyx_empty_tuple, 0); } if (unlikely(!o)) return 0; p = ((struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)o); p->__pyx_vtab = __pyx_vtabptr_7aiohttp_11_frozenlist_FrozenList; p->_items = ((PyObject*)Py_None); Py_INCREF(Py_None); return o; } static void __pyx_tp_dealloc_7aiohttp_11_frozenlist_FrozenList(PyObject *o) { struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *p = (struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)o; #if CYTHON_USE_TP_FINALIZE if (unlikely(PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE) && Py_TYPE(o)->tp_finalize) && !_PyGC_FINALIZED(o)) { if (PyObject_CallFinalizerFromDealloc(o)) return; } #endif PyObject_GC_UnTrack(o); Py_CLEAR(p->_items); (*Py_TYPE(o)->tp_free)(o); } static int __pyx_tp_traverse_7aiohttp_11_frozenlist_FrozenList(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *p = (struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)o; if (p->_items) { e = (*v)(p->_items, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_11_frozenlist_FrozenList(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *p = (struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *)o; tmp = ((PyObject*)p->_items); p->_items = ((PyObject*)Py_None); Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyObject *__pyx_sq_item_7aiohttp_11_frozenlist_FrozenList(PyObject *o, Py_ssize_t i) { PyObject *r; PyObject *x = PyInt_FromSsize_t(i); if(!x) return 0; r = Py_TYPE(o)->tp_as_mapping->mp_subscript(o, x); Py_DECREF(x); return r; } static int __pyx_mp_ass_subscript_7aiohttp_11_frozenlist_FrozenList(PyObject *o, PyObject *i, PyObject *v) { if (v) { return __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_7__setitem__(o, i, v); } else { return __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_9__delitem__(o, i); } } static PyObject *__pyx_getprop_7aiohttp_11_frozenlist_10FrozenList_frozen(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_6frozen_1__get__(o); } static PyMethodDef __pyx_methods_7aiohttp_11_frozenlist_FrozenList[] = { {"freeze", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_3freeze, METH_NOARGS, 0}, {"__reversed__", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_15__reversed__, METH_NOARGS, 0}, {"insert", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_19insert, METH_VARARGS|METH_KEYWORDS, 0}, {"index", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_25index, METH_O, 0}, {"remove", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_27remove, METH_O, 0}, {"clear", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_29clear, METH_NOARGS, 0}, {"extend", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_31extend, METH_O, 0}, {"reverse", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_33reverse, METH_NOARGS, 0}, {"pop", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_35pop, METH_VARARGS|METH_KEYWORDS, 0}, {"append", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_37append, METH_O, 0}, {"count", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_39count, METH_O, 0}, {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_43__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_11_frozenlist_10FrozenList_45__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static struct PyGetSetDef __pyx_getsets_7aiohttp_11_frozenlist_FrozenList[] = { {(char *)"frozen", __pyx_getprop_7aiohttp_11_frozenlist_10FrozenList_frozen, 0, (char *)0, 0}, {0, 0, 0, 0, 0} }; static PyNumberMethods __pyx_tp_as_number_FrozenList = { 0, /*nb_add*/ 0, /*nb_subtract*/ 0, /*nb_multiply*/ #if PY_MAJOR_VERSION < 3 || (CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x03050000) 0, /*nb_divide*/ #endif 0, /*nb_remainder*/ 0, /*nb_divmod*/ 0, /*nb_power*/ 0, /*nb_negative*/ 0, /*nb_positive*/ 0, /*nb_absolute*/ 0, /*nb_nonzero*/ 0, /*nb_invert*/ 0, /*nb_lshift*/ 0, /*nb_rshift*/ 0, /*nb_and*/ 0, /*nb_xor*/ 0, /*nb_or*/ #if PY_MAJOR_VERSION < 3 || (CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x03050000) 0, /*nb_coerce*/ #endif 0, /*nb_int*/ #if PY_MAJOR_VERSION < 3 0, /*nb_long*/ #else 0, /*reserved*/ #endif 0, /*nb_float*/ #if PY_MAJOR_VERSION < 3 || (CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x03050000) 0, /*nb_oct*/ #endif #if PY_MAJOR_VERSION < 3 || (CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x03050000) 0, /*nb_hex*/ #endif __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_23__iadd__, /*nb_inplace_add*/ 0, /*nb_inplace_subtract*/ 0, /*nb_inplace_multiply*/ #if PY_MAJOR_VERSION < 3 || (CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x03050000) 0, /*nb_inplace_divide*/ #endif 0, /*nb_inplace_remainder*/ 0, /*nb_inplace_power*/ 0, /*nb_inplace_lshift*/ 0, /*nb_inplace_rshift*/ 0, /*nb_inplace_and*/ 0, /*nb_inplace_xor*/ 0, /*nb_inplace_or*/ 0, /*nb_floor_divide*/ 0, /*nb_true_divide*/ 0, /*nb_inplace_floor_divide*/ 0, /*nb_inplace_true_divide*/ 0, /*nb_index*/ #if PY_VERSION_HEX >= 0x03050000 0, /*nb_matrix_multiply*/ #endif #if PY_VERSION_HEX >= 0x03050000 0, /*nb_inplace_matrix_multiply*/ #endif }; static PySequenceMethods __pyx_tp_as_sequence_FrozenList = { __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_11__len__, /*sq_length*/ 0, /*sq_concat*/ 0, /*sq_repeat*/ __pyx_sq_item_7aiohttp_11_frozenlist_FrozenList, /*sq_item*/ 0, /*sq_slice*/ 0, /*sq_ass_item*/ 0, /*sq_ass_slice*/ __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_21__contains__, /*sq_contains*/ 0, /*sq_inplace_concat*/ 0, /*sq_inplace_repeat*/ }; static PyMappingMethods __pyx_tp_as_mapping_FrozenList = { __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_11__len__, /*mp_length*/ __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_5__getitem__, /*mp_subscript*/ __pyx_mp_ass_subscript_7aiohttp_11_frozenlist_FrozenList, /*mp_ass_subscript*/ }; static PyTypeObject __pyx_type_7aiohttp_11_frozenlist_FrozenList = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._frozenlist.FrozenList", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_11_frozenlist_FrozenList, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_41__repr__, /*tp_repr*/ &__pyx_tp_as_number_FrozenList, /*tp_as_number*/ &__pyx_tp_as_sequence_FrozenList, /*tp_as_sequence*/ &__pyx_tp_as_mapping_FrozenList, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_11_frozenlist_FrozenList, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_11_frozenlist_FrozenList, /*tp_clear*/ __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_17__richcmp__, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_13__iter__, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_11_frozenlist_FrozenList, /*tp_methods*/ 0, /*tp_members*/ __pyx_getsets_7aiohttp_11_frozenlist_FrozenList, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_11_frozenlist_10FrozenList_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_11_frozenlist_FrozenList, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec__frozenlist(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec__frozenlist}, {0, NULL} }; #endif static struct PyModuleDef __pyx_moduledef = { PyModuleDef_HEAD_INIT, "_frozenlist", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; #endif #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_kp_u_Cannot_modify_frozen_list, __pyx_k_Cannot_modify_frozen_list, sizeof(__pyx_k_Cannot_modify_frozen_list), 0, 1, 0, 0}, {&__pyx_n_s_FrozenList, __pyx_k_FrozenList, sizeof(__pyx_k_FrozenList), 0, 0, 1, 1}, {&__pyx_kp_u_FrozenList_frozen_r, __pyx_k_FrozenList_frozen_r, sizeof(__pyx_k_FrozenList_frozen_r), 0, 1, 0, 0}, {&__pyx_kp_s_Incompatible_checksums_s_vs_0x94, __pyx_k_Incompatible_checksums_s_vs_0x94, sizeof(__pyx_k_Incompatible_checksums_s_vs_0x94), 0, 0, 1, 0}, {&__pyx_n_s_MutableSequence, __pyx_k_MutableSequence, sizeof(__pyx_k_MutableSequence), 0, 0, 1, 1}, {&__pyx_n_s_PickleError, __pyx_k_PickleError, sizeof(__pyx_k_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_RuntimeError, __pyx_k_RuntimeError, sizeof(__pyx_k_RuntimeError), 0, 0, 1, 1}, {&__pyx_n_s_aiohttp__frozenlist, __pyx_k_aiohttp__frozenlist, sizeof(__pyx_k_aiohttp__frozenlist), 0, 0, 1, 1}, {&__pyx_n_s_clear, __pyx_k_clear, sizeof(__pyx_k_clear), 0, 0, 1, 1}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_collections_abc, __pyx_k_collections_abc, sizeof(__pyx_k_collections_abc), 0, 0, 1, 1}, {&__pyx_n_s_count, __pyx_k_count, sizeof(__pyx_k_count), 0, 0, 1, 1}, {&__pyx_n_s_dict, __pyx_k_dict, sizeof(__pyx_k_dict), 0, 0, 1, 1}, {&__pyx_n_s_format, __pyx_k_format, sizeof(__pyx_k_format), 0, 0, 1, 1}, {&__pyx_n_s_getstate, __pyx_k_getstate, sizeof(__pyx_k_getstate), 0, 0, 1, 1}, {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, {&__pyx_n_s_index, __pyx_k_index, sizeof(__pyx_k_index), 0, 0, 1, 1}, {&__pyx_n_s_item, __pyx_k_item, sizeof(__pyx_k_item), 0, 0, 1, 1}, {&__pyx_n_s_items, __pyx_k_items, sizeof(__pyx_k_items), 0, 0, 1, 1}, {&__pyx_n_s_iter, __pyx_k_iter, sizeof(__pyx_k_iter), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_new, __pyx_k_new, sizeof(__pyx_k_new), 0, 0, 1, 1}, {&__pyx_n_s_pickle, __pyx_k_pickle, sizeof(__pyx_k_pickle), 0, 0, 1, 1}, {&__pyx_n_s_pop, __pyx_k_pop, sizeof(__pyx_k_pop), 0, 0, 1, 1}, {&__pyx_n_s_pos, __pyx_k_pos, sizeof(__pyx_k_pos), 0, 0, 1, 1}, {&__pyx_n_s_pyx_PickleError, __pyx_k_pyx_PickleError, sizeof(__pyx_k_pyx_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_pyx_checksum, __pyx_k_pyx_checksum, sizeof(__pyx_k_pyx_checksum), 0, 0, 1, 1}, {&__pyx_n_s_pyx_result, __pyx_k_pyx_result, sizeof(__pyx_k_pyx_result), 0, 0, 1, 1}, {&__pyx_n_s_pyx_state, __pyx_k_pyx_state, sizeof(__pyx_k_pyx_state), 0, 0, 1, 1}, {&__pyx_n_s_pyx_type, __pyx_k_pyx_type, sizeof(__pyx_k_pyx_type), 0, 0, 1, 1}, {&__pyx_n_s_pyx_unpickle_FrozenList, __pyx_k_pyx_unpickle_FrozenList, sizeof(__pyx_k_pyx_unpickle_FrozenList), 0, 0, 1, 1}, {&__pyx_n_s_pyx_vtable, __pyx_k_pyx_vtable, sizeof(__pyx_k_pyx_vtable), 0, 0, 1, 1}, {&__pyx_n_s_reduce, __pyx_k_reduce, sizeof(__pyx_k_reduce), 0, 0, 1, 1}, {&__pyx_n_s_reduce_cython, __pyx_k_reduce_cython, sizeof(__pyx_k_reduce_cython), 0, 0, 1, 1}, {&__pyx_n_s_reduce_ex, __pyx_k_reduce_ex, sizeof(__pyx_k_reduce_ex), 0, 0, 1, 1}, {&__pyx_n_s_register, __pyx_k_register, sizeof(__pyx_k_register), 0, 0, 1, 1}, {&__pyx_n_s_remove, __pyx_k_remove, sizeof(__pyx_k_remove), 0, 0, 1, 1}, {&__pyx_n_s_reversed, __pyx_k_reversed, sizeof(__pyx_k_reversed), 0, 0, 1, 1}, {&__pyx_n_s_setstate, __pyx_k_setstate, sizeof(__pyx_k_setstate), 0, 0, 1, 1}, {&__pyx_n_s_setstate_cython, __pyx_k_setstate_cython, sizeof(__pyx_k_setstate_cython), 0, 0, 1, 1}, {&__pyx_kp_s_stringsource, __pyx_k_stringsource, sizeof(__pyx_k_stringsource), 0, 0, 1, 0}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_update, __pyx_k_update, sizeof(__pyx_k_update), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_RuntimeError = __Pyx_GetBuiltinName(__pyx_n_s_RuntimeError); if (!__pyx_builtin_RuntimeError) __PYX_ERR(0, 19, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "aiohttp/_frozenlist.pyx":19 * cdef object _check_frozen(self): * if self.frozen: * raise RuntimeError("Cannot modify frozen list.") # <<<<<<<<<<<<<< * * cdef inline object _fast_len(self): */ __pyx_tuple_ = PyTuple_Pack(1, __pyx_kp_u_Cannot_modify_frozen_list); if (unlikely(!__pyx_tuple_)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple_); __Pyx_GIVEREF(__pyx_tuple_); /* "(tree fragment)":1 * def __pyx_unpickle_FrozenList(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_tuple__2 = PyTuple_Pack(5, __pyx_n_s_pyx_type, __pyx_n_s_pyx_checksum, __pyx_n_s_pyx_state, __pyx_n_s_pyx_PickleError, __pyx_n_s_pyx_result); if (unlikely(!__pyx_tuple__2)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__2); __Pyx_GIVEREF(__pyx_tuple__2); __pyx_codeobj__3 = (PyObject*)__Pyx_PyCode_New(3, 0, 5, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__2, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_stringsource, __pyx_n_s_pyx_unpickle_FrozenList, 1, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__3)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { if (__Pyx_InitStrings(__pyx_string_tab) < 0) __PYX_ERR(0, 1, __pyx_L1_error); __pyx_int_0 = PyInt_FromLong(0); if (unlikely(!__pyx_int_0)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_1 = PyInt_FromLong(1); if (unlikely(!__pyx_int_1)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_2 = PyInt_FromLong(2); if (unlikely(!__pyx_int_2)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_3 = PyInt_FromLong(3); if (unlikely(!__pyx_int_3)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_4 = PyInt_FromLong(4); if (unlikely(!__pyx_int_4)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_5 = PyInt_FromLong(5); if (unlikely(!__pyx_int_5)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_155820355 = PyInt_FromLong(155820355L); if (unlikely(!__pyx_int_155820355)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_neg_1 = PyInt_FromLong(-1); if (unlikely(!__pyx_int_neg_1)) __PYX_ERR(0, 1, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ __pyx_vtabptr_7aiohttp_11_frozenlist_FrozenList = &__pyx_vtable_7aiohttp_11_frozenlist_FrozenList; __pyx_vtable_7aiohttp_11_frozenlist_FrozenList._check_frozen = (PyObject *(*)(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *))__pyx_f_7aiohttp_11_frozenlist_10FrozenList__check_frozen; __pyx_vtable_7aiohttp_11_frozenlist_FrozenList._fast_len = (PyObject *(*)(struct __pyx_obj_7aiohttp_11_frozenlist_FrozenList *))__pyx_f_7aiohttp_11_frozenlist_10FrozenList__fast_len; if (PyType_Ready(&__pyx_type_7aiohttp_11_frozenlist_FrozenList) < 0) __PYX_ERR(0, 4, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_11_frozenlist_FrozenList.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_11_frozenlist_FrozenList.tp_dictoffset && __pyx_type_7aiohttp_11_frozenlist_FrozenList.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_11_frozenlist_FrozenList.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (__Pyx_SetVtable(__pyx_type_7aiohttp_11_frozenlist_FrozenList.tp_dict, __pyx_vtabptr_7aiohttp_11_frozenlist_FrozenList) < 0) __PYX_ERR(0, 4, __pyx_L1_error) if (PyObject_SetAttr(__pyx_m, __pyx_n_s_FrozenList, (PyObject *)&__pyx_type_7aiohttp_11_frozenlist_FrozenList) < 0) __PYX_ERR(0, 4, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_11_frozenlist_FrozenList) < 0) __PYX_ERR(0, 4, __pyx_L1_error) __pyx_ptype_7aiohttp_11_frozenlist_FrozenList = &__pyx_type_7aiohttp_11_frozenlist_FrozenList; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION < 3 #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC void #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #else #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyObject * #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC init_frozenlist(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC init_frozenlist(void) #else __Pyx_PyMODINIT_FUNC PyInit__frozenlist(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit__frozenlist(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { result = PyDict_SetItemString(moddict, to_name, value); } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, CYTHON_UNUSED PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec__frozenlist(PyObject *__pyx_pyinit_module) #endif #endif { PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module '_frozenlist' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit__frozenlist(void)", 0); if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS #ifdef WITH_THREAD /* Python build with threading support? */ PyEval_InitThreads(); #endif #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("_frozenlist", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); #else __pyx_m = PyModule_Create(&__pyx_moduledef); #endif if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_b); __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_cython_runtime); if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error); /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_aiohttp___frozenlist) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "aiohttp._frozenlist")) { if (unlikely(PyDict_SetItemString(modules, "aiohttp._frozenlist", __pyx_m) < 0)) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) goto __pyx_L1_error; /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) goto __pyx_L1_error; /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); if (unlikely(__Pyx_modinit_type_init_code() != 0)) goto __pyx_L1_error; (void)__Pyx_modinit_type_import_code(); (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "aiohttp/_frozenlist.pyx":1 * from collections.abc import MutableSequence # <<<<<<<<<<<<<< * * */ __pyx_t_1 = PyList_New(1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_MutableSequence); __Pyx_GIVEREF(__pyx_n_s_MutableSequence); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_MutableSequence); __pyx_t_2 = __Pyx_Import(__pyx_n_s_collections_abc, __pyx_t_1, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_MutableSequence); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_MutableSequence, __pyx_t_1) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_frozenlist.pyx":108 * * * MutableSequence.register(FrozenList) # <<<<<<<<<<<<<< */ __Pyx_GetModuleGlobalName(__pyx_t_2, __pyx_n_s_MutableSequence); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 108, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_1 = __Pyx_PyObject_GetAttrStr(__pyx_t_2, __pyx_n_s_register); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 108, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_PyObject_CallOneArg(__pyx_t_1, ((PyObject *)__pyx_ptype_7aiohttp_11_frozenlist_FrozenList)); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 108, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "(tree fragment)":1 * def __pyx_unpickle_FrozenList(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_t_2 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_11_frozenlist_1__pyx_unpickle_FrozenList, NULL, __pyx_n_s_aiohttp__frozenlist); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_pyx_unpickle_FrozenList, __pyx_t_2) < 0) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_frozenlist.pyx":1 * from collections.abc import MutableSequence # <<<<<<<<<<<<<< * * */ __pyx_t_2 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_2) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); if (__pyx_m) { if (__pyx_d) { __Pyx_AddTraceback("init aiohttp._frozenlist", __pyx_clineno, __pyx_lineno, __pyx_filename); } Py_CLEAR(__pyx_m); } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init aiohttp._frozenlist"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStr(__pyx_b, name); if (unlikely(!result)) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* RaiseDoubleKeywords */ static void __Pyx_RaiseDoubleKeywordsError( const char* func_name, PyObject* kw_name) { PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION >= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; while (PyDict_Next(kwds, &pos, &key, &value)) { name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; continue; } name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_CheckExact(key)) || likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = (**name == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**name) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**name, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**argname) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: return -1; } /* RaiseArgTupleInvalid */ static void __Pyx_RaiseArgtupleInvalid( const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found) { Py_ssize_t num_expected; const char *more_or_less; if (num_found < num_min) { num_expected = num_min; more_or_less = "at least"; } else { num_expected = num_max; more_or_less = "at most"; } if (exact) { more_or_less = "exactly"; } PyErr_Format(PyExc_TypeError, "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", func_name, more_or_less, num_expected, (num_expected == 1) ? "" : "s", num_found); } /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = func->ob_type->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; } #endif /* RaiseException */ #if PY_MAJOR_VERSION < 3 static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, CYTHON_UNUSED PyObject *cause) { __Pyx_PyThreadState_declare Py_XINCREF(type); if (!value || value == Py_None) value = NULL; else Py_INCREF(value); if (!tb || tb == Py_None) tb = NULL; else { Py_INCREF(tb); if (!PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } } if (PyType_Check(type)) { #if CYTHON_COMPILING_IN_PYPY if (!value) { Py_INCREF(Py_None); value = Py_None; } #endif PyErr_NormalizeException(&type, &value, &tb); } else { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } value = type; type = (PyObject*) Py_TYPE(type); Py_INCREF(type); if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto raise_error; } } __Pyx_PyThreadState_assign __Pyx_ErrRestore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } #else static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { PyObject* owned_instance = NULL; if (tb == Py_None) { tb = 0; } else if (tb && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto bad; } if (value == Py_None) value = 0; if (PyExceptionInstance_Check(type)) { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto bad; } value = type; type = (PyObject*) Py_TYPE(value); } else if (PyExceptionClass_Check(type)) { PyObject *instance_class = NULL; if (value && PyExceptionInstance_Check(value)) { instance_class = (PyObject*) Py_TYPE(value); if (instance_class != type) { int is_subclass = PyObject_IsSubclass(instance_class, type); if (!is_subclass) { instance_class = NULL; } else if (unlikely(is_subclass == -1)) { goto bad; } else { type = instance_class; } } } if (!instance_class) { PyObject *args; if (!value) args = PyTuple_New(0); else if (PyTuple_Check(value)) { Py_INCREF(value); args = value; } else args = PyTuple_Pack(1, value); if (!args) goto bad; owned_instance = PyObject_Call(type, args, NULL); Py_DECREF(args); if (!owned_instance) goto bad; value = owned_instance; if (!PyExceptionInstance_Check(value)) { PyErr_Format(PyExc_TypeError, "calling %R should have returned an instance of " "BaseException, not %R", type, Py_TYPE(value)); goto bad; } } } else { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto bad; } if (cause) { PyObject *fixed_cause; if (cause == Py_None) { fixed_cause = NULL; } else if (PyExceptionClass_Check(cause)) { fixed_cause = PyObject_CallObject(cause, NULL); if (fixed_cause == NULL) goto bad; } else if (PyExceptionInstance_Check(cause)) { fixed_cause = cause; Py_INCREF(fixed_cause); } else { PyErr_SetString(PyExc_TypeError, "exception causes must derive from " "BaseException"); goto bad; } PyException_SetCause(value, fixed_cause); } PyErr_SetObject(type, value); if (tb) { #if CYTHON_COMPILING_IN_PYPY PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); Py_INCREF(tb); PyErr_Restore(tmp_type, tmp_value, tb); Py_XDECREF(tmp_tb); #else PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* tmp_tb = tstate->curexc_traceback; if (tb != tmp_tb) { Py_INCREF(tb); tstate->curexc_traceback = tb; Py_XDECREF(tmp_tb); } #endif } bad: Py_XDECREF(owned_instance); return; } #endif /* GetItemInt */ static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j) { PyObject *r; if (!j) return NULL; r = PyObject_GetItem(o, j); Py_DECREF(j); return r; } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyList_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyList_GET_SIZE(o)))) { PyObject *r = PyList_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyTuple_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS if (is_list || PyList_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyList_GET_SIZE(o); if ((!boundscheck) || (likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o))))) { PyObject *r = PyList_GET_ITEM(o, n); Py_INCREF(r); return r; } } else if (PyTuple_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyTuple_GET_SIZE(o); if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, n); Py_INCREF(r); return r; } } else { PySequenceMethods *m = Py_TYPE(o)->tp_as_sequence; if (likely(m && m->sq_item)) { if (wraparound && unlikely(i < 0) && likely(m->sq_length)) { Py_ssize_t l = m->sq_length(o); if (likely(l >= 0)) { i += l; } else { if (!PyErr_ExceptionMatches(PyExc_OverflowError)) return NULL; PyErr_Clear(); } } return m->sq_item(o, i); } } #else if (is_list || PySequence_Check(o)) { return PySequence_GetItem(o, i); } #endif return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); } /* ObjectGetItem */ #if CYTHON_USE_TYPE_SLOTS static PyObject *__Pyx_PyObject_GetIndex(PyObject *obj, PyObject* index) { PyObject *runerr; Py_ssize_t key_value; PySequenceMethods *m = Py_TYPE(obj)->tp_as_sequence; if (unlikely(!(m && m->sq_item))) { PyErr_Format(PyExc_TypeError, "'%.200s' object is not subscriptable", Py_TYPE(obj)->tp_name); return NULL; } key_value = __Pyx_PyIndex_AsSsize_t(index); if (likely(key_value != -1 || !(runerr = PyErr_Occurred()))) { return __Pyx_GetItemInt_Fast(obj, key_value, 0, 1, 1); } if (PyErr_GivenExceptionMatches(runerr, PyExc_OverflowError)) { PyErr_Clear(); PyErr_Format(PyExc_IndexError, "cannot fit '%.200s' into an index-sized integer", Py_TYPE(index)->tp_name); } return NULL; } static PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject* key) { PyMappingMethods *m = Py_TYPE(obj)->tp_as_mapping; if (likely(m && m->mp_subscript)) { return m->mp_subscript(obj, key); } return __Pyx_PyObject_GetIndex(obj, key); } #endif /* PyFunctionFastCall */ #if CYTHON_FAST_PYCALL static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, PyObject *globals) { PyFrameObject *f; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject **fastlocals; Py_ssize_t i; PyObject *result; assert(globals != NULL); /* XXX Perhaps we should create a specialized PyFrame_New() that doesn't take locals, but does take builtins without sanity checking them. */ assert(tstate != NULL); f = PyFrame_New(tstate, co, globals, NULL); if (f == NULL) { return NULL; } fastlocals = __Pyx_PyFrame_GetLocalsplus(f); for (i = 0; i < na; i++) { Py_INCREF(*args); fastlocals[i] = *args++; } result = PyEval_EvalFrameEx(f,0); ++tstate->recursion_depth; Py_DECREF(f); --tstate->recursion_depth; return result; } #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); PyObject *globals = PyFunction_GET_GLOBALS(func); PyObject *argdefs = PyFunction_GET_DEFAULTS(func); PyObject *closure; #if PY_MAJOR_VERSION >= 3 PyObject *kwdefs; #endif PyObject *kwtuple, **k; PyObject **d; Py_ssize_t nd; Py_ssize_t nk; PyObject *result; assert(kwargs == NULL || PyDict_Check(kwargs)); nk = kwargs ? PyDict_Size(kwargs) : 0; if (Py_EnterRecursiveCall((char*)" while calling a Python object")) { return NULL; } if ( #if PY_MAJOR_VERSION >= 3 co->co_kwonlyargcount == 0 && #endif likely(kwargs == NULL || nk == 0) && co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { if (argdefs == NULL && co->co_argcount == nargs) { result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); goto done; } else if (nargs == 0 && argdefs != NULL && co->co_argcount == Py_SIZE(argdefs)) { /* function called with no arguments, but all parameters have a default value: use default values as arguments .*/ args = &PyTuple_GET_ITEM(argdefs, 0); result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); goto done; } } if (kwargs != NULL) { Py_ssize_t pos, i; kwtuple = PyTuple_New(2 * nk); if (kwtuple == NULL) { result = NULL; goto done; } k = &PyTuple_GET_ITEM(kwtuple, 0); pos = i = 0; while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { Py_INCREF(k[i]); Py_INCREF(k[i+1]); i += 2; } nk = i / 2; } else { kwtuple = NULL; k = NULL; } closure = PyFunction_GET_CLOSURE(func); #if PY_MAJOR_VERSION >= 3 kwdefs = PyFunction_GET_KW_DEFAULTS(func); #endif if (argdefs != NULL) { d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } else { d = NULL; nd = 0; } #if PY_MAJOR_VERSION >= 3 result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, kwdefs, closure); #else result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, closure); #endif Py_XDECREF(kwtuple); done: Py_LeaveRecursiveCall(); return result; } #endif #endif /* PyObjectCallMethO */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { PyObject *self, *result; PyCFunction cfunc; cfunc = PyCFunction_GET_FUNCTION(func); self = PyCFunction_GET_SELF(func); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = cfunc(self, arg); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallNoArg */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, NULL, 0); } #endif #ifdef __Pyx_CyFunction_USED if (likely(PyCFunction_Check(func) || __Pyx_CyFunction_Check(func))) #else if (likely(PyCFunction_Check(func))) #endif { if (likely(PyCFunction_GET_FLAGS(func) & METH_NOARGS)) { return __Pyx_PyObject_CallMethO(func, NULL); } } return __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL); } #endif /* PyCFunctionFastCall */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject * __Pyx_PyCFunction_FastCall(PyObject *func_obj, PyObject **args, Py_ssize_t nargs) { PyCFunctionObject *func = (PyCFunctionObject*)func_obj; PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyObject *self = PyCFunction_GET_SELF(func); int flags = PyCFunction_GET_FLAGS(func); assert(PyCFunction_Check(func)); assert(METH_FASTCALL == (flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))); assert(nargs >= 0); assert(nargs == 0 || args != NULL); /* _PyCFunction_FastCallDict() must not be called with an exception set, because it may clear it (directly or indirectly) and so the caller loses its exception */ assert(!PyErr_Occurred()); if ((PY_VERSION_HEX < 0x030700A0) || unlikely(flags & METH_KEYWORDS)) { return (*((__Pyx_PyCFunctionFastWithKeywords)(void*)meth)) (self, args, nargs, NULL); } else { return (*((__Pyx_PyCFunctionFast)(void*)meth)) (self, args, nargs); } } #endif /* PyObjectCallOneArg */ #if CYTHON_COMPILING_IN_CPYTHON static PyObject* __Pyx__PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_New(1); if (unlikely(!args)) return NULL; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, &arg, 1); } #endif if (likely(PyCFunction_Check(func))) { if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { return __Pyx_PyObject_CallMethO(func, arg); #if CYTHON_FAST_PYCCALL } else if (PyCFunction_GET_FLAGS(func) & METH_FASTCALL) { return __Pyx_PyCFunction_FastCall(func, &arg, 1); #endif } } return __Pyx__PyObject_CallOneArg(func, arg); } #else static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_Pack(1, arg); if (unlikely(!args)) return NULL; result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } #endif /* PyIntCompare */ static CYTHON_INLINE PyObject* __Pyx_PyInt_EqObjC(PyObject *op1, PyObject *op2, CYTHON_UNUSED long intval, CYTHON_UNUSED long inplace) { if (op1 == op2) { Py_RETURN_TRUE; } #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(op1))) { const long b = intval; long a = PyInt_AS_LONG(op1); if (a == b) Py_RETURN_TRUE; else Py_RETURN_FALSE; } #endif #if CYTHON_USE_PYLONG_INTERNALS if (likely(PyLong_CheckExact(op1))) { int unequal; unsigned long uintval; Py_ssize_t size = Py_SIZE(op1); const digit* digits = ((PyLongObject*)op1)->ob_digit; if (intval == 0) { if (size == 0) Py_RETURN_TRUE; else Py_RETURN_FALSE; } else if (intval < 0) { if (size >= 0) Py_RETURN_FALSE; intval = -intval; size = -size; } else { if (size <= 0) Py_RETURN_FALSE; } uintval = (unsigned long) intval; #if PyLong_SHIFT * 4 < SIZEOF_LONG*8 if (uintval >> (PyLong_SHIFT * 4)) { unequal = (size != 5) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[4] != ((uintval >> (4 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); } else #endif #if PyLong_SHIFT * 3 < SIZEOF_LONG*8 if (uintval >> (PyLong_SHIFT * 3)) { unequal = (size != 4) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[3] != ((uintval >> (3 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); } else #endif #if PyLong_SHIFT * 2 < SIZEOF_LONG*8 if (uintval >> (PyLong_SHIFT * 2)) { unequal = (size != 3) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)) | (digits[2] != ((uintval >> (2 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); } else #endif #if PyLong_SHIFT * 1 < SIZEOF_LONG*8 if (uintval >> (PyLong_SHIFT * 1)) { unequal = (size != 2) || (digits[0] != (uintval & (unsigned long) PyLong_MASK)) | (digits[1] != ((uintval >> (1 * PyLong_SHIFT)) & (unsigned long) PyLong_MASK)); } else #endif unequal = (size != 1) || (((unsigned long) digits[0]) != (uintval & (unsigned long) PyLong_MASK)); if (unequal == 0) Py_RETURN_TRUE; else Py_RETURN_FALSE; } #endif if (PyFloat_CheckExact(op1)) { const long b = intval; double a = PyFloat_AS_DOUBLE(op1); if ((double)a == (double)b) Py_RETURN_TRUE; else Py_RETURN_FALSE; } return ( PyObject_RichCompare(op1, op2, Py_EQ)); } /* PyObjectCall2Args */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) { PyObject *args, *result = NULL; #if CYTHON_FAST_PYCALL if (PyFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyFunction_FastCall(function, args, 2); } #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyCFunction_FastCall(function, args, 2); } #endif args = PyTuple_New(2); if (unlikely(!args)) goto done; Py_INCREF(arg1); PyTuple_SET_ITEM(args, 0, arg1); Py_INCREF(arg2); PyTuple_SET_ITEM(args, 1, arg2); Py_INCREF(function); result = __Pyx_PyObject_Call(function, args, NULL); Py_DECREF(args); Py_DECREF(function); done: return result; } /* PyObjectGetMethod */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method) { PyObject *attr; #if CYTHON_UNPACK_METHODS && CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_PYTYPE_LOOKUP PyTypeObject *tp = Py_TYPE(obj); PyObject *descr; descrgetfunc f = NULL; PyObject **dictptr, *dict; int meth_found = 0; assert (*method == NULL); if (unlikely(tp->tp_getattro != PyObject_GenericGetAttr)) { attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; } if (unlikely(tp->tp_dict == NULL) && unlikely(PyType_Ready(tp) < 0)) { return 0; } descr = _PyType_Lookup(tp, name); if (likely(descr != NULL)) { Py_INCREF(descr); #if PY_MAJOR_VERSION >= 3 #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type))) #endif #else #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr))) #endif #endif { meth_found = 1; } else { f = Py_TYPE(descr)->tp_descr_get; if (f != NULL && PyDescr_IsData(descr)) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } } } dictptr = _PyObject_GetDictPtr(obj); if (dictptr != NULL && (dict = *dictptr) != NULL) { Py_INCREF(dict); attr = __Pyx_PyDict_GetItemStr(dict, name); if (attr != NULL) { Py_INCREF(attr); Py_DECREF(dict); Py_XDECREF(descr); goto try_unpack; } Py_DECREF(dict); } if (meth_found) { *method = descr; return 1; } if (f != NULL) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } if (descr != NULL) { *method = descr; return 0; } PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(name)); #endif return 0; #else attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; #endif try_unpack: #if CYTHON_UNPACK_METHODS if (likely(attr) && PyMethod_Check(attr) && likely(PyMethod_GET_SELF(attr) == obj)) { PyObject *function = PyMethod_GET_FUNCTION(attr); Py_INCREF(function); Py_DECREF(attr); *method = function; return 1; } #endif *method = attr; return 0; } /* PyObjectCallMethod1 */ static PyObject* __Pyx__PyObject_CallMethod1(PyObject* method, PyObject* arg) { PyObject *result = __Pyx_PyObject_CallOneArg(method, arg); Py_DECREF(method); return result; } static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg) { PyObject *method = NULL, *result; int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method); if (likely(is_method)) { result = __Pyx_PyObject_Call2Args(method, obj, arg); Py_DECREF(method); return result; } if (unlikely(!method)) return NULL; return __Pyx__PyObject_CallMethod1(method, arg); } /* pop_index */ static PyObject* __Pyx__PyObject_PopNewIndex(PyObject* L, PyObject* py_ix) { PyObject *r; if (unlikely(!py_ix)) return NULL; r = __Pyx__PyObject_PopIndex(L, py_ix); Py_DECREF(py_ix); return r; } static PyObject* __Pyx__PyObject_PopIndex(PyObject* L, PyObject* py_ix) { return __Pyx_PyObject_CallMethod1(L, __pyx_n_s_pop, py_ix); } #if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS static PyObject* __Pyx__PyList_PopIndex(PyObject* L, PyObject* py_ix, Py_ssize_t ix) { Py_ssize_t size = PyList_GET_SIZE(L); if (likely(size > (((PyListObject*)L)->allocated >> 1))) { Py_ssize_t cix = ix; if (cix < 0) { cix += size; } if (likely(__Pyx_is_valid_index(cix, size))) { PyObject* v = PyList_GET_ITEM(L, cix); Py_SIZE(L) -= 1; size -= 1; memmove(&PyList_GET_ITEM(L, cix), &PyList_GET_ITEM(L, cix+1), (size_t)(size-cix)*sizeof(PyObject*)); return v; } } if (py_ix == Py_None) { return __Pyx__PyObject_PopNewIndex(L, PyInt_FromSsize_t(ix)); } else { return __Pyx__PyObject_PopIndex(L, py_ix); } } #endif /* PyErrExceptionMatches */ #if CYTHON_FAST_THREAD_STATE static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; icurexc_type; if (exc_type == err) return 1; if (unlikely(!exc_type)) return 0; if (unlikely(PyTuple_Check(err))) return __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err); return __Pyx_PyErr_GivenExceptionMatches(exc_type, err); } #endif /* GetAttr */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *o, PyObject *n) { #if CYTHON_USE_TYPE_SLOTS #if PY_MAJOR_VERSION >= 3 if (likely(PyUnicode_Check(n))) #else if (likely(PyString_Check(n))) #endif return __Pyx_PyObject_GetAttrStr(o, n); #endif return PyObject_GetAttr(o, n); } /* GetAttr3 */ static PyObject *__Pyx_GetAttr3Default(PyObject *d) { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign if (unlikely(!__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError))) return NULL; __Pyx_PyErr_Clear(); Py_INCREF(d); return d; } static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *o, PyObject *n, PyObject *d) { PyObject *r = __Pyx_GetAttr(o, n); return (likely(r)) ? r : __Pyx_GetAttr3Default(d); } /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* GetModuleGlobalName */ #if CYTHON_USE_DICT_VERSIONS static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) #else static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) #endif { PyObject *result; #if !CYTHON_AVOID_BORROWED_REFS #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } else if (unlikely(PyErr_Occurred())) { return NULL; } #else result = PyDict_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } #endif #else result = PyObject_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } PyErr_Clear(); #endif return __Pyx_GetBuiltinName(name); } /* Import */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level) { PyObject *empty_list = 0; PyObject *module = 0; PyObject *global_dict = 0; PyObject *empty_dict = 0; PyObject *list; #if PY_MAJOR_VERSION < 3 PyObject *py_import; py_import = __Pyx_PyObject_GetAttrStr(__pyx_b, __pyx_n_s_import); if (!py_import) goto bad; #endif if (from_list) list = from_list; else { empty_list = PyList_New(0); if (!empty_list) goto bad; list = empty_list; } global_dict = PyModule_GetDict(__pyx_m); if (!global_dict) goto bad; empty_dict = PyDict_New(); if (!empty_dict) goto bad; { #if PY_MAJOR_VERSION >= 3 if (level == -1) { if (strchr(__Pyx_MODULE_NAME, '.')) { module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, 1); if (!module) { if (!PyErr_ExceptionMatches(PyExc_ImportError)) goto bad; PyErr_Clear(); } } level = 0; } #endif if (!module) { #if PY_MAJOR_VERSION < 3 PyObject *py_level = PyInt_FromLong(level); if (!py_level) goto bad; module = PyObject_CallFunctionObjArgs(py_import, name, global_dict, empty_dict, list, py_level, (PyObject *)NULL); Py_DECREF(py_level); #else module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, level); #endif } } bad: #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_import); #endif Py_XDECREF(empty_list); Py_XDECREF(empty_dict); return module; } /* ImportFrom */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) { PyObject* value = __Pyx_PyObject_GetAttrStr(module, name); if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Format(PyExc_ImportError, #if PY_MAJOR_VERSION < 3 "cannot import name %.230s", PyString_AS_STRING(name)); #else "cannot import name %S", name); #endif } return value; } /* HasAttr */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *o, PyObject *n) { PyObject *r; if (unlikely(!__Pyx_PyBaseString_Check(n))) { PyErr_SetString(PyExc_TypeError, "hasattr(): attribute name must be string"); return -1; } r = __Pyx_GetAttr(o, n); if (unlikely(!r)) { PyErr_Clear(); return 0; } else { Py_DECREF(r); return 1; } } /* PyObject_GenericGetAttrNoDict */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject *__Pyx_RaiseGenericGetAttributeError(PyTypeObject *tp, PyObject *attr_name) { PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, attr_name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(attr_name)); #endif return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name) { PyObject *descr; PyTypeObject *tp = Py_TYPE(obj); if (unlikely(!PyString_Check(attr_name))) { return PyObject_GenericGetAttr(obj, attr_name); } assert(!tp->tp_dictoffset); descr = _PyType_Lookup(tp, attr_name); if (unlikely(!descr)) { return __Pyx_RaiseGenericGetAttributeError(tp, attr_name); } Py_INCREF(descr); #if PY_MAJOR_VERSION < 3 if (likely(PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_HAVE_CLASS))) #endif { descrgetfunc f = Py_TYPE(descr)->tp_descr_get; if (unlikely(f)) { PyObject *res = f(descr, obj, (PyObject *)tp); Py_DECREF(descr); return res; } } return descr; } #endif /* PyObject_GenericGetAttr */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name) { if (unlikely(Py_TYPE(obj)->tp_dictoffset)) { return PyObject_GenericGetAttr(obj, attr_name); } return __Pyx_PyObject_GenericGetAttrNoDict(obj, attr_name); } #endif /* SetVTable */ static int __Pyx_SetVtable(PyObject *dict, void *vtable) { #if PY_VERSION_HEX >= 0x02070000 PyObject *ob = PyCapsule_New(vtable, 0, 0); #else PyObject *ob = PyCObject_FromVoidPtr(vtable, 0); #endif if (!ob) goto bad; if (PyDict_SetItem(dict, __pyx_n_s_pyx_vtable, ob) < 0) goto bad; Py_DECREF(ob); return 0; bad: Py_XDECREF(ob); return -1; } /* SetupReduce */ static int __Pyx_setup_reduce_is_named(PyObject* meth, PyObject* name) { int ret; PyObject *name_attr; name_attr = __Pyx_PyObject_GetAttrStr(meth, __pyx_n_s_name); if (likely(name_attr)) { ret = PyObject_RichCompareBool(name_attr, name, Py_EQ); } else { ret = -1; } if (unlikely(ret < 0)) { PyErr_Clear(); ret = 0; } Py_XDECREF(name_attr); return ret; } static int __Pyx_setup_reduce(PyObject* type_obj) { int ret = 0; PyObject *object_reduce = NULL; PyObject *object_reduce_ex = NULL; PyObject *reduce = NULL; PyObject *reduce_ex = NULL; PyObject *reduce_cython = NULL; PyObject *setstate = NULL; PyObject *setstate_cython = NULL; #if CYTHON_USE_PYTYPE_LOOKUP if (_PyType_Lookup((PyTypeObject*)type_obj, __pyx_n_s_getstate)) goto GOOD; #else if (PyObject_HasAttr(type_obj, __pyx_n_s_getstate)) goto GOOD; #endif #if CYTHON_USE_PYTYPE_LOOKUP object_reduce_ex = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #else object_reduce_ex = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #endif reduce_ex = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_ex); if (unlikely(!reduce_ex)) goto BAD; if (reduce_ex == object_reduce_ex) { #if CYTHON_USE_PYTYPE_LOOKUP object_reduce = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #else object_reduce = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #endif reduce = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce); if (unlikely(!reduce)) goto BAD; if (reduce == object_reduce || __Pyx_setup_reduce_is_named(reduce, __pyx_n_s_reduce_cython)) { reduce_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_cython); if (unlikely(!reduce_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce, reduce_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce_cython); if (unlikely(ret < 0)) goto BAD; setstate = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate); if (!setstate) PyErr_Clear(); if (!setstate || __Pyx_setup_reduce_is_named(setstate, __pyx_n_s_setstate_cython)) { setstate_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate_cython); if (unlikely(!setstate_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate, setstate_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate_cython); if (unlikely(ret < 0)) goto BAD; } PyType_Modified((PyTypeObject*)type_obj); } } goto GOOD; BAD: if (!PyErr_Occurred()) PyErr_Format(PyExc_RuntimeError, "Unable to initialize pickling for %s", ((PyTypeObject*)type_obj)->tp_name); ret = -1; GOOD: #if !CYTHON_USE_PYTYPE_LOOKUP Py_XDECREF(object_reduce); Py_XDECREF(object_reduce_ex); #endif Py_XDECREF(reduce); Py_XDECREF(reduce_ex); Py_XDECREF(reduce_cython); Py_XDECREF(setstate); Py_XDECREF(setstate_cython); return ret; } /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, (size_t)new_max*sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyObject *py_srcfile = 0; PyObject *py_funcname = 0; #if PY_MAJOR_VERSION < 3 py_srcfile = PyString_FromString(filename); #else py_srcfile = PyUnicode_FromString(filename); #endif if (!py_srcfile) goto bad; if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); #else py_funcname = PyUnicode_FromString(funcname); #endif } if (!py_funcname) goto bad; py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); Py_DECREF(py_funcname); return py_code; bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) goto bad; __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(int) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(int) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(int) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(int), little, !is_unsigned); } } /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); } } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(long) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case 1: __PYX_VERIFY_RETURN_INT(long, digit, digits[0]) case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 2 * PyLong_SHIFT) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 3 * PyLong_SHIFT) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 4 * PyLong_SHIFT) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(long) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case -1: __PYX_VERIFY_RETURN_INT(long, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(long, digit, +digits[0]) case -2: if (8 * sizeof(long) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } #endif if (sizeof(long) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else long val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (long) -1; } } else { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(int) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case 1: __PYX_VERIFY_RETURN_INT(int, digit, digits[0]) case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 2 * PyLong_SHIFT) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 3 * PyLong_SHIFT) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 4 * PyLong_SHIFT) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(int) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case -1: __PYX_VERIFY_RETURN_INT(int, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(int, digit, +digits[0]) case -2: if (8 * sizeof(int) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } #endif if (sizeof(int) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else int val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (int) -1; } } else { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = a->tp_base; if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { int res = exc_type1 ? __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type1) : 0; if (!res) { res = __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } return res; } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; ip) { #if PY_MAJOR_VERSION < 3 if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } #else if (t->is_unicode | t->is_str) { if (t->intern) { *t->p = PyUnicode_InternFromString(t->s); } else if (t->encoding) { *t->p = PyUnicode_Decode(t->s, t->n - 1, t->encoding, NULL); } else { *t->p = PyUnicode_FromStringAndSize(t->s, t->n - 1); } } else { *t->p = PyBytes_FromStringAndSize(t->s, t->n - 1); } #endif if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; ++t; } return 0; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type %.200s). " "The ability to return an instance of a strict subclass of int " "is deprecated, and may be removed in a future version of Python.", Py_TYPE(result)->tp_name)) { Py_DECREF(result); return NULL; } return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type %.200s)", type_name, type_name, Py_TYPE(result)->tp_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)b)->ob_digit; const Py_ssize_t size = Py_SIZE(b); if (likely(__Pyx_sst_abs(size) <= 1)) { ival = likely(size) ? digits[0] : 0; if (size == -1) ival = -ival; return ival; } else { switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } #endif /* Py_PYTHON_H */ aiohttp-3.6.2/aiohttp/_frozenlist.pyx0000644000175100001650000000505513547410117020242 0ustar vstsdocker00000000000000from collections.abc import MutableSequence cdef class FrozenList: cdef readonly bint frozen cdef list _items def __init__(self, items=None): self.frozen = False if items is not None: items = list(items) else: items = [] self._items = items cdef object _check_frozen(self): if self.frozen: raise RuntimeError("Cannot modify frozen list.") cdef inline object _fast_len(self): return len(self._items) def freeze(self): self.frozen = True def __getitem__(self, index): return self._items[index] def __setitem__(self, index, value): self._check_frozen() self._items[index] = value def __delitem__(self, index): self._check_frozen() del self._items[index] def __len__(self): return self._fast_len() def __iter__(self): return self._items.__iter__() def __reversed__(self): return self._items.__reversed__() def __richcmp__(self, other, op): if op == 0: # < return list(self) < other if op == 1: # <= return list(self) <= other if op == 2: # == return list(self) == other if op == 3: # != return list(self) != other if op == 4: # > return list(self) > other if op == 5: # => return list(self) >= other def insert(self, pos, item): self._check_frozen() self._items.insert(pos, item) def __contains__(self, item): return item in self._items def __iadd__(self, items): self._check_frozen() self._items += list(items) return self def index(self, item): return self._items.index(item) def remove(self, item): self._check_frozen() self._items.remove(item) def clear(self): self._check_frozen() self._items.clear() def extend(self, items): self._check_frozen() self._items += list(items) def reverse(self): self._check_frozen() self._items.reverse() def pop(self, index=-1): self._check_frozen() return self._items.pop(index) def append(self, item): self._check_frozen() return self._items.append(item) def count(self, item): return self._items.count(item) def __repr__(self): return ''.format(self.frozen, self._items) MutableSequence.register(FrozenList) aiohttp-3.6.2/aiohttp/_headers.pxi0000644000175100001650000000375313547410117017441 0ustar vstsdocker00000000000000# The file is autogenerated from aiohttp/hdrs.py # Run ./tools/gen.py to update it after the origin changing. from . import hdrs cdef tuple headers = ( hdrs.ACCEPT, hdrs.ACCEPT_CHARSET, hdrs.ACCEPT_ENCODING, hdrs.ACCEPT_LANGUAGE, hdrs.ACCEPT_RANGES, hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, hdrs.ACCESS_CONTROL_ALLOW_HEADERS, hdrs.ACCESS_CONTROL_ALLOW_METHODS, hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, hdrs.ACCESS_CONTROL_MAX_AGE, hdrs.ACCESS_CONTROL_REQUEST_HEADERS, hdrs.ACCESS_CONTROL_REQUEST_METHOD, hdrs.AGE, hdrs.ALLOW, hdrs.AUTHORIZATION, hdrs.CACHE_CONTROL, hdrs.CONNECTION, hdrs.CONTENT_DISPOSITION, hdrs.CONTENT_ENCODING, hdrs.CONTENT_LANGUAGE, hdrs.CONTENT_LENGTH, hdrs.CONTENT_LOCATION, hdrs.CONTENT_MD5, hdrs.CONTENT_RANGE, hdrs.CONTENT_TRANSFER_ENCODING, hdrs.CONTENT_TYPE, hdrs.COOKIE, hdrs.DATE, hdrs.DESTINATION, hdrs.DIGEST, hdrs.ETAG, hdrs.EXPECT, hdrs.EXPIRES, hdrs.FORWARDED, hdrs.FROM, hdrs.HOST, hdrs.IF_MATCH, hdrs.IF_MODIFIED_SINCE, hdrs.IF_NONE_MATCH, hdrs.IF_RANGE, hdrs.IF_UNMODIFIED_SINCE, hdrs.KEEP_ALIVE, hdrs.LAST_EVENT_ID, hdrs.LAST_MODIFIED, hdrs.LINK, hdrs.LOCATION, hdrs.MAX_FORWARDS, hdrs.ORIGIN, hdrs.PRAGMA, hdrs.PROXY_AUTHENTICATE, hdrs.PROXY_AUTHORIZATION, hdrs.RANGE, hdrs.REFERER, hdrs.RETRY_AFTER, hdrs.SEC_WEBSOCKET_ACCEPT, hdrs.SEC_WEBSOCKET_EXTENSIONS, hdrs.SEC_WEBSOCKET_KEY, hdrs.SEC_WEBSOCKET_KEY1, hdrs.SEC_WEBSOCKET_PROTOCOL, hdrs.SEC_WEBSOCKET_VERSION, hdrs.SERVER, hdrs.SET_COOKIE, hdrs.TE, hdrs.TRAILER, hdrs.TRANSFER_ENCODING, hdrs.UPGRADE, hdrs.URI, hdrs.USER_AGENT, hdrs.VARY, hdrs.VIA, hdrs.WANT_DIGEST, hdrs.WARNING, hdrs.WEBSOCKET, hdrs.WWW_AUTHENTICATE, hdrs.X_FORWARDED_FOR, hdrs.X_FORWARDED_HOST, hdrs.X_FORWARDED_PROTO, ) aiohttp-3.6.2/aiohttp/_helpers.c0000644000175100001650000062742013547410136017116 0ustar vstsdocker00000000000000/* Generated by Cython 0.29.13 */ #define PY_SSIZE_T_CLEAN #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02060000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.6+ or Python 3.3+. #else #define CYTHON_ABI "0_29_13" #define CYTHON_HEX_VERSION 0x001D0DF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #if PY_VERSION_HEX >= 0x02070000 #define HAVE_LONG_LONG #endif #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #ifdef PYPY_VERSION #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif defined(PYSTON_VERSION) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #elif !defined(CYTHON_USE_PYTYPE_LOOKUP) #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #elif !defined(CYTHON_USE_PYLONG_INTERNALS) #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT (PY_VERSION_HEX >= 0x03050000) #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1) #endif #ifndef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX >= 0x030600B1) #endif #ifndef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK (PY_VERSION_HEX >= 0x030700A3) #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if CYTHON_USE_PYLONG_INTERNALS #include "longintrepr.h" #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_MAYBE_UNUSED_VAR( const T& ) { } # else # define CYTHON_MAYBE_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int32 uint32_t; #endif #endif #else #include #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) && __cplusplus >= 201103L #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #elif __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__ ) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #define Py_OptimizeFlag 0 #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #define __Pyx_DefaultClassType PyClass_Type #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #if PY_VERSION_HEX >= 0x030800A4 && PY_VERSION_HEX < 0x030800B2 #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #define __Pyx_DefaultClassType PyType_Type #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #define __Pyx_PyCFunctionFast _PyCFunctionFast #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #if CYTHON_FAST_PYCCALL #define __Pyx_PyFastCFunction_Check(func)\ ((PyCFunction_Check(func) && (METH_FASTCALL == (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))))) #else #define __Pyx_PyFastCFunction_Check(func) 0 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030400A1 #define PyMem_RawMalloc(n) PyMem_Malloc(n) #define PyMem_RawRealloc(p, n) PyMem_Realloc(p, n) #define PyMem_RawFree(p) PyMem_Free(p) #endif #if CYTHON_COMPILING_IN_PYSTON #define __Pyx_PyCode_HasFreeVars(co) PyCode_HasFreeVars(co) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if !CYTHON_FAST_THREAD_STATE || PY_VERSION_HEX < 0x02070000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStr(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) #else #define __Pyx_PyDict_GetItemStr(dict, name) PyDict_GetItem(dict, name) #endif #if PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) PyUnicode_KIND(u) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535 : 1114111) #define __Pyx_PyUnicode_KIND(u) (sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains) #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #define PyObject_Unicode PyObject_Str #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) (Py_TYPE(obj) == &PySet_Type) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #else #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t PyInt_AsLong #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t PyInt_AsSsize_t #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyMethod_New(func, self, klass) ((self) ? PyMethod_New(func, self) : (Py_INCREF(func), func)) #else #define __Pyx_PyMethod_New(func, self, klass) PyMethod_New(func, self, klass) #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(WIN32) || defined(MS_WINDOWS) #define _USE_MATH_DEFINES #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_ERR(f_index, lineno, Ln_error) \ { \ __pyx_filename = __pyx_f[f_index]; __pyx_lineno = lineno; __pyx_clineno = __LINE__; goto Ln_error; \ } #ifndef __PYX_EXTERN_C #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__aiohttp___helpers #define __PYX_HAVE_API__aiohttp___helpers /* Early includes */ #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); #define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) { const Py_UNICODE *u_end = u; while (*u_end++) ; return (size_t)(u_end - u - 1); } #define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) #define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #define __Pyx_PyNumber_Float(x) (PyFloat_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Float(x)) #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } static PyObject *__pyx_m = NULL; static PyObject *__pyx_d; static PyObject *__pyx_b; static PyObject *__pyx_cython_runtime = NULL; static PyObject *__pyx_empty_tuple; static PyObject *__pyx_empty_bytes; static PyObject *__pyx_empty_unicode; static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm= __FILE__; static const char *__pyx_filename; static const char *__pyx_f[] = { "aiohttp/_helpers.pyx", "stringsource", }; /*--- Type declarations ---*/ struct __pyx_obj_7aiohttp_8_helpers_reify; /* "aiohttp/_helpers.pyx":1 * cdef class reify: # <<<<<<<<<<<<<< * """Use as a class method decorator. It operates almost exactly like * the Python `@property` decorator, but it puts the result of the */ struct __pyx_obj_7aiohttp_8_helpers_reify { PyObject_HEAD PyObject *wrapped; PyObject *name; }; /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, int); void (*DECREF)(void*, PyObject*, int); void (*GOTREF)(void*, PyObject*, int); void (*GIVEREF)(void*, PyObject*, int); void* (*SetupContext)(const char*, int, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__) #endif #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_XINCREF(r) do { if((r) != NULL) {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) != NULL) {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) != NULL) {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) != NULL) {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject **argnames[],\ PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,\ const char* function_name); /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* GetItemInt.proto */ #define __Pyx_GetItemInt(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Fast(o, (Py_ssize_t)i, is_list, wraparound, boundscheck) :\ (is_list ? (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL) :\ __Pyx_GetItemInt_Generic(o, to_py_func(i)))) #define __Pyx_GetItemInt_List(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_List_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); #define __Pyx_GetItemInt_Tuple(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Tuple_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "tuple index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j); static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, int wraparound, int boundscheck); /* ObjectGetItem.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject* key); #else #define __Pyx_PyObject_GetItem(obj, key) PyObject_GetItem(obj, key) #endif /* GetTopmostException.proto */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate); #endif /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #define __Pyx_PyErr_Occurred() __pyx_tstate->curexc_type #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() PyErr_Occurred() #endif /* SaveResetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); #else #define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb) #define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb) #endif /* PyErrExceptionMatches.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err) static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err); #else #define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err) #endif /* GetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb) static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb); #endif /* PyCFunctionFastCall.proto */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject *__Pyx_PyCFunction_FastCall(PyObject *func, PyObject **args, Py_ssize_t nargs); #else #define __Pyx_PyCFunction_FastCall(func, args, nargs) (assert(0), NULL) #endif /* PyFunctionFastCall.proto */ #if CYTHON_FAST_PYCALL #define __Pyx_PyFunction_FastCall(func, args, nargs)\ __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); #else #define __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs) _PyFunction_FastCallDict(func, args, nargs, kwargs) #endif #define __Pyx_BUILD_ASSERT_EXPR(cond)\ (sizeof(char [1 - 2*!(cond)]) - 1) #ifndef Py_MEMBER_SIZE #define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) #endif static size_t __pyx_pyframe_localsplus_offset = 0; #include "frameobject.h" #define __Pxy_PyFrame_Initialize_Offsets()\ ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) #define __Pyx_PyFrame_GetLocalsplus(frame)\ (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) #endif /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* PyObjectCall2Args.proto */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2); /* PyObjectCallMethO.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); #endif /* PyObjectCallOneArg.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* RaiseException.proto */ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); /* GetAttr.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *, PyObject *); /* GetAttr3.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *, PyObject *, PyObject *); /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* GetModuleGlobalName.proto */ #if CYTHON_USE_DICT_VERSIONS #define __Pyx_GetModuleGlobalName(var, name) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } #define __Pyx_GetModuleGlobalNameUncached(var, name) {\ PY_UINT64_T __pyx_dict_version;\ PyObject *__pyx_dict_cached_value;\ (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); #else #define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) #define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); #endif /* Import.proto */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); /* ImportFrom.proto */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name); /* HasAttr.proto */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *, PyObject *); /* PyObject_GenericGetAttrNoDict.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttrNoDict PyObject_GenericGetAttr #endif /* PyObject_GenericGetAttr.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttr PyObject_GenericGetAttr #endif /* SetupReduce.proto */ static int __Pyx_setup_reduce(PyObject* type_obj); /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* CheckBinaryVersion.proto */ static int __Pyx_check_binary_version(void); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /* Module declarations from 'aiohttp._helpers' */ static PyTypeObject *__pyx_ptype_7aiohttp_8_helpers_reify = 0; static PyObject *__pyx_f_7aiohttp_8_helpers___pyx_unpickle_reify__set_state(struct __pyx_obj_7aiohttp_8_helpers_reify *, PyObject *); /*proto*/ #define __Pyx_MODULE_NAME "aiohttp._helpers" extern int __pyx_module_is_main_aiohttp___helpers; int __pyx_module_is_main_aiohttp___helpers = 0; /* Implementation of 'aiohttp._helpers' */ static PyObject *__pyx_builtin_KeyError; static PyObject *__pyx_builtin_AttributeError; static const char __pyx_k_doc[] = "__doc__"; static const char __pyx_k_new[] = "__new__"; static const char __pyx_k_dict[] = "__dict__"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_cache[] = "_cache"; static const char __pyx_k_reify[] = "reify"; static const char __pyx_k_import[] = "__import__"; static const char __pyx_k_pickle[] = "pickle"; static const char __pyx_k_reduce[] = "__reduce__"; static const char __pyx_k_update[] = "update"; static const char __pyx_k_wrapped[] = "wrapped"; static const char __pyx_k_KeyError[] = "KeyError"; static const char __pyx_k_getstate[] = "__getstate__"; static const char __pyx_k_pyx_type[] = "__pyx_type"; static const char __pyx_k_setstate[] = "__setstate__"; static const char __pyx_k_pyx_state[] = "__pyx_state"; static const char __pyx_k_reduce_ex[] = "__reduce_ex__"; static const char __pyx_k_pyx_result[] = "__pyx_result"; static const char __pyx_k_PickleError[] = "PickleError"; static const char __pyx_k_pyx_checksum[] = "__pyx_checksum"; static const char __pyx_k_stringsource[] = "stringsource"; static const char __pyx_k_reduce_cython[] = "__reduce_cython__"; static const char __pyx_k_AttributeError[] = "AttributeError"; static const char __pyx_k_pyx_PickleError[] = "__pyx_PickleError"; static const char __pyx_k_setstate_cython[] = "__setstate_cython__"; static const char __pyx_k_aiohttp__helpers[] = "aiohttp._helpers"; static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; static const char __pyx_k_pyx_unpickle_reify[] = "__pyx_unpickle_reify"; static const char __pyx_k_reified_property_is_read_only[] = "reified property is read-only"; static const char __pyx_k_Incompatible_checksums_s_vs_0x77[] = "Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))"; static PyObject *__pyx_n_s_AttributeError; static PyObject *__pyx_kp_s_Incompatible_checksums_s_vs_0x77; static PyObject *__pyx_n_s_KeyError; static PyObject *__pyx_n_s_PickleError; static PyObject *__pyx_n_s_aiohttp__helpers; static PyObject *__pyx_n_s_cache; static PyObject *__pyx_n_s_cline_in_traceback; static PyObject *__pyx_n_s_dict; static PyObject *__pyx_n_s_doc; static PyObject *__pyx_n_s_getstate; static PyObject *__pyx_n_s_import; static PyObject *__pyx_n_s_main; static PyObject *__pyx_n_s_name; static PyObject *__pyx_n_s_new; static PyObject *__pyx_n_s_pickle; static PyObject *__pyx_n_s_pyx_PickleError; static PyObject *__pyx_n_s_pyx_checksum; static PyObject *__pyx_n_s_pyx_result; static PyObject *__pyx_n_s_pyx_state; static PyObject *__pyx_n_s_pyx_type; static PyObject *__pyx_n_s_pyx_unpickle_reify; static PyObject *__pyx_n_s_reduce; static PyObject *__pyx_n_s_reduce_cython; static PyObject *__pyx_n_s_reduce_ex; static PyObject *__pyx_kp_u_reified_property_is_read_only; static PyObject *__pyx_n_s_reify; static PyObject *__pyx_n_s_setstate; static PyObject *__pyx_n_s_setstate_cython; static PyObject *__pyx_kp_s_stringsource; static PyObject *__pyx_n_s_test; static PyObject *__pyx_n_s_update; static PyObject *__pyx_n_s_wrapped; static int __pyx_pf_7aiohttp_8_helpers_5reify___init__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v_wrapped); /* proto */ static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_7__doc_____get__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_2__get__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v_inst, CYTHON_UNUSED PyObject *__pyx_v_owner); /* proto */ static int __pyx_pf_7aiohttp_8_helpers_5reify_4__set__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, CYTHON_UNUSED PyObject *__pyx_v_inst, CYTHON_UNUSED PyObject *__pyx_v_value); /* proto */ static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_6__reduce_cython__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self); /* proto */ static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_8__setstate_cython__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v___pyx_state); /* proto */ static PyObject *__pyx_pf_7aiohttp_8_helpers___pyx_unpickle_reify(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state); /* proto */ static PyObject *__pyx_tp_new_7aiohttp_8_helpers_reify(PyTypeObject *t, PyObject *a, PyObject *k); /*proto*/ static PyObject *__pyx_int_124832655; static PyObject *__pyx_tuple_; static PyObject *__pyx_tuple__2; static PyObject *__pyx_codeobj__3; /* Late includes */ /* "aiohttp/_helpers.pyx":13 * cdef object name * * def __init__(self, wrapped): # <<<<<<<<<<<<<< * self.wrapped = wrapped * self.name = wrapped.__name__ */ /* Python wrapper */ static int __pyx_pw_7aiohttp_8_helpers_5reify_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_8_helpers_5reify_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_wrapped = 0; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_wrapped,0}; PyObject* values[1] = {0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_wrapped)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 13, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 1) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); } __pyx_v_wrapped = values[0]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 1, 1, 1, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 13, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._helpers.reify.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify___init__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self), __pyx_v_wrapped); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_8_helpers_5reify___init__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v_wrapped) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__init__", 0); /* "aiohttp/_helpers.pyx":14 * * def __init__(self, wrapped): * self.wrapped = wrapped # <<<<<<<<<<<<<< * self.name = wrapped.__name__ * */ __Pyx_INCREF(__pyx_v_wrapped); __Pyx_GIVEREF(__pyx_v_wrapped); __Pyx_GOTREF(__pyx_v_self->wrapped); __Pyx_DECREF(__pyx_v_self->wrapped); __pyx_v_self->wrapped = __pyx_v_wrapped; /* "aiohttp/_helpers.pyx":15 * def __init__(self, wrapped): * self.wrapped = wrapped * self.name = wrapped.__name__ # <<<<<<<<<<<<<< * * @property */ __pyx_t_1 = __Pyx_PyObject_GetAttrStr(__pyx_v_wrapped, __pyx_n_s_name); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->name); __Pyx_DECREF(__pyx_v_self->name); __pyx_v_self->name = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_helpers.pyx":13 * cdef object name * * def __init__(self, wrapped): # <<<<<<<<<<<<<< * self.wrapped = wrapped * self.name = wrapped.__name__ */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._helpers.reify.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_helpers.pyx":18 * * @property * def __doc__(self): # <<<<<<<<<<<<<< * return self.wrapped.__doc__ * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_7__doc___1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_7__doc___1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify_7__doc_____get__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_7__doc_____get__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__get__", 0); /* "aiohttp/_helpers.pyx":19 * @property * def __doc__(self): * return self.wrapped.__doc__ # <<<<<<<<<<<<<< * * def __get__(self, inst, owner): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->wrapped, __pyx_n_s_doc); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_helpers.pyx":18 * * @property * def __doc__(self): # <<<<<<<<<<<<<< * return self.wrapped.__doc__ * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._helpers.reify.__doc__.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_helpers.pyx":21 * return self.wrapped.__doc__ * * def __get__(self, inst, owner): # <<<<<<<<<<<<<< * try: * try: */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_3__get__(PyObject *__pyx_v_self, PyObject *__pyx_v_inst, PyObject *__pyx_v_owner); /*proto*/ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_3__get__(PyObject *__pyx_v_self, PyObject *__pyx_v_inst, PyObject *__pyx_v_owner) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify_2__get__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self), ((PyObject *)__pyx_v_inst), ((PyObject *)__pyx_v_owner)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_2__get__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v_inst, CYTHON_UNUSED PyObject *__pyx_v_owner) { PyObject *__pyx_v_val = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; int __pyx_t_9; PyObject *__pyx_t_10 = NULL; PyObject *__pyx_t_11 = NULL; PyObject *__pyx_t_12 = NULL; PyObject *__pyx_t_13 = NULL; int __pyx_t_14; int __pyx_t_15; __Pyx_RefNannySetupContext("__get__", 0); /* "aiohttp/_helpers.pyx":22 * * def __get__(self, inst, owner): * try: # <<<<<<<<<<<<<< * try: * return inst._cache[self.name] */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_1, &__pyx_t_2, &__pyx_t_3); __Pyx_XGOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); /*try:*/ { /* "aiohttp/_helpers.pyx":23 * def __get__(self, inst, owner): * try: * try: # <<<<<<<<<<<<<< * return inst._cache[self.name] * except KeyError: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_4, &__pyx_t_5, &__pyx_t_6); __Pyx_XGOTREF(__pyx_t_4); __Pyx_XGOTREF(__pyx_t_5); __Pyx_XGOTREF(__pyx_t_6); /*try:*/ { /* "aiohttp/_helpers.pyx":24 * try: * try: * return inst._cache[self.name] # <<<<<<<<<<<<<< * except KeyError: * val = self.wrapped(inst) */ __Pyx_XDECREF(__pyx_r); __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_v_inst, __pyx_n_s_cache); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 24, __pyx_L9_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_8 = __Pyx_PyObject_GetItem(__pyx_t_7, __pyx_v_self->name); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 24, __pyx_L9_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __pyx_r = __pyx_t_8; __pyx_t_8 = 0; goto __pyx_L13_try_return; /* "aiohttp/_helpers.pyx":23 * def __get__(self, inst, owner): * try: * try: # <<<<<<<<<<<<<< * return inst._cache[self.name] * except KeyError: */ } __pyx_L9_error:; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; /* "aiohttp/_helpers.pyx":25 * try: * return inst._cache[self.name] * except KeyError: # <<<<<<<<<<<<<< * val = self.wrapped(inst) * inst._cache[self.name] = val */ __pyx_t_9 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_KeyError); if (__pyx_t_9) { __Pyx_AddTraceback("aiohttp._helpers.reify.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_8, &__pyx_t_7, &__pyx_t_10) < 0) __PYX_ERR(0, 25, __pyx_L11_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_GOTREF(__pyx_t_7); __Pyx_GOTREF(__pyx_t_10); /* "aiohttp/_helpers.pyx":26 * return inst._cache[self.name] * except KeyError: * val = self.wrapped(inst) # <<<<<<<<<<<<<< * inst._cache[self.name] = val * return val */ __Pyx_INCREF(__pyx_v_self->wrapped); __pyx_t_12 = __pyx_v_self->wrapped; __pyx_t_13 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_12))) { __pyx_t_13 = PyMethod_GET_SELF(__pyx_t_12); if (likely(__pyx_t_13)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_12); __Pyx_INCREF(__pyx_t_13); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_12, function); } } __pyx_t_11 = (__pyx_t_13) ? __Pyx_PyObject_Call2Args(__pyx_t_12, __pyx_t_13, __pyx_v_inst) : __Pyx_PyObject_CallOneArg(__pyx_t_12, __pyx_v_inst); __Pyx_XDECREF(__pyx_t_13); __pyx_t_13 = 0; if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 26, __pyx_L11_except_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; __pyx_v_val = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_helpers.pyx":27 * except KeyError: * val = self.wrapped(inst) * inst._cache[self.name] = val # <<<<<<<<<<<<<< * return val * except AttributeError: */ __pyx_t_11 = __Pyx_PyObject_GetAttrStr(__pyx_v_inst, __pyx_n_s_cache); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 27, __pyx_L11_except_error) __Pyx_GOTREF(__pyx_t_11); if (unlikely(PyObject_SetItem(__pyx_t_11, __pyx_v_self->name, __pyx_v_val) < 0)) __PYX_ERR(0, 27, __pyx_L11_except_error) __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; /* "aiohttp/_helpers.pyx":28 * val = self.wrapped(inst) * inst._cache[self.name] = val * return val # <<<<<<<<<<<<<< * except AttributeError: * if inst is None: */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_val); __pyx_r = __pyx_v_val; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0; goto __pyx_L12_except_return; } goto __pyx_L11_except_error; __pyx_L11_except_error:; /* "aiohttp/_helpers.pyx":23 * def __get__(self, inst, owner): * try: * try: # <<<<<<<<<<<<<< * return inst._cache[self.name] * except KeyError: */ __Pyx_XGIVEREF(__pyx_t_4); __Pyx_XGIVEREF(__pyx_t_5); __Pyx_XGIVEREF(__pyx_t_6); __Pyx_ExceptionReset(__pyx_t_4, __pyx_t_5, __pyx_t_6); goto __pyx_L3_error; __pyx_L13_try_return:; __Pyx_XGIVEREF(__pyx_t_4); __Pyx_XGIVEREF(__pyx_t_5); __Pyx_XGIVEREF(__pyx_t_6); __Pyx_ExceptionReset(__pyx_t_4, __pyx_t_5, __pyx_t_6); goto __pyx_L7_try_return; __pyx_L12_except_return:; __Pyx_XGIVEREF(__pyx_t_4); __Pyx_XGIVEREF(__pyx_t_5); __Pyx_XGIVEREF(__pyx_t_6); __Pyx_ExceptionReset(__pyx_t_4, __pyx_t_5, __pyx_t_6); goto __pyx_L7_try_return; } /* "aiohttp/_helpers.pyx":22 * * def __get__(self, inst, owner): * try: # <<<<<<<<<<<<<< * try: * return inst._cache[self.name] */ } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0; __Pyx_XDECREF(__pyx_t_12); __pyx_t_12 = 0; __Pyx_XDECREF(__pyx_t_13); __pyx_t_13 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; /* "aiohttp/_helpers.pyx":29 * inst._cache[self.name] = val * return val * except AttributeError: # <<<<<<<<<<<<<< * if inst is None: * return self */ __pyx_t_9 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_AttributeError); if (__pyx_t_9) { __Pyx_AddTraceback("aiohttp._helpers.reify.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_10, &__pyx_t_7, &__pyx_t_8) < 0) __PYX_ERR(0, 29, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_10); __Pyx_GOTREF(__pyx_t_7); __Pyx_GOTREF(__pyx_t_8); /* "aiohttp/_helpers.pyx":30 * return val * except AttributeError: * if inst is None: # <<<<<<<<<<<<<< * return self * raise */ __pyx_t_14 = (__pyx_v_inst == Py_None); __pyx_t_15 = (__pyx_t_14 != 0); if (__pyx_t_15) { /* "aiohttp/_helpers.pyx":31 * except AttributeError: * if inst is None: * return self # <<<<<<<<<<<<<< * raise * */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_self)); __pyx_r = ((PyObject *)__pyx_v_self); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0; goto __pyx_L6_except_return; /* "aiohttp/_helpers.pyx":30 * return val * except AttributeError: * if inst is None: # <<<<<<<<<<<<<< * return self * raise */ } /* "aiohttp/_helpers.pyx":32 * if inst is None: * return self * raise # <<<<<<<<<<<<<< * * def __set__(self, inst, value): */ __Pyx_GIVEREF(__pyx_t_10); __Pyx_GIVEREF(__pyx_t_7); __Pyx_XGIVEREF(__pyx_t_8); __Pyx_ErrRestoreWithState(__pyx_t_10, __pyx_t_7, __pyx_t_8); __pyx_t_10 = 0; __pyx_t_7 = 0; __pyx_t_8 = 0; __PYX_ERR(0, 32, __pyx_L5_except_error) } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_helpers.pyx":22 * * def __get__(self, inst, owner): * try: # <<<<<<<<<<<<<< * try: * return inst._cache[self.name] */ __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L1_error; __pyx_L7_try_return:; __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L0; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_1); __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_ExceptionReset(__pyx_t_1, __pyx_t_2, __pyx_t_3); goto __pyx_L0; } /* "aiohttp/_helpers.pyx":21 * return self.wrapped.__doc__ * * def __get__(self, inst, owner): # <<<<<<<<<<<<<< * try: * try: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_10); __Pyx_XDECREF(__pyx_t_11); __Pyx_XDECREF(__pyx_t_12); __Pyx_XDECREF(__pyx_t_13); __Pyx_AddTraceback("aiohttp._helpers.reify.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_val); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_helpers.pyx":34 * raise * * def __set__(self, inst, value): # <<<<<<<<<<<<<< * raise AttributeError("reified property is read-only") */ /* Python wrapper */ static int __pyx_pw_7aiohttp_8_helpers_5reify_5__set__(PyObject *__pyx_v_self, PyObject *__pyx_v_inst, PyObject *__pyx_v_value); /*proto*/ static int __pyx_pw_7aiohttp_8_helpers_5reify_5__set__(PyObject *__pyx_v_self, PyObject *__pyx_v_inst, PyObject *__pyx_v_value) { int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__set__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify_4__set__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self), ((PyObject *)__pyx_v_inst), ((PyObject *)__pyx_v_value)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_8_helpers_5reify_4__set__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, CYTHON_UNUSED PyObject *__pyx_v_inst, CYTHON_UNUSED PyObject *__pyx_v_value) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__set__", 0); /* "aiohttp/_helpers.pyx":35 * * def __set__(self, inst, value): * raise AttributeError("reified property is read-only") # <<<<<<<<<<<<<< */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_AttributeError, __pyx_tuple_, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 35, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 35, __pyx_L1_error) /* "aiohttp/_helpers.pyx":34 * raise * * def __set__(self, inst, value): # <<<<<<<<<<<<<< * raise AttributeError("reified property is read-only") */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._helpers.reify.__set__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_7__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_7__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify_6__reduce_cython__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_6__reduce_cython__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self) { PyObject *__pyx_v_state = 0; PyObject *__pyx_v__dict = 0; int __pyx_v_use_setstate; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; int __pyx_t_3; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":5 * cdef object _dict * cdef bint use_setstate * state = (self.name, self.wrapped) # <<<<<<<<<<<<<< * _dict = getattr(self, '__dict__', None) * if _dict is not None: */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_v_self->name); __Pyx_GIVEREF(__pyx_v_self->name); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_v_self->name); __Pyx_INCREF(__pyx_v_self->wrapped); __Pyx_GIVEREF(__pyx_v_self->wrapped); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->wrapped); __pyx_v_state = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":6 * cdef bint use_setstate * state = (self.name, self.wrapped) * _dict = getattr(self, '__dict__', None) # <<<<<<<<<<<<<< * if _dict is not None: * state += (_dict,) */ __pyx_t_1 = __Pyx_GetAttr3(((PyObject *)__pyx_v_self), __pyx_n_s_dict, Py_None); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v__dict = __pyx_t_1; __pyx_t_1 = 0; /* "(tree fragment)":7 * state = (self.name, self.wrapped) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ __pyx_t_2 = (__pyx_v__dict != Py_None); __pyx_t_3 = (__pyx_t_2 != 0); if (__pyx_t_3) { /* "(tree fragment)":8 * _dict = getattr(self, '__dict__', None) * if _dict is not None: * state += (_dict,) # <<<<<<<<<<<<<< * use_setstate = True * else: */ __pyx_t_1 = PyTuple_New(1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_v__dict); __Pyx_GIVEREF(__pyx_v__dict); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_v__dict); __pyx_t_4 = PyNumber_InPlaceAdd(__pyx_v_state, __pyx_t_1); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF_SET(__pyx_v_state, ((PyObject*)__pyx_t_4)); __pyx_t_4 = 0; /* "(tree fragment)":9 * if _dict is not None: * state += (_dict,) * use_setstate = True # <<<<<<<<<<<<<< * else: * use_setstate = self.name is not None or self.wrapped is not None */ __pyx_v_use_setstate = 1; /* "(tree fragment)":7 * state = (self.name, self.wrapped) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ goto __pyx_L3; } /* "(tree fragment)":11 * use_setstate = True * else: * use_setstate = self.name is not None or self.wrapped is not None # <<<<<<<<<<<<<< * if use_setstate: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, None), state */ /*else*/ { __pyx_t_2 = (__pyx_v_self->name != Py_None); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->wrapped != Py_None); __pyx_t_2 = (__pyx_t_5 != 0); __pyx_t_3 = __pyx_t_2; __pyx_L4_bool_binop_done:; __pyx_v_use_setstate = __pyx_t_3; } __pyx_L3:; /* "(tree fragment)":12 * else: * use_setstate = self.name is not None or self.wrapped is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_reify, (type(self), 0x770cb8f, None), state * else: */ __pyx_t_3 = (__pyx_v_use_setstate != 0); if (__pyx_t_3) { /* "(tree fragment)":13 * use_setstate = self.name is not None or self.wrapped is not None * if use_setstate: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, None), state # <<<<<<<<<<<<<< * else: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, state) */ __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_4, __pyx_n_s_pyx_unpickle_reify); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_1 = PyTuple_New(3); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_1, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_124832655); __Pyx_GIVEREF(__pyx_int_124832655); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_int_124832655); __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); PyTuple_SET_ITEM(__pyx_t_1, 2, Py_None); __pyx_t_6 = PyTuple_New(3); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_6, 1, __pyx_t_1); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_6, 2, __pyx_v_state); __pyx_t_4 = 0; __pyx_t_1 = 0; __pyx_r = __pyx_t_6; __pyx_t_6 = 0; goto __pyx_L0; /* "(tree fragment)":12 * else: * use_setstate = self.name is not None or self.wrapped is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_reify, (type(self), 0x770cb8f, None), state * else: */ } /* "(tree fragment)":15 * return __pyx_unpickle_reify, (type(self), 0x770cb8f, None), state * else: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, state) # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_reify__set_state(self, __pyx_state) */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_pyx_unpickle_reify); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_1 = PyTuple_New(3); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_1, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_124832655); __Pyx_GIVEREF(__pyx_int_124832655); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_int_124832655); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_1, 2, __pyx_v_state); __pyx_t_4 = PyTuple_New(2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_GIVEREF(__pyx_t_6); PyTuple_SET_ITEM(__pyx_t_4, 0, __pyx_t_6); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_4, 1, __pyx_t_1); __pyx_t_6 = 0; __pyx_t_1 = 0; __pyx_r = __pyx_t_4; __pyx_t_4 = 0; goto __pyx_L0; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._helpers.reify.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_state); __Pyx_XDECREF(__pyx_v__dict); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":16 * else: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_reify__set_state(self, __pyx_state) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_9__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_8_helpers_5reify_9__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_8_helpers_5reify_8__setstate_cython__(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_8_helpers_5reify_8__setstate_cython__(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":17 * return __pyx_unpickle_reify, (type(self), 0x770cb8f, state) * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_reify__set_state(self, __pyx_state) # <<<<<<<<<<<<<< */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 17, __pyx_L1_error) __pyx_t_1 = __pyx_f_7aiohttp_8_helpers___pyx_unpickle_reify__set_state(__pyx_v_self, ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":16 * else: * return __pyx_unpickle_reify, (type(self), 0x770cb8f, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_reify__set_state(self, __pyx_state) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._helpers.reify.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __pyx_unpickle_reify(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_8_helpers_1__pyx_unpickle_reify(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_8_helpers_1__pyx_unpickle_reify = {"__pyx_unpickle_reify", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_8_helpers_1__pyx_unpickle_reify, METH_VARARGS|METH_KEYWORDS, 0}; static PyObject *__pyx_pw_7aiohttp_8_helpers_1__pyx_unpickle_reify(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v___pyx_type = 0; long __pyx_v___pyx_checksum; PyObject *__pyx_v___pyx_state = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__pyx_unpickle_reify (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_pyx_type,&__pyx_n_s_pyx_checksum,&__pyx_n_s_pyx_state,0}; PyObject* values[3] = {0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_type)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_checksum)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_reify", 1, 3, 3, 1); __PYX_ERR(1, 1, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_state)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_reify", 1, 3, 3, 2); __PYX_ERR(1, 1, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__pyx_unpickle_reify") < 0)) __PYX_ERR(1, 1, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 3) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); } __pyx_v___pyx_type = values[0]; __pyx_v___pyx_checksum = __Pyx_PyInt_As_long(values[1]); if (unlikely((__pyx_v___pyx_checksum == (long)-1) && PyErr_Occurred())) __PYX_ERR(1, 1, __pyx_L3_error) __pyx_v___pyx_state = values[2]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_reify", 1, 3, 3, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(1, 1, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._helpers.__pyx_unpickle_reify", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_8_helpers___pyx_unpickle_reify(__pyx_self, __pyx_v___pyx_type, __pyx_v___pyx_checksum, __pyx_v___pyx_state); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_8_helpers___pyx_unpickle_reify(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_v___pyx_PickleError = 0; PyObject *__pyx_v___pyx_result = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; __Pyx_RefNannySetupContext("__pyx_unpickle_reify", 0); /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x770cb8f: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) */ __pyx_t_1 = ((__pyx_v___pyx_checksum != 0x770cb8f) != 0); if (__pyx_t_1) { /* "(tree fragment)":5 * cdef object __pyx_result * if __pyx_checksum != 0x770cb8f: * from pickle import PickleError as __pyx_PickleError # <<<<<<<<<<<<<< * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) * __pyx_result = reify.__new__(__pyx_type) */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_PickleError); __Pyx_GIVEREF(__pyx_n_s_PickleError); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_PickleError); __pyx_t_3 = __Pyx_Import(__pyx_n_s_pickle, __pyx_t_2, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_3, __pyx_n_s_PickleError); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_t_2); __pyx_v___pyx_PickleError = __pyx_t_2; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":6 * if __pyx_checksum != 0x770cb8f: * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) # <<<<<<<<<<<<<< * __pyx_result = reify.__new__(__pyx_type) * if __pyx_state is not None: */ __pyx_t_2 = __Pyx_PyInt_From_long(__pyx_v___pyx_checksum); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = __Pyx_PyString_Format(__pyx_kp_s_Incompatible_checksums_s_vs_0x77, __pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_INCREF(__pyx_v___pyx_PickleError); __pyx_t_2 = __pyx_v___pyx_PickleError; __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(1, 6, __pyx_L1_error) /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x770cb8f: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) */ } /* "(tree fragment)":7 * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) * __pyx_result = reify.__new__(__pyx_type) # <<<<<<<<<<<<<< * if __pyx_state is not None: * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_ptype_7aiohttp_8_helpers_reify), __pyx_n_s_new); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_4, __pyx_v___pyx_type) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v___pyx_type); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_v___pyx_result = __pyx_t_3; __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) * __pyx_result = reify.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) * return __pyx_result */ __pyx_t_1 = (__pyx_v___pyx_state != Py_None); __pyx_t_6 = (__pyx_t_1 != 0); if (__pyx_t_6) { /* "(tree fragment)":9 * __pyx_result = reify.__new__(__pyx_type) * if __pyx_state is not None: * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) # <<<<<<<<<<<<<< * return __pyx_result * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 9, __pyx_L1_error) __pyx_t_3 = __pyx_f_7aiohttp_8_helpers___pyx_unpickle_reify__set_state(((struct __pyx_obj_7aiohttp_8_helpers_reify *)__pyx_v___pyx_result), ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x770cb8f = (name, wrapped))" % __pyx_checksum) * __pyx_result = reify.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) * return __pyx_result */ } /* "(tree fragment)":10 * if __pyx_state is not None: * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) * return __pyx_result # <<<<<<<<<<<<<< * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v___pyx_result); __pyx_r = __pyx_v___pyx_result; goto __pyx_L0; /* "(tree fragment)":1 * def __pyx_unpickle_reify(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._helpers.__pyx_unpickle_reify", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v___pyx_PickleError); __Pyx_XDECREF(__pyx_v___pyx_result); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":11 * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): */ static PyObject *__pyx_f_7aiohttp_8_helpers___pyx_unpickle_reify__set_state(struct __pyx_obj_7aiohttp_8_helpers_reify *__pyx_v___pyx_result, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; Py_ssize_t __pyx_t_3; int __pyx_t_4; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; __Pyx_RefNannySetupContext("__pyx_unpickle_reify__set_state", 0); /* "(tree fragment)":12 * return __pyx_result * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] # <<<<<<<<<<<<<< * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[2]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->name); __Pyx_DECREF(__pyx_v___pyx_result->name); __pyx_v___pyx_result->name = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->wrapped); __Pyx_DECREF(__pyx_v___pyx_result->wrapped); __pyx_v___pyx_result->wrapped = __pyx_t_1; __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[2]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(1, 13, __pyx_L1_error) } __pyx_t_3 = PyTuple_GET_SIZE(__pyx_v___pyx_state); if (unlikely(__pyx_t_3 == ((Py_ssize_t)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_4 = ((__pyx_t_3 > 2) != 0); if (__pyx_t_4) { } else { __pyx_t_2 = __pyx_t_4; goto __pyx_L4_bool_binop_done; } __pyx_t_4 = __Pyx_HasAttr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_5 = (__pyx_t_4 != 0); __pyx_t_2 = __pyx_t_5; __pyx_L4_bool_binop_done:; if (__pyx_t_2) { /* "(tree fragment)":14 * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[2]) # <<<<<<<<<<<<<< */ __pyx_t_6 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_t_6, __pyx_n_s_update); if (unlikely(!__pyx_t_7)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 14, __pyx_L1_error) } __pyx_t_6 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_8 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_7))) { __pyx_t_8 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_8)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_8); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); } } __pyx_t_1 = (__pyx_t_8) ? __Pyx_PyObject_Call2Args(__pyx_t_7, __pyx_t_8, __pyx_t_6) : __Pyx_PyObject_CallOneArg(__pyx_t_7, __pyx_t_6); __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[2]) */ } /* "(tree fragment)":11 * __pyx_unpickle_reify__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_reify__set_state(reify __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.name = __pyx_state[0]; __pyx_result.wrapped = __pyx_state[1] * if len(__pyx_state) > 2 and hasattr(__pyx_result, '__dict__'): */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("aiohttp._helpers.__pyx_unpickle_reify__set_state", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_tp_new_7aiohttp_8_helpers_reify(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { struct __pyx_obj_7aiohttp_8_helpers_reify *p; PyObject *o; if (likely((t->tp_flags & Py_TPFLAGS_IS_ABSTRACT) == 0)) { o = (*t->tp_alloc)(t, 0); } else { o = (PyObject *) PyBaseObject_Type.tp_new(t, __pyx_empty_tuple, 0); } if (unlikely(!o)) return 0; p = ((struct __pyx_obj_7aiohttp_8_helpers_reify *)o); p->wrapped = Py_None; Py_INCREF(Py_None); p->name = Py_None; Py_INCREF(Py_None); return o; } static void __pyx_tp_dealloc_7aiohttp_8_helpers_reify(PyObject *o) { struct __pyx_obj_7aiohttp_8_helpers_reify *p = (struct __pyx_obj_7aiohttp_8_helpers_reify *)o; #if CYTHON_USE_TP_FINALIZE if (unlikely(PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE) && Py_TYPE(o)->tp_finalize) && !_PyGC_FINALIZED(o)) { if (PyObject_CallFinalizerFromDealloc(o)) return; } #endif PyObject_GC_UnTrack(o); Py_CLEAR(p->wrapped); Py_CLEAR(p->name); (*Py_TYPE(o)->tp_free)(o); } static int __pyx_tp_traverse_7aiohttp_8_helpers_reify(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_8_helpers_reify *p = (struct __pyx_obj_7aiohttp_8_helpers_reify *)o; if (p->wrapped) { e = (*v)(p->wrapped, a); if (e) return e; } if (p->name) { e = (*v)(p->name, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_8_helpers_reify(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_8_helpers_reify *p = (struct __pyx_obj_7aiohttp_8_helpers_reify *)o; tmp = ((PyObject*)p->wrapped); p->wrapped = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->name); p->name = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyObject *__pyx_tp_descr_get_7aiohttp_8_helpers_reify(PyObject *o, PyObject *i, PyObject *c) { PyObject *r = 0; if (!i) i = Py_None; if (!c) c = Py_None; r = __pyx_pw_7aiohttp_8_helpers_5reify_3__get__(o, i, c); return r; } static int __pyx_tp_descr_set_7aiohttp_8_helpers_reify(PyObject *o, PyObject *i, PyObject *v) { if (v) { return __pyx_pw_7aiohttp_8_helpers_5reify_5__set__(o, i, v); } else { PyErr_SetString(PyExc_NotImplementedError, "__delete__"); return -1; } } static PyObject *__pyx_getprop_7aiohttp_8_helpers_5reify___doc__(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_8_helpers_5reify_7__doc___1__get__(o); } static PyMethodDef __pyx_methods_7aiohttp_8_helpers_reify[] = { {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_8_helpers_5reify_7__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_8_helpers_5reify_9__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static struct PyGetSetDef __pyx_getsets_7aiohttp_8_helpers_reify[] = { {(char *)"__doc__", __pyx_getprop_7aiohttp_8_helpers_5reify___doc__, 0, (char *)0, 0}, {0, 0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_8_helpers_reify = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._helpers.reify", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_8_helpers_reify), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_8_helpers_reify, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ "Use as a class method decorator. It operates almost exactly like\n the Python `@property` decorator, but it puts the result of the\n method it decorates into the instance dict after the first call,\n effectively replacing the function it decorates with an instance\n variable. It is, in Python parlance, a data descriptor.\n\n ", /*tp_doc*/ __pyx_tp_traverse_7aiohttp_8_helpers_reify, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_8_helpers_reify, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_8_helpers_reify, /*tp_methods*/ 0, /*tp_members*/ __pyx_getsets_7aiohttp_8_helpers_reify, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ __pyx_tp_descr_get_7aiohttp_8_helpers_reify, /*tp_descr_get*/ __pyx_tp_descr_set_7aiohttp_8_helpers_reify, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_8_helpers_5reify_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_8_helpers_reify, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec__helpers(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec__helpers}, {0, NULL} }; #endif static struct PyModuleDef __pyx_moduledef = { PyModuleDef_HEAD_INIT, "_helpers", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; #endif #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_n_s_AttributeError, __pyx_k_AttributeError, sizeof(__pyx_k_AttributeError), 0, 0, 1, 1}, {&__pyx_kp_s_Incompatible_checksums_s_vs_0x77, __pyx_k_Incompatible_checksums_s_vs_0x77, sizeof(__pyx_k_Incompatible_checksums_s_vs_0x77), 0, 0, 1, 0}, {&__pyx_n_s_KeyError, __pyx_k_KeyError, sizeof(__pyx_k_KeyError), 0, 0, 1, 1}, {&__pyx_n_s_PickleError, __pyx_k_PickleError, sizeof(__pyx_k_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_aiohttp__helpers, __pyx_k_aiohttp__helpers, sizeof(__pyx_k_aiohttp__helpers), 0, 0, 1, 1}, {&__pyx_n_s_cache, __pyx_k_cache, sizeof(__pyx_k_cache), 0, 0, 1, 1}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_dict, __pyx_k_dict, sizeof(__pyx_k_dict), 0, 0, 1, 1}, {&__pyx_n_s_doc, __pyx_k_doc, sizeof(__pyx_k_doc), 0, 0, 1, 1}, {&__pyx_n_s_getstate, __pyx_k_getstate, sizeof(__pyx_k_getstate), 0, 0, 1, 1}, {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_new, __pyx_k_new, sizeof(__pyx_k_new), 0, 0, 1, 1}, {&__pyx_n_s_pickle, __pyx_k_pickle, sizeof(__pyx_k_pickle), 0, 0, 1, 1}, {&__pyx_n_s_pyx_PickleError, __pyx_k_pyx_PickleError, sizeof(__pyx_k_pyx_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_pyx_checksum, __pyx_k_pyx_checksum, sizeof(__pyx_k_pyx_checksum), 0, 0, 1, 1}, {&__pyx_n_s_pyx_result, __pyx_k_pyx_result, sizeof(__pyx_k_pyx_result), 0, 0, 1, 1}, {&__pyx_n_s_pyx_state, __pyx_k_pyx_state, sizeof(__pyx_k_pyx_state), 0, 0, 1, 1}, {&__pyx_n_s_pyx_type, __pyx_k_pyx_type, sizeof(__pyx_k_pyx_type), 0, 0, 1, 1}, {&__pyx_n_s_pyx_unpickle_reify, __pyx_k_pyx_unpickle_reify, sizeof(__pyx_k_pyx_unpickle_reify), 0, 0, 1, 1}, {&__pyx_n_s_reduce, __pyx_k_reduce, sizeof(__pyx_k_reduce), 0, 0, 1, 1}, {&__pyx_n_s_reduce_cython, __pyx_k_reduce_cython, sizeof(__pyx_k_reduce_cython), 0, 0, 1, 1}, {&__pyx_n_s_reduce_ex, __pyx_k_reduce_ex, sizeof(__pyx_k_reduce_ex), 0, 0, 1, 1}, {&__pyx_kp_u_reified_property_is_read_only, __pyx_k_reified_property_is_read_only, sizeof(__pyx_k_reified_property_is_read_only), 0, 1, 0, 0}, {&__pyx_n_s_reify, __pyx_k_reify, sizeof(__pyx_k_reify), 0, 0, 1, 1}, {&__pyx_n_s_setstate, __pyx_k_setstate, sizeof(__pyx_k_setstate), 0, 0, 1, 1}, {&__pyx_n_s_setstate_cython, __pyx_k_setstate_cython, sizeof(__pyx_k_setstate_cython), 0, 0, 1, 1}, {&__pyx_kp_s_stringsource, __pyx_k_stringsource, sizeof(__pyx_k_stringsource), 0, 0, 1, 0}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_update, __pyx_k_update, sizeof(__pyx_k_update), 0, 0, 1, 1}, {&__pyx_n_s_wrapped, __pyx_k_wrapped, sizeof(__pyx_k_wrapped), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_KeyError = __Pyx_GetBuiltinName(__pyx_n_s_KeyError); if (!__pyx_builtin_KeyError) __PYX_ERR(0, 25, __pyx_L1_error) __pyx_builtin_AttributeError = __Pyx_GetBuiltinName(__pyx_n_s_AttributeError); if (!__pyx_builtin_AttributeError) __PYX_ERR(0, 29, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "aiohttp/_helpers.pyx":35 * * def __set__(self, inst, value): * raise AttributeError("reified property is read-only") # <<<<<<<<<<<<<< */ __pyx_tuple_ = PyTuple_Pack(1, __pyx_kp_u_reified_property_is_read_only); if (unlikely(!__pyx_tuple_)) __PYX_ERR(0, 35, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple_); __Pyx_GIVEREF(__pyx_tuple_); /* "(tree fragment)":1 * def __pyx_unpickle_reify(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_tuple__2 = PyTuple_Pack(5, __pyx_n_s_pyx_type, __pyx_n_s_pyx_checksum, __pyx_n_s_pyx_state, __pyx_n_s_pyx_PickleError, __pyx_n_s_pyx_result); if (unlikely(!__pyx_tuple__2)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__2); __Pyx_GIVEREF(__pyx_tuple__2); __pyx_codeobj__3 = (PyObject*)__Pyx_PyCode_New(3, 0, 5, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__2, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_stringsource, __pyx_n_s_pyx_unpickle_reify, 1, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__3)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { if (__Pyx_InitStrings(__pyx_string_tab) < 0) __PYX_ERR(0, 1, __pyx_L1_error); __pyx_int_124832655 = PyInt_FromLong(124832655L); if (unlikely(!__pyx_int_124832655)) __PYX_ERR(0, 1, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ if (PyType_Ready(&__pyx_type_7aiohttp_8_helpers_reify) < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_8_helpers_reify.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_8_helpers_reify.tp_dictoffset && __pyx_type_7aiohttp_8_helpers_reify.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_8_helpers_reify.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (PyObject_SetAttr(__pyx_m, __pyx_n_s_reify, (PyObject *)&__pyx_type_7aiohttp_8_helpers_reify) < 0) __PYX_ERR(0, 1, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_8_helpers_reify) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_ptype_7aiohttp_8_helpers_reify = &__pyx_type_7aiohttp_8_helpers_reify; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION < 3 #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC void #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #else #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyObject * #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC init_helpers(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC init_helpers(void) #else __Pyx_PyMODINIT_FUNC PyInit__helpers(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit__helpers(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { result = PyDict_SetItemString(moddict, to_name, value); } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, CYTHON_UNUSED PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec__helpers(PyObject *__pyx_pyinit_module) #endif #endif { PyObject *__pyx_t_1 = NULL; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module '_helpers' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit__helpers(void)", 0); if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS #ifdef WITH_THREAD /* Python build with threading support? */ PyEval_InitThreads(); #endif #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("_helpers", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); #else __pyx_m = PyModule_Create(&__pyx_moduledef); #endif if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_b); __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_cython_runtime); if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error); /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_aiohttp___helpers) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "aiohttp._helpers")) { if (unlikely(PyDict_SetItemString(modules, "aiohttp._helpers", __pyx_m) < 0)) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) goto __pyx_L1_error; /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) goto __pyx_L1_error; /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); if (unlikely(__Pyx_modinit_type_init_code() != 0)) goto __pyx_L1_error; (void)__Pyx_modinit_type_import_code(); (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "(tree fragment)":1 * def __pyx_unpickle_reify(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_t_1 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_8_helpers_1__pyx_unpickle_reify, NULL, __pyx_n_s_aiohttp__helpers); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_pyx_unpickle_reify, __pyx_t_1) < 0) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_helpers.pyx":1 * cdef class reify: # <<<<<<<<<<<<<< * """Use as a class method decorator. It operates almost exactly like * the Python `@property` decorator, but it puts the result of the */ __pyx_t_1 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_1) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); if (__pyx_m) { if (__pyx_d) { __Pyx_AddTraceback("init aiohttp._helpers", __pyx_clineno, __pyx_lineno, __pyx_filename); } Py_CLEAR(__pyx_m); } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init aiohttp._helpers"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStr(__pyx_b, name); if (unlikely(!result)) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* RaiseDoubleKeywords */ static void __Pyx_RaiseDoubleKeywordsError( const char* func_name, PyObject* kw_name) { PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION >= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; while (PyDict_Next(kwds, &pos, &key, &value)) { name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; continue; } name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_CheckExact(key)) || likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = (**name == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**name) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**name, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**argname) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: return -1; } /* RaiseArgTupleInvalid */ static void __Pyx_RaiseArgtupleInvalid( const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found) { Py_ssize_t num_expected; const char *more_or_less; if (num_found < num_min) { num_expected = num_min; more_or_less = "at least"; } else { num_expected = num_max; more_or_less = "at most"; } if (exact) { more_or_less = "exactly"; } PyErr_Format(PyExc_TypeError, "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", func_name, more_or_less, num_expected, (num_expected == 1) ? "" : "s", num_found); } /* GetItemInt */ static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j) { PyObject *r; if (!j) return NULL; r = PyObject_GetItem(o, j); Py_DECREF(j); return r; } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyList_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyList_GET_SIZE(o)))) { PyObject *r = PyList_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyTuple_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS if (is_list || PyList_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyList_GET_SIZE(o); if ((!boundscheck) || (likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o))))) { PyObject *r = PyList_GET_ITEM(o, n); Py_INCREF(r); return r; } } else if (PyTuple_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyTuple_GET_SIZE(o); if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, n); Py_INCREF(r); return r; } } else { PySequenceMethods *m = Py_TYPE(o)->tp_as_sequence; if (likely(m && m->sq_item)) { if (wraparound && unlikely(i < 0) && likely(m->sq_length)) { Py_ssize_t l = m->sq_length(o); if (likely(l >= 0)) { i += l; } else { if (!PyErr_ExceptionMatches(PyExc_OverflowError)) return NULL; PyErr_Clear(); } } return m->sq_item(o, i); } } #else if (is_list || PySequence_Check(o)) { return PySequence_GetItem(o, i); } #endif return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); } /* ObjectGetItem */ #if CYTHON_USE_TYPE_SLOTS static PyObject *__Pyx_PyObject_GetIndex(PyObject *obj, PyObject* index) { PyObject *runerr; Py_ssize_t key_value; PySequenceMethods *m = Py_TYPE(obj)->tp_as_sequence; if (unlikely(!(m && m->sq_item))) { PyErr_Format(PyExc_TypeError, "'%.200s' object is not subscriptable", Py_TYPE(obj)->tp_name); return NULL; } key_value = __Pyx_PyIndex_AsSsize_t(index); if (likely(key_value != -1 || !(runerr = PyErr_Occurred()))) { return __Pyx_GetItemInt_Fast(obj, key_value, 0, 1, 1); } if (PyErr_GivenExceptionMatches(runerr, PyExc_OverflowError)) { PyErr_Clear(); PyErr_Format(PyExc_IndexError, "cannot fit '%.200s' into an index-sized integer", Py_TYPE(index)->tp_name); } return NULL; } static PyObject *__Pyx_PyObject_GetItem(PyObject *obj, PyObject* key) { PyMappingMethods *m = Py_TYPE(obj)->tp_as_mapping; if (likely(m && m->mp_subscript)) { return m->mp_subscript(obj, key); } return __Pyx_PyObject_GetIndex(obj, key); } #endif /* GetTopmostException */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate) { _PyErr_StackItem *exc_info = tstate->exc_info; while ((exc_info->exc_type == NULL || exc_info->exc_type == Py_None) && exc_info->previous_item != NULL) { exc_info = exc_info->previous_item; } return exc_info; } #endif /* SaveResetException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); *type = exc_info->exc_type; *value = exc_info->exc_value; *tb = exc_info->exc_traceback; #else *type = tstate->exc_type; *value = tstate->exc_value; *tb = tstate->exc_traceback; #endif Py_XINCREF(*type); Py_XINCREF(*value); Py_XINCREF(*tb); } static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = type; exc_info->exc_value = value; exc_info->exc_traceback = tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = type; tstate->exc_value = value; tstate->exc_traceback = tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } #endif /* PyErrExceptionMatches */ #if CYTHON_FAST_THREAD_STATE static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; icurexc_type; if (exc_type == err) return 1; if (unlikely(!exc_type)) return 0; if (unlikely(PyTuple_Check(err))) return __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err); return __Pyx_PyErr_GivenExceptionMatches(exc_type, err); } #endif /* GetException */ #if CYTHON_FAST_THREAD_STATE static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb) #endif { PyObject *local_type, *local_value, *local_tb; #if CYTHON_FAST_THREAD_STATE PyObject *tmp_type, *tmp_value, *tmp_tb; local_type = tstate->curexc_type; local_value = tstate->curexc_value; local_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; #else PyErr_Fetch(&local_type, &local_value, &local_tb); #endif PyErr_NormalizeException(&local_type, &local_value, &local_tb); #if CYTHON_FAST_THREAD_STATE if (unlikely(tstate->curexc_type)) #else if (unlikely(PyErr_Occurred())) #endif goto bad; #if PY_MAJOR_VERSION >= 3 if (local_tb) { if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0)) goto bad; } #endif Py_XINCREF(local_tb); Py_XINCREF(local_type); Py_XINCREF(local_value); *type = local_type; *value = local_value; *tb = local_tb; #if CYTHON_FAST_THREAD_STATE #if CYTHON_USE_EXC_INFO_STACK { _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = local_type; exc_info->exc_value = local_value; exc_info->exc_traceback = local_tb; } #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = local_type; tstate->exc_value = local_value; tstate->exc_traceback = local_tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #else PyErr_SetExcInfo(local_type, local_value, local_tb); #endif return 0; bad: *type = 0; *value = 0; *tb = 0; Py_XDECREF(local_type); Py_XDECREF(local_value); Py_XDECREF(local_tb); return -1; } /* PyCFunctionFastCall */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject * __Pyx_PyCFunction_FastCall(PyObject *func_obj, PyObject **args, Py_ssize_t nargs) { PyCFunctionObject *func = (PyCFunctionObject*)func_obj; PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyObject *self = PyCFunction_GET_SELF(func); int flags = PyCFunction_GET_FLAGS(func); assert(PyCFunction_Check(func)); assert(METH_FASTCALL == (flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))); assert(nargs >= 0); assert(nargs == 0 || args != NULL); /* _PyCFunction_FastCallDict() must not be called with an exception set, because it may clear it (directly or indirectly) and so the caller loses its exception */ assert(!PyErr_Occurred()); if ((PY_VERSION_HEX < 0x030700A0) || unlikely(flags & METH_KEYWORDS)) { return (*((__Pyx_PyCFunctionFastWithKeywords)(void*)meth)) (self, args, nargs, NULL); } else { return (*((__Pyx_PyCFunctionFast)(void*)meth)) (self, args, nargs); } } #endif /* PyFunctionFastCall */ #if CYTHON_FAST_PYCALL static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, PyObject *globals) { PyFrameObject *f; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject **fastlocals; Py_ssize_t i; PyObject *result; assert(globals != NULL); /* XXX Perhaps we should create a specialized PyFrame_New() that doesn't take locals, but does take builtins without sanity checking them. */ assert(tstate != NULL); f = PyFrame_New(tstate, co, globals, NULL); if (f == NULL) { return NULL; } fastlocals = __Pyx_PyFrame_GetLocalsplus(f); for (i = 0; i < na; i++) { Py_INCREF(*args); fastlocals[i] = *args++; } result = PyEval_EvalFrameEx(f,0); ++tstate->recursion_depth; Py_DECREF(f); --tstate->recursion_depth; return result; } #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); PyObject *globals = PyFunction_GET_GLOBALS(func); PyObject *argdefs = PyFunction_GET_DEFAULTS(func); PyObject *closure; #if PY_MAJOR_VERSION >= 3 PyObject *kwdefs; #endif PyObject *kwtuple, **k; PyObject **d; Py_ssize_t nd; Py_ssize_t nk; PyObject *result; assert(kwargs == NULL || PyDict_Check(kwargs)); nk = kwargs ? PyDict_Size(kwargs) : 0; if (Py_EnterRecursiveCall((char*)" while calling a Python object")) { return NULL; } if ( #if PY_MAJOR_VERSION >= 3 co->co_kwonlyargcount == 0 && #endif likely(kwargs == NULL || nk == 0) && co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { if (argdefs == NULL && co->co_argcount == nargs) { result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); goto done; } else if (nargs == 0 && argdefs != NULL && co->co_argcount == Py_SIZE(argdefs)) { /* function called with no arguments, but all parameters have a default value: use default values as arguments .*/ args = &PyTuple_GET_ITEM(argdefs, 0); result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); goto done; } } if (kwargs != NULL) { Py_ssize_t pos, i; kwtuple = PyTuple_New(2 * nk); if (kwtuple == NULL) { result = NULL; goto done; } k = &PyTuple_GET_ITEM(kwtuple, 0); pos = i = 0; while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { Py_INCREF(k[i]); Py_INCREF(k[i+1]); i += 2; } nk = i / 2; } else { kwtuple = NULL; k = NULL; } closure = PyFunction_GET_CLOSURE(func); #if PY_MAJOR_VERSION >= 3 kwdefs = PyFunction_GET_KW_DEFAULTS(func); #endif if (argdefs != NULL) { d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } else { d = NULL; nd = 0; } #if PY_MAJOR_VERSION >= 3 result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, kwdefs, closure); #else result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, closure); #endif Py_XDECREF(kwtuple); done: Py_LeaveRecursiveCall(); return result; } #endif #endif /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = func->ob_type->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCall2Args */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) { PyObject *args, *result = NULL; #if CYTHON_FAST_PYCALL if (PyFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyFunction_FastCall(function, args, 2); } #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyCFunction_FastCall(function, args, 2); } #endif args = PyTuple_New(2); if (unlikely(!args)) goto done; Py_INCREF(arg1); PyTuple_SET_ITEM(args, 0, arg1); Py_INCREF(arg2); PyTuple_SET_ITEM(args, 1, arg2); Py_INCREF(function); result = __Pyx_PyObject_Call(function, args, NULL); Py_DECREF(args); Py_DECREF(function); done: return result; } /* PyObjectCallMethO */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { PyObject *self, *result; PyCFunction cfunc; cfunc = PyCFunction_GET_FUNCTION(func); self = PyCFunction_GET_SELF(func); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = cfunc(self, arg); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallOneArg */ #if CYTHON_COMPILING_IN_CPYTHON static PyObject* __Pyx__PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_New(1); if (unlikely(!args)) return NULL; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, &arg, 1); } #endif if (likely(PyCFunction_Check(func))) { if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { return __Pyx_PyObject_CallMethO(func, arg); #if CYTHON_FAST_PYCCALL } else if (PyCFunction_GET_FLAGS(func) & METH_FASTCALL) { return __Pyx_PyCFunction_FastCall(func, &arg, 1); #endif } } return __Pyx__PyObject_CallOneArg(func, arg); } #else static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_Pack(1, arg); if (unlikely(!args)) return NULL; result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } #endif /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; } #endif /* RaiseException */ #if PY_MAJOR_VERSION < 3 static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, CYTHON_UNUSED PyObject *cause) { __Pyx_PyThreadState_declare Py_XINCREF(type); if (!value || value == Py_None) value = NULL; else Py_INCREF(value); if (!tb || tb == Py_None) tb = NULL; else { Py_INCREF(tb); if (!PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } } if (PyType_Check(type)) { #if CYTHON_COMPILING_IN_PYPY if (!value) { Py_INCREF(Py_None); value = Py_None; } #endif PyErr_NormalizeException(&type, &value, &tb); } else { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } value = type; type = (PyObject*) Py_TYPE(type); Py_INCREF(type); if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto raise_error; } } __Pyx_PyThreadState_assign __Pyx_ErrRestore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } #else static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { PyObject* owned_instance = NULL; if (tb == Py_None) { tb = 0; } else if (tb && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto bad; } if (value == Py_None) value = 0; if (PyExceptionInstance_Check(type)) { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto bad; } value = type; type = (PyObject*) Py_TYPE(value); } else if (PyExceptionClass_Check(type)) { PyObject *instance_class = NULL; if (value && PyExceptionInstance_Check(value)) { instance_class = (PyObject*) Py_TYPE(value); if (instance_class != type) { int is_subclass = PyObject_IsSubclass(instance_class, type); if (!is_subclass) { instance_class = NULL; } else if (unlikely(is_subclass == -1)) { goto bad; } else { type = instance_class; } } } if (!instance_class) { PyObject *args; if (!value) args = PyTuple_New(0); else if (PyTuple_Check(value)) { Py_INCREF(value); args = value; } else args = PyTuple_Pack(1, value); if (!args) goto bad; owned_instance = PyObject_Call(type, args, NULL); Py_DECREF(args); if (!owned_instance) goto bad; value = owned_instance; if (!PyExceptionInstance_Check(value)) { PyErr_Format(PyExc_TypeError, "calling %R should have returned an instance of " "BaseException, not %R", type, Py_TYPE(value)); goto bad; } } } else { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto bad; } if (cause) { PyObject *fixed_cause; if (cause == Py_None) { fixed_cause = NULL; } else if (PyExceptionClass_Check(cause)) { fixed_cause = PyObject_CallObject(cause, NULL); if (fixed_cause == NULL) goto bad; } else if (PyExceptionInstance_Check(cause)) { fixed_cause = cause; Py_INCREF(fixed_cause); } else { PyErr_SetString(PyExc_TypeError, "exception causes must derive from " "BaseException"); goto bad; } PyException_SetCause(value, fixed_cause); } PyErr_SetObject(type, value); if (tb) { #if CYTHON_COMPILING_IN_PYPY PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); Py_INCREF(tb); PyErr_Restore(tmp_type, tmp_value, tb); Py_XDECREF(tmp_tb); #else PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* tmp_tb = tstate->curexc_traceback; if (tb != tmp_tb) { Py_INCREF(tb); tstate->curexc_traceback = tb; Py_XDECREF(tmp_tb); } #endif } bad: Py_XDECREF(owned_instance); return; } #endif /* GetAttr */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *o, PyObject *n) { #if CYTHON_USE_TYPE_SLOTS #if PY_MAJOR_VERSION >= 3 if (likely(PyUnicode_Check(n))) #else if (likely(PyString_Check(n))) #endif return __Pyx_PyObject_GetAttrStr(o, n); #endif return PyObject_GetAttr(o, n); } /* GetAttr3 */ static PyObject *__Pyx_GetAttr3Default(PyObject *d) { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign if (unlikely(!__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError))) return NULL; __Pyx_PyErr_Clear(); Py_INCREF(d); return d; } static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *o, PyObject *n, PyObject *d) { PyObject *r = __Pyx_GetAttr(o, n); return (likely(r)) ? r : __Pyx_GetAttr3Default(d); } /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* GetModuleGlobalName */ #if CYTHON_USE_DICT_VERSIONS static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) #else static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) #endif { PyObject *result; #if !CYTHON_AVOID_BORROWED_REFS #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } else if (unlikely(PyErr_Occurred())) { return NULL; } #else result = PyDict_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } #endif #else result = PyObject_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } PyErr_Clear(); #endif return __Pyx_GetBuiltinName(name); } /* Import */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level) { PyObject *empty_list = 0; PyObject *module = 0; PyObject *global_dict = 0; PyObject *empty_dict = 0; PyObject *list; #if PY_MAJOR_VERSION < 3 PyObject *py_import; py_import = __Pyx_PyObject_GetAttrStr(__pyx_b, __pyx_n_s_import); if (!py_import) goto bad; #endif if (from_list) list = from_list; else { empty_list = PyList_New(0); if (!empty_list) goto bad; list = empty_list; } global_dict = PyModule_GetDict(__pyx_m); if (!global_dict) goto bad; empty_dict = PyDict_New(); if (!empty_dict) goto bad; { #if PY_MAJOR_VERSION >= 3 if (level == -1) { if (strchr(__Pyx_MODULE_NAME, '.')) { module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, 1); if (!module) { if (!PyErr_ExceptionMatches(PyExc_ImportError)) goto bad; PyErr_Clear(); } } level = 0; } #endif if (!module) { #if PY_MAJOR_VERSION < 3 PyObject *py_level = PyInt_FromLong(level); if (!py_level) goto bad; module = PyObject_CallFunctionObjArgs(py_import, name, global_dict, empty_dict, list, py_level, (PyObject *)NULL); Py_DECREF(py_level); #else module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, level); #endif } } bad: #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_import); #endif Py_XDECREF(empty_list); Py_XDECREF(empty_dict); return module; } /* ImportFrom */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) { PyObject* value = __Pyx_PyObject_GetAttrStr(module, name); if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Format(PyExc_ImportError, #if PY_MAJOR_VERSION < 3 "cannot import name %.230s", PyString_AS_STRING(name)); #else "cannot import name %S", name); #endif } return value; } /* HasAttr */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *o, PyObject *n) { PyObject *r; if (unlikely(!__Pyx_PyBaseString_Check(n))) { PyErr_SetString(PyExc_TypeError, "hasattr(): attribute name must be string"); return -1; } r = __Pyx_GetAttr(o, n); if (unlikely(!r)) { PyErr_Clear(); return 0; } else { Py_DECREF(r); return 1; } } /* PyObject_GenericGetAttrNoDict */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject *__Pyx_RaiseGenericGetAttributeError(PyTypeObject *tp, PyObject *attr_name) { PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, attr_name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(attr_name)); #endif return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name) { PyObject *descr; PyTypeObject *tp = Py_TYPE(obj); if (unlikely(!PyString_Check(attr_name))) { return PyObject_GenericGetAttr(obj, attr_name); } assert(!tp->tp_dictoffset); descr = _PyType_Lookup(tp, attr_name); if (unlikely(!descr)) { return __Pyx_RaiseGenericGetAttributeError(tp, attr_name); } Py_INCREF(descr); #if PY_MAJOR_VERSION < 3 if (likely(PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_HAVE_CLASS))) #endif { descrgetfunc f = Py_TYPE(descr)->tp_descr_get; if (unlikely(f)) { PyObject *res = f(descr, obj, (PyObject *)tp); Py_DECREF(descr); return res; } } return descr; } #endif /* PyObject_GenericGetAttr */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name) { if (unlikely(Py_TYPE(obj)->tp_dictoffset)) { return PyObject_GenericGetAttr(obj, attr_name); } return __Pyx_PyObject_GenericGetAttrNoDict(obj, attr_name); } #endif /* SetupReduce */ static int __Pyx_setup_reduce_is_named(PyObject* meth, PyObject* name) { int ret; PyObject *name_attr; name_attr = __Pyx_PyObject_GetAttrStr(meth, __pyx_n_s_name); if (likely(name_attr)) { ret = PyObject_RichCompareBool(name_attr, name, Py_EQ); } else { ret = -1; } if (unlikely(ret < 0)) { PyErr_Clear(); ret = 0; } Py_XDECREF(name_attr); return ret; } static int __Pyx_setup_reduce(PyObject* type_obj) { int ret = 0; PyObject *object_reduce = NULL; PyObject *object_reduce_ex = NULL; PyObject *reduce = NULL; PyObject *reduce_ex = NULL; PyObject *reduce_cython = NULL; PyObject *setstate = NULL; PyObject *setstate_cython = NULL; #if CYTHON_USE_PYTYPE_LOOKUP if (_PyType_Lookup((PyTypeObject*)type_obj, __pyx_n_s_getstate)) goto GOOD; #else if (PyObject_HasAttr(type_obj, __pyx_n_s_getstate)) goto GOOD; #endif #if CYTHON_USE_PYTYPE_LOOKUP object_reduce_ex = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #else object_reduce_ex = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #endif reduce_ex = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_ex); if (unlikely(!reduce_ex)) goto BAD; if (reduce_ex == object_reduce_ex) { #if CYTHON_USE_PYTYPE_LOOKUP object_reduce = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #else object_reduce = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #endif reduce = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce); if (unlikely(!reduce)) goto BAD; if (reduce == object_reduce || __Pyx_setup_reduce_is_named(reduce, __pyx_n_s_reduce_cython)) { reduce_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_cython); if (unlikely(!reduce_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce, reduce_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce_cython); if (unlikely(ret < 0)) goto BAD; setstate = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate); if (!setstate) PyErr_Clear(); if (!setstate || __Pyx_setup_reduce_is_named(setstate, __pyx_n_s_setstate_cython)) { setstate_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate_cython); if (unlikely(!setstate_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate, setstate_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate_cython); if (unlikely(ret < 0)) goto BAD; } PyType_Modified((PyTypeObject*)type_obj); } } goto GOOD; BAD: if (!PyErr_Occurred()) PyErr_Format(PyExc_RuntimeError, "Unable to initialize pickling for %s", ((PyTypeObject*)type_obj)->tp_name); ret = -1; GOOD: #if !CYTHON_USE_PYTYPE_LOOKUP Py_XDECREF(object_reduce); Py_XDECREF(object_reduce_ex); #endif Py_XDECREF(reduce); Py_XDECREF(reduce_ex); Py_XDECREF(reduce_cython); Py_XDECREF(setstate); Py_XDECREF(setstate_cython); return ret; } /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, (size_t)new_max*sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyObject *py_srcfile = 0; PyObject *py_funcname = 0; #if PY_MAJOR_VERSION < 3 py_srcfile = PyString_FromString(filename); #else py_srcfile = PyUnicode_FromString(filename); #endif if (!py_srcfile) goto bad; if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); #else py_funcname = PyUnicode_FromString(funcname); #endif } if (!py_funcname) goto bad; py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); Py_DECREF(py_funcname); return py_code; bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) goto bad; __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); } } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(long) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case 1: __PYX_VERIFY_RETURN_INT(long, digit, digits[0]) case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 2 * PyLong_SHIFT) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 3 * PyLong_SHIFT) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 4 * PyLong_SHIFT) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(long) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case -1: __PYX_VERIFY_RETURN_INT(long, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(long, digit, +digits[0]) case -2: if (8 * sizeof(long) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } #endif if (sizeof(long) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else long val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (long) -1; } } else { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(int) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case 1: __PYX_VERIFY_RETURN_INT(int, digit, digits[0]) case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 2 * PyLong_SHIFT) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 3 * PyLong_SHIFT) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 4 * PyLong_SHIFT) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(int) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case -1: __PYX_VERIFY_RETURN_INT(int, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(int, digit, +digits[0]) case -2: if (8 * sizeof(int) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } #endif if (sizeof(int) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else int val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (int) -1; } } else { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = a->tp_base; if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { int res = exc_type1 ? __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type1) : 0; if (!res) { res = __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } return res; } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; ip) { #if PY_MAJOR_VERSION < 3 if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } #else if (t->is_unicode | t->is_str) { if (t->intern) { *t->p = PyUnicode_InternFromString(t->s); } else if (t->encoding) { *t->p = PyUnicode_Decode(t->s, t->n - 1, t->encoding, NULL); } else { *t->p = PyUnicode_FromStringAndSize(t->s, t->n - 1); } } else { *t->p = PyBytes_FromStringAndSize(t->s, t->n - 1); } #endif if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; ++t; } return 0; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type %.200s). " "The ability to return an instance of a strict subclass of int " "is deprecated, and may be removed in a future version of Python.", Py_TYPE(result)->tp_name)) { Py_DECREF(result); return NULL; } return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type %.200s)", type_name, type_name, Py_TYPE(result)->tp_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)b)->ob_digit; const Py_ssize_t size = Py_SIZE(b); if (likely(__Pyx_sst_abs(size) <= 1)) { ival = likely(size) ? digits[0] : 0; if (size == -1) ival = -ival; return ival; } else { switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } #endif /* Py_PYTHON_H */ aiohttp-3.6.2/aiohttp/_helpers.pyi0000644000175100001650000000031413547410117017457 0ustar vstsdocker00000000000000from typing import Any class reify: def __init__(self, wrapped: Any) -> None: ... def __get__(self, inst: Any, owner: Any) -> Any: ... def __set__(self, inst: Any, value: Any) -> None: ... aiohttp-3.6.2/aiohttp/_helpers.pyx0000644000175100001650000000203113547410117017474 0ustar vstsdocker00000000000000cdef class reify: """Use as a class method decorator. It operates almost exactly like the Python `@property` decorator, but it puts the result of the method it decorates into the instance dict after the first call, effectively replacing the function it decorates with an instance variable. It is, in Python parlance, a data descriptor. """ cdef object wrapped cdef object name def __init__(self, wrapped): self.wrapped = wrapped self.name = wrapped.__name__ @property def __doc__(self): return self.wrapped.__doc__ def __get__(self, inst, owner): try: try: return inst._cache[self.name] except KeyError: val = self.wrapped(inst) inst._cache[self.name] = val return val except AttributeError: if inst is None: return self raise def __set__(self, inst, value): raise AttributeError("reified property is read-only") aiohttp-3.6.2/aiohttp/_http_parser.c0000644000175100001650000363416313547410137020015 0ustar vstsdocker00000000000000/* Generated by Cython 0.29.13 */ #define PY_SSIZE_T_CLEAN #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02060000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.6+ or Python 3.3+. #else #define CYTHON_ABI "0_29_13" #define CYTHON_HEX_VERSION 0x001D0DF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #if PY_VERSION_HEX >= 0x02070000 #define HAVE_LONG_LONG #endif #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #ifdef PYPY_VERSION #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif defined(PYSTON_VERSION) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #elif !defined(CYTHON_USE_PYTYPE_LOOKUP) #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #elif !defined(CYTHON_USE_PYLONG_INTERNALS) #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT (PY_VERSION_HEX >= 0x03050000) #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1) #endif #ifndef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX >= 0x030600B1) #endif #ifndef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK (PY_VERSION_HEX >= 0x030700A3) #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if CYTHON_USE_PYLONG_INTERNALS #include "longintrepr.h" #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_MAYBE_UNUSED_VAR( const T& ) { } # else # define CYTHON_MAYBE_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int32 uint32_t; #endif #endif #else #include #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) && __cplusplus >= 201103L #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #elif __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__ ) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #define Py_OptimizeFlag 0 #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #define __Pyx_DefaultClassType PyClass_Type #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #if PY_VERSION_HEX >= 0x030800A4 && PY_VERSION_HEX < 0x030800B2 #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #define __Pyx_DefaultClassType PyType_Type #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #define __Pyx_PyCFunctionFast _PyCFunctionFast #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #if CYTHON_FAST_PYCCALL #define __Pyx_PyFastCFunction_Check(func)\ ((PyCFunction_Check(func) && (METH_FASTCALL == (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))))) #else #define __Pyx_PyFastCFunction_Check(func) 0 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030400A1 #define PyMem_RawMalloc(n) PyMem_Malloc(n) #define PyMem_RawRealloc(p, n) PyMem_Realloc(p, n) #define PyMem_RawFree(p) PyMem_Free(p) #endif #if CYTHON_COMPILING_IN_PYSTON #define __Pyx_PyCode_HasFreeVars(co) PyCode_HasFreeVars(co) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if !CYTHON_FAST_THREAD_STATE || PY_VERSION_HEX < 0x02070000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStr(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) #else #define __Pyx_PyDict_GetItemStr(dict, name) PyDict_GetItem(dict, name) #endif #if PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) PyUnicode_KIND(u) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535 : 1114111) #define __Pyx_PyUnicode_KIND(u) (sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains) #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #define PyObject_Unicode PyObject_Str #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) (Py_TYPE(obj) == &PySet_Type) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #else #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t PyInt_AsLong #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t PyInt_AsSsize_t #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyMethod_New(func, self, klass) ((self) ? PyMethod_New(func, self) : (Py_INCREF(func), func)) #else #define __Pyx_PyMethod_New(func, self, klass) PyMethod_New(func, self, klass) #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(WIN32) || defined(MS_WINDOWS) #define _USE_MATH_DEFINES #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_ERR(f_index, lineno, Ln_error) \ { \ __pyx_filename = __pyx_f[f_index]; __pyx_lineno = lineno; __pyx_clineno = __LINE__; goto Ln_error; \ } #ifndef __PYX_EXTERN_C #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__aiohttp___http_parser #define __PYX_HAVE_API__aiohttp___http_parser /* Early includes */ #include #include #include "pythread.h" #include #include "../vendor/http-parser/http_parser.h" #include "_find_header.h" #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); #define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) { const Py_UNICODE *u_end = u; while (*u_end++) ; return (size_t)(u_end - u - 1); } #define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) #define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #define __Pyx_PyNumber_Float(x) (PyFloat_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Float(x)) #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } static PyObject *__pyx_m = NULL; static PyObject *__pyx_d; static PyObject *__pyx_b; static PyObject *__pyx_cython_runtime = NULL; static PyObject *__pyx_empty_tuple; static PyObject *__pyx_empty_bytes; static PyObject *__pyx_empty_unicode; static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm= __FILE__; static const char *__pyx_filename; static const char *__pyx_f[] = { "aiohttp/_http_parser.pyx", "stringsource", "type.pxd", "bool.pxd", "complex.pxd", "aiohttp/_headers.pxi", }; /*--- Type declarations ---*/ struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage; struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage; struct __pyx_obj_7aiohttp_12_http_parser_HttpParser; struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser; struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr; struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init; /* "aiohttp/_http_parser.pyx":308 * PyMem_Free(self._csettings) * * cdef _init(self, cparser.http_parser_type mode, # <<<<<<<<<<<<<< * object protocol, object loop, object timer=None, * size_t max_line_size=8190, size_t max_headers=32768, */ struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init { int __pyx_n; PyObject *timer; size_t max_line_size; size_t max_headers; size_t max_field_size; PyObject *payload_exception; int response_with_body; int auto_decompress; }; /* "aiohttp/_http_parser.pyx":93 * * @cython.freelist(DEFAULT_FREELIST_SIZE) * cdef class RawRequestMessage: # <<<<<<<<<<<<<< * cdef readonly str method * cdef readonly str path */ struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage { PyObject_HEAD PyObject *method; PyObject *path; PyObject *version; PyObject *headers; PyObject *raw_headers; PyObject *should_close; PyObject *compression; PyObject *upgrade; PyObject *chunked; PyObject *url; }; /* "aiohttp/_http_parser.pyx":193 * * @cython.freelist(DEFAULT_FREELIST_SIZE) * cdef class RawResponseMessage: # <<<<<<<<<<<<<< * cdef readonly object version # HttpVersion * cdef readonly int code */ struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage { PyObject_HEAD PyObject *version; int code; PyObject *reason; PyObject *headers; PyObject *raw_headers; PyObject *should_close; PyObject *compression; PyObject *upgrade; PyObject *chunked; }; /* "aiohttp/_http_parser.pyx":255 * * @cython.internal * cdef class HttpParser: # <<<<<<<<<<<<<< * * cdef: */ struct __pyx_obj_7aiohttp_12_http_parser_HttpParser { PyObject_HEAD struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *__pyx_vtab; struct http_parser *_cparser; struct http_parser_settings *_csettings; PyObject *_raw_name; PyObject *_raw_value; int _has_value; PyObject *_protocol; PyObject *_loop; PyObject *_timer; size_t _max_line_size; size_t _max_field_size; size_t _max_headers; int _response_with_body; int _started; PyObject *_url; PyObject *_buf; PyObject *_path; PyObject *_reason; PyObject *_headers; PyObject *_raw_headers; int _upgraded; PyObject *_messages; PyObject *_payload; int _payload_error; PyObject *_payload_exception; PyObject *_last_error; int _auto_decompress; PyObject *_content_encoding; Py_buffer py_buf; }; /* "aiohttp/_http_parser.pyx":537 * * * cdef class HttpRequestParser(HttpParser): # <<<<<<<<<<<<<< * * def __init__(self, protocol, loop, timer=None, */ struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser __pyx_base; }; /* "aiohttp/_http_parser.pyx":564 * * * cdef class HttpResponseParser(HttpParser): # <<<<<<<<<<<<<< * * def __init__(self, protocol, loop, timer=None, */ struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser __pyx_base; }; /* "aiohttp/_http_parser.pyx":118 * self.url = url * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("method", self.method)) */ struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ { PyObject_HEAD PyObject *__pyx_v_info; }; /* "aiohttp/_http_parser.pyx":130 * info.append(("chunked", self.chunked)) * info.append(("url", self.url)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr { PyObject_HEAD struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *__pyx_outer_scope; PyObject *__pyx_v_name; PyObject *__pyx_v_val; }; /* "aiohttp/_http_parser.pyx":216 * self.chunked = chunked * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("version", self.version)) */ struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ { PyObject_HEAD PyObject *__pyx_v_info; }; /* "aiohttp/_http_parser.pyx":227 * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr { PyObject_HEAD struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *__pyx_outer_scope; PyObject *__pyx_v_name; PyObject *__pyx_v_val; }; /* "aiohttp/_http_parser.pyx":255 * * @cython.internal * cdef class HttpParser: # <<<<<<<<<<<<<< * * cdef: */ struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser { PyObject *(*_init)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, enum http_parser_type, PyObject *, PyObject *, struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init *__pyx_optional_args); PyObject *(*_process_header)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*_on_header_field)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, char *, size_t); PyObject *(*_on_header_value)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, char *, size_t); PyObject *(*_on_headers_complete)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*_on_message_complete)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*_on_chunk_header)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*_on_chunk_complete)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*_on_status_complete)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); PyObject *(*http_version)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *__pyx_vtabptr_7aiohttp_12_http_parser_HttpParser; static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *); /* "aiohttp/_http_parser.pyx":537 * * * cdef class HttpRequestParser(HttpParser): # <<<<<<<<<<<<<< * * def __init__(self, protocol, loop, timer=None, */ struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpRequestParser { struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser __pyx_base; }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpRequestParser *__pyx_vtabptr_7aiohttp_12_http_parser_HttpRequestParser; /* "aiohttp/_http_parser.pyx":564 * * * cdef class HttpResponseParser(HttpParser): # <<<<<<<<<<<<<< * * def __init__(self, protocol, loop, timer=None, */ struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpResponseParser { struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser __pyx_base; }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpResponseParser *__pyx_vtabptr_7aiohttp_12_http_parser_HttpResponseParser; /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, int); void (*DECREF)(void*, PyObject*, int); void (*GOTREF)(void*, PyObject*, int); void (*GIVEREF)(void*, PyObject*, int); void* (*SetupContext)(const char*, int, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__) #endif #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_XINCREF(r) do { if((r) != NULL) {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) != NULL) {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) != NULL) {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) != NULL) {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* GetItemInt.proto */ #define __Pyx_GetItemInt(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Fast(o, (Py_ssize_t)i, is_list, wraparound, boundscheck) :\ (is_list ? (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL) :\ __Pyx_GetItemInt_Generic(o, to_py_func(i)))) #define __Pyx_GetItemInt_List(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_List_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "list index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); #define __Pyx_GetItemInt_Tuple(o, i, type, is_signed, to_py_func, is_list, wraparound, boundscheck)\ (__Pyx_fits_Py_ssize_t(i, type, is_signed) ?\ __Pyx_GetItemInt_Tuple_Fast(o, (Py_ssize_t)i, wraparound, boundscheck) :\ (PyErr_SetString(PyExc_IndexError, "tuple index out of range"), (PyObject*)NULL)) static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, int wraparound, int boundscheck); static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j); static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, int wraparound, int boundscheck); /* decode_c_string_utf16.proto */ static CYTHON_INLINE PyObject *__Pyx_PyUnicode_DecodeUTF16(const char *s, Py_ssize_t size, const char *errors) { int byteorder = 0; return PyUnicode_DecodeUTF16(s, size, errors, &byteorder); } static CYTHON_INLINE PyObject *__Pyx_PyUnicode_DecodeUTF16LE(const char *s, Py_ssize_t size, const char *errors) { int byteorder = -1; return PyUnicode_DecodeUTF16(s, size, errors, &byteorder); } static CYTHON_INLINE PyObject *__Pyx_PyUnicode_DecodeUTF16BE(const char *s, Py_ssize_t size, const char *errors) { int byteorder = 1; return PyUnicode_DecodeUTF16(s, size, errors, &byteorder); } /* decode_c_bytes.proto */ static CYTHON_INLINE PyObject* __Pyx_decode_c_bytes( const char* cstring, Py_ssize_t length, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)); /* decode_bytes.proto */ static CYTHON_INLINE PyObject* __Pyx_decode_bytes( PyObject* string, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)) { return __Pyx_decode_c_bytes( PyBytes_AS_STRING(string), PyBytes_GET_SIZE(string), start, stop, encoding, errors, decode_func); } /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject **argnames[],\ PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,\ const char* function_name); /* None.proto */ static CYTHON_INLINE void __Pyx_RaiseClosureNameError(const char *varname); /* RaiseTooManyValuesToUnpack.proto */ static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected); /* RaiseNeedMoreValuesToUnpack.proto */ static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index); /* IterFinish.proto */ static CYTHON_INLINE int __Pyx_IterFinish(void); /* UnpackItemEndCheck.proto */ static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected); /* ListCompAppend.proto */ #if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS static CYTHON_INLINE int __Pyx_ListComp_Append(PyObject* list, PyObject* x) { PyListObject* L = (PyListObject*) list; Py_ssize_t len = Py_SIZE(list); if (likely(L->allocated > len)) { Py_INCREF(x); PyList_SET_ITEM(list, len, x); Py_SIZE(list) = len+1; return 0; } return PyList_Append(list, x); } #else #define __Pyx_ListComp_Append(L,x) PyList_Append(L,x) #endif /* ListAppend.proto */ #if CYTHON_USE_PYLIST_INTERNALS && CYTHON_ASSUME_SAFE_MACROS static CYTHON_INLINE int __Pyx_PyList_Append(PyObject* list, PyObject* x) { PyListObject* L = (PyListObject*) list; Py_ssize_t len = Py_SIZE(list); if (likely(L->allocated > len) & likely(len > (L->allocated >> 1))) { Py_INCREF(x); PyList_SET_ITEM(list, len, x); Py_SIZE(list) = len+1; return 0; } return PyList_Append(list, x); } #else #define __Pyx_PyList_Append(L,x) PyList_Append(L,x) #endif /* KeywordStringCheck.proto */ static int __Pyx_CheckKeywordStrings(PyObject *kwdict, const char* function_name, int kw_allowed); /* ExtTypeTest.proto */ static CYTHON_INLINE int __Pyx_TypeTest(PyObject *obj, PyTypeObject *type); /* PyDictContains.proto */ static CYTHON_INLINE int __Pyx_PyDict_ContainsTF(PyObject* item, PyObject* dict, int eq) { int result = PyDict_Contains(dict, item); return unlikely(result < 0) ? result : (result == (eq == Py_EQ)); } /* DictGetItem.proto */ #if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY static PyObject *__Pyx_PyDict_GetItem(PyObject *d, PyObject* key); #define __Pyx_PyObject_Dict_GetItem(obj, name)\ (likely(PyDict_CheckExact(obj)) ?\ __Pyx_PyDict_GetItem(obj, name) : PyObject_GetItem(obj, name)) #else #define __Pyx_PyDict_GetItem(d, key) PyObject_GetItem(d, key) #define __Pyx_PyObject_Dict_GetItem(obj, name) PyObject_GetItem(obj, name) #endif /* PyErrExceptionMatches.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_ExceptionMatches(err) __Pyx_PyErr_ExceptionMatchesInState(__pyx_tstate, err) static CYTHON_INLINE int __Pyx_PyErr_ExceptionMatchesInState(PyThreadState* tstate, PyObject* err); #else #define __Pyx_PyErr_ExceptionMatches(err) PyErr_ExceptionMatches(err) #endif /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #define __Pyx_PyErr_Occurred() __pyx_tstate->curexc_type #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() PyErr_Occurred() #endif /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* GetAttr.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *, PyObject *); /* GetAttr3.proto */ static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *, PyObject *, PyObject *); /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* GetModuleGlobalName.proto */ #if CYTHON_USE_DICT_VERSIONS #define __Pyx_GetModuleGlobalName(var, name) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } #define __Pyx_GetModuleGlobalNameUncached(var, name) {\ PY_UINT64_T __pyx_dict_version;\ PyObject *__pyx_dict_cached_value;\ (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); #else #define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) #define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); #endif /* PyFunctionFastCall.proto */ #if CYTHON_FAST_PYCALL #define __Pyx_PyFunction_FastCall(func, args, nargs)\ __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); #else #define __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs) _PyFunction_FastCallDict(func, args, nargs, kwargs) #endif #define __Pyx_BUILD_ASSERT_EXPR(cond)\ (sizeof(char [1 - 2*!(cond)]) - 1) #ifndef Py_MEMBER_SIZE #define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) #endif static size_t __pyx_pyframe_localsplus_offset = 0; #include "frameobject.h" #define __Pxy_PyFrame_Initialize_Offsets()\ ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) #define __Pyx_PyFrame_GetLocalsplus(frame)\ (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) #endif /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* PyObjectCallMethO.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); #endif /* PyObjectCallNoArg.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func); #else #define __Pyx_PyObject_CallNoArg(func) __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL) #endif /* PyCFunctionFastCall.proto */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject *__Pyx_PyCFunction_FastCall(PyObject *func, PyObject **args, Py_ssize_t nargs); #else #define __Pyx_PyCFunction_FastCall(func, args, nargs) (assert(0), NULL) #endif /* PyObjectCallOneArg.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); /* PyObjectCall2Args.proto */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2); /* PySequenceContains.proto */ static CYTHON_INLINE int __Pyx_PySequence_ContainsTF(PyObject* item, PyObject* seq, int eq) { int result = PySequence_Contains(seq, item); return unlikely(result < 0) ? result : (result == (eq == Py_EQ)); } /* RaiseException.proto */ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); /* IncludeStringH.proto */ #include /* BytesEquals.proto */ static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals); /* UnicodeEquals.proto */ static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals); /* SliceObject.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_GetSlice( PyObject* obj, Py_ssize_t cstart, Py_ssize_t cstop, PyObject** py_start, PyObject** py_stop, PyObject** py_slice, int has_cstart, int has_cstop, int wraparound); /* decode_bytearray.proto */ static CYTHON_INLINE PyObject* __Pyx_decode_bytearray( PyObject* string, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)) { return __Pyx_decode_c_bytes( PyByteArray_AS_STRING(string), PyByteArray_GET_SIZE(string), start, stop, encoding, errors, decode_func); } /* GetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb) static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb); #endif /* SwapException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSwap(type, value, tb) __Pyx__ExceptionSwap(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb); #endif /* GetTopmostException.proto */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate); #endif /* SaveResetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); #else #define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb) #define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb) #endif /* decode_c_string.proto */ static CYTHON_INLINE PyObject* __Pyx_decode_c_string( const char* cstring, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)); /* UnpackUnboundCMethod.proto */ typedef struct { PyObject *type; PyObject **method_name; PyCFunction func; PyObject *method; int flag; } __Pyx_CachedCFunction; /* CallUnboundCMethod1.proto */ static PyObject* __Pyx__CallUnboundCMethod1(__Pyx_CachedCFunction* cfunc, PyObject* self, PyObject* arg); #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_CallUnboundCMethod1(__Pyx_CachedCFunction* cfunc, PyObject* self, PyObject* arg); #else #define __Pyx_CallUnboundCMethod1(cfunc, self, arg) __Pyx__CallUnboundCMethod1(cfunc, self, arg) #endif /* Import.proto */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); /* ImportFrom.proto */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name); /* HasAttr.proto */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *, PyObject *); /* PyObject_GenericGetAttrNoDict.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttrNoDict PyObject_GenericGetAttr #endif /* PyObject_GenericGetAttr.proto */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GenericGetAttr PyObject_GenericGetAttr #endif /* SetupReduce.proto */ static int __Pyx_setup_reduce(PyObject* type_obj); /* SetVTable.proto */ static int __Pyx_SetVtable(PyObject *dict, void *vtable); /* TypeImport.proto */ #ifndef __PYX_HAVE_RT_ImportType_proto #define __PYX_HAVE_RT_ImportType_proto enum __Pyx_ImportType_CheckSize { __Pyx_ImportType_CheckSize_Error = 0, __Pyx_ImportType_CheckSize_Warn = 1, __Pyx_ImportType_CheckSize_Ignore = 2 }; static PyTypeObject *__Pyx_ImportType(PyObject* module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size); #endif /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_int(unsigned int value); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_short(unsigned short value); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_uint16_t(uint16_t value); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE enum http_method __Pyx_PyInt_As_enum__http_method(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE size_t __Pyx_PyInt_As_size_t(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* FetchCommonType.proto */ static PyTypeObject* __Pyx_FetchCommonType(PyTypeObject* type); /* PyObjectGetMethod.proto */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method); /* PyObjectCallMethod1.proto */ static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg); /* CoroutineBase.proto */ typedef PyObject *(*__pyx_coroutine_body_t)(PyObject *, PyThreadState *, PyObject *); #if CYTHON_USE_EXC_INFO_STACK #define __Pyx_ExcInfoStruct _PyErr_StackItem #else typedef struct { PyObject *exc_type; PyObject *exc_value; PyObject *exc_traceback; } __Pyx_ExcInfoStruct; #endif typedef struct { PyObject_HEAD __pyx_coroutine_body_t body; PyObject *closure; __Pyx_ExcInfoStruct gi_exc_state; PyObject *gi_weakreflist; PyObject *classobj; PyObject *yieldfrom; PyObject *gi_name; PyObject *gi_qualname; PyObject *gi_modulename; PyObject *gi_code; int resume_label; char is_running; } __pyx_CoroutineObject; static __pyx_CoroutineObject *__Pyx__Coroutine_New( PyTypeObject *type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, PyObject *name, PyObject *qualname, PyObject *module_name); static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit( __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, PyObject *name, PyObject *qualname, PyObject *module_name); static CYTHON_INLINE void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *self); static int __Pyx_Coroutine_clear(PyObject *self); static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value); static PyObject *__Pyx_Coroutine_Close(PyObject *self); static PyObject *__Pyx_Coroutine_Throw(PyObject *gen, PyObject *args); #if CYTHON_USE_EXC_INFO_STACK #define __Pyx_Coroutine_SwapException(self) #define __Pyx_Coroutine_ResetAndClearException(self) __Pyx_Coroutine_ExceptionClear(&(self)->gi_exc_state) #else #define __Pyx_Coroutine_SwapException(self) {\ __Pyx_ExceptionSwap(&(self)->gi_exc_state.exc_type, &(self)->gi_exc_state.exc_value, &(self)->gi_exc_state.exc_traceback);\ __Pyx_Coroutine_ResetFrameBackpointer(&(self)->gi_exc_state);\ } #define __Pyx_Coroutine_ResetAndClearException(self) {\ __Pyx_ExceptionReset((self)->gi_exc_state.exc_type, (self)->gi_exc_state.exc_value, (self)->gi_exc_state.exc_traceback);\ (self)->gi_exc_state.exc_type = (self)->gi_exc_state.exc_value = (self)->gi_exc_state.exc_traceback = NULL;\ } #endif #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyGen_FetchStopIterationValue(pvalue)\ __Pyx_PyGen__FetchStopIterationValue(__pyx_tstate, pvalue) #else #define __Pyx_PyGen_FetchStopIterationValue(pvalue)\ __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, pvalue) #endif static int __Pyx_PyGen__FetchStopIterationValue(PyThreadState *tstate, PyObject **pvalue); static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state); /* PatchModuleWithCoroutine.proto */ static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code); /* PatchGeneratorABC.proto */ static int __Pyx_patch_abc(void); /* Generator.proto */ #define __Pyx_Generator_USED static PyTypeObject *__pyx_GeneratorType = 0; #define __Pyx_Generator_CheckExact(obj) (Py_TYPE(obj) == __pyx_GeneratorType) #define __Pyx_Generator_New(body, code, closure, name, qualname, module_name)\ __Pyx__Coroutine_New(__pyx_GeneratorType, body, code, closure, name, qualname, module_name) static PyObject *__Pyx_Generator_Next(PyObject *self); static int __pyx_Generator_init(void); /* CheckBinaryVersion.proto */ static int __Pyx_check_binary_version(void); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__init(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, enum http_parser_type __pyx_v_mode, PyObject *__pyx_v_protocol, PyObject *__pyx_v_loop, struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init *__pyx_optional_args); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__process_header(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_field(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, char *__pyx_v_at, size_t __pyx_v_length); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_value(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, char *__pyx_v_at, size_t __pyx_v_length); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_headers_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_message_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_header(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_status_complete(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_17HttpRequestParser__on_status_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *__pyx_v_self); /* proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_18HttpResponseParser__on_status_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *__pyx_v_self); /* proto*/ /* Module declarations from 'cpython.mem' */ /* Module declarations from 'libc.string' */ /* Module declarations from 'cpython.version' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.type' */ static PyTypeObject *__pyx_ptype_7cpython_4type_type = 0; /* Module declarations from 'libc.stdio' */ /* Module declarations from 'cpython.object' */ /* Module declarations from 'cpython.ref' */ /* Module declarations from 'cpython.exc' */ /* Module declarations from 'cpython.module' */ /* Module declarations from 'cpython.tuple' */ /* Module declarations from 'cpython.list' */ /* Module declarations from 'cpython.sequence' */ /* Module declarations from 'cpython.mapping' */ /* Module declarations from 'cpython.iterator' */ /* Module declarations from 'cpython.number' */ /* Module declarations from 'cpython.int' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.bool' */ static PyTypeObject *__pyx_ptype_7cpython_4bool_bool = 0; /* Module declarations from 'cpython.long' */ /* Module declarations from 'cpython.float' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.complex' */ static PyTypeObject *__pyx_ptype_7cpython_7complex_complex = 0; /* Module declarations from 'cpython.string' */ /* Module declarations from 'cpython.unicode' */ /* Module declarations from 'cpython.dict' */ /* Module declarations from 'cpython.instance' */ /* Module declarations from 'cpython.function' */ /* Module declarations from 'cpython.method' */ /* Module declarations from 'cpython.weakref' */ /* Module declarations from 'cpython.getargs' */ /* Module declarations from 'cpython.pythread' */ /* Module declarations from 'cpython.pystate' */ /* Module declarations from 'cpython.cobject' */ /* Module declarations from 'cpython.oldbuffer' */ /* Module declarations from 'cpython.set' */ /* Module declarations from 'cpython.buffer' */ /* Module declarations from 'cpython.bytes' */ /* Module declarations from 'cpython.pycapsule' */ /* Module declarations from 'cpython' */ /* Module declarations from 'cython' */ /* Module declarations from 'aiohttp' */ /* Module declarations from 'libc.stdint' */ /* Module declarations from 'aiohttp._cparser' */ /* Module declarations from 'aiohttp._find_header' */ /* Module declarations from 'aiohttp._http_parser' */ static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser_RawRequestMessage = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser_RawResponseMessage = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser_HttpParser = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser_HttpRequestParser = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser_HttpResponseParser = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct____repr__ = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ = 0; static PyTypeObject *__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_headers = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_URL = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_URL_build = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_CIMultiDict = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_HttpVersion = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_HttpVersion10 = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_HttpVersion11 = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1 = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_CONTENT_ENCODING = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_StreamReader = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser_DeflateBuffer = 0; static PyObject *__pyx_v_7aiohttp_12_http_parser__http_method = 0; static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_extend(PyObject *, char const *, size_t); /*proto*/ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_http_method_str(int); /*proto*/ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_find_header(PyObject *); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser__new_request_message(PyObject *, PyObject *, PyObject *, PyObject *, PyObject *, int, PyObject *, int, int, PyObject *); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser__new_response_message(PyObject *, int, PyObject *, PyObject *, PyObject *, int, PyObject *, int, int); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_message_begin(struct http_parser *); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_url(struct http_parser *, char const *, size_t); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_status(struct http_parser *, char const *, size_t); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_header_field(struct http_parser *, char const *, size_t); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_header_value(struct http_parser *, char const *, size_t); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_headers_complete(struct http_parser *); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_body(struct http_parser *, char const *, size_t); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_message_complete(struct http_parser *); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_header(struct http_parser *); /*proto*/ static int __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_complete(struct http_parser *); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser_parser_error_from_errno(enum http_errno); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser__parse_url(char *, size_t); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawRequestMessage__set_state(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *, PyObject *); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawResponseMessage__set_state(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *, PyObject *); /*proto*/ #define __Pyx_MODULE_NAME "aiohttp._http_parser" extern int __pyx_module_is_main_aiohttp___http_parser; int __pyx_module_is_main_aiohttp___http_parser = 0; /* Implementation of 'aiohttp._http_parser' */ static PyObject *__pyx_builtin_range; static PyObject *__pyx_builtin_MemoryError; static PyObject *__pyx_builtin_TypeError; static PyObject *__pyx_builtin_BaseException; static const char __pyx_k_[] = "="; static const char __pyx_k_i[] = "i"; static const char __pyx_k_TE[] = "TE"; static const char __pyx_k__2[] = ", "; static const char __pyx_k__3[] = ")>"; static const char __pyx_k__4[] = ""; static const char __pyx_k_br[] = "br"; static const char __pyx_k_AGE[] = "AGE"; static const char __pyx_k_URI[] = "URI"; static const char __pyx_k_URL[] = "URL"; static const char __pyx_k_VIA[] = "VIA"; static const char __pyx_k__11[] = ":"; static const char __pyx_k_add[] = "add"; static const char __pyx_k_all[] = "__all__"; static const char __pyx_k_new[] = "__new__"; static const char __pyx_k_url[] = "url"; static const char __pyx_k_DATE[] = "DATE"; static const char __pyx_k_ETAG[] = "ETAG"; static const char __pyx_k_FROM[] = "FROM"; static const char __pyx_k_HOST[] = "HOST"; static const char __pyx_k_LINK[] = "LINK"; static const char __pyx_k_VARY[] = "VARY"; static const char __pyx_k_args[] = "args"; static const char __pyx_k_code[] = "code"; static const char __pyx_k_dict[] = "__dict__"; static const char __pyx_k_gzip[] = "gzip"; static const char __pyx_k_hdrs[] = "hdrs"; static const char __pyx_k_host[] = "host"; static const char __pyx_k_loop[] = "loop"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_path[] = "path"; static const char __pyx_k_port[] = "port"; static const char __pyx_k_send[] = "send"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_user[] = "user"; static const char __pyx_k_yarl[] = "yarl"; static const char __pyx_k_ALLOW[] = "ALLOW"; static const char __pyx_k_RANGE[] = "RANGE"; static const char __pyx_k_URL_2[] = "_URL"; static const char __pyx_k_build[] = "build"; static const char __pyx_k_close[] = "close"; static const char __pyx_k_lower[] = "lower"; static const char __pyx_k_query[] = "query"; static const char __pyx_k_range[] = "range"; static const char __pyx_k_throw[] = "throw"; static const char __pyx_k_timer[] = "timer"; static const char __pyx_k_ACCEPT[] = "ACCEPT"; static const char __pyx_k_COOKIE[] = "COOKIE"; static const char __pyx_k_DIGEST[] = "DIGEST"; static const char __pyx_k_EXPECT[] = "EXPECT"; static const char __pyx_k_ORIGIN[] = "ORIGIN"; static const char __pyx_k_PRAGMA[] = "PRAGMA"; static const char __pyx_k_SERVER[] = "SERVER"; static const char __pyx_k_format[] = "format"; static const char __pyx_k_import[] = "__import__"; static const char __pyx_k_method[] = "method"; static const char __pyx_k_pickle[] = "pickle"; static const char __pyx_k_py_buf[] = "py_buf"; static const char __pyx_k_reason[] = "reason"; static const char __pyx_k_reduce[] = "__reduce__"; static const char __pyx_k_scheme[] = "scheme"; static const char __pyx_k_update[] = "update"; static const char __pyx_k_EXPIRES[] = "EXPIRES"; static const char __pyx_k_REFERER[] = "REFERER"; static const char __pyx_k_TRAILER[] = "TRAILER"; static const char __pyx_k_UPGRADE[] = "UPGRADE"; static const char __pyx_k_WARNING[] = "WARNING"; static const char __pyx_k_aiohttp[] = "aiohttp"; static const char __pyx_k_chunked[] = "chunked"; static const char __pyx_k_deflate[] = "deflate"; static const char __pyx_k_genexpr[] = "genexpr"; static const char __pyx_k_headers[] = "headers"; static const char __pyx_k_streams[] = "streams"; static const char __pyx_k_unknown[] = ""; static const char __pyx_k_upgrade[] = "upgrade"; static const char __pyx_k_version[] = "version"; static const char __pyx_k_IF_MATCH[] = "IF_MATCH"; static const char __pyx_k_IF_RANGE[] = "IF_RANGE"; static const char __pyx_k_LOCATION[] = "LOCATION"; static const char __pyx_k_buf_data[] = "buf_data"; static const char __pyx_k_feed_eof[] = "feed_eof"; static const char __pyx_k_fragment[] = "fragment"; static const char __pyx_k_getstate[] = "__getstate__"; static const char __pyx_k_password[] = "password"; static const char __pyx_k_protocol[] = "protocol"; static const char __pyx_k_pyx_type[] = "__pyx_type"; static const char __pyx_k_setstate[] = "__setstate__"; static const char __pyx_k_FORWARDED[] = "FORWARDED"; static const char __pyx_k_TypeError[] = "TypeError"; static const char __pyx_k_WEBSOCKET[] = "WEBSOCKET"; static const char __pyx_k_feed_data[] = "feed_data"; static const char __pyx_k_multidict[] = "multidict"; static const char __pyx_k_parse_url[] = "parse_url"; static const char __pyx_k_partition[] = "partition"; static const char __pyx_k_pyx_state[] = "__pyx_state"; static const char __pyx_k_reduce_ex[] = "__reduce_ex__"; static const char __pyx_k_CONNECTION[] = "CONNECTION"; static const char __pyx_k_KEEP_ALIVE[] = "KEEP_ALIVE"; static const char __pyx_k_SET_COOKIE[] = "SET_COOKIE"; static const char __pyx_k_USER_AGENT[] = "USER_AGENT"; static const char __pyx_k_pyx_result[] = "__pyx_result"; static const char __pyx_k_pyx_vtable[] = "__pyx_vtable__"; static const char __pyx_k_CIMultiDict[] = "CIMultiDict"; static const char __pyx_k_CONTENT_MD5[] = "CONTENT_MD5"; static const char __pyx_k_DESTINATION[] = "DESTINATION"; static const char __pyx_k_HttpVersion[] = "HttpVersion"; static const char __pyx_k_LineTooLong[] = "LineTooLong"; static const char __pyx_k_MemoryError[] = "MemoryError"; static const char __pyx_k_PickleError[] = "PickleError"; static const char __pyx_k_RETRY_AFTER[] = "RETRY_AFTER"; static const char __pyx_k_WANT_DIGEST[] = "WANT_DIGEST"; static const char __pyx_k_compression[] = "compression"; static const char __pyx_k_http_parser[] = "http_parser"; static const char __pyx_k_http_writer[] = "http_writer"; static const char __pyx_k_max_headers[] = "max_headers"; static const char __pyx_k_raw_headers[] = "raw_headers"; static const char __pyx_k_CONTENT_TYPE[] = "CONTENT_TYPE"; static const char __pyx_k_MAX_FORWARDS[] = "MAX_FORWARDS"; static const char __pyx_k_StreamReader[] = "StreamReader"; static const char __pyx_k_pyx_checksum[] = "__pyx_checksum"; static const char __pyx_k_should_close[] = "should_close"; static const char __pyx_k_stringsource[] = "stringsource"; static const char __pyx_k_ACCEPT_RANGES[] = "ACCEPT_RANGES"; static const char __pyx_k_AUTHORIZATION[] = "AUTHORIZATION"; static const char __pyx_k_BadStatusLine[] = "BadStatusLine"; static const char __pyx_k_BaseException[] = "BaseException"; static const char __pyx_k_CACHE_CONTROL[] = "CACHE_CONTROL"; static const char __pyx_k_CIMultiDict_2[] = "_CIMultiDict"; static const char __pyx_k_CONTENT_RANGE[] = "CONTENT_RANGE"; static const char __pyx_k_DeflateBuffer[] = "DeflateBuffer"; static const char __pyx_k_EMPTY_PAYLOAD[] = "EMPTY_PAYLOAD"; static const char __pyx_k_HttpVersion10[] = "HttpVersion10"; static const char __pyx_k_HttpVersion11[] = "HttpVersion11"; static const char __pyx_k_HttpVersion_2[] = "_HttpVersion"; static const char __pyx_k_IF_NONE_MATCH[] = "IF_NONE_MATCH"; static const char __pyx_k_InvalidHeader[] = "InvalidHeader"; static const char __pyx_k_LAST_EVENT_ID[] = "LAST_EVENT_ID"; static const char __pyx_k_LAST_MODIFIED[] = "LAST_MODIFIED"; static const char __pyx_k_invalid_url_r[] = "invalid url {!r}"; static const char __pyx_k_max_line_size[] = "max_line_size"; static const char __pyx_k_reduce_cython[] = "__reduce_cython__"; static const char __pyx_k_set_exception[] = "set_exception"; static const char __pyx_k_ACCEPT_CHARSET[] = "ACCEPT_CHARSET"; static const char __pyx_k_BadHttpMessage[] = "BadHttpMessage"; static const char __pyx_k_CONTENT_LENGTH[] = "CONTENT_LENGTH"; static const char __pyx_k_StreamReader_2[] = "_StreamReader"; static const char __pyx_k_max_field_size[] = "max_field_size"; static const char __pyx_k_read_until_eof[] = "read_until_eof"; static const char __pyx_k_ACCEPT_ENCODING[] = "ACCEPT_ENCODING"; static const char __pyx_k_ACCEPT_LANGUAGE[] = "ACCEPT_LANGUAGE"; static const char __pyx_k_DeflateBuffer_2[] = "_DeflateBuffer"; static const char __pyx_k_EMPTY_PAYLOAD_2[] = "_EMPTY_PAYLOAD"; static const char __pyx_k_HttpVersion10_2[] = "_HttpVersion10"; static const char __pyx_k_HttpVersion11_2[] = "_HttpVersion11"; static const char __pyx_k_InvalidURLError[] = "InvalidURLError"; static const char __pyx_k_X_FORWARDED_FOR[] = "X_FORWARDED_FOR"; static const char __pyx_k_auto_decompress[] = "auto_decompress"; static const char __pyx_k_http_exceptions[] = "http_exceptions"; static const char __pyx_k_pyx_PickleError[] = "__pyx_PickleError"; static const char __pyx_k_setstate_cython[] = "__setstate_cython__"; static const char __pyx_k_CIMultiDictProxy[] = "CIMultiDictProxy"; static const char __pyx_k_CONTENT_ENCODING[] = "CONTENT_ENCODING"; static const char __pyx_k_CONTENT_LANGUAGE[] = "CONTENT_LANGUAGE"; static const char __pyx_k_CONTENT_LOCATION[] = "CONTENT_LOCATION"; static const char __pyx_k_WWW_AUTHENTICATE[] = "WWW_AUTHENTICATE"; static const char __pyx_k_X_FORWARDED_HOST[] = "X_FORWARDED_HOST"; static const char __pyx_k_HttpRequestParser[] = "HttpRequestParser"; static const char __pyx_k_IF_MODIFIED_SINCE[] = "IF_MODIFIED_SINCE"; static const char __pyx_k_RawRequestMessage[] = "_http_method[i] */ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_http_method_str(int __pyx_v_i) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("http_method_str", 0); /* "aiohttp/_http_parser.pyx":76 * * cdef inline str http_method_str(int i): * if i < METHODS_COUNT: # <<<<<<<<<<<<<< * return _http_method[i] * else: */ __pyx_t_1 = ((__pyx_v_i < 34) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":77 * cdef inline str http_method_str(int i): * if i < METHODS_COUNT: * return _http_method[i] # <<<<<<<<<<<<<< * else: * return "" */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_7aiohttp_12_http_parser__http_method == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 77, __pyx_L1_error) } __pyx_t_2 = __Pyx_GetItemInt_List(__pyx_v_7aiohttp_12_http_parser__http_method, __pyx_v_i, int, 1, __Pyx_PyInt_From_int, 1, 1, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 77, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(((PyObject*)__pyx_t_2)); __pyx_r = ((PyObject*)__pyx_t_2); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":76 * * cdef inline str http_method_str(int i): * if i < METHODS_COUNT: # <<<<<<<<<<<<<< * return _http_method[i] * else: */ } /* "aiohttp/_http_parser.pyx":79 * return _http_method[i] * else: * return "" # <<<<<<<<<<<<<< * * cdef inline object find_header(bytes raw_header): */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_kp_u_unknown); __pyx_r = __pyx_kp_u_unknown; goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":75 * * * cdef inline str http_method_str(int i): # <<<<<<<<<<<<<< * if i < METHODS_COUNT: * return _http_method[i] */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._http_parser.http_method_str", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":81 * return "" * * cdef inline object find_header(bytes raw_header): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_find_header(PyObject *__pyx_v_raw_header) { Py_ssize_t __pyx_v_size; char *__pyx_v_buf; int __pyx_v_idx; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("find_header", 0); /* "aiohttp/_http_parser.pyx":85 * cdef char *buf * cdef int idx * PyBytes_AsStringAndSize(raw_header, &buf, &size) # <<<<<<<<<<<<<< * idx = _find_header.find_header(buf, size) * if idx == -1: */ __pyx_t_1 = PyBytes_AsStringAndSize(__pyx_v_raw_header, (&__pyx_v_buf), (&__pyx_v_size)); if (unlikely(__pyx_t_1 == ((int)-1))) __PYX_ERR(0, 85, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":86 * cdef int idx * PyBytes_AsStringAndSize(raw_header, &buf, &size) * idx = _find_header.find_header(buf, size) # <<<<<<<<<<<<<< * if idx == -1: * return raw_header.decode('utf-8', 'surrogateescape') */ __pyx_v_idx = find_header(__pyx_v_buf, __pyx_v_size); /* "aiohttp/_http_parser.pyx":87 * PyBytes_AsStringAndSize(raw_header, &buf, &size) * idx = _find_header.find_header(buf, size) * if idx == -1: # <<<<<<<<<<<<<< * return raw_header.decode('utf-8', 'surrogateescape') * return headers[idx] */ __pyx_t_2 = ((__pyx_v_idx == -1L) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":88 * idx = _find_header.find_header(buf, size) * if idx == -1: * return raw_header.decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * return headers[idx] * */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_raw_header == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "decode"); __PYX_ERR(0, 88, __pyx_L1_error) } __pyx_t_3 = __Pyx_decode_bytes(__pyx_v_raw_header, 0, PY_SSIZE_T_MAX, NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 88, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":87 * PyBytes_AsStringAndSize(raw_header, &buf, &size) * idx = _find_header.find_header(buf, size) * if idx == -1: # <<<<<<<<<<<<<< * return raw_header.decode('utf-8', 'surrogateescape') * return headers[idx] */ } /* "aiohttp/_http_parser.pyx":89 * if idx == -1: * return raw_header.decode('utf-8', 'surrogateescape') * return headers[idx] # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_7aiohttp_12_http_parser_headers == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 89, __pyx_L1_error) } __pyx_t_3 = __Pyx_GetItemInt_Tuple(__pyx_v_7aiohttp_12_http_parser_headers, __pyx_v_idx, int, 1, __Pyx_PyInt_From_int, 0, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 89, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":81 * return "" * * cdef inline object find_header(bytes raw_header): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.find_header", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":105 * cdef readonly object url # yarl.URL * * def __init__(self, method, path, version, headers, raw_headers, # <<<<<<<<<<<<<< * should_close, compression, upgrade, chunked, url): * self.method = method */ /* Python wrapper */ static int __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_method = 0; PyObject *__pyx_v_path = 0; PyObject *__pyx_v_version = 0; PyObject *__pyx_v_headers = 0; PyObject *__pyx_v_raw_headers = 0; PyObject *__pyx_v_should_close = 0; PyObject *__pyx_v_compression = 0; PyObject *__pyx_v_upgrade = 0; PyObject *__pyx_v_chunked = 0; PyObject *__pyx_v_url = 0; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_method,&__pyx_n_s_path,&__pyx_n_s_version,&__pyx_n_s_headers,&__pyx_n_s_raw_headers,&__pyx_n_s_should_close,&__pyx_n_s_compression,&__pyx_n_s_upgrade,&__pyx_n_s_chunked,&__pyx_n_s_url,0}; PyObject* values[10] = {0,0,0,0,0,0,0,0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 10: values[9] = PyTuple_GET_ITEM(__pyx_args, 9); CYTHON_FALLTHROUGH; case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_method)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_path)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 1); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_version)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 2); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 3: if (likely((values[3] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_headers)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 3); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 4: if (likely((values[4] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_raw_headers)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 4); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 5: if (likely((values[5] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_should_close)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 5); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 6: if (likely((values[6] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_compression)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 6); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 7: if (likely((values[7] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_upgrade)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 7); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 8: if (likely((values[8] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_chunked)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 8); __PYX_ERR(0, 105, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 9: if (likely((values[9] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_url)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, 9); __PYX_ERR(0, 105, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 105, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 10) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); values[3] = PyTuple_GET_ITEM(__pyx_args, 3); values[4] = PyTuple_GET_ITEM(__pyx_args, 4); values[5] = PyTuple_GET_ITEM(__pyx_args, 5); values[6] = PyTuple_GET_ITEM(__pyx_args, 6); values[7] = PyTuple_GET_ITEM(__pyx_args, 7); values[8] = PyTuple_GET_ITEM(__pyx_args, 8); values[9] = PyTuple_GET_ITEM(__pyx_args, 9); } __pyx_v_method = values[0]; __pyx_v_path = values[1]; __pyx_v_version = values[2]; __pyx_v_headers = values[3]; __pyx_v_raw_headers = values[4]; __pyx_v_should_close = values[5]; __pyx_v_compression = values[6]; __pyx_v_upgrade = values[7]; __pyx_v_chunked = values[8]; __pyx_v_url = values[9]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 1, 10, 10, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 105, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage___init__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self), __pyx_v_method, __pyx_v_path, __pyx_v_version, __pyx_v_headers, __pyx_v_raw_headers, __pyx_v_should_close, __pyx_v_compression, __pyx_v_upgrade, __pyx_v_chunked, __pyx_v_url); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage___init__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self, PyObject *__pyx_v_method, PyObject *__pyx_v_path, PyObject *__pyx_v_version, PyObject *__pyx_v_headers, PyObject *__pyx_v_raw_headers, PyObject *__pyx_v_should_close, PyObject *__pyx_v_compression, PyObject *__pyx_v_upgrade, PyObject *__pyx_v_chunked, PyObject *__pyx_v_url) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__init__", 0); /* "aiohttp/_http_parser.pyx":107 * def __init__(self, method, path, version, headers, raw_headers, * should_close, compression, upgrade, chunked, url): * self.method = method # <<<<<<<<<<<<<< * self.path = path * self.version = version */ if (!(likely(PyUnicode_CheckExact(__pyx_v_method))||((__pyx_v_method) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_v_method)->tp_name), 0))) __PYX_ERR(0, 107, __pyx_L1_error) __pyx_t_1 = __pyx_v_method; __Pyx_INCREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->method); __Pyx_DECREF(__pyx_v_self->method); __pyx_v_self->method = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":108 * should_close, compression, upgrade, chunked, url): * self.method = method * self.path = path # <<<<<<<<<<<<<< * self.version = version * self.headers = headers */ if (!(likely(PyUnicode_CheckExact(__pyx_v_path))||((__pyx_v_path) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_v_path)->tp_name), 0))) __PYX_ERR(0, 108, __pyx_L1_error) __pyx_t_1 = __pyx_v_path; __Pyx_INCREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->path); __Pyx_DECREF(__pyx_v_self->path); __pyx_v_self->path = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":109 * self.method = method * self.path = path * self.version = version # <<<<<<<<<<<<<< * self.headers = headers * self.raw_headers = raw_headers */ __Pyx_INCREF(__pyx_v_version); __Pyx_GIVEREF(__pyx_v_version); __Pyx_GOTREF(__pyx_v_self->version); __Pyx_DECREF(__pyx_v_self->version); __pyx_v_self->version = __pyx_v_version; /* "aiohttp/_http_parser.pyx":110 * self.path = path * self.version = version * self.headers = headers # <<<<<<<<<<<<<< * self.raw_headers = raw_headers * self.should_close = should_close */ __Pyx_INCREF(__pyx_v_headers); __Pyx_GIVEREF(__pyx_v_headers); __Pyx_GOTREF(__pyx_v_self->headers); __Pyx_DECREF(__pyx_v_self->headers); __pyx_v_self->headers = __pyx_v_headers; /* "aiohttp/_http_parser.pyx":111 * self.version = version * self.headers = headers * self.raw_headers = raw_headers # <<<<<<<<<<<<<< * self.should_close = should_close * self.compression = compression */ __Pyx_INCREF(__pyx_v_raw_headers); __Pyx_GIVEREF(__pyx_v_raw_headers); __Pyx_GOTREF(__pyx_v_self->raw_headers); __Pyx_DECREF(__pyx_v_self->raw_headers); __pyx_v_self->raw_headers = __pyx_v_raw_headers; /* "aiohttp/_http_parser.pyx":112 * self.headers = headers * self.raw_headers = raw_headers * self.should_close = should_close # <<<<<<<<<<<<<< * self.compression = compression * self.upgrade = upgrade */ __Pyx_INCREF(__pyx_v_should_close); __Pyx_GIVEREF(__pyx_v_should_close); __Pyx_GOTREF(__pyx_v_self->should_close); __Pyx_DECREF(__pyx_v_self->should_close); __pyx_v_self->should_close = __pyx_v_should_close; /* "aiohttp/_http_parser.pyx":113 * self.raw_headers = raw_headers * self.should_close = should_close * self.compression = compression # <<<<<<<<<<<<<< * self.upgrade = upgrade * self.chunked = chunked */ __Pyx_INCREF(__pyx_v_compression); __Pyx_GIVEREF(__pyx_v_compression); __Pyx_GOTREF(__pyx_v_self->compression); __Pyx_DECREF(__pyx_v_self->compression); __pyx_v_self->compression = __pyx_v_compression; /* "aiohttp/_http_parser.pyx":114 * self.should_close = should_close * self.compression = compression * self.upgrade = upgrade # <<<<<<<<<<<<<< * self.chunked = chunked * self.url = url */ __Pyx_INCREF(__pyx_v_upgrade); __Pyx_GIVEREF(__pyx_v_upgrade); __Pyx_GOTREF(__pyx_v_self->upgrade); __Pyx_DECREF(__pyx_v_self->upgrade); __pyx_v_self->upgrade = __pyx_v_upgrade; /* "aiohttp/_http_parser.pyx":115 * self.compression = compression * self.upgrade = upgrade * self.chunked = chunked # <<<<<<<<<<<<<< * self.url = url * */ __Pyx_INCREF(__pyx_v_chunked); __Pyx_GIVEREF(__pyx_v_chunked); __Pyx_GOTREF(__pyx_v_self->chunked); __Pyx_DECREF(__pyx_v_self->chunked); __pyx_v_self->chunked = __pyx_v_chunked; /* "aiohttp/_http_parser.pyx":116 * self.upgrade = upgrade * self.chunked = chunked * self.url = url # <<<<<<<<<<<<<< * * def __repr__(self): */ __Pyx_INCREF(__pyx_v_url); __Pyx_GIVEREF(__pyx_v_url); __Pyx_GOTREF(__pyx_v_self->url); __Pyx_DECREF(__pyx_v_self->url); __pyx_v_self->url = __pyx_v_url; /* "aiohttp/_http_parser.pyx":105 * cdef readonly object url # yarl.URL * * def __init__(self, method, path, version, headers, raw_headers, # <<<<<<<<<<<<<< * should_close, compression, upgrade, chunked, url): * self.method = method */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":118 * self.url = url * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("method", self.method)) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3__repr__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3__repr__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__repr__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_2__repr__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_gb_7aiohttp_12_http_parser_17RawRequestMessage_8__repr___2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */ /* "aiohttp/_http_parser.pyx":130 * info.append(("chunked", self.chunked)) * info.append(("url", self.url)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_8__repr___genexpr(PyObject *__pyx_self) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *__pyx_cur_scope; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("genexpr", 0); __pyx_cur_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr(__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr, __pyx_empty_tuple, NULL); if (unlikely(!__pyx_cur_scope)) { __pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)Py_None); __Pyx_INCREF(Py_None); __PYX_ERR(0, 130, __pyx_L1_error) } else { __Pyx_GOTREF(__pyx_cur_scope); } __pyx_cur_scope->__pyx_outer_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *) __pyx_self; __Pyx_INCREF(((PyObject *)__pyx_cur_scope->__pyx_outer_scope)); __Pyx_GIVEREF(__pyx_cur_scope->__pyx_outer_scope); { __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_7aiohttp_12_http_parser_17RawRequestMessage_8__repr___2generator, NULL, (PyObject *) __pyx_cur_scope, __pyx_n_s_genexpr, __pyx_n_s_repr___locals_genexpr, __pyx_n_s_aiohttp__http_parser); if (unlikely(!gen)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_DECREF(__pyx_cur_scope); __Pyx_RefNannyFinishContext(); return (PyObject *) gen; } /* function exit code */ __pyx_L1_error:; __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__repr__.genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_DECREF(((PyObject *)__pyx_cur_scope)); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_gb_7aiohttp_12_http_parser_17RawRequestMessage_8__repr___2generator(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */ { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *__pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)__pyx_generator->closure); PyObject *__pyx_r = NULL; PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *(*__pyx_t_7)(PyObject *); __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("genexpr", 0); switch (__pyx_generator->resume_label) { case 0: goto __pyx_L3_first_run; default: /* CPython raises the right error here */ __Pyx_RefNannyFinishContext(); return NULL; } __pyx_L3_first_run:; if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 130, __pyx_L1_error) __pyx_r = PyList_New(0); if (unlikely(!__pyx_r)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_r); if (unlikely(!__pyx_cur_scope->__pyx_outer_scope->__pyx_v_info)) { __Pyx_RaiseClosureNameError("info"); __PYX_ERR(0, 130, __pyx_L1_error) } if (unlikely(__pyx_cur_scope->__pyx_outer_scope->__pyx_v_info == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); __PYX_ERR(0, 130, __pyx_L1_error) } __pyx_t_1 = __pyx_cur_scope->__pyx_outer_scope->__pyx_v_info; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0; for (;;) { if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_1)) break; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS __pyx_t_3 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_3); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(0, 130, __pyx_L1_error) #else __pyx_t_3 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); #endif if ((likely(PyTuple_CheckExact(__pyx_t_3))) || (PyList_CheckExact(__pyx_t_3))) { PyObject* sequence = __pyx_t_3; Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); if (unlikely(size != 2)) { if (size > 2) __Pyx_RaiseTooManyValuesError(2); else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); __PYX_ERR(0, 130, __pyx_L1_error) } #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS if (likely(PyTuple_CheckExact(sequence))) { __pyx_t_4 = PyTuple_GET_ITEM(sequence, 0); __pyx_t_5 = PyTuple_GET_ITEM(sequence, 1); } else { __pyx_t_4 = PyList_GET_ITEM(sequence, 0); __pyx_t_5 = PyList_GET_ITEM(sequence, 1); } __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); #else __pyx_t_4 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); #endif __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } else { Py_ssize_t index = -1; __pyx_t_6 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_t_7 = Py_TYPE(__pyx_t_6)->tp_iternext; index = 0; __pyx_t_4 = __pyx_t_7(__pyx_t_6); if (unlikely(!__pyx_t_4)) goto __pyx_L6_unpacking_failed; __Pyx_GOTREF(__pyx_t_4); index = 1; __pyx_t_5 = __pyx_t_7(__pyx_t_6); if (unlikely(!__pyx_t_5)) goto __pyx_L6_unpacking_failed; __Pyx_GOTREF(__pyx_t_5); if (__Pyx_IternextUnpackEndCheck(__pyx_t_7(__pyx_t_6), 2) < 0) __PYX_ERR(0, 130, __pyx_L1_error) __pyx_t_7 = NULL; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; goto __pyx_L7_unpacking_done; __pyx_L6_unpacking_failed:; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __pyx_t_7 = NULL; if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); __PYX_ERR(0, 130, __pyx_L1_error) __pyx_L7_unpacking_done:; } __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_name); __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_name, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_val); __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_val, __pyx_t_5); __Pyx_GIVEREF(__pyx_t_5); __pyx_t_5 = 0; __pyx_t_3 = PyNumber_Add(__pyx_cur_scope->__pyx_v_name, __pyx_kp_u_); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_5 = PyObject_Repr(__pyx_cur_scope->__pyx_v_val); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_4 = PyNumber_Add(__pyx_t_3, __pyx_t_5); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(__Pyx_ListComp_Append(__pyx_r, (PyObject*)__pyx_t_4))) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; } __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_r); __pyx_r = 0; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); #if !CYTHON_USE_EXC_INFO_STACK __Pyx_Coroutine_ResetAndClearException(__pyx_generator); #endif __pyx_generator->resume_label = -1; __Pyx_Coroutine_clear((PyObject*)__pyx_generator); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":118 * self.url = url * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("method", self.method)) */ static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_2__repr__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *__pyx_cur_scope; PyObject *__pyx_v_sinfo = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__repr__", 0); __pyx_cur_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct____repr__(__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct____repr__, __pyx_empty_tuple, NULL); if (unlikely(!__pyx_cur_scope)) { __pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)Py_None); __Pyx_INCREF(Py_None); __PYX_ERR(0, 118, __pyx_L1_error) } else { __Pyx_GOTREF(__pyx_cur_scope); } /* "aiohttp/_http_parser.pyx":119 * * def __repr__(self): * info = [] # <<<<<<<<<<<<<< * info.append(("method", self.method)) * info.append(("path", self.path)) */ __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 119, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_cur_scope->__pyx_v_info = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":120 * def __repr__(self): * info = [] * info.append(("method", self.method)) # <<<<<<<<<<<<<< * info.append(("path", self.path)) * info.append(("version", self.version)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 120, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_method); __Pyx_GIVEREF(__pyx_n_u_method); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_method); __Pyx_INCREF(__pyx_v_self->method); __Pyx_GIVEREF(__pyx_v_self->method); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->method); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 120, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":121 * info = [] * info.append(("method", self.method)) * info.append(("path", self.path)) # <<<<<<<<<<<<<< * info.append(("version", self.version)) * info.append(("headers", self.headers)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 121, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_path); __Pyx_GIVEREF(__pyx_n_u_path); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_path); __Pyx_INCREF(__pyx_v_self->path); __Pyx_GIVEREF(__pyx_v_self->path); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->path); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 121, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":122 * info.append(("method", self.method)) * info.append(("path", self.path)) * info.append(("version", self.version)) # <<<<<<<<<<<<<< * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 122, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_version); __Pyx_GIVEREF(__pyx_n_u_version); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_version); __Pyx_INCREF(__pyx_v_self->version); __Pyx_GIVEREF(__pyx_v_self->version); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->version); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 122, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":123 * info.append(("path", self.path)) * info.append(("version", self.version)) * info.append(("headers", self.headers)) # <<<<<<<<<<<<<< * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 123, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_headers); __Pyx_GIVEREF(__pyx_n_u_headers); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_headers); __Pyx_INCREF(__pyx_v_self->headers); __Pyx_GIVEREF(__pyx_v_self->headers); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->headers); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 123, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":124 * info.append(("version", self.version)) * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) # <<<<<<<<<<<<<< * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 124, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_raw_headers); __Pyx_GIVEREF(__pyx_n_u_raw_headers); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_raw_headers); __Pyx_INCREF(__pyx_v_self->raw_headers); __Pyx_GIVEREF(__pyx_v_self->raw_headers); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->raw_headers); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 124, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":125 * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) # <<<<<<<<<<<<<< * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 125, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_should_close); __Pyx_GIVEREF(__pyx_n_u_should_close); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_should_close); __Pyx_INCREF(__pyx_v_self->should_close); __Pyx_GIVEREF(__pyx_v_self->should_close); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->should_close); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 125, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":126 * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) # <<<<<<<<<<<<<< * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 126, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_compression); __Pyx_GIVEREF(__pyx_n_u_compression); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_compression); __Pyx_INCREF(__pyx_v_self->compression); __Pyx_GIVEREF(__pyx_v_self->compression); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->compression); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 126, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":127 * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) # <<<<<<<<<<<<<< * info.append(("chunked", self.chunked)) * info.append(("url", self.url)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 127, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_upgrade); __Pyx_GIVEREF(__pyx_n_u_upgrade); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_upgrade); __Pyx_INCREF(__pyx_v_self->upgrade); __Pyx_GIVEREF(__pyx_v_self->upgrade); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->upgrade); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 127, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":128 * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) # <<<<<<<<<<<<<< * info.append(("url", self.url)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 128, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_chunked); __Pyx_GIVEREF(__pyx_n_u_chunked); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_chunked); __Pyx_INCREF(__pyx_v_self->chunked); __Pyx_GIVEREF(__pyx_v_self->chunked); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->chunked); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 128, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":129 * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) * info.append(("url", self.url)) # <<<<<<<<<<<<<< * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) * return '' */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 129, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_url); __Pyx_GIVEREF(__pyx_n_u_url); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_url); __Pyx_INCREF(__pyx_v_self->url); __Pyx_GIVEREF(__pyx_v_self->url); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->url); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 129, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":130 * info.append(("chunked", self.chunked)) * info.append(("url", self.url)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ __pyx_t_1 = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_8__repr___genexpr(((PyObject*)__pyx_cur_scope)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = __Pyx_Generator_Next(__pyx_t_1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyUnicode_Join(__pyx_kp_u__2, __pyx_t_3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_v_sinfo = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":131 * info.append(("url", self.url)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) * return '' # <<<<<<<<<<<<<< * * def _replace(self, **dct): */ __Pyx_XDECREF(__pyx_r); __pyx_t_1 = __Pyx_PyUnicode_ConcatSafe(__pyx_kp_u_RawRequestMessage, __pyx_v_sinfo); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 131, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = __Pyx_PyUnicode_Concat(__pyx_t_1, __pyx_kp_u__3); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 131, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":118 * self.url = url * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("method", self.method)) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__repr__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_sinfo); __Pyx_DECREF(((PyObject *)__pyx_cur_scope)); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":133 * return '' * * def _replace(self, **dct): # <<<<<<<<<<<<<< * cdef RawRequestMessage ret * ret = _new_request_message(self.method, */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_5_replace(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_5_replace(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_dct = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("_replace (wrapper)", 0); if (unlikely(PyTuple_GET_SIZE(__pyx_args) > 0)) { __Pyx_RaiseArgtupleInvalid("_replace", 1, 0, 0, PyTuple_GET_SIZE(__pyx_args)); return NULL;} if (__pyx_kwds && unlikely(!__Pyx_CheckKeywordStrings(__pyx_kwds, "_replace", 1))) return NULL; __pyx_v_dct = (__pyx_kwds) ? PyDict_Copy(__pyx_kwds) : PyDict_New(); if (unlikely(!__pyx_v_dct)) return NULL; __Pyx_GOTREF(__pyx_v_dct); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_4_replace(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self), __pyx_v_dct); /* function exit code */ __Pyx_XDECREF(__pyx_v_dct); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_4_replace(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self, PyObject *__pyx_v_dct) { struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_ret = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; PyObject *__pyx_t_7 = NULL; int __pyx_t_8; int __pyx_t_9; PyObject *__pyx_t_10 = NULL; PyObject *__pyx_t_11 = NULL; __Pyx_RefNannySetupContext("_replace", 0); /* "aiohttp/_http_parser.pyx":135 * def _replace(self, **dct): * cdef RawRequestMessage ret * ret = _new_request_message(self.method, # <<<<<<<<<<<<<< * self.path, * self.version, */ __pyx_t_1 = __pyx_v_self->method; __Pyx_INCREF(__pyx_t_1); /* "aiohttp/_http_parser.pyx":136 * cdef RawRequestMessage ret * ret = _new_request_message(self.method, * self.path, # <<<<<<<<<<<<<< * self.version, * self.headers, */ __pyx_t_2 = __pyx_v_self->path; __Pyx_INCREF(__pyx_t_2); /* "aiohttp/_http_parser.pyx":137 * ret = _new_request_message(self.method, * self.path, * self.version, # <<<<<<<<<<<<<< * self.headers, * self.raw_headers, */ __pyx_t_3 = __pyx_v_self->version; __Pyx_INCREF(__pyx_t_3); /* "aiohttp/_http_parser.pyx":138 * self.path, * self.version, * self.headers, # <<<<<<<<<<<<<< * self.raw_headers, * self.should_close, */ __pyx_t_4 = __pyx_v_self->headers; __Pyx_INCREF(__pyx_t_4); /* "aiohttp/_http_parser.pyx":139 * self.version, * self.headers, * self.raw_headers, # <<<<<<<<<<<<<< * self.should_close, * self.compression, */ __pyx_t_5 = __pyx_v_self->raw_headers; __Pyx_INCREF(__pyx_t_5); /* "aiohttp/_http_parser.pyx":140 * self.headers, * self.raw_headers, * self.should_close, # <<<<<<<<<<<<<< * self.compression, * self.upgrade, */ __pyx_t_6 = __Pyx_PyObject_IsTrue(__pyx_v_self->should_close); if (unlikely((__pyx_t_6 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 140, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":141 * self.raw_headers, * self.should_close, * self.compression, # <<<<<<<<<<<<<< * self.upgrade, * self.chunked, */ __pyx_t_7 = __pyx_v_self->compression; __Pyx_INCREF(__pyx_t_7); /* "aiohttp/_http_parser.pyx":142 * self.should_close, * self.compression, * self.upgrade, # <<<<<<<<<<<<<< * self.chunked, * self.url) */ __pyx_t_8 = __Pyx_PyObject_IsTrue(__pyx_v_self->upgrade); if (unlikely((__pyx_t_8 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 142, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":143 * self.compression, * self.upgrade, * self.chunked, # <<<<<<<<<<<<<< * self.url) * if "method" in dct: */ __pyx_t_9 = __Pyx_PyObject_IsTrue(__pyx_v_self->chunked); if (unlikely((__pyx_t_9 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 143, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":144 * self.upgrade, * self.chunked, * self.url) # <<<<<<<<<<<<<< * if "method" in dct: * ret.method = dct["method"] */ __pyx_t_10 = __pyx_v_self->url; __Pyx_INCREF(__pyx_t_10); /* "aiohttp/_http_parser.pyx":135 * def _replace(self, **dct): * cdef RawRequestMessage ret * ret = _new_request_message(self.method, # <<<<<<<<<<<<<< * self.path, * self.version, */ __pyx_t_11 = __pyx_f_7aiohttp_12_http_parser__new_request_message(((PyObject*)__pyx_t_1), ((PyObject*)__pyx_t_2), __pyx_t_3, __pyx_t_4, __pyx_t_5, __pyx_t_6, __pyx_t_7, __pyx_t_8, __pyx_t_9, __pyx_t_10); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 135, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_10); __pyx_t_10 = 0; if (!(likely(((__pyx_t_11) == Py_None) || likely(__Pyx_TypeTest(__pyx_t_11, __pyx_ptype_7aiohttp_12_http_parser_RawRequestMessage))))) __PYX_ERR(0, 135, __pyx_L1_error) __pyx_v_ret = ((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_t_11); __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":145 * self.chunked, * self.url) * if "method" in dct: # <<<<<<<<<<<<<< * ret.method = dct["method"] * if "path" in dct: */ __pyx_t_9 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_method, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_9 < 0)) __PYX_ERR(0, 145, __pyx_L1_error) __pyx_t_8 = (__pyx_t_9 != 0); if (__pyx_t_8) { /* "aiohttp/_http_parser.pyx":146 * self.url) * if "method" in dct: * ret.method = dct["method"] # <<<<<<<<<<<<<< * if "path" in dct: * ret.path = dct["path"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_method); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 146, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); if (!(likely(PyUnicode_CheckExact(__pyx_t_11))||((__pyx_t_11) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_11)->tp_name), 0))) __PYX_ERR(0, 146, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->method); __Pyx_DECREF(__pyx_v_ret->method); __pyx_v_ret->method = ((PyObject*)__pyx_t_11); __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":145 * self.chunked, * self.url) * if "method" in dct: # <<<<<<<<<<<<<< * ret.method = dct["method"] * if "path" in dct: */ } /* "aiohttp/_http_parser.pyx":147 * if "method" in dct: * ret.method = dct["method"] * if "path" in dct: # <<<<<<<<<<<<<< * ret.path = dct["path"] * if "version" in dct: */ __pyx_t_8 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_path, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_8 < 0)) __PYX_ERR(0, 147, __pyx_L1_error) __pyx_t_9 = (__pyx_t_8 != 0); if (__pyx_t_9) { /* "aiohttp/_http_parser.pyx":148 * ret.method = dct["method"] * if "path" in dct: * ret.path = dct["path"] # <<<<<<<<<<<<<< * if "version" in dct: * ret.version = dct["version"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_path); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 148, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); if (!(likely(PyUnicode_CheckExact(__pyx_t_11))||((__pyx_t_11) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_11)->tp_name), 0))) __PYX_ERR(0, 148, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->path); __Pyx_DECREF(__pyx_v_ret->path); __pyx_v_ret->path = ((PyObject*)__pyx_t_11); __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":147 * if "method" in dct: * ret.method = dct["method"] * if "path" in dct: # <<<<<<<<<<<<<< * ret.path = dct["path"] * if "version" in dct: */ } /* "aiohttp/_http_parser.pyx":149 * if "path" in dct: * ret.path = dct["path"] * if "version" in dct: # <<<<<<<<<<<<<< * ret.version = dct["version"] * if "headers" in dct: */ __pyx_t_9 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_version, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_9 < 0)) __PYX_ERR(0, 149, __pyx_L1_error) __pyx_t_8 = (__pyx_t_9 != 0); if (__pyx_t_8) { /* "aiohttp/_http_parser.pyx":150 * ret.path = dct["path"] * if "version" in dct: * ret.version = dct["version"] # <<<<<<<<<<<<<< * if "headers" in dct: * ret.headers = dct["headers"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_version); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 150, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->version); __Pyx_DECREF(__pyx_v_ret->version); __pyx_v_ret->version = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":149 * if "path" in dct: * ret.path = dct["path"] * if "version" in dct: # <<<<<<<<<<<<<< * ret.version = dct["version"] * if "headers" in dct: */ } /* "aiohttp/_http_parser.pyx":151 * if "version" in dct: * ret.version = dct["version"] * if "headers" in dct: # <<<<<<<<<<<<<< * ret.headers = dct["headers"] * if "raw_headers" in dct: */ __pyx_t_8 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_headers, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_8 < 0)) __PYX_ERR(0, 151, __pyx_L1_error) __pyx_t_9 = (__pyx_t_8 != 0); if (__pyx_t_9) { /* "aiohttp/_http_parser.pyx":152 * ret.version = dct["version"] * if "headers" in dct: * ret.headers = dct["headers"] # <<<<<<<<<<<<<< * if "raw_headers" in dct: * ret.raw_headers = dct["raw_headers"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_headers); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 152, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->headers); __Pyx_DECREF(__pyx_v_ret->headers); __pyx_v_ret->headers = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":151 * if "version" in dct: * ret.version = dct["version"] * if "headers" in dct: # <<<<<<<<<<<<<< * ret.headers = dct["headers"] * if "raw_headers" in dct: */ } /* "aiohttp/_http_parser.pyx":153 * if "headers" in dct: * ret.headers = dct["headers"] * if "raw_headers" in dct: # <<<<<<<<<<<<<< * ret.raw_headers = dct["raw_headers"] * if "should_close" in dct: */ __pyx_t_9 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_raw_headers, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_9 < 0)) __PYX_ERR(0, 153, __pyx_L1_error) __pyx_t_8 = (__pyx_t_9 != 0); if (__pyx_t_8) { /* "aiohttp/_http_parser.pyx":154 * ret.headers = dct["headers"] * if "raw_headers" in dct: * ret.raw_headers = dct["raw_headers"] # <<<<<<<<<<<<<< * if "should_close" in dct: * ret.should_close = dct["should_close"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_raw_headers); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 154, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->raw_headers); __Pyx_DECREF(__pyx_v_ret->raw_headers); __pyx_v_ret->raw_headers = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":153 * if "headers" in dct: * ret.headers = dct["headers"] * if "raw_headers" in dct: # <<<<<<<<<<<<<< * ret.raw_headers = dct["raw_headers"] * if "should_close" in dct: */ } /* "aiohttp/_http_parser.pyx":155 * if "raw_headers" in dct: * ret.raw_headers = dct["raw_headers"] * if "should_close" in dct: # <<<<<<<<<<<<<< * ret.should_close = dct["should_close"] * if "compression" in dct: */ __pyx_t_8 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_should_close, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_8 < 0)) __PYX_ERR(0, 155, __pyx_L1_error) __pyx_t_9 = (__pyx_t_8 != 0); if (__pyx_t_9) { /* "aiohttp/_http_parser.pyx":156 * ret.raw_headers = dct["raw_headers"] * if "should_close" in dct: * ret.should_close = dct["should_close"] # <<<<<<<<<<<<<< * if "compression" in dct: * ret.compression = dct["compression"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_should_close); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 156, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->should_close); __Pyx_DECREF(__pyx_v_ret->should_close); __pyx_v_ret->should_close = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":155 * if "raw_headers" in dct: * ret.raw_headers = dct["raw_headers"] * if "should_close" in dct: # <<<<<<<<<<<<<< * ret.should_close = dct["should_close"] * if "compression" in dct: */ } /* "aiohttp/_http_parser.pyx":157 * if "should_close" in dct: * ret.should_close = dct["should_close"] * if "compression" in dct: # <<<<<<<<<<<<<< * ret.compression = dct["compression"] * if "upgrade" in dct: */ __pyx_t_9 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_compression, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_9 < 0)) __PYX_ERR(0, 157, __pyx_L1_error) __pyx_t_8 = (__pyx_t_9 != 0); if (__pyx_t_8) { /* "aiohttp/_http_parser.pyx":158 * ret.should_close = dct["should_close"] * if "compression" in dct: * ret.compression = dct["compression"] # <<<<<<<<<<<<<< * if "upgrade" in dct: * ret.upgrade = dct["upgrade"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_compression); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 158, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->compression); __Pyx_DECREF(__pyx_v_ret->compression); __pyx_v_ret->compression = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":157 * if "should_close" in dct: * ret.should_close = dct["should_close"] * if "compression" in dct: # <<<<<<<<<<<<<< * ret.compression = dct["compression"] * if "upgrade" in dct: */ } /* "aiohttp/_http_parser.pyx":159 * if "compression" in dct: * ret.compression = dct["compression"] * if "upgrade" in dct: # <<<<<<<<<<<<<< * ret.upgrade = dct["upgrade"] * if "chunked" in dct: */ __pyx_t_8 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_upgrade, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_8 < 0)) __PYX_ERR(0, 159, __pyx_L1_error) __pyx_t_9 = (__pyx_t_8 != 0); if (__pyx_t_9) { /* "aiohttp/_http_parser.pyx":160 * ret.compression = dct["compression"] * if "upgrade" in dct: * ret.upgrade = dct["upgrade"] # <<<<<<<<<<<<<< * if "chunked" in dct: * ret.chunked = dct["chunked"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_upgrade); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 160, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->upgrade); __Pyx_DECREF(__pyx_v_ret->upgrade); __pyx_v_ret->upgrade = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":159 * if "compression" in dct: * ret.compression = dct["compression"] * if "upgrade" in dct: # <<<<<<<<<<<<<< * ret.upgrade = dct["upgrade"] * if "chunked" in dct: */ } /* "aiohttp/_http_parser.pyx":161 * if "upgrade" in dct: * ret.upgrade = dct["upgrade"] * if "chunked" in dct: # <<<<<<<<<<<<<< * ret.chunked = dct["chunked"] * if "url" in dct: */ __pyx_t_9 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_chunked, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_9 < 0)) __PYX_ERR(0, 161, __pyx_L1_error) __pyx_t_8 = (__pyx_t_9 != 0); if (__pyx_t_8) { /* "aiohttp/_http_parser.pyx":162 * ret.upgrade = dct["upgrade"] * if "chunked" in dct: * ret.chunked = dct["chunked"] # <<<<<<<<<<<<<< * if "url" in dct: * ret.url = dct["url"] */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_chunked); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 162, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->chunked); __Pyx_DECREF(__pyx_v_ret->chunked); __pyx_v_ret->chunked = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":161 * if "upgrade" in dct: * ret.upgrade = dct["upgrade"] * if "chunked" in dct: # <<<<<<<<<<<<<< * ret.chunked = dct["chunked"] * if "url" in dct: */ } /* "aiohttp/_http_parser.pyx":163 * if "chunked" in dct: * ret.chunked = dct["chunked"] * if "url" in dct: # <<<<<<<<<<<<<< * ret.url = dct["url"] * return ret */ __pyx_t_8 = (__Pyx_PyDict_ContainsTF(__pyx_n_u_url, __pyx_v_dct, Py_EQ)); if (unlikely(__pyx_t_8 < 0)) __PYX_ERR(0, 163, __pyx_L1_error) __pyx_t_9 = (__pyx_t_8 != 0); if (__pyx_t_9) { /* "aiohttp/_http_parser.pyx":164 * ret.chunked = dct["chunked"] * if "url" in dct: * ret.url = dct["url"] # <<<<<<<<<<<<<< * return ret * */ __pyx_t_11 = __Pyx_PyDict_GetItem(__pyx_v_dct, __pyx_n_u_url); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 164, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_GIVEREF(__pyx_t_11); __Pyx_GOTREF(__pyx_v_ret->url); __Pyx_DECREF(__pyx_v_ret->url); __pyx_v_ret->url = __pyx_t_11; __pyx_t_11 = 0; /* "aiohttp/_http_parser.pyx":163 * if "chunked" in dct: * ret.chunked = dct["chunked"] * if "url" in dct: # <<<<<<<<<<<<<< * ret.url = dct["url"] * return ret */ } /* "aiohttp/_http_parser.pyx":165 * if "url" in dct: * ret.url = dct["url"] * return ret # <<<<<<<<<<<<<< * * cdef _new_request_message(str method, */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_ret)); __pyx_r = ((PyObject *)__pyx_v_ret); goto __pyx_L0; /* "aiohttp/_http_parser.pyx":133 * return '' * * def _replace(self, **dct): # <<<<<<<<<<<<<< * cdef RawRequestMessage ret * ret = _new_request_message(self.method, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_10); __Pyx_XDECREF(__pyx_t_11); __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage._replace", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_ret); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":94 * @cython.freelist(DEFAULT_FREELIST_SIZE) * cdef class RawRequestMessage: * cdef readonly str method # <<<<<<<<<<<<<< * cdef readonly str path * cdef readonly object version # HttpVersion */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_6method_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_6method_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_6method___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_6method___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->method); __pyx_r = __pyx_v_self->method; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":95 * cdef class RawRequestMessage: * cdef readonly str method * cdef readonly str path # <<<<<<<<<<<<<< * cdef readonly object version # HttpVersion * cdef readonly object headers # CIMultiDict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_4path_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_4path_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_4path___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_4path___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->path); __pyx_r = __pyx_v_self->path; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":96 * cdef readonly str method * cdef readonly str path * cdef readonly object version # HttpVersion # <<<<<<<<<<<<<< * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7version_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7version_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7version___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7version___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->version); __pyx_r = __pyx_v_self->version; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":97 * cdef readonly str path * cdef readonly object version # HttpVersion * cdef readonly object headers # CIMultiDict # <<<<<<<<<<<<<< * cdef readonly object raw_headers # tuple * cdef readonly object should_close */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7headers_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7headers_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7headers___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7headers___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->headers); __pyx_r = __pyx_v_self->headers; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":98 * cdef readonly object version # HttpVersion * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple # <<<<<<<<<<<<<< * cdef readonly object should_close * cdef readonly object compression */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11raw_headers_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11raw_headers_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_11raw_headers___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_11raw_headers___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->raw_headers); __pyx_r = __pyx_v_self->raw_headers; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":99 * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple * cdef readonly object should_close # <<<<<<<<<<<<<< * cdef readonly object compression * cdef readonly object upgrade */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_12should_close_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_12should_close_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_12should_close___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_12should_close___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->should_close); __pyx_r = __pyx_v_self->should_close; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":100 * cdef readonly object raw_headers # tuple * cdef readonly object should_close * cdef readonly object compression # <<<<<<<<<<<<<< * cdef readonly object upgrade * cdef readonly object chunked */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11compression_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11compression_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_11compression___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_11compression___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->compression); __pyx_r = __pyx_v_self->compression; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":101 * cdef readonly object should_close * cdef readonly object compression * cdef readonly object upgrade # <<<<<<<<<<<<<< * cdef readonly object chunked * cdef readonly object url # yarl.URL */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7upgrade_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7upgrade_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7upgrade___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7upgrade___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->upgrade); __pyx_r = __pyx_v_self->upgrade; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":102 * cdef readonly object compression * cdef readonly object upgrade * cdef readonly object chunked # <<<<<<<<<<<<<< * cdef readonly object url # yarl.URL * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7chunked_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7chunked_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7chunked___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_7chunked___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->chunked); __pyx_r = __pyx_v_self->chunked; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":103 * cdef readonly object upgrade * cdef readonly object chunked * cdef readonly object url # yarl.URL # <<<<<<<<<<<<<< * * def __init__(self, method, path, version, headers, raw_headers, */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3url_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3url_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_3url___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_3url___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->url); __pyx_r = __pyx_v_self->url; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_6__reduce_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_6__reduce_cython__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self) { PyObject *__pyx_v_state = 0; PyObject *__pyx_v__dict = 0; int __pyx_v_use_setstate; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; int __pyx_t_3; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":5 * cdef object _dict * cdef bint use_setstate * state = (self.chunked, self.compression, self.headers, self.method, self.path, self.raw_headers, self.should_close, self.upgrade, self.url, self.version) # <<<<<<<<<<<<<< * _dict = getattr(self, '__dict__', None) * if _dict is not None: */ __pyx_t_1 = PyTuple_New(10); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_v_self->chunked); __Pyx_GIVEREF(__pyx_v_self->chunked); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_v_self->chunked); __Pyx_INCREF(__pyx_v_self->compression); __Pyx_GIVEREF(__pyx_v_self->compression); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->compression); __Pyx_INCREF(__pyx_v_self->headers); __Pyx_GIVEREF(__pyx_v_self->headers); PyTuple_SET_ITEM(__pyx_t_1, 2, __pyx_v_self->headers); __Pyx_INCREF(__pyx_v_self->method); __Pyx_GIVEREF(__pyx_v_self->method); PyTuple_SET_ITEM(__pyx_t_1, 3, __pyx_v_self->method); __Pyx_INCREF(__pyx_v_self->path); __Pyx_GIVEREF(__pyx_v_self->path); PyTuple_SET_ITEM(__pyx_t_1, 4, __pyx_v_self->path); __Pyx_INCREF(__pyx_v_self->raw_headers); __Pyx_GIVEREF(__pyx_v_self->raw_headers); PyTuple_SET_ITEM(__pyx_t_1, 5, __pyx_v_self->raw_headers); __Pyx_INCREF(__pyx_v_self->should_close); __Pyx_GIVEREF(__pyx_v_self->should_close); PyTuple_SET_ITEM(__pyx_t_1, 6, __pyx_v_self->should_close); __Pyx_INCREF(__pyx_v_self->upgrade); __Pyx_GIVEREF(__pyx_v_self->upgrade); PyTuple_SET_ITEM(__pyx_t_1, 7, __pyx_v_self->upgrade); __Pyx_INCREF(__pyx_v_self->url); __Pyx_GIVEREF(__pyx_v_self->url); PyTuple_SET_ITEM(__pyx_t_1, 8, __pyx_v_self->url); __Pyx_INCREF(__pyx_v_self->version); __Pyx_GIVEREF(__pyx_v_self->version); PyTuple_SET_ITEM(__pyx_t_1, 9, __pyx_v_self->version); __pyx_v_state = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":6 * cdef bint use_setstate * state = (self.chunked, self.compression, self.headers, self.method, self.path, self.raw_headers, self.should_close, self.upgrade, self.url, self.version) * _dict = getattr(self, '__dict__', None) # <<<<<<<<<<<<<< * if _dict is not None: * state += (_dict,) */ __pyx_t_1 = __Pyx_GetAttr3(((PyObject *)__pyx_v_self), __pyx_n_s_dict, Py_None); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v__dict = __pyx_t_1; __pyx_t_1 = 0; /* "(tree fragment)":7 * state = (self.chunked, self.compression, self.headers, self.method, self.path, self.raw_headers, self.should_close, self.upgrade, self.url, self.version) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ __pyx_t_2 = (__pyx_v__dict != Py_None); __pyx_t_3 = (__pyx_t_2 != 0); if (__pyx_t_3) { /* "(tree fragment)":8 * _dict = getattr(self, '__dict__', None) * if _dict is not None: * state += (_dict,) # <<<<<<<<<<<<<< * use_setstate = True * else: */ __pyx_t_1 = PyTuple_New(1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_v__dict); __Pyx_GIVEREF(__pyx_v__dict); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_v__dict); __pyx_t_4 = PyNumber_InPlaceAdd(__pyx_v_state, __pyx_t_1); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF_SET(__pyx_v_state, ((PyObject*)__pyx_t_4)); __pyx_t_4 = 0; /* "(tree fragment)":9 * if _dict is not None: * state += (_dict,) * use_setstate = True # <<<<<<<<<<<<<< * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.method is not None or self.path is not None or self.raw_headers is not None or self.should_close is not None or self.upgrade is not None or self.url is not None or self.version is not None */ __pyx_v_use_setstate = 1; /* "(tree fragment)":7 * state = (self.chunked, self.compression, self.headers, self.method, self.path, self.raw_headers, self.should_close, self.upgrade, self.url, self.version) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ goto __pyx_L3; } /* "(tree fragment)":11 * use_setstate = True * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.method is not None or self.path is not None or self.raw_headers is not None or self.should_close is not None or self.upgrade is not None or self.url is not None or self.version is not None # <<<<<<<<<<<<<< * if use_setstate: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, None), state */ /*else*/ { __pyx_t_2 = (__pyx_v_self->chunked != Py_None); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->compression != Py_None); __pyx_t_2 = (__pyx_t_5 != 0); if (!__pyx_t_2) { } else { __pyx_t_3 = __pyx_t_2; goto __pyx_L4_bool_binop_done; } __pyx_t_2 = (__pyx_v_self->headers != Py_None); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->method != ((PyObject*)Py_None)); __pyx_t_2 = (__pyx_t_5 != 0); if (!__pyx_t_2) { } else { __pyx_t_3 = __pyx_t_2; goto __pyx_L4_bool_binop_done; } __pyx_t_2 = (__pyx_v_self->path != ((PyObject*)Py_None)); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->raw_headers != Py_None); __pyx_t_2 = (__pyx_t_5 != 0); if (!__pyx_t_2) { } else { __pyx_t_3 = __pyx_t_2; goto __pyx_L4_bool_binop_done; } __pyx_t_2 = (__pyx_v_self->should_close != Py_None); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->upgrade != Py_None); __pyx_t_2 = (__pyx_t_5 != 0); if (!__pyx_t_2) { } else { __pyx_t_3 = __pyx_t_2; goto __pyx_L4_bool_binop_done; } __pyx_t_2 = (__pyx_v_self->url != Py_None); __pyx_t_5 = (__pyx_t_2 != 0); if (!__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->version != Py_None); __pyx_t_2 = (__pyx_t_5 != 0); __pyx_t_3 = __pyx_t_2; __pyx_L4_bool_binop_done:; __pyx_v_use_setstate = __pyx_t_3; } __pyx_L3:; /* "(tree fragment)":12 * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.method is not None or self.path is not None or self.raw_headers is not None or self.should_close is not None or self.upgrade is not None or self.url is not None or self.version is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, None), state * else: */ __pyx_t_3 = (__pyx_v_use_setstate != 0); if (__pyx_t_3) { /* "(tree fragment)":13 * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.method is not None or self.path is not None or self.raw_headers is not None or self.should_close is not None or self.upgrade is not None or self.url is not None or self.version is not None * if use_setstate: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, None), state # <<<<<<<<<<<<<< * else: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, state) */ __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_4, __pyx_n_s_pyx_unpickle_RawRequestMessage); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_1 = PyTuple_New(3); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_1, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_21004882); __Pyx_GIVEREF(__pyx_int_21004882); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_int_21004882); __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); PyTuple_SET_ITEM(__pyx_t_1, 2, Py_None); __pyx_t_6 = PyTuple_New(3); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_6, 1, __pyx_t_1); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_6, 2, __pyx_v_state); __pyx_t_4 = 0; __pyx_t_1 = 0; __pyx_r = __pyx_t_6; __pyx_t_6 = 0; goto __pyx_L0; /* "(tree fragment)":12 * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.method is not None or self.path is not None or self.raw_headers is not None or self.should_close is not None or self.upgrade is not None or self.url is not None or self.version is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, None), state * else: */ } /* "(tree fragment)":15 * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, None), state * else: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, state) # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_RawRequestMessage__set_state(self, __pyx_state) */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_pyx_unpickle_RawRequestMessage); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_1 = PyTuple_New(3); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_1, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_21004882); __Pyx_GIVEREF(__pyx_int_21004882); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_int_21004882); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_1, 2, __pyx_v_state); __pyx_t_4 = PyTuple_New(2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_GIVEREF(__pyx_t_6); PyTuple_SET_ITEM(__pyx_t_4, 0, __pyx_t_6); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_4, 1, __pyx_t_1); __pyx_t_6 = 0; __pyx_t_1 = 0; __pyx_r = __pyx_t_4; __pyx_t_4 = 0; goto __pyx_L0; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_state); __Pyx_XDECREF(__pyx_v__dict); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":16 * else: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_RawRequestMessage__set_state(self, __pyx_state) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_9__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_9__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_8__setstate_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17RawRequestMessage_8__setstate_cython__(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":17 * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, state) * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_RawRequestMessage__set_state(self, __pyx_state) # <<<<<<<<<<<<<< */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 17, __pyx_L1_error) __pyx_t_1 = __pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawRequestMessage__set_state(__pyx_v_self, ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":16 * else: * return __pyx_unpickle_RawRequestMessage, (type(self), 0x1408252, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_RawRequestMessage__set_state(self, __pyx_state) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.RawRequestMessage.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":167 * return ret * * cdef _new_request_message(str method, # <<<<<<<<<<<<<< * str path, * object version, */ static PyObject *__pyx_f_7aiohttp_12_http_parser__new_request_message(PyObject *__pyx_v_method, PyObject *__pyx_v_path, PyObject *__pyx_v_version, PyObject *__pyx_v_headers, PyObject *__pyx_v_raw_headers, int __pyx_v_should_close, PyObject *__pyx_v_compression, int __pyx_v_upgrade, int __pyx_v_chunked, PyObject *__pyx_v_url) { struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v_ret = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("_new_request_message", 0); /* "aiohttp/_http_parser.pyx":178 * object url): * cdef RawRequestMessage ret * ret = RawRequestMessage.__new__(RawRequestMessage) # <<<<<<<<<<<<<< * ret.method = method * ret.path = path */ __pyx_t_1 = ((PyObject *)__pyx_tp_new_7aiohttp_12_http_parser_RawRequestMessage(((PyTypeObject *)__pyx_ptype_7aiohttp_12_http_parser_RawRequestMessage), __pyx_empty_tuple, NULL)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 178, __pyx_L1_error) __Pyx_GOTREF(((PyObject *)__pyx_t_1)); __pyx_v_ret = ((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":179 * cdef RawRequestMessage ret * ret = RawRequestMessage.__new__(RawRequestMessage) * ret.method = method # <<<<<<<<<<<<<< * ret.path = path * ret.version = version */ __Pyx_INCREF(__pyx_v_method); __Pyx_GIVEREF(__pyx_v_method); __Pyx_GOTREF(__pyx_v_ret->method); __Pyx_DECREF(__pyx_v_ret->method); __pyx_v_ret->method = __pyx_v_method; /* "aiohttp/_http_parser.pyx":180 * ret = RawRequestMessage.__new__(RawRequestMessage) * ret.method = method * ret.path = path # <<<<<<<<<<<<<< * ret.version = version * ret.headers = headers */ __Pyx_INCREF(__pyx_v_path); __Pyx_GIVEREF(__pyx_v_path); __Pyx_GOTREF(__pyx_v_ret->path); __Pyx_DECREF(__pyx_v_ret->path); __pyx_v_ret->path = __pyx_v_path; /* "aiohttp/_http_parser.pyx":181 * ret.method = method * ret.path = path * ret.version = version # <<<<<<<<<<<<<< * ret.headers = headers * ret.raw_headers = raw_headers */ __Pyx_INCREF(__pyx_v_version); __Pyx_GIVEREF(__pyx_v_version); __Pyx_GOTREF(__pyx_v_ret->version); __Pyx_DECREF(__pyx_v_ret->version); __pyx_v_ret->version = __pyx_v_version; /* "aiohttp/_http_parser.pyx":182 * ret.path = path * ret.version = version * ret.headers = headers # <<<<<<<<<<<<<< * ret.raw_headers = raw_headers * ret.should_close = should_close */ __Pyx_INCREF(__pyx_v_headers); __Pyx_GIVEREF(__pyx_v_headers); __Pyx_GOTREF(__pyx_v_ret->headers); __Pyx_DECREF(__pyx_v_ret->headers); __pyx_v_ret->headers = __pyx_v_headers; /* "aiohttp/_http_parser.pyx":183 * ret.version = version * ret.headers = headers * ret.raw_headers = raw_headers # <<<<<<<<<<<<<< * ret.should_close = should_close * ret.compression = compression */ __Pyx_INCREF(__pyx_v_raw_headers); __Pyx_GIVEREF(__pyx_v_raw_headers); __Pyx_GOTREF(__pyx_v_ret->raw_headers); __Pyx_DECREF(__pyx_v_ret->raw_headers); __pyx_v_ret->raw_headers = __pyx_v_raw_headers; /* "aiohttp/_http_parser.pyx":184 * ret.headers = headers * ret.raw_headers = raw_headers * ret.should_close = should_close # <<<<<<<<<<<<<< * ret.compression = compression * ret.upgrade = upgrade */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_should_close); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 184, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->should_close); __Pyx_DECREF(__pyx_v_ret->should_close); __pyx_v_ret->should_close = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":185 * ret.raw_headers = raw_headers * ret.should_close = should_close * ret.compression = compression # <<<<<<<<<<<<<< * ret.upgrade = upgrade * ret.chunked = chunked */ __Pyx_INCREF(__pyx_v_compression); __Pyx_GIVEREF(__pyx_v_compression); __Pyx_GOTREF(__pyx_v_ret->compression); __Pyx_DECREF(__pyx_v_ret->compression); __pyx_v_ret->compression = __pyx_v_compression; /* "aiohttp/_http_parser.pyx":186 * ret.should_close = should_close * ret.compression = compression * ret.upgrade = upgrade # <<<<<<<<<<<<<< * ret.chunked = chunked * ret.url = url */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_upgrade); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 186, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->upgrade); __Pyx_DECREF(__pyx_v_ret->upgrade); __pyx_v_ret->upgrade = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":187 * ret.compression = compression * ret.upgrade = upgrade * ret.chunked = chunked # <<<<<<<<<<<<<< * ret.url = url * return ret */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_chunked); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 187, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->chunked); __Pyx_DECREF(__pyx_v_ret->chunked); __pyx_v_ret->chunked = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":188 * ret.upgrade = upgrade * ret.chunked = chunked * ret.url = url # <<<<<<<<<<<<<< * return ret * */ __Pyx_INCREF(__pyx_v_url); __Pyx_GIVEREF(__pyx_v_url); __Pyx_GOTREF(__pyx_v_ret->url); __Pyx_DECREF(__pyx_v_ret->url); __pyx_v_ret->url = __pyx_v_url; /* "aiohttp/_http_parser.pyx":189 * ret.chunked = chunked * ret.url = url * return ret # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_ret)); __pyx_r = ((PyObject *)__pyx_v_ret); goto __pyx_L0; /* "aiohttp/_http_parser.pyx":167 * return ret * * cdef _new_request_message(str method, # <<<<<<<<<<<<<< * str path, * object version, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser._new_request_message", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_ret); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":204 * cdef readonly object chunked * * def __init__(self, version, code, reason, headers, raw_headers, # <<<<<<<<<<<<<< * should_close, compression, upgrade, chunked): * self.version = version */ /* Python wrapper */ static int __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_version = 0; PyObject *__pyx_v_code = 0; PyObject *__pyx_v_reason = 0; PyObject *__pyx_v_headers = 0; PyObject *__pyx_v_raw_headers = 0; PyObject *__pyx_v_should_close = 0; PyObject *__pyx_v_compression = 0; PyObject *__pyx_v_upgrade = 0; PyObject *__pyx_v_chunked = 0; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_version,&__pyx_n_s_code,&__pyx_n_s_reason,&__pyx_n_s_headers,&__pyx_n_s_raw_headers,&__pyx_n_s_should_close,&__pyx_n_s_compression,&__pyx_n_s_upgrade,&__pyx_n_s_chunked,0}; PyObject* values[9] = {0,0,0,0,0,0,0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_version)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_code)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 1); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_reason)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 2); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 3: if (likely((values[3] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_headers)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 3); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 4: if (likely((values[4] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_raw_headers)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 4); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 5: if (likely((values[5] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_should_close)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 5); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 6: if (likely((values[6] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_compression)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 6); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 7: if (likely((values[7] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_upgrade)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 7); __PYX_ERR(0, 204, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 8: if (likely((values[8] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_chunked)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, 8); __PYX_ERR(0, 204, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 204, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 9) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); values[3] = PyTuple_GET_ITEM(__pyx_args, 3); values[4] = PyTuple_GET_ITEM(__pyx_args, 4); values[5] = PyTuple_GET_ITEM(__pyx_args, 5); values[6] = PyTuple_GET_ITEM(__pyx_args, 6); values[7] = PyTuple_GET_ITEM(__pyx_args, 7); values[8] = PyTuple_GET_ITEM(__pyx_args, 8); } __pyx_v_version = values[0]; __pyx_v_code = values[1]; __pyx_v_reason = values[2]; __pyx_v_headers = values[3]; __pyx_v_raw_headers = values[4]; __pyx_v_should_close = values[5]; __pyx_v_compression = values[6]; __pyx_v_upgrade = values[7]; __pyx_v_chunked = values[8]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 1, 9, 9, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 204, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage___init__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self), __pyx_v_version, __pyx_v_code, __pyx_v_reason, __pyx_v_headers, __pyx_v_raw_headers, __pyx_v_should_close, __pyx_v_compression, __pyx_v_upgrade, __pyx_v_chunked); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage___init__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self, PyObject *__pyx_v_version, PyObject *__pyx_v_code, PyObject *__pyx_v_reason, PyObject *__pyx_v_headers, PyObject *__pyx_v_raw_headers, PyObject *__pyx_v_should_close, PyObject *__pyx_v_compression, PyObject *__pyx_v_upgrade, PyObject *__pyx_v_chunked) { int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__init__", 0); /* "aiohttp/_http_parser.pyx":206 * def __init__(self, version, code, reason, headers, raw_headers, * should_close, compression, upgrade, chunked): * self.version = version # <<<<<<<<<<<<<< * self.code = code * self.reason = reason */ __Pyx_INCREF(__pyx_v_version); __Pyx_GIVEREF(__pyx_v_version); __Pyx_GOTREF(__pyx_v_self->version); __Pyx_DECREF(__pyx_v_self->version); __pyx_v_self->version = __pyx_v_version; /* "aiohttp/_http_parser.pyx":207 * should_close, compression, upgrade, chunked): * self.version = version * self.code = code # <<<<<<<<<<<<<< * self.reason = reason * self.headers = headers */ __pyx_t_1 = __Pyx_PyInt_As_int(__pyx_v_code); if (unlikely((__pyx_t_1 == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 207, __pyx_L1_error) __pyx_v_self->code = __pyx_t_1; /* "aiohttp/_http_parser.pyx":208 * self.version = version * self.code = code * self.reason = reason # <<<<<<<<<<<<<< * self.headers = headers * self.raw_headers = raw_headers */ if (!(likely(PyUnicode_CheckExact(__pyx_v_reason))||((__pyx_v_reason) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_v_reason)->tp_name), 0))) __PYX_ERR(0, 208, __pyx_L1_error) __pyx_t_2 = __pyx_v_reason; __Pyx_INCREF(__pyx_t_2); __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->reason); __Pyx_DECREF(__pyx_v_self->reason); __pyx_v_self->reason = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":209 * self.code = code * self.reason = reason * self.headers = headers # <<<<<<<<<<<<<< * self.raw_headers = raw_headers * self.should_close = should_close */ __Pyx_INCREF(__pyx_v_headers); __Pyx_GIVEREF(__pyx_v_headers); __Pyx_GOTREF(__pyx_v_self->headers); __Pyx_DECREF(__pyx_v_self->headers); __pyx_v_self->headers = __pyx_v_headers; /* "aiohttp/_http_parser.pyx":210 * self.reason = reason * self.headers = headers * self.raw_headers = raw_headers # <<<<<<<<<<<<<< * self.should_close = should_close * self.compression = compression */ __Pyx_INCREF(__pyx_v_raw_headers); __Pyx_GIVEREF(__pyx_v_raw_headers); __Pyx_GOTREF(__pyx_v_self->raw_headers); __Pyx_DECREF(__pyx_v_self->raw_headers); __pyx_v_self->raw_headers = __pyx_v_raw_headers; /* "aiohttp/_http_parser.pyx":211 * self.headers = headers * self.raw_headers = raw_headers * self.should_close = should_close # <<<<<<<<<<<<<< * self.compression = compression * self.upgrade = upgrade */ __Pyx_INCREF(__pyx_v_should_close); __Pyx_GIVEREF(__pyx_v_should_close); __Pyx_GOTREF(__pyx_v_self->should_close); __Pyx_DECREF(__pyx_v_self->should_close); __pyx_v_self->should_close = __pyx_v_should_close; /* "aiohttp/_http_parser.pyx":212 * self.raw_headers = raw_headers * self.should_close = should_close * self.compression = compression # <<<<<<<<<<<<<< * self.upgrade = upgrade * self.chunked = chunked */ __Pyx_INCREF(__pyx_v_compression); __Pyx_GIVEREF(__pyx_v_compression); __Pyx_GOTREF(__pyx_v_self->compression); __Pyx_DECREF(__pyx_v_self->compression); __pyx_v_self->compression = __pyx_v_compression; /* "aiohttp/_http_parser.pyx":213 * self.should_close = should_close * self.compression = compression * self.upgrade = upgrade # <<<<<<<<<<<<<< * self.chunked = chunked * */ __Pyx_INCREF(__pyx_v_upgrade); __Pyx_GIVEREF(__pyx_v_upgrade); __Pyx_GOTREF(__pyx_v_self->upgrade); __Pyx_DECREF(__pyx_v_self->upgrade); __pyx_v_self->upgrade = __pyx_v_upgrade; /* "aiohttp/_http_parser.pyx":214 * self.compression = compression * self.upgrade = upgrade * self.chunked = chunked # <<<<<<<<<<<<<< * * def __repr__(self): */ __Pyx_INCREF(__pyx_v_chunked); __Pyx_GIVEREF(__pyx_v_chunked); __Pyx_GOTREF(__pyx_v_self->chunked); __Pyx_DECREF(__pyx_v_self->chunked); __pyx_v_self->chunked = __pyx_v_chunked; /* "aiohttp/_http_parser.pyx":204 * cdef readonly object chunked * * def __init__(self, version, code, reason, headers, raw_headers, # <<<<<<<<<<<<<< * should_close, compression, upgrade, chunked): * self.version = version */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":216 * self.chunked = chunked * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("version", self.version)) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_3__repr__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_3__repr__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__repr__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_2__repr__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_gb_7aiohttp_12_http_parser_18RawResponseMessage_8__repr___2generator1(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value); /* proto */ /* "aiohttp/_http_parser.pyx":227 * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_8__repr___genexpr(PyObject *__pyx_self) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *__pyx_cur_scope; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("genexpr", 0); __pyx_cur_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr(__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr, __pyx_empty_tuple, NULL); if (unlikely(!__pyx_cur_scope)) { __pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)Py_None); __Pyx_INCREF(Py_None); __PYX_ERR(0, 227, __pyx_L1_error) } else { __Pyx_GOTREF(__pyx_cur_scope); } __pyx_cur_scope->__pyx_outer_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *) __pyx_self; __Pyx_INCREF(((PyObject *)__pyx_cur_scope->__pyx_outer_scope)); __Pyx_GIVEREF(__pyx_cur_scope->__pyx_outer_scope); { __pyx_CoroutineObject *gen = __Pyx_Generator_New((__pyx_coroutine_body_t) __pyx_gb_7aiohttp_12_http_parser_18RawResponseMessage_8__repr___2generator1, NULL, (PyObject *) __pyx_cur_scope, __pyx_n_s_genexpr, __pyx_n_s_repr___locals_genexpr, __pyx_n_s_aiohttp__http_parser); if (unlikely(!gen)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_DECREF(__pyx_cur_scope); __Pyx_RefNannyFinishContext(); return (PyObject *) gen; } /* function exit code */ __pyx_L1_error:; __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__repr__.genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_DECREF(((PyObject *)__pyx_cur_scope)); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_gb_7aiohttp_12_http_parser_18RawResponseMessage_8__repr___2generator1(__pyx_CoroutineObject *__pyx_generator, CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject *__pyx_sent_value) /* generator body */ { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *__pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)__pyx_generator->closure); PyObject *__pyx_r = NULL; PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *(*__pyx_t_7)(PyObject *); __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("genexpr", 0); switch (__pyx_generator->resume_label) { case 0: goto __pyx_L3_first_run; default: /* CPython raises the right error here */ __Pyx_RefNannyFinishContext(); return NULL; } __pyx_L3_first_run:; if (unlikely(!__pyx_sent_value)) __PYX_ERR(0, 227, __pyx_L1_error) __pyx_r = PyList_New(0); if (unlikely(!__pyx_r)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_r); if (unlikely(!__pyx_cur_scope->__pyx_outer_scope->__pyx_v_info)) { __Pyx_RaiseClosureNameError("info"); __PYX_ERR(0, 227, __pyx_L1_error) } if (unlikely(__pyx_cur_scope->__pyx_outer_scope->__pyx_v_info == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); __PYX_ERR(0, 227, __pyx_L1_error) } __pyx_t_1 = __pyx_cur_scope->__pyx_outer_scope->__pyx_v_info; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = 0; for (;;) { if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_1)) break; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS __pyx_t_3 = PyList_GET_ITEM(__pyx_t_1, __pyx_t_2); __Pyx_INCREF(__pyx_t_3); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(0, 227, __pyx_L1_error) #else __pyx_t_3 = PySequence_ITEM(__pyx_t_1, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); #endif if ((likely(PyTuple_CheckExact(__pyx_t_3))) || (PyList_CheckExact(__pyx_t_3))) { PyObject* sequence = __pyx_t_3; Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); if (unlikely(size != 2)) { if (size > 2) __Pyx_RaiseTooManyValuesError(2); else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); __PYX_ERR(0, 227, __pyx_L1_error) } #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS if (likely(PyTuple_CheckExact(sequence))) { __pyx_t_4 = PyTuple_GET_ITEM(sequence, 0); __pyx_t_5 = PyTuple_GET_ITEM(sequence, 1); } else { __pyx_t_4 = PyList_GET_ITEM(sequence, 0); __pyx_t_5 = PyList_GET_ITEM(sequence, 1); } __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); #else __pyx_t_4 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); #endif __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } else { Py_ssize_t index = -1; __pyx_t_6 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_t_7 = Py_TYPE(__pyx_t_6)->tp_iternext; index = 0; __pyx_t_4 = __pyx_t_7(__pyx_t_6); if (unlikely(!__pyx_t_4)) goto __pyx_L6_unpacking_failed; __Pyx_GOTREF(__pyx_t_4); index = 1; __pyx_t_5 = __pyx_t_7(__pyx_t_6); if (unlikely(!__pyx_t_5)) goto __pyx_L6_unpacking_failed; __Pyx_GOTREF(__pyx_t_5); if (__Pyx_IternextUnpackEndCheck(__pyx_t_7(__pyx_t_6), 2) < 0) __PYX_ERR(0, 227, __pyx_L1_error) __pyx_t_7 = NULL; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; goto __pyx_L7_unpacking_done; __pyx_L6_unpacking_failed:; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __pyx_t_7 = NULL; if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); __PYX_ERR(0, 227, __pyx_L1_error) __pyx_L7_unpacking_done:; } __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_name); __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_name, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_XGOTREF(__pyx_cur_scope->__pyx_v_val); __Pyx_XDECREF_SET(__pyx_cur_scope->__pyx_v_val, __pyx_t_5); __Pyx_GIVEREF(__pyx_t_5); __pyx_t_5 = 0; __pyx_t_3 = PyNumber_Add(__pyx_cur_scope->__pyx_v_name, __pyx_kp_u_); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_5 = PyObject_Repr(__pyx_cur_scope->__pyx_v_val); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_4 = PyNumber_Add(__pyx_t_3, __pyx_t_5); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(__Pyx_ListComp_Append(__pyx_r, (PyObject*)__pyx_t_4))) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; } __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; CYTHON_MAYBE_UNUSED_VAR(__pyx_cur_scope); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_r); __pyx_r = 0; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("genexpr", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); #if !CYTHON_USE_EXC_INFO_STACK __Pyx_Coroutine_ResetAndClearException(__pyx_generator); #endif __pyx_generator->resume_label = -1; __Pyx_Coroutine_clear((PyObject*)__pyx_generator); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":216 * self.chunked = chunked * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("version", self.version)) */ static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_2__repr__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *__pyx_cur_scope; PyObject *__pyx_v_sinfo = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("__repr__", 0); __pyx_cur_scope = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__(__pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__, __pyx_empty_tuple, NULL); if (unlikely(!__pyx_cur_scope)) { __pyx_cur_scope = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)Py_None); __Pyx_INCREF(Py_None); __PYX_ERR(0, 216, __pyx_L1_error) } else { __Pyx_GOTREF(__pyx_cur_scope); } /* "aiohttp/_http_parser.pyx":217 * * def __repr__(self): * info = [] # <<<<<<<<<<<<<< * info.append(("version", self.version)) * info.append(("code", self.code)) */ __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 217, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_cur_scope->__pyx_v_info = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":218 * def __repr__(self): * info = [] * info.append(("version", self.version)) # <<<<<<<<<<<<<< * info.append(("code", self.code)) * info.append(("reason", self.reason)) */ __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 218, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_u_version); __Pyx_GIVEREF(__pyx_n_u_version); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_n_u_version); __Pyx_INCREF(__pyx_v_self->version); __Pyx_GIVEREF(__pyx_v_self->version); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_v_self->version); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_1); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 218, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":219 * info = [] * info.append(("version", self.version)) * info.append(("code", self.code)) # <<<<<<<<<<<<<< * info.append(("reason", self.reason)) * info.append(("headers", self.headers)) */ __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->code); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 219, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 219, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_code); __Pyx_GIVEREF(__pyx_n_u_code); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_code); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_t_1); __pyx_t_1 = 0; __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 219, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":220 * info.append(("version", self.version)) * info.append(("code", self.code)) * info.append(("reason", self.reason)) # <<<<<<<<<<<<<< * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 220, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_reason); __Pyx_GIVEREF(__pyx_n_u_reason); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_reason); __Pyx_INCREF(__pyx_v_self->reason); __Pyx_GIVEREF(__pyx_v_self->reason); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->reason); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 220, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":221 * info.append(("code", self.code)) * info.append(("reason", self.reason)) * info.append(("headers", self.headers)) # <<<<<<<<<<<<<< * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 221, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_headers); __Pyx_GIVEREF(__pyx_n_u_headers); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_headers); __Pyx_INCREF(__pyx_v_self->headers); __Pyx_GIVEREF(__pyx_v_self->headers); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->headers); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 221, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":222 * info.append(("reason", self.reason)) * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) # <<<<<<<<<<<<<< * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 222, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_raw_headers); __Pyx_GIVEREF(__pyx_n_u_raw_headers); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_raw_headers); __Pyx_INCREF(__pyx_v_self->raw_headers); __Pyx_GIVEREF(__pyx_v_self->raw_headers); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->raw_headers); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 222, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":223 * info.append(("headers", self.headers)) * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) # <<<<<<<<<<<<<< * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 223, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_should_close); __Pyx_GIVEREF(__pyx_n_u_should_close); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_should_close); __Pyx_INCREF(__pyx_v_self->should_close); __Pyx_GIVEREF(__pyx_v_self->should_close); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->should_close); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 223, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":224 * info.append(("raw_headers", self.raw_headers)) * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) # <<<<<<<<<<<<<< * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 224, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_compression); __Pyx_GIVEREF(__pyx_n_u_compression); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_compression); __Pyx_INCREF(__pyx_v_self->compression); __Pyx_GIVEREF(__pyx_v_self->compression); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->compression); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 224, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":225 * info.append(("should_close", self.should_close)) * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) # <<<<<<<<<<<<<< * info.append(("chunked", self.chunked)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 225, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_upgrade); __Pyx_GIVEREF(__pyx_n_u_upgrade); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_upgrade); __Pyx_INCREF(__pyx_v_self->upgrade); __Pyx_GIVEREF(__pyx_v_self->upgrade); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->upgrade); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 225, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":226 * info.append(("compression", self.compression)) * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) # <<<<<<<<<<<<<< * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) * return '' */ __pyx_t_3 = PyTuple_New(2); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 226, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_n_u_chunked); __Pyx_GIVEREF(__pyx_n_u_chunked); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_n_u_chunked); __Pyx_INCREF(__pyx_v_self->chunked); __Pyx_GIVEREF(__pyx_v_self->chunked); PyTuple_SET_ITEM(__pyx_t_3, 1, __pyx_v_self->chunked); __pyx_t_2 = __Pyx_PyList_Append(__pyx_cur_scope->__pyx_v_info, __pyx_t_3); if (unlikely(__pyx_t_2 == ((int)-1))) __PYX_ERR(0, 226, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":227 * info.append(("upgrade", self.upgrade)) * info.append(("chunked", self.chunked)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) # <<<<<<<<<<<<<< * return '' * */ __pyx_t_3 = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_8__repr___genexpr(((PyObject*)__pyx_cur_scope)); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_1 = __Pyx_Generator_Next(__pyx_t_3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_t_3 = PyUnicode_Join(__pyx_kp_u__2, __pyx_t_1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 227, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_sinfo = ((PyObject*)__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":228 * info.append(("chunked", self.chunked)) * sinfo = ', '.join(name + '=' + repr(val) for name, val in info) * return '' # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = __Pyx_PyUnicode_ConcatSafe(__pyx_kp_u_RawResponseMessage, __pyx_v_sinfo); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 228, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_1 = __Pyx_PyUnicode_Concat(__pyx_t_3, __pyx_kp_u__3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 228, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":216 * self.chunked = chunked * * def __repr__(self): # <<<<<<<<<<<<<< * info = [] * info.append(("version", self.version)) */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__repr__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_sinfo); __Pyx_DECREF(((PyObject *)__pyx_cur_scope)); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":194 * @cython.freelist(DEFAULT_FREELIST_SIZE) * cdef class RawResponseMessage: * cdef readonly object version # HttpVersion # <<<<<<<<<<<<<< * cdef readonly int code * cdef readonly str reason */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7version_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7version_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7version___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7version___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->version); __pyx_r = __pyx_v_self->version; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":195 * cdef class RawResponseMessage: * cdef readonly object version # HttpVersion * cdef readonly int code # <<<<<<<<<<<<<< * cdef readonly str reason * cdef readonly object headers # CIMultiDict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_4code_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_4code_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_4code___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_4code___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->code); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 195, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.code.__get__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":196 * cdef readonly object version # HttpVersion * cdef readonly int code * cdef readonly str reason # <<<<<<<<<<<<<< * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_6reason_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_6reason_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_6reason___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_6reason___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->reason); __pyx_r = __pyx_v_self->reason; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":197 * cdef readonly int code * cdef readonly str reason * cdef readonly object headers # CIMultiDict # <<<<<<<<<<<<<< * cdef readonly object raw_headers # tuple * cdef readonly object should_close */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7headers_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7headers_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7headers___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7headers___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->headers); __pyx_r = __pyx_v_self->headers; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":198 * cdef readonly str reason * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple # <<<<<<<<<<<<<< * cdef readonly object should_close * cdef readonly object compression */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11raw_headers_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11raw_headers_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_11raw_headers___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_11raw_headers___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->raw_headers); __pyx_r = __pyx_v_self->raw_headers; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":199 * cdef readonly object headers # CIMultiDict * cdef readonly object raw_headers # tuple * cdef readonly object should_close # <<<<<<<<<<<<<< * cdef readonly object compression * cdef readonly object upgrade */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_12should_close_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_12should_close_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_12should_close___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_12should_close___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->should_close); __pyx_r = __pyx_v_self->should_close; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":200 * cdef readonly object raw_headers # tuple * cdef readonly object should_close * cdef readonly object compression # <<<<<<<<<<<<<< * cdef readonly object upgrade * cdef readonly object chunked */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11compression_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11compression_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_11compression___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_11compression___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->compression); __pyx_r = __pyx_v_self->compression; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":201 * cdef readonly object should_close * cdef readonly object compression * cdef readonly object upgrade # <<<<<<<<<<<<<< * cdef readonly object chunked * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7upgrade_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7upgrade_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7upgrade___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7upgrade___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->upgrade); __pyx_r = __pyx_v_self->upgrade; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":202 * cdef readonly object compression * cdef readonly object upgrade * cdef readonly object chunked # <<<<<<<<<<<<<< * * def __init__(self, version, code, reason, headers, raw_headers, */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7chunked_1__get__(PyObject *__pyx_v_self); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7chunked_1__get__(PyObject *__pyx_v_self) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7chunked___get__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_7chunked___get__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__get__", 0); __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_self->chunked); __pyx_r = __pyx_v_self->chunked; goto __pyx_L0; /* function exit code */ __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_5__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_5__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_4__reduce_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_4__reduce_cython__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self) { PyObject *__pyx_v_state = 0; PyObject *__pyx_v__dict = 0; int __pyx_v_use_setstate; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; int __pyx_t_3; int __pyx_t_4; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":5 * cdef object _dict * cdef bint use_setstate * state = (self.chunked, self.code, self.compression, self.headers, self.raw_headers, self.reason, self.should_close, self.upgrade, self.version) # <<<<<<<<<<<<<< * _dict = getattr(self, '__dict__', None) * if _dict is not None: */ __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->code); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyTuple_New(9); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v_self->chunked); __Pyx_GIVEREF(__pyx_v_self->chunked); PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v_self->chunked); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_t_1); __Pyx_INCREF(__pyx_v_self->compression); __Pyx_GIVEREF(__pyx_v_self->compression); PyTuple_SET_ITEM(__pyx_t_2, 2, __pyx_v_self->compression); __Pyx_INCREF(__pyx_v_self->headers); __Pyx_GIVEREF(__pyx_v_self->headers); PyTuple_SET_ITEM(__pyx_t_2, 3, __pyx_v_self->headers); __Pyx_INCREF(__pyx_v_self->raw_headers); __Pyx_GIVEREF(__pyx_v_self->raw_headers); PyTuple_SET_ITEM(__pyx_t_2, 4, __pyx_v_self->raw_headers); __Pyx_INCREF(__pyx_v_self->reason); __Pyx_GIVEREF(__pyx_v_self->reason); PyTuple_SET_ITEM(__pyx_t_2, 5, __pyx_v_self->reason); __Pyx_INCREF(__pyx_v_self->should_close); __Pyx_GIVEREF(__pyx_v_self->should_close); PyTuple_SET_ITEM(__pyx_t_2, 6, __pyx_v_self->should_close); __Pyx_INCREF(__pyx_v_self->upgrade); __Pyx_GIVEREF(__pyx_v_self->upgrade); PyTuple_SET_ITEM(__pyx_t_2, 7, __pyx_v_self->upgrade); __Pyx_INCREF(__pyx_v_self->version); __Pyx_GIVEREF(__pyx_v_self->version); PyTuple_SET_ITEM(__pyx_t_2, 8, __pyx_v_self->version); __pyx_t_1 = 0; __pyx_v_state = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "(tree fragment)":6 * cdef bint use_setstate * state = (self.chunked, self.code, self.compression, self.headers, self.raw_headers, self.reason, self.should_close, self.upgrade, self.version) * _dict = getattr(self, '__dict__', None) # <<<<<<<<<<<<<< * if _dict is not None: * state += (_dict,) */ __pyx_t_2 = __Pyx_GetAttr3(((PyObject *)__pyx_v_self), __pyx_n_s_dict, Py_None); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v__dict = __pyx_t_2; __pyx_t_2 = 0; /* "(tree fragment)":7 * state = (self.chunked, self.code, self.compression, self.headers, self.raw_headers, self.reason, self.should_close, self.upgrade, self.version) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ __pyx_t_3 = (__pyx_v__dict != Py_None); __pyx_t_4 = (__pyx_t_3 != 0); if (__pyx_t_4) { /* "(tree fragment)":8 * _dict = getattr(self, '__dict__', None) * if _dict is not None: * state += (_dict,) # <<<<<<<<<<<<<< * use_setstate = True * else: */ __pyx_t_2 = PyTuple_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v__dict); __Pyx_GIVEREF(__pyx_v__dict); PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v__dict); __pyx_t_1 = PyNumber_InPlaceAdd(__pyx_v_state, __pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF_SET(__pyx_v_state, ((PyObject*)__pyx_t_1)); __pyx_t_1 = 0; /* "(tree fragment)":9 * if _dict is not None: * state += (_dict,) * use_setstate = True # <<<<<<<<<<<<<< * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.raw_headers is not None or self.reason is not None or self.should_close is not None or self.upgrade is not None or self.version is not None */ __pyx_v_use_setstate = 1; /* "(tree fragment)":7 * state = (self.chunked, self.code, self.compression, self.headers, self.raw_headers, self.reason, self.should_close, self.upgrade, self.version) * _dict = getattr(self, '__dict__', None) * if _dict is not None: # <<<<<<<<<<<<<< * state += (_dict,) * use_setstate = True */ goto __pyx_L3; } /* "(tree fragment)":11 * use_setstate = True * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.raw_headers is not None or self.reason is not None or self.should_close is not None or self.upgrade is not None or self.version is not None # <<<<<<<<<<<<<< * if use_setstate: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, None), state */ /*else*/ { __pyx_t_3 = (__pyx_v_self->chunked != Py_None); __pyx_t_5 = (__pyx_t_3 != 0); if (!__pyx_t_5) { } else { __pyx_t_4 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->compression != Py_None); __pyx_t_3 = (__pyx_t_5 != 0); if (!__pyx_t_3) { } else { __pyx_t_4 = __pyx_t_3; goto __pyx_L4_bool_binop_done; } __pyx_t_3 = (__pyx_v_self->headers != Py_None); __pyx_t_5 = (__pyx_t_3 != 0); if (!__pyx_t_5) { } else { __pyx_t_4 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->raw_headers != Py_None); __pyx_t_3 = (__pyx_t_5 != 0); if (!__pyx_t_3) { } else { __pyx_t_4 = __pyx_t_3; goto __pyx_L4_bool_binop_done; } __pyx_t_3 = (__pyx_v_self->reason != ((PyObject*)Py_None)); __pyx_t_5 = (__pyx_t_3 != 0); if (!__pyx_t_5) { } else { __pyx_t_4 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->should_close != Py_None); __pyx_t_3 = (__pyx_t_5 != 0); if (!__pyx_t_3) { } else { __pyx_t_4 = __pyx_t_3; goto __pyx_L4_bool_binop_done; } __pyx_t_3 = (__pyx_v_self->upgrade != Py_None); __pyx_t_5 = (__pyx_t_3 != 0); if (!__pyx_t_5) { } else { __pyx_t_4 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = (__pyx_v_self->version != Py_None); __pyx_t_3 = (__pyx_t_5 != 0); __pyx_t_4 = __pyx_t_3; __pyx_L4_bool_binop_done:; __pyx_v_use_setstate = __pyx_t_4; } __pyx_L3:; /* "(tree fragment)":12 * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.raw_headers is not None or self.reason is not None or self.should_close is not None or self.upgrade is not None or self.version is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, None), state * else: */ __pyx_t_4 = (__pyx_v_use_setstate != 0); if (__pyx_t_4) { /* "(tree fragment)":13 * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.raw_headers is not None or self.reason is not None or self.should_close is not None or self.upgrade is not None or self.version is not None * if use_setstate: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, None), state # <<<<<<<<<<<<<< * else: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, state) */ __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_pyx_unpickle_RawResponseMessag); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = PyTuple_New(3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_2, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_209127132); __Pyx_GIVEREF(__pyx_int_209127132); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_int_209127132); __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); PyTuple_SET_ITEM(__pyx_t_2, 2, Py_None); __pyx_t_6 = PyTuple_New(3); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GIVEREF(__pyx_t_1); PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_2); PyTuple_SET_ITEM(__pyx_t_6, 1, __pyx_t_2); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_6, 2, __pyx_v_state); __pyx_t_1 = 0; __pyx_t_2 = 0; __pyx_r = __pyx_t_6; __pyx_t_6 = 0; goto __pyx_L0; /* "(tree fragment)":12 * else: * use_setstate = self.chunked is not None or self.compression is not None or self.headers is not None or self.raw_headers is not None or self.reason is not None or self.should_close is not None or self.upgrade is not None or self.version is not None * if use_setstate: # <<<<<<<<<<<<<< * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, None), state * else: */ } /* "(tree fragment)":15 * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, None), state * else: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, state) # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_RawResponseMessage__set_state(self, __pyx_state) */ /*else*/ { __Pyx_XDECREF(__pyx_r); __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_pyx_unpickle_RawResponseMessag); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_2 = PyTuple_New(3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_GIVEREF(((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); PyTuple_SET_ITEM(__pyx_t_2, 0, ((PyObject *)Py_TYPE(((PyObject *)__pyx_v_self)))); __Pyx_INCREF(__pyx_int_209127132); __Pyx_GIVEREF(__pyx_int_209127132); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_int_209127132); __Pyx_INCREF(__pyx_v_state); __Pyx_GIVEREF(__pyx_v_state); PyTuple_SET_ITEM(__pyx_t_2, 2, __pyx_v_state); __pyx_t_1 = PyTuple_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_6); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_6); __Pyx_GIVEREF(__pyx_t_2); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_t_2); __pyx_t_6 = 0; __pyx_t_2 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * cdef tuple state * cdef object _dict */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_state); __Pyx_XDECREF(__pyx_v__dict); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":16 * else: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_RawResponseMessage__set_state(self, __pyx_state) */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_6__setstate_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18RawResponseMessage_6__setstate_cython__(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":17 * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, state) * def __setstate_cython__(self, __pyx_state): * __pyx_unpickle_RawResponseMessage__set_state(self, __pyx_state) # <<<<<<<<<<<<<< */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 17, __pyx_L1_error) __pyx_t_1 = __pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawResponseMessage__set_state(__pyx_v_self, ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":16 * else: * return __pyx_unpickle_RawResponseMessage, (type(self), 0xc7706dc, state) * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * __pyx_unpickle_RawResponseMessage__set_state(self, __pyx_state) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.RawResponseMessage.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":231 * * * cdef _new_response_message(object version, # <<<<<<<<<<<<<< * int code, * str reason, */ static PyObject *__pyx_f_7aiohttp_12_http_parser__new_response_message(PyObject *__pyx_v_version, int __pyx_v_code, PyObject *__pyx_v_reason, PyObject *__pyx_v_headers, PyObject *__pyx_v_raw_headers, int __pyx_v_should_close, PyObject *__pyx_v_compression, int __pyx_v_upgrade, int __pyx_v_chunked) { struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v_ret = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("_new_response_message", 0); /* "aiohttp/_http_parser.pyx":241 * bint chunked): * cdef RawResponseMessage ret * ret = RawResponseMessage.__new__(RawResponseMessage) # <<<<<<<<<<<<<< * ret.version = version * ret.code = code */ __pyx_t_1 = ((PyObject *)__pyx_tp_new_7aiohttp_12_http_parser_RawResponseMessage(((PyTypeObject *)__pyx_ptype_7aiohttp_12_http_parser_RawResponseMessage), __pyx_empty_tuple, NULL)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 241, __pyx_L1_error) __Pyx_GOTREF(((PyObject *)__pyx_t_1)); __pyx_v_ret = ((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":242 * cdef RawResponseMessage ret * ret = RawResponseMessage.__new__(RawResponseMessage) * ret.version = version # <<<<<<<<<<<<<< * ret.code = code * ret.reason = reason */ __Pyx_INCREF(__pyx_v_version); __Pyx_GIVEREF(__pyx_v_version); __Pyx_GOTREF(__pyx_v_ret->version); __Pyx_DECREF(__pyx_v_ret->version); __pyx_v_ret->version = __pyx_v_version; /* "aiohttp/_http_parser.pyx":243 * ret = RawResponseMessage.__new__(RawResponseMessage) * ret.version = version * ret.code = code # <<<<<<<<<<<<<< * ret.reason = reason * ret.headers = headers */ __pyx_v_ret->code = __pyx_v_code; /* "aiohttp/_http_parser.pyx":244 * ret.version = version * ret.code = code * ret.reason = reason # <<<<<<<<<<<<<< * ret.headers = headers * ret.raw_headers = raw_headers */ __Pyx_INCREF(__pyx_v_reason); __Pyx_GIVEREF(__pyx_v_reason); __Pyx_GOTREF(__pyx_v_ret->reason); __Pyx_DECREF(__pyx_v_ret->reason); __pyx_v_ret->reason = __pyx_v_reason; /* "aiohttp/_http_parser.pyx":245 * ret.code = code * ret.reason = reason * ret.headers = headers # <<<<<<<<<<<<<< * ret.raw_headers = raw_headers * ret.should_close = should_close */ __Pyx_INCREF(__pyx_v_headers); __Pyx_GIVEREF(__pyx_v_headers); __Pyx_GOTREF(__pyx_v_ret->headers); __Pyx_DECREF(__pyx_v_ret->headers); __pyx_v_ret->headers = __pyx_v_headers; /* "aiohttp/_http_parser.pyx":246 * ret.reason = reason * ret.headers = headers * ret.raw_headers = raw_headers # <<<<<<<<<<<<<< * ret.should_close = should_close * ret.compression = compression */ __Pyx_INCREF(__pyx_v_raw_headers); __Pyx_GIVEREF(__pyx_v_raw_headers); __Pyx_GOTREF(__pyx_v_ret->raw_headers); __Pyx_DECREF(__pyx_v_ret->raw_headers); __pyx_v_ret->raw_headers = __pyx_v_raw_headers; /* "aiohttp/_http_parser.pyx":247 * ret.headers = headers * ret.raw_headers = raw_headers * ret.should_close = should_close # <<<<<<<<<<<<<< * ret.compression = compression * ret.upgrade = upgrade */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_should_close); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 247, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->should_close); __Pyx_DECREF(__pyx_v_ret->should_close); __pyx_v_ret->should_close = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":248 * ret.raw_headers = raw_headers * ret.should_close = should_close * ret.compression = compression # <<<<<<<<<<<<<< * ret.upgrade = upgrade * ret.chunked = chunked */ __Pyx_INCREF(__pyx_v_compression); __Pyx_GIVEREF(__pyx_v_compression); __Pyx_GOTREF(__pyx_v_ret->compression); __Pyx_DECREF(__pyx_v_ret->compression); __pyx_v_ret->compression = __pyx_v_compression; /* "aiohttp/_http_parser.pyx":249 * ret.should_close = should_close * ret.compression = compression * ret.upgrade = upgrade # <<<<<<<<<<<<<< * ret.chunked = chunked * return ret */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_upgrade); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 249, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->upgrade); __Pyx_DECREF(__pyx_v_ret->upgrade); __pyx_v_ret->upgrade = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":250 * ret.compression = compression * ret.upgrade = upgrade * ret.chunked = chunked # <<<<<<<<<<<<<< * return ret * */ __pyx_t_1 = __Pyx_PyBool_FromLong(__pyx_v_chunked); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 250, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_ret->chunked); __Pyx_DECREF(__pyx_v_ret->chunked); __pyx_v_ret->chunked = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":251 * ret.upgrade = upgrade * ret.chunked = chunked * return ret # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject *)__pyx_v_ret)); __pyx_r = ((PyObject *)__pyx_v_ret); goto __pyx_L0; /* "aiohttp/_http_parser.pyx":231 * * * cdef _new_response_message(object version, # <<<<<<<<<<<<<< * int code, * str reason, */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser._new_response_message", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_ret); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":293 * Py_buffer py_buf * * def __cinit__(self): # <<<<<<<<<<<<<< * self._cparser = \ * PyMem_Malloc(sizeof(cparser.http_parser)) */ /* Python wrapper */ static int __pyx_pw_7aiohttp_12_http_parser_10HttpParser_1__cinit__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_12_http_parser_10HttpParser_1__cinit__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__cinit__ (wrapper)", 0); if (unlikely(PyTuple_GET_SIZE(__pyx_args) > 0)) { __Pyx_RaiseArgtupleInvalid("__cinit__", 1, 0, 0, PyTuple_GET_SIZE(__pyx_args)); return -1;} if (unlikely(__pyx_kwds) && unlikely(PyDict_Size(__pyx_kwds) > 0) && unlikely(!__Pyx_CheckKeywordStrings(__pyx_kwds, "__cinit__", 0))) return -1; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_10HttpParser___cinit__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_12_http_parser_10HttpParser___cinit__(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; __Pyx_RefNannySetupContext("__cinit__", 0); /* "aiohttp/_http_parser.pyx":294 * * def __cinit__(self): * self._cparser = \ # <<<<<<<<<<<<<< * PyMem_Malloc(sizeof(cparser.http_parser)) * if self._cparser is NULL: */ __pyx_v_self->_cparser = ((struct http_parser *)PyMem_Malloc((sizeof(struct http_parser)))); /* "aiohttp/_http_parser.pyx":296 * self._cparser = \ * PyMem_Malloc(sizeof(cparser.http_parser)) * if self._cparser is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * */ __pyx_t_1 = ((__pyx_v_self->_cparser == NULL) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_parser.pyx":297 * PyMem_Malloc(sizeof(cparser.http_parser)) * if self._cparser is NULL: * raise MemoryError() # <<<<<<<<<<<<<< * * self._csettings = \ */ PyErr_NoMemory(); __PYX_ERR(0, 297, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":296 * self._cparser = \ * PyMem_Malloc(sizeof(cparser.http_parser)) * if self._cparser is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * */ } /* "aiohttp/_http_parser.pyx":299 * raise MemoryError() * * self._csettings = \ # <<<<<<<<<<<<<< * PyMem_Malloc(sizeof(cparser.http_parser_settings)) * if self._csettings is NULL: */ __pyx_v_self->_csettings = ((struct http_parser_settings *)PyMem_Malloc((sizeof(struct http_parser_settings)))); /* "aiohttp/_http_parser.pyx":301 * self._csettings = \ * PyMem_Malloc(sizeof(cparser.http_parser_settings)) * if self._csettings is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * */ __pyx_t_1 = ((__pyx_v_self->_csettings == NULL) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_parser.pyx":302 * PyMem_Malloc(sizeof(cparser.http_parser_settings)) * if self._csettings is NULL: * raise MemoryError() # <<<<<<<<<<<<<< * * def __dealloc__(self): */ PyErr_NoMemory(); __PYX_ERR(0, 302, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":301 * self._csettings = \ * PyMem_Malloc(sizeof(cparser.http_parser_settings)) * if self._csettings is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * */ } /* "aiohttp/_http_parser.pyx":293 * Py_buffer py_buf * * def __cinit__(self): # <<<<<<<<<<<<<< * self._cparser = \ * PyMem_Malloc(sizeof(cparser.http_parser)) */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.__cinit__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":304 * raise MemoryError() * * def __dealloc__(self): # <<<<<<<<<<<<<< * PyMem_Free(self._cparser) * PyMem_Free(self._csettings) */ /* Python wrapper */ static void __pyx_pw_7aiohttp_12_http_parser_10HttpParser_3__dealloc__(PyObject *__pyx_v_self); /*proto*/ static void __pyx_pw_7aiohttp_12_http_parser_10HttpParser_3__dealloc__(PyObject *__pyx_v_self) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__dealloc__ (wrapper)", 0); __pyx_pf_7aiohttp_12_http_parser_10HttpParser_2__dealloc__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); } static void __pyx_pf_7aiohttp_12_http_parser_10HttpParser_2__dealloc__(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__dealloc__", 0); /* "aiohttp/_http_parser.pyx":305 * * def __dealloc__(self): * PyMem_Free(self._cparser) # <<<<<<<<<<<<<< * PyMem_Free(self._csettings) * */ PyMem_Free(__pyx_v_self->_cparser); /* "aiohttp/_http_parser.pyx":306 * def __dealloc__(self): * PyMem_Free(self._cparser) * PyMem_Free(self._csettings) # <<<<<<<<<<<<<< * * cdef _init(self, cparser.http_parser_type mode, */ PyMem_Free(__pyx_v_self->_csettings); /* "aiohttp/_http_parser.pyx":304 * raise MemoryError() * * def __dealloc__(self): # <<<<<<<<<<<<<< * PyMem_Free(self._cparser) * PyMem_Free(self._csettings) */ /* function exit code */ __Pyx_RefNannyFinishContext(); } /* "aiohttp/_http_parser.pyx":308 * PyMem_Free(self._csettings) * * cdef _init(self, cparser.http_parser_type mode, # <<<<<<<<<<<<<< * object protocol, object loop, object timer=None, * size_t max_line_size=8190, size_t max_headers=32768, */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__init(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, enum http_parser_type __pyx_v_mode, PyObject *__pyx_v_protocol, PyObject *__pyx_v_loop, struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init *__pyx_optional_args) { /* "aiohttp/_http_parser.pyx":309 * * cdef _init(self, cparser.http_parser_type mode, * object protocol, object loop, object timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ PyObject *__pyx_v_timer = ((PyObject *)Py_None); size_t __pyx_v_max_line_size = ((size_t)0x1FFE); size_t __pyx_v_max_headers = ((size_t)0x8000); size_t __pyx_v_max_field_size = ((size_t)0x1FFE); /* "aiohttp/_http_parser.pyx":311 * object protocol, object loop, object timer=None, * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, # <<<<<<<<<<<<<< * bint response_with_body=True, bint auto_decompress=True): * cparser.http_parser_init(self._cparser, mode) */ PyObject *__pyx_v_payload_exception = ((PyObject *)Py_None); /* "aiohttp/_http_parser.pyx":312 * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint auto_decompress=True): # <<<<<<<<<<<<<< * cparser.http_parser_init(self._cparser, mode) * self._cparser.data = self */ int __pyx_v_response_with_body = ((int)1); int __pyx_v_auto_decompress = ((int)1); PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("_init", 0); if (__pyx_optional_args) { if (__pyx_optional_args->__pyx_n > 0) { __pyx_v_timer = __pyx_optional_args->timer; if (__pyx_optional_args->__pyx_n > 1) { __pyx_v_max_line_size = __pyx_optional_args->max_line_size; if (__pyx_optional_args->__pyx_n > 2) { __pyx_v_max_headers = __pyx_optional_args->max_headers; if (__pyx_optional_args->__pyx_n > 3) { __pyx_v_max_field_size = __pyx_optional_args->max_field_size; if (__pyx_optional_args->__pyx_n > 4) { __pyx_v_payload_exception = __pyx_optional_args->payload_exception; if (__pyx_optional_args->__pyx_n > 5) { __pyx_v_response_with_body = __pyx_optional_args->response_with_body; if (__pyx_optional_args->__pyx_n > 6) { __pyx_v_auto_decompress = __pyx_optional_args->auto_decompress; } } } } } } } } /* "aiohttp/_http_parser.pyx":313 * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint auto_decompress=True): * cparser.http_parser_init(self._cparser, mode) # <<<<<<<<<<<<<< * self._cparser.data = self * self._cparser.content_length = 0 */ http_parser_init(__pyx_v_self->_cparser, __pyx_v_mode); /* "aiohttp/_http_parser.pyx":314 * bint response_with_body=True, bint auto_decompress=True): * cparser.http_parser_init(self._cparser, mode) * self._cparser.data = self # <<<<<<<<<<<<<< * self._cparser.content_length = 0 * */ __pyx_v_self->_cparser->data = ((void *)__pyx_v_self); /* "aiohttp/_http_parser.pyx":315 * cparser.http_parser_init(self._cparser, mode) * self._cparser.data = self * self._cparser.content_length = 0 # <<<<<<<<<<<<<< * * cparser.http_parser_settings_init(self._csettings) */ __pyx_v_self->_cparser->content_length = 0; /* "aiohttp/_http_parser.pyx":317 * self._cparser.content_length = 0 * * cparser.http_parser_settings_init(self._csettings) # <<<<<<<<<<<<<< * * self._protocol = protocol */ http_parser_settings_init(__pyx_v_self->_csettings); /* "aiohttp/_http_parser.pyx":319 * cparser.http_parser_settings_init(self._csettings) * * self._protocol = protocol # <<<<<<<<<<<<<< * self._loop = loop * self._timer = timer */ __Pyx_INCREF(__pyx_v_protocol); __Pyx_GIVEREF(__pyx_v_protocol); __Pyx_GOTREF(__pyx_v_self->_protocol); __Pyx_DECREF(__pyx_v_self->_protocol); __pyx_v_self->_protocol = __pyx_v_protocol; /* "aiohttp/_http_parser.pyx":320 * * self._protocol = protocol * self._loop = loop # <<<<<<<<<<<<<< * self._timer = timer * */ __Pyx_INCREF(__pyx_v_loop); __Pyx_GIVEREF(__pyx_v_loop); __Pyx_GOTREF(__pyx_v_self->_loop); __Pyx_DECREF(__pyx_v_self->_loop); __pyx_v_self->_loop = __pyx_v_loop; /* "aiohttp/_http_parser.pyx":321 * self._protocol = protocol * self._loop = loop * self._timer = timer # <<<<<<<<<<<<<< * * self._buf = bytearray() */ __Pyx_INCREF(__pyx_v_timer); __Pyx_GIVEREF(__pyx_v_timer); __Pyx_GOTREF(__pyx_v_self->_timer); __Pyx_DECREF(__pyx_v_self->_timer); __pyx_v_self->_timer = __pyx_v_timer; /* "aiohttp/_http_parser.pyx":323 * self._timer = timer * * self._buf = bytearray() # <<<<<<<<<<<<<< * self._payload = None * self._payload_error = 0 */ __pyx_t_1 = __Pyx_PyObject_CallNoArg(((PyObject *)(&PyByteArray_Type))); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 323, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->_buf); __Pyx_DECREF(__pyx_v_self->_buf); __pyx_v_self->_buf = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":324 * * self._buf = bytearray() * self._payload = None # <<<<<<<<<<<<<< * self._payload_error = 0 * self._payload_exception = payload_exception */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_payload); __Pyx_DECREF(__pyx_v_self->_payload); __pyx_v_self->_payload = Py_None; /* "aiohttp/_http_parser.pyx":325 * self._buf = bytearray() * self._payload = None * self._payload_error = 0 # <<<<<<<<<<<<<< * self._payload_exception = payload_exception * self._messages = [] */ __pyx_v_self->_payload_error = 0; /* "aiohttp/_http_parser.pyx":326 * self._payload = None * self._payload_error = 0 * self._payload_exception = payload_exception # <<<<<<<<<<<<<< * self._messages = [] * */ __Pyx_INCREF(__pyx_v_payload_exception); __Pyx_GIVEREF(__pyx_v_payload_exception); __Pyx_GOTREF(__pyx_v_self->_payload_exception); __Pyx_DECREF(__pyx_v_self->_payload_exception); __pyx_v_self->_payload_exception = __pyx_v_payload_exception; /* "aiohttp/_http_parser.pyx":327 * self._payload_error = 0 * self._payload_exception = payload_exception * self._messages = [] # <<<<<<<<<<<<<< * * self._raw_name = bytearray() */ __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 327, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->_messages); __Pyx_DECREF(__pyx_v_self->_messages); __pyx_v_self->_messages = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":329 * self._messages = [] * * self._raw_name = bytearray() # <<<<<<<<<<<<<< * self._raw_value = bytearray() * self._has_value = False */ __pyx_t_1 = __Pyx_PyObject_CallNoArg(((PyObject *)(&PyByteArray_Type))); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 329, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->_raw_name); __Pyx_DECREF(__pyx_v_self->_raw_name); __pyx_v_self->_raw_name = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":330 * * self._raw_name = bytearray() * self._raw_value = bytearray() # <<<<<<<<<<<<<< * self._has_value = False * */ __pyx_t_1 = __Pyx_PyObject_CallNoArg(((PyObject *)(&PyByteArray_Type))); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 330, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_self->_raw_value); __Pyx_DECREF(__pyx_v_self->_raw_value); __pyx_v_self->_raw_value = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":331 * self._raw_name = bytearray() * self._raw_value = bytearray() * self._has_value = False # <<<<<<<<<<<<<< * * self._max_line_size = max_line_size */ __pyx_v_self->_has_value = 0; /* "aiohttp/_http_parser.pyx":333 * self._has_value = False * * self._max_line_size = max_line_size # <<<<<<<<<<<<<< * self._max_headers = max_headers * self._max_field_size = max_field_size */ __pyx_v_self->_max_line_size = __pyx_v_max_line_size; /* "aiohttp/_http_parser.pyx":334 * * self._max_line_size = max_line_size * self._max_headers = max_headers # <<<<<<<<<<<<<< * self._max_field_size = max_field_size * self._response_with_body = response_with_body */ __pyx_v_self->_max_headers = __pyx_v_max_headers; /* "aiohttp/_http_parser.pyx":335 * self._max_line_size = max_line_size * self._max_headers = max_headers * self._max_field_size = max_field_size # <<<<<<<<<<<<<< * self._response_with_body = response_with_body * self._upgraded = False */ __pyx_v_self->_max_field_size = __pyx_v_max_field_size; /* "aiohttp/_http_parser.pyx":336 * self._max_headers = max_headers * self._max_field_size = max_field_size * self._response_with_body = response_with_body # <<<<<<<<<<<<<< * self._upgraded = False * self._auto_decompress = auto_decompress */ __pyx_v_self->_response_with_body = __pyx_v_response_with_body; /* "aiohttp/_http_parser.pyx":337 * self._max_field_size = max_field_size * self._response_with_body = response_with_body * self._upgraded = False # <<<<<<<<<<<<<< * self._auto_decompress = auto_decompress * self._content_encoding = None */ __pyx_v_self->_upgraded = 0; /* "aiohttp/_http_parser.pyx":338 * self._response_with_body = response_with_body * self._upgraded = False * self._auto_decompress = auto_decompress # <<<<<<<<<<<<<< * self._content_encoding = None * */ __pyx_v_self->_auto_decompress = __pyx_v_auto_decompress; /* "aiohttp/_http_parser.pyx":339 * self._upgraded = False * self._auto_decompress = auto_decompress * self._content_encoding = None # <<<<<<<<<<<<<< * * self._csettings.on_url = cb_on_url */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_content_encoding); __Pyx_DECREF(__pyx_v_self->_content_encoding); __pyx_v_self->_content_encoding = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":341 * self._content_encoding = None * * self._csettings.on_url = cb_on_url # <<<<<<<<<<<<<< * self._csettings.on_status = cb_on_status * self._csettings.on_header_field = cb_on_header_field */ __pyx_v_self->_csettings->on_url = __pyx_f_7aiohttp_12_http_parser_cb_on_url; /* "aiohttp/_http_parser.pyx":342 * * self._csettings.on_url = cb_on_url * self._csettings.on_status = cb_on_status # <<<<<<<<<<<<<< * self._csettings.on_header_field = cb_on_header_field * self._csettings.on_header_value = cb_on_header_value */ __pyx_v_self->_csettings->on_status = __pyx_f_7aiohttp_12_http_parser_cb_on_status; /* "aiohttp/_http_parser.pyx":343 * self._csettings.on_url = cb_on_url * self._csettings.on_status = cb_on_status * self._csettings.on_header_field = cb_on_header_field # <<<<<<<<<<<<<< * self._csettings.on_header_value = cb_on_header_value * self._csettings.on_headers_complete = cb_on_headers_complete */ __pyx_v_self->_csettings->on_header_field = __pyx_f_7aiohttp_12_http_parser_cb_on_header_field; /* "aiohttp/_http_parser.pyx":344 * self._csettings.on_status = cb_on_status * self._csettings.on_header_field = cb_on_header_field * self._csettings.on_header_value = cb_on_header_value # <<<<<<<<<<<<<< * self._csettings.on_headers_complete = cb_on_headers_complete * self._csettings.on_body = cb_on_body */ __pyx_v_self->_csettings->on_header_value = __pyx_f_7aiohttp_12_http_parser_cb_on_header_value; /* "aiohttp/_http_parser.pyx":345 * self._csettings.on_header_field = cb_on_header_field * self._csettings.on_header_value = cb_on_header_value * self._csettings.on_headers_complete = cb_on_headers_complete # <<<<<<<<<<<<<< * self._csettings.on_body = cb_on_body * self._csettings.on_message_begin = cb_on_message_begin */ __pyx_v_self->_csettings->on_headers_complete = __pyx_f_7aiohttp_12_http_parser_cb_on_headers_complete; /* "aiohttp/_http_parser.pyx":346 * self._csettings.on_header_value = cb_on_header_value * self._csettings.on_headers_complete = cb_on_headers_complete * self._csettings.on_body = cb_on_body # <<<<<<<<<<<<<< * self._csettings.on_message_begin = cb_on_message_begin * self._csettings.on_message_complete = cb_on_message_complete */ __pyx_v_self->_csettings->on_body = __pyx_f_7aiohttp_12_http_parser_cb_on_body; /* "aiohttp/_http_parser.pyx":347 * self._csettings.on_headers_complete = cb_on_headers_complete * self._csettings.on_body = cb_on_body * self._csettings.on_message_begin = cb_on_message_begin # <<<<<<<<<<<<<< * self._csettings.on_message_complete = cb_on_message_complete * self._csettings.on_chunk_header = cb_on_chunk_header */ __pyx_v_self->_csettings->on_message_begin = __pyx_f_7aiohttp_12_http_parser_cb_on_message_begin; /* "aiohttp/_http_parser.pyx":348 * self._csettings.on_body = cb_on_body * self._csettings.on_message_begin = cb_on_message_begin * self._csettings.on_message_complete = cb_on_message_complete # <<<<<<<<<<<<<< * self._csettings.on_chunk_header = cb_on_chunk_header * self._csettings.on_chunk_complete = cb_on_chunk_complete */ __pyx_v_self->_csettings->on_message_complete = __pyx_f_7aiohttp_12_http_parser_cb_on_message_complete; /* "aiohttp/_http_parser.pyx":349 * self._csettings.on_message_begin = cb_on_message_begin * self._csettings.on_message_complete = cb_on_message_complete * self._csettings.on_chunk_header = cb_on_chunk_header # <<<<<<<<<<<<<< * self._csettings.on_chunk_complete = cb_on_chunk_complete * */ __pyx_v_self->_csettings->on_chunk_header = __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_header; /* "aiohttp/_http_parser.pyx":350 * self._csettings.on_message_complete = cb_on_message_complete * self._csettings.on_chunk_header = cb_on_chunk_header * self._csettings.on_chunk_complete = cb_on_chunk_complete # <<<<<<<<<<<<<< * * self._last_error = None */ __pyx_v_self->_csettings->on_chunk_complete = __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_complete; /* "aiohttp/_http_parser.pyx":352 * self._csettings.on_chunk_complete = cb_on_chunk_complete * * self._last_error = None # <<<<<<<<<<<<<< * * cdef _process_header(self): */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_last_error); __Pyx_DECREF(__pyx_v_self->_last_error); __pyx_v_self->_last_error = Py_None; /* "aiohttp/_http_parser.pyx":308 * PyMem_Free(self._csettings) * * cdef _init(self, cparser.http_parser_type mode, # <<<<<<<<<<<<<< * object protocol, object loop, object timer=None, * size_t max_line_size=8190, size_t max_headers=32768, */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._init", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":354 * self._last_error = None * * cdef _process_header(self): # <<<<<<<<<<<<<< * if self._raw_name: * raw_name = bytes(self._raw_name) */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__process_header(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_v_raw_name = NULL; PyObject *__pyx_v_raw_value = NULL; PyObject *__pyx_v_name = NULL; PyObject *__pyx_v_value = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; int __pyx_t_7; int __pyx_t_8; __Pyx_RefNannySetupContext("_process_header", 0); /* "aiohttp/_http_parser.pyx":355 * * cdef _process_header(self): * if self._raw_name: # <<<<<<<<<<<<<< * raw_name = bytes(self._raw_name) * raw_value = bytes(self._raw_value) */ __pyx_t_1 = (__pyx_v_self->_raw_name != Py_None)&&(PyByteArray_GET_SIZE(__pyx_v_self->_raw_name) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":356 * cdef _process_header(self): * if self._raw_name: * raw_name = bytes(self._raw_name) # <<<<<<<<<<<<<< * raw_value = bytes(self._raw_value) * */ __pyx_t_2 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyBytes_Type)), __pyx_v_self->_raw_name); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 356, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_raw_name = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":357 * if self._raw_name: * raw_name = bytes(self._raw_name) * raw_value = bytes(self._raw_value) # <<<<<<<<<<<<<< * * name = find_header(raw_name) */ __pyx_t_2 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyBytes_Type)), __pyx_v_self->_raw_value); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 357, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_raw_value = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":359 * raw_value = bytes(self._raw_value) * * name = find_header(raw_name) # <<<<<<<<<<<<<< * value = raw_value.decode('utf-8', 'surrogateescape') * */ __pyx_t_2 = __pyx_f_7aiohttp_12_http_parser_find_header(__pyx_v_raw_name); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 359, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_name = __pyx_t_2; __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":360 * * name = find_header(raw_name) * value = raw_value.decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * * self._headers.add(name, value) */ __pyx_t_2 = __Pyx_decode_bytes(__pyx_v_raw_value, 0, PY_SSIZE_T_MAX, NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 360, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_value = __pyx_t_2; __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":362 * value = raw_value.decode('utf-8', 'surrogateescape') * * self._headers.add(name, value) # <<<<<<<<<<<<<< * * if name is CONTENT_ENCODING: */ __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_headers, __pyx_n_s_add); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 362, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_4 = NULL; __pyx_t_5 = 0; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); __pyx_t_5 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_3)) { PyObject *__pyx_temp[3] = {__pyx_t_4, __pyx_v_name, __pyx_v_value}; __pyx_t_2 = __Pyx_PyFunction_FastCall(__pyx_t_3, __pyx_temp+1-__pyx_t_5, 2+__pyx_t_5); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 362, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GOTREF(__pyx_t_2); } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_3)) { PyObject *__pyx_temp[3] = {__pyx_t_4, __pyx_v_name, __pyx_v_value}; __pyx_t_2 = __Pyx_PyCFunction_FastCall(__pyx_t_3, __pyx_temp+1-__pyx_t_5, 2+__pyx_t_5); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 362, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GOTREF(__pyx_t_2); } else #endif { __pyx_t_6 = PyTuple_New(2+__pyx_t_5); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 362, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); if (__pyx_t_4) { __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_6, 0, __pyx_t_4); __pyx_t_4 = NULL; } __Pyx_INCREF(__pyx_v_name); __Pyx_GIVEREF(__pyx_v_name); PyTuple_SET_ITEM(__pyx_t_6, 0+__pyx_t_5, __pyx_v_name); __Pyx_INCREF(__pyx_v_value); __Pyx_GIVEREF(__pyx_v_value); PyTuple_SET_ITEM(__pyx_t_6, 1+__pyx_t_5, __pyx_v_value); __pyx_t_2 = __Pyx_PyObject_Call(__pyx_t_3, __pyx_t_6, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 362, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; } __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":364 * self._headers.add(name, value) * * if name is CONTENT_ENCODING: # <<<<<<<<<<<<<< * self._content_encoding = value * */ __pyx_t_1 = (__pyx_v_name == __pyx_v_7aiohttp_12_http_parser_CONTENT_ENCODING); __pyx_t_7 = (__pyx_t_1 != 0); if (__pyx_t_7) { /* "aiohttp/_http_parser.pyx":365 * * if name is CONTENT_ENCODING: * self._content_encoding = value # <<<<<<<<<<<<<< * * PyByteArray_Resize(self._raw_name, 0) */ if (!(likely(PyUnicode_CheckExact(__pyx_v_value))||((__pyx_v_value) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_v_value)->tp_name), 0))) __PYX_ERR(0, 365, __pyx_L1_error) __pyx_t_2 = __pyx_v_value; __Pyx_INCREF(__pyx_t_2); __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->_content_encoding); __Pyx_DECREF(__pyx_v_self->_content_encoding); __pyx_v_self->_content_encoding = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":364 * self._headers.add(name, value) * * if name is CONTENT_ENCODING: # <<<<<<<<<<<<<< * self._content_encoding = value * */ } /* "aiohttp/_http_parser.pyx":367 * self._content_encoding = value * * PyByteArray_Resize(self._raw_name, 0) # <<<<<<<<<<<<<< * PyByteArray_Resize(self._raw_value, 0) * self._has_value = False */ __pyx_t_2 = __pyx_v_self->_raw_name; __Pyx_INCREF(__pyx_t_2); __pyx_t_5 = PyByteArray_Resize(__pyx_t_2, 0); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(0, 367, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":368 * * PyByteArray_Resize(self._raw_name, 0) * PyByteArray_Resize(self._raw_value, 0) # <<<<<<<<<<<<<< * self._has_value = False * self._raw_headers.append((raw_name, raw_value)) */ __pyx_t_2 = __pyx_v_self->_raw_value; __Pyx_INCREF(__pyx_t_2); __pyx_t_5 = PyByteArray_Resize(__pyx_t_2, 0); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(0, 368, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":369 * PyByteArray_Resize(self._raw_name, 0) * PyByteArray_Resize(self._raw_value, 0) * self._has_value = False # <<<<<<<<<<<<<< * self._raw_headers.append((raw_name, raw_value)) * */ __pyx_v_self->_has_value = 0; /* "aiohttp/_http_parser.pyx":370 * PyByteArray_Resize(self._raw_value, 0) * self._has_value = False * self._raw_headers.append((raw_name, raw_value)) # <<<<<<<<<<<<<< * * cdef _on_header_field(self, char* at, size_t length): */ if (unlikely(__pyx_v_self->_raw_headers == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "append"); __PYX_ERR(0, 370, __pyx_L1_error) } __pyx_t_2 = PyTuple_New(2); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 370, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v_raw_name); __Pyx_GIVEREF(__pyx_v_raw_name); PyTuple_SET_ITEM(__pyx_t_2, 0, __pyx_v_raw_name); __Pyx_INCREF(__pyx_v_raw_value); __Pyx_GIVEREF(__pyx_v_raw_value); PyTuple_SET_ITEM(__pyx_t_2, 1, __pyx_v_raw_value); __pyx_t_8 = __Pyx_PyList_Append(__pyx_v_self->_raw_headers, __pyx_t_2); if (unlikely(__pyx_t_8 == ((int)-1))) __PYX_ERR(0, 370, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":355 * * cdef _process_header(self): * if self._raw_name: # <<<<<<<<<<<<<< * raw_name = bytes(self._raw_name) * raw_value = bytes(self._raw_value) */ } /* "aiohttp/_http_parser.pyx":354 * self._last_error = None * * cdef _process_header(self): # <<<<<<<<<<<<<< * if self._raw_name: * raw_name = bytes(self._raw_name) */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._process_header", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF(__pyx_v_raw_name); __Pyx_XDECREF(__pyx_v_raw_value); __Pyx_XDECREF(__pyx_v_name); __Pyx_XDECREF(__pyx_v_value); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":372 * self._raw_headers.append((raw_name, raw_value)) * * cdef _on_header_field(self, char* at, size_t length): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_field(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, char *__pyx_v_at, size_t __pyx_v_length) { Py_ssize_t __pyx_v_size; char *__pyx_v_buf; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; Py_ssize_t __pyx_t_3; int __pyx_t_4; __Pyx_RefNannySetupContext("_on_header_field", 0); /* "aiohttp/_http_parser.pyx":375 * cdef Py_ssize_t size * cdef char *buf * if self._has_value: # <<<<<<<<<<<<<< * self._process_header() * */ __pyx_t_1 = (__pyx_v_self->_has_value != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":376 * cdef char *buf * if self._has_value: * self._process_header() # <<<<<<<<<<<<<< * * size = PyByteArray_Size(self._raw_name) */ __pyx_t_2 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self->__pyx_vtab)->_process_header(__pyx_v_self); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 376, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":375 * cdef Py_ssize_t size * cdef char *buf * if self._has_value: # <<<<<<<<<<<<<< * self._process_header() * */ } /* "aiohttp/_http_parser.pyx":378 * self._process_header() * * size = PyByteArray_Size(self._raw_name) # <<<<<<<<<<<<<< * PyByteArray_Resize(self._raw_name, size + length) * buf = PyByteArray_AsString(self._raw_name) */ __pyx_t_2 = __pyx_v_self->_raw_name; __Pyx_INCREF(__pyx_t_2); __pyx_t_3 = PyByteArray_Size(__pyx_t_2); if (unlikely(__pyx_t_3 == ((Py_ssize_t)-1L))) __PYX_ERR(0, 378, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_v_size = __pyx_t_3; /* "aiohttp/_http_parser.pyx":379 * * size = PyByteArray_Size(self._raw_name) * PyByteArray_Resize(self._raw_name, size + length) # <<<<<<<<<<<<<< * buf = PyByteArray_AsString(self._raw_name) * memcpy(buf + size, at, length) */ __pyx_t_2 = __pyx_v_self->_raw_name; __Pyx_INCREF(__pyx_t_2); __pyx_t_4 = PyByteArray_Resize(__pyx_t_2, (__pyx_v_size + __pyx_v_length)); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 379, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":380 * size = PyByteArray_Size(self._raw_name) * PyByteArray_Resize(self._raw_name, size + length) * buf = PyByteArray_AsString(self._raw_name) # <<<<<<<<<<<<<< * memcpy(buf + size, at, length) * */ __pyx_t_2 = __pyx_v_self->_raw_name; __Pyx_INCREF(__pyx_t_2); __pyx_v_buf = PyByteArray_AsString(__pyx_t_2); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":381 * PyByteArray_Resize(self._raw_name, size + length) * buf = PyByteArray_AsString(self._raw_name) * memcpy(buf + size, at, length) # <<<<<<<<<<<<<< * * cdef _on_header_value(self, char* at, size_t length): */ (void)(memcpy((__pyx_v_buf + __pyx_v_size), __pyx_v_at, __pyx_v_length)); /* "aiohttp/_http_parser.pyx":372 * self._raw_headers.append((raw_name, raw_value)) * * cdef _on_header_field(self, char* at, size_t length): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_header_field", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":383 * memcpy(buf + size, at, length) * * cdef _on_header_value(self, char* at, size_t length): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_value(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, char *__pyx_v_at, size_t __pyx_v_length) { Py_ssize_t __pyx_v_size; char *__pyx_v_buf; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; int __pyx_t_3; __Pyx_RefNannySetupContext("_on_header_value", 0); /* "aiohttp/_http_parser.pyx":387 * cdef char *buf * * size = PyByteArray_Size(self._raw_value) # <<<<<<<<<<<<<< * PyByteArray_Resize(self._raw_value, size + length) * buf = PyByteArray_AsString(self._raw_value) */ __pyx_t_1 = __pyx_v_self->_raw_value; __Pyx_INCREF(__pyx_t_1); __pyx_t_2 = PyByteArray_Size(__pyx_t_1); if (unlikely(__pyx_t_2 == ((Py_ssize_t)-1L))) __PYX_ERR(0, 387, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_size = __pyx_t_2; /* "aiohttp/_http_parser.pyx":388 * * size = PyByteArray_Size(self._raw_value) * PyByteArray_Resize(self._raw_value, size + length) # <<<<<<<<<<<<<< * buf = PyByteArray_AsString(self._raw_value) * memcpy(buf + size, at, length) */ __pyx_t_1 = __pyx_v_self->_raw_value; __Pyx_INCREF(__pyx_t_1); __pyx_t_3 = PyByteArray_Resize(__pyx_t_1, (__pyx_v_size + __pyx_v_length)); if (unlikely(__pyx_t_3 == ((int)-1))) __PYX_ERR(0, 388, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":389 * size = PyByteArray_Size(self._raw_value) * PyByteArray_Resize(self._raw_value, size + length) * buf = PyByteArray_AsString(self._raw_value) # <<<<<<<<<<<<<< * memcpy(buf + size, at, length) * self._has_value = True */ __pyx_t_1 = __pyx_v_self->_raw_value; __Pyx_INCREF(__pyx_t_1); __pyx_v_buf = PyByteArray_AsString(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":390 * PyByteArray_Resize(self._raw_value, size + length) * buf = PyByteArray_AsString(self._raw_value) * memcpy(buf + size, at, length) # <<<<<<<<<<<<<< * self._has_value = True * */ (void)(memcpy((__pyx_v_buf + __pyx_v_size), __pyx_v_at, __pyx_v_length)); /* "aiohttp/_http_parser.pyx":391 * buf = PyByteArray_AsString(self._raw_value) * memcpy(buf + size, at, length) * self._has_value = True # <<<<<<<<<<<<<< * * cdef _on_headers_complete(self): */ __pyx_v_self->_has_value = 1; /* "aiohttp/_http_parser.pyx":383 * memcpy(buf + size, at, length) * * cdef _on_header_value(self, char* at, size_t length): # <<<<<<<<<<<<<< * cdef Py_ssize_t size * cdef char *buf */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_header_value", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":393 * self._has_value = True * * cdef _on_headers_complete(self): # <<<<<<<<<<<<<< * self._process_header() * */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_headers_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_v_method = NULL; int __pyx_v_should_close; unsigned int __pyx_v_upgrade; unsigned int __pyx_v_chunked; PyObject *__pyx_v_raw_headers = NULL; PyObject *__pyx_v_headers = NULL; PyObject *__pyx_v_encoding = NULL; PyObject *__pyx_v_enc = NULL; PyObject *__pyx_v_msg = NULL; PyObject *__pyx_v_payload = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; unsigned int __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; int __pyx_t_6; PyObject *__pyx_t_7 = NULL; int __pyx_t_8; int __pyx_t_9; int __pyx_t_10; __Pyx_RefNannySetupContext("_on_headers_complete", 0); /* "aiohttp/_http_parser.pyx":394 * * cdef _on_headers_complete(self): * self._process_header() # <<<<<<<<<<<<<< * * method = http_method_str(self._cparser.method) */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self->__pyx_vtab)->_process_header(__pyx_v_self); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 394, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":396 * self._process_header() * * method = http_method_str(self._cparser.method) # <<<<<<<<<<<<<< * should_close = not cparser.http_should_keep_alive(self._cparser) * upgrade = self._cparser.upgrade */ __pyx_t_1 = __pyx_f_7aiohttp_12_http_parser_http_method_str(__pyx_v_self->_cparser->method); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 396, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_method = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":397 * * method = http_method_str(self._cparser.method) * should_close = not cparser.http_should_keep_alive(self._cparser) # <<<<<<<<<<<<<< * upgrade = self._cparser.upgrade * chunked = self._cparser.flags & cparser.F_CHUNKED */ __pyx_v_should_close = (!(http_should_keep_alive(__pyx_v_self->_cparser) != 0)); /* "aiohttp/_http_parser.pyx":398 * method = http_method_str(self._cparser.method) * should_close = not cparser.http_should_keep_alive(self._cparser) * upgrade = self._cparser.upgrade # <<<<<<<<<<<<<< * chunked = self._cparser.flags & cparser.F_CHUNKED * */ __pyx_t_2 = __pyx_v_self->_cparser->upgrade; __pyx_v_upgrade = __pyx_t_2; /* "aiohttp/_http_parser.pyx":399 * should_close = not cparser.http_should_keep_alive(self._cparser) * upgrade = self._cparser.upgrade * chunked = self._cparser.flags & cparser.F_CHUNKED # <<<<<<<<<<<<<< * * raw_headers = tuple(self._raw_headers) */ __pyx_v_chunked = (__pyx_v_self->_cparser->flags & F_CHUNKED); /* "aiohttp/_http_parser.pyx":401 * chunked = self._cparser.flags & cparser.F_CHUNKED * * raw_headers = tuple(self._raw_headers) # <<<<<<<<<<<<<< * headers = CIMultiDictProxy(self._headers) * */ if (unlikely(__pyx_v_self->_raw_headers == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); __PYX_ERR(0, 401, __pyx_L1_error) } __pyx_t_1 = PyList_AsTuple(__pyx_v_self->_raw_headers); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 401, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_raw_headers = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":402 * * raw_headers = tuple(self._raw_headers) * headers = CIMultiDictProxy(self._headers) # <<<<<<<<<<<<<< * * if upgrade or self._cparser.method == 5: # cparser.CONNECT: */ __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy); __pyx_t_3 = __pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy; __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); } } __pyx_t_1 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_4, __pyx_v_self->_headers) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_v_self->_headers); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 402, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_v_headers = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":404 * headers = CIMultiDictProxy(self._headers) * * if upgrade or self._cparser.method == 5: # cparser.CONNECT: # <<<<<<<<<<<<<< * self._upgraded = True * */ __pyx_t_6 = (__pyx_v_upgrade != 0); if (!__pyx_t_6) { } else { __pyx_t_5 = __pyx_t_6; goto __pyx_L4_bool_binop_done; } __pyx_t_6 = ((__pyx_v_self->_cparser->method == 5) != 0); __pyx_t_5 = __pyx_t_6; __pyx_L4_bool_binop_done:; if (__pyx_t_5) { /* "aiohttp/_http_parser.pyx":405 * * if upgrade or self._cparser.method == 5: # cparser.CONNECT: * self._upgraded = True # <<<<<<<<<<<<<< * * # do not support old websocket spec */ __pyx_v_self->_upgraded = 1; /* "aiohttp/_http_parser.pyx":404 * headers = CIMultiDictProxy(self._headers) * * if upgrade or self._cparser.method == 5: # cparser.CONNECT: # <<<<<<<<<<<<<< * self._upgraded = True * */ } /* "aiohttp/_http_parser.pyx":408 * * # do not support old websocket spec * if SEC_WEBSOCKET_KEY1 in headers: # <<<<<<<<<<<<<< * raise InvalidHeader(SEC_WEBSOCKET_KEY1) * */ __pyx_t_5 = (__Pyx_PySequence_ContainsTF(__pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1, __pyx_v_headers, Py_EQ)); if (unlikely(__pyx_t_5 < 0)) __PYX_ERR(0, 408, __pyx_L1_error) __pyx_t_6 = (__pyx_t_5 != 0); if (unlikely(__pyx_t_6)) { /* "aiohttp/_http_parser.pyx":409 * # do not support old websocket spec * if SEC_WEBSOCKET_KEY1 in headers: * raise InvalidHeader(SEC_WEBSOCKET_KEY1) # <<<<<<<<<<<<<< * * encoding = None */ __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_InvalidHeader); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 409, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); } } __pyx_t_1 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_4, __pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 409, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 409, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":408 * * # do not support old websocket spec * if SEC_WEBSOCKET_KEY1 in headers: # <<<<<<<<<<<<<< * raise InvalidHeader(SEC_WEBSOCKET_KEY1) * */ } /* "aiohttp/_http_parser.pyx":411 * raise InvalidHeader(SEC_WEBSOCKET_KEY1) * * encoding = None # <<<<<<<<<<<<<< * enc = self._content_encoding * if enc is not None: */ __Pyx_INCREF(Py_None); __pyx_v_encoding = Py_None; /* "aiohttp/_http_parser.pyx":412 * * encoding = None * enc = self._content_encoding # <<<<<<<<<<<<<< * if enc is not None: * self._content_encoding = None */ __pyx_t_1 = __pyx_v_self->_content_encoding; __Pyx_INCREF(__pyx_t_1); __pyx_v_enc = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":413 * encoding = None * enc = self._content_encoding * if enc is not None: # <<<<<<<<<<<<<< * self._content_encoding = None * enc = enc.lower() */ __pyx_t_6 = (__pyx_v_enc != Py_None); __pyx_t_5 = (__pyx_t_6 != 0); if (__pyx_t_5) { /* "aiohttp/_http_parser.pyx":414 * enc = self._content_encoding * if enc is not None: * self._content_encoding = None # <<<<<<<<<<<<<< * enc = enc.lower() * if enc in ('gzip', 'deflate', 'br'): */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_content_encoding); __Pyx_DECREF(__pyx_v_self->_content_encoding); __pyx_v_self->_content_encoding = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":415 * if enc is not None: * self._content_encoding = None * enc = enc.lower() # <<<<<<<<<<<<<< * if enc in ('gzip', 'deflate', 'br'): * encoding = enc */ __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_enc, __pyx_n_s_lower); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 415, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); } } __pyx_t_1 = (__pyx_t_4) ? __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_t_4) : __Pyx_PyObject_CallNoArg(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 415, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF_SET(__pyx_v_enc, __pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":416 * self._content_encoding = None * enc = enc.lower() * if enc in ('gzip', 'deflate', 'br'): # <<<<<<<<<<<<<< * encoding = enc * */ __Pyx_INCREF(__pyx_v_enc); __pyx_t_1 = __pyx_v_enc; __pyx_t_6 = (__Pyx_PyUnicode_Equals(__pyx_t_1, __pyx_n_u_gzip, Py_EQ)); if (unlikely(__pyx_t_6 < 0)) __PYX_ERR(0, 416, __pyx_L1_error) if (!__pyx_t_6) { } else { __pyx_t_5 = __pyx_t_6; goto __pyx_L9_bool_binop_done; } __pyx_t_6 = (__Pyx_PyUnicode_Equals(__pyx_t_1, __pyx_n_u_deflate, Py_EQ)); if (unlikely(__pyx_t_6 < 0)) __PYX_ERR(0, 416, __pyx_L1_error) if (!__pyx_t_6) { } else { __pyx_t_5 = __pyx_t_6; goto __pyx_L9_bool_binop_done; } __pyx_t_6 = (__Pyx_PyUnicode_Equals(__pyx_t_1, __pyx_n_u_br, Py_EQ)); if (unlikely(__pyx_t_6 < 0)) __PYX_ERR(0, 416, __pyx_L1_error) __pyx_t_5 = __pyx_t_6; __pyx_L9_bool_binop_done:; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_6 = (__pyx_t_5 != 0); if (__pyx_t_6) { /* "aiohttp/_http_parser.pyx":417 * enc = enc.lower() * if enc in ('gzip', 'deflate', 'br'): * encoding = enc # <<<<<<<<<<<<<< * * if self._cparser.type == cparser.HTTP_REQUEST: */ __Pyx_INCREF(__pyx_v_enc); __Pyx_DECREF_SET(__pyx_v_encoding, __pyx_v_enc); /* "aiohttp/_http_parser.pyx":416 * self._content_encoding = None * enc = enc.lower() * if enc in ('gzip', 'deflate', 'br'): # <<<<<<<<<<<<<< * encoding = enc * */ } /* "aiohttp/_http_parser.pyx":413 * encoding = None * enc = self._content_encoding * if enc is not None: # <<<<<<<<<<<<<< * self._content_encoding = None * enc = enc.lower() */ } /* "aiohttp/_http_parser.pyx":419 * encoding = enc * * if self._cparser.type == cparser.HTTP_REQUEST: # <<<<<<<<<<<<<< * msg = _new_request_message( * method, self._path, */ __pyx_t_6 = ((__pyx_v_self->_cparser->type == HTTP_REQUEST) != 0); if (__pyx_t_6) { /* "aiohttp/_http_parser.pyx":421 * if self._cparser.type == cparser.HTTP_REQUEST: * msg = _new_request_message( * method, self._path, # <<<<<<<<<<<<<< * self.http_version(), headers, raw_headers, * should_close, encoding, upgrade, chunked, self._url) */ __pyx_t_1 = __pyx_v_self->_path; __Pyx_INCREF(__pyx_t_1); /* "aiohttp/_http_parser.pyx":422 * msg = _new_request_message( * method, self._path, * self.http_version(), headers, raw_headers, # <<<<<<<<<<<<<< * should_close, encoding, upgrade, chunked, self._url) * else: */ __pyx_t_3 = __pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version(__pyx_v_self); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 422, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); /* "aiohttp/_http_parser.pyx":423 * method, self._path, * self.http_version(), headers, raw_headers, * should_close, encoding, upgrade, chunked, self._url) # <<<<<<<<<<<<<< * else: * msg = _new_response_message( */ __pyx_t_4 = __pyx_v_self->_url; __Pyx_INCREF(__pyx_t_4); /* "aiohttp/_http_parser.pyx":420 * * if self._cparser.type == cparser.HTTP_REQUEST: * msg = _new_request_message( # <<<<<<<<<<<<<< * method, self._path, * self.http_version(), headers, raw_headers, */ __pyx_t_7 = __pyx_f_7aiohttp_12_http_parser__new_request_message(__pyx_v_method, ((PyObject*)__pyx_t_1), __pyx_t_3, __pyx_v_headers, __pyx_v_raw_headers, __pyx_v_should_close, __pyx_v_encoding, __pyx_v_upgrade, __pyx_v_chunked, __pyx_t_4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 420, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_msg = __pyx_t_7; __pyx_t_7 = 0; /* "aiohttp/_http_parser.pyx":419 * encoding = enc * * if self._cparser.type == cparser.HTTP_REQUEST: # <<<<<<<<<<<<<< * msg = _new_request_message( * method, self._path, */ goto __pyx_L12; } /* "aiohttp/_http_parser.pyx":425 * should_close, encoding, upgrade, chunked, self._url) * else: * msg = _new_response_message( # <<<<<<<<<<<<<< * self.http_version(), self._cparser.status_code, self._reason, * headers, raw_headers, should_close, encoding, */ /*else*/ { /* "aiohttp/_http_parser.pyx":426 * else: * msg = _new_response_message( * self.http_version(), self._cparser.status_code, self._reason, # <<<<<<<<<<<<<< * headers, raw_headers, should_close, encoding, * upgrade, chunked) */ __pyx_t_7 = __pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version(__pyx_v_self); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 426, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_4 = __pyx_v_self->_reason; __Pyx_INCREF(__pyx_t_4); /* "aiohttp/_http_parser.pyx":425 * should_close, encoding, upgrade, chunked, self._url) * else: * msg = _new_response_message( # <<<<<<<<<<<<<< * self.http_version(), self._cparser.status_code, self._reason, * headers, raw_headers, should_close, encoding, */ __pyx_t_3 = __pyx_f_7aiohttp_12_http_parser__new_response_message(__pyx_t_7, __pyx_v_self->_cparser->status_code, ((PyObject*)__pyx_t_4), __pyx_v_headers, __pyx_v_raw_headers, __pyx_v_should_close, __pyx_v_encoding, __pyx_v_upgrade, __pyx_v_chunked); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 425, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_msg = __pyx_t_3; __pyx_t_3 = 0; } __pyx_L12:; /* "aiohttp/_http_parser.pyx":430 * upgrade, chunked) * * if (self._cparser.content_length > 0 or chunked or # <<<<<<<<<<<<<< * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( */ __pyx_t_5 = ((__pyx_v_self->_cparser->content_length > 0) != 0); if (!__pyx_t_5) { } else { __pyx_t_6 = __pyx_t_5; goto __pyx_L14_bool_binop_done; } __pyx_t_5 = (__pyx_v_chunked != 0); if (!__pyx_t_5) { } else { __pyx_t_6 = __pyx_t_5; goto __pyx_L14_bool_binop_done; } /* "aiohttp/_http_parser.pyx":431 * * if (self._cparser.content_length > 0 or chunked or * self._cparser.method == 5): # CONNECT: 5 # <<<<<<<<<<<<<< * payload = StreamReader( * self._protocol, timer=self._timer, loop=self._loop) */ __pyx_t_5 = ((__pyx_v_self->_cparser->method == 5) != 0); __pyx_t_6 = __pyx_t_5; __pyx_L14_bool_binop_done:; /* "aiohttp/_http_parser.pyx":430 * upgrade, chunked) * * if (self._cparser.content_length > 0 or chunked or # <<<<<<<<<<<<<< * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( */ if (__pyx_t_6) { /* "aiohttp/_http_parser.pyx":432 * if (self._cparser.content_length > 0 or chunked or * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( # <<<<<<<<<<<<<< * self._protocol, timer=self._timer, loop=self._loop) * else: */ __pyx_t_3 = PyTuple_New(1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 432, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_INCREF(__pyx_v_self->_protocol); __Pyx_GIVEREF(__pyx_v_self->_protocol); PyTuple_SET_ITEM(__pyx_t_3, 0, __pyx_v_self->_protocol); /* "aiohttp/_http_parser.pyx":433 * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( * self._protocol, timer=self._timer, loop=self._loop) # <<<<<<<<<<<<<< * else: * payload = EMPTY_PAYLOAD */ __pyx_t_4 = __Pyx_PyDict_NewPresized(2); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 433, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); if (PyDict_SetItem(__pyx_t_4, __pyx_n_s_timer, __pyx_v_self->_timer) < 0) __PYX_ERR(0, 433, __pyx_L1_error) if (PyDict_SetItem(__pyx_t_4, __pyx_n_s_loop, __pyx_v_self->_loop) < 0) __PYX_ERR(0, 433, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":432 * if (self._cparser.content_length > 0 or chunked or * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( # <<<<<<<<<<<<<< * self._protocol, timer=self._timer, loop=self._loop) * else: */ __pyx_t_7 = __Pyx_PyObject_Call(__pyx_v_7aiohttp_12_http_parser_StreamReader, __pyx_t_3, __pyx_t_4); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 432, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_payload = __pyx_t_7; __pyx_t_7 = 0; /* "aiohttp/_http_parser.pyx":430 * upgrade, chunked) * * if (self._cparser.content_length > 0 or chunked or # <<<<<<<<<<<<<< * self._cparser.method == 5): # CONNECT: 5 * payload = StreamReader( */ goto __pyx_L13; } /* "aiohttp/_http_parser.pyx":435 * self._protocol, timer=self._timer, loop=self._loop) * else: * payload = EMPTY_PAYLOAD # <<<<<<<<<<<<<< * * self._payload = payload */ /*else*/ { __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD); __pyx_v_payload = __pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD; } __pyx_L13:; /* "aiohttp/_http_parser.pyx":437 * payload = EMPTY_PAYLOAD * * self._payload = payload # <<<<<<<<<<<<<< * if encoding is not None and self._auto_decompress: * self._payload = DeflateBuffer(payload, encoding) */ __Pyx_INCREF(__pyx_v_payload); __Pyx_GIVEREF(__pyx_v_payload); __Pyx_GOTREF(__pyx_v_self->_payload); __Pyx_DECREF(__pyx_v_self->_payload); __pyx_v_self->_payload = __pyx_v_payload; /* "aiohttp/_http_parser.pyx":438 * * self._payload = payload * if encoding is not None and self._auto_decompress: # <<<<<<<<<<<<<< * self._payload = DeflateBuffer(payload, encoding) * */ __pyx_t_5 = (__pyx_v_encoding != Py_None); __pyx_t_8 = (__pyx_t_5 != 0); if (__pyx_t_8) { } else { __pyx_t_6 = __pyx_t_8; goto __pyx_L18_bool_binop_done; } __pyx_t_8 = (__pyx_v_self->_auto_decompress != 0); __pyx_t_6 = __pyx_t_8; __pyx_L18_bool_binop_done:; if (__pyx_t_6) { /* "aiohttp/_http_parser.pyx":439 * self._payload = payload * if encoding is not None and self._auto_decompress: * self._payload = DeflateBuffer(payload, encoding) # <<<<<<<<<<<<<< * * if not self._response_with_body: */ __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_DeflateBuffer); __pyx_t_4 = __pyx_v_7aiohttp_12_http_parser_DeflateBuffer; __pyx_t_3 = NULL; __pyx_t_9 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_4))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); __pyx_t_9 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_4)) { PyObject *__pyx_temp[3] = {__pyx_t_3, __pyx_v_payload, __pyx_v_encoding}; __pyx_t_7 = __Pyx_PyFunction_FastCall(__pyx_t_4, __pyx_temp+1-__pyx_t_9, 2+__pyx_t_9); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 439, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_GOTREF(__pyx_t_7); } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_4)) { PyObject *__pyx_temp[3] = {__pyx_t_3, __pyx_v_payload, __pyx_v_encoding}; __pyx_t_7 = __Pyx_PyCFunction_FastCall(__pyx_t_4, __pyx_temp+1-__pyx_t_9, 2+__pyx_t_9); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 439, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_GOTREF(__pyx_t_7); } else #endif { __pyx_t_1 = PyTuple_New(2+__pyx_t_9); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 439, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (__pyx_t_3) { __Pyx_GIVEREF(__pyx_t_3); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_3); __pyx_t_3 = NULL; } __Pyx_INCREF(__pyx_v_payload); __Pyx_GIVEREF(__pyx_v_payload); PyTuple_SET_ITEM(__pyx_t_1, 0+__pyx_t_9, __pyx_v_payload); __Pyx_INCREF(__pyx_v_encoding); __Pyx_GIVEREF(__pyx_v_encoding); PyTuple_SET_ITEM(__pyx_t_1, 1+__pyx_t_9, __pyx_v_encoding); __pyx_t_7 = __Pyx_PyObject_Call(__pyx_t_4, __pyx_t_1, NULL); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 439, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GIVEREF(__pyx_t_7); __Pyx_GOTREF(__pyx_v_self->_payload); __Pyx_DECREF(__pyx_v_self->_payload); __pyx_v_self->_payload = __pyx_t_7; __pyx_t_7 = 0; /* "aiohttp/_http_parser.pyx":438 * * self._payload = payload * if encoding is not None and self._auto_decompress: # <<<<<<<<<<<<<< * self._payload = DeflateBuffer(payload, encoding) * */ } /* "aiohttp/_http_parser.pyx":441 * self._payload = DeflateBuffer(payload, encoding) * * if not self._response_with_body: # <<<<<<<<<<<<<< * payload = EMPTY_PAYLOAD * */ __pyx_t_6 = ((!(__pyx_v_self->_response_with_body != 0)) != 0); if (__pyx_t_6) { /* "aiohttp/_http_parser.pyx":442 * * if not self._response_with_body: * payload = EMPTY_PAYLOAD # <<<<<<<<<<<<<< * * self._messages.append((msg, payload)) */ __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD); __Pyx_DECREF_SET(__pyx_v_payload, __pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD); /* "aiohttp/_http_parser.pyx":441 * self._payload = DeflateBuffer(payload, encoding) * * if not self._response_with_body: # <<<<<<<<<<<<<< * payload = EMPTY_PAYLOAD * */ } /* "aiohttp/_http_parser.pyx":444 * payload = EMPTY_PAYLOAD * * self._messages.append((msg, payload)) # <<<<<<<<<<<<<< * * cdef _on_message_complete(self): */ if (unlikely(__pyx_v_self->_messages == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "append"); __PYX_ERR(0, 444, __pyx_L1_error) } __pyx_t_7 = PyTuple_New(2); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 444, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_INCREF(__pyx_v_msg); __Pyx_GIVEREF(__pyx_v_msg); PyTuple_SET_ITEM(__pyx_t_7, 0, __pyx_v_msg); __Pyx_INCREF(__pyx_v_payload); __Pyx_GIVEREF(__pyx_v_payload); PyTuple_SET_ITEM(__pyx_t_7, 1, __pyx_v_payload); __pyx_t_10 = __Pyx_PyList_Append(__pyx_v_self->_messages, __pyx_t_7); if (unlikely(__pyx_t_10 == ((int)-1))) __PYX_ERR(0, 444, __pyx_L1_error) __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; /* "aiohttp/_http_parser.pyx":393 * self._has_value = True * * cdef _on_headers_complete(self): # <<<<<<<<<<<<<< * self._process_header() * */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_7); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_headers_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF(__pyx_v_method); __Pyx_XDECREF(__pyx_v_raw_headers); __Pyx_XDECREF(__pyx_v_headers); __Pyx_XDECREF(__pyx_v_encoding); __Pyx_XDECREF(__pyx_v_enc); __Pyx_XDECREF(__pyx_v_msg); __Pyx_XDECREF(__pyx_v_payload); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":446 * self._messages.append((msg, payload)) * * cdef _on_message_complete(self): # <<<<<<<<<<<<<< * self._payload.feed_eof() * self._payload = None */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_message_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("_on_message_complete", 0); /* "aiohttp/_http_parser.pyx":447 * * cdef _on_message_complete(self): * self._payload.feed_eof() # <<<<<<<<<<<<<< * self._payload = None * */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_payload, __pyx_n_s_feed_eof); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 447, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 447, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":448 * cdef _on_message_complete(self): * self._payload.feed_eof() * self._payload = None # <<<<<<<<<<<<<< * * cdef _on_chunk_header(self): */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_payload); __Pyx_DECREF(__pyx_v_self->_payload); __pyx_v_self->_payload = Py_None; /* "aiohttp/_http_parser.pyx":446 * self._messages.append((msg, payload)) * * cdef _on_message_complete(self): # <<<<<<<<<<<<<< * self._payload.feed_eof() * self._payload = None */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_message_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":450 * self._payload = None * * cdef _on_chunk_header(self): # <<<<<<<<<<<<<< * self._payload.begin_http_chunk_receiving() * */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_header(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("_on_chunk_header", 0); /* "aiohttp/_http_parser.pyx":451 * * cdef _on_chunk_header(self): * self._payload.begin_http_chunk_receiving() # <<<<<<<<<<<<<< * * cdef _on_chunk_complete(self): */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_payload, __pyx_n_s_begin_http_chunk_receiving); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 451, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 451, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":450 * self._payload = None * * cdef _on_chunk_header(self): # <<<<<<<<<<<<<< * self._payload.begin_http_chunk_receiving() * */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_chunk_header", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":453 * self._payload.begin_http_chunk_receiving() * * cdef _on_chunk_complete(self): # <<<<<<<<<<<<<< * self._payload.end_http_chunk_receiving() * */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; __Pyx_RefNannySetupContext("_on_chunk_complete", 0); /* "aiohttp/_http_parser.pyx":454 * * cdef _on_chunk_complete(self): * self._payload.end_http_chunk_receiving() # <<<<<<<<<<<<<< * * cdef object _on_status_complete(self): */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_payload, __pyx_n_s_end_http_chunk_receiving); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 454, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 454, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":453 * self._payload.begin_http_chunk_receiving() * * cdef _on_chunk_complete(self): # <<<<<<<<<<<<<< * self._payload.end_http_chunk_receiving() * */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser._on_chunk_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":456 * self._payload.end_http_chunk_receiving() * * cdef object _on_status_complete(self): # <<<<<<<<<<<<<< * pass * */ static PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_status_complete(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("_on_status_complete", 0); /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":459 * pass * * cdef inline http_version(self): # <<<<<<<<<<<<<< * cdef cparser.http_parser* parser = self._cparser * */ static CYTHON_INLINE PyObject *__pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { struct http_parser *__pyx_v_parser; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations struct http_parser *__pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; int __pyx_t_8; PyObject *__pyx_t_9 = NULL; __Pyx_RefNannySetupContext("http_version", 0); /* "aiohttp/_http_parser.pyx":460 * * cdef inline http_version(self): * cdef cparser.http_parser* parser = self._cparser # <<<<<<<<<<<<<< * * if parser.http_major == 1: */ __pyx_t_1 = __pyx_v_self->_cparser; __pyx_v_parser = __pyx_t_1; /* "aiohttp/_http_parser.pyx":462 * cdef cparser.http_parser* parser = self._cparser * * if parser.http_major == 1: # <<<<<<<<<<<<<< * if parser.http_minor == 0: * return HttpVersion10 */ __pyx_t_2 = ((__pyx_v_parser->http_major == 1) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":463 * * if parser.http_major == 1: * if parser.http_minor == 0: # <<<<<<<<<<<<<< * return HttpVersion10 * elif parser.http_minor == 1: */ switch (__pyx_v_parser->http_minor) { case 0: /* "aiohttp/_http_parser.pyx":464 * if parser.http_major == 1: * if parser.http_minor == 0: * return HttpVersion10 # <<<<<<<<<<<<<< * elif parser.http_minor == 1: * return HttpVersion11 */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion10); __pyx_r = __pyx_v_7aiohttp_12_http_parser_HttpVersion10; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":463 * * if parser.http_major == 1: * if parser.http_minor == 0: # <<<<<<<<<<<<<< * return HttpVersion10 * elif parser.http_minor == 1: */ break; case 1: /* "aiohttp/_http_parser.pyx":466 * return HttpVersion10 * elif parser.http_minor == 1: * return HttpVersion11 # <<<<<<<<<<<<<< * * return HttpVersion(parser.http_major, parser.http_minor) */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion11); __pyx_r = __pyx_v_7aiohttp_12_http_parser_HttpVersion11; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":465 * if parser.http_minor == 0: * return HttpVersion10 * elif parser.http_minor == 1: # <<<<<<<<<<<<<< * return HttpVersion11 * */ break; default: break; } /* "aiohttp/_http_parser.pyx":462 * cdef cparser.http_parser* parser = self._cparser * * if parser.http_major == 1: # <<<<<<<<<<<<<< * if parser.http_minor == 0: * return HttpVersion10 */ } /* "aiohttp/_http_parser.pyx":468 * return HttpVersion11 * * return HttpVersion(parser.http_major, parser.http_minor) # <<<<<<<<<<<<<< * * ### Public API ### */ __Pyx_XDECREF(__pyx_r); __pyx_t_4 = __Pyx_PyInt_From_unsigned_short(__pyx_v_parser->http_major); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = __Pyx_PyInt_From_unsigned_short(__pyx_v_parser->http_minor); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion); __pyx_t_6 = __pyx_v_7aiohttp_12_http_parser_HttpVersion; __pyx_t_7 = NULL; __pyx_t_8 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_6))) { __pyx_t_7 = PyMethod_GET_SELF(__pyx_t_6); if (likely(__pyx_t_7)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_6); __Pyx_INCREF(__pyx_t_7); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_6, function); __pyx_t_8 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[3] = {__pyx_t_7, __pyx_t_4, __pyx_t_5}; __pyx_t_3 = __Pyx_PyFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_8, 2+__pyx_t_8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[3] = {__pyx_t_7, __pyx_t_4, __pyx_t_5}; __pyx_t_3 = __Pyx_PyCFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_8, 2+__pyx_t_8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; } else #endif { __pyx_t_9 = PyTuple_New(2+__pyx_t_8); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_9); if (__pyx_t_7) { __Pyx_GIVEREF(__pyx_t_7); PyTuple_SET_ITEM(__pyx_t_9, 0, __pyx_t_7); __pyx_t_7 = NULL; } __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_9, 0+__pyx_t_8, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_5); PyTuple_SET_ITEM(__pyx_t_9, 1+__pyx_t_8, __pyx_t_5); __pyx_t_4 = 0; __pyx_t_5 = 0; __pyx_t_3 = __Pyx_PyObject_Call(__pyx_t_6, __pyx_t_9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 468, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __pyx_r = __pyx_t_3; __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":459 * pass * * cdef inline http_version(self): # <<<<<<<<<<<<<< * cdef cparser.http_parser* parser = self._cparser * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_9); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.http_version", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":472 * ### Public API ### * * def feed_eof(self): # <<<<<<<<<<<<<< * cdef bytes desc * */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_5feed_eof(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_5feed_eof(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("feed_eof (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_10HttpParser_4feed_eof(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_10HttpParser_4feed_eof(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_v_desc = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; __Pyx_RefNannySetupContext("feed_eof", 0); /* "aiohttp/_http_parser.pyx":475 * cdef bytes desc * * if self._payload is not None: # <<<<<<<<<<<<<< * if self._cparser.flags & cparser.F_CHUNKED: * raise TransferEncodingError( */ __pyx_t_1 = (__pyx_v_self->_payload != Py_None); __pyx_t_2 = (__pyx_t_1 != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":476 * * if self._payload is not None: * if self._cparser.flags & cparser.F_CHUNKED: # <<<<<<<<<<<<<< * raise TransferEncodingError( * "Not enough data for satisfy transfer length header.") */ __pyx_t_2 = ((__pyx_v_self->_cparser->flags & F_CHUNKED) != 0); if (unlikely(__pyx_t_2)) { /* "aiohttp/_http_parser.pyx":477 * if self._payload is not None: * if self._cparser.flags & cparser.F_CHUNKED: * raise TransferEncodingError( # <<<<<<<<<<<<<< * "Not enough data for satisfy transfer length header.") * elif self._cparser.flags & cparser.F_CONTENTLENGTH: */ __Pyx_GetModuleGlobalName(__pyx_t_4, __pyx_n_s_TransferEncodingError); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 477, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_4))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_5, __pyx_kp_u_Not_enough_data_for_satisfy_tran) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_kp_u_Not_enough_data_for_satisfy_tran); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 477, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(0, 477, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":476 * * if self._payload is not None: * if self._cparser.flags & cparser.F_CHUNKED: # <<<<<<<<<<<<<< * raise TransferEncodingError( * "Not enough data for satisfy transfer length header.") */ } /* "aiohttp/_http_parser.pyx":479 * raise TransferEncodingError( * "Not enough data for satisfy transfer length header.") * elif self._cparser.flags & cparser.F_CONTENTLENGTH: # <<<<<<<<<<<<<< * raise ContentLengthError( * "Not enough data for satisfy content length header.") */ __pyx_t_2 = ((__pyx_v_self->_cparser->flags & F_CONTENTLENGTH) != 0); if (unlikely(__pyx_t_2)) { /* "aiohttp/_http_parser.pyx":480 * "Not enough data for satisfy transfer length header.") * elif self._cparser.flags & cparser.F_CONTENTLENGTH: * raise ContentLengthError( # <<<<<<<<<<<<<< * "Not enough data for satisfy content length header.") * elif self._cparser.http_errno != cparser.HPE_OK: */ __Pyx_GetModuleGlobalName(__pyx_t_4, __pyx_n_s_ContentLengthError); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 480, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_4))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_5, __pyx_kp_u_Not_enough_data_for_satisfy_cont) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_kp_u_Not_enough_data_for_satisfy_cont); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 480, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(0, 480, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":479 * raise TransferEncodingError( * "Not enough data for satisfy transfer length header.") * elif self._cparser.flags & cparser.F_CONTENTLENGTH: # <<<<<<<<<<<<<< * raise ContentLengthError( * "Not enough data for satisfy content length header.") */ } /* "aiohttp/_http_parser.pyx":482 * raise ContentLengthError( * "Not enough data for satisfy content length header.") * elif self._cparser.http_errno != cparser.HPE_OK: # <<<<<<<<<<<<<< * desc = cparser.http_errno_description( * self._cparser.http_errno) */ __pyx_t_2 = ((__pyx_v_self->_cparser->http_errno != HPE_OK) != 0); if (unlikely(__pyx_t_2)) { /* "aiohttp/_http_parser.pyx":483 * "Not enough data for satisfy content length header.") * elif self._cparser.http_errno != cparser.HPE_OK: * desc = cparser.http_errno_description( # <<<<<<<<<<<<<< * self._cparser.http_errno) * raise PayloadEncodingError(desc.decode('latin-1')) */ __pyx_t_3 = __Pyx_PyBytes_FromString(http_errno_description(((enum http_errno)__pyx_v_self->_cparser->http_errno))); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 483, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_v_desc = ((PyObject*)__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":485 * desc = cparser.http_errno_description( * self._cparser.http_errno) * raise PayloadEncodingError(desc.decode('latin-1')) # <<<<<<<<<<<<<< * else: * self._payload.feed_eof() */ __Pyx_GetModuleGlobalName(__pyx_t_4, __pyx_n_s_PayloadEncodingError); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 485, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = __Pyx_decode_bytes(__pyx_v_desc, 0, PY_SSIZE_T_MAX, NULL, NULL, PyUnicode_DecodeLatin1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 485, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_6 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_4))) { __pyx_t_6 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_6)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_6); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_6) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_6, __pyx_t_5) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 485, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(0, 485, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":482 * raise ContentLengthError( * "Not enough data for satisfy content length header.") * elif self._cparser.http_errno != cparser.HPE_OK: # <<<<<<<<<<<<<< * desc = cparser.http_errno_description( * self._cparser.http_errno) */ } /* "aiohttp/_http_parser.pyx":487 * raise PayloadEncodingError(desc.decode('latin-1')) * else: * self._payload.feed_eof() # <<<<<<<<<<<<<< * elif self._started: * self._on_headers_complete() */ /*else*/ { __pyx_t_4 = __Pyx_PyObject_GetAttrStr(__pyx_v_self->_payload, __pyx_n_s_feed_eof); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 487, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_4))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_t_5) : __Pyx_PyObject_CallNoArg(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 487, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } /* "aiohttp/_http_parser.pyx":475 * cdef bytes desc * * if self._payload is not None: # <<<<<<<<<<<<<< * if self._cparser.flags & cparser.F_CHUNKED: * raise TransferEncodingError( */ goto __pyx_L3; } /* "aiohttp/_http_parser.pyx":488 * else: * self._payload.feed_eof() * elif self._started: # <<<<<<<<<<<<<< * self._on_headers_complete() * if self._messages: */ __pyx_t_2 = (__pyx_v_self->_started != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":489 * self._payload.feed_eof() * elif self._started: * self._on_headers_complete() # <<<<<<<<<<<<<< * if self._messages: * return self._messages[-1][0] */ __pyx_t_3 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self->__pyx_vtab)->_on_headers_complete(__pyx_v_self); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 489, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":490 * elif self._started: * self._on_headers_complete() * if self._messages: # <<<<<<<<<<<<<< * return self._messages[-1][0] * */ __pyx_t_2 = (__pyx_v_self->_messages != Py_None)&&(PyList_GET_SIZE(__pyx_v_self->_messages) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":491 * self._on_headers_complete() * if self._messages: * return self._messages[-1][0] # <<<<<<<<<<<<<< * * def feed_data(self, data): */ __Pyx_XDECREF(__pyx_r); if (unlikely(__pyx_v_self->_messages == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(0, 491, __pyx_L1_error) } __pyx_t_3 = __Pyx_GetItemInt_List(__pyx_v_self->_messages, -1L, long, 1, __Pyx_PyInt_From_long, 1, 1, 1); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 491, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_4 = __Pyx_GetItemInt(__pyx_t_3, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 491, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_4; __pyx_t_4 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":490 * elif self._started: * self._on_headers_complete() * if self._messages: # <<<<<<<<<<<<<< * return self._messages[-1][0] * */ } /* "aiohttp/_http_parser.pyx":488 * else: * self._payload.feed_eof() * elif self._started: # <<<<<<<<<<<<<< * self._on_headers_complete() * if self._messages: */ } __pyx_L3:; /* "aiohttp/_http_parser.pyx":472 * ### Public API ### * * def feed_eof(self): # <<<<<<<<<<<<<< * cdef bytes desc * */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.feed_eof", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_desc); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":493 * return self._messages[-1][0] * * def feed_data(self, data): # <<<<<<<<<<<<<< * cdef: * size_t data_len */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_7feed_data(PyObject *__pyx_v_self, PyObject *__pyx_v_data); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_7feed_data(PyObject *__pyx_v_self, PyObject *__pyx_v_data) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("feed_data (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_10HttpParser_6feed_data(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self), ((PyObject *)__pyx_v_data)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_10HttpParser_6feed_data(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, PyObject *__pyx_v_data) { size_t __pyx_v_data_len; size_t __pyx_v_nb; PyObject *__pyx_v_ex = NULL; PyObject *__pyx_v_messages = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; int __pyx_t_3; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; __Pyx_RefNannySetupContext("feed_data", 0); /* "aiohttp/_http_parser.pyx":498 * size_t nb * * PyObject_GetBuffer(data, &self.py_buf, PyBUF_SIMPLE) # <<<<<<<<<<<<<< * data_len = self.py_buf.len * */ __pyx_t_1 = PyObject_GetBuffer(__pyx_v_data, (&__pyx_v_self->py_buf), PyBUF_SIMPLE); if (unlikely(__pyx_t_1 == ((int)-1))) __PYX_ERR(0, 498, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":499 * * PyObject_GetBuffer(data, &self.py_buf, PyBUF_SIMPLE) * data_len = self.py_buf.len # <<<<<<<<<<<<<< * * nb = cparser.http_parser_execute( */ __pyx_v_data_len = ((size_t)__pyx_v_self->py_buf.len); /* "aiohttp/_http_parser.pyx":501 * data_len = self.py_buf.len * * nb = cparser.http_parser_execute( # <<<<<<<<<<<<<< * self._cparser, * self._csettings, */ __pyx_v_nb = http_parser_execute(__pyx_v_self->_cparser, __pyx_v_self->_csettings, ((char *)__pyx_v_self->py_buf.buf), __pyx_v_data_len); /* "aiohttp/_http_parser.pyx":507 * data_len) * * PyBuffer_Release(&self.py_buf) # <<<<<<<<<<<<<< * * # i am not sure about cparser.HPE_INVALID_METHOD, */ PyBuffer_Release((&__pyx_v_self->py_buf)); /* "aiohttp/_http_parser.pyx":512 * # seems get err for valid request * # test_client_functional.py::test_post_data_with_bytesio_file * if (self._cparser.http_errno != cparser.HPE_OK and # <<<<<<<<<<<<<< * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): */ __pyx_t_3 = ((__pyx_v_self->_cparser->http_errno != HPE_OK) != 0); if (__pyx_t_3) { } else { __pyx_t_2 = __pyx_t_3; goto __pyx_L4_bool_binop_done; } /* "aiohttp/_http_parser.pyx":513 * # test_client_functional.py::test_post_data_with_bytesio_file * if (self._cparser.http_errno != cparser.HPE_OK and * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or # <<<<<<<<<<<<<< * self._cparser.method == 0)): * if self._payload_error == 0: */ __pyx_t_3 = ((__pyx_v_self->_cparser->http_errno != HPE_INVALID_METHOD) != 0); if (!__pyx_t_3) { } else { __pyx_t_2 = __pyx_t_3; goto __pyx_L4_bool_binop_done; } /* "aiohttp/_http_parser.pyx":514 * if (self._cparser.http_errno != cparser.HPE_OK and * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): # <<<<<<<<<<<<<< * if self._payload_error == 0: * if self._last_error is not None: */ __pyx_t_3 = ((__pyx_v_self->_cparser->method == 0) != 0); __pyx_t_2 = __pyx_t_3; __pyx_L4_bool_binop_done:; /* "aiohttp/_http_parser.pyx":512 * # seems get err for valid request * # test_client_functional.py::test_post_data_with_bytesio_file * if (self._cparser.http_errno != cparser.HPE_OK and # <<<<<<<<<<<<<< * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): */ if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":515 * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): * if self._payload_error == 0: # <<<<<<<<<<<<<< * if self._last_error is not None: * ex = self._last_error */ __pyx_t_2 = ((__pyx_v_self->_payload_error == 0) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":516 * self._cparser.method == 0)): * if self._payload_error == 0: * if self._last_error is not None: # <<<<<<<<<<<<<< * ex = self._last_error * self._last_error = None */ __pyx_t_2 = (__pyx_v_self->_last_error != Py_None); __pyx_t_3 = (__pyx_t_2 != 0); if (__pyx_t_3) { /* "aiohttp/_http_parser.pyx":517 * if self._payload_error == 0: * if self._last_error is not None: * ex = self._last_error # <<<<<<<<<<<<<< * self._last_error = None * else: */ __pyx_t_4 = __pyx_v_self->_last_error; __Pyx_INCREF(__pyx_t_4); __pyx_v_ex = __pyx_t_4; __pyx_t_4 = 0; /* "aiohttp/_http_parser.pyx":518 * if self._last_error is not None: * ex = self._last_error * self._last_error = None # <<<<<<<<<<<<<< * else: * ex = parser_error_from_errno( */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_last_error); __Pyx_DECREF(__pyx_v_self->_last_error); __pyx_v_self->_last_error = Py_None; /* "aiohttp/_http_parser.pyx":516 * self._cparser.method == 0)): * if self._payload_error == 0: * if self._last_error is not None: # <<<<<<<<<<<<<< * ex = self._last_error * self._last_error = None */ goto __pyx_L8; } /* "aiohttp/_http_parser.pyx":520 * self._last_error = None * else: * ex = parser_error_from_errno( # <<<<<<<<<<<<<< * self._cparser.http_errno) * self._payload = None */ /*else*/ { /* "aiohttp/_http_parser.pyx":521 * else: * ex = parser_error_from_errno( * self._cparser.http_errno) # <<<<<<<<<<<<<< * self._payload = None * raise ex */ __pyx_t_4 = __pyx_f_7aiohttp_12_http_parser_parser_error_from_errno(((enum http_errno)__pyx_v_self->_cparser->http_errno)); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 520, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_v_ex = __pyx_t_4; __pyx_t_4 = 0; } __pyx_L8:; /* "aiohttp/_http_parser.pyx":522 * ex = parser_error_from_errno( * self._cparser.http_errno) * self._payload = None # <<<<<<<<<<<<<< * raise ex * */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_self->_payload); __Pyx_DECREF(__pyx_v_self->_payload); __pyx_v_self->_payload = Py_None; /* "aiohttp/_http_parser.pyx":523 * self._cparser.http_errno) * self._payload = None * raise ex # <<<<<<<<<<<<<< * * if self._messages: */ __Pyx_Raise(__pyx_v_ex, 0, 0, 0); __PYX_ERR(0, 523, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":515 * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): * if self._payload_error == 0: # <<<<<<<<<<<<<< * if self._last_error is not None: * ex = self._last_error */ } /* "aiohttp/_http_parser.pyx":512 * # seems get err for valid request * # test_client_functional.py::test_post_data_with_bytesio_file * if (self._cparser.http_errno != cparser.HPE_OK and # <<<<<<<<<<<<<< * (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or * self._cparser.method == 0)): */ } /* "aiohttp/_http_parser.pyx":525 * raise ex * * if self._messages: # <<<<<<<<<<<<<< * messages = self._messages * self._messages = [] */ __pyx_t_3 = (__pyx_v_self->_messages != Py_None)&&(PyList_GET_SIZE(__pyx_v_self->_messages) != 0); if (__pyx_t_3) { /* "aiohttp/_http_parser.pyx":526 * * if self._messages: * messages = self._messages # <<<<<<<<<<<<<< * self._messages = [] * else: */ __pyx_t_4 = __pyx_v_self->_messages; __Pyx_INCREF(__pyx_t_4); __pyx_v_messages = __pyx_t_4; __pyx_t_4 = 0; /* "aiohttp/_http_parser.pyx":527 * if self._messages: * messages = self._messages * self._messages = [] # <<<<<<<<<<<<<< * else: * messages = () */ __pyx_t_4 = PyList_New(0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 527, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_GIVEREF(__pyx_t_4); __Pyx_GOTREF(__pyx_v_self->_messages); __Pyx_DECREF(__pyx_v_self->_messages); __pyx_v_self->_messages = ((PyObject*)__pyx_t_4); __pyx_t_4 = 0; /* "aiohttp/_http_parser.pyx":525 * raise ex * * if self._messages: # <<<<<<<<<<<<<< * messages = self._messages * self._messages = [] */ goto __pyx_L9; } /* "aiohttp/_http_parser.pyx":529 * self._messages = [] * else: * messages = () # <<<<<<<<<<<<<< * * if self._upgraded: */ /*else*/ { __Pyx_INCREF(__pyx_empty_tuple); __pyx_v_messages = __pyx_empty_tuple; } __pyx_L9:; /* "aiohttp/_http_parser.pyx":531 * messages = () * * if self._upgraded: # <<<<<<<<<<<<<< * return messages, True, data[nb:] * else: */ __pyx_t_3 = (__pyx_v_self->_upgraded != 0); if (__pyx_t_3) { /* "aiohttp/_http_parser.pyx":532 * * if self._upgraded: * return messages, True, data[nb:] # <<<<<<<<<<<<<< * else: * return messages, False, b'' */ __Pyx_XDECREF(__pyx_r); __pyx_t_4 = __Pyx_PyObject_GetSlice(__pyx_v_data, __pyx_v_nb, 0, NULL, NULL, NULL, 1, 0, 1); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 532, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = PyTuple_New(3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 532, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_INCREF(__pyx_v_messages); __Pyx_GIVEREF(__pyx_v_messages); PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_v_messages); __Pyx_INCREF(Py_True); __Pyx_GIVEREF(Py_True); PyTuple_SET_ITEM(__pyx_t_5, 1, Py_True); __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_5, 2, __pyx_t_4); __pyx_t_4 = 0; __pyx_r = __pyx_t_5; __pyx_t_5 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":531 * messages = () * * if self._upgraded: # <<<<<<<<<<<<<< * return messages, True, data[nb:] * else: */ } /* "aiohttp/_http_parser.pyx":534 * return messages, True, data[nb:] * else: * return messages, False, b'' # <<<<<<<<<<<<<< * * */ /*else*/ { __Pyx_XDECREF(__pyx_r); __pyx_t_5 = PyTuple_New(3); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 534, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_INCREF(__pyx_v_messages); __Pyx_GIVEREF(__pyx_v_messages); PyTuple_SET_ITEM(__pyx_t_5, 0, __pyx_v_messages); __Pyx_INCREF(Py_False); __Pyx_GIVEREF(Py_False); PyTuple_SET_ITEM(__pyx_t_5, 1, Py_False); __Pyx_INCREF(__pyx_kp_b__4); __Pyx_GIVEREF(__pyx_kp_b__4); PyTuple_SET_ITEM(__pyx_t_5, 2, __pyx_kp_b__4); __pyx_r = __pyx_t_5; __pyx_t_5 = 0; goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":493 * return self._messages[-1][0] * * def feed_data(self, data): # <<<<<<<<<<<<<< * cdef: * size_t data_len */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.feed_data", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_ex); __Pyx_XDECREF(__pyx_v_messages); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_9__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_9__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_10HttpParser_8__reduce_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_10HttpParser_8__reduce_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__5, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 2, __pyx_L1_error) /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_11__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_10HttpParser_11__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_10HttpParser_10__setstate_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_10HttpParser_10__setstate_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_self, CYTHON_UNUSED PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__6, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 4, __pyx_L1_error) /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpParser.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":539 * cdef class HttpRequestParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* Python wrapper */ static int __pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_protocol = 0; PyObject *__pyx_v_loop = 0; PyObject *__pyx_v_timer = 0; size_t __pyx_v_max_line_size; size_t __pyx_v_max_headers; size_t __pyx_v_max_field_size; PyObject *__pyx_v_payload_exception = 0; int __pyx_v_response_with_body; CYTHON_UNUSED int __pyx_v_read_until_eof; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_protocol,&__pyx_n_s_loop,&__pyx_n_s_timer,&__pyx_n_s_max_line_size,&__pyx_n_s_max_headers,&__pyx_n_s_max_field_size,&__pyx_n_s_payload_exception,&__pyx_n_s_response_with_body,&__pyx_n_s_read_until_eof,0}; PyObject* values[9] = {0,0,0,0,0,0,0,0,0}; values[2] = ((PyObject *)Py_None); /* "aiohttp/_http_parser.pyx":541 * def __init__(self, protocol, loop, timer=None, * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, # <<<<<<<<<<<<<< * bint response_with_body=True, bint read_until_eof=False): * self._init(cparser.HTTP_REQUEST, protocol, loop, timer, */ values[6] = ((PyObject *)Py_None); if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_protocol)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_loop)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 0, 2, 9, 1); __PYX_ERR(0, 539, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_timer); if (value) { values[2] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 3: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_line_size); if (value) { values[3] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 4: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_headers); if (value) { values[4] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 5: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_field_size); if (value) { values[5] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 6: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_payload_exception); if (value) { values[6] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 7: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_response_with_body); if (value) { values[7] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 8: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_read_until_eof); if (value) { values[8] = value; kw_args--; } } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 539, __pyx_L3_error) } } else { switch (PyTuple_GET_SIZE(__pyx_args)) { case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[0] = PyTuple_GET_ITEM(__pyx_args, 0); break; default: goto __pyx_L5_argtuple_error; } } __pyx_v_protocol = values[0]; __pyx_v_loop = values[1]; __pyx_v_timer = values[2]; if (values[3]) { __pyx_v_max_line_size = __Pyx_PyInt_As_size_t(values[3]); if (unlikely((__pyx_v_max_line_size == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 540, __pyx_L3_error) } else { __pyx_v_max_line_size = ((size_t)0x1FFE); } if (values[4]) { __pyx_v_max_headers = __Pyx_PyInt_As_size_t(values[4]); if (unlikely((__pyx_v_max_headers == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 540, __pyx_L3_error) } else { __pyx_v_max_headers = ((size_t)0x8000); } if (values[5]) { __pyx_v_max_field_size = __Pyx_PyInt_As_size_t(values[5]); if (unlikely((__pyx_v_max_field_size == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 541, __pyx_L3_error) } else { __pyx_v_max_field_size = ((size_t)0x1FFE); } __pyx_v_payload_exception = values[6]; if (values[7]) { __pyx_v_response_with_body = __Pyx_PyObject_IsTrue(values[7]); if (unlikely((__pyx_v_response_with_body == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 542, __pyx_L3_error) } else { /* "aiohttp/_http_parser.pyx":542 * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint read_until_eof=False): # <<<<<<<<<<<<<< * self._init(cparser.HTTP_REQUEST, protocol, loop, timer, * max_line_size, max_headers, max_field_size, */ __pyx_v_response_with_body = ((int)1); } if (values[8]) { __pyx_v_read_until_eof = __Pyx_PyObject_IsTrue(values[8]); if (unlikely((__pyx_v_read_until_eof == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 542, __pyx_L3_error) } else { __pyx_v_read_until_eof = ((int)0); } } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 0, 2, 9, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 539, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.HttpRequestParser.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser___init__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *)__pyx_v_self), __pyx_v_protocol, __pyx_v_loop, __pyx_v_timer, __pyx_v_max_line_size, __pyx_v_max_headers, __pyx_v_max_field_size, __pyx_v_payload_exception, __pyx_v_response_with_body, __pyx_v_read_until_eof); /* "aiohttp/_http_parser.pyx":539 * cdef class HttpRequestParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser___init__(struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *__pyx_v_self, PyObject *__pyx_v_protocol, PyObject *__pyx_v_loop, PyObject *__pyx_v_timer, size_t __pyx_v_max_line_size, size_t __pyx_v_max_headers, size_t __pyx_v_max_field_size, PyObject *__pyx_v_payload_exception, int __pyx_v_response_with_body, CYTHON_UNUSED int __pyx_v_read_until_eof) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init __pyx_t_2; __Pyx_RefNannySetupContext("__init__", 0); /* "aiohttp/_http_parser.pyx":543 * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint read_until_eof=False): * self._init(cparser.HTTP_REQUEST, protocol, loop, timer, # <<<<<<<<<<<<<< * max_line_size, max_headers, max_field_size, * payload_exception, response_with_body) */ __pyx_t_2.__pyx_n = 6; __pyx_t_2.timer = __pyx_v_timer; __pyx_t_2.max_line_size = __pyx_v_max_line_size; __pyx_t_2.max_headers = __pyx_v_max_headers; __pyx_t_2.max_field_size = __pyx_v_max_field_size; __pyx_t_2.payload_exception = __pyx_v_payload_exception; __pyx_t_2.response_with_body = __pyx_v_response_with_body; __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpRequestParser *)__pyx_v_self->__pyx_base.__pyx_vtab)->__pyx_base._init(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self), HTTP_REQUEST, __pyx_v_protocol, __pyx_v_loop, &__pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 543, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":539 * cdef class HttpRequestParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpRequestParser.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":547 * payload_exception, response_with_body) * * cdef object _on_status_complete(self): # <<<<<<<<<<<<<< * cdef Py_buffer py_buf * if not self._buf: */ static PyObject *__pyx_f_7aiohttp_12_http_parser_17HttpRequestParser__on_status_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *__pyx_v_self) { Py_buffer __pyx_v_py_buf; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; int __pyx_t_7; char const *__pyx_t_8; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; PyObject *__pyx_t_11 = NULL; PyObject *__pyx_t_12 = NULL; PyObject *__pyx_t_13 = NULL; PyObject *__pyx_t_14 = NULL; __Pyx_RefNannySetupContext("_on_status_complete", 0); /* "aiohttp/_http_parser.pyx":549 * cdef object _on_status_complete(self): * cdef Py_buffer py_buf * if not self._buf: # <<<<<<<<<<<<<< * return * self._path = self._buf.decode('utf-8', 'surrogateescape') */ __pyx_t_1 = (__pyx_v_self->__pyx_base._buf != Py_None)&&(PyByteArray_GET_SIZE(__pyx_v_self->__pyx_base._buf) != 0); __pyx_t_2 = ((!__pyx_t_1) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":550 * cdef Py_buffer py_buf * if not self._buf: * return # <<<<<<<<<<<<<< * self._path = self._buf.decode('utf-8', 'surrogateescape') * if self._cparser.method == 5: # CONNECT */ __Pyx_XDECREF(__pyx_r); __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; /* "aiohttp/_http_parser.pyx":549 * cdef object _on_status_complete(self): * cdef Py_buffer py_buf * if not self._buf: # <<<<<<<<<<<<<< * return * self._path = self._buf.decode('utf-8', 'surrogateescape') */ } /* "aiohttp/_http_parser.pyx":551 * if not self._buf: * return * self._path = self._buf.decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * if self._cparser.method == 5: # CONNECT * self._url = URL(self._path) */ if (unlikely(__pyx_v_self->__pyx_base._buf == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "decode"); __PYX_ERR(0, 551, __pyx_L1_error) } __pyx_t_3 = __Pyx_decode_bytearray(__pyx_v_self->__pyx_base._buf, 0, PY_SSIZE_T_MAX, NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 551, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_GIVEREF(__pyx_t_3); __Pyx_GOTREF(__pyx_v_self->__pyx_base._path); __Pyx_DECREF(__pyx_v_self->__pyx_base._path); __pyx_v_self->__pyx_base._path = ((PyObject*)__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":552 * return * self._path = self._buf.decode('utf-8', 'surrogateescape') * if self._cparser.method == 5: # CONNECT # <<<<<<<<<<<<<< * self._url = URL(self._path) * else: */ __pyx_t_2 = ((__pyx_v_self->__pyx_base._cparser->method == 5) != 0); if (__pyx_t_2) { /* "aiohttp/_http_parser.pyx":553 * self._path = self._buf.decode('utf-8', 'surrogateescape') * if self._cparser.method == 5: # CONNECT * self._url = URL(self._path) # <<<<<<<<<<<<<< * else: * PyObject_GetBuffer(self._buf, &py_buf, PyBUF_SIMPLE) */ __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_URL); __pyx_t_4 = __pyx_v_7aiohttp_12_http_parser_URL; __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_4))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_5, __pyx_v_self->__pyx_base._path) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_v_self->__pyx_base._path); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 553, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_GIVEREF(__pyx_t_3); __Pyx_GOTREF(__pyx_v_self->__pyx_base._url); __Pyx_DECREF(__pyx_v_self->__pyx_base._url); __pyx_v_self->__pyx_base._url = __pyx_t_3; __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":552 * return * self._path = self._buf.decode('utf-8', 'surrogateescape') * if self._cparser.method == 5: # CONNECT # <<<<<<<<<<<<<< * self._url = URL(self._path) * else: */ goto __pyx_L4; } /* "aiohttp/_http_parser.pyx":555 * self._url = URL(self._path) * else: * PyObject_GetBuffer(self._buf, &py_buf, PyBUF_SIMPLE) # <<<<<<<<<<<<<< * try: * self._url = _parse_url(py_buf.buf, */ /*else*/ { __pyx_t_3 = __pyx_v_self->__pyx_base._buf; __Pyx_INCREF(__pyx_t_3); __pyx_t_6 = PyObject_GetBuffer(__pyx_t_3, (&__pyx_v_py_buf), PyBUF_SIMPLE); if (unlikely(__pyx_t_6 == ((int)-1))) __PYX_ERR(0, 555, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":556 * else: * PyObject_GetBuffer(self._buf, &py_buf, PyBUF_SIMPLE) * try: # <<<<<<<<<<<<<< * self._url = _parse_url(py_buf.buf, * py_buf.len) */ /*try:*/ { /* "aiohttp/_http_parser.pyx":557 * PyObject_GetBuffer(self._buf, &py_buf, PyBUF_SIMPLE) * try: * self._url = _parse_url(py_buf.buf, # <<<<<<<<<<<<<< * py_buf.len) * finally: */ __pyx_t_3 = __pyx_f_7aiohttp_12_http_parser__parse_url(((char *)__pyx_v_py_buf.buf), __pyx_v_py_buf.len); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 557, __pyx_L6_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_GIVEREF(__pyx_t_3); __Pyx_GOTREF(__pyx_v_self->__pyx_base._url); __Pyx_DECREF(__pyx_v_self->__pyx_base._url); __pyx_v_self->__pyx_base._url = __pyx_t_3; __pyx_t_3 = 0; } /* "aiohttp/_http_parser.pyx":560 * py_buf.len) * finally: * PyBuffer_Release(&py_buf) # <<<<<<<<<<<<<< * PyByteArray_Resize(self._buf, 0) * */ /*finally:*/ { /*normal exit:*/{ PyBuffer_Release((&__pyx_v_py_buf)); goto __pyx_L7; } __pyx_L6_error:; /*exception exit:*/{ __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __pyx_t_9 = 0; __pyx_t_10 = 0; __pyx_t_11 = 0; __pyx_t_12 = 0; __pyx_t_13 = 0; __pyx_t_14 = 0; __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (PY_MAJOR_VERSION >= 3) __Pyx_ExceptionSwap(&__pyx_t_12, &__pyx_t_13, &__pyx_t_14); if ((PY_MAJOR_VERSION < 3) || unlikely(__Pyx_GetException(&__pyx_t_9, &__pyx_t_10, &__pyx_t_11) < 0)) __Pyx_ErrFetch(&__pyx_t_9, &__pyx_t_10, &__pyx_t_11); __Pyx_XGOTREF(__pyx_t_9); __Pyx_XGOTREF(__pyx_t_10); __Pyx_XGOTREF(__pyx_t_11); __Pyx_XGOTREF(__pyx_t_12); __Pyx_XGOTREF(__pyx_t_13); __Pyx_XGOTREF(__pyx_t_14); __pyx_t_6 = __pyx_lineno; __pyx_t_7 = __pyx_clineno; __pyx_t_8 = __pyx_filename; { PyBuffer_Release((&__pyx_v_py_buf)); } if (PY_MAJOR_VERSION >= 3) { __Pyx_XGIVEREF(__pyx_t_12); __Pyx_XGIVEREF(__pyx_t_13); __Pyx_XGIVEREF(__pyx_t_14); __Pyx_ExceptionReset(__pyx_t_12, __pyx_t_13, __pyx_t_14); } __Pyx_XGIVEREF(__pyx_t_9); __Pyx_XGIVEREF(__pyx_t_10); __Pyx_XGIVEREF(__pyx_t_11); __Pyx_ErrRestore(__pyx_t_9, __pyx_t_10, __pyx_t_11); __pyx_t_9 = 0; __pyx_t_10 = 0; __pyx_t_11 = 0; __pyx_t_12 = 0; __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_lineno = __pyx_t_6; __pyx_clineno = __pyx_t_7; __pyx_filename = __pyx_t_8; goto __pyx_L1_error; } __pyx_L7:; } } __pyx_L4:; /* "aiohttp/_http_parser.pyx":561 * finally: * PyBuffer_Release(&py_buf) * PyByteArray_Resize(self._buf, 0) # <<<<<<<<<<<<<< * * */ __pyx_t_3 = __pyx_v_self->__pyx_base._buf; __Pyx_INCREF(__pyx_t_3); __pyx_t_7 = PyByteArray_Resize(__pyx_t_3, 0); if (unlikely(__pyx_t_7 == ((int)-1))) __PYX_ERR(0, 561, __pyx_L1_error) __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":547 * payload_exception, response_with_body) * * cdef object _on_status_complete(self): # <<<<<<<<<<<<<< * cdef Py_buffer py_buf * if not self._buf: */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._http_parser.HttpRequestParser._on_status_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_3__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_3__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser_2__reduce_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser_2__reduce_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__7, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 2, __pyx_L1_error) /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpRequestParser.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_5__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_5__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser_4__setstate_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_17HttpRequestParser_4__setstate_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *__pyx_v_self, CYTHON_UNUSED PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 4, __pyx_L1_error) /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpRequestParser.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":566 * cdef class HttpResponseParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* Python wrapper */ static int __pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static int __pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_1__init__(PyObject *__pyx_v_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_protocol = 0; PyObject *__pyx_v_loop = 0; PyObject *__pyx_v_timer = 0; size_t __pyx_v_max_line_size; size_t __pyx_v_max_headers; size_t __pyx_v_max_field_size; PyObject *__pyx_v_payload_exception = 0; int __pyx_v_response_with_body; CYTHON_UNUSED int __pyx_v_read_until_eof; int __pyx_v_auto_decompress; int __pyx_r; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__init__ (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_protocol,&__pyx_n_s_loop,&__pyx_n_s_timer,&__pyx_n_s_max_line_size,&__pyx_n_s_max_headers,&__pyx_n_s_max_field_size,&__pyx_n_s_payload_exception,&__pyx_n_s_response_with_body,&__pyx_n_s_read_until_eof,&__pyx_n_s_auto_decompress,0}; PyObject* values[10] = {0,0,0,0,0,0,0,0,0,0}; values[2] = ((PyObject *)Py_None); /* "aiohttp/_http_parser.pyx":568 * def __init__(self, protocol, loop, timer=None, * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, # <<<<<<<<<<<<<< * bint response_with_body=True, bint read_until_eof=False, * bint auto_decompress=True): */ values[6] = ((PyObject *)Py_None); if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 10: values[9] = PyTuple_GET_ITEM(__pyx_args, 9); CYTHON_FALLTHROUGH; case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_protocol)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_loop)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__init__", 0, 2, 10, 1); __PYX_ERR(0, 566, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_timer); if (value) { values[2] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 3: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_line_size); if (value) { values[3] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 4: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_headers); if (value) { values[4] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 5: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_max_field_size); if (value) { values[5] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 6: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_payload_exception); if (value) { values[6] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 7: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_response_with_body); if (value) { values[7] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 8: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_read_until_eof); if (value) { values[8] = value; kw_args--; } } CYTHON_FALLTHROUGH; case 9: if (kw_args > 0) { PyObject* value = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_auto_decompress); if (value) { values[9] = value; kw_args--; } } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__init__") < 0)) __PYX_ERR(0, 566, __pyx_L3_error) } } else { switch (PyTuple_GET_SIZE(__pyx_args)) { case 10: values[9] = PyTuple_GET_ITEM(__pyx_args, 9); CYTHON_FALLTHROUGH; case 9: values[8] = PyTuple_GET_ITEM(__pyx_args, 8); CYTHON_FALLTHROUGH; case 8: values[7] = PyTuple_GET_ITEM(__pyx_args, 7); CYTHON_FALLTHROUGH; case 7: values[6] = PyTuple_GET_ITEM(__pyx_args, 6); CYTHON_FALLTHROUGH; case 6: values[5] = PyTuple_GET_ITEM(__pyx_args, 5); CYTHON_FALLTHROUGH; case 5: values[4] = PyTuple_GET_ITEM(__pyx_args, 4); CYTHON_FALLTHROUGH; case 4: values[3] = PyTuple_GET_ITEM(__pyx_args, 3); CYTHON_FALLTHROUGH; case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[0] = PyTuple_GET_ITEM(__pyx_args, 0); break; default: goto __pyx_L5_argtuple_error; } } __pyx_v_protocol = values[0]; __pyx_v_loop = values[1]; __pyx_v_timer = values[2]; if (values[3]) { __pyx_v_max_line_size = __Pyx_PyInt_As_size_t(values[3]); if (unlikely((__pyx_v_max_line_size == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 567, __pyx_L3_error) } else { __pyx_v_max_line_size = ((size_t)0x1FFE); } if (values[4]) { __pyx_v_max_headers = __Pyx_PyInt_As_size_t(values[4]); if (unlikely((__pyx_v_max_headers == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 567, __pyx_L3_error) } else { __pyx_v_max_headers = ((size_t)0x8000); } if (values[5]) { __pyx_v_max_field_size = __Pyx_PyInt_As_size_t(values[5]); if (unlikely((__pyx_v_max_field_size == (size_t)-1) && PyErr_Occurred())) __PYX_ERR(0, 568, __pyx_L3_error) } else { __pyx_v_max_field_size = ((size_t)0x1FFE); } __pyx_v_payload_exception = values[6]; if (values[7]) { __pyx_v_response_with_body = __Pyx_PyObject_IsTrue(values[7]); if (unlikely((__pyx_v_response_with_body == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 569, __pyx_L3_error) } else { /* "aiohttp/_http_parser.pyx":569 * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint read_until_eof=False, # <<<<<<<<<<<<<< * bint auto_decompress=True): * self._init(cparser.HTTP_RESPONSE, protocol, loop, timer, */ __pyx_v_response_with_body = ((int)1); } if (values[8]) { __pyx_v_read_until_eof = __Pyx_PyObject_IsTrue(values[8]); if (unlikely((__pyx_v_read_until_eof == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 569, __pyx_L3_error) } else { __pyx_v_read_until_eof = ((int)0); } if (values[9]) { __pyx_v_auto_decompress = __Pyx_PyObject_IsTrue(values[9]); if (unlikely((__pyx_v_auto_decompress == (int)-1) && PyErr_Occurred())) __PYX_ERR(0, 570, __pyx_L3_error) } else { /* "aiohttp/_http_parser.pyx":570 * size_t max_field_size=8190, payload_exception=None, * bint response_with_body=True, bint read_until_eof=False, * bint auto_decompress=True): # <<<<<<<<<<<<<< * self._init(cparser.HTTP_RESPONSE, protocol, loop, timer, * max_line_size, max_headers, max_field_size, */ __pyx_v_auto_decompress = ((int)1); } } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__init__", 0, 2, 10, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 566, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.HttpResponseParser.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return -1; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser___init__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *)__pyx_v_self), __pyx_v_protocol, __pyx_v_loop, __pyx_v_timer, __pyx_v_max_line_size, __pyx_v_max_headers, __pyx_v_max_field_size, __pyx_v_payload_exception, __pyx_v_response_with_body, __pyx_v_read_until_eof, __pyx_v_auto_decompress); /* "aiohttp/_http_parser.pyx":566 * cdef class HttpResponseParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static int __pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser___init__(struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *__pyx_v_self, PyObject *__pyx_v_protocol, PyObject *__pyx_v_loop, PyObject *__pyx_v_timer, size_t __pyx_v_max_line_size, size_t __pyx_v_max_headers, size_t __pyx_v_max_field_size, PyObject *__pyx_v_payload_exception, int __pyx_v_response_with_body, CYTHON_UNUSED int __pyx_v_read_until_eof, int __pyx_v_auto_decompress) { int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init __pyx_t_2; __Pyx_RefNannySetupContext("__init__", 0); /* "aiohttp/_http_parser.pyx":571 * bint response_with_body=True, bint read_until_eof=False, * bint auto_decompress=True): * self._init(cparser.HTTP_RESPONSE, protocol, loop, timer, # <<<<<<<<<<<<<< * max_line_size, max_headers, max_field_size, * payload_exception, response_with_body, auto_decompress) */ __pyx_t_2.__pyx_n = 7; __pyx_t_2.timer = __pyx_v_timer; __pyx_t_2.max_line_size = __pyx_v_max_line_size; __pyx_t_2.max_headers = __pyx_v_max_headers; __pyx_t_2.max_field_size = __pyx_v_max_field_size; __pyx_t_2.payload_exception = __pyx_v_payload_exception; __pyx_t_2.response_with_body = __pyx_v_response_with_body; __pyx_t_2.auto_decompress = __pyx_v_auto_decompress; __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpResponseParser *)__pyx_v_self->__pyx_base.__pyx_vtab)->__pyx_base._init(((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_v_self), HTTP_RESPONSE, __pyx_v_protocol, __pyx_v_loop, &__pyx_t_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 571, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":566 * cdef class HttpResponseParser(HttpParser): * * def __init__(self, protocol, loop, timer=None, # <<<<<<<<<<<<<< * size_t max_line_size=8190, size_t max_headers=32768, * size_t max_field_size=8190, payload_exception=None, */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpResponseParser.__init__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":575 * payload_exception, response_with_body, auto_decompress) * * cdef object _on_status_complete(self): # <<<<<<<<<<<<<< * if self._buf: * self._reason = self._buf.decode('utf-8', 'surrogateescape') */ static PyObject *__pyx_f_7aiohttp_12_http_parser_18HttpResponseParser__on_status_complete(struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; int __pyx_t_3; __Pyx_RefNannySetupContext("_on_status_complete", 0); /* "aiohttp/_http_parser.pyx":576 * * cdef object _on_status_complete(self): * if self._buf: # <<<<<<<<<<<<<< * self._reason = self._buf.decode('utf-8', 'surrogateescape') * PyByteArray_Resize(self._buf, 0) */ __pyx_t_1 = (__pyx_v_self->__pyx_base._buf != Py_None)&&(PyByteArray_GET_SIZE(__pyx_v_self->__pyx_base._buf) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":577 * cdef object _on_status_complete(self): * if self._buf: * self._reason = self._buf.decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * PyByteArray_Resize(self._buf, 0) * else: */ if (unlikely(__pyx_v_self->__pyx_base._buf == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "decode"); __PYX_ERR(0, 577, __pyx_L1_error) } __pyx_t_2 = __Pyx_decode_bytearray(__pyx_v_self->__pyx_base._buf, 0, PY_SSIZE_T_MAX, NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 577, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->__pyx_base._reason); __Pyx_DECREF(__pyx_v_self->__pyx_base._reason); __pyx_v_self->__pyx_base._reason = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":578 * if self._buf: * self._reason = self._buf.decode('utf-8', 'surrogateescape') * PyByteArray_Resize(self._buf, 0) # <<<<<<<<<<<<<< * else: * self._reason = self._reason or '' */ __pyx_t_2 = __pyx_v_self->__pyx_base._buf; __Pyx_INCREF(__pyx_t_2); __pyx_t_3 = PyByteArray_Resize(__pyx_t_2, 0); if (unlikely(__pyx_t_3 == ((int)-1))) __PYX_ERR(0, 578, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":576 * * cdef object _on_status_complete(self): * if self._buf: # <<<<<<<<<<<<<< * self._reason = self._buf.decode('utf-8', 'surrogateescape') * PyByteArray_Resize(self._buf, 0) */ goto __pyx_L3; } /* "aiohttp/_http_parser.pyx":580 * PyByteArray_Resize(self._buf, 0) * else: * self._reason = self._reason or '' # <<<<<<<<<<<<<< * * cdef int cb_on_message_begin(cparser.http_parser* parser) except -1: */ /*else*/ { __pyx_t_1 = __Pyx_PyObject_IsTrue(__pyx_v_self->__pyx_base._reason); if (unlikely(__pyx_t_1 < 0)) __PYX_ERR(0, 580, __pyx_L1_error) if (!__pyx_t_1) { } else { __Pyx_INCREF(__pyx_v_self->__pyx_base._reason); __pyx_t_2 = __pyx_v_self->__pyx_base._reason; goto __pyx_L4_bool_binop_done; } __Pyx_INCREF(__pyx_kp_u__4); __pyx_t_2 = __pyx_kp_u__4; __pyx_L4_bool_binop_done:; __Pyx_GIVEREF(__pyx_t_2); __Pyx_GOTREF(__pyx_v_self->__pyx_base._reason); __Pyx_DECREF(__pyx_v_self->__pyx_base._reason); __pyx_v_self->__pyx_base._reason = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; } __pyx_L3:; /* "aiohttp/_http_parser.pyx":575 * payload_exception, response_with_body, auto_decompress) * * cdef object _on_status_complete(self): # <<<<<<<<<<<<<< * if self._buf: * self._reason = self._buf.decode('utf-8', 'surrogateescape') */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._http_parser.HttpResponseParser._on_status_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_3__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_3__reduce_cython__(PyObject *__pyx_v_self, CYTHON_UNUSED PyObject *unused) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__reduce_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser_2__reduce_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *)__pyx_v_self)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser_2__reduce_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *__pyx_v_self) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__reduce_cython__", 0); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 2, __pyx_L1_error) /* "(tree fragment)":1 * def __reduce_cython__(self): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpResponseParser.__reduce_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_5__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state); /*proto*/ static PyObject *__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_5__setstate_cython__(PyObject *__pyx_v_self, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__setstate_cython__ (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser_4__setstate_cython__(((struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *)__pyx_v_self), ((PyObject *)__pyx_v___pyx_state)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_18HttpResponseParser_4__setstate_cython__(CYTHON_UNUSED struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *__pyx_v_self, CYTHON_UNUSED PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__setstate_cython__", 0); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__10, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(1, 4, __pyx_L1_error) /* "(tree fragment)":3 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): # <<<<<<<<<<<<<< * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_AddTraceback("aiohttp._http_parser.HttpResponseParser.__setstate_cython__", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":582 * self._reason = self._reason or '' * * cdef int cb_on_message_begin(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_message_begin(struct http_parser *__pyx_v_parser) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; int __pyx_t_4; __Pyx_RefNannySetupContext("cb_on_message_begin", 0); /* "aiohttp/_http_parser.pyx":583 * * cdef int cb_on_message_begin(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * * pyparser._started = True */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":585 * cdef HttpParser pyparser = parser.data * * pyparser._started = True # <<<<<<<<<<<<<< * pyparser._headers = CIMultiDict() * pyparser._raw_headers = [] */ __pyx_v_pyparser->_started = 1; /* "aiohttp/_http_parser.pyx":586 * * pyparser._started = True * pyparser._headers = CIMultiDict() # <<<<<<<<<<<<<< * pyparser._raw_headers = [] * PyByteArray_Resize(pyparser._buf, 0) */ __Pyx_INCREF(__pyx_v_7aiohttp_12_http_parser_CIMultiDict); __pyx_t_2 = __pyx_v_7aiohttp_12_http_parser_CIMultiDict; __pyx_t_3 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { __pyx_t_3 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_3)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_3); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_1 = (__pyx_t_3) ? __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_3) : __Pyx_PyObject_CallNoArg(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 586, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_pyparser->_headers); __Pyx_DECREF(__pyx_v_pyparser->_headers); __pyx_v_pyparser->_headers = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":587 * pyparser._started = True * pyparser._headers = CIMultiDict() * pyparser._raw_headers = [] # <<<<<<<<<<<<<< * PyByteArray_Resize(pyparser._buf, 0) * pyparser._path = None */ __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 587, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v_pyparser->_raw_headers); __Pyx_DECREF(__pyx_v_pyparser->_raw_headers); __pyx_v_pyparser->_raw_headers = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":588 * pyparser._headers = CIMultiDict() * pyparser._raw_headers = [] * PyByteArray_Resize(pyparser._buf, 0) # <<<<<<<<<<<<<< * pyparser._path = None * pyparser._reason = None */ __pyx_t_1 = __pyx_v_pyparser->_buf; __Pyx_INCREF(__pyx_t_1); __pyx_t_4 = PyByteArray_Resize(__pyx_t_1, 0); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(0, 588, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":589 * pyparser._raw_headers = [] * PyByteArray_Resize(pyparser._buf, 0) * pyparser._path = None # <<<<<<<<<<<<<< * pyparser._reason = None * return 0 */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_pyparser->_path); __Pyx_DECREF(__pyx_v_pyparser->_path); __pyx_v_pyparser->_path = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":590 * PyByteArray_Resize(pyparser._buf, 0) * pyparser._path = None * pyparser._reason = None # <<<<<<<<<<<<<< * return 0 * */ __Pyx_INCREF(Py_None); __Pyx_GIVEREF(Py_None); __Pyx_GOTREF(__pyx_v_pyparser->_reason); __Pyx_DECREF(__pyx_v_pyparser->_reason); __pyx_v_pyparser->_reason = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":591 * pyparser._path = None * pyparser._reason = None * return 0 # <<<<<<<<<<<<<< * * */ __pyx_r = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":582 * self._reason = self._reason or '' * * cdef int cb_on_message_begin(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_message_begin", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":594 * * * cdef int cb_on_url(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_url(struct http_parser *__pyx_v_parser, char const *__pyx_v_at, size_t __pyx_v_length) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_ex = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; int __pyx_t_10; PyObject *__pyx_t_11 = NULL; __Pyx_RefNannySetupContext("cb_on_url", 0); /* "aiohttp/_http_parser.pyx":596 * cdef int cb_on_url(cparser.http_parser* parser, * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * try: * if length > pyparser._max_line_size: */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":597 * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":598 * cdef HttpParser pyparser = parser.data * try: * if length > pyparser._max_line_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) */ __pyx_t_5 = ((__pyx_v_length > __pyx_v_pyparser->_max_line_size) != 0); if (unlikely(__pyx_t_5)) { /* "aiohttp/_http_parser.pyx":599 * try: * if length > pyparser._max_line_size: * raise LineTooLong( # <<<<<<<<<<<<<< * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) */ __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_LineTooLong); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 599, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_6); /* "aiohttp/_http_parser.pyx":600 * if length > pyparser._max_line_size: * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) # <<<<<<<<<<<<<< * extend(pyparser._buf, at, length) * except BaseException as ex: */ __pyx_t_7 = __Pyx_PyInt_FromSize_t(__pyx_v_pyparser->_max_line_size); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 600, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_8 = __Pyx_PyInt_FromSize_t(__pyx_v_length); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 600, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_8); __pyx_t_9 = NULL; __pyx_t_10 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_6))) { __pyx_t_9 = PyMethod_GET_SELF(__pyx_t_6); if (likely(__pyx_t_9)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_6); __Pyx_INCREF(__pyx_t_9); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_6, function); __pyx_t_10 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[4] = {__pyx_t_9, __pyx_kp_u_Status_line_is_too_long, __pyx_t_7, __pyx_t_8}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_10, 3+__pyx_t_10); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 599, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[4] = {__pyx_t_9, __pyx_kp_u_Status_line_is_too_long, __pyx_t_7, __pyx_t_8}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_10, 3+__pyx_t_10); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 599, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; } else #endif { __pyx_t_11 = PyTuple_New(3+__pyx_t_10); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 599, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_11); if (__pyx_t_9) { __Pyx_GIVEREF(__pyx_t_9); PyTuple_SET_ITEM(__pyx_t_11, 0, __pyx_t_9); __pyx_t_9 = NULL; } __Pyx_INCREF(__pyx_kp_u_Status_line_is_too_long); __Pyx_GIVEREF(__pyx_kp_u_Status_line_is_too_long); PyTuple_SET_ITEM(__pyx_t_11, 0+__pyx_t_10, __pyx_kp_u_Status_line_is_too_long); __Pyx_GIVEREF(__pyx_t_7); PyTuple_SET_ITEM(__pyx_t_11, 1+__pyx_t_10, __pyx_t_7); __Pyx_GIVEREF(__pyx_t_8); PyTuple_SET_ITEM(__pyx_t_11, 2+__pyx_t_10, __pyx_t_8); __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_6, __pyx_t_11, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 599, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; } __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 599, __pyx_L3_error) /* "aiohttp/_http_parser.pyx":598 * cdef HttpParser pyparser = parser.data * try: * if length > pyparser._max_line_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) */ } /* "aiohttp/_http_parser.pyx":601 * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) # <<<<<<<<<<<<<< * except BaseException as ex: * pyparser._last_error = ex */ __pyx_t_1 = __pyx_v_pyparser->_buf; __Pyx_INCREF(__pyx_t_1); __pyx_t_6 = __pyx_f_7aiohttp_12_http_parser_extend(__pyx_t_1, __pyx_v_at, __pyx_v_length); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 601, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; /* "aiohttp/_http_parser.pyx":597 * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ } /* "aiohttp/_http_parser.pyx":606 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; /* "aiohttp/_http_parser.pyx":602 * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ __pyx_t_10 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_10) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_url", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_6, &__pyx_t_1, &__pyx_t_11) < 0) __PYX_ERR(0, 602, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_11); __Pyx_INCREF(__pyx_t_1); __pyx_v_ex = __pyx_t_1; /*try:*/ { /* "aiohttp/_http_parser.pyx":603 * extend(pyparser._buf, at, length) * except BaseException as ex: * pyparser._last_error = ex # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_ex); __Pyx_GIVEREF(__pyx_v_ex); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_ex; /* "aiohttp/_http_parser.pyx":604 * except BaseException as ex: * pyparser._last_error = ex * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; goto __pyx_L14_return; } /* "aiohttp/_http_parser.pyx":602 * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ /*finally:*/ { __pyx_L14_return: { __pyx_t_10 = __pyx_r; __Pyx_DECREF(__pyx_v_ex); __pyx_v_ex = NULL; __pyx_r = __pyx_t_10; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":597 * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":594 * * * cdef int cb_on_url(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_11); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_url", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_ex); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":609 * * * cdef int cb_on_status(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_status(struct http_parser *__pyx_v_parser, char const *__pyx_v_at, size_t __pyx_v_length) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_ex = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; int __pyx_t_10; PyObject *__pyx_t_11 = NULL; __Pyx_RefNannySetupContext("cb_on_status", 0); /* "aiohttp/_http_parser.pyx":611 * cdef int cb_on_status(cparser.http_parser* parser, * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * cdef str reason * try: */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":613 * cdef HttpParser pyparser = parser.data * cdef str reason * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":614 * cdef str reason * try: * if length > pyparser._max_line_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) */ __pyx_t_5 = ((__pyx_v_length > __pyx_v_pyparser->_max_line_size) != 0); if (unlikely(__pyx_t_5)) { /* "aiohttp/_http_parser.pyx":615 * try: * if length > pyparser._max_line_size: * raise LineTooLong( # <<<<<<<<<<<<<< * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) */ __Pyx_GetModuleGlobalName(__pyx_t_6, __pyx_n_s_LineTooLong); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 615, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_6); /* "aiohttp/_http_parser.pyx":616 * if length > pyparser._max_line_size: * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) # <<<<<<<<<<<<<< * extend(pyparser._buf, at, length) * except BaseException as ex: */ __pyx_t_7 = __Pyx_PyInt_FromSize_t(__pyx_v_pyparser->_max_line_size); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 616, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_8 = __Pyx_PyInt_FromSize_t(__pyx_v_length); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 616, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_8); __pyx_t_9 = NULL; __pyx_t_10 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_6))) { __pyx_t_9 = PyMethod_GET_SELF(__pyx_t_6); if (likely(__pyx_t_9)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_6); __Pyx_INCREF(__pyx_t_9); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_6, function); __pyx_t_10 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[4] = {__pyx_t_9, __pyx_kp_u_Status_line_is_too_long, __pyx_t_7, __pyx_t_8}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_10, 3+__pyx_t_10); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 615, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_6)) { PyObject *__pyx_temp[4] = {__pyx_t_9, __pyx_kp_u_Status_line_is_too_long, __pyx_t_7, __pyx_t_8}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_6, __pyx_temp+1-__pyx_t_10, 3+__pyx_t_10); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 615, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; } else #endif { __pyx_t_11 = PyTuple_New(3+__pyx_t_10); if (unlikely(!__pyx_t_11)) __PYX_ERR(0, 615, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_11); if (__pyx_t_9) { __Pyx_GIVEREF(__pyx_t_9); PyTuple_SET_ITEM(__pyx_t_11, 0, __pyx_t_9); __pyx_t_9 = NULL; } __Pyx_INCREF(__pyx_kp_u_Status_line_is_too_long); __Pyx_GIVEREF(__pyx_kp_u_Status_line_is_too_long); PyTuple_SET_ITEM(__pyx_t_11, 0+__pyx_t_10, __pyx_kp_u_Status_line_is_too_long); __Pyx_GIVEREF(__pyx_t_7); PyTuple_SET_ITEM(__pyx_t_11, 1+__pyx_t_10, __pyx_t_7); __Pyx_GIVEREF(__pyx_t_8); PyTuple_SET_ITEM(__pyx_t_11, 2+__pyx_t_10, __pyx_t_8); __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_6, __pyx_t_11, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 615, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; } __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 615, __pyx_L3_error) /* "aiohttp/_http_parser.pyx":614 * cdef str reason * try: * if length > pyparser._max_line_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) */ } /* "aiohttp/_http_parser.pyx":617 * raise LineTooLong( * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) # <<<<<<<<<<<<<< * except BaseException as ex: * pyparser._last_error = ex */ __pyx_t_1 = __pyx_v_pyparser->_buf; __Pyx_INCREF(__pyx_t_1); __pyx_t_6 = __pyx_f_7aiohttp_12_http_parser_extend(__pyx_t_1, __pyx_v_at, __pyx_v_length); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 617, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; /* "aiohttp/_http_parser.pyx":613 * cdef HttpParser pyparser = parser.data * cdef str reason * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ } /* "aiohttp/_http_parser.pyx":622 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_11); __pyx_t_11 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; /* "aiohttp/_http_parser.pyx":618 * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ __pyx_t_10 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_10) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_status", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_6, &__pyx_t_1, &__pyx_t_11) < 0) __PYX_ERR(0, 618, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_11); __Pyx_INCREF(__pyx_t_1); __pyx_v_ex = __pyx_t_1; /*try:*/ { /* "aiohttp/_http_parser.pyx":619 * extend(pyparser._buf, at, length) * except BaseException as ex: * pyparser._last_error = ex # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_ex); __Pyx_GIVEREF(__pyx_v_ex); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_ex; /* "aiohttp/_http_parser.pyx":620 * except BaseException as ex: * pyparser._last_error = ex * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_11); __pyx_t_11 = 0; goto __pyx_L14_return; } /* "aiohttp/_http_parser.pyx":618 * 'Status line is too long', pyparser._max_line_size, length) * extend(pyparser._buf, at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ /*finally:*/ { __pyx_L14_return: { __pyx_t_10 = __pyx_r; __Pyx_DECREF(__pyx_v_ex); __pyx_v_ex = NULL; __pyx_r = __pyx_t_10; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":613 * cdef HttpParser pyparser = parser.data * cdef str reason * try: # <<<<<<<<<<<<<< * if length > pyparser._max_line_size: * raise LineTooLong( */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":609 * * * cdef int cb_on_status(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_11); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_status", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_ex); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":625 * * * cdef int cb_on_header_field(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_header_field(struct http_parser *__pyx_v_parser, char const *__pyx_v_at, size_t __pyx_v_length) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; Py_ssize_t __pyx_v_size; PyObject *__pyx_v_ex = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; Py_ssize_t __pyx_t_5; int __pyx_t_6; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; int __pyx_t_11; PyObject *__pyx_t_12 = NULL; __Pyx_RefNannySetupContext("cb_on_header_field", 0); /* "aiohttp/_http_parser.pyx":627 * cdef int cb_on_header_field(cparser.http_parser* parser, * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * cdef Py_ssize_t size * try: */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":629 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":630 * cdef Py_ssize_t size * try: * pyparser._on_status_complete() # <<<<<<<<<<<<<< * size = len(pyparser._raw_name) + length * if size > pyparser._max_field_size: */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_status_complete(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 630, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":631 * try: * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length # <<<<<<<<<<<<<< * if size > pyparser._max_field_size: * raise LineTooLong( */ __pyx_t_1 = __pyx_v_pyparser->_raw_name; __Pyx_INCREF(__pyx_t_1); if (unlikely(__pyx_t_1 == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(0, 631, __pyx_L3_error) } __pyx_t_5 = PyByteArray_GET_SIZE(__pyx_t_1); if (unlikely(__pyx_t_5 == ((Py_ssize_t)-1))) __PYX_ERR(0, 631, __pyx_L3_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_size = (__pyx_t_5 + __pyx_v_length); /* "aiohttp/_http_parser.pyx":632 * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length * if size > pyparser._max_field_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Header name is too long', pyparser._max_field_size, size) */ __pyx_t_6 = ((__pyx_v_size > __pyx_v_pyparser->_max_field_size) != 0); if (unlikely(__pyx_t_6)) { /* "aiohttp/_http_parser.pyx":633 * size = len(pyparser._raw_name) + length * if size > pyparser._max_field_size: * raise LineTooLong( # <<<<<<<<<<<<<< * 'Header name is too long', pyparser._max_field_size, size) * pyparser._on_header_field(at, length) */ __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_LineTooLong); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 633, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_7); /* "aiohttp/_http_parser.pyx":634 * if size > pyparser._max_field_size: * raise LineTooLong( * 'Header name is too long', pyparser._max_field_size, size) # <<<<<<<<<<<<<< * pyparser._on_header_field(at, length) * except BaseException as ex: */ __pyx_t_8 = __Pyx_PyInt_FromSize_t(__pyx_v_pyparser->_max_field_size); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 634, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_8); __pyx_t_9 = PyInt_FromSsize_t(__pyx_v_size); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 634, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_9); __pyx_t_10 = NULL; __pyx_t_11 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_7))) { __pyx_t_10 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_10)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_10); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); __pyx_t_11 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_7)) { PyObject *__pyx_temp[4] = {__pyx_t_10, __pyx_kp_u_Header_name_is_too_long, __pyx_t_8, __pyx_t_9}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_7, __pyx_temp+1-__pyx_t_11, 3+__pyx_t_11); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 633, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_7)) { PyObject *__pyx_temp[4] = {__pyx_t_10, __pyx_kp_u_Header_name_is_too_long, __pyx_t_8, __pyx_t_9}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_7, __pyx_temp+1-__pyx_t_11, 3+__pyx_t_11); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 633, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } else #endif { __pyx_t_12 = PyTuple_New(3+__pyx_t_11); if (unlikely(!__pyx_t_12)) __PYX_ERR(0, 633, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_12); if (__pyx_t_10) { __Pyx_GIVEREF(__pyx_t_10); PyTuple_SET_ITEM(__pyx_t_12, 0, __pyx_t_10); __pyx_t_10 = NULL; } __Pyx_INCREF(__pyx_kp_u_Header_name_is_too_long); __Pyx_GIVEREF(__pyx_kp_u_Header_name_is_too_long); PyTuple_SET_ITEM(__pyx_t_12, 0+__pyx_t_11, __pyx_kp_u_Header_name_is_too_long); __Pyx_GIVEREF(__pyx_t_8); PyTuple_SET_ITEM(__pyx_t_12, 1+__pyx_t_11, __pyx_t_8); __Pyx_GIVEREF(__pyx_t_9); PyTuple_SET_ITEM(__pyx_t_12, 2+__pyx_t_11, __pyx_t_9); __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_7, __pyx_t_12, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 633, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; } __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 633, __pyx_L3_error) /* "aiohttp/_http_parser.pyx":632 * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length * if size > pyparser._max_field_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Header name is too long', pyparser._max_field_size, size) */ } /* "aiohttp/_http_parser.pyx":635 * raise LineTooLong( * 'Header name is too long', pyparser._max_field_size, size) * pyparser._on_header_field(at, length) # <<<<<<<<<<<<<< * except BaseException as ex: * pyparser._last_error = ex */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_header_field(__pyx_v_pyparser, __pyx_v_at, __pyx_v_length); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 635, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":629 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length */ } /* "aiohttp/_http_parser.pyx":640 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_XDECREF(__pyx_t_12); __pyx_t_12 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; /* "aiohttp/_http_parser.pyx":636 * 'Header name is too long', pyparser._max_field_size, size) * pyparser._on_header_field(at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ __pyx_t_11 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_11) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_header_field", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_7, &__pyx_t_12) < 0) __PYX_ERR(0, 636, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_7); __Pyx_GOTREF(__pyx_t_12); __Pyx_INCREF(__pyx_t_7); __pyx_v_ex = __pyx_t_7; /*try:*/ { /* "aiohttp/_http_parser.pyx":637 * pyparser._on_header_field(at, length) * except BaseException as ex: * pyparser._last_error = ex # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_ex); __Pyx_GIVEREF(__pyx_v_ex); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_ex; /* "aiohttp/_http_parser.pyx":638 * except BaseException as ex: * pyparser._last_error = ex * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; goto __pyx_L14_return; } /* "aiohttp/_http_parser.pyx":636 * 'Header name is too long', pyparser._max_field_size, size) * pyparser._on_header_field(at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ /*finally:*/ { __pyx_L14_return: { __pyx_t_11 = __pyx_r; __Pyx_DECREF(__pyx_v_ex); __pyx_v_ex = NULL; __pyx_r = __pyx_t_11; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":629 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * size = len(pyparser._raw_name) + length */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":625 * * * cdef int cb_on_header_field(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_10); __Pyx_XDECREF(__pyx_t_12); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_header_field", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_ex); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":643 * * * cdef int cb_on_header_value(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_header_value(struct http_parser *__pyx_v_parser, char const *__pyx_v_at, size_t __pyx_v_length) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; Py_ssize_t __pyx_v_size; PyObject *__pyx_v_ex = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; Py_ssize_t __pyx_t_5; int __pyx_t_6; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; int __pyx_t_11; PyObject *__pyx_t_12 = NULL; __Pyx_RefNannySetupContext("cb_on_header_value", 0); /* "aiohttp/_http_parser.pyx":645 * cdef int cb_on_header_value(cparser.http_parser* parser, * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * cdef Py_ssize_t size * try: */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":647 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":648 * cdef Py_ssize_t size * try: * size = len(pyparser._raw_value) + length # <<<<<<<<<<<<<< * if size > pyparser._max_field_size: * raise LineTooLong( */ __pyx_t_1 = __pyx_v_pyparser->_raw_value; __Pyx_INCREF(__pyx_t_1); if (unlikely(__pyx_t_1 == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(0, 648, __pyx_L3_error) } __pyx_t_5 = PyByteArray_GET_SIZE(__pyx_t_1); if (unlikely(__pyx_t_5 == ((Py_ssize_t)-1))) __PYX_ERR(0, 648, __pyx_L3_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v_size = (__pyx_t_5 + __pyx_v_length); /* "aiohttp/_http_parser.pyx":649 * try: * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Header value is too long', pyparser._max_field_size, size) */ __pyx_t_6 = ((__pyx_v_size > __pyx_v_pyparser->_max_field_size) != 0); if (unlikely(__pyx_t_6)) { /* "aiohttp/_http_parser.pyx":650 * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: * raise LineTooLong( # <<<<<<<<<<<<<< * 'Header value is too long', pyparser._max_field_size, size) * pyparser._on_header_value(at, length) */ __Pyx_GetModuleGlobalName(__pyx_t_7, __pyx_n_s_LineTooLong); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 650, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_7); /* "aiohttp/_http_parser.pyx":651 * if size > pyparser._max_field_size: * raise LineTooLong( * 'Header value is too long', pyparser._max_field_size, size) # <<<<<<<<<<<<<< * pyparser._on_header_value(at, length) * except BaseException as ex: */ __pyx_t_8 = __Pyx_PyInt_FromSize_t(__pyx_v_pyparser->_max_field_size); if (unlikely(!__pyx_t_8)) __PYX_ERR(0, 651, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_8); __pyx_t_9 = PyInt_FromSsize_t(__pyx_v_size); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 651, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_9); __pyx_t_10 = NULL; __pyx_t_11 = 0; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_7))) { __pyx_t_10 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_10)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_10); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); __pyx_t_11 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_7)) { PyObject *__pyx_temp[4] = {__pyx_t_10, __pyx_kp_u_Header_value_is_too_long, __pyx_t_8, __pyx_t_9}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_7, __pyx_temp+1-__pyx_t_11, 3+__pyx_t_11); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 650, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_7)) { PyObject *__pyx_temp[4] = {__pyx_t_10, __pyx_kp_u_Header_value_is_too_long, __pyx_t_8, __pyx_t_9}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_7, __pyx_temp+1-__pyx_t_11, 3+__pyx_t_11); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 650, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } else #endif { __pyx_t_12 = PyTuple_New(3+__pyx_t_11); if (unlikely(!__pyx_t_12)) __PYX_ERR(0, 650, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_12); if (__pyx_t_10) { __Pyx_GIVEREF(__pyx_t_10); PyTuple_SET_ITEM(__pyx_t_12, 0, __pyx_t_10); __pyx_t_10 = NULL; } __Pyx_INCREF(__pyx_kp_u_Header_value_is_too_long); __Pyx_GIVEREF(__pyx_kp_u_Header_value_is_too_long); PyTuple_SET_ITEM(__pyx_t_12, 0+__pyx_t_11, __pyx_kp_u_Header_value_is_too_long); __Pyx_GIVEREF(__pyx_t_8); PyTuple_SET_ITEM(__pyx_t_12, 1+__pyx_t_11, __pyx_t_8); __Pyx_GIVEREF(__pyx_t_9); PyTuple_SET_ITEM(__pyx_t_12, 2+__pyx_t_11, __pyx_t_9); __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_7, __pyx_t_12, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 650, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; } __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __PYX_ERR(0, 650, __pyx_L3_error) /* "aiohttp/_http_parser.pyx":649 * try: * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: # <<<<<<<<<<<<<< * raise LineTooLong( * 'Header value is too long', pyparser._max_field_size, size) */ } /* "aiohttp/_http_parser.pyx":652 * raise LineTooLong( * 'Header value is too long', pyparser._max_field_size, size) * pyparser._on_header_value(at, length) # <<<<<<<<<<<<<< * except BaseException as ex: * pyparser._last_error = ex */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_header_value(__pyx_v_pyparser, __pyx_v_at, __pyx_v_length); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 652, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":647 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: */ } /* "aiohttp/_http_parser.pyx":657 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_10); __pyx_t_10 = 0; __Pyx_XDECREF(__pyx_t_12); __pyx_t_12 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; /* "aiohttp/_http_parser.pyx":653 * 'Header value is too long', pyparser._max_field_size, size) * pyparser._on_header_value(at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ __pyx_t_11 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_11) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_header_value", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_7, &__pyx_t_12) < 0) __PYX_ERR(0, 653, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_7); __Pyx_GOTREF(__pyx_t_12); __Pyx_INCREF(__pyx_t_7); __pyx_v_ex = __pyx_t_7; /*try:*/ { /* "aiohttp/_http_parser.pyx":654 * pyparser._on_header_value(at, length) * except BaseException as ex: * pyparser._last_error = ex # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_ex); __Pyx_GIVEREF(__pyx_v_ex); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_ex; /* "aiohttp/_http_parser.pyx":655 * except BaseException as ex: * pyparser._last_error = ex * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; goto __pyx_L14_return; } /* "aiohttp/_http_parser.pyx":653 * 'Header value is too long', pyparser._max_field_size, size) * pyparser._on_header_value(at, length) * except BaseException as ex: # <<<<<<<<<<<<<< * pyparser._last_error = ex * return -1 */ /*finally:*/ { __pyx_L14_return: { __pyx_t_11 = __pyx_r; __Pyx_DECREF(__pyx_v_ex); __pyx_v_ex = NULL; __pyx_r = __pyx_t_11; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":647 * cdef HttpParser pyparser = parser.data * cdef Py_ssize_t size * try: # <<<<<<<<<<<<<< * size = len(pyparser._raw_value) + length * if size > pyparser._max_field_size: */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":643 * * * cdef int cb_on_header_value(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_10); __Pyx_XDECREF(__pyx_t_12); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_header_value", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_ex); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":660 * * * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_headers_complete(struct http_parser *__pyx_v_parser) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_exc = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; int __pyx_t_6; int __pyx_t_7; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; __Pyx_RefNannySetupContext("cb_on_headers_complete", 0); /* "aiohttp/_http_parser.pyx":661 * * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * try: * pyparser._on_status_complete() */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":662 * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * pyparser._on_headers_complete() */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":663 * cdef HttpParser pyparser = parser.data * try: * pyparser._on_status_complete() # <<<<<<<<<<<<<< * pyparser._on_headers_complete() * except BaseException as exc: */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_status_complete(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 663, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":664 * try: * pyparser._on_status_complete() * pyparser._on_headers_complete() # <<<<<<<<<<<<<< * except BaseException as exc: * pyparser._last_error = exc */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_headers_complete(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 664, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":662 * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * pyparser._on_headers_complete() */ } /* "aiohttp/_http_parser.pyx":669 * return -1 * else: * if pyparser._cparser.upgrade or pyparser._cparser.method == 5: # CONNECT # <<<<<<<<<<<<<< * return 2 * else: */ /*else:*/ { __pyx_t_6 = (__pyx_v_pyparser->_cparser->upgrade != 0); if (!__pyx_t_6) { } else { __pyx_t_5 = __pyx_t_6; goto __pyx_L10_bool_binop_done; } __pyx_t_6 = ((__pyx_v_pyparser->_cparser->method == 5) != 0); __pyx_t_5 = __pyx_t_6; __pyx_L10_bool_binop_done:; if (__pyx_t_5) { /* "aiohttp/_http_parser.pyx":670 * else: * if pyparser._cparser.upgrade or pyparser._cparser.method == 5: # CONNECT * return 2 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = 2; goto __pyx_L6_except_return; /* "aiohttp/_http_parser.pyx":669 * return -1 * else: * if pyparser._cparser.upgrade or pyparser._cparser.method == 5: # CONNECT # <<<<<<<<<<<<<< * return 2 * else: */ } /* "aiohttp/_http_parser.pyx":672 * return 2 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else*/ { __pyx_r = 0; goto __pyx_L6_except_return; } } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":665 * pyparser._on_status_complete() * pyparser._on_headers_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ __pyx_t_7 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_7) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_headers_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_8, &__pyx_t_9) < 0) __PYX_ERR(0, 665, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_8); __Pyx_GOTREF(__pyx_t_9); __Pyx_INCREF(__pyx_t_8); __pyx_v_exc = __pyx_t_8; /*try:*/ { /* "aiohttp/_http_parser.pyx":666 * pyparser._on_headers_complete() * except BaseException as exc: * pyparser._last_error = exc # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_exc); __Pyx_GIVEREF(__pyx_v_exc); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_exc; /* "aiohttp/_http_parser.pyx":667 * except BaseException as exc: * pyparser._last_error = exc * return -1 # <<<<<<<<<<<<<< * else: * if pyparser._cparser.upgrade or pyparser._cparser.method == 5: # CONNECT */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; goto __pyx_L16_return; } /* "aiohttp/_http_parser.pyx":665 * pyparser._on_status_complete() * pyparser._on_headers_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ /*finally:*/ { __pyx_L16_return: { __pyx_t_7 = __pyx_r; __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; __pyx_r = __pyx_t_7; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":662 * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_status_complete() * pyparser._on_headers_complete() */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":660 * * * cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_headers_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_exc); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":675 * * * cdef int cb_on_body(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_body(struct http_parser *__pyx_v_parser, char const *__pyx_v_at, size_t __pyx_v_length) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_body = 0; PyObject *__pyx_v_exc = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; int __pyx_t_8; PyObject *__pyx_t_9 = NULL; int __pyx_t_10; int __pyx_t_11; PyObject *__pyx_t_12 = NULL; PyObject *__pyx_t_13 = NULL; PyObject *__pyx_t_14 = NULL; PyObject *__pyx_t_15 = NULL; int __pyx_t_16; char const *__pyx_t_17; PyObject *__pyx_t_18 = NULL; PyObject *__pyx_t_19 = NULL; PyObject *__pyx_t_20 = NULL; PyObject *__pyx_t_21 = NULL; PyObject *__pyx_t_22 = NULL; PyObject *__pyx_t_23 = NULL; __Pyx_RefNannySetupContext("cb_on_body", 0); /* "aiohttp/_http_parser.pyx":677 * cdef int cb_on_body(cparser.http_parser* parser, * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * cdef bytes body = at[:length] * try: */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":678 * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data * cdef bytes body = at[:length] # <<<<<<<<<<<<<< * try: * pyparser._payload.feed_data(body, length) */ __pyx_t_1 = __Pyx_PyBytes_FromStringAndSize(__pyx_v_at + 0, __pyx_v_length - 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 678, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_body = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":679 * cdef HttpParser pyparser = parser.data * cdef bytes body = at[:length] * try: # <<<<<<<<<<<<<< * pyparser._payload.feed_data(body, length) * except BaseException as exc: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":680 * cdef bytes body = at[:length] * try: * pyparser._payload.feed_data(body, length) # <<<<<<<<<<<<<< * except BaseException as exc: * if pyparser._payload_exception is not None: */ __pyx_t_5 = __Pyx_PyObject_GetAttrStr(__pyx_v_pyparser->_payload, __pyx_n_s_feed_data); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_6 = __Pyx_PyInt_FromSize_t(__pyx_v_length); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_7 = NULL; __pyx_t_8 = 0; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_5))) { __pyx_t_7 = PyMethod_GET_SELF(__pyx_t_5); if (likely(__pyx_t_7)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_5); __Pyx_INCREF(__pyx_t_7); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_5, function); __pyx_t_8 = 1; } } #if CYTHON_FAST_PYCALL if (PyFunction_Check(__pyx_t_5)) { PyObject *__pyx_temp[3] = {__pyx_t_7, __pyx_v_body, __pyx_t_6}; __pyx_t_1 = __Pyx_PyFunction_FastCall(__pyx_t_5, __pyx_temp+1-__pyx_t_8, 2+__pyx_t_8); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; } else #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(__pyx_t_5)) { PyObject *__pyx_temp[3] = {__pyx_t_7, __pyx_v_body, __pyx_t_6}; __pyx_t_1 = __Pyx_PyCFunction_FastCall(__pyx_t_5, __pyx_temp+1-__pyx_t_8, 2+__pyx_t_8); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; } else #endif { __pyx_t_9 = PyTuple_New(2+__pyx_t_8); if (unlikely(!__pyx_t_9)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_9); if (__pyx_t_7) { __Pyx_GIVEREF(__pyx_t_7); PyTuple_SET_ITEM(__pyx_t_9, 0, __pyx_t_7); __pyx_t_7 = NULL; } __Pyx_INCREF(__pyx_v_body); __Pyx_GIVEREF(__pyx_v_body); PyTuple_SET_ITEM(__pyx_t_9, 0+__pyx_t_8, __pyx_v_body); __Pyx_GIVEREF(__pyx_t_6); PyTuple_SET_ITEM(__pyx_t_9, 1+__pyx_t_8, __pyx_t_6); __pyx_t_6 = 0; __pyx_t_1 = __Pyx_PyObject_Call(__pyx_t_5, __pyx_t_9, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 680, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; } __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":679 * cdef HttpParser pyparser = parser.data * cdef bytes body = at[:length] * try: # <<<<<<<<<<<<<< * pyparser._payload.feed_data(body, length) * except BaseException as exc: */ } /* "aiohttp/_http_parser.pyx":689 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; /* "aiohttp/_http_parser.pyx":681 * try: * pyparser._payload.feed_data(body, length) * except BaseException as exc: # <<<<<<<<<<<<<< * if pyparser._payload_exception is not None: * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) */ __pyx_t_8 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_8) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_body", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_5, &__pyx_t_9) < 0) __PYX_ERR(0, 681, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_5); __Pyx_GOTREF(__pyx_t_9); __Pyx_INCREF(__pyx_t_5); __pyx_v_exc = __pyx_t_5; /*try:*/ { /* "aiohttp/_http_parser.pyx":682 * pyparser._payload.feed_data(body, length) * except BaseException as exc: * if pyparser._payload_exception is not None: # <<<<<<<<<<<<<< * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) * else: */ __pyx_t_10 = (__pyx_v_pyparser->_payload_exception != Py_None); __pyx_t_11 = (__pyx_t_10 != 0); if (__pyx_t_11) { /* "aiohttp/_http_parser.pyx":683 * except BaseException as exc: * if pyparser._payload_exception is not None: * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) # <<<<<<<<<<<<<< * else: * pyparser._payload.set_exception(exc) */ __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_v_pyparser->_payload, __pyx_n_s_set_exception); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 683, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_13 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyUnicode_Type)), __pyx_v_exc); if (unlikely(!__pyx_t_13)) __PYX_ERR(0, 683, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_13); __Pyx_INCREF(__pyx_v_pyparser->_payload_exception); __pyx_t_14 = __pyx_v_pyparser->_payload_exception; __pyx_t_15 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_14))) { __pyx_t_15 = PyMethod_GET_SELF(__pyx_t_14); if (likely(__pyx_t_15)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_14); __Pyx_INCREF(__pyx_t_15); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_14, function); } } __pyx_t_12 = (__pyx_t_15) ? __Pyx_PyObject_Call2Args(__pyx_t_14, __pyx_t_15, __pyx_t_13) : __Pyx_PyObject_CallOneArg(__pyx_t_14, __pyx_t_13); __Pyx_XDECREF(__pyx_t_15); __pyx_t_15 = 0; __Pyx_DECREF(__pyx_t_13); __pyx_t_13 = 0; if (unlikely(!__pyx_t_12)) __PYX_ERR(0, 683, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_12); __Pyx_DECREF(__pyx_t_14); __pyx_t_14 = 0; __pyx_t_14 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_7))) { __pyx_t_14 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_14)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_14); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); } } __pyx_t_6 = (__pyx_t_14) ? __Pyx_PyObject_Call2Args(__pyx_t_7, __pyx_t_14, __pyx_t_12) : __Pyx_PyObject_CallOneArg(__pyx_t_7, __pyx_t_12); __Pyx_XDECREF(__pyx_t_14); __pyx_t_14 = 0; __Pyx_DECREF(__pyx_t_12); __pyx_t_12 = 0; if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 683, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; /* "aiohttp/_http_parser.pyx":682 * pyparser._payload.feed_data(body, length) * except BaseException as exc: * if pyparser._payload_exception is not None: # <<<<<<<<<<<<<< * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) * else: */ goto __pyx_L16; } /* "aiohttp/_http_parser.pyx":685 * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) * else: * pyparser._payload.set_exception(exc) # <<<<<<<<<<<<<< * pyparser._payload_error = 1 * return -1 */ /*else*/ { __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_v_pyparser->_payload, __pyx_n_s_set_exception); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 685, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_12 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_7))) { __pyx_t_12 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_12)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_12); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); } } __pyx_t_6 = (__pyx_t_12) ? __Pyx_PyObject_Call2Args(__pyx_t_7, __pyx_t_12, __pyx_v_exc) : __Pyx_PyObject_CallOneArg(__pyx_t_7, __pyx_v_exc); __Pyx_XDECREF(__pyx_t_12); __pyx_t_12 = 0; if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 685, __pyx_L14_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; } __pyx_L16:; /* "aiohttp/_http_parser.pyx":686 * else: * pyparser._payload.set_exception(exc) * pyparser._payload_error = 1 # <<<<<<<<<<<<<< * return -1 * else: */ __pyx_v_pyparser->_payload_error = 1; /* "aiohttp/_http_parser.pyx":687 * pyparser._payload.set_exception(exc) * pyparser._payload_error = 1 * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_9); __pyx_t_9 = 0; goto __pyx_L13_return; } /* "aiohttp/_http_parser.pyx":681 * try: * pyparser._payload.feed_data(body, length) * except BaseException as exc: # <<<<<<<<<<<<<< * if pyparser._payload_exception is not None: * pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) */ /*finally:*/ { __pyx_L14_error:; /*exception exit:*/{ __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __pyx_t_18 = 0; __pyx_t_19 = 0; __pyx_t_20 = 0; __pyx_t_21 = 0; __pyx_t_22 = 0; __pyx_t_23 = 0; __Pyx_XDECREF(__pyx_t_12); __pyx_t_12 = 0; __Pyx_XDECREF(__pyx_t_13); __pyx_t_13 = 0; __Pyx_XDECREF(__pyx_t_14); __pyx_t_14 = 0; __Pyx_XDECREF(__pyx_t_15); __pyx_t_15 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; if (PY_MAJOR_VERSION >= 3) __Pyx_ExceptionSwap(&__pyx_t_21, &__pyx_t_22, &__pyx_t_23); if ((PY_MAJOR_VERSION < 3) || unlikely(__Pyx_GetException(&__pyx_t_18, &__pyx_t_19, &__pyx_t_20) < 0)) __Pyx_ErrFetch(&__pyx_t_18, &__pyx_t_19, &__pyx_t_20); __Pyx_XGOTREF(__pyx_t_18); __Pyx_XGOTREF(__pyx_t_19); __Pyx_XGOTREF(__pyx_t_20); __Pyx_XGOTREF(__pyx_t_21); __Pyx_XGOTREF(__pyx_t_22); __Pyx_XGOTREF(__pyx_t_23); __pyx_t_8 = __pyx_lineno; __pyx_t_16 = __pyx_clineno; __pyx_t_17 = __pyx_filename; { __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; } if (PY_MAJOR_VERSION >= 3) { __Pyx_XGIVEREF(__pyx_t_21); __Pyx_XGIVEREF(__pyx_t_22); __Pyx_XGIVEREF(__pyx_t_23); __Pyx_ExceptionReset(__pyx_t_21, __pyx_t_22, __pyx_t_23); } __Pyx_XGIVEREF(__pyx_t_18); __Pyx_XGIVEREF(__pyx_t_19); __Pyx_XGIVEREF(__pyx_t_20); __Pyx_ErrRestore(__pyx_t_18, __pyx_t_19, __pyx_t_20); __pyx_t_18 = 0; __pyx_t_19 = 0; __pyx_t_20 = 0; __pyx_t_21 = 0; __pyx_t_22 = 0; __pyx_t_23 = 0; __pyx_lineno = __pyx_t_8; __pyx_clineno = __pyx_t_16; __pyx_filename = __pyx_t_17; goto __pyx_L5_except_error; } __pyx_L13_return: { __pyx_t_16 = __pyx_r; __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; __pyx_r = __pyx_t_16; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":679 * cdef HttpParser pyparser = parser.data * cdef bytes body = at[:length] * try: # <<<<<<<<<<<<<< * pyparser._payload.feed_data(body, length) * except BaseException as exc: */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":675 * * * cdef int cb_on_body(cparser.http_parser* parser, # <<<<<<<<<<<<<< * const char *at, size_t length) except -1: * cdef HttpParser pyparser = parser.data */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_12); __Pyx_XDECREF(__pyx_t_13); __Pyx_XDECREF(__pyx_t_14); __Pyx_XDECREF(__pyx_t_15); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_body", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_body); __Pyx_XDECREF(__pyx_v_exc); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":692 * * * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_message_complete(struct http_parser *__pyx_v_parser) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_exc = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; __Pyx_RefNannySetupContext("cb_on_message_complete", 0); /* "aiohttp/_http_parser.pyx":693 * * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * try: * pyparser._started = False */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":694 * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._started = False * pyparser._on_message_complete() */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":695 * cdef HttpParser pyparser = parser.data * try: * pyparser._started = False # <<<<<<<<<<<<<< * pyparser._on_message_complete() * except BaseException as exc: */ __pyx_v_pyparser->_started = 0; /* "aiohttp/_http_parser.pyx":696 * try: * pyparser._started = False * pyparser._on_message_complete() # <<<<<<<<<<<<<< * except BaseException as exc: * pyparser._last_error = exc */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_message_complete(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 696, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":694 * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._started = False * pyparser._on_message_complete() */ } /* "aiohttp/_http_parser.pyx":701 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":697 * pyparser._started = False * pyparser._on_message_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ __pyx_t_5 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_5) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_message_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(0, 697, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_7); __Pyx_INCREF(__pyx_t_6); __pyx_v_exc = __pyx_t_6; /*try:*/ { /* "aiohttp/_http_parser.pyx":698 * pyparser._on_message_complete() * except BaseException as exc: * pyparser._last_error = exc # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_exc); __Pyx_GIVEREF(__pyx_v_exc); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_exc; /* "aiohttp/_http_parser.pyx":699 * except BaseException as exc: * pyparser._last_error = exc * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; goto __pyx_L13_return; } /* "aiohttp/_http_parser.pyx":697 * pyparser._started = False * pyparser._on_message_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ /*finally:*/ { __pyx_L13_return: { __pyx_t_5 = __pyx_r; __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; __pyx_r = __pyx_t_5; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":694 * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._started = False * pyparser._on_message_complete() */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":692 * * * cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_message_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_exc); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":704 * * * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_header(struct http_parser *__pyx_v_parser) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_exc = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; __Pyx_RefNannySetupContext("cb_on_chunk_header", 0); /* "aiohttp/_http_parser.pyx":705 * * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * try: * pyparser._on_chunk_header() */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":706 * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_header() * except BaseException as exc: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":707 * cdef HttpParser pyparser = parser.data * try: * pyparser._on_chunk_header() # <<<<<<<<<<<<<< * except BaseException as exc: * pyparser._last_error = exc */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_chunk_header(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 707, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":706 * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_header() * except BaseException as exc: */ } /* "aiohttp/_http_parser.pyx":712 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":708 * try: * pyparser._on_chunk_header() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ __pyx_t_5 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_5) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_chunk_header", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(0, 708, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_7); __Pyx_INCREF(__pyx_t_6); __pyx_v_exc = __pyx_t_6; /*try:*/ { /* "aiohttp/_http_parser.pyx":709 * pyparser._on_chunk_header() * except BaseException as exc: * pyparser._last_error = exc # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_exc); __Pyx_GIVEREF(__pyx_v_exc); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_exc; /* "aiohttp/_http_parser.pyx":710 * except BaseException as exc: * pyparser._last_error = exc * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; goto __pyx_L13_return; } /* "aiohttp/_http_parser.pyx":708 * try: * pyparser._on_chunk_header() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ /*finally:*/ { __pyx_L13_return: { __pyx_t_5 = __pyx_r; __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; __pyx_r = __pyx_t_5; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":706 * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_header() * except BaseException as exc: */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":704 * * * cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_chunk_header", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_exc); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":715 * * * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ static int __pyx_f_7aiohttp_12_http_parser_cb_on_chunk_complete(struct http_parser *__pyx_v_parser) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *__pyx_v_pyparser = 0; PyObject *__pyx_v_exc = NULL; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; __Pyx_RefNannySetupContext("cb_on_chunk_complete", 0); /* "aiohttp/_http_parser.pyx":716 * * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data # <<<<<<<<<<<<<< * try: * pyparser._on_chunk_complete() */ __pyx_t_1 = ((PyObject *)__pyx_v_parser->data); __Pyx_INCREF(__pyx_t_1); __pyx_v_pyparser = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":717 * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_complete() * except BaseException as exc: */ { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ExceptionSave(&__pyx_t_2, &__pyx_t_3, &__pyx_t_4); __Pyx_XGOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_t_3); __Pyx_XGOTREF(__pyx_t_4); /*try:*/ { /* "aiohttp/_http_parser.pyx":718 * cdef HttpParser pyparser = parser.data * try: * pyparser._on_chunk_complete() # <<<<<<<<<<<<<< * except BaseException as exc: * pyparser._last_error = exc */ __pyx_t_1 = ((struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser *)__pyx_v_pyparser->__pyx_vtab)->_on_chunk_complete(__pyx_v_pyparser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 718, __pyx_L3_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":717 * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_complete() * except BaseException as exc: */ } /* "aiohttp/_http_parser.pyx":723 * return -1 * else: * return 0 # <<<<<<<<<<<<<< * * */ /*else:*/ { __pyx_r = 0; goto __pyx_L6_except_return; } __pyx_L3_error:; __Pyx_XDECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":719 * try: * pyparser._on_chunk_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ __pyx_t_5 = __Pyx_PyErr_ExceptionMatches(__pyx_builtin_BaseException); if (__pyx_t_5) { __Pyx_AddTraceback("aiohttp._http_parser.cb_on_chunk_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); if (__Pyx_GetException(&__pyx_t_1, &__pyx_t_6, &__pyx_t_7) < 0) __PYX_ERR(0, 719, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_7); __Pyx_INCREF(__pyx_t_6); __pyx_v_exc = __pyx_t_6; /*try:*/ { /* "aiohttp/_http_parser.pyx":720 * pyparser._on_chunk_complete() * except BaseException as exc: * pyparser._last_error = exc # <<<<<<<<<<<<<< * return -1 * else: */ __Pyx_INCREF(__pyx_v_exc); __Pyx_GIVEREF(__pyx_v_exc); __Pyx_GOTREF(__pyx_v_pyparser->_last_error); __Pyx_DECREF(__pyx_v_pyparser->_last_error); __pyx_v_pyparser->_last_error = __pyx_v_exc; /* "aiohttp/_http_parser.pyx":721 * except BaseException as exc: * pyparser._last_error = exc * return -1 # <<<<<<<<<<<<<< * else: * return 0 */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; goto __pyx_L13_return; } /* "aiohttp/_http_parser.pyx":719 * try: * pyparser._on_chunk_complete() * except BaseException as exc: # <<<<<<<<<<<<<< * pyparser._last_error = exc * return -1 */ /*finally:*/ { __pyx_L13_return: { __pyx_t_5 = __pyx_r; __Pyx_DECREF(__pyx_v_exc); __pyx_v_exc = NULL; __pyx_r = __pyx_t_5; goto __pyx_L6_except_return; } } } goto __pyx_L5_except_error; __pyx_L5_except_error:; /* "aiohttp/_http_parser.pyx":717 * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: * cdef HttpParser pyparser = parser.data * try: # <<<<<<<<<<<<<< * pyparser._on_chunk_complete() * except BaseException as exc: */ __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L1_error; __pyx_L6_except_return:; __Pyx_XGIVEREF(__pyx_t_2); __Pyx_XGIVEREF(__pyx_t_3); __Pyx_XGIVEREF(__pyx_t_4); __Pyx_ExceptionReset(__pyx_t_2, __pyx_t_3, __pyx_t_4); goto __pyx_L0; } /* "aiohttp/_http_parser.pyx":715 * * * cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: # <<<<<<<<<<<<<< * cdef HttpParser pyparser = parser.data * try: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_AddTraceback("aiohttp._http_parser.cb_on_chunk_complete", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = -1; __pyx_L0:; __Pyx_XDECREF((PyObject *)__pyx_v_pyparser); __Pyx_XDECREF(__pyx_v_exc); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":726 * * * cdef parser_error_from_errno(cparser.http_errno errno): # <<<<<<<<<<<<<< * cdef bytes desc = cparser.http_errno_description(errno) * */ static PyObject *__pyx_f_7aiohttp_12_http_parser_parser_error_from_errno(enum http_errno __pyx_v_errno) { PyObject *__pyx_v_desc = 0; PyObject *__pyx_v_cls = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; __Pyx_RefNannySetupContext("parser_error_from_errno", 0); /* "aiohttp/_http_parser.pyx":727 * * cdef parser_error_from_errno(cparser.http_errno errno): * cdef bytes desc = cparser.http_errno_description(errno) # <<<<<<<<<<<<<< * * if errno in (cparser.HPE_CB_message_begin, */ __pyx_t_1 = __Pyx_PyBytes_FromString(http_errno_description(__pyx_v_errno)); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 727, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_desc = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":729 * cdef bytes desc = cparser.http_errno_description(errno) * * if errno in (cparser.HPE_CB_message_begin, # <<<<<<<<<<<<<< * cparser.HPE_CB_url, * cparser.HPE_CB_header_field, */ switch (__pyx_v_errno) { case HPE_CB_message_begin: case HPE_CB_url: /* "aiohttp/_http_parser.pyx":730 * * if errno in (cparser.HPE_CB_message_begin, * cparser.HPE_CB_url, # <<<<<<<<<<<<<< * cparser.HPE_CB_header_field, * cparser.HPE_CB_header_value, */ case HPE_CB_header_field: /* "aiohttp/_http_parser.pyx":731 * if errno in (cparser.HPE_CB_message_begin, * cparser.HPE_CB_url, * cparser.HPE_CB_header_field, # <<<<<<<<<<<<<< * cparser.HPE_CB_header_value, * cparser.HPE_CB_headers_complete, */ case HPE_CB_header_value: /* "aiohttp/_http_parser.pyx":732 * cparser.HPE_CB_url, * cparser.HPE_CB_header_field, * cparser.HPE_CB_header_value, # <<<<<<<<<<<<<< * cparser.HPE_CB_headers_complete, * cparser.HPE_CB_body, */ case HPE_CB_headers_complete: /* "aiohttp/_http_parser.pyx":733 * cparser.HPE_CB_header_field, * cparser.HPE_CB_header_value, * cparser.HPE_CB_headers_complete, # <<<<<<<<<<<<<< * cparser.HPE_CB_body, * cparser.HPE_CB_message_complete, */ case HPE_CB_body: /* "aiohttp/_http_parser.pyx":734 * cparser.HPE_CB_header_value, * cparser.HPE_CB_headers_complete, * cparser.HPE_CB_body, # <<<<<<<<<<<<<< * cparser.HPE_CB_message_complete, * cparser.HPE_CB_status, */ case HPE_CB_message_complete: /* "aiohttp/_http_parser.pyx":735 * cparser.HPE_CB_headers_complete, * cparser.HPE_CB_body, * cparser.HPE_CB_message_complete, # <<<<<<<<<<<<<< * cparser.HPE_CB_status, * cparser.HPE_CB_chunk_header, */ case HPE_CB_status: /* "aiohttp/_http_parser.pyx":736 * cparser.HPE_CB_body, * cparser.HPE_CB_message_complete, * cparser.HPE_CB_status, # <<<<<<<<<<<<<< * cparser.HPE_CB_chunk_header, * cparser.HPE_CB_chunk_complete): */ case HPE_CB_chunk_header: /* "aiohttp/_http_parser.pyx":737 * cparser.HPE_CB_message_complete, * cparser.HPE_CB_status, * cparser.HPE_CB_chunk_header, # <<<<<<<<<<<<<< * cparser.HPE_CB_chunk_complete): * cls = BadHttpMessage */ case HPE_CB_chunk_complete: /* "aiohttp/_http_parser.pyx":739 * cparser.HPE_CB_chunk_header, * cparser.HPE_CB_chunk_complete): * cls = BadHttpMessage # <<<<<<<<<<<<<< * * elif errno == cparser.HPE_INVALID_STATUS: */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_BadHttpMessage); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 739, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_cls = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":729 * cdef bytes desc = cparser.http_errno_description(errno) * * if errno in (cparser.HPE_CB_message_begin, # <<<<<<<<<<<<<< * cparser.HPE_CB_url, * cparser.HPE_CB_header_field, */ break; case HPE_INVALID_STATUS: /* "aiohttp/_http_parser.pyx":742 * * elif errno == cparser.HPE_INVALID_STATUS: * cls = BadStatusLine # <<<<<<<<<<<<<< * * elif errno == cparser.HPE_INVALID_METHOD: */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_BadStatusLine); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 742, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_cls = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":741 * cls = BadHttpMessage * * elif errno == cparser.HPE_INVALID_STATUS: # <<<<<<<<<<<<<< * cls = BadStatusLine * */ break; case HPE_INVALID_METHOD: /* "aiohttp/_http_parser.pyx":745 * * elif errno == cparser.HPE_INVALID_METHOD: * cls = BadStatusLine # <<<<<<<<<<<<<< * * elif errno == cparser.HPE_INVALID_URL: */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_BadStatusLine); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 745, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_cls = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":744 * cls = BadStatusLine * * elif errno == cparser.HPE_INVALID_METHOD: # <<<<<<<<<<<<<< * cls = BadStatusLine * */ break; case HPE_INVALID_URL: /* "aiohttp/_http_parser.pyx":748 * * elif errno == cparser.HPE_INVALID_URL: * cls = InvalidURLError # <<<<<<<<<<<<<< * * else: */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_InvalidURLError); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 748, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_cls = __pyx_t_1; __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":747 * cls = BadStatusLine * * elif errno == cparser.HPE_INVALID_URL: # <<<<<<<<<<<<<< * cls = InvalidURLError * */ break; default: /* "aiohttp/_http_parser.pyx":751 * * else: * cls = BadHttpMessage # <<<<<<<<<<<<<< * * return cls(desc.decode('latin-1')) */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_BadHttpMessage); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 751, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_cls = __pyx_t_1; __pyx_t_1 = 0; break; } /* "aiohttp/_http_parser.pyx":753 * cls = BadHttpMessage * * return cls(desc.decode('latin-1')) # <<<<<<<<<<<<<< * * */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __Pyx_decode_bytes(__pyx_v_desc, 0, PY_SSIZE_T_MAX, NULL, NULL, PyUnicode_DecodeLatin1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 753, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_v_cls); __pyx_t_3 = __pyx_v_cls; __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); } } __pyx_t_1 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_4, __pyx_t_2) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_t_2); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 753, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_http_parser.pyx":726 * * * cdef parser_error_from_errno(cparser.http_errno errno): # <<<<<<<<<<<<<< * cdef bytes desc = cparser.http_errno_description(errno) * */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_AddTraceback("aiohttp._http_parser.parser_error_from_errno", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF(__pyx_v_desc); __Pyx_XDECREF(__pyx_v_cls); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":756 * * * def parse_url(url): # <<<<<<<<<<<<<< * cdef: * Py_buffer py_buf */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_1parse_url(PyObject *__pyx_self, PyObject *__pyx_v_url); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_12_http_parser_1parse_url = {"parse_url", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_1parse_url, METH_O, 0}; static PyObject *__pyx_pw_7aiohttp_12_http_parser_1parse_url(PyObject *__pyx_self, PyObject *__pyx_v_url) { PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("parse_url (wrapper)", 0); __pyx_r = __pyx_pf_7aiohttp_12_http_parser_parse_url(__pyx_self, ((PyObject *)__pyx_v_url)); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_parse_url(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_url) { Py_buffer __pyx_v_py_buf; char *__pyx_v_buf_data; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; int __pyx_t_3; char const *__pyx_t_4; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; __Pyx_RefNannySetupContext("parse_url", 0); /* "aiohttp/_http_parser.pyx":761 * char* buf_data * * PyObject_GetBuffer(url, &py_buf, PyBUF_SIMPLE) # <<<<<<<<<<<<<< * try: * buf_data = py_buf.buf */ __pyx_t_1 = PyObject_GetBuffer(__pyx_v_url, (&__pyx_v_py_buf), PyBUF_SIMPLE); if (unlikely(__pyx_t_1 == ((int)-1))) __PYX_ERR(0, 761, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":762 * * PyObject_GetBuffer(url, &py_buf, PyBUF_SIMPLE) * try: # <<<<<<<<<<<<<< * buf_data = py_buf.buf * return _parse_url(buf_data, py_buf.len) */ /*try:*/ { /* "aiohttp/_http_parser.pyx":763 * PyObject_GetBuffer(url, &py_buf, PyBUF_SIMPLE) * try: * buf_data = py_buf.buf # <<<<<<<<<<<<<< * return _parse_url(buf_data, py_buf.len) * finally: */ __pyx_v_buf_data = ((char *)__pyx_v_py_buf.buf); /* "aiohttp/_http_parser.pyx":764 * try: * buf_data = py_buf.buf * return _parse_url(buf_data, py_buf.len) # <<<<<<<<<<<<<< * finally: * PyBuffer_Release(&py_buf) */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = __pyx_f_7aiohttp_12_http_parser__parse_url(__pyx_v_buf_data, __pyx_v_py_buf.len); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 764, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L3_return; } /* "aiohttp/_http_parser.pyx":766 * return _parse_url(buf_data, py_buf.len) * finally: * PyBuffer_Release(&py_buf) # <<<<<<<<<<<<<< * * */ /*finally:*/ { __pyx_L4_error:; /*exception exit:*/{ __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __pyx_t_5 = 0; __pyx_t_6 = 0; __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_t_10 = 0; __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; if (PY_MAJOR_VERSION >= 3) __Pyx_ExceptionSwap(&__pyx_t_8, &__pyx_t_9, &__pyx_t_10); if ((PY_MAJOR_VERSION < 3) || unlikely(__Pyx_GetException(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7) < 0)) __Pyx_ErrFetch(&__pyx_t_5, &__pyx_t_6, &__pyx_t_7); __Pyx_XGOTREF(__pyx_t_5); __Pyx_XGOTREF(__pyx_t_6); __Pyx_XGOTREF(__pyx_t_7); __Pyx_XGOTREF(__pyx_t_8); __Pyx_XGOTREF(__pyx_t_9); __Pyx_XGOTREF(__pyx_t_10); __pyx_t_1 = __pyx_lineno; __pyx_t_3 = __pyx_clineno; __pyx_t_4 = __pyx_filename; { PyBuffer_Release((&__pyx_v_py_buf)); } if (PY_MAJOR_VERSION >= 3) { __Pyx_XGIVEREF(__pyx_t_8); __Pyx_XGIVEREF(__pyx_t_9); __Pyx_XGIVEREF(__pyx_t_10); __Pyx_ExceptionReset(__pyx_t_8, __pyx_t_9, __pyx_t_10); } __Pyx_XGIVEREF(__pyx_t_5); __Pyx_XGIVEREF(__pyx_t_6); __Pyx_XGIVEREF(__pyx_t_7); __Pyx_ErrRestore(__pyx_t_5, __pyx_t_6, __pyx_t_7); __pyx_t_5 = 0; __pyx_t_6 = 0; __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_t_10 = 0; __pyx_lineno = __pyx_t_1; __pyx_clineno = __pyx_t_3; __pyx_filename = __pyx_t_4; goto __pyx_L1_error; } __pyx_L3_return: { __pyx_t_10 = __pyx_r; __pyx_r = 0; PyBuffer_Release((&__pyx_v_py_buf)); __pyx_r = __pyx_t_10; __pyx_t_10 = 0; goto __pyx_L0; } } /* "aiohttp/_http_parser.pyx":756 * * * def parse_url(url): # <<<<<<<<<<<<<< * cdef: * Py_buffer py_buf */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_AddTraceback("aiohttp._http_parser.parse_url", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_parser.pyx":769 * * * cdef _parse_url(char* buf_data, size_t length): # <<<<<<<<<<<<<< * cdef: * cparser.http_parser_url* parsed */ static PyObject *__pyx_f_7aiohttp_12_http_parser__parse_url(char *__pyx_v_buf_data, size_t __pyx_v_length) { struct http_parser_url *__pyx_v_parsed; int __pyx_v_res; PyObject *__pyx_v_schema = 0; PyObject *__pyx_v_host = 0; PyObject *__pyx_v_port = 0; PyObject *__pyx_v_path = 0; PyObject *__pyx_v_query = 0; PyObject *__pyx_v_fragment = 0; PyObject *__pyx_v_user = 0; PyObject *__pyx_v_password = 0; PyObject *__pyx_v_userinfo = 0; CYTHON_UNUSED PyObject *__pyx_v_result = 0; int __pyx_v_off; int __pyx_v_ln; CYTHON_UNUSED PyObject *__pyx_v_sep = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; uint16_t __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *(*__pyx_t_8)(PyObject *); PyObject *__pyx_t_9 = NULL; int __pyx_t_10; int __pyx_t_11; char const *__pyx_t_12; PyObject *__pyx_t_13 = NULL; PyObject *__pyx_t_14 = NULL; PyObject *__pyx_t_15 = NULL; PyObject *__pyx_t_16 = NULL; PyObject *__pyx_t_17 = NULL; PyObject *__pyx_t_18 = NULL; __Pyx_RefNannySetupContext("_parse_url", 0); /* "aiohttp/_http_parser.pyx":773 * cparser.http_parser_url* parsed * int res * str schema = None # <<<<<<<<<<<<<< * str host = None * object port = None */ __Pyx_INCREF(Py_None); __pyx_v_schema = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":774 * int res * str schema = None * str host = None # <<<<<<<<<<<<<< * object port = None * str path = None */ __Pyx_INCREF(Py_None); __pyx_v_host = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":775 * str schema = None * str host = None * object port = None # <<<<<<<<<<<<<< * str path = None * str query = None */ __Pyx_INCREF(Py_None); __pyx_v_port = Py_None; /* "aiohttp/_http_parser.pyx":776 * str host = None * object port = None * str path = None # <<<<<<<<<<<<<< * str query = None * str fragment = None */ __Pyx_INCREF(Py_None); __pyx_v_path = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":777 * object port = None * str path = None * str query = None # <<<<<<<<<<<<<< * str fragment = None * str user = None */ __Pyx_INCREF(Py_None); __pyx_v_query = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":778 * str path = None * str query = None * str fragment = None # <<<<<<<<<<<<<< * str user = None * str password = None */ __Pyx_INCREF(Py_None); __pyx_v_fragment = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":779 * str query = None * str fragment = None * str user = None # <<<<<<<<<<<<<< * str password = None * str userinfo = None */ __Pyx_INCREF(Py_None); __pyx_v_user = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":780 * str fragment = None * str user = None * str password = None # <<<<<<<<<<<<<< * str userinfo = None * object result = None */ __Pyx_INCREF(Py_None); __pyx_v_password = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":781 * str user = None * str password = None * str userinfo = None # <<<<<<<<<<<<<< * object result = None * int off */ __Pyx_INCREF(Py_None); __pyx_v_userinfo = ((PyObject*)Py_None); /* "aiohttp/_http_parser.pyx":782 * str password = None * str userinfo = None * object result = None # <<<<<<<<<<<<<< * int off * int ln */ __Pyx_INCREF(Py_None); __pyx_v_result = Py_None; /* "aiohttp/_http_parser.pyx":786 * int ln * * parsed = \ # <<<<<<<<<<<<<< * PyMem_Malloc(sizeof(cparser.http_parser_url)) * if parsed is NULL: */ __pyx_v_parsed = ((struct http_parser_url *)PyMem_Malloc((sizeof(struct http_parser_url)))); /* "aiohttp/_http_parser.pyx":788 * parsed = \ * PyMem_Malloc(sizeof(cparser.http_parser_url)) * if parsed is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * cparser.http_parser_url_init(parsed) */ __pyx_t_1 = ((__pyx_v_parsed == NULL) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_parser.pyx":789 * PyMem_Malloc(sizeof(cparser.http_parser_url)) * if parsed is NULL: * raise MemoryError() # <<<<<<<<<<<<<< * cparser.http_parser_url_init(parsed) * try: */ PyErr_NoMemory(); __PYX_ERR(0, 789, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":788 * parsed = \ * PyMem_Malloc(sizeof(cparser.http_parser_url)) * if parsed is NULL: # <<<<<<<<<<<<<< * raise MemoryError() * cparser.http_parser_url_init(parsed) */ } /* "aiohttp/_http_parser.pyx":790 * if parsed is NULL: * raise MemoryError() * cparser.http_parser_url_init(parsed) # <<<<<<<<<<<<<< * try: * res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) */ http_parser_url_init(__pyx_v_parsed); /* "aiohttp/_http_parser.pyx":791 * raise MemoryError() * cparser.http_parser_url_init(parsed) * try: # <<<<<<<<<<<<<< * res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) * */ /*try:*/ { /* "aiohttp/_http_parser.pyx":792 * cparser.http_parser_url_init(parsed) * try: * res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) # <<<<<<<<<<<<<< * * if res == 0: */ __pyx_v_res = http_parser_parse_url(__pyx_v_buf_data, __pyx_v_length, 0, __pyx_v_parsed); /* "aiohttp/_http_parser.pyx":794 * res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) * * if res == 0: # <<<<<<<<<<<<<< * if parsed.field_set & (1 << cparser.UF_SCHEMA): * off = parsed.field_data[cparser.UF_SCHEMA].off */ __pyx_t_1 = ((__pyx_v_res == 0) != 0); if (likely(__pyx_t_1)) { /* "aiohttp/_http_parser.pyx":795 * * if res == 0: * if parsed.field_set & (1 << cparser.UF_SCHEMA): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_SCHEMA].off * ln = parsed.field_data[cparser.UF_SCHEMA].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_SCHEMA)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":796 * if res == 0: * if parsed.field_set & (1 << cparser.UF_SCHEMA): * off = parsed.field_data[cparser.UF_SCHEMA].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_SCHEMA].len * schema = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_SCHEMA)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":797 * if parsed.field_set & (1 << cparser.UF_SCHEMA): * off = parsed.field_data[cparser.UF_SCHEMA].off * ln = parsed.field_data[cparser.UF_SCHEMA].len # <<<<<<<<<<<<<< * schema = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_SCHEMA)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":798 * off = parsed.field_data[cparser.UF_SCHEMA].off * ln = parsed.field_data[cparser.UF_SCHEMA].len * schema = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * else: * schema = '' */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 798, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_schema, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":795 * * if res == 0: * if parsed.field_set & (1 << cparser.UF_SCHEMA): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_SCHEMA].off * ln = parsed.field_data[cparser.UF_SCHEMA].len */ goto __pyx_L8; } /* "aiohttp/_http_parser.pyx":800 * schema = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: * schema = '' # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_HOST): */ /*else*/ { __Pyx_INCREF(__pyx_kp_u__4); __Pyx_DECREF_SET(__pyx_v_schema, __pyx_kp_u__4); } __pyx_L8:; /* "aiohttp/_http_parser.pyx":802 * schema = '' * * if parsed.field_set & (1 << cparser.UF_HOST): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_HOST].off * ln = parsed.field_data[cparser.UF_HOST].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_HOST)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":803 * * if parsed.field_set & (1 << cparser.UF_HOST): * off = parsed.field_data[cparser.UF_HOST].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_HOST].len * host = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_HOST)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":804 * if parsed.field_set & (1 << cparser.UF_HOST): * off = parsed.field_data[cparser.UF_HOST].off * ln = parsed.field_data[cparser.UF_HOST].len # <<<<<<<<<<<<<< * host = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_HOST)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":805 * off = parsed.field_data[cparser.UF_HOST].off * ln = parsed.field_data[cparser.UF_HOST].len * host = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * else: * host = '' */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 805, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_host, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":802 * schema = '' * * if parsed.field_set & (1 << cparser.UF_HOST): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_HOST].off * ln = parsed.field_data[cparser.UF_HOST].len */ goto __pyx_L9; } /* "aiohttp/_http_parser.pyx":807 * host = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: * host = '' # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_PORT): */ /*else*/ { __Pyx_INCREF(__pyx_kp_u__4); __Pyx_DECREF_SET(__pyx_v_host, __pyx_kp_u__4); } __pyx_L9:; /* "aiohttp/_http_parser.pyx":809 * host = '' * * if parsed.field_set & (1 << cparser.UF_PORT): # <<<<<<<<<<<<<< * port = parsed.port * */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_PORT)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":810 * * if parsed.field_set & (1 << cparser.UF_PORT): * port = parsed.port # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_PATH): */ __pyx_t_3 = __Pyx_PyInt_From_uint16_t(__pyx_v_parsed->port); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 810, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_port, __pyx_t_3); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":809 * host = '' * * if parsed.field_set & (1 << cparser.UF_PORT): # <<<<<<<<<<<<<< * port = parsed.port * */ } /* "aiohttp/_http_parser.pyx":812 * port = parsed.port * * if parsed.field_set & (1 << cparser.UF_PATH): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_PATH].off * ln = parsed.field_data[cparser.UF_PATH].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_PATH)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":813 * * if parsed.field_set & (1 << cparser.UF_PATH): * off = parsed.field_data[cparser.UF_PATH].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_PATH].len * path = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_PATH)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":814 * if parsed.field_set & (1 << cparser.UF_PATH): * off = parsed.field_data[cparser.UF_PATH].off * ln = parsed.field_data[cparser.UF_PATH].len # <<<<<<<<<<<<<< * path = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_PATH)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":815 * off = parsed.field_data[cparser.UF_PATH].off * ln = parsed.field_data[cparser.UF_PATH].len * path = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * else: * path = '' */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 815, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_path, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":812 * port = parsed.port * * if parsed.field_set & (1 << cparser.UF_PATH): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_PATH].off * ln = parsed.field_data[cparser.UF_PATH].len */ goto __pyx_L11; } /* "aiohttp/_http_parser.pyx":817 * path = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: * path = '' # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_QUERY): */ /*else*/ { __Pyx_INCREF(__pyx_kp_u__4); __Pyx_DECREF_SET(__pyx_v_path, __pyx_kp_u__4); } __pyx_L11:; /* "aiohttp/_http_parser.pyx":819 * path = '' * * if parsed.field_set & (1 << cparser.UF_QUERY): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_QUERY].off * ln = parsed.field_data[cparser.UF_QUERY].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_QUERY)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":820 * * if parsed.field_set & (1 << cparser.UF_QUERY): * off = parsed.field_data[cparser.UF_QUERY].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_QUERY].len * query = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_QUERY)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":821 * if parsed.field_set & (1 << cparser.UF_QUERY): * off = parsed.field_data[cparser.UF_QUERY].off * ln = parsed.field_data[cparser.UF_QUERY].len # <<<<<<<<<<<<<< * query = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_QUERY)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":822 * off = parsed.field_data[cparser.UF_QUERY].off * ln = parsed.field_data[cparser.UF_QUERY].len * query = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * else: * query = '' */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 822, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_query, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":819 * path = '' * * if parsed.field_set & (1 << cparser.UF_QUERY): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_QUERY].off * ln = parsed.field_data[cparser.UF_QUERY].len */ goto __pyx_L12; } /* "aiohttp/_http_parser.pyx":824 * query = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: * query = '' # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_FRAGMENT): */ /*else*/ { __Pyx_INCREF(__pyx_kp_u__4); __Pyx_DECREF_SET(__pyx_v_query, __pyx_kp_u__4); } __pyx_L12:; /* "aiohttp/_http_parser.pyx":826 * query = '' * * if parsed.field_set & (1 << cparser.UF_FRAGMENT): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_FRAGMENT].off * ln = parsed.field_data[cparser.UF_FRAGMENT].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_FRAGMENT)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":827 * * if parsed.field_set & (1 << cparser.UF_FRAGMENT): * off = parsed.field_data[cparser.UF_FRAGMENT].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_FRAGMENT].len * fragment = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_FRAGMENT)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":828 * if parsed.field_set & (1 << cparser.UF_FRAGMENT): * off = parsed.field_data[cparser.UF_FRAGMENT].off * ln = parsed.field_data[cparser.UF_FRAGMENT].len # <<<<<<<<<<<<<< * fragment = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_FRAGMENT)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":829 * off = parsed.field_data[cparser.UF_FRAGMENT].off * ln = parsed.field_data[cparser.UF_FRAGMENT].len * fragment = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * else: * fragment = '' */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 829, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_fragment, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":826 * query = '' * * if parsed.field_set & (1 << cparser.UF_FRAGMENT): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_FRAGMENT].off * ln = parsed.field_data[cparser.UF_FRAGMENT].len */ goto __pyx_L13; } /* "aiohttp/_http_parser.pyx":831 * fragment = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * else: * fragment = '' # <<<<<<<<<<<<<< * * if parsed.field_set & (1 << cparser.UF_USERINFO): */ /*else*/ { __Pyx_INCREF(__pyx_kp_u__4); __Pyx_DECREF_SET(__pyx_v_fragment, __pyx_kp_u__4); } __pyx_L13:; /* "aiohttp/_http_parser.pyx":833 * fragment = '' * * if parsed.field_set & (1 << cparser.UF_USERINFO): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_USERINFO].off * ln = parsed.field_data[cparser.UF_USERINFO].len */ __pyx_t_1 = ((__pyx_v_parsed->field_set & (1 << UF_USERINFO)) != 0); if (__pyx_t_1) { /* "aiohttp/_http_parser.pyx":834 * * if parsed.field_set & (1 << cparser.UF_USERINFO): * off = parsed.field_data[cparser.UF_USERINFO].off # <<<<<<<<<<<<<< * ln = parsed.field_data[cparser.UF_USERINFO].len * userinfo = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_USERINFO)]).off; __pyx_v_off = __pyx_t_2; /* "aiohttp/_http_parser.pyx":835 * if parsed.field_set & (1 << cparser.UF_USERINFO): * off = parsed.field_data[cparser.UF_USERINFO].off * ln = parsed.field_data[cparser.UF_USERINFO].len # <<<<<<<<<<<<<< * userinfo = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * */ __pyx_t_2 = (__pyx_v_parsed->field_data[((int)UF_USERINFO)]).len; __pyx_v_ln = __pyx_t_2; /* "aiohttp/_http_parser.pyx":836 * off = parsed.field_data[cparser.UF_USERINFO].off * ln = parsed.field_data[cparser.UF_USERINFO].len * userinfo = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') # <<<<<<<<<<<<<< * * user, sep, password = userinfo.partition(':') */ __pyx_t_3 = __Pyx_decode_c_string(__pyx_v_buf_data, __pyx_v_off, (__pyx_v_off + __pyx_v_ln), NULL, ((char const *)"surrogateescape"), PyUnicode_DecodeUTF8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 836, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF_SET(__pyx_v_userinfo, ((PyObject*)__pyx_t_3)); __pyx_t_3 = 0; /* "aiohttp/_http_parser.pyx":838 * userinfo = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') * * user, sep, password = userinfo.partition(':') # <<<<<<<<<<<<<< * * return URL_build(scheme=schema, */ __pyx_t_3 = __Pyx_CallUnboundCMethod1(&__pyx_umethod_PyUnicode_Type_partition, __pyx_v_userinfo, __pyx_kp_u__11); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); if ((likely(PyTuple_CheckExact(__pyx_t_3))) || (PyList_CheckExact(__pyx_t_3))) { PyObject* sequence = __pyx_t_3; Py_ssize_t size = __Pyx_PySequence_SIZE(sequence); if (unlikely(size != 3)) { if (size > 3) __Pyx_RaiseTooManyValuesError(3); else if (size >= 0) __Pyx_RaiseNeedMoreValuesError(size); __PYX_ERR(0, 838, __pyx_L5_error) } #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS if (likely(PyTuple_CheckExact(sequence))) { __pyx_t_4 = PyTuple_GET_ITEM(sequence, 0); __pyx_t_5 = PyTuple_GET_ITEM(sequence, 1); __pyx_t_6 = PyTuple_GET_ITEM(sequence, 2); } else { __pyx_t_4 = PyList_GET_ITEM(sequence, 0); __pyx_t_5 = PyList_GET_ITEM(sequence, 1); __pyx_t_6 = PyList_GET_ITEM(sequence, 2); } __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(__pyx_t_6); #else __pyx_t_4 = PySequence_ITEM(sequence, 0); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = PySequence_ITEM(sequence, 1); if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_5); __pyx_t_6 = PySequence_ITEM(sequence, 2); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_6); #endif __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; } else { Py_ssize_t index = -1; __pyx_t_7 = PyObject_GetIter(__pyx_t_3); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_t_8 = Py_TYPE(__pyx_t_7)->tp_iternext; index = 0; __pyx_t_4 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_4)) goto __pyx_L15_unpacking_failed; __Pyx_GOTREF(__pyx_t_4); index = 1; __pyx_t_5 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_5)) goto __pyx_L15_unpacking_failed; __Pyx_GOTREF(__pyx_t_5); index = 2; __pyx_t_6 = __pyx_t_8(__pyx_t_7); if (unlikely(!__pyx_t_6)) goto __pyx_L15_unpacking_failed; __Pyx_GOTREF(__pyx_t_6); if (__Pyx_IternextUnpackEndCheck(__pyx_t_8(__pyx_t_7), 3) < 0) __PYX_ERR(0, 838, __pyx_L5_error) __pyx_t_8 = NULL; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; goto __pyx_L16_unpacking_done; __pyx_L15_unpacking_failed:; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __pyx_t_8 = NULL; if (__Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); __PYX_ERR(0, 838, __pyx_L5_error) __pyx_L16_unpacking_done:; } if (!(likely(PyUnicode_CheckExact(__pyx_t_4))||((__pyx_t_4) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_4)->tp_name), 0))) __PYX_ERR(0, 838, __pyx_L5_error) if (!(likely(PyUnicode_CheckExact(__pyx_t_6))||((__pyx_t_6) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_6)->tp_name), 0))) __PYX_ERR(0, 838, __pyx_L5_error) __Pyx_DECREF_SET(__pyx_v_user, ((PyObject*)__pyx_t_4)); __pyx_t_4 = 0; __pyx_v_sep = __pyx_t_5; __pyx_t_5 = 0; __Pyx_DECREF_SET(__pyx_v_password, ((PyObject*)__pyx_t_6)); __pyx_t_6 = 0; /* "aiohttp/_http_parser.pyx":833 * fragment = '' * * if parsed.field_set & (1 << cparser.UF_USERINFO): # <<<<<<<<<<<<<< * off = parsed.field_data[cparser.UF_USERINFO].off * ln = parsed.field_data[cparser.UF_USERINFO].len */ } /* "aiohttp/_http_parser.pyx":840 * user, sep, password = userinfo.partition(':') * * return URL_build(scheme=schema, # <<<<<<<<<<<<<< * user=user, password=password, host=host, port=port, * path=path, query=query, fragment=fragment) */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = __Pyx_PyDict_NewPresized(8); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 840, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_scheme, __pyx_v_schema) < 0) __PYX_ERR(0, 840, __pyx_L5_error) /* "aiohttp/_http_parser.pyx":841 * * return URL_build(scheme=schema, * user=user, password=password, host=host, port=port, # <<<<<<<<<<<<<< * path=path, query=query, fragment=fragment) * else: */ if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_user, __pyx_v_user) < 0) __PYX_ERR(0, 840, __pyx_L5_error) if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_password, __pyx_v_password) < 0) __PYX_ERR(0, 840, __pyx_L5_error) if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_host, __pyx_v_host) < 0) __PYX_ERR(0, 840, __pyx_L5_error) if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_port, __pyx_v_port) < 0) __PYX_ERR(0, 840, __pyx_L5_error) /* "aiohttp/_http_parser.pyx":842 * return URL_build(scheme=schema, * user=user, password=password, host=host, port=port, * path=path, query=query, fragment=fragment) # <<<<<<<<<<<<<< * else: * raise InvalidURLError("invalid url {!r}".format(buf_data)) */ if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_path, __pyx_v_path) < 0) __PYX_ERR(0, 840, __pyx_L5_error) if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_query, __pyx_v_query) < 0) __PYX_ERR(0, 840, __pyx_L5_error) if (PyDict_SetItem(__pyx_t_3, __pyx_n_s_fragment, __pyx_v_fragment) < 0) __PYX_ERR(0, 840, __pyx_L5_error) /* "aiohttp/_http_parser.pyx":840 * user, sep, password = userinfo.partition(':') * * return URL_build(scheme=schema, # <<<<<<<<<<<<<< * user=user, password=password, host=host, port=port, * path=path, query=query, fragment=fragment) */ __pyx_t_6 = __Pyx_PyObject_Call(__pyx_v_7aiohttp_12_http_parser_URL_build, __pyx_empty_tuple, __pyx_t_3); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 840, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_6; __pyx_t_6 = 0; goto __pyx_L4_return; /* "aiohttp/_http_parser.pyx":794 * res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) * * if res == 0: # <<<<<<<<<<<<<< * if parsed.field_set & (1 << cparser.UF_SCHEMA): * off = parsed.field_data[cparser.UF_SCHEMA].off */ } /* "aiohttp/_http_parser.pyx":844 * path=path, query=query, fragment=fragment) * else: * raise InvalidURLError("invalid url {!r}".format(buf_data)) # <<<<<<<<<<<<<< * finally: * PyMem_Free(parsed) */ /*else*/ { __Pyx_GetModuleGlobalName(__pyx_t_3, __pyx_n_s_InvalidURLError); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 844, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_3); __pyx_t_4 = __Pyx_PyObject_GetAttrStr(__pyx_kp_u_invalid_url_r, __pyx_n_s_format); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 844, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_7 = __Pyx_PyBytes_FromString(__pyx_v_buf_data); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 844, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_9 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_4))) { __pyx_t_9 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_9)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_9); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_5 = (__pyx_t_9) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_9, __pyx_t_7) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_t_7); __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; if (unlikely(!__pyx_t_5)) __PYX_ERR(0, 844, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_3))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_3); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_3); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_3, function); } } __pyx_t_6 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_4, __pyx_t_5) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_t_5); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_DECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 844, __pyx_L5_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_Raise(__pyx_t_6, 0, 0, 0); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; __PYX_ERR(0, 844, __pyx_L5_error) } } /* "aiohttp/_http_parser.pyx":846 * raise InvalidURLError("invalid url {!r}".format(buf_data)) * finally: * PyMem_Free(parsed) # <<<<<<<<<<<<<< */ /*finally:*/ { __pyx_L5_error:; /*exception exit:*/{ __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_t_15 = 0; __pyx_t_16 = 0; __pyx_t_17 = 0; __pyx_t_18 = 0; __Pyx_XDECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; if (PY_MAJOR_VERSION >= 3) __Pyx_ExceptionSwap(&__pyx_t_16, &__pyx_t_17, &__pyx_t_18); if ((PY_MAJOR_VERSION < 3) || unlikely(__Pyx_GetException(&__pyx_t_13, &__pyx_t_14, &__pyx_t_15) < 0)) __Pyx_ErrFetch(&__pyx_t_13, &__pyx_t_14, &__pyx_t_15); __Pyx_XGOTREF(__pyx_t_13); __Pyx_XGOTREF(__pyx_t_14); __Pyx_XGOTREF(__pyx_t_15); __Pyx_XGOTREF(__pyx_t_16); __Pyx_XGOTREF(__pyx_t_17); __Pyx_XGOTREF(__pyx_t_18); __pyx_t_10 = __pyx_lineno; __pyx_t_11 = __pyx_clineno; __pyx_t_12 = __pyx_filename; { PyMem_Free(__pyx_v_parsed); } if (PY_MAJOR_VERSION >= 3) { __Pyx_XGIVEREF(__pyx_t_16); __Pyx_XGIVEREF(__pyx_t_17); __Pyx_XGIVEREF(__pyx_t_18); __Pyx_ExceptionReset(__pyx_t_16, __pyx_t_17, __pyx_t_18); } __Pyx_XGIVEREF(__pyx_t_13); __Pyx_XGIVEREF(__pyx_t_14); __Pyx_XGIVEREF(__pyx_t_15); __Pyx_ErrRestore(__pyx_t_13, __pyx_t_14, __pyx_t_15); __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_t_15 = 0; __pyx_t_16 = 0; __pyx_t_17 = 0; __pyx_t_18 = 0; __pyx_lineno = __pyx_t_10; __pyx_clineno = __pyx_t_11; __pyx_filename = __pyx_t_12; goto __pyx_L1_error; } __pyx_L4_return: { __pyx_t_18 = __pyx_r; __pyx_r = 0; PyMem_Free(__pyx_v_parsed); __pyx_r = __pyx_t_18; __pyx_t_18 = 0; goto __pyx_L0; } } /* "aiohttp/_http_parser.pyx":769 * * * cdef _parse_url(char* buf_data, size_t length): # <<<<<<<<<<<<<< * cdef: * cparser.http_parser_url* parsed */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_9); __Pyx_AddTraceback("aiohttp._http_parser._parse_url", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF(__pyx_v_schema); __Pyx_XDECREF(__pyx_v_host); __Pyx_XDECREF(__pyx_v_port); __Pyx_XDECREF(__pyx_v_path); __Pyx_XDECREF(__pyx_v_query); __Pyx_XDECREF(__pyx_v_fragment); __Pyx_XDECREF(__pyx_v_user); __Pyx_XDECREF(__pyx_v_password); __Pyx_XDECREF(__pyx_v_userinfo); __Pyx_XDECREF(__pyx_v_result); __Pyx_XDECREF(__pyx_v_sep); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __pyx_unpickle_RawRequestMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_3__pyx_unpickle_RawRequestMessage(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_12_http_parser_3__pyx_unpickle_RawRequestMessage = {"__pyx_unpickle_RawRequestMessage", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_12_http_parser_3__pyx_unpickle_RawRequestMessage, METH_VARARGS|METH_KEYWORDS, 0}; static PyObject *__pyx_pw_7aiohttp_12_http_parser_3__pyx_unpickle_RawRequestMessage(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v___pyx_type = 0; long __pyx_v___pyx_checksum; PyObject *__pyx_v___pyx_state = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__pyx_unpickle_RawRequestMessage (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_pyx_type,&__pyx_n_s_pyx_checksum,&__pyx_n_s_pyx_state,0}; PyObject* values[3] = {0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_type)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_checksum)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawRequestMessage", 1, 3, 3, 1); __PYX_ERR(1, 1, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_state)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawRequestMessage", 1, 3, 3, 2); __PYX_ERR(1, 1, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__pyx_unpickle_RawRequestMessage") < 0)) __PYX_ERR(1, 1, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 3) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); } __pyx_v___pyx_type = values[0]; __pyx_v___pyx_checksum = __Pyx_PyInt_As_long(values[1]); if (unlikely((__pyx_v___pyx_checksum == (long)-1) && PyErr_Occurred())) __PYX_ERR(1, 1, __pyx_L3_error) __pyx_v___pyx_state = values[2]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawRequestMessage", 1, 3, 3, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(1, 1, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawRequestMessage", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_2__pyx_unpickle_RawRequestMessage(__pyx_self, __pyx_v___pyx_type, __pyx_v___pyx_checksum, __pyx_v___pyx_state); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_2__pyx_unpickle_RawRequestMessage(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_v___pyx_PickleError = 0; PyObject *__pyx_v___pyx_result = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; __Pyx_RefNannySetupContext("__pyx_unpickle_RawRequestMessage", 0); /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x1408252: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) */ __pyx_t_1 = ((__pyx_v___pyx_checksum != 0x1408252) != 0); if (__pyx_t_1) { /* "(tree fragment)":5 * cdef object __pyx_result * if __pyx_checksum != 0x1408252: * from pickle import PickleError as __pyx_PickleError # <<<<<<<<<<<<<< * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) * __pyx_result = RawRequestMessage.__new__(__pyx_type) */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_PickleError); __Pyx_GIVEREF(__pyx_n_s_PickleError); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_PickleError); __pyx_t_3 = __Pyx_Import(__pyx_n_s_pickle, __pyx_t_2, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_3, __pyx_n_s_PickleError); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_t_2); __pyx_v___pyx_PickleError = __pyx_t_2; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":6 * if __pyx_checksum != 0x1408252: * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) # <<<<<<<<<<<<<< * __pyx_result = RawRequestMessage.__new__(__pyx_type) * if __pyx_state is not None: */ __pyx_t_2 = __Pyx_PyInt_From_long(__pyx_v___pyx_checksum); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = __Pyx_PyString_Format(__pyx_kp_s_Incompatible_checksums_s_vs_0x14, __pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_INCREF(__pyx_v___pyx_PickleError); __pyx_t_2 = __pyx_v___pyx_PickleError; __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(1, 6, __pyx_L1_error) /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0x1408252: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) */ } /* "(tree fragment)":7 * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) * __pyx_result = RawRequestMessage.__new__(__pyx_type) # <<<<<<<<<<<<<< * if __pyx_state is not None: * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_ptype_7aiohttp_12_http_parser_RawRequestMessage), __pyx_n_s_new); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_4, __pyx_v___pyx_type) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v___pyx_type); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_v___pyx_result = __pyx_t_3; __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) * __pyx_result = RawRequestMessage.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result */ __pyx_t_1 = (__pyx_v___pyx_state != Py_None); __pyx_t_6 = (__pyx_t_1 != 0); if (__pyx_t_6) { /* "(tree fragment)":9 * __pyx_result = RawRequestMessage.__new__(__pyx_type) * if __pyx_state is not None: * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) # <<<<<<<<<<<<<< * return __pyx_result * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 9, __pyx_L1_error) __pyx_t_3 = __pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawRequestMessage__set_state(((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)__pyx_v___pyx_result), ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0x1408252 = (chunked, compression, headers, method, path, raw_headers, should_close, upgrade, url, version))" % __pyx_checksum) * __pyx_result = RawRequestMessage.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result */ } /* "(tree fragment)":10 * if __pyx_state is not None: * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result # <<<<<<<<<<<<<< * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v___pyx_result); __pyx_r = __pyx_v___pyx_result; goto __pyx_L0; /* "(tree fragment)":1 * def __pyx_unpickle_RawRequestMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawRequestMessage", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v___pyx_PickleError); __Pyx_XDECREF(__pyx_v___pyx_result); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":11 * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): */ static PyObject *__pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawRequestMessage__set_state(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_v___pyx_result, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; Py_ssize_t __pyx_t_3; int __pyx_t_4; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; __Pyx_RefNannySetupContext("__pyx_unpickle_RawRequestMessage__set_state", 0); /* "(tree fragment)":12 * return __pyx_result * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] # <<<<<<<<<<<<<< * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[10]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->chunked); __Pyx_DECREF(__pyx_v___pyx_result->chunked); __pyx_v___pyx_result->chunked = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->compression); __Pyx_DECREF(__pyx_v___pyx_result->compression); __pyx_v___pyx_result->compression = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->headers); __Pyx_DECREF(__pyx_v___pyx_result->headers); __pyx_v___pyx_result->headers = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (!(likely(PyUnicode_CheckExact(__pyx_t_1))||((__pyx_t_1) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_1)->tp_name), 0))) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->method); __Pyx_DECREF(__pyx_v___pyx_result->method); __pyx_v___pyx_result->method = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 4, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (!(likely(PyUnicode_CheckExact(__pyx_t_1))||((__pyx_t_1) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_1)->tp_name), 0))) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->path); __Pyx_DECREF(__pyx_v___pyx_result->path); __pyx_v___pyx_result->path = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 5, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->raw_headers); __Pyx_DECREF(__pyx_v___pyx_result->raw_headers); __pyx_v___pyx_result->raw_headers = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 6, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->should_close); __Pyx_DECREF(__pyx_v___pyx_result->should_close); __pyx_v___pyx_result->should_close = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 7, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->upgrade); __Pyx_DECREF(__pyx_v___pyx_result->upgrade); __pyx_v___pyx_result->upgrade = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 8, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->url); __Pyx_DECREF(__pyx_v___pyx_result->url); __pyx_v___pyx_result->url = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 9, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->version); __Pyx_DECREF(__pyx_v___pyx_result->version); __pyx_v___pyx_result->version = __pyx_t_1; __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[10]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(1, 13, __pyx_L1_error) } __pyx_t_3 = PyTuple_GET_SIZE(__pyx_v___pyx_state); if (unlikely(__pyx_t_3 == ((Py_ssize_t)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_4 = ((__pyx_t_3 > 10) != 0); if (__pyx_t_4) { } else { __pyx_t_2 = __pyx_t_4; goto __pyx_L4_bool_binop_done; } __pyx_t_4 = __Pyx_HasAttr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(__pyx_t_4 == ((int)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_5 = (__pyx_t_4 != 0); __pyx_t_2 = __pyx_t_5; __pyx_L4_bool_binop_done:; if (__pyx_t_2) { /* "(tree fragment)":14 * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[10]) # <<<<<<<<<<<<<< */ __pyx_t_6 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_t_6, __pyx_n_s_update); if (unlikely(!__pyx_t_7)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 14, __pyx_L1_error) } __pyx_t_6 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 10, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_6)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __pyx_t_8 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_7))) { __pyx_t_8 = PyMethod_GET_SELF(__pyx_t_7); if (likely(__pyx_t_8)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_7); __Pyx_INCREF(__pyx_t_8); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_7, function); } } __pyx_t_1 = (__pyx_t_8) ? __Pyx_PyObject_Call2Args(__pyx_t_7, __pyx_t_8, __pyx_t_6) : __Pyx_PyObject_CallOneArg(__pyx_t_7, __pyx_t_6); __Pyx_XDECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_6); __pyx_t_6 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[10]) */ } /* "(tree fragment)":11 * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawRequestMessage__set_state", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":1 * def __pyx_unpickle_RawResponseMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_parser_5__pyx_unpickle_RawResponseMessage(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_12_http_parser_5__pyx_unpickle_RawResponseMessage = {"__pyx_unpickle_RawResponseMessage", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_12_http_parser_5__pyx_unpickle_RawResponseMessage, METH_VARARGS|METH_KEYWORDS, 0}; static PyObject *__pyx_pw_7aiohttp_12_http_parser_5__pyx_unpickle_RawResponseMessage(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v___pyx_type = 0; long __pyx_v___pyx_checksum; PyObject *__pyx_v___pyx_state = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__pyx_unpickle_RawResponseMessage (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_pyx_type,&__pyx_n_s_pyx_checksum,&__pyx_n_s_pyx_state,0}; PyObject* values[3] = {0,0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 3: values[2] = PyTuple_GET_ITEM(__pyx_args, 2); CYTHON_FALLTHROUGH; case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_type)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_checksum)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawResponseMessage", 1, 3, 3, 1); __PYX_ERR(1, 1, __pyx_L3_error) } CYTHON_FALLTHROUGH; case 2: if (likely((values[2] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_pyx_state)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawResponseMessage", 1, 3, 3, 2); __PYX_ERR(1, 1, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "__pyx_unpickle_RawResponseMessage") < 0)) __PYX_ERR(1, 1, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 3) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); values[2] = PyTuple_GET_ITEM(__pyx_args, 2); } __pyx_v___pyx_type = values[0]; __pyx_v___pyx_checksum = __Pyx_PyInt_As_long(values[1]); if (unlikely((__pyx_v___pyx_checksum == (long)-1) && PyErr_Occurred())) __PYX_ERR(1, 1, __pyx_L3_error) __pyx_v___pyx_state = values[2]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("__pyx_unpickle_RawResponseMessage", 1, 3, 3, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(1, 1, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawResponseMessage", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_12_http_parser_4__pyx_unpickle_RawResponseMessage(__pyx_self, __pyx_v___pyx_type, __pyx_v___pyx_checksum, __pyx_v___pyx_state); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_parser_4__pyx_unpickle_RawResponseMessage(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v___pyx_type, long __pyx_v___pyx_checksum, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_v___pyx_PickleError = 0; PyObject *__pyx_v___pyx_result = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; int __pyx_t_6; __Pyx_RefNannySetupContext("__pyx_unpickle_RawResponseMessage", 0); /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0xc7706dc: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) */ __pyx_t_1 = ((__pyx_v___pyx_checksum != 0xc7706dc) != 0); if (__pyx_t_1) { /* "(tree fragment)":5 * cdef object __pyx_result * if __pyx_checksum != 0xc7706dc: * from pickle import PickleError as __pyx_PickleError # <<<<<<<<<<<<<< * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) * __pyx_result = RawResponseMessage.__new__(__pyx_type) */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_PickleError); __Pyx_GIVEREF(__pyx_n_s_PickleError); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_PickleError); __pyx_t_3 = __Pyx_Import(__pyx_n_s_pickle, __pyx_t_2, 0); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_3, __pyx_n_s_PickleError); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 5, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_t_2); __pyx_v___pyx_PickleError = __pyx_t_2; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":6 * if __pyx_checksum != 0xc7706dc: * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) # <<<<<<<<<<<<<< * __pyx_result = RawResponseMessage.__new__(__pyx_type) * if __pyx_state is not None: */ __pyx_t_2 = __Pyx_PyInt_From_long(__pyx_v___pyx_checksum); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = __Pyx_PyString_Format(__pyx_kp_s_Incompatible_checksums_s_vs_0xc7, __pyx_t_2); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_INCREF(__pyx_v___pyx_PickleError); __pyx_t_2 = __pyx_v___pyx_PickleError; __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && unlikely(PyMethod_Check(__pyx_t_2))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __PYX_ERR(1, 6, __pyx_L1_error) /* "(tree fragment)":4 * cdef object __pyx_PickleError * cdef object __pyx_result * if __pyx_checksum != 0xc7706dc: # <<<<<<<<<<<<<< * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) */ } /* "(tree fragment)":7 * from pickle import PickleError as __pyx_PickleError * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) * __pyx_result = RawResponseMessage.__new__(__pyx_type) # <<<<<<<<<<<<<< * if __pyx_state is not None: * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) */ __pyx_t_2 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_ptype_7aiohttp_12_http_parser_RawResponseMessage), __pyx_n_s_new); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_t_4 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_2))) { __pyx_t_4 = PyMethod_GET_SELF(__pyx_t_2); if (likely(__pyx_t_4)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_2); __Pyx_INCREF(__pyx_t_4); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_2, function); } } __pyx_t_3 = (__pyx_t_4) ? __Pyx_PyObject_Call2Args(__pyx_t_2, __pyx_t_4, __pyx_v___pyx_type) : __Pyx_PyObject_CallOneArg(__pyx_t_2, __pyx_v___pyx_type); __Pyx_XDECREF(__pyx_t_4); __pyx_t_4 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_v___pyx_result = __pyx_t_3; __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) * __pyx_result = RawResponseMessage.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result */ __pyx_t_1 = (__pyx_v___pyx_state != Py_None); __pyx_t_6 = (__pyx_t_1 != 0); if (__pyx_t_6) { /* "(tree fragment)":9 * __pyx_result = RawResponseMessage.__new__(__pyx_type) * if __pyx_state is not None: * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) # <<<<<<<<<<<<<< * return __pyx_result * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): */ if (!(likely(PyTuple_CheckExact(__pyx_v___pyx_state))||((__pyx_v___pyx_state) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "tuple", Py_TYPE(__pyx_v___pyx_state)->tp_name), 0))) __PYX_ERR(1, 9, __pyx_L1_error) __pyx_t_3 = __pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawResponseMessage__set_state(((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)__pyx_v___pyx_result), ((PyObject*)__pyx_v___pyx_state)); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; /* "(tree fragment)":8 * raise __pyx_PickleError("Incompatible checksums (%s vs 0xc7706dc = (chunked, code, compression, headers, raw_headers, reason, should_close, upgrade, version))" % __pyx_checksum) * __pyx_result = RawResponseMessage.__new__(__pyx_type) * if __pyx_state is not None: # <<<<<<<<<<<<<< * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result */ } /* "(tree fragment)":10 * if __pyx_state is not None: * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result # <<<<<<<<<<<<<< * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(__pyx_v___pyx_result); __pyx_r = __pyx_v___pyx_result; goto __pyx_L0; /* "(tree fragment)":1 * def __pyx_unpickle_RawResponseMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawResponseMessage", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v___pyx_PickleError); __Pyx_XDECREF(__pyx_v___pyx_result); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "(tree fragment)":11 * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): */ static PyObject *__pyx_f_7aiohttp_12_http_parser___pyx_unpickle_RawResponseMessage__set_state(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_v___pyx_result, PyObject *__pyx_v___pyx_state) { PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; int __pyx_t_2; int __pyx_t_3; Py_ssize_t __pyx_t_4; int __pyx_t_5; int __pyx_t_6; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; __Pyx_RefNannySetupContext("__pyx_unpickle_RawResponseMessage__set_state", 0); /* "(tree fragment)":12 * return __pyx_result * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] # <<<<<<<<<<<<<< * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[9]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 0, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->chunked); __Pyx_DECREF(__pyx_v___pyx_result->chunked); __pyx_v___pyx_result->chunked = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 1, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyInt_As_int(__pyx_t_1); if (unlikely((__pyx_t_2 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_v___pyx_result->code = __pyx_t_2; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 2, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->compression); __Pyx_DECREF(__pyx_v___pyx_result->compression); __pyx_v___pyx_result->compression = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 3, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->headers); __Pyx_DECREF(__pyx_v___pyx_result->headers); __pyx_v___pyx_result->headers = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 4, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->raw_headers); __Pyx_DECREF(__pyx_v___pyx_result->raw_headers); __pyx_v___pyx_result->raw_headers = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 5, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (!(likely(PyUnicode_CheckExact(__pyx_t_1))||((__pyx_t_1) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_1)->tp_name), 0))) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->reason); __Pyx_DECREF(__pyx_v___pyx_result->reason); __pyx_v___pyx_result->reason = ((PyObject*)__pyx_t_1); __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 6, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->should_close); __Pyx_DECREF(__pyx_v___pyx_result->should_close); __pyx_v___pyx_result->should_close = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 7, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->upgrade); __Pyx_DECREF(__pyx_v___pyx_result->upgrade); __pyx_v___pyx_result->upgrade = __pyx_t_1; __pyx_t_1 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 12, __pyx_L1_error) } __pyx_t_1 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 8, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __Pyx_GOTREF(__pyx_v___pyx_result->version); __Pyx_DECREF(__pyx_v___pyx_result->version); __pyx_v___pyx_result->version = __pyx_t_1; __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[9]) */ if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "object of type 'NoneType' has no len()"); __PYX_ERR(1, 13, __pyx_L1_error) } __pyx_t_4 = PyTuple_GET_SIZE(__pyx_v___pyx_state); if (unlikely(__pyx_t_4 == ((Py_ssize_t)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_5 = ((__pyx_t_4 > 9) != 0); if (__pyx_t_5) { } else { __pyx_t_3 = __pyx_t_5; goto __pyx_L4_bool_binop_done; } __pyx_t_5 = __Pyx_HasAttr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(1, 13, __pyx_L1_error) __pyx_t_6 = (__pyx_t_5 != 0); __pyx_t_3 = __pyx_t_6; __pyx_L4_bool_binop_done:; if (__pyx_t_3) { /* "(tree fragment)":14 * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): * __pyx_result.__dict__.update(__pyx_state[9]) # <<<<<<<<<<<<<< */ __pyx_t_7 = __Pyx_PyObject_GetAttrStr(((PyObject *)__pyx_v___pyx_result), __pyx_n_s_dict); if (unlikely(!__pyx_t_7)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_8 = __Pyx_PyObject_GetAttrStr(__pyx_t_7, __pyx_n_s_update); if (unlikely(!__pyx_t_8)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; if (unlikely(__pyx_v___pyx_state == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not subscriptable"); __PYX_ERR(1, 14, __pyx_L1_error) } __pyx_t_7 = __Pyx_GetItemInt_Tuple(__pyx_v___pyx_state, 9, long, 1, __Pyx_PyInt_From_long, 0, 0, 1); if (unlikely(!__pyx_t_7)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_9 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_8))) { __pyx_t_9 = PyMethod_GET_SELF(__pyx_t_8); if (likely(__pyx_t_9)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_8); __Pyx_INCREF(__pyx_t_9); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_8, function); } } __pyx_t_1 = (__pyx_t_9) ? __Pyx_PyObject_Call2Args(__pyx_t_8, __pyx_t_9, __pyx_t_7) : __Pyx_PyObject_CallOneArg(__pyx_t_8, __pyx_t_7); __Pyx_XDECREF(__pyx_t_9); __pyx_t_9 = 0; __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":13 * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): # <<<<<<<<<<<<<< * __pyx_result.__dict__.update(__pyx_state[9]) */ } /* "(tree fragment)":11 * __pyx_unpickle_RawResponseMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_RawResponseMessage__set_state(RawResponseMessage __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.chunked = __pyx_state[0]; __pyx_result.code = __pyx_state[1]; __pyx_result.compression = __pyx_state[2]; __pyx_result.headers = __pyx_state[3]; __pyx_result.raw_headers = __pyx_state[4]; __pyx_result.reason = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.version = __pyx_state[8] * if len(__pyx_state) > 9 and hasattr(__pyx_result, '__dict__'): */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_AddTraceback("aiohttp._http_parser.__pyx_unpickle_RawResponseMessage__set_state", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *__pyx_freelist_7aiohttp_12_http_parser_RawRequestMessage[250]; static int __pyx_freecount_7aiohttp_12_http_parser_RawRequestMessage = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser_RawRequestMessage(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *p; PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser_RawRequestMessage > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage)) & ((t->tp_flags & (Py_TPFLAGS_IS_ABSTRACT | Py_TPFLAGS_HEAPTYPE)) == 0))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser_RawRequestMessage[--__pyx_freecount_7aiohttp_12_http_parser_RawRequestMessage]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { if (likely((t->tp_flags & Py_TPFLAGS_IS_ABSTRACT) == 0)) { o = (*t->tp_alloc)(t, 0); } else { o = (PyObject *) PyBaseObject_Type.tp_new(t, __pyx_empty_tuple, 0); } if (unlikely(!o)) return 0; } p = ((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)o); p->method = ((PyObject*)Py_None); Py_INCREF(Py_None); p->path = ((PyObject*)Py_None); Py_INCREF(Py_None); p->version = Py_None; Py_INCREF(Py_None); p->headers = Py_None; Py_INCREF(Py_None); p->raw_headers = Py_None; Py_INCREF(Py_None); p->should_close = Py_None; Py_INCREF(Py_None); p->compression = Py_None; Py_INCREF(Py_None); p->upgrade = Py_None; Py_INCREF(Py_None); p->chunked = Py_None; Py_INCREF(Py_None); p->url = Py_None; Py_INCREF(Py_None); return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser_RawRequestMessage(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)o; #if CYTHON_USE_TP_FINALIZE if (unlikely(PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE) && Py_TYPE(o)->tp_finalize) && !_PyGC_FINALIZED(o)) { if (PyObject_CallFinalizerFromDealloc(o)) return; } #endif PyObject_GC_UnTrack(o); Py_CLEAR(p->method); Py_CLEAR(p->path); Py_CLEAR(p->version); Py_CLEAR(p->headers); Py_CLEAR(p->raw_headers); Py_CLEAR(p->should_close); Py_CLEAR(p->compression); Py_CLEAR(p->upgrade); Py_CLEAR(p->chunked); Py_CLEAR(p->url); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser_RawRequestMessage < 250) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage)) & ((Py_TYPE(o)->tp_flags & (Py_TPFLAGS_IS_ABSTRACT | Py_TPFLAGS_HEAPTYPE)) == 0))) { __pyx_freelist_7aiohttp_12_http_parser_RawRequestMessage[__pyx_freecount_7aiohttp_12_http_parser_RawRequestMessage++] = ((struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser_RawRequestMessage(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)o; if (p->version) { e = (*v)(p->version, a); if (e) return e; } if (p->headers) { e = (*v)(p->headers, a); if (e) return e; } if (p->raw_headers) { e = (*v)(p->raw_headers, a); if (e) return e; } if (p->should_close) { e = (*v)(p->should_close, a); if (e) return e; } if (p->compression) { e = (*v)(p->compression, a); if (e) return e; } if (p->upgrade) { e = (*v)(p->upgrade, a); if (e) return e; } if (p->chunked) { e = (*v)(p->chunked, a); if (e) return e; } if (p->url) { e = (*v)(p->url, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_12_http_parser_RawRequestMessage(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage *)o; tmp = ((PyObject*)p->version); p->version = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->headers); p->headers = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->raw_headers); p->raw_headers = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->should_close); p->should_close = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->compression); p->compression = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->upgrade); p->upgrade = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->chunked); p->chunked = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->url); p->url = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_method(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_6method_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_path(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_4path_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_version(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7version_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_headers(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7headers_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_raw_headers(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11raw_headers_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_should_close(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_12should_close_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_compression(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_11compression_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_upgrade(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7upgrade_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_chunked(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7chunked_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_url(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3url_1__get__(o); } static PyMethodDef __pyx_methods_7aiohttp_12_http_parser_RawRequestMessage[] = { {"_replace", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_5_replace, METH_VARARGS|METH_KEYWORDS, 0}, {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_7__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_9__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static struct PyGetSetDef __pyx_getsets_7aiohttp_12_http_parser_RawRequestMessage[] = { {(char *)"method", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_method, 0, (char *)0, 0}, {(char *)"path", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_path, 0, (char *)0, 0}, {(char *)"version", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_version, 0, (char *)0, 0}, {(char *)"headers", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_headers, 0, (char *)0, 0}, {(char *)"raw_headers", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_raw_headers, 0, (char *)0, 0}, {(char *)"should_close", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_should_close, 0, (char *)0, 0}, {(char *)"compression", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_compression, 0, (char *)0, 0}, {(char *)"upgrade", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_upgrade, 0, (char *)0, 0}, {(char *)"chunked", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_chunked, 0, (char *)0, 0}, {(char *)"url", __pyx_getprop_7aiohttp_12_http_parser_17RawRequestMessage_url, 0, (char *)0, 0}, {0, 0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_12_http_parser_RawRequestMessage = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.RawRequestMessage", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawRequestMessage), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser_RawRequestMessage, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_3__repr__, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser_RawRequestMessage, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser_RawRequestMessage, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_12_http_parser_RawRequestMessage, /*tp_methods*/ 0, /*tp_members*/ __pyx_getsets_7aiohttp_12_http_parser_RawRequestMessage, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_12_http_parser_17RawRequestMessage_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser_RawRequestMessage, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *__pyx_freelist_7aiohttp_12_http_parser_RawResponseMessage[250]; static int __pyx_freecount_7aiohttp_12_http_parser_RawResponseMessage = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser_RawResponseMessage(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *p; PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser_RawResponseMessage > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage)) & ((t->tp_flags & (Py_TPFLAGS_IS_ABSTRACT | Py_TPFLAGS_HEAPTYPE)) == 0))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser_RawResponseMessage[--__pyx_freecount_7aiohttp_12_http_parser_RawResponseMessage]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { if (likely((t->tp_flags & Py_TPFLAGS_IS_ABSTRACT) == 0)) { o = (*t->tp_alloc)(t, 0); } else { o = (PyObject *) PyBaseObject_Type.tp_new(t, __pyx_empty_tuple, 0); } if (unlikely(!o)) return 0; } p = ((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)o); p->version = Py_None; Py_INCREF(Py_None); p->reason = ((PyObject*)Py_None); Py_INCREF(Py_None); p->headers = Py_None; Py_INCREF(Py_None); p->raw_headers = Py_None; Py_INCREF(Py_None); p->should_close = Py_None; Py_INCREF(Py_None); p->compression = Py_None; Py_INCREF(Py_None); p->upgrade = Py_None; Py_INCREF(Py_None); p->chunked = Py_None; Py_INCREF(Py_None); return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser_RawResponseMessage(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)o; #if CYTHON_USE_TP_FINALIZE if (unlikely(PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE) && Py_TYPE(o)->tp_finalize) && !_PyGC_FINALIZED(o)) { if (PyObject_CallFinalizerFromDealloc(o)) return; } #endif PyObject_GC_UnTrack(o); Py_CLEAR(p->version); Py_CLEAR(p->reason); Py_CLEAR(p->headers); Py_CLEAR(p->raw_headers); Py_CLEAR(p->should_close); Py_CLEAR(p->compression); Py_CLEAR(p->upgrade); Py_CLEAR(p->chunked); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser_RawResponseMessage < 250) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage)) & ((Py_TYPE(o)->tp_flags & (Py_TPFLAGS_IS_ABSTRACT | Py_TPFLAGS_HEAPTYPE)) == 0))) { __pyx_freelist_7aiohttp_12_http_parser_RawResponseMessage[__pyx_freecount_7aiohttp_12_http_parser_RawResponseMessage++] = ((struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser_RawResponseMessage(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)o; if (p->version) { e = (*v)(p->version, a); if (e) return e; } if (p->headers) { e = (*v)(p->headers, a); if (e) return e; } if (p->raw_headers) { e = (*v)(p->raw_headers, a); if (e) return e; } if (p->should_close) { e = (*v)(p->should_close, a); if (e) return e; } if (p->compression) { e = (*v)(p->compression, a); if (e) return e; } if (p->upgrade) { e = (*v)(p->upgrade, a); if (e) return e; } if (p->chunked) { e = (*v)(p->chunked, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_12_http_parser_RawResponseMessage(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *p = (struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage *)o; tmp = ((PyObject*)p->version); p->version = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->headers); p->headers = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->raw_headers); p->raw_headers = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->should_close); p->should_close = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->compression); p->compression = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->upgrade); p->upgrade = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->chunked); p->chunked = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_version(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7version_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_code(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_4code_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_reason(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_6reason_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_headers(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7headers_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_raw_headers(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11raw_headers_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_should_close(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_12should_close_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_compression(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_11compression_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_upgrade(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7upgrade_1__get__(o); } static PyObject *__pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_chunked(PyObject *o, CYTHON_UNUSED void *x) { return __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7chunked_1__get__(o); } static PyMethodDef __pyx_methods_7aiohttp_12_http_parser_RawResponseMessage[] = { {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_5__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_7__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static struct PyGetSetDef __pyx_getsets_7aiohttp_12_http_parser_RawResponseMessage[] = { {(char *)"version", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_version, 0, (char *)0, 0}, {(char *)"code", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_code, 0, (char *)0, 0}, {(char *)"reason", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_reason, 0, (char *)0, 0}, {(char *)"headers", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_headers, 0, (char *)0, 0}, {(char *)"raw_headers", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_raw_headers, 0, (char *)0, 0}, {(char *)"should_close", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_should_close, 0, (char *)0, 0}, {(char *)"compression", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_compression, 0, (char *)0, 0}, {(char *)"upgrade", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_upgrade, 0, (char *)0, 0}, {(char *)"chunked", __pyx_getprop_7aiohttp_12_http_parser_18RawResponseMessage_chunked, 0, (char *)0, 0}, {0, 0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_12_http_parser_RawResponseMessage = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.RawResponseMessage", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser_RawResponseMessage), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser_RawResponseMessage, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_3__repr__, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser_RawResponseMessage, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser_RawResponseMessage, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_12_http_parser_RawResponseMessage, /*tp_methods*/ 0, /*tp_members*/ __pyx_getsets_7aiohttp_12_http_parser_RawResponseMessage, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_12_http_parser_18RawResponseMessage_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser_RawResponseMessage, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser __pyx_vtable_7aiohttp_12_http_parser_HttpParser; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser_HttpParser(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *p; PyObject *o; if (likely((t->tp_flags & Py_TPFLAGS_IS_ABSTRACT) == 0)) { o = (*t->tp_alloc)(t, 0); } else { o = (PyObject *) PyBaseObject_Type.tp_new(t, __pyx_empty_tuple, 0); } if (unlikely(!o)) return 0; p = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)o); p->__pyx_vtab = __pyx_vtabptr_7aiohttp_12_http_parser_HttpParser; p->_raw_name = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_raw_value = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_protocol = Py_None; Py_INCREF(Py_None); p->_loop = Py_None; Py_INCREF(Py_None); p->_timer = Py_None; Py_INCREF(Py_None); p->_url = Py_None; Py_INCREF(Py_None); p->_buf = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_path = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_reason = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_headers = Py_None; Py_INCREF(Py_None); p->_raw_headers = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_messages = ((PyObject*)Py_None); Py_INCREF(Py_None); p->_payload = Py_None; Py_INCREF(Py_None); p->_payload_exception = Py_None; Py_INCREF(Py_None); p->_last_error = Py_None; Py_INCREF(Py_None); p->_content_encoding = ((PyObject*)Py_None); Py_INCREF(Py_None); p->py_buf.obj = NULL; if (unlikely(__pyx_pw_7aiohttp_12_http_parser_10HttpParser_1__cinit__(o, __pyx_empty_tuple, NULL) < 0)) goto bad; return o; bad: Py_DECREF(o); o = 0; return NULL; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser_HttpParser(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *p = (struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)o; #if CYTHON_USE_TP_FINALIZE if (unlikely(PyType_HasFeature(Py_TYPE(o), Py_TPFLAGS_HAVE_FINALIZE) && Py_TYPE(o)->tp_finalize) && !_PyGC_FINALIZED(o)) { if (PyObject_CallFinalizerFromDealloc(o)) return; } #endif PyObject_GC_UnTrack(o); { PyObject *etype, *eval, *etb; PyErr_Fetch(&etype, &eval, &etb); ++Py_REFCNT(o); __pyx_pw_7aiohttp_12_http_parser_10HttpParser_3__dealloc__(o); --Py_REFCNT(o); PyErr_Restore(etype, eval, etb); } Py_CLEAR(p->_raw_name); Py_CLEAR(p->_raw_value); Py_CLEAR(p->_protocol); Py_CLEAR(p->_loop); Py_CLEAR(p->_timer); Py_CLEAR(p->_url); Py_CLEAR(p->_buf); Py_CLEAR(p->_path); Py_CLEAR(p->_reason); Py_CLEAR(p->_headers); Py_CLEAR(p->_raw_headers); Py_CLEAR(p->_messages); Py_CLEAR(p->_payload); Py_CLEAR(p->_payload_exception); Py_CLEAR(p->_last_error); Py_CLEAR(p->_content_encoding); (*Py_TYPE(o)->tp_free)(o); } static int __pyx_tp_traverse_7aiohttp_12_http_parser_HttpParser(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *p = (struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)o; if (p->_protocol) { e = (*v)(p->_protocol, a); if (e) return e; } if (p->_loop) { e = (*v)(p->_loop, a); if (e) return e; } if (p->_timer) { e = (*v)(p->_timer, a); if (e) return e; } if (p->_url) { e = (*v)(p->_url, a); if (e) return e; } if (p->_headers) { e = (*v)(p->_headers, a); if (e) return e; } if (p->_raw_headers) { e = (*v)(p->_raw_headers, a); if (e) return e; } if (p->_messages) { e = (*v)(p->_messages, a); if (e) return e; } if (p->_payload) { e = (*v)(p->_payload, a); if (e) return e; } if (p->_payload_exception) { e = (*v)(p->_payload_exception, a); if (e) return e; } if (p->_last_error) { e = (*v)(p->_last_error, a); if (e) return e; } if (p->py_buf.obj) { e = (*v)(p->py_buf.obj, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_12_http_parser_HttpParser(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *p = (struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *)o; tmp = ((PyObject*)p->_protocol); p->_protocol = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_loop); p->_loop = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_timer); p->_timer = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_url); p->_url = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_headers); p->_headers = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_raw_headers); p->_raw_headers = ((PyObject*)Py_None); Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_messages); p->_messages = ((PyObject*)Py_None); Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_payload); p->_payload = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_payload_exception); p->_payload_exception = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); tmp = ((PyObject*)p->_last_error); p->_last_error = Py_None; Py_INCREF(Py_None); Py_XDECREF(tmp); Py_CLEAR(p->py_buf.obj); return 0; } static PyMethodDef __pyx_methods_7aiohttp_12_http_parser_HttpParser[] = { {"feed_eof", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_10HttpParser_5feed_eof, METH_NOARGS, 0}, {"feed_data", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_10HttpParser_7feed_data, METH_O, 0}, {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_10HttpParser_9__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_10HttpParser_11__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_12_http_parser_HttpParser = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.HttpParser", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser_HttpParser, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser_HttpParser, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser_HttpParser, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_12_http_parser_HttpParser, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ 0, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser_HttpParser, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpRequestParser __pyx_vtable_7aiohttp_12_http_parser_HttpRequestParser; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser_HttpRequestParser(PyTypeObject *t, PyObject *a, PyObject *k) { struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *p; PyObject *o = __pyx_tp_new_7aiohttp_12_http_parser_HttpParser(t, a, k); if (unlikely(!o)) return 0; p = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser *)o); p->__pyx_base.__pyx_vtab = (struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser*)__pyx_vtabptr_7aiohttp_12_http_parser_HttpRequestParser; return o; } static PyMethodDef __pyx_methods_7aiohttp_12_http_parser_HttpRequestParser[] = { {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_3__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_5__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_12_http_parser_HttpRequestParser = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.HttpRequestParser", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser_HttpRequestParser), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser_HttpParser, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser_HttpParser, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser_HttpParser, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_12_http_parser_HttpRequestParser, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_12_http_parser_17HttpRequestParser_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser_HttpRequestParser, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpResponseParser __pyx_vtable_7aiohttp_12_http_parser_HttpResponseParser; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser_HttpResponseParser(PyTypeObject *t, PyObject *a, PyObject *k) { struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *p; PyObject *o = __pyx_tp_new_7aiohttp_12_http_parser_HttpParser(t, a, k); if (unlikely(!o)) return 0; p = ((struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser *)o); p->__pyx_base.__pyx_vtab = (struct __pyx_vtabstruct_7aiohttp_12_http_parser_HttpParser*)__pyx_vtabptr_7aiohttp_12_http_parser_HttpResponseParser; return o; } static PyMethodDef __pyx_methods_7aiohttp_12_http_parser_HttpResponseParser[] = { {"__reduce_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_3__reduce_cython__, METH_NOARGS, 0}, {"__setstate_cython__", (PyCFunction)__pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_5__setstate_cython__, METH_O, 0}, {0, 0, 0, 0} }; static PyTypeObject __pyx_type_7aiohttp_12_http_parser_HttpResponseParser = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.HttpResponseParser", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser_HttpResponseParser), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser_HttpParser, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_BASETYPE|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser_HttpParser, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser_HttpParser, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ __pyx_methods_7aiohttp_12_http_parser_HttpResponseParser, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ __pyx_pw_7aiohttp_12_http_parser_18HttpResponseParser_1__init__, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser_HttpResponseParser, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct____repr__[8]; static int __pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct____repr__ = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct____repr__(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct____repr__ > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__)))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct____repr__[--__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct____repr__]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { o = (*t->tp_alloc)(t, 0); if (unlikely(!o)) return 0; } return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct____repr__(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)o; PyObject_GC_UnTrack(o); Py_CLEAR(p->__pyx_v_info); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct____repr__ < 8) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__)))) { __pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct____repr__[__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct____repr__++] = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct____repr__(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)o; if (p->__pyx_v_info) { e = (*v)(p->__pyx_v_info, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_12_http_parser___pyx_scope_struct____repr__(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__ *)o; tmp = ((PyObject*)p->__pyx_v_info); p->__pyx_v_info = ((PyObject*)Py_None); Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyTypeObject __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__ = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.__pyx_scope_struct____repr__", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct____repr__), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct____repr__, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct____repr__, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser___pyx_scope_struct____repr__, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ 0, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ 0, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct____repr__, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr[8]; static int __pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr)))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr[--__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { o = (*t->tp_alloc)(t, 0); if (unlikely(!o)) return 0; } return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)o; PyObject_GC_UnTrack(o); Py_CLEAR(p->__pyx_outer_scope); Py_CLEAR(p->__pyx_v_name); Py_CLEAR(p->__pyx_v_val); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr < 8) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr)))) { __pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr[__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr++] = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr *)o; if (p->__pyx_outer_scope) { e = (*v)(((PyObject *)p->__pyx_outer_scope), a); if (e) return e; } if (p->__pyx_v_name) { e = (*v)(p->__pyx_v_name, a); if (e) return e; } if (p->__pyx_v_val) { e = (*v)(p->__pyx_v_val, a); if (e) return e; } return 0; } static PyTypeObject __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.__pyx_scope_struct_1_genexpr", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr, /*tp_traverse*/ 0, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ 0, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ 0, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__[8]; static int __pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__)))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__[--__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { o = (*t->tp_alloc)(t, 0); if (unlikely(!o)) return 0; } return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)o; PyObject_GC_UnTrack(o); Py_CLEAR(p->__pyx_v_info); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ < 8) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__)))) { __pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__[__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__++] = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)o; if (p->__pyx_v_info) { e = (*v)(p->__pyx_v_info, a); if (e) return e; } return 0; } static int __pyx_tp_clear_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__(PyObject *o) { PyObject* tmp; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ *)o; tmp = ((PyObject*)p->__pyx_v_info); p->__pyx_v_info = ((PyObject*)Py_None); Py_INCREF(Py_None); Py_XDECREF(tmp); return 0; } static PyTypeObject __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.__pyx_scope_struct_2___repr__", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__, /*tp_traverse*/ __pyx_tp_clear_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ 0, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ 0, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr[8]; static int __pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr = 0; static PyObject *__pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr(PyTypeObject *t, CYTHON_UNUSED PyObject *a, CYTHON_UNUSED PyObject *k) { PyObject *o; if (CYTHON_COMPILING_IN_CPYTHON && likely((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr > 0) & (t->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr)))) { o = (PyObject*)__pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr[--__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr]; memset(o, 0, sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr)); (void) PyObject_INIT(o, t); PyObject_GC_Track(o); } else { o = (*t->tp_alloc)(t, 0); if (unlikely(!o)) return 0; } return o; } static void __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr(PyObject *o) { struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)o; PyObject_GC_UnTrack(o); Py_CLEAR(p->__pyx_outer_scope); Py_CLEAR(p->__pyx_v_name); Py_CLEAR(p->__pyx_v_val); if (CYTHON_COMPILING_IN_CPYTHON && ((__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr < 8) & (Py_TYPE(o)->tp_basicsize == sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr)))) { __pyx_freelist_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr[__pyx_freecount_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr++] = ((struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)o); } else { (*Py_TYPE(o)->tp_free)(o); } } static int __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr(PyObject *o, visitproc v, void *a) { int e; struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *p = (struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr *)o; if (p->__pyx_outer_scope) { e = (*v)(((PyObject *)p->__pyx_outer_scope), a); if (e) return e; } if (p->__pyx_v_name) { e = (*v)(p->__pyx_v_name, a); if (e) return e; } if (p->__pyx_v_val) { e = (*v)(p->__pyx_v_val, a); if (e) return e; } return 0; } static PyTypeObject __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr = { PyVarObject_HEAD_INIT(0, 0) "aiohttp._http_parser.__pyx_scope_struct_3_genexpr", /*tp_name*/ sizeof(struct __pyx_obj_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr), /*tp_basicsize*/ 0, /*tp_itemsize*/ __pyx_tp_dealloc_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr, /*tp_dealloc*/ 0, /*tp_print*/ 0, /*tp_getattr*/ 0, /*tp_setattr*/ #if PY_MAJOR_VERSION < 3 0, /*tp_compare*/ #endif #if PY_MAJOR_VERSION >= 3 0, /*tp_as_async*/ #endif 0, /*tp_repr*/ 0, /*tp_as_number*/ 0, /*tp_as_sequence*/ 0, /*tp_as_mapping*/ 0, /*tp_hash*/ 0, /*tp_call*/ 0, /*tp_str*/ 0, /*tp_getattro*/ 0, /*tp_setattro*/ 0, /*tp_as_buffer*/ Py_TPFLAGS_DEFAULT|Py_TPFLAGS_HAVE_VERSION_TAG|Py_TPFLAGS_CHECKTYPES|Py_TPFLAGS_HAVE_NEWBUFFER|Py_TPFLAGS_HAVE_GC, /*tp_flags*/ 0, /*tp_doc*/ __pyx_tp_traverse_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr, /*tp_traverse*/ 0, /*tp_clear*/ 0, /*tp_richcompare*/ 0, /*tp_weaklistoffset*/ 0, /*tp_iter*/ 0, /*tp_iternext*/ 0, /*tp_methods*/ 0, /*tp_members*/ 0, /*tp_getset*/ 0, /*tp_base*/ 0, /*tp_dict*/ 0, /*tp_descr_get*/ 0, /*tp_descr_set*/ 0, /*tp_dictoffset*/ 0, /*tp_init*/ 0, /*tp_alloc*/ __pyx_tp_new_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr, /*tp_new*/ 0, /*tp_free*/ 0, /*tp_is_gc*/ 0, /*tp_bases*/ 0, /*tp_mro*/ 0, /*tp_cache*/ 0, /*tp_subclasses*/ 0, /*tp_weaklist*/ 0, /*tp_del*/ 0, /*tp_version_tag*/ #if PY_VERSION_HEX >= 0x030400a1 0, /*tp_finalize*/ #endif #if PY_VERSION_HEX >= 0x030800b1 0, /*tp_vectorcall*/ #endif }; static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec__http_parser(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec__http_parser}, {0, NULL} }; #endif static struct PyModuleDef __pyx_moduledef = { PyModuleDef_HEAD_INIT, "_http_parser", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; #endif #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_kp_u_, __pyx_k_, sizeof(__pyx_k_), 0, 1, 0, 0}, {&__pyx_n_s_ACCEPT, __pyx_k_ACCEPT, sizeof(__pyx_k_ACCEPT), 0, 0, 1, 1}, {&__pyx_n_s_ACCEPT_CHARSET, __pyx_k_ACCEPT_CHARSET, sizeof(__pyx_k_ACCEPT_CHARSET), 0, 0, 1, 1}, {&__pyx_n_s_ACCEPT_ENCODING, __pyx_k_ACCEPT_ENCODING, sizeof(__pyx_k_ACCEPT_ENCODING), 0, 0, 1, 1}, {&__pyx_n_s_ACCEPT_LANGUAGE, __pyx_k_ACCEPT_LANGUAGE, sizeof(__pyx_k_ACCEPT_LANGUAGE), 0, 0, 1, 1}, {&__pyx_n_s_ACCEPT_RANGES, __pyx_k_ACCEPT_RANGES, sizeof(__pyx_k_ACCEPT_RANGES), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_ALLOW_CREDENTIALS, __pyx_k_ACCESS_CONTROL_ALLOW_CREDENTIALS, sizeof(__pyx_k_ACCESS_CONTROL_ALLOW_CREDENTIALS), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_ALLOW_HEADERS, __pyx_k_ACCESS_CONTROL_ALLOW_HEADERS, sizeof(__pyx_k_ACCESS_CONTROL_ALLOW_HEADERS), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_ALLOW_METHODS, __pyx_k_ACCESS_CONTROL_ALLOW_METHODS, sizeof(__pyx_k_ACCESS_CONTROL_ALLOW_METHODS), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_ALLOW_ORIGIN, __pyx_k_ACCESS_CONTROL_ALLOW_ORIGIN, sizeof(__pyx_k_ACCESS_CONTROL_ALLOW_ORIGIN), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_EXPOSE_HEADERS, __pyx_k_ACCESS_CONTROL_EXPOSE_HEADERS, sizeof(__pyx_k_ACCESS_CONTROL_EXPOSE_HEADERS), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_MAX_AGE, __pyx_k_ACCESS_CONTROL_MAX_AGE, sizeof(__pyx_k_ACCESS_CONTROL_MAX_AGE), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_REQUEST_HEADERS, __pyx_k_ACCESS_CONTROL_REQUEST_HEADERS, sizeof(__pyx_k_ACCESS_CONTROL_REQUEST_HEADERS), 0, 0, 1, 1}, {&__pyx_n_s_ACCESS_CONTROL_REQUEST_METHOD, __pyx_k_ACCESS_CONTROL_REQUEST_METHOD, sizeof(__pyx_k_ACCESS_CONTROL_REQUEST_METHOD), 0, 0, 1, 1}, {&__pyx_n_s_AGE, __pyx_k_AGE, sizeof(__pyx_k_AGE), 0, 0, 1, 1}, {&__pyx_n_s_ALLOW, __pyx_k_ALLOW, sizeof(__pyx_k_ALLOW), 0, 0, 1, 1}, {&__pyx_n_s_AUTHORIZATION, __pyx_k_AUTHORIZATION, sizeof(__pyx_k_AUTHORIZATION), 0, 0, 1, 1}, {&__pyx_n_s_BadHttpMessage, __pyx_k_BadHttpMessage, sizeof(__pyx_k_BadHttpMessage), 0, 0, 1, 1}, {&__pyx_n_s_BadStatusLine, __pyx_k_BadStatusLine, sizeof(__pyx_k_BadStatusLine), 0, 0, 1, 1}, {&__pyx_n_s_BaseException, __pyx_k_BaseException, sizeof(__pyx_k_BaseException), 0, 0, 1, 1}, {&__pyx_n_s_CACHE_CONTROL, __pyx_k_CACHE_CONTROL, sizeof(__pyx_k_CACHE_CONTROL), 0, 0, 1, 1}, {&__pyx_n_s_CIMultiDict, __pyx_k_CIMultiDict, sizeof(__pyx_k_CIMultiDict), 0, 0, 1, 1}, {&__pyx_n_s_CIMultiDictProxy, __pyx_k_CIMultiDictProxy, sizeof(__pyx_k_CIMultiDictProxy), 0, 0, 1, 1}, {&__pyx_n_s_CIMultiDictProxy_2, __pyx_k_CIMultiDictProxy_2, sizeof(__pyx_k_CIMultiDictProxy_2), 0, 0, 1, 1}, {&__pyx_n_s_CIMultiDict_2, __pyx_k_CIMultiDict_2, sizeof(__pyx_k_CIMultiDict_2), 0, 0, 1, 1}, {&__pyx_n_s_CONNECTION, __pyx_k_CONNECTION, sizeof(__pyx_k_CONNECTION), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_DISPOSITION, __pyx_k_CONTENT_DISPOSITION, sizeof(__pyx_k_CONTENT_DISPOSITION), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_ENCODING, __pyx_k_CONTENT_ENCODING, sizeof(__pyx_k_CONTENT_ENCODING), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_LANGUAGE, __pyx_k_CONTENT_LANGUAGE, sizeof(__pyx_k_CONTENT_LANGUAGE), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_LENGTH, __pyx_k_CONTENT_LENGTH, sizeof(__pyx_k_CONTENT_LENGTH), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_LOCATION, __pyx_k_CONTENT_LOCATION, sizeof(__pyx_k_CONTENT_LOCATION), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_MD5, __pyx_k_CONTENT_MD5, sizeof(__pyx_k_CONTENT_MD5), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_RANGE, __pyx_k_CONTENT_RANGE, sizeof(__pyx_k_CONTENT_RANGE), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_TRANSFER_ENCODING, __pyx_k_CONTENT_TRANSFER_ENCODING, sizeof(__pyx_k_CONTENT_TRANSFER_ENCODING), 0, 0, 1, 1}, {&__pyx_n_s_CONTENT_TYPE, __pyx_k_CONTENT_TYPE, sizeof(__pyx_k_CONTENT_TYPE), 0, 0, 1, 1}, {&__pyx_n_s_COOKIE, __pyx_k_COOKIE, sizeof(__pyx_k_COOKIE), 0, 0, 1, 1}, {&__pyx_n_s_ContentLengthError, __pyx_k_ContentLengthError, sizeof(__pyx_k_ContentLengthError), 0, 0, 1, 1}, {&__pyx_n_s_DATE, __pyx_k_DATE, sizeof(__pyx_k_DATE), 0, 0, 1, 1}, {&__pyx_n_s_DESTINATION, __pyx_k_DESTINATION, sizeof(__pyx_k_DESTINATION), 0, 0, 1, 1}, {&__pyx_n_s_DIGEST, __pyx_k_DIGEST, sizeof(__pyx_k_DIGEST), 0, 0, 1, 1}, {&__pyx_n_s_DeflateBuffer, __pyx_k_DeflateBuffer, sizeof(__pyx_k_DeflateBuffer), 0, 0, 1, 1}, {&__pyx_n_s_DeflateBuffer_2, __pyx_k_DeflateBuffer_2, sizeof(__pyx_k_DeflateBuffer_2), 0, 0, 1, 1}, {&__pyx_n_s_EMPTY_PAYLOAD, __pyx_k_EMPTY_PAYLOAD, sizeof(__pyx_k_EMPTY_PAYLOAD), 0, 0, 1, 1}, {&__pyx_n_s_EMPTY_PAYLOAD_2, __pyx_k_EMPTY_PAYLOAD_2, sizeof(__pyx_k_EMPTY_PAYLOAD_2), 0, 0, 1, 1}, {&__pyx_n_s_ETAG, __pyx_k_ETAG, sizeof(__pyx_k_ETAG), 0, 0, 1, 1}, {&__pyx_n_s_EXPECT, __pyx_k_EXPECT, sizeof(__pyx_k_EXPECT), 0, 0, 1, 1}, {&__pyx_n_s_EXPIRES, __pyx_k_EXPIRES, sizeof(__pyx_k_EXPIRES), 0, 0, 1, 1}, {&__pyx_n_s_FORWARDED, __pyx_k_FORWARDED, sizeof(__pyx_k_FORWARDED), 0, 0, 1, 1}, {&__pyx_n_s_FROM, __pyx_k_FROM, sizeof(__pyx_k_FROM), 0, 0, 1, 1}, {&__pyx_n_s_HOST, __pyx_k_HOST, sizeof(__pyx_k_HOST), 0, 0, 1, 1}, {&__pyx_kp_u_Header_name_is_too_long, __pyx_k_Header_name_is_too_long, sizeof(__pyx_k_Header_name_is_too_long), 0, 1, 0, 0}, {&__pyx_kp_u_Header_value_is_too_long, __pyx_k_Header_value_is_too_long, sizeof(__pyx_k_Header_value_is_too_long), 0, 1, 0, 0}, {&__pyx_n_s_HttpRequestParser, __pyx_k_HttpRequestParser, sizeof(__pyx_k_HttpRequestParser), 0, 0, 1, 1}, {&__pyx_n_u_HttpRequestParser, __pyx_k_HttpRequestParser, sizeof(__pyx_k_HttpRequestParser), 0, 1, 0, 1}, {&__pyx_n_s_HttpResponseParser, __pyx_k_HttpResponseParser, sizeof(__pyx_k_HttpResponseParser), 0, 0, 1, 1}, {&__pyx_n_u_HttpResponseParser, __pyx_k_HttpResponseParser, sizeof(__pyx_k_HttpResponseParser), 0, 1, 0, 1}, {&__pyx_n_s_HttpVersion, __pyx_k_HttpVersion, sizeof(__pyx_k_HttpVersion), 0, 0, 1, 1}, {&__pyx_n_s_HttpVersion10, __pyx_k_HttpVersion10, sizeof(__pyx_k_HttpVersion10), 0, 0, 1, 1}, {&__pyx_n_s_HttpVersion10_2, __pyx_k_HttpVersion10_2, sizeof(__pyx_k_HttpVersion10_2), 0, 0, 1, 1}, {&__pyx_n_s_HttpVersion11, __pyx_k_HttpVersion11, sizeof(__pyx_k_HttpVersion11), 0, 0, 1, 1}, {&__pyx_n_s_HttpVersion11_2, __pyx_k_HttpVersion11_2, sizeof(__pyx_k_HttpVersion11_2), 0, 0, 1, 1}, {&__pyx_n_s_HttpVersion_2, __pyx_k_HttpVersion_2, sizeof(__pyx_k_HttpVersion_2), 0, 0, 1, 1}, {&__pyx_n_s_IF_MATCH, __pyx_k_IF_MATCH, sizeof(__pyx_k_IF_MATCH), 0, 0, 1, 1}, {&__pyx_n_s_IF_MODIFIED_SINCE, __pyx_k_IF_MODIFIED_SINCE, sizeof(__pyx_k_IF_MODIFIED_SINCE), 0, 0, 1, 1}, {&__pyx_n_s_IF_NONE_MATCH, __pyx_k_IF_NONE_MATCH, sizeof(__pyx_k_IF_NONE_MATCH), 0, 0, 1, 1}, {&__pyx_n_s_IF_RANGE, __pyx_k_IF_RANGE, sizeof(__pyx_k_IF_RANGE), 0, 0, 1, 1}, {&__pyx_n_s_IF_UNMODIFIED_SINCE, __pyx_k_IF_UNMODIFIED_SINCE, sizeof(__pyx_k_IF_UNMODIFIED_SINCE), 0, 0, 1, 1}, {&__pyx_kp_s_Incompatible_checksums_s_vs_0x14, __pyx_k_Incompatible_checksums_s_vs_0x14, sizeof(__pyx_k_Incompatible_checksums_s_vs_0x14), 0, 0, 1, 0}, {&__pyx_kp_s_Incompatible_checksums_s_vs_0xc7, __pyx_k_Incompatible_checksums_s_vs_0xc7, sizeof(__pyx_k_Incompatible_checksums_s_vs_0xc7), 0, 0, 1, 0}, {&__pyx_n_s_InvalidHeader, __pyx_k_InvalidHeader, sizeof(__pyx_k_InvalidHeader), 0, 0, 1, 1}, {&__pyx_n_s_InvalidURLError, __pyx_k_InvalidURLError, sizeof(__pyx_k_InvalidURLError), 0, 0, 1, 1}, {&__pyx_n_s_KEEP_ALIVE, __pyx_k_KEEP_ALIVE, sizeof(__pyx_k_KEEP_ALIVE), 0, 0, 1, 1}, {&__pyx_n_s_LAST_EVENT_ID, __pyx_k_LAST_EVENT_ID, sizeof(__pyx_k_LAST_EVENT_ID), 0, 0, 1, 1}, {&__pyx_n_s_LAST_MODIFIED, __pyx_k_LAST_MODIFIED, sizeof(__pyx_k_LAST_MODIFIED), 0, 0, 1, 1}, {&__pyx_n_s_LINK, __pyx_k_LINK, sizeof(__pyx_k_LINK), 0, 0, 1, 1}, {&__pyx_n_s_LOCATION, __pyx_k_LOCATION, sizeof(__pyx_k_LOCATION), 0, 0, 1, 1}, {&__pyx_n_s_LineTooLong, __pyx_k_LineTooLong, sizeof(__pyx_k_LineTooLong), 0, 0, 1, 1}, {&__pyx_n_s_MAX_FORWARDS, __pyx_k_MAX_FORWARDS, sizeof(__pyx_k_MAX_FORWARDS), 0, 0, 1, 1}, {&__pyx_n_s_MemoryError, __pyx_k_MemoryError, sizeof(__pyx_k_MemoryError), 0, 0, 1, 1}, {&__pyx_kp_u_Not_enough_data_for_satisfy_cont, __pyx_k_Not_enough_data_for_satisfy_cont, sizeof(__pyx_k_Not_enough_data_for_satisfy_cont), 0, 1, 0, 0}, {&__pyx_kp_u_Not_enough_data_for_satisfy_tran, __pyx_k_Not_enough_data_for_satisfy_tran, sizeof(__pyx_k_Not_enough_data_for_satisfy_tran), 0, 1, 0, 0}, {&__pyx_n_s_ORIGIN, __pyx_k_ORIGIN, sizeof(__pyx_k_ORIGIN), 0, 0, 1, 1}, {&__pyx_n_s_PRAGMA, __pyx_k_PRAGMA, sizeof(__pyx_k_PRAGMA), 0, 0, 1, 1}, {&__pyx_n_s_PROXY_AUTHENTICATE, __pyx_k_PROXY_AUTHENTICATE, sizeof(__pyx_k_PROXY_AUTHENTICATE), 0, 0, 1, 1}, {&__pyx_n_s_PROXY_AUTHORIZATION, __pyx_k_PROXY_AUTHORIZATION, sizeof(__pyx_k_PROXY_AUTHORIZATION), 0, 0, 1, 1}, {&__pyx_n_s_PayloadEncodingError, __pyx_k_PayloadEncodingError, sizeof(__pyx_k_PayloadEncodingError), 0, 0, 1, 1}, {&__pyx_n_s_PickleError, __pyx_k_PickleError, sizeof(__pyx_k_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_RANGE, __pyx_k_RANGE, sizeof(__pyx_k_RANGE), 0, 0, 1, 1}, {&__pyx_n_s_REFERER, __pyx_k_REFERER, sizeof(__pyx_k_REFERER), 0, 0, 1, 1}, {&__pyx_n_s_RETRY_AFTER, __pyx_k_RETRY_AFTER, sizeof(__pyx_k_RETRY_AFTER), 0, 0, 1, 1}, {&__pyx_kp_u_RawRequestMessage, __pyx_k_RawRequestMessage, sizeof(__pyx_k_RawRequestMessage), 0, 1, 0, 0}, {&__pyx_n_s_RawRequestMessage_2, __pyx_k_RawRequestMessage_2, sizeof(__pyx_k_RawRequestMessage_2), 0, 0, 1, 1}, {&__pyx_n_u_RawRequestMessage_2, __pyx_k_RawRequestMessage_2, sizeof(__pyx_k_RawRequestMessage_2), 0, 1, 0, 1}, {&__pyx_kp_u_RawResponseMessage, __pyx_k_RawResponseMessage, sizeof(__pyx_k_RawResponseMessage), 0, 1, 0, 0}, {&__pyx_n_s_RawResponseMessage_2, __pyx_k_RawResponseMessage_2, sizeof(__pyx_k_RawResponseMessage_2), 0, 0, 1, 1}, {&__pyx_n_u_RawResponseMessage_2, __pyx_k_RawResponseMessage_2, sizeof(__pyx_k_RawResponseMessage_2), 0, 1, 0, 1}, {&__pyx_n_s_SEC_WEBSOCKET_ACCEPT, __pyx_k_SEC_WEBSOCKET_ACCEPT, sizeof(__pyx_k_SEC_WEBSOCKET_ACCEPT), 0, 0, 1, 1}, {&__pyx_n_s_SEC_WEBSOCKET_EXTENSIONS, __pyx_k_SEC_WEBSOCKET_EXTENSIONS, sizeof(__pyx_k_SEC_WEBSOCKET_EXTENSIONS), 0, 0, 1, 1}, {&__pyx_n_s_SEC_WEBSOCKET_KEY, __pyx_k_SEC_WEBSOCKET_KEY, sizeof(__pyx_k_SEC_WEBSOCKET_KEY), 0, 0, 1, 1}, {&__pyx_n_s_SEC_WEBSOCKET_KEY1, __pyx_k_SEC_WEBSOCKET_KEY1, sizeof(__pyx_k_SEC_WEBSOCKET_KEY1), 0, 0, 1, 1}, {&__pyx_n_s_SEC_WEBSOCKET_PROTOCOL, __pyx_k_SEC_WEBSOCKET_PROTOCOL, sizeof(__pyx_k_SEC_WEBSOCKET_PROTOCOL), 0, 0, 1, 1}, {&__pyx_n_s_SEC_WEBSOCKET_VERSION, __pyx_k_SEC_WEBSOCKET_VERSION, sizeof(__pyx_k_SEC_WEBSOCKET_VERSION), 0, 0, 1, 1}, {&__pyx_n_s_SERVER, __pyx_k_SERVER, sizeof(__pyx_k_SERVER), 0, 0, 1, 1}, {&__pyx_n_s_SET_COOKIE, __pyx_k_SET_COOKIE, sizeof(__pyx_k_SET_COOKIE), 0, 0, 1, 1}, {&__pyx_kp_u_Status_line_is_too_long, __pyx_k_Status_line_is_too_long, sizeof(__pyx_k_Status_line_is_too_long), 0, 1, 0, 0}, {&__pyx_n_s_StreamReader, __pyx_k_StreamReader, sizeof(__pyx_k_StreamReader), 0, 0, 1, 1}, {&__pyx_n_s_StreamReader_2, __pyx_k_StreamReader_2, sizeof(__pyx_k_StreamReader_2), 0, 0, 1, 1}, {&__pyx_n_s_TE, __pyx_k_TE, sizeof(__pyx_k_TE), 0, 0, 1, 1}, {&__pyx_n_s_TRAILER, __pyx_k_TRAILER, sizeof(__pyx_k_TRAILER), 0, 0, 1, 1}, {&__pyx_n_s_TRANSFER_ENCODING, __pyx_k_TRANSFER_ENCODING, sizeof(__pyx_k_TRANSFER_ENCODING), 0, 0, 1, 1}, {&__pyx_n_s_TransferEncodingError, __pyx_k_TransferEncodingError, sizeof(__pyx_k_TransferEncodingError), 0, 0, 1, 1}, {&__pyx_n_s_TypeError, __pyx_k_TypeError, sizeof(__pyx_k_TypeError), 0, 0, 1, 1}, {&__pyx_n_s_UPGRADE, __pyx_k_UPGRADE, sizeof(__pyx_k_UPGRADE), 0, 0, 1, 1}, {&__pyx_n_s_URI, __pyx_k_URI, sizeof(__pyx_k_URI), 0, 0, 1, 1}, {&__pyx_n_s_URL, __pyx_k_URL, sizeof(__pyx_k_URL), 0, 0, 1, 1}, {&__pyx_n_s_URL_2, __pyx_k_URL_2, sizeof(__pyx_k_URL_2), 0, 0, 1, 1}, {&__pyx_n_s_USER_AGENT, __pyx_k_USER_AGENT, sizeof(__pyx_k_USER_AGENT), 0, 0, 1, 1}, {&__pyx_n_s_VARY, __pyx_k_VARY, sizeof(__pyx_k_VARY), 0, 0, 1, 1}, {&__pyx_n_s_VIA, __pyx_k_VIA, sizeof(__pyx_k_VIA), 0, 0, 1, 1}, {&__pyx_n_s_WANT_DIGEST, __pyx_k_WANT_DIGEST, sizeof(__pyx_k_WANT_DIGEST), 0, 0, 1, 1}, {&__pyx_n_s_WARNING, __pyx_k_WARNING, sizeof(__pyx_k_WARNING), 0, 0, 1, 1}, {&__pyx_n_s_WEBSOCKET, __pyx_k_WEBSOCKET, sizeof(__pyx_k_WEBSOCKET), 0, 0, 1, 1}, {&__pyx_n_s_WWW_AUTHENTICATE, __pyx_k_WWW_AUTHENTICATE, sizeof(__pyx_k_WWW_AUTHENTICATE), 0, 0, 1, 1}, {&__pyx_n_s_X_FORWARDED_FOR, __pyx_k_X_FORWARDED_FOR, sizeof(__pyx_k_X_FORWARDED_FOR), 0, 0, 1, 1}, {&__pyx_n_s_X_FORWARDED_HOST, __pyx_k_X_FORWARDED_HOST, sizeof(__pyx_k_X_FORWARDED_HOST), 0, 0, 1, 1}, {&__pyx_n_s_X_FORWARDED_PROTO, __pyx_k_X_FORWARDED_PROTO, sizeof(__pyx_k_X_FORWARDED_PROTO), 0, 0, 1, 1}, {&__pyx_kp_u__11, __pyx_k__11, sizeof(__pyx_k__11), 0, 1, 0, 0}, {&__pyx_kp_u__2, __pyx_k__2, sizeof(__pyx_k__2), 0, 1, 0, 0}, {&__pyx_kp_u__3, __pyx_k__3, sizeof(__pyx_k__3), 0, 1, 0, 0}, {&__pyx_n_s__4, __pyx_k__4, sizeof(__pyx_k__4), 0, 0, 1, 1}, {&__pyx_kp_b__4, __pyx_k__4, sizeof(__pyx_k__4), 0, 0, 0, 0}, {&__pyx_kp_u__4, __pyx_k__4, sizeof(__pyx_k__4), 0, 1, 0, 0}, {&__pyx_n_s_add, __pyx_k_add, sizeof(__pyx_k_add), 0, 0, 1, 1}, {&__pyx_n_s_aiohttp, __pyx_k_aiohttp, sizeof(__pyx_k_aiohttp), 0, 0, 1, 1}, {&__pyx_n_s_aiohttp__http_parser, __pyx_k_aiohttp__http_parser, sizeof(__pyx_k_aiohttp__http_parser), 0, 0, 1, 1}, {&__pyx_kp_s_aiohttp__http_parser_pyx, __pyx_k_aiohttp__http_parser_pyx, sizeof(__pyx_k_aiohttp__http_parser_pyx), 0, 0, 1, 0}, {&__pyx_n_s_all, __pyx_k_all, sizeof(__pyx_k_all), 0, 0, 1, 1}, {&__pyx_n_s_args, __pyx_k_args, sizeof(__pyx_k_args), 0, 0, 1, 1}, {&__pyx_n_s_auto_decompress, __pyx_k_auto_decompress, sizeof(__pyx_k_auto_decompress), 0, 0, 1, 1}, {&__pyx_n_s_begin_http_chunk_receiving, __pyx_k_begin_http_chunk_receiving, sizeof(__pyx_k_begin_http_chunk_receiving), 0, 0, 1, 1}, {&__pyx_n_u_br, __pyx_k_br, sizeof(__pyx_k_br), 0, 1, 0, 1}, {&__pyx_n_s_buf_data, __pyx_k_buf_data, sizeof(__pyx_k_buf_data), 0, 0, 1, 1}, {&__pyx_n_s_build, __pyx_k_build, sizeof(__pyx_k_build), 0, 0, 1, 1}, {&__pyx_n_s_chunked, __pyx_k_chunked, sizeof(__pyx_k_chunked), 0, 0, 1, 1}, {&__pyx_n_u_chunked, __pyx_k_chunked, sizeof(__pyx_k_chunked), 0, 1, 0, 1}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_close, __pyx_k_close, sizeof(__pyx_k_close), 0, 0, 1, 1}, {&__pyx_n_s_code, __pyx_k_code, sizeof(__pyx_k_code), 0, 0, 1, 1}, {&__pyx_n_u_code, __pyx_k_code, sizeof(__pyx_k_code), 0, 1, 0, 1}, {&__pyx_n_s_compression, __pyx_k_compression, sizeof(__pyx_k_compression), 0, 0, 1, 1}, {&__pyx_n_u_compression, __pyx_k_compression, sizeof(__pyx_k_compression), 0, 1, 0, 1}, {&__pyx_n_u_deflate, __pyx_k_deflate, sizeof(__pyx_k_deflate), 0, 1, 0, 1}, {&__pyx_n_s_dict, __pyx_k_dict, sizeof(__pyx_k_dict), 0, 0, 1, 1}, {&__pyx_n_s_end_http_chunk_receiving, __pyx_k_end_http_chunk_receiving, sizeof(__pyx_k_end_http_chunk_receiving), 0, 0, 1, 1}, {&__pyx_n_s_feed_data, __pyx_k_feed_data, sizeof(__pyx_k_feed_data), 0, 0, 1, 1}, {&__pyx_n_s_feed_eof, __pyx_k_feed_eof, sizeof(__pyx_k_feed_eof), 0, 0, 1, 1}, {&__pyx_n_s_format, __pyx_k_format, sizeof(__pyx_k_format), 0, 0, 1, 1}, {&__pyx_n_s_fragment, __pyx_k_fragment, sizeof(__pyx_k_fragment), 0, 0, 1, 1}, {&__pyx_n_s_genexpr, __pyx_k_genexpr, sizeof(__pyx_k_genexpr), 0, 0, 1, 1}, {&__pyx_n_s_getstate, __pyx_k_getstate, sizeof(__pyx_k_getstate), 0, 0, 1, 1}, {&__pyx_n_u_gzip, __pyx_k_gzip, sizeof(__pyx_k_gzip), 0, 1, 0, 1}, {&__pyx_n_s_hdrs, __pyx_k_hdrs, sizeof(__pyx_k_hdrs), 0, 0, 1, 1}, {&__pyx_n_s_headers, __pyx_k_headers, sizeof(__pyx_k_headers), 0, 0, 1, 1}, {&__pyx_n_u_headers, __pyx_k_headers, sizeof(__pyx_k_headers), 0, 1, 0, 1}, {&__pyx_n_s_host, __pyx_k_host, sizeof(__pyx_k_host), 0, 0, 1, 1}, {&__pyx_n_s_http_exceptions, __pyx_k_http_exceptions, sizeof(__pyx_k_http_exceptions), 0, 0, 1, 1}, {&__pyx_n_s_http_parser, __pyx_k_http_parser, sizeof(__pyx_k_http_parser), 0, 0, 1, 1}, {&__pyx_n_s_http_writer, __pyx_k_http_writer, sizeof(__pyx_k_http_writer), 0, 0, 1, 1}, {&__pyx_n_s_i, __pyx_k_i, sizeof(__pyx_k_i), 0, 0, 1, 1}, {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, {&__pyx_kp_u_invalid_url_r, __pyx_k_invalid_url_r, sizeof(__pyx_k_invalid_url_r), 0, 1, 0, 0}, {&__pyx_n_s_loop, __pyx_k_loop, sizeof(__pyx_k_loop), 0, 0, 1, 1}, {&__pyx_n_s_lower, __pyx_k_lower, sizeof(__pyx_k_lower), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_max_field_size, __pyx_k_max_field_size, sizeof(__pyx_k_max_field_size), 0, 0, 1, 1}, {&__pyx_n_s_max_headers, __pyx_k_max_headers, sizeof(__pyx_k_max_headers), 0, 0, 1, 1}, {&__pyx_n_s_max_line_size, __pyx_k_max_line_size, sizeof(__pyx_k_max_line_size), 0, 0, 1, 1}, {&__pyx_n_s_method, __pyx_k_method, sizeof(__pyx_k_method), 0, 0, 1, 1}, {&__pyx_n_u_method, __pyx_k_method, sizeof(__pyx_k_method), 0, 1, 0, 1}, {&__pyx_n_s_multidict, __pyx_k_multidict, sizeof(__pyx_k_multidict), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_new, __pyx_k_new, sizeof(__pyx_k_new), 0, 0, 1, 1}, {&__pyx_kp_s_no_default___reduce___due_to_non, __pyx_k_no_default___reduce___due_to_non, sizeof(__pyx_k_no_default___reduce___due_to_non), 0, 0, 1, 0}, {&__pyx_n_s_parse_url, __pyx_k_parse_url, sizeof(__pyx_k_parse_url), 0, 0, 1, 1}, {&__pyx_n_s_partition, __pyx_k_partition, sizeof(__pyx_k_partition), 0, 0, 1, 1}, {&__pyx_n_s_password, __pyx_k_password, sizeof(__pyx_k_password), 0, 0, 1, 1}, {&__pyx_n_s_path, __pyx_k_path, sizeof(__pyx_k_path), 0, 0, 1, 1}, {&__pyx_n_u_path, __pyx_k_path, sizeof(__pyx_k_path), 0, 1, 0, 1}, {&__pyx_n_s_payload_exception, __pyx_k_payload_exception, sizeof(__pyx_k_payload_exception), 0, 0, 1, 1}, {&__pyx_n_s_pickle, __pyx_k_pickle, sizeof(__pyx_k_pickle), 0, 0, 1, 1}, {&__pyx_n_s_port, __pyx_k_port, sizeof(__pyx_k_port), 0, 0, 1, 1}, {&__pyx_n_s_protocol, __pyx_k_protocol, sizeof(__pyx_k_protocol), 0, 0, 1, 1}, {&__pyx_n_s_py_buf, __pyx_k_py_buf, sizeof(__pyx_k_py_buf), 0, 0, 1, 1}, {&__pyx_n_s_pyx_PickleError, __pyx_k_pyx_PickleError, sizeof(__pyx_k_pyx_PickleError), 0, 0, 1, 1}, {&__pyx_n_s_pyx_checksum, __pyx_k_pyx_checksum, sizeof(__pyx_k_pyx_checksum), 0, 0, 1, 1}, {&__pyx_n_s_pyx_result, __pyx_k_pyx_result, sizeof(__pyx_k_pyx_result), 0, 0, 1, 1}, {&__pyx_n_s_pyx_state, __pyx_k_pyx_state, sizeof(__pyx_k_pyx_state), 0, 0, 1, 1}, {&__pyx_n_s_pyx_type, __pyx_k_pyx_type, sizeof(__pyx_k_pyx_type), 0, 0, 1, 1}, {&__pyx_n_s_pyx_unpickle_RawRequestMessage, __pyx_k_pyx_unpickle_RawRequestMessage, sizeof(__pyx_k_pyx_unpickle_RawRequestMessage), 0, 0, 1, 1}, {&__pyx_n_s_pyx_unpickle_RawResponseMessag, __pyx_k_pyx_unpickle_RawResponseMessag, sizeof(__pyx_k_pyx_unpickle_RawResponseMessag), 0, 0, 1, 1}, {&__pyx_n_s_pyx_vtable, __pyx_k_pyx_vtable, sizeof(__pyx_k_pyx_vtable), 0, 0, 1, 1}, {&__pyx_n_s_query, __pyx_k_query, sizeof(__pyx_k_query), 0, 0, 1, 1}, {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, {&__pyx_n_s_raw_headers, __pyx_k_raw_headers, sizeof(__pyx_k_raw_headers), 0, 0, 1, 1}, {&__pyx_n_u_raw_headers, __pyx_k_raw_headers, sizeof(__pyx_k_raw_headers), 0, 1, 0, 1}, {&__pyx_n_s_read_until_eof, __pyx_k_read_until_eof, sizeof(__pyx_k_read_until_eof), 0, 0, 1, 1}, {&__pyx_n_s_reason, __pyx_k_reason, sizeof(__pyx_k_reason), 0, 0, 1, 1}, {&__pyx_n_u_reason, __pyx_k_reason, sizeof(__pyx_k_reason), 0, 1, 0, 1}, {&__pyx_n_s_reduce, __pyx_k_reduce, sizeof(__pyx_k_reduce), 0, 0, 1, 1}, {&__pyx_n_s_reduce_cython, __pyx_k_reduce_cython, sizeof(__pyx_k_reduce_cython), 0, 0, 1, 1}, {&__pyx_n_s_reduce_ex, __pyx_k_reduce_ex, sizeof(__pyx_k_reduce_ex), 0, 0, 1, 1}, {&__pyx_n_s_repr___locals_genexpr, __pyx_k_repr___locals_genexpr, sizeof(__pyx_k_repr___locals_genexpr), 0, 0, 1, 1}, {&__pyx_n_s_response_with_body, __pyx_k_response_with_body, sizeof(__pyx_k_response_with_body), 0, 0, 1, 1}, {&__pyx_n_s_scheme, __pyx_k_scheme, sizeof(__pyx_k_scheme), 0, 0, 1, 1}, {&__pyx_n_s_send, __pyx_k_send, sizeof(__pyx_k_send), 0, 0, 1, 1}, {&__pyx_n_s_set_exception, __pyx_k_set_exception, sizeof(__pyx_k_set_exception), 0, 0, 1, 1}, {&__pyx_n_s_setstate, __pyx_k_setstate, sizeof(__pyx_k_setstate), 0, 0, 1, 1}, {&__pyx_n_s_setstate_cython, __pyx_k_setstate_cython, sizeof(__pyx_k_setstate_cython), 0, 0, 1, 1}, {&__pyx_n_s_should_close, __pyx_k_should_close, sizeof(__pyx_k_should_close), 0, 0, 1, 1}, {&__pyx_n_u_should_close, __pyx_k_should_close, sizeof(__pyx_k_should_close), 0, 1, 0, 1}, {&__pyx_n_s_streams, __pyx_k_streams, sizeof(__pyx_k_streams), 0, 0, 1, 1}, {&__pyx_kp_s_stringsource, __pyx_k_stringsource, sizeof(__pyx_k_stringsource), 0, 0, 1, 0}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_throw, __pyx_k_throw, sizeof(__pyx_k_throw), 0, 0, 1, 1}, {&__pyx_n_s_timer, __pyx_k_timer, sizeof(__pyx_k_timer), 0, 0, 1, 1}, {&__pyx_kp_u_unknown, __pyx_k_unknown, sizeof(__pyx_k_unknown), 0, 1, 0, 0}, {&__pyx_n_s_update, __pyx_k_update, sizeof(__pyx_k_update), 0, 0, 1, 1}, {&__pyx_n_s_upgrade, __pyx_k_upgrade, sizeof(__pyx_k_upgrade), 0, 0, 1, 1}, {&__pyx_n_u_upgrade, __pyx_k_upgrade, sizeof(__pyx_k_upgrade), 0, 1, 0, 1}, {&__pyx_n_s_url, __pyx_k_url, sizeof(__pyx_k_url), 0, 0, 1, 1}, {&__pyx_n_u_url, __pyx_k_url, sizeof(__pyx_k_url), 0, 1, 0, 1}, {&__pyx_n_s_user, __pyx_k_user, sizeof(__pyx_k_user), 0, 0, 1, 1}, {&__pyx_n_s_version, __pyx_k_version, sizeof(__pyx_k_version), 0, 0, 1, 1}, {&__pyx_n_u_version, __pyx_k_version, sizeof(__pyx_k_version), 0, 1, 0, 1}, {&__pyx_n_s_yarl, __pyx_k_yarl, sizeof(__pyx_k_yarl), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(0, 70, __pyx_L1_error) __pyx_builtin_MemoryError = __Pyx_GetBuiltinName(__pyx_n_s_MemoryError); if (!__pyx_builtin_MemoryError) __PYX_ERR(0, 297, __pyx_L1_error) __pyx_builtin_TypeError = __Pyx_GetBuiltinName(__pyx_n_s_TypeError); if (!__pyx_builtin_TypeError) __PYX_ERR(1, 2, __pyx_L1_error) __pyx_builtin_BaseException = __Pyx_GetBuiltinName(__pyx_n_s_BaseException); if (!__pyx_builtin_BaseException) __PYX_ERR(0, 602, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_tuple__5 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__5); __Pyx_GIVEREF(__pyx_tuple__5); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_tuple__6 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__6); __Pyx_GIVEREF(__pyx_tuple__6); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_tuple__7 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__7); __Pyx_GIVEREF(__pyx_tuple__7); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_tuple__8 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__8); __Pyx_GIVEREF(__pyx_tuple__8); /* "(tree fragment)":2 * def __reduce_cython__(self): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ __pyx_tuple__9 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__9)) __PYX_ERR(1, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__9); __Pyx_GIVEREF(__pyx_tuple__9); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ __pyx_tuple__10 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(1, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__10); __Pyx_GIVEREF(__pyx_tuple__10); /* "aiohttp/_http_parser.pyx":40 * char* PyByteArray_AsString(object) * * __all__ = ('HttpRequestParser', 'HttpResponseParser', # <<<<<<<<<<<<<< * 'RawRequestMessage', 'RawResponseMessage') * */ __pyx_tuple__12 = PyTuple_Pack(4, __pyx_n_u_HttpRequestParser, __pyx_n_u_HttpResponseParser, __pyx_n_u_RawRequestMessage_2, __pyx_n_u_RawResponseMessage_2); if (unlikely(!__pyx_tuple__12)) __PYX_ERR(0, 40, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__12); __Pyx_GIVEREF(__pyx_tuple__12); /* "aiohttp/_http_parser.pyx":756 * * * def parse_url(url): # <<<<<<<<<<<<<< * cdef: * Py_buffer py_buf */ __pyx_tuple__13 = PyTuple_Pack(3, __pyx_n_s_url, __pyx_n_s_py_buf, __pyx_n_s_buf_data); if (unlikely(!__pyx_tuple__13)) __PYX_ERR(0, 756, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__13); __Pyx_GIVEREF(__pyx_tuple__13); __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(1, 0, 3, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__13, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_aiohttp__http_parser_pyx, __pyx_n_s_parse_url, 756, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(0, 756, __pyx_L1_error) /* "(tree fragment)":1 * def __pyx_unpickle_RawRequestMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_tuple__15 = PyTuple_Pack(5, __pyx_n_s_pyx_type, __pyx_n_s_pyx_checksum, __pyx_n_s_pyx_state, __pyx_n_s_pyx_PickleError, __pyx_n_s_pyx_result); if (unlikely(!__pyx_tuple__15)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__15); __Pyx_GIVEREF(__pyx_tuple__15); __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(3, 0, 5, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__15, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_stringsource, __pyx_n_s_pyx_unpickle_RawRequestMessage, 1, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(1, 1, __pyx_L1_error) __pyx_tuple__17 = PyTuple_Pack(5, __pyx_n_s_pyx_type, __pyx_n_s_pyx_checksum, __pyx_n_s_pyx_state, __pyx_n_s_pyx_PickleError, __pyx_n_s_pyx_result); if (unlikely(!__pyx_tuple__17)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple__17); __Pyx_GIVEREF(__pyx_tuple__17); __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(3, 0, 5, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__17, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_stringsource, __pyx_n_s_pyx_unpickle_RawResponseMessag, 1, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { __pyx_umethod_PyUnicode_Type_partition.type = (PyObject*)&PyUnicode_Type; if (__Pyx_InitStrings(__pyx_string_tab) < 0) __PYX_ERR(0, 1, __pyx_L1_error); __pyx_int_21004882 = PyInt_FromLong(21004882L); if (unlikely(!__pyx_int_21004882)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_int_209127132 = PyInt_FromLong(209127132L); if (unlikely(!__pyx_int_209127132)) __PYX_ERR(0, 1, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __pyx_v_7aiohttp_12_http_parser_headers = ((PyObject*)Py_None); Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_URL = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_URL_build = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_CIMultiDict = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_HttpVersion = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_HttpVersion10 = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_HttpVersion11 = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1 = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_CONTENT_ENCODING = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_StreamReader = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser_DeflateBuffer = Py_None; Py_INCREF(Py_None); __pyx_v_7aiohttp_12_http_parser__http_method = ((PyObject*)Py_None); Py_INCREF(Py_None); __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser_RawRequestMessage) < 0) __PYX_ERR(0, 93, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser_RawRequestMessage.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser_RawRequestMessage.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser_RawRequestMessage.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser_RawRequestMessage.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (PyObject_SetAttr(__pyx_m, __pyx_n_s_RawRequestMessage_2, (PyObject *)&__pyx_type_7aiohttp_12_http_parser_RawRequestMessage) < 0) __PYX_ERR(0, 93, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_12_http_parser_RawRequestMessage) < 0) __PYX_ERR(0, 93, __pyx_L1_error) __pyx_ptype_7aiohttp_12_http_parser_RawRequestMessage = &__pyx_type_7aiohttp_12_http_parser_RawRequestMessage; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser_RawResponseMessage) < 0) __PYX_ERR(0, 193, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser_RawResponseMessage.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser_RawResponseMessage.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser_RawResponseMessage.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser_RawResponseMessage.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (PyObject_SetAttr(__pyx_m, __pyx_n_s_RawResponseMessage_2, (PyObject *)&__pyx_type_7aiohttp_12_http_parser_RawResponseMessage) < 0) __PYX_ERR(0, 193, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_12_http_parser_RawResponseMessage) < 0) __PYX_ERR(0, 193, __pyx_L1_error) __pyx_ptype_7aiohttp_12_http_parser_RawResponseMessage = &__pyx_type_7aiohttp_12_http_parser_RawResponseMessage; __pyx_vtabptr_7aiohttp_12_http_parser_HttpParser = &__pyx_vtable_7aiohttp_12_http_parser_HttpParser; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._init = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, enum http_parser_type, PyObject *, PyObject *, struct __pyx_opt_args_7aiohttp_12_http_parser_10HttpParser__init *__pyx_optional_args))__pyx_f_7aiohttp_12_http_parser_10HttpParser__init; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._process_header = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__process_header; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_header_field = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, char *, size_t))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_field; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_header_value = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *, char *, size_t))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_header_value; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_headers_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_headers_complete; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_message_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_message_complete; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_chunk_header = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_header; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_chunk_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_chunk_complete; __pyx_vtable_7aiohttp_12_http_parser_HttpParser._on_status_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser__on_status_complete; __pyx_vtable_7aiohttp_12_http_parser_HttpParser.http_version = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_10HttpParser_http_version; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser_HttpParser) < 0) __PYX_ERR(0, 255, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser_HttpParser.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser_HttpParser.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser_HttpParser.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser_HttpParser.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (__Pyx_SetVtable(__pyx_type_7aiohttp_12_http_parser_HttpParser.tp_dict, __pyx_vtabptr_7aiohttp_12_http_parser_HttpParser) < 0) __PYX_ERR(0, 255, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_12_http_parser_HttpParser) < 0) __PYX_ERR(0, 255, __pyx_L1_error) __pyx_ptype_7aiohttp_12_http_parser_HttpParser = &__pyx_type_7aiohttp_12_http_parser_HttpParser; __pyx_vtabptr_7aiohttp_12_http_parser_HttpRequestParser = &__pyx_vtable_7aiohttp_12_http_parser_HttpRequestParser; __pyx_vtable_7aiohttp_12_http_parser_HttpRequestParser.__pyx_base = *__pyx_vtabptr_7aiohttp_12_http_parser_HttpParser; __pyx_vtable_7aiohttp_12_http_parser_HttpRequestParser.__pyx_base._on_status_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_17HttpRequestParser__on_status_complete; __pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_base = __pyx_ptype_7aiohttp_12_http_parser_HttpParser; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser_HttpRequestParser) < 0) __PYX_ERR(0, 537, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (__Pyx_SetVtable(__pyx_type_7aiohttp_12_http_parser_HttpRequestParser.tp_dict, __pyx_vtabptr_7aiohttp_12_http_parser_HttpRequestParser) < 0) __PYX_ERR(0, 537, __pyx_L1_error) if (PyObject_SetAttr(__pyx_m, __pyx_n_s_HttpRequestParser, (PyObject *)&__pyx_type_7aiohttp_12_http_parser_HttpRequestParser) < 0) __PYX_ERR(0, 537, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_12_http_parser_HttpRequestParser) < 0) __PYX_ERR(0, 537, __pyx_L1_error) __pyx_ptype_7aiohttp_12_http_parser_HttpRequestParser = &__pyx_type_7aiohttp_12_http_parser_HttpRequestParser; __pyx_vtabptr_7aiohttp_12_http_parser_HttpResponseParser = &__pyx_vtable_7aiohttp_12_http_parser_HttpResponseParser; __pyx_vtable_7aiohttp_12_http_parser_HttpResponseParser.__pyx_base = *__pyx_vtabptr_7aiohttp_12_http_parser_HttpParser; __pyx_vtable_7aiohttp_12_http_parser_HttpResponseParser.__pyx_base._on_status_complete = (PyObject *(*)(struct __pyx_obj_7aiohttp_12_http_parser_HttpParser *))__pyx_f_7aiohttp_12_http_parser_18HttpResponseParser__on_status_complete; __pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_base = __pyx_ptype_7aiohttp_12_http_parser_HttpParser; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser_HttpResponseParser) < 0) __PYX_ERR(0, 564, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_getattro = __Pyx_PyObject_GenericGetAttr; } if (__Pyx_SetVtable(__pyx_type_7aiohttp_12_http_parser_HttpResponseParser.tp_dict, __pyx_vtabptr_7aiohttp_12_http_parser_HttpResponseParser) < 0) __PYX_ERR(0, 564, __pyx_L1_error) if (PyObject_SetAttr(__pyx_m, __pyx_n_s_HttpResponseParser, (PyObject *)&__pyx_type_7aiohttp_12_http_parser_HttpResponseParser) < 0) __PYX_ERR(0, 564, __pyx_L1_error) if (__Pyx_setup_reduce((PyObject*)&__pyx_type_7aiohttp_12_http_parser_HttpResponseParser) < 0) __PYX_ERR(0, 564, __pyx_L1_error) __pyx_ptype_7aiohttp_12_http_parser_HttpResponseParser = &__pyx_type_7aiohttp_12_http_parser_HttpResponseParser; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__) < 0) __PYX_ERR(0, 118, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; } __pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct____repr__ = &__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct____repr__; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr) < 0) __PYX_ERR(0, 130, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; } __pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr = &__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_1_genexpr; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__) < 0) __PYX_ERR(0, 216, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; } __pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__ = &__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_2___repr__; if (PyType_Ready(&__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr) < 0) __PYX_ERR(0, 227, __pyx_L1_error) #if PY_VERSION_HEX < 0x030800B1 __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr.tp_print = 0; #endif if ((CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP) && likely(!__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr.tp_dictoffset && __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr.tp_getattro == PyObject_GenericGetAttr)) { __pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; } __pyx_ptype_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr = &__pyx_type_7aiohttp_12_http_parser___pyx_scope_struct_3_genexpr; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(2, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4type_type = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "type", #if defined(PYPY_VERSION_NUM) && PYPY_VERSION_NUM < 0x050B0000 sizeof(PyTypeObject), #else sizeof(PyHeapTypeObject), #endif __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_4type_type) __PYX_ERR(2, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(3, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4bool_bool = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "bool", sizeof(PyBoolObject), __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_4bool_bool) __PYX_ERR(3, 8, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(4, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_7complex_complex = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "complex", sizeof(PyComplexObject), __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_7complex_complex) __PYX_ERR(4, 15, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION < 3 #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC void #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #else #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyObject * #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC init_http_parser(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC init_http_parser(void) #else __Pyx_PyMODINIT_FUNC PyInit__http_parser(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit__http_parser(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { result = PyDict_SetItemString(moddict, to_name, value); } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, CYTHON_UNUSED PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec__http_parser(PyObject *__pyx_pyinit_module) #endif #endif { PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; PyObject *__pyx_t_8 = NULL; PyObject *__pyx_t_9 = NULL; PyObject *__pyx_t_10 = NULL; PyObject *__pyx_t_11 = NULL; PyObject *__pyx_t_12 = NULL; PyObject *__pyx_t_13 = NULL; PyObject *__pyx_t_14 = NULL; PyObject *__pyx_t_15 = NULL; PyObject *__pyx_t_16 = NULL; PyObject *__pyx_t_17 = NULL; PyObject *__pyx_t_18 = NULL; PyObject *__pyx_t_19 = NULL; PyObject *__pyx_t_20 = NULL; PyObject *__pyx_t_21 = NULL; PyObject *__pyx_t_22 = NULL; PyObject *__pyx_t_23 = NULL; PyObject *__pyx_t_24 = NULL; PyObject *__pyx_t_25 = NULL; PyObject *__pyx_t_26 = NULL; PyObject *__pyx_t_27 = NULL; PyObject *__pyx_t_28 = NULL; PyObject *__pyx_t_29 = NULL; PyObject *__pyx_t_30 = NULL; PyObject *__pyx_t_31 = NULL; PyObject *__pyx_t_32 = NULL; PyObject *__pyx_t_33 = NULL; PyObject *__pyx_t_34 = NULL; PyObject *__pyx_t_35 = NULL; PyObject *__pyx_t_36 = NULL; PyObject *__pyx_t_37 = NULL; PyObject *__pyx_t_38 = NULL; PyObject *__pyx_t_39 = NULL; PyObject *__pyx_t_40 = NULL; PyObject *__pyx_t_41 = NULL; PyObject *__pyx_t_42 = NULL; PyObject *__pyx_t_43 = NULL; PyObject *__pyx_t_44 = NULL; PyObject *__pyx_t_45 = NULL; PyObject *__pyx_t_46 = NULL; PyObject *__pyx_t_47 = NULL; PyObject *__pyx_t_48 = NULL; PyObject *__pyx_t_49 = NULL; PyObject *__pyx_t_50 = NULL; PyObject *__pyx_t_51 = NULL; PyObject *__pyx_t_52 = NULL; PyObject *__pyx_t_53 = NULL; PyObject *__pyx_t_54 = NULL; PyObject *__pyx_t_55 = NULL; PyObject *__pyx_t_56 = NULL; PyObject *__pyx_t_57 = NULL; PyObject *__pyx_t_58 = NULL; PyObject *__pyx_t_59 = NULL; PyObject *__pyx_t_60 = NULL; PyObject *__pyx_t_61 = NULL; PyObject *__pyx_t_62 = NULL; PyObject *__pyx_t_63 = NULL; PyObject *__pyx_t_64 = NULL; PyObject *__pyx_t_65 = NULL; PyObject *__pyx_t_66 = NULL; PyObject *__pyx_t_67 = NULL; PyObject *__pyx_t_68 = NULL; PyObject *__pyx_t_69 = NULL; PyObject *__pyx_t_70 = NULL; PyObject *__pyx_t_71 = NULL; PyObject *__pyx_t_72 = NULL; PyObject *__pyx_t_73 = NULL; PyObject *__pyx_t_74 = NULL; PyObject *__pyx_t_75 = NULL; PyObject *__pyx_t_76 = NULL; PyObject *__pyx_t_77 = NULL; PyObject *__pyx_t_78 = NULL; PyObject *__pyx_t_79 = NULL; long __pyx_t_80; enum http_method __pyx_t_81; char const *__pyx_t_82; int __pyx_t_83; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module '_http_parser' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit__http_parser(void)", 0); if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS #ifdef WITH_THREAD /* Python build with threading support? */ PyEval_InitThreads(); #endif #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("_http_parser", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); #else __pyx_m = PyModule_Create(&__pyx_moduledef); #endif if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_b); __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_cython_runtime); if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error); /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_aiohttp___http_parser) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "aiohttp._http_parser")) { if (unlikely(PyDict_SetItemString(modules, "aiohttp._http_parser", __pyx_m) < 0)) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) goto __pyx_L1_error; /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) goto __pyx_L1_error; /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); if (unlikely(__Pyx_modinit_type_init_code() != 0)) goto __pyx_L1_error; if (unlikely(__Pyx_modinit_type_import_code() != 0)) goto __pyx_L1_error; (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "aiohttp/_http_parser.pyx":11 * Py_buffer, PyBytes_AsString, PyBytes_AsStringAndSize) * * from multidict import (CIMultiDict as _CIMultiDict, # <<<<<<<<<<<<<< * CIMultiDictProxy as _CIMultiDictProxy) * from yarl import URL as _URL */ __pyx_t_1 = PyList_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_CIMultiDict); __Pyx_GIVEREF(__pyx_n_s_CIMultiDict); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_CIMultiDict); __Pyx_INCREF(__pyx_n_s_CIMultiDictProxy); __Pyx_GIVEREF(__pyx_n_s_CIMultiDictProxy); PyList_SET_ITEM(__pyx_t_1, 1, __pyx_n_s_CIMultiDictProxy); __pyx_t_2 = __Pyx_Import(__pyx_n_s_multidict, __pyx_t_1, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_CIMultiDict); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_CIMultiDict_2, __pyx_t_1) < 0) __PYX_ERR(0, 11, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_CIMultiDictProxy); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_CIMultiDictProxy_2, __pyx_t_1) < 0) __PYX_ERR(0, 12, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":13 * from multidict import (CIMultiDict as _CIMultiDict, * CIMultiDictProxy as _CIMultiDictProxy) * from yarl import URL as _URL # <<<<<<<<<<<<<< * * from aiohttp import hdrs */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_URL); __Pyx_GIVEREF(__pyx_n_s_URL); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_URL); __pyx_t_1 = __Pyx_Import(__pyx_n_s_yarl, __pyx_t_2, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_URL); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_URL_2, __pyx_t_2) < 0) __PYX_ERR(0, 13, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":15 * from yarl import URL as _URL * * from aiohttp import hdrs # <<<<<<<<<<<<<< * from .http_exceptions import ( * BadHttpMessage, BadStatusLine, InvalidHeader, LineTooLong, InvalidURLError, */ __pyx_t_1 = PyList_New(1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_hdrs); __Pyx_GIVEREF(__pyx_n_s_hdrs); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_hdrs); __pyx_t_2 = __Pyx_Import(__pyx_n_s_aiohttp, __pyx_t_1, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_hdrs, __pyx_t_1) < 0) __PYX_ERR(0, 15, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":17 * from aiohttp import hdrs * from .http_exceptions import ( * BadHttpMessage, BadStatusLine, InvalidHeader, LineTooLong, InvalidURLError, # <<<<<<<<<<<<<< * PayloadEncodingError, ContentLengthError, TransferEncodingError) * from .http_writer import (HttpVersion as _HttpVersion, */ __pyx_t_2 = PyList_New(8); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_BadHttpMessage); __Pyx_GIVEREF(__pyx_n_s_BadHttpMessage); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_BadHttpMessage); __Pyx_INCREF(__pyx_n_s_BadStatusLine); __Pyx_GIVEREF(__pyx_n_s_BadStatusLine); PyList_SET_ITEM(__pyx_t_2, 1, __pyx_n_s_BadStatusLine); __Pyx_INCREF(__pyx_n_s_InvalidHeader); __Pyx_GIVEREF(__pyx_n_s_InvalidHeader); PyList_SET_ITEM(__pyx_t_2, 2, __pyx_n_s_InvalidHeader); __Pyx_INCREF(__pyx_n_s_LineTooLong); __Pyx_GIVEREF(__pyx_n_s_LineTooLong); PyList_SET_ITEM(__pyx_t_2, 3, __pyx_n_s_LineTooLong); __Pyx_INCREF(__pyx_n_s_InvalidURLError); __Pyx_GIVEREF(__pyx_n_s_InvalidURLError); PyList_SET_ITEM(__pyx_t_2, 4, __pyx_n_s_InvalidURLError); __Pyx_INCREF(__pyx_n_s_PayloadEncodingError); __Pyx_GIVEREF(__pyx_n_s_PayloadEncodingError); PyList_SET_ITEM(__pyx_t_2, 5, __pyx_n_s_PayloadEncodingError); __Pyx_INCREF(__pyx_n_s_ContentLengthError); __Pyx_GIVEREF(__pyx_n_s_ContentLengthError); PyList_SET_ITEM(__pyx_t_2, 6, __pyx_n_s_ContentLengthError); __Pyx_INCREF(__pyx_n_s_TransferEncodingError); __Pyx_GIVEREF(__pyx_n_s_TransferEncodingError); PyList_SET_ITEM(__pyx_t_2, 7, __pyx_n_s_TransferEncodingError); /* "aiohttp/_http_parser.pyx":16 * * from aiohttp import hdrs * from .http_exceptions import ( # <<<<<<<<<<<<<< * BadHttpMessage, BadStatusLine, InvalidHeader, LineTooLong, InvalidURLError, * PayloadEncodingError, ContentLengthError, TransferEncodingError) */ __pyx_t_1 = __Pyx_Import(__pyx_n_s_http_exceptions, __pyx_t_2, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_BadHttpMessage); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_BadHttpMessage, __pyx_t_2) < 0) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_BadStatusLine); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_BadStatusLine, __pyx_t_2) < 0) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_InvalidHeader); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_InvalidHeader, __pyx_t_2) < 0) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_LineTooLong); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_LineTooLong, __pyx_t_2) < 0) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_InvalidURLError); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_InvalidURLError, __pyx_t_2) < 0) __PYX_ERR(0, 17, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_PayloadEncodingError); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_PayloadEncodingError, __pyx_t_2) < 0) __PYX_ERR(0, 18, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_ContentLengthError); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_ContentLengthError, __pyx_t_2) < 0) __PYX_ERR(0, 18, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_TransferEncodingError); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_TransferEncodingError, __pyx_t_2) < 0) __PYX_ERR(0, 18, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":19 * BadHttpMessage, BadStatusLine, InvalidHeader, LineTooLong, InvalidURLError, * PayloadEncodingError, ContentLengthError, TransferEncodingError) * from .http_writer import (HttpVersion as _HttpVersion, # <<<<<<<<<<<<<< * HttpVersion10 as _HttpVersion10, * HttpVersion11 as _HttpVersion11) */ __pyx_t_1 = PyList_New(3); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_HttpVersion); __Pyx_GIVEREF(__pyx_n_s_HttpVersion); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_HttpVersion); __Pyx_INCREF(__pyx_n_s_HttpVersion10); __Pyx_GIVEREF(__pyx_n_s_HttpVersion10); PyList_SET_ITEM(__pyx_t_1, 1, __pyx_n_s_HttpVersion10); __Pyx_INCREF(__pyx_n_s_HttpVersion11); __Pyx_GIVEREF(__pyx_n_s_HttpVersion11); PyList_SET_ITEM(__pyx_t_1, 2, __pyx_n_s_HttpVersion11); __pyx_t_2 = __Pyx_Import(__pyx_n_s_http_writer, __pyx_t_1, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_HttpVersion); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_HttpVersion_2, __pyx_t_1) < 0) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_HttpVersion10); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_HttpVersion10_2, __pyx_t_1) < 0) __PYX_ERR(0, 20, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_HttpVersion11); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_HttpVersion11_2, __pyx_t_1) < 0) __PYX_ERR(0, 21, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_parser.pyx":22 * HttpVersion10 as _HttpVersion10, * HttpVersion11 as _HttpVersion11) * from .http_parser import DeflateBuffer as _DeflateBuffer # <<<<<<<<<<<<<< * from .streams import (EMPTY_PAYLOAD as _EMPTY_PAYLOAD, * StreamReader as _StreamReader) */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_DeflateBuffer); __Pyx_GIVEREF(__pyx_n_s_DeflateBuffer); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_DeflateBuffer); __pyx_t_1 = __Pyx_Import(__pyx_n_s_http_parser, __pyx_t_2, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_DeflateBuffer); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_DeflateBuffer_2, __pyx_t_2) < 0) __PYX_ERR(0, 22, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":23 * HttpVersion11 as _HttpVersion11) * from .http_parser import DeflateBuffer as _DeflateBuffer * from .streams import (EMPTY_PAYLOAD as _EMPTY_PAYLOAD, # <<<<<<<<<<<<<< * StreamReader as _StreamReader) * */ __pyx_t_1 = PyList_New(2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_EMPTY_PAYLOAD); __Pyx_GIVEREF(__pyx_n_s_EMPTY_PAYLOAD); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_EMPTY_PAYLOAD); __Pyx_INCREF(__pyx_n_s_StreamReader); __Pyx_GIVEREF(__pyx_n_s_StreamReader); PyList_SET_ITEM(__pyx_t_1, 1, __pyx_n_s_StreamReader); __pyx_t_2 = __Pyx_Import(__pyx_n_s_streams, __pyx_t_1, 1); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_EMPTY_PAYLOAD); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_EMPTY_PAYLOAD_2, __pyx_t_1) < 0) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_StreamReader); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_StreamReader_2, __pyx_t_1) < 0) __PYX_ERR(0, 24, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_headers.pxi":4 * # Run ./tools/gen.py to update it after the origin changing. * * from . import hdrs # <<<<<<<<<<<<<< * cdef tuple headers = ( * hdrs.ACCEPT, */ __pyx_t_2 = PyList_New(1); if (unlikely(!__pyx_t_2)) __PYX_ERR(5, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_INCREF(__pyx_n_s_hdrs); __Pyx_GIVEREF(__pyx_n_s_hdrs); PyList_SET_ITEM(__pyx_t_2, 0, __pyx_n_s_hdrs); __pyx_t_1 = __Pyx_Import(__pyx_n_s__4, __pyx_t_2, 1); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __pyx_t_2 = __Pyx_ImportFrom(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_2)) __PYX_ERR(5, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_hdrs, __pyx_t_2) < 0) __PYX_ERR(5, 4, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":6 * from . import hdrs * cdef tuple headers = ( * hdrs.ACCEPT, # <<<<<<<<<<<<<< * hdrs.ACCEPT_CHARSET, * hdrs.ACCEPT_ENCODING, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_2 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCEPT); if (unlikely(!__pyx_t_2)) __PYX_ERR(5, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":7 * cdef tuple headers = ( * hdrs.ACCEPT, * hdrs.ACCEPT_CHARSET, # <<<<<<<<<<<<<< * hdrs.ACCEPT_ENCODING, * hdrs.ACCEPT_LANGUAGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCEPT_CHARSET); if (unlikely(!__pyx_t_3)) __PYX_ERR(5, 7, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":8 * hdrs.ACCEPT, * hdrs.ACCEPT_CHARSET, * hdrs.ACCEPT_ENCODING, # <<<<<<<<<<<<<< * hdrs.ACCEPT_LANGUAGE, * hdrs.ACCEPT_RANGES, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_4 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCEPT_ENCODING); if (unlikely(!__pyx_t_4)) __PYX_ERR(5, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":9 * hdrs.ACCEPT_CHARSET, * hdrs.ACCEPT_ENCODING, * hdrs.ACCEPT_LANGUAGE, # <<<<<<<<<<<<<< * hdrs.ACCEPT_RANGES, * hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_5 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCEPT_LANGUAGE); if (unlikely(!__pyx_t_5)) __PYX_ERR(5, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":10 * hdrs.ACCEPT_ENCODING, * hdrs.ACCEPT_LANGUAGE, * hdrs.ACCEPT_RANGES, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, * hdrs.ACCESS_CONTROL_ALLOW_HEADERS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 10, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_6 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCEPT_RANGES); if (unlikely(!__pyx_t_6)) __PYX_ERR(5, 10, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":11 * hdrs.ACCEPT_LANGUAGE, * hdrs.ACCEPT_RANGES, * hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_ALLOW_HEADERS, * hdrs.ACCESS_CONTROL_ALLOW_METHODS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_7 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_ALLOW_CREDENTIALS); if (unlikely(!__pyx_t_7)) __PYX_ERR(5, 11, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_7); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":12 * hdrs.ACCEPT_RANGES, * hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, * hdrs.ACCESS_CONTROL_ALLOW_HEADERS, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_ALLOW_METHODS, * hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_8 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_ALLOW_HEADERS); if (unlikely(!__pyx_t_8)) __PYX_ERR(5, 12, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":13 * hdrs.ACCESS_CONTROL_ALLOW_CREDENTIALS, * hdrs.ACCESS_CONTROL_ALLOW_HEADERS, * hdrs.ACCESS_CONTROL_ALLOW_METHODS, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, * hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_9 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_ALLOW_METHODS); if (unlikely(!__pyx_t_9)) __PYX_ERR(5, 13, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_9); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":14 * hdrs.ACCESS_CONTROL_ALLOW_HEADERS, * hdrs.ACCESS_CONTROL_ALLOW_METHODS, * hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, * hdrs.ACCESS_CONTROL_MAX_AGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_10 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_ALLOW_ORIGIN); if (unlikely(!__pyx_t_10)) __PYX_ERR(5, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_10); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":15 * hdrs.ACCESS_CONTROL_ALLOW_METHODS, * hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, * hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_MAX_AGE, * hdrs.ACCESS_CONTROL_REQUEST_HEADERS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_11 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_EXPOSE_HEADERS); if (unlikely(!__pyx_t_11)) __PYX_ERR(5, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_11); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":16 * hdrs.ACCESS_CONTROL_ALLOW_ORIGIN, * hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, * hdrs.ACCESS_CONTROL_MAX_AGE, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_REQUEST_HEADERS, * hdrs.ACCESS_CONTROL_REQUEST_METHOD, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_12 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_MAX_AGE); if (unlikely(!__pyx_t_12)) __PYX_ERR(5, 16, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_12); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":17 * hdrs.ACCESS_CONTROL_EXPOSE_HEADERS, * hdrs.ACCESS_CONTROL_MAX_AGE, * hdrs.ACCESS_CONTROL_REQUEST_HEADERS, # <<<<<<<<<<<<<< * hdrs.ACCESS_CONTROL_REQUEST_METHOD, * hdrs.AGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_13 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_REQUEST_HEADERS); if (unlikely(!__pyx_t_13)) __PYX_ERR(5, 17, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_13); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":18 * hdrs.ACCESS_CONTROL_MAX_AGE, * hdrs.ACCESS_CONTROL_REQUEST_HEADERS, * hdrs.ACCESS_CONTROL_REQUEST_METHOD, # <<<<<<<<<<<<<< * hdrs.AGE, * hdrs.ALLOW, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 18, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_14 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ACCESS_CONTROL_REQUEST_METHOD); if (unlikely(!__pyx_t_14)) __PYX_ERR(5, 18, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_14); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":19 * hdrs.ACCESS_CONTROL_REQUEST_HEADERS, * hdrs.ACCESS_CONTROL_REQUEST_METHOD, * hdrs.AGE, # <<<<<<<<<<<<<< * hdrs.ALLOW, * hdrs.AUTHORIZATION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_15 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_AGE); if (unlikely(!__pyx_t_15)) __PYX_ERR(5, 19, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_15); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":20 * hdrs.ACCESS_CONTROL_REQUEST_METHOD, * hdrs.AGE, * hdrs.ALLOW, # <<<<<<<<<<<<<< * hdrs.AUTHORIZATION, * hdrs.CACHE_CONTROL, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 20, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_16 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ALLOW); if (unlikely(!__pyx_t_16)) __PYX_ERR(5, 20, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_16); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":21 * hdrs.AGE, * hdrs.ALLOW, * hdrs.AUTHORIZATION, # <<<<<<<<<<<<<< * hdrs.CACHE_CONTROL, * hdrs.CONNECTION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 21, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_17 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_AUTHORIZATION); if (unlikely(!__pyx_t_17)) __PYX_ERR(5, 21, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_17); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":22 * hdrs.ALLOW, * hdrs.AUTHORIZATION, * hdrs.CACHE_CONTROL, # <<<<<<<<<<<<<< * hdrs.CONNECTION, * hdrs.CONTENT_DISPOSITION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_18 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CACHE_CONTROL); if (unlikely(!__pyx_t_18)) __PYX_ERR(5, 22, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_18); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":23 * hdrs.AUTHORIZATION, * hdrs.CACHE_CONTROL, * hdrs.CONNECTION, # <<<<<<<<<<<<<< * hdrs.CONTENT_DISPOSITION, * hdrs.CONTENT_ENCODING, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_19 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONNECTION); if (unlikely(!__pyx_t_19)) __PYX_ERR(5, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_19); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":24 * hdrs.CACHE_CONTROL, * hdrs.CONNECTION, * hdrs.CONTENT_DISPOSITION, # <<<<<<<<<<<<<< * hdrs.CONTENT_ENCODING, * hdrs.CONTENT_LANGUAGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 24, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_20 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_DISPOSITION); if (unlikely(!__pyx_t_20)) __PYX_ERR(5, 24, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_20); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":25 * hdrs.CONNECTION, * hdrs.CONTENT_DISPOSITION, * hdrs.CONTENT_ENCODING, # <<<<<<<<<<<<<< * hdrs.CONTENT_LANGUAGE, * hdrs.CONTENT_LENGTH, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 25, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_21 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_ENCODING); if (unlikely(!__pyx_t_21)) __PYX_ERR(5, 25, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_21); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":26 * hdrs.CONTENT_DISPOSITION, * hdrs.CONTENT_ENCODING, * hdrs.CONTENT_LANGUAGE, # <<<<<<<<<<<<<< * hdrs.CONTENT_LENGTH, * hdrs.CONTENT_LOCATION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 26, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_22 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_LANGUAGE); if (unlikely(!__pyx_t_22)) __PYX_ERR(5, 26, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_22); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":27 * hdrs.CONTENT_ENCODING, * hdrs.CONTENT_LANGUAGE, * hdrs.CONTENT_LENGTH, # <<<<<<<<<<<<<< * hdrs.CONTENT_LOCATION, * hdrs.CONTENT_MD5, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 27, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_23 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_LENGTH); if (unlikely(!__pyx_t_23)) __PYX_ERR(5, 27, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_23); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":28 * hdrs.CONTENT_LANGUAGE, * hdrs.CONTENT_LENGTH, * hdrs.CONTENT_LOCATION, # <<<<<<<<<<<<<< * hdrs.CONTENT_MD5, * hdrs.CONTENT_RANGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 28, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_24 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_LOCATION); if (unlikely(!__pyx_t_24)) __PYX_ERR(5, 28, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_24); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":29 * hdrs.CONTENT_LENGTH, * hdrs.CONTENT_LOCATION, * hdrs.CONTENT_MD5, # <<<<<<<<<<<<<< * hdrs.CONTENT_RANGE, * hdrs.CONTENT_TRANSFER_ENCODING, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 29, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_25 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_MD5); if (unlikely(!__pyx_t_25)) __PYX_ERR(5, 29, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_25); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":30 * hdrs.CONTENT_LOCATION, * hdrs.CONTENT_MD5, * hdrs.CONTENT_RANGE, # <<<<<<<<<<<<<< * hdrs.CONTENT_TRANSFER_ENCODING, * hdrs.CONTENT_TYPE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 30, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_26 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_RANGE); if (unlikely(!__pyx_t_26)) __PYX_ERR(5, 30, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_26); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":31 * hdrs.CONTENT_MD5, * hdrs.CONTENT_RANGE, * hdrs.CONTENT_TRANSFER_ENCODING, # <<<<<<<<<<<<<< * hdrs.CONTENT_TYPE, * hdrs.COOKIE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 31, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_27 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_TRANSFER_ENCODING); if (unlikely(!__pyx_t_27)) __PYX_ERR(5, 31, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_27); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":32 * hdrs.CONTENT_RANGE, * hdrs.CONTENT_TRANSFER_ENCODING, * hdrs.CONTENT_TYPE, # <<<<<<<<<<<<<< * hdrs.COOKIE, * hdrs.DATE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 32, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_28 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_CONTENT_TYPE); if (unlikely(!__pyx_t_28)) __PYX_ERR(5, 32, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_28); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":33 * hdrs.CONTENT_TRANSFER_ENCODING, * hdrs.CONTENT_TYPE, * hdrs.COOKIE, # <<<<<<<<<<<<<< * hdrs.DATE, * hdrs.DESTINATION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 33, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_29 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_COOKIE); if (unlikely(!__pyx_t_29)) __PYX_ERR(5, 33, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_29); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":34 * hdrs.CONTENT_TYPE, * hdrs.COOKIE, * hdrs.DATE, # <<<<<<<<<<<<<< * hdrs.DESTINATION, * hdrs.DIGEST, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 34, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_30 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_DATE); if (unlikely(!__pyx_t_30)) __PYX_ERR(5, 34, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_30); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":35 * hdrs.COOKIE, * hdrs.DATE, * hdrs.DESTINATION, # <<<<<<<<<<<<<< * hdrs.DIGEST, * hdrs.ETAG, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 35, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_31 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_DESTINATION); if (unlikely(!__pyx_t_31)) __PYX_ERR(5, 35, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_31); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":36 * hdrs.DATE, * hdrs.DESTINATION, * hdrs.DIGEST, # <<<<<<<<<<<<<< * hdrs.ETAG, * hdrs.EXPECT, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 36, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_32 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_DIGEST); if (unlikely(!__pyx_t_32)) __PYX_ERR(5, 36, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_32); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":37 * hdrs.DESTINATION, * hdrs.DIGEST, * hdrs.ETAG, # <<<<<<<<<<<<<< * hdrs.EXPECT, * hdrs.EXPIRES, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 37, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_33 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ETAG); if (unlikely(!__pyx_t_33)) __PYX_ERR(5, 37, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_33); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":38 * hdrs.DIGEST, * hdrs.ETAG, * hdrs.EXPECT, # <<<<<<<<<<<<<< * hdrs.EXPIRES, * hdrs.FORWARDED, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 38, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_34 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_EXPECT); if (unlikely(!__pyx_t_34)) __PYX_ERR(5, 38, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_34); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":39 * hdrs.ETAG, * hdrs.EXPECT, * hdrs.EXPIRES, # <<<<<<<<<<<<<< * hdrs.FORWARDED, * hdrs.FROM, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 39, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_35 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_EXPIRES); if (unlikely(!__pyx_t_35)) __PYX_ERR(5, 39, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_35); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":40 * hdrs.EXPECT, * hdrs.EXPIRES, * hdrs.FORWARDED, # <<<<<<<<<<<<<< * hdrs.FROM, * hdrs.HOST, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 40, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_36 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_FORWARDED); if (unlikely(!__pyx_t_36)) __PYX_ERR(5, 40, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_36); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":41 * hdrs.EXPIRES, * hdrs.FORWARDED, * hdrs.FROM, # <<<<<<<<<<<<<< * hdrs.HOST, * hdrs.IF_MATCH, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 41, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_37 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_FROM); if (unlikely(!__pyx_t_37)) __PYX_ERR(5, 41, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_37); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":42 * hdrs.FORWARDED, * hdrs.FROM, * hdrs.HOST, # <<<<<<<<<<<<<< * hdrs.IF_MATCH, * hdrs.IF_MODIFIED_SINCE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 42, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_38 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_HOST); if (unlikely(!__pyx_t_38)) __PYX_ERR(5, 42, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_38); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":43 * hdrs.FROM, * hdrs.HOST, * hdrs.IF_MATCH, # <<<<<<<<<<<<<< * hdrs.IF_MODIFIED_SINCE, * hdrs.IF_NONE_MATCH, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 43, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_39 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_IF_MATCH); if (unlikely(!__pyx_t_39)) __PYX_ERR(5, 43, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_39); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":44 * hdrs.HOST, * hdrs.IF_MATCH, * hdrs.IF_MODIFIED_SINCE, # <<<<<<<<<<<<<< * hdrs.IF_NONE_MATCH, * hdrs.IF_RANGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 44, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_40 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_IF_MODIFIED_SINCE); if (unlikely(!__pyx_t_40)) __PYX_ERR(5, 44, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_40); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":45 * hdrs.IF_MATCH, * hdrs.IF_MODIFIED_SINCE, * hdrs.IF_NONE_MATCH, # <<<<<<<<<<<<<< * hdrs.IF_RANGE, * hdrs.IF_UNMODIFIED_SINCE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 45, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_41 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_IF_NONE_MATCH); if (unlikely(!__pyx_t_41)) __PYX_ERR(5, 45, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_41); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":46 * hdrs.IF_MODIFIED_SINCE, * hdrs.IF_NONE_MATCH, * hdrs.IF_RANGE, # <<<<<<<<<<<<<< * hdrs.IF_UNMODIFIED_SINCE, * hdrs.KEEP_ALIVE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 46, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_42 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_IF_RANGE); if (unlikely(!__pyx_t_42)) __PYX_ERR(5, 46, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_42); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":47 * hdrs.IF_NONE_MATCH, * hdrs.IF_RANGE, * hdrs.IF_UNMODIFIED_SINCE, # <<<<<<<<<<<<<< * hdrs.KEEP_ALIVE, * hdrs.LAST_EVENT_ID, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 47, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_43 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_IF_UNMODIFIED_SINCE); if (unlikely(!__pyx_t_43)) __PYX_ERR(5, 47, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_43); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":48 * hdrs.IF_RANGE, * hdrs.IF_UNMODIFIED_SINCE, * hdrs.KEEP_ALIVE, # <<<<<<<<<<<<<< * hdrs.LAST_EVENT_ID, * hdrs.LAST_MODIFIED, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 48, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_44 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_KEEP_ALIVE); if (unlikely(!__pyx_t_44)) __PYX_ERR(5, 48, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_44); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":49 * hdrs.IF_UNMODIFIED_SINCE, * hdrs.KEEP_ALIVE, * hdrs.LAST_EVENT_ID, # <<<<<<<<<<<<<< * hdrs.LAST_MODIFIED, * hdrs.LINK, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 49, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_45 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_LAST_EVENT_ID); if (unlikely(!__pyx_t_45)) __PYX_ERR(5, 49, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_45); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":50 * hdrs.KEEP_ALIVE, * hdrs.LAST_EVENT_ID, * hdrs.LAST_MODIFIED, # <<<<<<<<<<<<<< * hdrs.LINK, * hdrs.LOCATION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_46 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_LAST_MODIFIED); if (unlikely(!__pyx_t_46)) __PYX_ERR(5, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_46); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":51 * hdrs.LAST_EVENT_ID, * hdrs.LAST_MODIFIED, * hdrs.LINK, # <<<<<<<<<<<<<< * hdrs.LOCATION, * hdrs.MAX_FORWARDS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 51, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_47 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_LINK); if (unlikely(!__pyx_t_47)) __PYX_ERR(5, 51, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_47); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":52 * hdrs.LAST_MODIFIED, * hdrs.LINK, * hdrs.LOCATION, # <<<<<<<<<<<<<< * hdrs.MAX_FORWARDS, * hdrs.ORIGIN, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 52, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_48 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_LOCATION); if (unlikely(!__pyx_t_48)) __PYX_ERR(5, 52, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_48); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":53 * hdrs.LINK, * hdrs.LOCATION, * hdrs.MAX_FORWARDS, # <<<<<<<<<<<<<< * hdrs.ORIGIN, * hdrs.PRAGMA, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 53, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_49 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_MAX_FORWARDS); if (unlikely(!__pyx_t_49)) __PYX_ERR(5, 53, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_49); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":54 * hdrs.LOCATION, * hdrs.MAX_FORWARDS, * hdrs.ORIGIN, # <<<<<<<<<<<<<< * hdrs.PRAGMA, * hdrs.PROXY_AUTHENTICATE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 54, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_50 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_ORIGIN); if (unlikely(!__pyx_t_50)) __PYX_ERR(5, 54, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_50); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":55 * hdrs.MAX_FORWARDS, * hdrs.ORIGIN, * hdrs.PRAGMA, # <<<<<<<<<<<<<< * hdrs.PROXY_AUTHENTICATE, * hdrs.PROXY_AUTHORIZATION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 55, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_51 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_PRAGMA); if (unlikely(!__pyx_t_51)) __PYX_ERR(5, 55, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_51); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":56 * hdrs.ORIGIN, * hdrs.PRAGMA, * hdrs.PROXY_AUTHENTICATE, # <<<<<<<<<<<<<< * hdrs.PROXY_AUTHORIZATION, * hdrs.RANGE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 56, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_52 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_PROXY_AUTHENTICATE); if (unlikely(!__pyx_t_52)) __PYX_ERR(5, 56, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_52); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":57 * hdrs.PRAGMA, * hdrs.PROXY_AUTHENTICATE, * hdrs.PROXY_AUTHORIZATION, # <<<<<<<<<<<<<< * hdrs.RANGE, * hdrs.REFERER, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 57, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_53 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_PROXY_AUTHORIZATION); if (unlikely(!__pyx_t_53)) __PYX_ERR(5, 57, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_53); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":58 * hdrs.PROXY_AUTHENTICATE, * hdrs.PROXY_AUTHORIZATION, * hdrs.RANGE, # <<<<<<<<<<<<<< * hdrs.REFERER, * hdrs.RETRY_AFTER, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 58, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_54 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_RANGE); if (unlikely(!__pyx_t_54)) __PYX_ERR(5, 58, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_54); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":59 * hdrs.PROXY_AUTHORIZATION, * hdrs.RANGE, * hdrs.REFERER, # <<<<<<<<<<<<<< * hdrs.RETRY_AFTER, * hdrs.SEC_WEBSOCKET_ACCEPT, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 59, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_55 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_REFERER); if (unlikely(!__pyx_t_55)) __PYX_ERR(5, 59, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_55); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":60 * hdrs.RANGE, * hdrs.REFERER, * hdrs.RETRY_AFTER, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_ACCEPT, * hdrs.SEC_WEBSOCKET_EXTENSIONS, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 60, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_56 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_RETRY_AFTER); if (unlikely(!__pyx_t_56)) __PYX_ERR(5, 60, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_56); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":61 * hdrs.REFERER, * hdrs.RETRY_AFTER, * hdrs.SEC_WEBSOCKET_ACCEPT, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_EXTENSIONS, * hdrs.SEC_WEBSOCKET_KEY, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 61, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_57 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_ACCEPT); if (unlikely(!__pyx_t_57)) __PYX_ERR(5, 61, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_57); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":62 * hdrs.RETRY_AFTER, * hdrs.SEC_WEBSOCKET_ACCEPT, * hdrs.SEC_WEBSOCKET_EXTENSIONS, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_KEY, * hdrs.SEC_WEBSOCKET_KEY1, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 62, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_58 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_EXTENSIONS); if (unlikely(!__pyx_t_58)) __PYX_ERR(5, 62, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_58); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":63 * hdrs.SEC_WEBSOCKET_ACCEPT, * hdrs.SEC_WEBSOCKET_EXTENSIONS, * hdrs.SEC_WEBSOCKET_KEY, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_KEY1, * hdrs.SEC_WEBSOCKET_PROTOCOL, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 63, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_59 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_KEY); if (unlikely(!__pyx_t_59)) __PYX_ERR(5, 63, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_59); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":64 * hdrs.SEC_WEBSOCKET_EXTENSIONS, * hdrs.SEC_WEBSOCKET_KEY, * hdrs.SEC_WEBSOCKET_KEY1, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_PROTOCOL, * hdrs.SEC_WEBSOCKET_VERSION, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 64, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_60 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_KEY1); if (unlikely(!__pyx_t_60)) __PYX_ERR(5, 64, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_60); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":65 * hdrs.SEC_WEBSOCKET_KEY, * hdrs.SEC_WEBSOCKET_KEY1, * hdrs.SEC_WEBSOCKET_PROTOCOL, # <<<<<<<<<<<<<< * hdrs.SEC_WEBSOCKET_VERSION, * hdrs.SERVER, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 65, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_61 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_PROTOCOL); if (unlikely(!__pyx_t_61)) __PYX_ERR(5, 65, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_61); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":66 * hdrs.SEC_WEBSOCKET_KEY1, * hdrs.SEC_WEBSOCKET_PROTOCOL, * hdrs.SEC_WEBSOCKET_VERSION, # <<<<<<<<<<<<<< * hdrs.SERVER, * hdrs.SET_COOKIE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 66, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_62 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_VERSION); if (unlikely(!__pyx_t_62)) __PYX_ERR(5, 66, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_62); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":67 * hdrs.SEC_WEBSOCKET_PROTOCOL, * hdrs.SEC_WEBSOCKET_VERSION, * hdrs.SERVER, # <<<<<<<<<<<<<< * hdrs.SET_COOKIE, * hdrs.TE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 67, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_63 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SERVER); if (unlikely(!__pyx_t_63)) __PYX_ERR(5, 67, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_63); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":68 * hdrs.SEC_WEBSOCKET_VERSION, * hdrs.SERVER, * hdrs.SET_COOKIE, # <<<<<<<<<<<<<< * hdrs.TE, * hdrs.TRAILER, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 68, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_64 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SET_COOKIE); if (unlikely(!__pyx_t_64)) __PYX_ERR(5, 68, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_64); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":69 * hdrs.SERVER, * hdrs.SET_COOKIE, * hdrs.TE, # <<<<<<<<<<<<<< * hdrs.TRAILER, * hdrs.TRANSFER_ENCODING, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 69, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_65 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_TE); if (unlikely(!__pyx_t_65)) __PYX_ERR(5, 69, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_65); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":70 * hdrs.SET_COOKIE, * hdrs.TE, * hdrs.TRAILER, # <<<<<<<<<<<<<< * hdrs.TRANSFER_ENCODING, * hdrs.UPGRADE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 70, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_66 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_TRAILER); if (unlikely(!__pyx_t_66)) __PYX_ERR(5, 70, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_66); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":71 * hdrs.TE, * hdrs.TRAILER, * hdrs.TRANSFER_ENCODING, # <<<<<<<<<<<<<< * hdrs.UPGRADE, * hdrs.URI, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 71, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_67 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_TRANSFER_ENCODING); if (unlikely(!__pyx_t_67)) __PYX_ERR(5, 71, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_67); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":72 * hdrs.TRAILER, * hdrs.TRANSFER_ENCODING, * hdrs.UPGRADE, # <<<<<<<<<<<<<< * hdrs.URI, * hdrs.USER_AGENT, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 72, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_68 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_UPGRADE); if (unlikely(!__pyx_t_68)) __PYX_ERR(5, 72, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_68); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":73 * hdrs.TRANSFER_ENCODING, * hdrs.UPGRADE, * hdrs.URI, # <<<<<<<<<<<<<< * hdrs.USER_AGENT, * hdrs.VARY, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 73, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_69 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_URI); if (unlikely(!__pyx_t_69)) __PYX_ERR(5, 73, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_69); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":74 * hdrs.UPGRADE, * hdrs.URI, * hdrs.USER_AGENT, # <<<<<<<<<<<<<< * hdrs.VARY, * hdrs.VIA, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 74, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_70 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_USER_AGENT); if (unlikely(!__pyx_t_70)) __PYX_ERR(5, 74, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_70); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":75 * hdrs.URI, * hdrs.USER_AGENT, * hdrs.VARY, # <<<<<<<<<<<<<< * hdrs.VIA, * hdrs.WANT_DIGEST, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 75, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_71 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_VARY); if (unlikely(!__pyx_t_71)) __PYX_ERR(5, 75, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_71); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":76 * hdrs.USER_AGENT, * hdrs.VARY, * hdrs.VIA, # <<<<<<<<<<<<<< * hdrs.WANT_DIGEST, * hdrs.WARNING, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 76, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_72 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_VIA); if (unlikely(!__pyx_t_72)) __PYX_ERR(5, 76, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_72); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":77 * hdrs.VARY, * hdrs.VIA, * hdrs.WANT_DIGEST, # <<<<<<<<<<<<<< * hdrs.WARNING, * hdrs.WEBSOCKET, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 77, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_73 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_WANT_DIGEST); if (unlikely(!__pyx_t_73)) __PYX_ERR(5, 77, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_73); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":78 * hdrs.VIA, * hdrs.WANT_DIGEST, * hdrs.WARNING, # <<<<<<<<<<<<<< * hdrs.WEBSOCKET, * hdrs.WWW_AUTHENTICATE, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 78, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_74 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_WARNING); if (unlikely(!__pyx_t_74)) __PYX_ERR(5, 78, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_74); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":79 * hdrs.WANT_DIGEST, * hdrs.WARNING, * hdrs.WEBSOCKET, # <<<<<<<<<<<<<< * hdrs.WWW_AUTHENTICATE, * hdrs.X_FORWARDED_FOR, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 79, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_75 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_WEBSOCKET); if (unlikely(!__pyx_t_75)) __PYX_ERR(5, 79, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_75); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":80 * hdrs.WARNING, * hdrs.WEBSOCKET, * hdrs.WWW_AUTHENTICATE, # <<<<<<<<<<<<<< * hdrs.X_FORWARDED_FOR, * hdrs.X_FORWARDED_HOST, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 80, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_76 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_WWW_AUTHENTICATE); if (unlikely(!__pyx_t_76)) __PYX_ERR(5, 80, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_76); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":81 * hdrs.WEBSOCKET, * hdrs.WWW_AUTHENTICATE, * hdrs.X_FORWARDED_FOR, # <<<<<<<<<<<<<< * hdrs.X_FORWARDED_HOST, * hdrs.X_FORWARDED_PROTO, */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 81, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_77 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_X_FORWARDED_FOR); if (unlikely(!__pyx_t_77)) __PYX_ERR(5, 81, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_77); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":82 * hdrs.WWW_AUTHENTICATE, * hdrs.X_FORWARDED_FOR, * hdrs.X_FORWARDED_HOST, # <<<<<<<<<<<<<< * hdrs.X_FORWARDED_PROTO, * ) */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 82, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_78 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_X_FORWARDED_HOST); if (unlikely(!__pyx_t_78)) __PYX_ERR(5, 82, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_78); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":83 * hdrs.X_FORWARDED_FOR, * hdrs.X_FORWARDED_HOST, * hdrs.X_FORWARDED_PROTO, # <<<<<<<<<<<<<< * ) */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 83, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_79 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_X_FORWARDED_PROTO); if (unlikely(!__pyx_t_79)) __PYX_ERR(5, 83, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_79); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_headers.pxi":6 * from . import hdrs * cdef tuple headers = ( * hdrs.ACCEPT, # <<<<<<<<<<<<<< * hdrs.ACCEPT_CHARSET, * hdrs.ACCEPT_ENCODING, */ __pyx_t_1 = PyTuple_New(78); if (unlikely(!__pyx_t_1)) __PYX_ERR(5, 6, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_GIVEREF(__pyx_t_2); PyTuple_SET_ITEM(__pyx_t_1, 0, __pyx_t_2); __Pyx_GIVEREF(__pyx_t_3); PyTuple_SET_ITEM(__pyx_t_1, 1, __pyx_t_3); __Pyx_GIVEREF(__pyx_t_4); PyTuple_SET_ITEM(__pyx_t_1, 2, __pyx_t_4); __Pyx_GIVEREF(__pyx_t_5); PyTuple_SET_ITEM(__pyx_t_1, 3, __pyx_t_5); __Pyx_GIVEREF(__pyx_t_6); PyTuple_SET_ITEM(__pyx_t_1, 4, __pyx_t_6); __Pyx_GIVEREF(__pyx_t_7); PyTuple_SET_ITEM(__pyx_t_1, 5, __pyx_t_7); __Pyx_GIVEREF(__pyx_t_8); PyTuple_SET_ITEM(__pyx_t_1, 6, __pyx_t_8); __Pyx_GIVEREF(__pyx_t_9); PyTuple_SET_ITEM(__pyx_t_1, 7, __pyx_t_9); __Pyx_GIVEREF(__pyx_t_10); PyTuple_SET_ITEM(__pyx_t_1, 8, __pyx_t_10); __Pyx_GIVEREF(__pyx_t_11); PyTuple_SET_ITEM(__pyx_t_1, 9, __pyx_t_11); __Pyx_GIVEREF(__pyx_t_12); PyTuple_SET_ITEM(__pyx_t_1, 10, __pyx_t_12); __Pyx_GIVEREF(__pyx_t_13); PyTuple_SET_ITEM(__pyx_t_1, 11, __pyx_t_13); __Pyx_GIVEREF(__pyx_t_14); PyTuple_SET_ITEM(__pyx_t_1, 12, __pyx_t_14); __Pyx_GIVEREF(__pyx_t_15); PyTuple_SET_ITEM(__pyx_t_1, 13, __pyx_t_15); __Pyx_GIVEREF(__pyx_t_16); PyTuple_SET_ITEM(__pyx_t_1, 14, __pyx_t_16); __Pyx_GIVEREF(__pyx_t_17); PyTuple_SET_ITEM(__pyx_t_1, 15, __pyx_t_17); __Pyx_GIVEREF(__pyx_t_18); PyTuple_SET_ITEM(__pyx_t_1, 16, __pyx_t_18); __Pyx_GIVEREF(__pyx_t_19); PyTuple_SET_ITEM(__pyx_t_1, 17, __pyx_t_19); __Pyx_GIVEREF(__pyx_t_20); PyTuple_SET_ITEM(__pyx_t_1, 18, __pyx_t_20); __Pyx_GIVEREF(__pyx_t_21); PyTuple_SET_ITEM(__pyx_t_1, 19, __pyx_t_21); __Pyx_GIVEREF(__pyx_t_22); PyTuple_SET_ITEM(__pyx_t_1, 20, __pyx_t_22); __Pyx_GIVEREF(__pyx_t_23); PyTuple_SET_ITEM(__pyx_t_1, 21, __pyx_t_23); __Pyx_GIVEREF(__pyx_t_24); PyTuple_SET_ITEM(__pyx_t_1, 22, __pyx_t_24); __Pyx_GIVEREF(__pyx_t_25); PyTuple_SET_ITEM(__pyx_t_1, 23, __pyx_t_25); __Pyx_GIVEREF(__pyx_t_26); PyTuple_SET_ITEM(__pyx_t_1, 24, __pyx_t_26); __Pyx_GIVEREF(__pyx_t_27); PyTuple_SET_ITEM(__pyx_t_1, 25, __pyx_t_27); __Pyx_GIVEREF(__pyx_t_28); PyTuple_SET_ITEM(__pyx_t_1, 26, __pyx_t_28); __Pyx_GIVEREF(__pyx_t_29); PyTuple_SET_ITEM(__pyx_t_1, 27, __pyx_t_29); __Pyx_GIVEREF(__pyx_t_30); PyTuple_SET_ITEM(__pyx_t_1, 28, __pyx_t_30); __Pyx_GIVEREF(__pyx_t_31); PyTuple_SET_ITEM(__pyx_t_1, 29, __pyx_t_31); __Pyx_GIVEREF(__pyx_t_32); PyTuple_SET_ITEM(__pyx_t_1, 30, __pyx_t_32); __Pyx_GIVEREF(__pyx_t_33); PyTuple_SET_ITEM(__pyx_t_1, 31, __pyx_t_33); __Pyx_GIVEREF(__pyx_t_34); PyTuple_SET_ITEM(__pyx_t_1, 32, __pyx_t_34); __Pyx_GIVEREF(__pyx_t_35); PyTuple_SET_ITEM(__pyx_t_1, 33, __pyx_t_35); __Pyx_GIVEREF(__pyx_t_36); PyTuple_SET_ITEM(__pyx_t_1, 34, __pyx_t_36); __Pyx_GIVEREF(__pyx_t_37); PyTuple_SET_ITEM(__pyx_t_1, 35, __pyx_t_37); __Pyx_GIVEREF(__pyx_t_38); PyTuple_SET_ITEM(__pyx_t_1, 36, __pyx_t_38); __Pyx_GIVEREF(__pyx_t_39); PyTuple_SET_ITEM(__pyx_t_1, 37, __pyx_t_39); __Pyx_GIVEREF(__pyx_t_40); PyTuple_SET_ITEM(__pyx_t_1, 38, __pyx_t_40); __Pyx_GIVEREF(__pyx_t_41); PyTuple_SET_ITEM(__pyx_t_1, 39, __pyx_t_41); __Pyx_GIVEREF(__pyx_t_42); PyTuple_SET_ITEM(__pyx_t_1, 40, __pyx_t_42); __Pyx_GIVEREF(__pyx_t_43); PyTuple_SET_ITEM(__pyx_t_1, 41, __pyx_t_43); __Pyx_GIVEREF(__pyx_t_44); PyTuple_SET_ITEM(__pyx_t_1, 42, __pyx_t_44); __Pyx_GIVEREF(__pyx_t_45); PyTuple_SET_ITEM(__pyx_t_1, 43, __pyx_t_45); __Pyx_GIVEREF(__pyx_t_46); PyTuple_SET_ITEM(__pyx_t_1, 44, __pyx_t_46); __Pyx_GIVEREF(__pyx_t_47); PyTuple_SET_ITEM(__pyx_t_1, 45, __pyx_t_47); __Pyx_GIVEREF(__pyx_t_48); PyTuple_SET_ITEM(__pyx_t_1, 46, __pyx_t_48); __Pyx_GIVEREF(__pyx_t_49); PyTuple_SET_ITEM(__pyx_t_1, 47, __pyx_t_49); __Pyx_GIVEREF(__pyx_t_50); PyTuple_SET_ITEM(__pyx_t_1, 48, __pyx_t_50); __Pyx_GIVEREF(__pyx_t_51); PyTuple_SET_ITEM(__pyx_t_1, 49, __pyx_t_51); __Pyx_GIVEREF(__pyx_t_52); PyTuple_SET_ITEM(__pyx_t_1, 50, __pyx_t_52); __Pyx_GIVEREF(__pyx_t_53); PyTuple_SET_ITEM(__pyx_t_1, 51, __pyx_t_53); __Pyx_GIVEREF(__pyx_t_54); PyTuple_SET_ITEM(__pyx_t_1, 52, __pyx_t_54); __Pyx_GIVEREF(__pyx_t_55); PyTuple_SET_ITEM(__pyx_t_1, 53, __pyx_t_55); __Pyx_GIVEREF(__pyx_t_56); PyTuple_SET_ITEM(__pyx_t_1, 54, __pyx_t_56); __Pyx_GIVEREF(__pyx_t_57); PyTuple_SET_ITEM(__pyx_t_1, 55, __pyx_t_57); __Pyx_GIVEREF(__pyx_t_58); PyTuple_SET_ITEM(__pyx_t_1, 56, __pyx_t_58); __Pyx_GIVEREF(__pyx_t_59); PyTuple_SET_ITEM(__pyx_t_1, 57, __pyx_t_59); __Pyx_GIVEREF(__pyx_t_60); PyTuple_SET_ITEM(__pyx_t_1, 58, __pyx_t_60); __Pyx_GIVEREF(__pyx_t_61); PyTuple_SET_ITEM(__pyx_t_1, 59, __pyx_t_61); __Pyx_GIVEREF(__pyx_t_62); PyTuple_SET_ITEM(__pyx_t_1, 60, __pyx_t_62); __Pyx_GIVEREF(__pyx_t_63); PyTuple_SET_ITEM(__pyx_t_1, 61, __pyx_t_63); __Pyx_GIVEREF(__pyx_t_64); PyTuple_SET_ITEM(__pyx_t_1, 62, __pyx_t_64); __Pyx_GIVEREF(__pyx_t_65); PyTuple_SET_ITEM(__pyx_t_1, 63, __pyx_t_65); __Pyx_GIVEREF(__pyx_t_66); PyTuple_SET_ITEM(__pyx_t_1, 64, __pyx_t_66); __Pyx_GIVEREF(__pyx_t_67); PyTuple_SET_ITEM(__pyx_t_1, 65, __pyx_t_67); __Pyx_GIVEREF(__pyx_t_68); PyTuple_SET_ITEM(__pyx_t_1, 66, __pyx_t_68); __Pyx_GIVEREF(__pyx_t_69); PyTuple_SET_ITEM(__pyx_t_1, 67, __pyx_t_69); __Pyx_GIVEREF(__pyx_t_70); PyTuple_SET_ITEM(__pyx_t_1, 68, __pyx_t_70); __Pyx_GIVEREF(__pyx_t_71); PyTuple_SET_ITEM(__pyx_t_1, 69, __pyx_t_71); __Pyx_GIVEREF(__pyx_t_72); PyTuple_SET_ITEM(__pyx_t_1, 70, __pyx_t_72); __Pyx_GIVEREF(__pyx_t_73); PyTuple_SET_ITEM(__pyx_t_1, 71, __pyx_t_73); __Pyx_GIVEREF(__pyx_t_74); PyTuple_SET_ITEM(__pyx_t_1, 72, __pyx_t_74); __Pyx_GIVEREF(__pyx_t_75); PyTuple_SET_ITEM(__pyx_t_1, 73, __pyx_t_75); __Pyx_GIVEREF(__pyx_t_76); PyTuple_SET_ITEM(__pyx_t_1, 74, __pyx_t_76); __Pyx_GIVEREF(__pyx_t_77); PyTuple_SET_ITEM(__pyx_t_1, 75, __pyx_t_77); __Pyx_GIVEREF(__pyx_t_78); PyTuple_SET_ITEM(__pyx_t_1, 76, __pyx_t_78); __Pyx_GIVEREF(__pyx_t_79); PyTuple_SET_ITEM(__pyx_t_1, 77, __pyx_t_79); __pyx_t_2 = 0; __pyx_t_3 = 0; __pyx_t_4 = 0; __pyx_t_5 = 0; __pyx_t_6 = 0; __pyx_t_7 = 0; __pyx_t_8 = 0; __pyx_t_9 = 0; __pyx_t_10 = 0; __pyx_t_11 = 0; __pyx_t_12 = 0; __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_t_15 = 0; __pyx_t_16 = 0; __pyx_t_17 = 0; __pyx_t_18 = 0; __pyx_t_19 = 0; __pyx_t_20 = 0; __pyx_t_21 = 0; __pyx_t_22 = 0; __pyx_t_23 = 0; __pyx_t_24 = 0; __pyx_t_25 = 0; __pyx_t_26 = 0; __pyx_t_27 = 0; __pyx_t_28 = 0; __pyx_t_29 = 0; __pyx_t_30 = 0; __pyx_t_31 = 0; __pyx_t_32 = 0; __pyx_t_33 = 0; __pyx_t_34 = 0; __pyx_t_35 = 0; __pyx_t_36 = 0; __pyx_t_37 = 0; __pyx_t_38 = 0; __pyx_t_39 = 0; __pyx_t_40 = 0; __pyx_t_41 = 0; __pyx_t_42 = 0; __pyx_t_43 = 0; __pyx_t_44 = 0; __pyx_t_45 = 0; __pyx_t_46 = 0; __pyx_t_47 = 0; __pyx_t_48 = 0; __pyx_t_49 = 0; __pyx_t_50 = 0; __pyx_t_51 = 0; __pyx_t_52 = 0; __pyx_t_53 = 0; __pyx_t_54 = 0; __pyx_t_55 = 0; __pyx_t_56 = 0; __pyx_t_57 = 0; __pyx_t_58 = 0; __pyx_t_59 = 0; __pyx_t_60 = 0; __pyx_t_61 = 0; __pyx_t_62 = 0; __pyx_t_63 = 0; __pyx_t_64 = 0; __pyx_t_65 = 0; __pyx_t_66 = 0; __pyx_t_67 = 0; __pyx_t_68 = 0; __pyx_t_69 = 0; __pyx_t_70 = 0; __pyx_t_71 = 0; __pyx_t_72 = 0; __pyx_t_73 = 0; __pyx_t_74 = 0; __pyx_t_75 = 0; __pyx_t_76 = 0; __pyx_t_77 = 0; __pyx_t_78 = 0; __pyx_t_79 = 0; __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_headers); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_headers, ((PyObject*)__pyx_t_1)); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":40 * char* PyByteArray_AsString(object) * * __all__ = ('HttpRequestParser', 'HttpResponseParser', # <<<<<<<<<<<<<< * 'RawRequestMessage', 'RawResponseMessage') * */ if (PyDict_SetItem(__pyx_d, __pyx_n_s_all, __pyx_tuple__12) < 0) __PYX_ERR(0, 40, __pyx_L1_error) /* "aiohttp/_http_parser.pyx":43 * 'RawRequestMessage', 'RawResponseMessage') * * cdef object URL = _URL # <<<<<<<<<<<<<< * cdef object URL_build = URL.build * cdef object CIMultiDict = _CIMultiDict */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_URL_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 43, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_URL); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_URL, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":44 * * cdef object URL = _URL * cdef object URL_build = URL.build # <<<<<<<<<<<<<< * cdef object CIMultiDict = _CIMultiDict * cdef object CIMultiDictProxy = _CIMultiDictProxy */ __pyx_t_1 = __Pyx_PyObject_GetAttrStr(__pyx_v_7aiohttp_12_http_parser_URL, __pyx_n_s_build); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 44, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_URL_build); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_URL_build, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":45 * cdef object URL = _URL * cdef object URL_build = URL.build * cdef object CIMultiDict = _CIMultiDict # <<<<<<<<<<<<<< * cdef object CIMultiDictProxy = _CIMultiDictProxy * cdef object HttpVersion = _HttpVersion */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_CIMultiDict_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 45, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_CIMultiDict); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_CIMultiDict, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":46 * cdef object URL_build = URL.build * cdef object CIMultiDict = _CIMultiDict * cdef object CIMultiDictProxy = _CIMultiDictProxy # <<<<<<<<<<<<<< * cdef object HttpVersion = _HttpVersion * cdef object HttpVersion10 = _HttpVersion10 */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_CIMultiDictProxy_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 46, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_CIMultiDictProxy, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":47 * cdef object CIMultiDict = _CIMultiDict * cdef object CIMultiDictProxy = _CIMultiDictProxy * cdef object HttpVersion = _HttpVersion # <<<<<<<<<<<<<< * cdef object HttpVersion10 = _HttpVersion10 * cdef object HttpVersion11 = _HttpVersion11 */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_HttpVersion_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 47, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_HttpVersion, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":48 * cdef object CIMultiDictProxy = _CIMultiDictProxy * cdef object HttpVersion = _HttpVersion * cdef object HttpVersion10 = _HttpVersion10 # <<<<<<<<<<<<<< * cdef object HttpVersion11 = _HttpVersion11 * cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_HttpVersion10_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 48, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion10); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_HttpVersion10, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":49 * cdef object HttpVersion = _HttpVersion * cdef object HttpVersion10 = _HttpVersion10 * cdef object HttpVersion11 = _HttpVersion11 # <<<<<<<<<<<<<< * cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 * cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_HttpVersion11_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 49, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_HttpVersion11); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_HttpVersion11, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":50 * cdef object HttpVersion10 = _HttpVersion10 * cdef object HttpVersion11 = _HttpVersion11 * cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 # <<<<<<<<<<<<<< * cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING * cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_79 = __Pyx_PyObject_GetAttrStr(__pyx_t_1, __pyx_n_s_SEC_WEBSOCKET_KEY1); if (unlikely(!__pyx_t_79)) __PYX_ERR(0, 50, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_79); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_SEC_WEBSOCKET_KEY1, __pyx_t_79); __Pyx_GIVEREF(__pyx_t_79); __pyx_t_79 = 0; /* "aiohttp/_http_parser.pyx":51 * cdef object HttpVersion11 = _HttpVersion11 * cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 * cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING # <<<<<<<<<<<<<< * cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD * cdef object StreamReader = _StreamReader */ __Pyx_GetModuleGlobalName(__pyx_t_79, __pyx_n_s_hdrs); if (unlikely(!__pyx_t_79)) __PYX_ERR(0, 51, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_79); __pyx_t_1 = __Pyx_PyObject_GetAttrStr(__pyx_t_79, __pyx_n_s_CONTENT_ENCODING); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 51, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_79); __pyx_t_79 = 0; __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_CONTENT_ENCODING); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_CONTENT_ENCODING, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":52 * cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 * cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING * cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD # <<<<<<<<<<<<<< * cdef object StreamReader = _StreamReader * cdef object DeflateBuffer = _DeflateBuffer */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_EMPTY_PAYLOAD_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 52, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_EMPTY_PAYLOAD, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":53 * cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING * cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD * cdef object StreamReader = _StreamReader # <<<<<<<<<<<<<< * cdef object DeflateBuffer = _DeflateBuffer * */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_StreamReader_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 53, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_StreamReader); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_StreamReader, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":54 * cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD * cdef object StreamReader = _StreamReader * cdef object DeflateBuffer = _DeflateBuffer # <<<<<<<<<<<<<< * * */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_DeflateBuffer_2); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 54, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser_DeflateBuffer); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser_DeflateBuffer, __pyx_t_1); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":68 * DEF METHODS_COUNT = 34; * * cdef list _http_method = [] # <<<<<<<<<<<<<< * * for i in range(METHODS_COUNT): */ __pyx_t_1 = PyList_New(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 68, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_parser__http_method); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_parser__http_method, ((PyObject*)__pyx_t_1)); __Pyx_GIVEREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":70 * cdef list _http_method = [] * * for i in range(METHODS_COUNT): # <<<<<<<<<<<<<< * _http_method.append( * cparser.http_method_str( i).decode('ascii')) */ for (__pyx_t_80 = 0; __pyx_t_80 < 34; __pyx_t_80+=1) { __pyx_t_1 = __Pyx_PyInt_From_long(__pyx_t_80); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 70, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_i, __pyx_t_1) < 0) __PYX_ERR(0, 70, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":71 * * for i in range(METHODS_COUNT): * _http_method.append( # <<<<<<<<<<<<<< * cparser.http_method_str( i).decode('ascii')) * */ if (unlikely(__pyx_v_7aiohttp_12_http_parser__http_method == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "append"); __PYX_ERR(0, 71, __pyx_L1_error) } /* "aiohttp/_http_parser.pyx":72 * for i in range(METHODS_COUNT): * _http_method.append( * cparser.http_method_str( i).decode('ascii')) # <<<<<<<<<<<<<< * * */ __Pyx_GetModuleGlobalName(__pyx_t_1, __pyx_n_s_i); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 72, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_t_81 = ((enum http_method)__Pyx_PyInt_As_enum__http_method(__pyx_t_1)); if (unlikely(PyErr_Occurred())) __PYX_ERR(0, 72, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_82 = http_method_str(((enum http_method)__pyx_t_81)); __pyx_t_1 = __Pyx_decode_c_string(__pyx_t_82, 0, strlen(__pyx_t_82), NULL, NULL, PyUnicode_DecodeASCII); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 72, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); /* "aiohttp/_http_parser.pyx":71 * * for i in range(METHODS_COUNT): * _http_method.append( # <<<<<<<<<<<<<< * cparser.http_method_str( i).decode('ascii')) * */ __pyx_t_83 = __Pyx_PyList_Append(__pyx_v_7aiohttp_12_http_parser__http_method, __pyx_t_1); if (unlikely(__pyx_t_83 == ((int)-1))) __PYX_ERR(0, 71, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; } /* "aiohttp/_http_parser.pyx":756 * * * def parse_url(url): # <<<<<<<<<<<<<< * cdef: * Py_buffer py_buf */ __pyx_t_1 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_12_http_parser_1parse_url, NULL, __pyx_n_s_aiohttp__http_parser); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 756, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_parse_url, __pyx_t_1) < 0) __PYX_ERR(0, 756, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":1 * def __pyx_unpickle_RawRequestMessage(__pyx_type, long __pyx_checksum, __pyx_state): # <<<<<<<<<<<<<< * cdef object __pyx_PickleError * cdef object __pyx_result */ __pyx_t_1 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_12_http_parser_3__pyx_unpickle_RawRequestMessage, NULL, __pyx_n_s_aiohttp__http_parser); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_pyx_unpickle_RawRequestMessage, __pyx_t_1) < 0) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "(tree fragment)":11 * __pyx_unpickle_RawRequestMessage__set_state( __pyx_result, __pyx_state) * return __pyx_result * cdef __pyx_unpickle_RawRequestMessage__set_state(RawRequestMessage __pyx_result, tuple __pyx_state): # <<<<<<<<<<<<<< * __pyx_result.chunked = __pyx_state[0]; __pyx_result.compression = __pyx_state[1]; __pyx_result.headers = __pyx_state[2]; __pyx_result.method = __pyx_state[3]; __pyx_result.path = __pyx_state[4]; __pyx_result.raw_headers = __pyx_state[5]; __pyx_result.should_close = __pyx_state[6]; __pyx_result.upgrade = __pyx_state[7]; __pyx_result.url = __pyx_state[8]; __pyx_result.version = __pyx_state[9] * if len(__pyx_state) > 10 and hasattr(__pyx_result, '__dict__'): */ __pyx_t_1 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_12_http_parser_5__pyx_unpickle_RawResponseMessage, NULL, __pyx_n_s_aiohttp__http_parser); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_pyx_unpickle_RawResponseMessag, __pyx_t_1) < 0) __PYX_ERR(1, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_parser.pyx":1 * #cython: language_level=3 # <<<<<<<<<<<<<< * # * # Based on https://github.com/MagicStack/httptools */ __pyx_t_1 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_1) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_XDECREF(__pyx_t_8); __Pyx_XDECREF(__pyx_t_9); __Pyx_XDECREF(__pyx_t_10); __Pyx_XDECREF(__pyx_t_11); __Pyx_XDECREF(__pyx_t_12); __Pyx_XDECREF(__pyx_t_13); __Pyx_XDECREF(__pyx_t_14); __Pyx_XDECREF(__pyx_t_15); __Pyx_XDECREF(__pyx_t_16); __Pyx_XDECREF(__pyx_t_17); __Pyx_XDECREF(__pyx_t_18); __Pyx_XDECREF(__pyx_t_19); __Pyx_XDECREF(__pyx_t_20); __Pyx_XDECREF(__pyx_t_21); __Pyx_XDECREF(__pyx_t_22); __Pyx_XDECREF(__pyx_t_23); __Pyx_XDECREF(__pyx_t_24); __Pyx_XDECREF(__pyx_t_25); __Pyx_XDECREF(__pyx_t_26); __Pyx_XDECREF(__pyx_t_27); __Pyx_XDECREF(__pyx_t_28); __Pyx_XDECREF(__pyx_t_29); __Pyx_XDECREF(__pyx_t_30); __Pyx_XDECREF(__pyx_t_31); __Pyx_XDECREF(__pyx_t_32); __Pyx_XDECREF(__pyx_t_33); __Pyx_XDECREF(__pyx_t_34); __Pyx_XDECREF(__pyx_t_35); __Pyx_XDECREF(__pyx_t_36); __Pyx_XDECREF(__pyx_t_37); __Pyx_XDECREF(__pyx_t_38); __Pyx_XDECREF(__pyx_t_39); __Pyx_XDECREF(__pyx_t_40); __Pyx_XDECREF(__pyx_t_41); __Pyx_XDECREF(__pyx_t_42); __Pyx_XDECREF(__pyx_t_43); __Pyx_XDECREF(__pyx_t_44); __Pyx_XDECREF(__pyx_t_45); __Pyx_XDECREF(__pyx_t_46); __Pyx_XDECREF(__pyx_t_47); __Pyx_XDECREF(__pyx_t_48); __Pyx_XDECREF(__pyx_t_49); __Pyx_XDECREF(__pyx_t_50); __Pyx_XDECREF(__pyx_t_51); __Pyx_XDECREF(__pyx_t_52); __Pyx_XDECREF(__pyx_t_53); __Pyx_XDECREF(__pyx_t_54); __Pyx_XDECREF(__pyx_t_55); __Pyx_XDECREF(__pyx_t_56); __Pyx_XDECREF(__pyx_t_57); __Pyx_XDECREF(__pyx_t_58); __Pyx_XDECREF(__pyx_t_59); __Pyx_XDECREF(__pyx_t_60); __Pyx_XDECREF(__pyx_t_61); __Pyx_XDECREF(__pyx_t_62); __Pyx_XDECREF(__pyx_t_63); __Pyx_XDECREF(__pyx_t_64); __Pyx_XDECREF(__pyx_t_65); __Pyx_XDECREF(__pyx_t_66); __Pyx_XDECREF(__pyx_t_67); __Pyx_XDECREF(__pyx_t_68); __Pyx_XDECREF(__pyx_t_69); __Pyx_XDECREF(__pyx_t_70); __Pyx_XDECREF(__pyx_t_71); __Pyx_XDECREF(__pyx_t_72); __Pyx_XDECREF(__pyx_t_73); __Pyx_XDECREF(__pyx_t_74); __Pyx_XDECREF(__pyx_t_75); __Pyx_XDECREF(__pyx_t_76); __Pyx_XDECREF(__pyx_t_77); __Pyx_XDECREF(__pyx_t_78); __Pyx_XDECREF(__pyx_t_79); if (__pyx_m) { if (__pyx_d) { __Pyx_AddTraceback("init aiohttp._http_parser", __pyx_clineno, __pyx_lineno, __pyx_filename); } Py_CLEAR(__pyx_m); } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init aiohttp._http_parser"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStr(__pyx_b, name); if (unlikely(!result)) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* GetItemInt */ static PyObject *__Pyx_GetItemInt_Generic(PyObject *o, PyObject* j) { PyObject *r; if (!j) return NULL; r = PyObject_GetItem(o, j); Py_DECREF(j); return r; } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_List_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyList_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyList_GET_SIZE(o)))) { PyObject *r = PyList_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Tuple_Fast(PyObject *o, Py_ssize_t i, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS Py_ssize_t wrapped_i = i; if (wraparound & unlikely(i < 0)) { wrapped_i += PyTuple_GET_SIZE(o); } if ((!boundscheck) || likely(__Pyx_is_valid_index(wrapped_i, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, wrapped_i); Py_INCREF(r); return r; } return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); #else return PySequence_GetItem(o, i); #endif } static CYTHON_INLINE PyObject *__Pyx_GetItemInt_Fast(PyObject *o, Py_ssize_t i, int is_list, CYTHON_NCP_UNUSED int wraparound, CYTHON_NCP_UNUSED int boundscheck) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS && CYTHON_USE_TYPE_SLOTS if (is_list || PyList_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyList_GET_SIZE(o); if ((!boundscheck) || (likely(__Pyx_is_valid_index(n, PyList_GET_SIZE(o))))) { PyObject *r = PyList_GET_ITEM(o, n); Py_INCREF(r); return r; } } else if (PyTuple_CheckExact(o)) { Py_ssize_t n = ((!wraparound) | likely(i >= 0)) ? i : i + PyTuple_GET_SIZE(o); if ((!boundscheck) || likely(__Pyx_is_valid_index(n, PyTuple_GET_SIZE(o)))) { PyObject *r = PyTuple_GET_ITEM(o, n); Py_INCREF(r); return r; } } else { PySequenceMethods *m = Py_TYPE(o)->tp_as_sequence; if (likely(m && m->sq_item)) { if (wraparound && unlikely(i < 0) && likely(m->sq_length)) { Py_ssize_t l = m->sq_length(o); if (likely(l >= 0)) { i += l; } else { if (!PyErr_ExceptionMatches(PyExc_OverflowError)) return NULL; PyErr_Clear(); } } return m->sq_item(o, i); } } #else if (is_list || PySequence_Check(o)) { return PySequence_GetItem(o, i); } #endif return __Pyx_GetItemInt_Generic(o, PyInt_FromSsize_t(i)); } /* decode_c_bytes */ static CYTHON_INLINE PyObject* __Pyx_decode_c_bytes( const char* cstring, Py_ssize_t length, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)) { if (unlikely((start < 0) | (stop < 0))) { if (start < 0) { start += length; if (start < 0) start = 0; } if (stop < 0) stop += length; } if (stop > length) stop = length; length = stop - start; if (unlikely(length <= 0)) return PyUnicode_FromUnicode(NULL, 0); cstring += start; if (decode_func) { return decode_func(cstring, length, errors); } else { return PyUnicode_Decode(cstring, length, encoding, errors); } } /* RaiseArgTupleInvalid */ static void __Pyx_RaiseArgtupleInvalid( const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found) { Py_ssize_t num_expected; const char *more_or_less; if (num_found < num_min) { num_expected = num_min; more_or_less = "at least"; } else { num_expected = num_max; more_or_less = "at most"; } if (exact) { more_or_less = "exactly"; } PyErr_Format(PyExc_TypeError, "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", func_name, more_or_less, num_expected, (num_expected == 1) ? "" : "s", num_found); } /* RaiseDoubleKeywords */ static void __Pyx_RaiseDoubleKeywordsError( const char* func_name, PyObject* kw_name) { PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION >= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; while (PyDict_Next(kwds, &pos, &key, &value)) { name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; continue; } name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_CheckExact(key)) || likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = (**name == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**name) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**name, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**argname) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: return -1; } /* None */ static CYTHON_INLINE void __Pyx_RaiseClosureNameError(const char *varname) { PyErr_Format(PyExc_NameError, "free variable '%s' referenced before assignment in enclosing scope", varname); } /* RaiseTooManyValuesToUnpack */ static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected) { PyErr_Format(PyExc_ValueError, "too many values to unpack (expected %" CYTHON_FORMAT_SSIZE_T "d)", expected); } /* RaiseNeedMoreValuesToUnpack */ static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index) { PyErr_Format(PyExc_ValueError, "need more than %" CYTHON_FORMAT_SSIZE_T "d value%.1s to unpack", index, (index == 1) ? "" : "s"); } /* IterFinish */ static CYTHON_INLINE int __Pyx_IterFinish(void) { #if CYTHON_FAST_THREAD_STATE PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* exc_type = tstate->curexc_type; if (unlikely(exc_type)) { if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) { PyObject *exc_value, *exc_tb; exc_value = tstate->curexc_value; exc_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; Py_DECREF(exc_type); Py_XDECREF(exc_value); Py_XDECREF(exc_tb); return 0; } else { return -1; } } return 0; #else if (unlikely(PyErr_Occurred())) { if (likely(PyErr_ExceptionMatches(PyExc_StopIteration))) { PyErr_Clear(); return 0; } else { return -1; } } return 0; #endif } /* UnpackItemEndCheck */ static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected) { if (unlikely(retval)) { Py_DECREF(retval); __Pyx_RaiseTooManyValuesError(expected); return -1; } else { return __Pyx_IterFinish(); } return 0; } /* KeywordStringCheck */ static int __Pyx_CheckKeywordStrings( PyObject *kwdict, const char* function_name, int kw_allowed) { PyObject* key = 0; Py_ssize_t pos = 0; #if CYTHON_COMPILING_IN_PYPY if (!kw_allowed && PyDict_Next(kwdict, &pos, &key, 0)) goto invalid_keyword; return 1; #else while (PyDict_Next(kwdict, &pos, &key, 0)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyString_Check(key))) #endif if (unlikely(!PyUnicode_Check(key))) goto invalid_keyword_type; } if ((!kw_allowed) && unlikely(key)) goto invalid_keyword; return 1; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); return 0; #endif invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif return 0; } /* ExtTypeTest */ static CYTHON_INLINE int __Pyx_TypeTest(PyObject *obj, PyTypeObject *type) { if (unlikely(!type)) { PyErr_SetString(PyExc_SystemError, "Missing type object"); return 0; } if (likely(__Pyx_TypeCheck(obj, type))) return 1; PyErr_Format(PyExc_TypeError, "Cannot convert %.200s to %.200s", Py_TYPE(obj)->tp_name, type->tp_name); return 0; } /* DictGetItem */ #if PY_MAJOR_VERSION >= 3 && !CYTHON_COMPILING_IN_PYPY static PyObject *__Pyx_PyDict_GetItem(PyObject *d, PyObject* key) { PyObject *value; value = PyDict_GetItemWithError(d, key); if (unlikely(!value)) { if (!PyErr_Occurred()) { if (unlikely(PyTuple_Check(key))) { PyObject* args = PyTuple_Pack(1, key); if (likely(args)) { PyErr_SetObject(PyExc_KeyError, args); Py_DECREF(args); } } else { PyErr_SetObject(PyExc_KeyError, key); } } return NULL; } Py_INCREF(value); return value; } #endif /* PyErrExceptionMatches */ #if CYTHON_FAST_THREAD_STATE static int __Pyx_PyErr_ExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; icurexc_type; if (exc_type == err) return 1; if (unlikely(!exc_type)) return 0; if (unlikely(PyTuple_Check(err))) return __Pyx_PyErr_ExceptionMatchesTuple(exc_type, err); return __Pyx_PyErr_GivenExceptionMatches(exc_type, err); } #endif /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; } #endif /* GetAttr */ static CYTHON_INLINE PyObject *__Pyx_GetAttr(PyObject *o, PyObject *n) { #if CYTHON_USE_TYPE_SLOTS #if PY_MAJOR_VERSION >= 3 if (likely(PyUnicode_Check(n))) #else if (likely(PyString_Check(n))) #endif return __Pyx_PyObject_GetAttrStr(o, n); #endif return PyObject_GetAttr(o, n); } /* GetAttr3 */ static PyObject *__Pyx_GetAttr3Default(PyObject *d) { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign if (unlikely(!__Pyx_PyErr_ExceptionMatches(PyExc_AttributeError))) return NULL; __Pyx_PyErr_Clear(); Py_INCREF(d); return d; } static CYTHON_INLINE PyObject *__Pyx_GetAttr3(PyObject *o, PyObject *n, PyObject *d) { PyObject *r = __Pyx_GetAttr(o, n); return (likely(r)) ? r : __Pyx_GetAttr3Default(d); } /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* GetModuleGlobalName */ #if CYTHON_USE_DICT_VERSIONS static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) #else static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) #endif { PyObject *result; #if !CYTHON_AVOID_BORROWED_REFS #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } else if (unlikely(PyErr_Occurred())) { return NULL; } #else result = PyDict_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } #endif #else result = PyObject_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } PyErr_Clear(); #endif return __Pyx_GetBuiltinName(name); } /* PyFunctionFastCall */ #if CYTHON_FAST_PYCALL static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, PyObject *globals) { PyFrameObject *f; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject **fastlocals; Py_ssize_t i; PyObject *result; assert(globals != NULL); /* XXX Perhaps we should create a specialized PyFrame_New() that doesn't take locals, but does take builtins without sanity checking them. */ assert(tstate != NULL); f = PyFrame_New(tstate, co, globals, NULL); if (f == NULL) { return NULL; } fastlocals = __Pyx_PyFrame_GetLocalsplus(f); for (i = 0; i < na; i++) { Py_INCREF(*args); fastlocals[i] = *args++; } result = PyEval_EvalFrameEx(f,0); ++tstate->recursion_depth; Py_DECREF(f); --tstate->recursion_depth; return result; } #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); PyObject *globals = PyFunction_GET_GLOBALS(func); PyObject *argdefs = PyFunction_GET_DEFAULTS(func); PyObject *closure; #if PY_MAJOR_VERSION >= 3 PyObject *kwdefs; #endif PyObject *kwtuple, **k; PyObject **d; Py_ssize_t nd; Py_ssize_t nk; PyObject *result; assert(kwargs == NULL || PyDict_Check(kwargs)); nk = kwargs ? PyDict_Size(kwargs) : 0; if (Py_EnterRecursiveCall((char*)" while calling a Python object")) { return NULL; } if ( #if PY_MAJOR_VERSION >= 3 co->co_kwonlyargcount == 0 && #endif likely(kwargs == NULL || nk == 0) && co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { if (argdefs == NULL && co->co_argcount == nargs) { result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); goto done; } else if (nargs == 0 && argdefs != NULL && co->co_argcount == Py_SIZE(argdefs)) { /* function called with no arguments, but all parameters have a default value: use default values as arguments .*/ args = &PyTuple_GET_ITEM(argdefs, 0); result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); goto done; } } if (kwargs != NULL) { Py_ssize_t pos, i; kwtuple = PyTuple_New(2 * nk); if (kwtuple == NULL) { result = NULL; goto done; } k = &PyTuple_GET_ITEM(kwtuple, 0); pos = i = 0; while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { Py_INCREF(k[i]); Py_INCREF(k[i+1]); i += 2; } nk = i / 2; } else { kwtuple = NULL; k = NULL; } closure = PyFunction_GET_CLOSURE(func); #if PY_MAJOR_VERSION >= 3 kwdefs = PyFunction_GET_KW_DEFAULTS(func); #endif if (argdefs != NULL) { d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } else { d = NULL; nd = 0; } #if PY_MAJOR_VERSION >= 3 result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, kwdefs, closure); #else result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, closure); #endif Py_XDECREF(kwtuple); done: Py_LeaveRecursiveCall(); return result; } #endif #endif /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = func->ob_type->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallMethO */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { PyObject *self, *result; PyCFunction cfunc; cfunc = PyCFunction_GET_FUNCTION(func); self = PyCFunction_GET_SELF(func); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = cfunc(self, arg); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallNoArg */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, NULL, 0); } #endif #ifdef __Pyx_CyFunction_USED if (likely(PyCFunction_Check(func) || __Pyx_CyFunction_Check(func))) #else if (likely(PyCFunction_Check(func))) #endif { if (likely(PyCFunction_GET_FLAGS(func) & METH_NOARGS)) { return __Pyx_PyObject_CallMethO(func, NULL); } } return __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL); } #endif /* PyCFunctionFastCall */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject * __Pyx_PyCFunction_FastCall(PyObject *func_obj, PyObject **args, Py_ssize_t nargs) { PyCFunctionObject *func = (PyCFunctionObject*)func_obj; PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyObject *self = PyCFunction_GET_SELF(func); int flags = PyCFunction_GET_FLAGS(func); assert(PyCFunction_Check(func)); assert(METH_FASTCALL == (flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))); assert(nargs >= 0); assert(nargs == 0 || args != NULL); /* _PyCFunction_FastCallDict() must not be called with an exception set, because it may clear it (directly or indirectly) and so the caller loses its exception */ assert(!PyErr_Occurred()); if ((PY_VERSION_HEX < 0x030700A0) || unlikely(flags & METH_KEYWORDS)) { return (*((__Pyx_PyCFunctionFastWithKeywords)(void*)meth)) (self, args, nargs, NULL); } else { return (*((__Pyx_PyCFunctionFast)(void*)meth)) (self, args, nargs); } } #endif /* PyObjectCallOneArg */ #if CYTHON_COMPILING_IN_CPYTHON static PyObject* __Pyx__PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_New(1); if (unlikely(!args)) return NULL; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, &arg, 1); } #endif if (likely(PyCFunction_Check(func))) { if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { return __Pyx_PyObject_CallMethO(func, arg); #if CYTHON_FAST_PYCCALL } else if (PyCFunction_GET_FLAGS(func) & METH_FASTCALL) { return __Pyx_PyCFunction_FastCall(func, &arg, 1); #endif } } return __Pyx__PyObject_CallOneArg(func, arg); } #else static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_Pack(1, arg); if (unlikely(!args)) return NULL; result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } #endif /* PyObjectCall2Args */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) { PyObject *args, *result = NULL; #if CYTHON_FAST_PYCALL if (PyFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyFunction_FastCall(function, args, 2); } #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyCFunction_FastCall(function, args, 2); } #endif args = PyTuple_New(2); if (unlikely(!args)) goto done; Py_INCREF(arg1); PyTuple_SET_ITEM(args, 0, arg1); Py_INCREF(arg2); PyTuple_SET_ITEM(args, 1, arg2); Py_INCREF(function); result = __Pyx_PyObject_Call(function, args, NULL); Py_DECREF(args); Py_DECREF(function); done: return result; } /* RaiseException */ #if PY_MAJOR_VERSION < 3 static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, CYTHON_UNUSED PyObject *cause) { __Pyx_PyThreadState_declare Py_XINCREF(type); if (!value || value == Py_None) value = NULL; else Py_INCREF(value); if (!tb || tb == Py_None) tb = NULL; else { Py_INCREF(tb); if (!PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } } if (PyType_Check(type)) { #if CYTHON_COMPILING_IN_PYPY if (!value) { Py_INCREF(Py_None); value = Py_None; } #endif PyErr_NormalizeException(&type, &value, &tb); } else { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } value = type; type = (PyObject*) Py_TYPE(type); Py_INCREF(type); if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto raise_error; } } __Pyx_PyThreadState_assign __Pyx_ErrRestore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } #else static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { PyObject* owned_instance = NULL; if (tb == Py_None) { tb = 0; } else if (tb && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto bad; } if (value == Py_None) value = 0; if (PyExceptionInstance_Check(type)) { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto bad; } value = type; type = (PyObject*) Py_TYPE(value); } else if (PyExceptionClass_Check(type)) { PyObject *instance_class = NULL; if (value && PyExceptionInstance_Check(value)) { instance_class = (PyObject*) Py_TYPE(value); if (instance_class != type) { int is_subclass = PyObject_IsSubclass(instance_class, type); if (!is_subclass) { instance_class = NULL; } else if (unlikely(is_subclass == -1)) { goto bad; } else { type = instance_class; } } } if (!instance_class) { PyObject *args; if (!value) args = PyTuple_New(0); else if (PyTuple_Check(value)) { Py_INCREF(value); args = value; } else args = PyTuple_Pack(1, value); if (!args) goto bad; owned_instance = PyObject_Call(type, args, NULL); Py_DECREF(args); if (!owned_instance) goto bad; value = owned_instance; if (!PyExceptionInstance_Check(value)) { PyErr_Format(PyExc_TypeError, "calling %R should have returned an instance of " "BaseException, not %R", type, Py_TYPE(value)); goto bad; } } } else { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto bad; } if (cause) { PyObject *fixed_cause; if (cause == Py_None) { fixed_cause = NULL; } else if (PyExceptionClass_Check(cause)) { fixed_cause = PyObject_CallObject(cause, NULL); if (fixed_cause == NULL) goto bad; } else if (PyExceptionInstance_Check(cause)) { fixed_cause = cause; Py_INCREF(fixed_cause); } else { PyErr_SetString(PyExc_TypeError, "exception causes must derive from " "BaseException"); goto bad; } PyException_SetCause(value, fixed_cause); } PyErr_SetObject(type, value); if (tb) { #if CYTHON_COMPILING_IN_PYPY PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); Py_INCREF(tb); PyErr_Restore(tmp_type, tmp_value, tb); Py_XDECREF(tmp_tb); #else PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* tmp_tb = tstate->curexc_traceback; if (tb != tmp_tb) { Py_INCREF(tb); tstate->curexc_traceback = tb; Py_XDECREF(tmp_tb); } #endif } bad: Py_XDECREF(owned_instance); return; } #endif /* BytesEquals */ static CYTHON_INLINE int __Pyx_PyBytes_Equals(PyObject* s1, PyObject* s2, int equals) { #if CYTHON_COMPILING_IN_PYPY return PyObject_RichCompareBool(s1, s2, equals); #else if (s1 == s2) { return (equals == Py_EQ); } else if (PyBytes_CheckExact(s1) & PyBytes_CheckExact(s2)) { const char *ps1, *ps2; Py_ssize_t length = PyBytes_GET_SIZE(s1); if (length != PyBytes_GET_SIZE(s2)) return (equals == Py_NE); ps1 = PyBytes_AS_STRING(s1); ps2 = PyBytes_AS_STRING(s2); if (ps1[0] != ps2[0]) { return (equals == Py_NE); } else if (length == 1) { return (equals == Py_EQ); } else { int result; #if CYTHON_USE_UNICODE_INTERNALS Py_hash_t hash1, hash2; hash1 = ((PyBytesObject*)s1)->ob_shash; hash2 = ((PyBytesObject*)s2)->ob_shash; if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { return (equals == Py_NE); } #endif result = memcmp(ps1, ps2, (size_t)length); return (equals == Py_EQ) ? (result == 0) : (result != 0); } } else if ((s1 == Py_None) & PyBytes_CheckExact(s2)) { return (equals == Py_NE); } else if ((s2 == Py_None) & PyBytes_CheckExact(s1)) { return (equals == Py_NE); } else { int result; PyObject* py_result = PyObject_RichCompare(s1, s2, equals); if (!py_result) return -1; result = __Pyx_PyObject_IsTrue(py_result); Py_DECREF(py_result); return result; } #endif } /* UnicodeEquals */ static CYTHON_INLINE int __Pyx_PyUnicode_Equals(PyObject* s1, PyObject* s2, int equals) { #if CYTHON_COMPILING_IN_PYPY return PyObject_RichCompareBool(s1, s2, equals); #else #if PY_MAJOR_VERSION < 3 PyObject* owned_ref = NULL; #endif int s1_is_unicode, s2_is_unicode; if (s1 == s2) { goto return_eq; } s1_is_unicode = PyUnicode_CheckExact(s1); s2_is_unicode = PyUnicode_CheckExact(s2); #if PY_MAJOR_VERSION < 3 if ((s1_is_unicode & (!s2_is_unicode)) && PyString_CheckExact(s2)) { owned_ref = PyUnicode_FromObject(s2); if (unlikely(!owned_ref)) return -1; s2 = owned_ref; s2_is_unicode = 1; } else if ((s2_is_unicode & (!s1_is_unicode)) && PyString_CheckExact(s1)) { owned_ref = PyUnicode_FromObject(s1); if (unlikely(!owned_ref)) return -1; s1 = owned_ref; s1_is_unicode = 1; } else if (((!s2_is_unicode) & (!s1_is_unicode))) { return __Pyx_PyBytes_Equals(s1, s2, equals); } #endif if (s1_is_unicode & s2_is_unicode) { Py_ssize_t length; int kind; void *data1, *data2; if (unlikely(__Pyx_PyUnicode_READY(s1) < 0) || unlikely(__Pyx_PyUnicode_READY(s2) < 0)) return -1; length = __Pyx_PyUnicode_GET_LENGTH(s1); if (length != __Pyx_PyUnicode_GET_LENGTH(s2)) { goto return_ne; } #if CYTHON_USE_UNICODE_INTERNALS { Py_hash_t hash1, hash2; #if CYTHON_PEP393_ENABLED hash1 = ((PyASCIIObject*)s1)->hash; hash2 = ((PyASCIIObject*)s2)->hash; #else hash1 = ((PyUnicodeObject*)s1)->hash; hash2 = ((PyUnicodeObject*)s2)->hash; #endif if (hash1 != hash2 && hash1 != -1 && hash2 != -1) { goto return_ne; } } #endif kind = __Pyx_PyUnicode_KIND(s1); if (kind != __Pyx_PyUnicode_KIND(s2)) { goto return_ne; } data1 = __Pyx_PyUnicode_DATA(s1); data2 = __Pyx_PyUnicode_DATA(s2); if (__Pyx_PyUnicode_READ(kind, data1, 0) != __Pyx_PyUnicode_READ(kind, data2, 0)) { goto return_ne; } else if (length == 1) { goto return_eq; } else { int result = memcmp(data1, data2, (size_t)(length * kind)); #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_EQ) ? (result == 0) : (result != 0); } } else if ((s1 == Py_None) & s2_is_unicode) { goto return_ne; } else if ((s2 == Py_None) & s1_is_unicode) { goto return_ne; } else { int result; PyObject* py_result = PyObject_RichCompare(s1, s2, equals); #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif if (!py_result) return -1; result = __Pyx_PyObject_IsTrue(py_result); Py_DECREF(py_result); return result; } return_eq: #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_EQ); return_ne: #if PY_MAJOR_VERSION < 3 Py_XDECREF(owned_ref); #endif return (equals == Py_NE); #endif } /* SliceObject */ static CYTHON_INLINE PyObject* __Pyx_PyObject_GetSlice(PyObject* obj, Py_ssize_t cstart, Py_ssize_t cstop, PyObject** _py_start, PyObject** _py_stop, PyObject** _py_slice, int has_cstart, int has_cstop, CYTHON_UNUSED int wraparound) { #if CYTHON_USE_TYPE_SLOTS PyMappingMethods* mp; #if PY_MAJOR_VERSION < 3 PySequenceMethods* ms = Py_TYPE(obj)->tp_as_sequence; if (likely(ms && ms->sq_slice)) { if (!has_cstart) { if (_py_start && (*_py_start != Py_None)) { cstart = __Pyx_PyIndex_AsSsize_t(*_py_start); if ((cstart == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad; } else cstart = 0; } if (!has_cstop) { if (_py_stop && (*_py_stop != Py_None)) { cstop = __Pyx_PyIndex_AsSsize_t(*_py_stop); if ((cstop == (Py_ssize_t)-1) && PyErr_Occurred()) goto bad; } else cstop = PY_SSIZE_T_MAX; } if (wraparound && unlikely((cstart < 0) | (cstop < 0)) && likely(ms->sq_length)) { Py_ssize_t l = ms->sq_length(obj); if (likely(l >= 0)) { if (cstop < 0) { cstop += l; if (cstop < 0) cstop = 0; } if (cstart < 0) { cstart += l; if (cstart < 0) cstart = 0; } } else { if (!PyErr_ExceptionMatches(PyExc_OverflowError)) goto bad; PyErr_Clear(); } } return ms->sq_slice(obj, cstart, cstop); } #endif mp = Py_TYPE(obj)->tp_as_mapping; if (likely(mp && mp->mp_subscript)) #endif { PyObject* result; PyObject *py_slice, *py_start, *py_stop; if (_py_slice) { py_slice = *_py_slice; } else { PyObject* owned_start = NULL; PyObject* owned_stop = NULL; if (_py_start) { py_start = *_py_start; } else { if (has_cstart) { owned_start = py_start = PyInt_FromSsize_t(cstart); if (unlikely(!py_start)) goto bad; } else py_start = Py_None; } if (_py_stop) { py_stop = *_py_stop; } else { if (has_cstop) { owned_stop = py_stop = PyInt_FromSsize_t(cstop); if (unlikely(!py_stop)) { Py_XDECREF(owned_start); goto bad; } } else py_stop = Py_None; } py_slice = PySlice_New(py_start, py_stop, Py_None); Py_XDECREF(owned_start); Py_XDECREF(owned_stop); if (unlikely(!py_slice)) goto bad; } #if CYTHON_USE_TYPE_SLOTS result = mp->mp_subscript(obj, py_slice); #else result = PyObject_GetItem(obj, py_slice); #endif if (!_py_slice) { Py_DECREF(py_slice); } return result; } PyErr_Format(PyExc_TypeError, "'%.200s' object is unsliceable", Py_TYPE(obj)->tp_name); bad: return NULL; } /* GetException */ #if CYTHON_FAST_THREAD_STATE static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb) #endif { PyObject *local_type, *local_value, *local_tb; #if CYTHON_FAST_THREAD_STATE PyObject *tmp_type, *tmp_value, *tmp_tb; local_type = tstate->curexc_type; local_value = tstate->curexc_value; local_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; #else PyErr_Fetch(&local_type, &local_value, &local_tb); #endif PyErr_NormalizeException(&local_type, &local_value, &local_tb); #if CYTHON_FAST_THREAD_STATE if (unlikely(tstate->curexc_type)) #else if (unlikely(PyErr_Occurred())) #endif goto bad; #if PY_MAJOR_VERSION >= 3 if (local_tb) { if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0)) goto bad; } #endif Py_XINCREF(local_tb); Py_XINCREF(local_type); Py_XINCREF(local_value); *type = local_type; *value = local_value; *tb = local_tb; #if CYTHON_FAST_THREAD_STATE #if CYTHON_USE_EXC_INFO_STACK { _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = local_type; exc_info->exc_value = local_value; exc_info->exc_traceback = local_tb; } #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = local_type; tstate->exc_value = local_value; tstate->exc_traceback = local_tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #else PyErr_SetExcInfo(local_type, local_value, local_tb); #endif return 0; bad: *type = 0; *value = 0; *tb = 0; Py_XDECREF(local_type); Py_XDECREF(local_value); Py_XDECREF(local_tb); return -1; } /* SwapException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = *type; exc_info->exc_value = *value; exc_info->exc_traceback = *tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = *type; tstate->exc_value = *value; tstate->exc_traceback = *tb; #endif *type = tmp_type; *value = tmp_value; *tb = tmp_tb; } #else static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_GetExcInfo(&tmp_type, &tmp_value, &tmp_tb); PyErr_SetExcInfo(*type, *value, *tb); *type = tmp_type; *value = tmp_value; *tb = tmp_tb; } #endif /* GetTopmostException */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate) { _PyErr_StackItem *exc_info = tstate->exc_info; while ((exc_info->exc_type == NULL || exc_info->exc_type == Py_None) && exc_info->previous_item != NULL) { exc_info = exc_info->previous_item; } return exc_info; } #endif /* SaveResetException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); *type = exc_info->exc_type; *value = exc_info->exc_value; *tb = exc_info->exc_traceback; #else *type = tstate->exc_type; *value = tstate->exc_value; *tb = tstate->exc_traceback; #endif Py_XINCREF(*type); Py_XINCREF(*value); Py_XINCREF(*tb); } static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = type; exc_info->exc_value = value; exc_info->exc_traceback = tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = type; tstate->exc_value = value; tstate->exc_traceback = tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } #endif /* decode_c_string */ static CYTHON_INLINE PyObject* __Pyx_decode_c_string( const char* cstring, Py_ssize_t start, Py_ssize_t stop, const char* encoding, const char* errors, PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors)) { Py_ssize_t length; if (unlikely((start < 0) | (stop < 0))) { size_t slen = strlen(cstring); if (unlikely(slen > (size_t) PY_SSIZE_T_MAX)) { PyErr_SetString(PyExc_OverflowError, "c-string too long to convert to Python"); return NULL; } length = (Py_ssize_t) slen; if (start < 0) { start += length; if (start < 0) start = 0; } if (stop < 0) stop += length; } length = stop - start; if (unlikely(length <= 0)) return PyUnicode_FromUnicode(NULL, 0); cstring += start; if (decode_func) { return decode_func(cstring, length, errors); } else { return PyUnicode_Decode(cstring, length, encoding, errors); } } /* UnpackUnboundCMethod */ static int __Pyx_TryUnpackUnboundCMethod(__Pyx_CachedCFunction* target) { PyObject *method; method = __Pyx_PyObject_GetAttrStr(target->type, *target->method_name); if (unlikely(!method)) return -1; target->method = method; #if CYTHON_COMPILING_IN_CPYTHON #if PY_MAJOR_VERSION >= 3 if (likely(__Pyx_TypeCheck(method, &PyMethodDescr_Type))) #endif { PyMethodDescrObject *descr = (PyMethodDescrObject*) method; target->func = descr->d_method->ml_meth; target->flag = descr->d_method->ml_flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_STACKLESS); } #endif return 0; } /* CallUnboundCMethod1 */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_CallUnboundCMethod1(__Pyx_CachedCFunction* cfunc, PyObject* self, PyObject* arg) { if (likely(cfunc->func)) { int flag = cfunc->flag; if (flag == METH_O) { return (*(cfunc->func))(self, arg); } else if (PY_VERSION_HEX >= 0x030600B1 && flag == METH_FASTCALL) { if (PY_VERSION_HEX >= 0x030700A0) { return (*(__Pyx_PyCFunctionFast)(void*)(PyCFunction)cfunc->func)(self, &arg, 1); } else { return (*(__Pyx_PyCFunctionFastWithKeywords)(void*)(PyCFunction)cfunc->func)(self, &arg, 1, NULL); } } else if (PY_VERSION_HEX >= 0x030700A0 && flag == (METH_FASTCALL | METH_KEYWORDS)) { return (*(__Pyx_PyCFunctionFastWithKeywords)(void*)(PyCFunction)cfunc->func)(self, &arg, 1, NULL); } } return __Pyx__CallUnboundCMethod1(cfunc, self, arg); } #endif static PyObject* __Pyx__CallUnboundCMethod1(__Pyx_CachedCFunction* cfunc, PyObject* self, PyObject* arg){ PyObject *args, *result = NULL; if (unlikely(!cfunc->func && !cfunc->method) && unlikely(__Pyx_TryUnpackUnboundCMethod(cfunc) < 0)) return NULL; #if CYTHON_COMPILING_IN_CPYTHON if (cfunc->func && (cfunc->flag & METH_VARARGS)) { args = PyTuple_New(1); if (unlikely(!args)) goto bad; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); if (cfunc->flag & METH_KEYWORDS) result = (*(PyCFunctionWithKeywords)(void*)(PyCFunction)cfunc->func)(self, args, NULL); else result = (*cfunc->func)(self, args); } else { args = PyTuple_New(2); if (unlikely(!args)) goto bad; Py_INCREF(self); PyTuple_SET_ITEM(args, 0, self); Py_INCREF(arg); PyTuple_SET_ITEM(args, 1, arg); result = __Pyx_PyObject_Call(cfunc->method, args, NULL); } #else args = PyTuple_Pack(2, self, arg); if (unlikely(!args)) goto bad; result = __Pyx_PyObject_Call(cfunc->method, args, NULL); #endif bad: Py_XDECREF(args); return result; } /* Import */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level) { PyObject *empty_list = 0; PyObject *module = 0; PyObject *global_dict = 0; PyObject *empty_dict = 0; PyObject *list; #if PY_MAJOR_VERSION < 3 PyObject *py_import; py_import = __Pyx_PyObject_GetAttrStr(__pyx_b, __pyx_n_s_import); if (!py_import) goto bad; #endif if (from_list) list = from_list; else { empty_list = PyList_New(0); if (!empty_list) goto bad; list = empty_list; } global_dict = PyModule_GetDict(__pyx_m); if (!global_dict) goto bad; empty_dict = PyDict_New(); if (!empty_dict) goto bad; { #if PY_MAJOR_VERSION >= 3 if (level == -1) { if (strchr(__Pyx_MODULE_NAME, '.')) { module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, 1); if (!module) { if (!PyErr_ExceptionMatches(PyExc_ImportError)) goto bad; PyErr_Clear(); } } level = 0; } #endif if (!module) { #if PY_MAJOR_VERSION < 3 PyObject *py_level = PyInt_FromLong(level); if (!py_level) goto bad; module = PyObject_CallFunctionObjArgs(py_import, name, global_dict, empty_dict, list, py_level, (PyObject *)NULL); Py_DECREF(py_level); #else module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, level); #endif } } bad: #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_import); #endif Py_XDECREF(empty_list); Py_XDECREF(empty_dict); return module; } /* ImportFrom */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) { PyObject* value = __Pyx_PyObject_GetAttrStr(module, name); if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Format(PyExc_ImportError, #if PY_MAJOR_VERSION < 3 "cannot import name %.230s", PyString_AS_STRING(name)); #else "cannot import name %S", name); #endif } return value; } /* HasAttr */ static CYTHON_INLINE int __Pyx_HasAttr(PyObject *o, PyObject *n) { PyObject *r; if (unlikely(!__Pyx_PyBaseString_Check(n))) { PyErr_SetString(PyExc_TypeError, "hasattr(): attribute name must be string"); return -1; } r = __Pyx_GetAttr(o, n); if (unlikely(!r)) { PyErr_Clear(); return 0; } else { Py_DECREF(r); return 1; } } /* PyObject_GenericGetAttrNoDict */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject *__Pyx_RaiseGenericGetAttributeError(PyTypeObject *tp, PyObject *attr_name) { PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, attr_name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(attr_name)); #endif return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyObject_GenericGetAttrNoDict(PyObject* obj, PyObject* attr_name) { PyObject *descr; PyTypeObject *tp = Py_TYPE(obj); if (unlikely(!PyString_Check(attr_name))) { return PyObject_GenericGetAttr(obj, attr_name); } assert(!tp->tp_dictoffset); descr = _PyType_Lookup(tp, attr_name); if (unlikely(!descr)) { return __Pyx_RaiseGenericGetAttributeError(tp, attr_name); } Py_INCREF(descr); #if PY_MAJOR_VERSION < 3 if (likely(PyType_HasFeature(Py_TYPE(descr), Py_TPFLAGS_HAVE_CLASS))) #endif { descrgetfunc f = Py_TYPE(descr)->tp_descr_get; if (unlikely(f)) { PyObject *res = f(descr, obj, (PyObject *)tp); Py_DECREF(descr); return res; } } return descr; } #endif /* PyObject_GenericGetAttr */ #if CYTHON_USE_TYPE_SLOTS && CYTHON_USE_PYTYPE_LOOKUP && PY_VERSION_HEX < 0x03070000 static PyObject* __Pyx_PyObject_GenericGetAttr(PyObject* obj, PyObject* attr_name) { if (unlikely(Py_TYPE(obj)->tp_dictoffset)) { return PyObject_GenericGetAttr(obj, attr_name); } return __Pyx_PyObject_GenericGetAttrNoDict(obj, attr_name); } #endif /* SetupReduce */ static int __Pyx_setup_reduce_is_named(PyObject* meth, PyObject* name) { int ret; PyObject *name_attr; name_attr = __Pyx_PyObject_GetAttrStr(meth, __pyx_n_s_name); if (likely(name_attr)) { ret = PyObject_RichCompareBool(name_attr, name, Py_EQ); } else { ret = -1; } if (unlikely(ret < 0)) { PyErr_Clear(); ret = 0; } Py_XDECREF(name_attr); return ret; } static int __Pyx_setup_reduce(PyObject* type_obj) { int ret = 0; PyObject *object_reduce = NULL; PyObject *object_reduce_ex = NULL; PyObject *reduce = NULL; PyObject *reduce_ex = NULL; PyObject *reduce_cython = NULL; PyObject *setstate = NULL; PyObject *setstate_cython = NULL; #if CYTHON_USE_PYTYPE_LOOKUP if (_PyType_Lookup((PyTypeObject*)type_obj, __pyx_n_s_getstate)) goto GOOD; #else if (PyObject_HasAttr(type_obj, __pyx_n_s_getstate)) goto GOOD; #endif #if CYTHON_USE_PYTYPE_LOOKUP object_reduce_ex = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #else object_reduce_ex = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce_ex); if (!object_reduce_ex) goto BAD; #endif reduce_ex = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_ex); if (unlikely(!reduce_ex)) goto BAD; if (reduce_ex == object_reduce_ex) { #if CYTHON_USE_PYTYPE_LOOKUP object_reduce = _PyType_Lookup(&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #else object_reduce = __Pyx_PyObject_GetAttrStr((PyObject*)&PyBaseObject_Type, __pyx_n_s_reduce); if (!object_reduce) goto BAD; #endif reduce = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce); if (unlikely(!reduce)) goto BAD; if (reduce == object_reduce || __Pyx_setup_reduce_is_named(reduce, __pyx_n_s_reduce_cython)) { reduce_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_reduce_cython); if (unlikely(!reduce_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce, reduce_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_reduce_cython); if (unlikely(ret < 0)) goto BAD; setstate = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate); if (!setstate) PyErr_Clear(); if (!setstate || __Pyx_setup_reduce_is_named(setstate, __pyx_n_s_setstate_cython)) { setstate_cython = __Pyx_PyObject_GetAttrStr(type_obj, __pyx_n_s_setstate_cython); if (unlikely(!setstate_cython)) goto BAD; ret = PyDict_SetItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate, setstate_cython); if (unlikely(ret < 0)) goto BAD; ret = PyDict_DelItem(((PyTypeObject*)type_obj)->tp_dict, __pyx_n_s_setstate_cython); if (unlikely(ret < 0)) goto BAD; } PyType_Modified((PyTypeObject*)type_obj); } } goto GOOD; BAD: if (!PyErr_Occurred()) PyErr_Format(PyExc_RuntimeError, "Unable to initialize pickling for %s", ((PyTypeObject*)type_obj)->tp_name); ret = -1; GOOD: #if !CYTHON_USE_PYTYPE_LOOKUP Py_XDECREF(object_reduce); Py_XDECREF(object_reduce_ex); #endif Py_XDECREF(reduce); Py_XDECREF(reduce_ex); Py_XDECREF(reduce_cython); Py_XDECREF(setstate); Py_XDECREF(setstate_cython); return ret; } /* SetVTable */ static int __Pyx_SetVtable(PyObject *dict, void *vtable) { #if PY_VERSION_HEX >= 0x02070000 PyObject *ob = PyCapsule_New(vtable, 0, 0); #else PyObject *ob = PyCObject_FromVoidPtr(vtable, 0); #endif if (!ob) goto bad; if (PyDict_SetItem(dict, __pyx_n_s_pyx_vtable, ob) < 0) goto bad; Py_DECREF(ob); return 0; bad: Py_XDECREF(ob); return -1; } /* TypeImport */ #ifndef __PYX_HAVE_RT_ImportType #define __PYX_HAVE_RT_ImportType static PyTypeObject *__Pyx_ImportType(PyObject *module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size) { PyObject *result = 0; char warning[200]; Py_ssize_t basicsize; #ifdef Py_LIMITED_API PyObject *py_basicsize; #endif result = PyObject_GetAttrString(module, class_name); if (!result) goto bad; if (!PyType_Check(result)) { PyErr_Format(PyExc_TypeError, "%.200s.%.200s is not a type object", module_name, class_name); goto bad; } #ifndef Py_LIMITED_API basicsize = ((PyTypeObject *)result)->tp_basicsize; #else py_basicsize = PyObject_GetAttrString(result, "__basicsize__"); if (!py_basicsize) goto bad; basicsize = PyLong_AsSsize_t(py_basicsize); Py_DECREF(py_basicsize); py_basicsize = 0; if (basicsize == (Py_ssize_t)-1 && PyErr_Occurred()) goto bad; #endif if ((size_t)basicsize < size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } if (check_size == __Pyx_ImportType_CheckSize_Error && (size_t)basicsize != size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } else if (check_size == __Pyx_ImportType_CheckSize_Warn && (size_t)basicsize > size) { PyOS_snprintf(warning, sizeof(warning), "%s.%s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); if (PyErr_WarnEx(NULL, warning, 0) < 0) goto bad; } return (PyTypeObject *)result; bad: Py_XDECREF(result); return NULL; } #endif /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, (size_t)new_max*sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyObject *py_srcfile = 0; PyObject *py_funcname = 0; #if PY_MAJOR_VERSION < 3 py_srcfile = PyString_FromString(filename); #else py_srcfile = PyUnicode_FromString(filename); #endif if (!py_srcfile) goto bad; if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); #else py_funcname = PyUnicode_FromString(funcname); #endif } if (!py_funcname) goto bad; py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); Py_DECREF(py_funcname); return py_code; bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) goto bad; __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_int(int value) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(int) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(int) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(int) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(int), little, !is_unsigned); } } /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_int(unsigned int value) { const unsigned int neg_one = (unsigned int) ((unsigned int) 0 - (unsigned int) 1), const_zero = (unsigned int) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(unsigned int) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(unsigned int) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned int) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(unsigned int) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned int) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(unsigned int), little, !is_unsigned); } } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_unsigned_short(unsigned short value) { const unsigned short neg_one = (unsigned short) ((unsigned short) 0 - (unsigned short) 1), const_zero = (unsigned short) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(unsigned short) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(unsigned short) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned short) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(unsigned short) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(unsigned short) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(unsigned short), little, !is_unsigned); } } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); } } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_uint16_t(uint16_t value) { const uint16_t neg_one = (uint16_t) ((uint16_t) 0 - (uint16_t) 1), const_zero = (uint16_t) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(uint16_t) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(uint16_t) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(uint16_t) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(uint16_t) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(uint16_t) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(uint16_t), little, !is_unsigned); } } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(int) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case 1: __PYX_VERIFY_RETURN_INT(int, digit, digits[0]) case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 2 * PyLong_SHIFT) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 3 * PyLong_SHIFT) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 4 * PyLong_SHIFT) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(int) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case -1: __PYX_VERIFY_RETURN_INT(int, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(int, digit, +digits[0]) case -2: if (8 * sizeof(int) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } #endif if (sizeof(int) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else int val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (int) -1; } } else { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* CIntFromPy */ static CYTHON_INLINE enum http_method __Pyx_PyInt_As_enum__http_method(PyObject *x) { const enum http_method neg_one = (enum http_method) ((enum http_method) 0 - (enum http_method) 1), const_zero = (enum http_method) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(enum http_method) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(enum http_method, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (enum http_method) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (enum http_method) 0; case 1: __PYX_VERIFY_RETURN_INT(enum http_method, digit, digits[0]) case 2: if (8 * sizeof(enum http_method) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) >= 2 * PyLong_SHIFT) { return (enum http_method) (((((enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0])); } } break; case 3: if (8 * sizeof(enum http_method) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) >= 3 * PyLong_SHIFT) { return (enum http_method) (((((((enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0])); } } break; case 4: if (8 * sizeof(enum http_method) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) >= 4 * PyLong_SHIFT) { return (enum http_method) (((((((((enum http_method)digits[3]) << PyLong_SHIFT) | (enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (enum http_method) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(enum http_method) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(enum http_method, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(enum http_method) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(enum http_method, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (enum http_method) 0; case -1: __PYX_VERIFY_RETURN_INT(enum http_method, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(enum http_method, digit, +digits[0]) case -2: if (8 * sizeof(enum http_method) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 2 * PyLong_SHIFT) { return (enum http_method) (((enum http_method)-1)*(((((enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; case 2: if (8 * sizeof(enum http_method) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 2 * PyLong_SHIFT) { return (enum http_method) ((((((enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; case -3: if (8 * sizeof(enum http_method) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 3 * PyLong_SHIFT) { return (enum http_method) (((enum http_method)-1)*(((((((enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; case 3: if (8 * sizeof(enum http_method) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 3 * PyLong_SHIFT) { return (enum http_method) ((((((((enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; case -4: if (8 * sizeof(enum http_method) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 4 * PyLong_SHIFT) { return (enum http_method) (((enum http_method)-1)*(((((((((enum http_method)digits[3]) << PyLong_SHIFT) | (enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; case 4: if (8 * sizeof(enum http_method) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(enum http_method, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(enum http_method) - 1 > 4 * PyLong_SHIFT) { return (enum http_method) ((((((((((enum http_method)digits[3]) << PyLong_SHIFT) | (enum http_method)digits[2]) << PyLong_SHIFT) | (enum http_method)digits[1]) << PyLong_SHIFT) | (enum http_method)digits[0]))); } } break; } #endif if (sizeof(enum http_method) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(enum http_method, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(enum http_method) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(enum http_method, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else enum http_method val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (enum http_method) -1; } } else { enum http_method val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (enum http_method) -1; val = __Pyx_PyInt_As_enum__http_method(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to enum http_method"); return (enum http_method) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to enum http_method"); return (enum http_method) -1; } /* CIntFromPy */ static CYTHON_INLINE size_t __Pyx_PyInt_As_size_t(PyObject *x) { const size_t neg_one = (size_t) ((size_t) 0 - (size_t) 1), const_zero = (size_t) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(size_t) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(size_t, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (size_t) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (size_t) 0; case 1: __PYX_VERIFY_RETURN_INT(size_t, digit, digits[0]) case 2: if (8 * sizeof(size_t) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) >= 2 * PyLong_SHIFT) { return (size_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } } break; case 3: if (8 * sizeof(size_t) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) >= 3 * PyLong_SHIFT) { return (size_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } } break; case 4: if (8 * sizeof(size_t) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) >= 4 * PyLong_SHIFT) { return (size_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (size_t) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(size_t) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(size_t, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(size_t) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(size_t, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (size_t) 0; case -1: __PYX_VERIFY_RETURN_INT(size_t, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(size_t, digit, +digits[0]) case -2: if (8 * sizeof(size_t) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 2 * PyLong_SHIFT) { return (size_t) (((size_t)-1)*(((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; case 2: if (8 * sizeof(size_t) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 2 * PyLong_SHIFT) { return (size_t) ((((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; case -3: if (8 * sizeof(size_t) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 3 * PyLong_SHIFT) { return (size_t) (((size_t)-1)*(((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; case 3: if (8 * sizeof(size_t) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 3 * PyLong_SHIFT) { return (size_t) ((((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; case -4: if (8 * sizeof(size_t) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 4 * PyLong_SHIFT) { return (size_t) (((size_t)-1)*(((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; case 4: if (8 * sizeof(size_t) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(size_t, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(size_t) - 1 > 4 * PyLong_SHIFT) { return (size_t) ((((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0]))); } } break; } #endif if (sizeof(size_t) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(size_t, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(size_t) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(size_t, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else size_t val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (size_t) -1; } } else { size_t val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (size_t) -1; val = __Pyx_PyInt_As_size_t(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to size_t"); return (size_t) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to size_t"); return (size_t) -1; } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(long) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case 1: __PYX_VERIFY_RETURN_INT(long, digit, digits[0]) case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 2 * PyLong_SHIFT) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 3 * PyLong_SHIFT) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 4 * PyLong_SHIFT) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(long) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case -1: __PYX_VERIFY_RETURN_INT(long, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(long, digit, +digits[0]) case -2: if (8 * sizeof(long) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } #endif if (sizeof(long) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else long val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (long) -1; } } else { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = a->tp_base; if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { int res = exc_type1 ? __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type1) : 0; if (!res) { res = __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } return res; } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; itp_name); if (cached_type) { if (!PyType_Check((PyObject*)cached_type)) { PyErr_Format(PyExc_TypeError, "Shared Cython type %.200s is not a type object", type->tp_name); goto bad; } if (cached_type->tp_basicsize != type->tp_basicsize) { PyErr_Format(PyExc_TypeError, "Shared Cython type %.200s has the wrong size, try recompiling", type->tp_name); goto bad; } } else { if (!PyErr_ExceptionMatches(PyExc_AttributeError)) goto bad; PyErr_Clear(); if (PyType_Ready(type) < 0) goto bad; if (PyObject_SetAttrString(fake_module, type->tp_name, (PyObject*) type) < 0) goto bad; Py_INCREF(type); cached_type = type; } done: Py_DECREF(fake_module); return cached_type; bad: Py_XDECREF(cached_type); cached_type = NULL; goto done; } /* PyObjectGetMethod */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method) { PyObject *attr; #if CYTHON_UNPACK_METHODS && CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_PYTYPE_LOOKUP PyTypeObject *tp = Py_TYPE(obj); PyObject *descr; descrgetfunc f = NULL; PyObject **dictptr, *dict; int meth_found = 0; assert (*method == NULL); if (unlikely(tp->tp_getattro != PyObject_GenericGetAttr)) { attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; } if (unlikely(tp->tp_dict == NULL) && unlikely(PyType_Ready(tp) < 0)) { return 0; } descr = _PyType_Lookup(tp, name); if (likely(descr != NULL)) { Py_INCREF(descr); #if PY_MAJOR_VERSION >= 3 #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type))) #endif #else #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr))) #endif #endif { meth_found = 1; } else { f = Py_TYPE(descr)->tp_descr_get; if (f != NULL && PyDescr_IsData(descr)) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } } } dictptr = _PyObject_GetDictPtr(obj); if (dictptr != NULL && (dict = *dictptr) != NULL) { Py_INCREF(dict); attr = __Pyx_PyDict_GetItemStr(dict, name); if (attr != NULL) { Py_INCREF(attr); Py_DECREF(dict); Py_XDECREF(descr); goto try_unpack; } Py_DECREF(dict); } if (meth_found) { *method = descr; return 1; } if (f != NULL) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } if (descr != NULL) { *method = descr; return 0; } PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(name)); #endif return 0; #else attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; #endif try_unpack: #if CYTHON_UNPACK_METHODS if (likely(attr) && PyMethod_Check(attr) && likely(PyMethod_GET_SELF(attr) == obj)) { PyObject *function = PyMethod_GET_FUNCTION(attr); Py_INCREF(function); Py_DECREF(attr); *method = function; return 1; } #endif *method = attr; return 0; } /* PyObjectCallMethod1 */ static PyObject* __Pyx__PyObject_CallMethod1(PyObject* method, PyObject* arg) { PyObject *result = __Pyx_PyObject_CallOneArg(method, arg); Py_DECREF(method); return result; } static PyObject* __Pyx_PyObject_CallMethod1(PyObject* obj, PyObject* method_name, PyObject* arg) { PyObject *method = NULL, *result; int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method); if (likely(is_method)) { result = __Pyx_PyObject_Call2Args(method, obj, arg); Py_DECREF(method); return result; } if (unlikely(!method)) return NULL; return __Pyx__PyObject_CallMethod1(method, arg); } /* CoroutineBase */ #include #include #define __Pyx_Coroutine_Undelegate(gen) Py_CLEAR((gen)->yieldfrom) static int __Pyx_PyGen__FetchStopIterationValue(CYTHON_UNUSED PyThreadState *__pyx_tstate, PyObject **pvalue) { PyObject *et, *ev, *tb; PyObject *value = NULL; __Pyx_ErrFetch(&et, &ev, &tb); if (!et) { Py_XDECREF(tb); Py_XDECREF(ev); Py_INCREF(Py_None); *pvalue = Py_None; return 0; } if (likely(et == PyExc_StopIteration)) { if (!ev) { Py_INCREF(Py_None); value = Py_None; } #if PY_VERSION_HEX >= 0x030300A0 else if (Py_TYPE(ev) == (PyTypeObject*)PyExc_StopIteration) { value = ((PyStopIterationObject *)ev)->value; Py_INCREF(value); Py_DECREF(ev); } #endif else if (unlikely(PyTuple_Check(ev))) { if (PyTuple_GET_SIZE(ev) >= 1) { #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS value = PyTuple_GET_ITEM(ev, 0); Py_INCREF(value); #else value = PySequence_ITEM(ev, 0); #endif } else { Py_INCREF(Py_None); value = Py_None; } Py_DECREF(ev); } else if (!__Pyx_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration)) { value = ev; } if (likely(value)) { Py_XDECREF(tb); Py_DECREF(et); *pvalue = value; return 0; } } else if (!__Pyx_PyErr_GivenExceptionMatches(et, PyExc_StopIteration)) { __Pyx_ErrRestore(et, ev, tb); return -1; } PyErr_NormalizeException(&et, &ev, &tb); if (unlikely(!PyObject_TypeCheck(ev, (PyTypeObject*)PyExc_StopIteration))) { __Pyx_ErrRestore(et, ev, tb); return -1; } Py_XDECREF(tb); Py_DECREF(et); #if PY_VERSION_HEX >= 0x030300A0 value = ((PyStopIterationObject *)ev)->value; Py_INCREF(value); Py_DECREF(ev); #else { PyObject* args = __Pyx_PyObject_GetAttrStr(ev, __pyx_n_s_args); Py_DECREF(ev); if (likely(args)) { value = PySequence_GetItem(args, 0); Py_DECREF(args); } if (unlikely(!value)) { __Pyx_ErrRestore(NULL, NULL, NULL); Py_INCREF(Py_None); value = Py_None; } } #endif *pvalue = value; return 0; } static CYTHON_INLINE void __Pyx_Coroutine_ExceptionClear(__Pyx_ExcInfoStruct *exc_state) { PyObject *t, *v, *tb; t = exc_state->exc_type; v = exc_state->exc_value; tb = exc_state->exc_traceback; exc_state->exc_type = NULL; exc_state->exc_value = NULL; exc_state->exc_traceback = NULL; Py_XDECREF(t); Py_XDECREF(v); Py_XDECREF(tb); } #define __Pyx_Coroutine_AlreadyRunningError(gen) (__Pyx__Coroutine_AlreadyRunningError(gen), (PyObject*)NULL) static void __Pyx__Coroutine_AlreadyRunningError(CYTHON_UNUSED __pyx_CoroutineObject *gen) { const char *msg; if ((0)) { #ifdef __Pyx_Coroutine_USED } else if (__Pyx_Coroutine_Check((PyObject*)gen)) { msg = "coroutine already executing"; #endif #ifdef __Pyx_AsyncGen_USED } else if (__Pyx_AsyncGen_CheckExact((PyObject*)gen)) { msg = "async generator already executing"; #endif } else { msg = "generator already executing"; } PyErr_SetString(PyExc_ValueError, msg); } #define __Pyx_Coroutine_NotStartedError(gen) (__Pyx__Coroutine_NotStartedError(gen), (PyObject*)NULL) static void __Pyx__Coroutine_NotStartedError(CYTHON_UNUSED PyObject *gen) { const char *msg; if ((0)) { #ifdef __Pyx_Coroutine_USED } else if (__Pyx_Coroutine_Check(gen)) { msg = "can't send non-None value to a just-started coroutine"; #endif #ifdef __Pyx_AsyncGen_USED } else if (__Pyx_AsyncGen_CheckExact(gen)) { msg = "can't send non-None value to a just-started async generator"; #endif } else { msg = "can't send non-None value to a just-started generator"; } PyErr_SetString(PyExc_TypeError, msg); } #define __Pyx_Coroutine_AlreadyTerminatedError(gen, value, closing) (__Pyx__Coroutine_AlreadyTerminatedError(gen, value, closing), (PyObject*)NULL) static void __Pyx__Coroutine_AlreadyTerminatedError(CYTHON_UNUSED PyObject *gen, PyObject *value, CYTHON_UNUSED int closing) { #ifdef __Pyx_Coroutine_USED if (!closing && __Pyx_Coroutine_Check(gen)) { PyErr_SetString(PyExc_RuntimeError, "cannot reuse already awaited coroutine"); } else #endif if (value) { #ifdef __Pyx_AsyncGen_USED if (__Pyx_AsyncGen_CheckExact(gen)) PyErr_SetNone(__Pyx_PyExc_StopAsyncIteration); else #endif PyErr_SetNone(PyExc_StopIteration); } } static PyObject *__Pyx_Coroutine_SendEx(__pyx_CoroutineObject *self, PyObject *value, int closing) { __Pyx_PyThreadState_declare PyThreadState *tstate; __Pyx_ExcInfoStruct *exc_state; PyObject *retval; assert(!self->is_running); if (unlikely(self->resume_label == 0)) { if (unlikely(value && value != Py_None)) { return __Pyx_Coroutine_NotStartedError((PyObject*)self); } } if (unlikely(self->resume_label == -1)) { return __Pyx_Coroutine_AlreadyTerminatedError((PyObject*)self, value, closing); } #if CYTHON_FAST_THREAD_STATE __Pyx_PyThreadState_assign tstate = __pyx_tstate; #else tstate = __Pyx_PyThreadState_Current; #endif exc_state = &self->gi_exc_state; if (exc_state->exc_type) { #if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_PYSTON #else if (exc_state->exc_traceback) { PyTracebackObject *tb = (PyTracebackObject *) exc_state->exc_traceback; PyFrameObject *f = tb->tb_frame; Py_XINCREF(tstate->frame); assert(f->f_back == NULL); f->f_back = tstate->frame; } #endif } #if CYTHON_USE_EXC_INFO_STACK exc_state->previous_item = tstate->exc_info; tstate->exc_info = exc_state; #else if (exc_state->exc_type) { __Pyx_ExceptionSwap(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback); } else { __Pyx_Coroutine_ExceptionClear(exc_state); __Pyx_ExceptionSave(&exc_state->exc_type, &exc_state->exc_value, &exc_state->exc_traceback); } #endif self->is_running = 1; retval = self->body((PyObject *) self, tstate, value); self->is_running = 0; #if CYTHON_USE_EXC_INFO_STACK exc_state = &self->gi_exc_state; tstate->exc_info = exc_state->previous_item; exc_state->previous_item = NULL; __Pyx_Coroutine_ResetFrameBackpointer(exc_state); #endif return retval; } static CYTHON_INLINE void __Pyx_Coroutine_ResetFrameBackpointer(__Pyx_ExcInfoStruct *exc_state) { PyObject *exc_tb = exc_state->exc_traceback; if (likely(exc_tb)) { #if CYTHON_COMPILING_IN_PYPY || CYTHON_COMPILING_IN_PYSTON #else PyTracebackObject *tb = (PyTracebackObject *) exc_tb; PyFrameObject *f = tb->tb_frame; Py_CLEAR(f->f_back); #endif } } static CYTHON_INLINE PyObject *__Pyx_Coroutine_MethodReturn(CYTHON_UNUSED PyObject* gen, PyObject *retval) { if (unlikely(!retval)) { __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign if (!__Pyx_PyErr_Occurred()) { PyObject *exc = PyExc_StopIteration; #ifdef __Pyx_AsyncGen_USED if (__Pyx_AsyncGen_CheckExact(gen)) exc = __Pyx_PyExc_StopAsyncIteration; #endif __Pyx_PyErr_SetNone(exc); } } return retval; } static CYTHON_INLINE PyObject *__Pyx_Coroutine_FinishDelegation(__pyx_CoroutineObject *gen) { PyObject *ret; PyObject *val = NULL; __Pyx_Coroutine_Undelegate(gen); __Pyx_PyGen__FetchStopIterationValue(__Pyx_PyThreadState_Current, &val); ret = __Pyx_Coroutine_SendEx(gen, val, 0); Py_XDECREF(val); return ret; } static PyObject *__Pyx_Coroutine_Send(PyObject *self, PyObject *value) { PyObject *retval; __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self; PyObject *yf = gen->yieldfrom; if (unlikely(gen->is_running)) return __Pyx_Coroutine_AlreadyRunningError(gen); if (yf) { PyObject *ret; gen->is_running = 1; #ifdef __Pyx_Generator_USED if (__Pyx_Generator_CheckExact(yf)) { ret = __Pyx_Coroutine_Send(yf, value); } else #endif #ifdef __Pyx_Coroutine_USED if (__Pyx_Coroutine_Check(yf)) { ret = __Pyx_Coroutine_Send(yf, value); } else #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_PyAsyncGenASend_CheckExact(yf)) { ret = __Pyx_async_gen_asend_send(yf, value); } else #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) if (PyGen_CheckExact(yf)) { ret = _PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value); } else #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03050000 && defined(PyCoro_CheckExact) && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) if (PyCoro_CheckExact(yf)) { ret = _PyGen_Send((PyGenObject*)yf, value == Py_None ? NULL : value); } else #endif { if (value == Py_None) ret = Py_TYPE(yf)->tp_iternext(yf); else ret = __Pyx_PyObject_CallMethod1(yf, __pyx_n_s_send, value); } gen->is_running = 0; if (likely(ret)) { return ret; } retval = __Pyx_Coroutine_FinishDelegation(gen); } else { retval = __Pyx_Coroutine_SendEx(gen, value, 0); } return __Pyx_Coroutine_MethodReturn(self, retval); } static int __Pyx_Coroutine_CloseIter(__pyx_CoroutineObject *gen, PyObject *yf) { PyObject *retval = NULL; int err = 0; #ifdef __Pyx_Generator_USED if (__Pyx_Generator_CheckExact(yf)) { retval = __Pyx_Coroutine_Close(yf); if (!retval) return -1; } else #endif #ifdef __Pyx_Coroutine_USED if (__Pyx_Coroutine_Check(yf)) { retval = __Pyx_Coroutine_Close(yf); if (!retval) return -1; } else if (__Pyx_CoroutineAwait_CheckExact(yf)) { retval = __Pyx_CoroutineAwait_Close((__pyx_CoroutineAwaitObject*)yf, NULL); if (!retval) return -1; } else #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_PyAsyncGenASend_CheckExact(yf)) { retval = __Pyx_async_gen_asend_close(yf, NULL); } else if (__pyx_PyAsyncGenAThrow_CheckExact(yf)) { retval = __Pyx_async_gen_athrow_close(yf, NULL); } else #endif { PyObject *meth; gen->is_running = 1; meth = __Pyx_PyObject_GetAttrStr(yf, __pyx_n_s_close); if (unlikely(!meth)) { if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_WriteUnraisable(yf); } PyErr_Clear(); } else { retval = PyObject_CallFunction(meth, NULL); Py_DECREF(meth); if (!retval) err = -1; } gen->is_running = 0; } Py_XDECREF(retval); return err; } static PyObject *__Pyx_Generator_Next(PyObject *self) { __pyx_CoroutineObject *gen = (__pyx_CoroutineObject*) self; PyObject *yf = gen->yieldfrom; if (unlikely(gen->is_running)) return __Pyx_Coroutine_AlreadyRunningError(gen); if (yf) { PyObject *ret; gen->is_running = 1; #ifdef __Pyx_Generator_USED if (__Pyx_Generator_CheckExact(yf)) { ret = __Pyx_Generator_Next(yf); } else #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x03030000 && (defined(__linux__) || PY_VERSION_HEX >= 0x030600B3) if (PyGen_CheckExact(yf)) { ret = _PyGen_Send((PyGenObject*)yf, NULL); } else #endif #ifdef __Pyx_Coroutine_USED if (__Pyx_Coroutine_Check(yf)) { ret = __Pyx_Coroutine_Send(yf, Py_None); } else #endif ret = Py_TYPE(yf)->tp_iternext(yf); gen->is_running = 0; if (likely(ret)) { return ret; } return __Pyx_Coroutine_FinishDelegation(gen); } return __Pyx_Coroutine_SendEx(gen, Py_None, 0); } static PyObject *__Pyx_Coroutine_Close_Method(PyObject *self, CYTHON_UNUSED PyObject *arg) { return __Pyx_Coroutine_Close(self); } static PyObject *__Pyx_Coroutine_Close(PyObject *self) { __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; PyObject *retval, *raised_exception; PyObject *yf = gen->yieldfrom; int err = 0; if (unlikely(gen->is_running)) return __Pyx_Coroutine_AlreadyRunningError(gen); if (yf) { Py_INCREF(yf); err = __Pyx_Coroutine_CloseIter(gen, yf); __Pyx_Coroutine_Undelegate(gen); Py_DECREF(yf); } if (err == 0) PyErr_SetNone(PyExc_GeneratorExit); retval = __Pyx_Coroutine_SendEx(gen, NULL, 1); if (unlikely(retval)) { const char *msg; Py_DECREF(retval); if ((0)) { #ifdef __Pyx_Coroutine_USED } else if (__Pyx_Coroutine_Check(self)) { msg = "coroutine ignored GeneratorExit"; #endif #ifdef __Pyx_AsyncGen_USED } else if (__Pyx_AsyncGen_CheckExact(self)) { #if PY_VERSION_HEX < 0x03060000 msg = "async generator ignored GeneratorExit - might require Python 3.6+ finalisation (PEP 525)"; #else msg = "async generator ignored GeneratorExit"; #endif #endif } else { msg = "generator ignored GeneratorExit"; } PyErr_SetString(PyExc_RuntimeError, msg); return NULL; } raised_exception = PyErr_Occurred(); if (likely(!raised_exception || __Pyx_PyErr_GivenExceptionMatches2(raised_exception, PyExc_GeneratorExit, PyExc_StopIteration))) { if (raised_exception) PyErr_Clear(); Py_INCREF(Py_None); return Py_None; } return NULL; } static PyObject *__Pyx__Coroutine_Throw(PyObject *self, PyObject *typ, PyObject *val, PyObject *tb, PyObject *args, int close_on_genexit) { __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; PyObject *yf = gen->yieldfrom; if (unlikely(gen->is_running)) return __Pyx_Coroutine_AlreadyRunningError(gen); if (yf) { PyObject *ret; Py_INCREF(yf); if (__Pyx_PyErr_GivenExceptionMatches(typ, PyExc_GeneratorExit) && close_on_genexit) { int err = __Pyx_Coroutine_CloseIter(gen, yf); Py_DECREF(yf); __Pyx_Coroutine_Undelegate(gen); if (err < 0) return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0)); goto throw_here; } gen->is_running = 1; if (0 #ifdef __Pyx_Generator_USED || __Pyx_Generator_CheckExact(yf) #endif #ifdef __Pyx_Coroutine_USED || __Pyx_Coroutine_Check(yf) #endif ) { ret = __Pyx__Coroutine_Throw(yf, typ, val, tb, args, close_on_genexit); #ifdef __Pyx_Coroutine_USED } else if (__Pyx_CoroutineAwait_CheckExact(yf)) { ret = __Pyx__Coroutine_Throw(((__pyx_CoroutineAwaitObject*)yf)->coroutine, typ, val, tb, args, close_on_genexit); #endif } else { PyObject *meth = __Pyx_PyObject_GetAttrStr(yf, __pyx_n_s_throw); if (unlikely(!meth)) { Py_DECREF(yf); if (!PyErr_ExceptionMatches(PyExc_AttributeError)) { gen->is_running = 0; return NULL; } PyErr_Clear(); __Pyx_Coroutine_Undelegate(gen); gen->is_running = 0; goto throw_here; } if (likely(args)) { ret = PyObject_CallObject(meth, args); } else { ret = PyObject_CallFunctionObjArgs(meth, typ, val, tb, NULL); } Py_DECREF(meth); } gen->is_running = 0; Py_DECREF(yf); if (!ret) { ret = __Pyx_Coroutine_FinishDelegation(gen); } return __Pyx_Coroutine_MethodReturn(self, ret); } throw_here: __Pyx_Raise(typ, val, tb, NULL); return __Pyx_Coroutine_MethodReturn(self, __Pyx_Coroutine_SendEx(gen, NULL, 0)); } static PyObject *__Pyx_Coroutine_Throw(PyObject *self, PyObject *args) { PyObject *typ; PyObject *val = NULL; PyObject *tb = NULL; if (!PyArg_UnpackTuple(args, (char *)"throw", 1, 3, &typ, &val, &tb)) return NULL; return __Pyx__Coroutine_Throw(self, typ, val, tb, args, 1); } static CYTHON_INLINE int __Pyx_Coroutine_traverse_excstate(__Pyx_ExcInfoStruct *exc_state, visitproc visit, void *arg) { Py_VISIT(exc_state->exc_type); Py_VISIT(exc_state->exc_value); Py_VISIT(exc_state->exc_traceback); return 0; } static int __Pyx_Coroutine_traverse(__pyx_CoroutineObject *gen, visitproc visit, void *arg) { Py_VISIT(gen->closure); Py_VISIT(gen->classobj); Py_VISIT(gen->yieldfrom); return __Pyx_Coroutine_traverse_excstate(&gen->gi_exc_state, visit, arg); } static int __Pyx_Coroutine_clear(PyObject *self) { __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; Py_CLEAR(gen->closure); Py_CLEAR(gen->classobj); Py_CLEAR(gen->yieldfrom); __Pyx_Coroutine_ExceptionClear(&gen->gi_exc_state); #ifdef __Pyx_AsyncGen_USED if (__Pyx_AsyncGen_CheckExact(self)) { Py_CLEAR(((__pyx_PyAsyncGenObject*)gen)->ag_finalizer); } #endif Py_CLEAR(gen->gi_code); Py_CLEAR(gen->gi_name); Py_CLEAR(gen->gi_qualname); Py_CLEAR(gen->gi_modulename); return 0; } static void __Pyx_Coroutine_dealloc(PyObject *self) { __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; PyObject_GC_UnTrack(gen); if (gen->gi_weakreflist != NULL) PyObject_ClearWeakRefs(self); if (gen->resume_label >= 0) { PyObject_GC_Track(self); #if PY_VERSION_HEX >= 0x030400a1 && CYTHON_USE_TP_FINALIZE if (PyObject_CallFinalizerFromDealloc(self)) #else Py_TYPE(gen)->tp_del(self); if (self->ob_refcnt > 0) #endif { return; } PyObject_GC_UnTrack(self); } #ifdef __Pyx_AsyncGen_USED if (__Pyx_AsyncGen_CheckExact(self)) { /* We have to handle this case for asynchronous generators right here, because this code has to be between UNTRACK and GC_Del. */ Py_CLEAR(((__pyx_PyAsyncGenObject*)self)->ag_finalizer); } #endif __Pyx_Coroutine_clear(self); PyObject_GC_Del(gen); } static void __Pyx_Coroutine_del(PyObject *self) { PyObject *error_type, *error_value, *error_traceback; __pyx_CoroutineObject *gen = (__pyx_CoroutineObject *) self; __Pyx_PyThreadState_declare if (gen->resume_label < 0) { return; } #if !CYTHON_USE_TP_FINALIZE assert(self->ob_refcnt == 0); self->ob_refcnt = 1; #endif __Pyx_PyThreadState_assign __Pyx_ErrFetch(&error_type, &error_value, &error_traceback); #ifdef __Pyx_AsyncGen_USED if (__Pyx_AsyncGen_CheckExact(self)) { __pyx_PyAsyncGenObject *agen = (__pyx_PyAsyncGenObject*)self; PyObject *finalizer = agen->ag_finalizer; if (finalizer && !agen->ag_closed) { PyObject *res = __Pyx_PyObject_CallOneArg(finalizer, self); if (unlikely(!res)) { PyErr_WriteUnraisable(self); } else { Py_DECREF(res); } __Pyx_ErrRestore(error_type, error_value, error_traceback); return; } } #endif if (unlikely(gen->resume_label == 0 && !error_value)) { #ifdef __Pyx_Coroutine_USED #ifdef __Pyx_Generator_USED if (!__Pyx_Generator_CheckExact(self)) #endif { PyObject_GC_UnTrack(self); #if PY_MAJOR_VERSION >= 3 || defined(PyErr_WarnFormat) if (unlikely(PyErr_WarnFormat(PyExc_RuntimeWarning, 1, "coroutine '%.50S' was never awaited", gen->gi_qualname) < 0)) PyErr_WriteUnraisable(self); #else {PyObject *msg; char *cmsg; #if CYTHON_COMPILING_IN_PYPY msg = NULL; cmsg = (char*) "coroutine was never awaited"; #else char *cname; PyObject *qualname; qualname = gen->gi_qualname; cname = PyString_AS_STRING(qualname); msg = PyString_FromFormat("coroutine '%.50s' was never awaited", cname); if (unlikely(!msg)) { PyErr_Clear(); cmsg = (char*) "coroutine was never awaited"; } else { cmsg = PyString_AS_STRING(msg); } #endif if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, cmsg, 1) < 0)) PyErr_WriteUnraisable(self); Py_XDECREF(msg);} #endif PyObject_GC_Track(self); } #endif } else { PyObject *res = __Pyx_Coroutine_Close(self); if (unlikely(!res)) { if (PyErr_Occurred()) PyErr_WriteUnraisable(self); } else { Py_DECREF(res); } } __Pyx_ErrRestore(error_type, error_value, error_traceback); #if !CYTHON_USE_TP_FINALIZE assert(self->ob_refcnt > 0); if (--self->ob_refcnt == 0) { return; } { Py_ssize_t refcnt = self->ob_refcnt; _Py_NewReference(self); self->ob_refcnt = refcnt; } #if CYTHON_COMPILING_IN_CPYTHON assert(PyType_IS_GC(self->ob_type) && _Py_AS_GC(self)->gc.gc_refs != _PyGC_REFS_UNTRACKED); _Py_DEC_REFTOTAL; #endif #ifdef COUNT_ALLOCS --Py_TYPE(self)->tp_frees; --Py_TYPE(self)->tp_allocs; #endif #endif } static PyObject * __Pyx_Coroutine_get_name(__pyx_CoroutineObject *self, CYTHON_UNUSED void *context) { PyObject *name = self->gi_name; if (unlikely(!name)) name = Py_None; Py_INCREF(name); return name; } static int __Pyx_Coroutine_set_name(__pyx_CoroutineObject *self, PyObject *value, CYTHON_UNUSED void *context) { PyObject *tmp; #if PY_MAJOR_VERSION >= 3 if (unlikely(value == NULL || !PyUnicode_Check(value))) #else if (unlikely(value == NULL || !PyString_Check(value))) #endif { PyErr_SetString(PyExc_TypeError, "__name__ must be set to a string object"); return -1; } tmp = self->gi_name; Py_INCREF(value); self->gi_name = value; Py_XDECREF(tmp); return 0; } static PyObject * __Pyx_Coroutine_get_qualname(__pyx_CoroutineObject *self, CYTHON_UNUSED void *context) { PyObject *name = self->gi_qualname; if (unlikely(!name)) name = Py_None; Py_INCREF(name); return name; } static int __Pyx_Coroutine_set_qualname(__pyx_CoroutineObject *self, PyObject *value, CYTHON_UNUSED void *context) { PyObject *tmp; #if PY_MAJOR_VERSION >= 3 if (unlikely(value == NULL || !PyUnicode_Check(value))) #else if (unlikely(value == NULL || !PyString_Check(value))) #endif { PyErr_SetString(PyExc_TypeError, "__qualname__ must be set to a string object"); return -1; } tmp = self->gi_qualname; Py_INCREF(value); self->gi_qualname = value; Py_XDECREF(tmp); return 0; } static __pyx_CoroutineObject *__Pyx__Coroutine_New( PyTypeObject* type, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, PyObject *name, PyObject *qualname, PyObject *module_name) { __pyx_CoroutineObject *gen = PyObject_GC_New(__pyx_CoroutineObject, type); if (unlikely(!gen)) return NULL; return __Pyx__Coroutine_NewInit(gen, body, code, closure, name, qualname, module_name); } static __pyx_CoroutineObject *__Pyx__Coroutine_NewInit( __pyx_CoroutineObject *gen, __pyx_coroutine_body_t body, PyObject *code, PyObject *closure, PyObject *name, PyObject *qualname, PyObject *module_name) { gen->body = body; gen->closure = closure; Py_XINCREF(closure); gen->is_running = 0; gen->resume_label = 0; gen->classobj = NULL; gen->yieldfrom = NULL; gen->gi_exc_state.exc_type = NULL; gen->gi_exc_state.exc_value = NULL; gen->gi_exc_state.exc_traceback = NULL; #if CYTHON_USE_EXC_INFO_STACK gen->gi_exc_state.previous_item = NULL; #endif gen->gi_weakreflist = NULL; Py_XINCREF(qualname); gen->gi_qualname = qualname; Py_XINCREF(name); gen->gi_name = name; Py_XINCREF(module_name); gen->gi_modulename = module_name; Py_XINCREF(code); gen->gi_code = code; PyObject_GC_Track(gen); return gen; } /* PatchModuleWithCoroutine */ static PyObject* __Pyx_Coroutine_patch_module(PyObject* module, const char* py_code) { #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) int result; PyObject *globals, *result_obj; globals = PyDict_New(); if (unlikely(!globals)) goto ignore; result = PyDict_SetItemString(globals, "_cython_coroutine_type", #ifdef __Pyx_Coroutine_USED (PyObject*)__pyx_CoroutineType); #else Py_None); #endif if (unlikely(result < 0)) goto ignore; result = PyDict_SetItemString(globals, "_cython_generator_type", #ifdef __Pyx_Generator_USED (PyObject*)__pyx_GeneratorType); #else Py_None); #endif if (unlikely(result < 0)) goto ignore; if (unlikely(PyDict_SetItemString(globals, "_module", module) < 0)) goto ignore; if (unlikely(PyDict_SetItemString(globals, "__builtins__", __pyx_b) < 0)) goto ignore; result_obj = PyRun_String(py_code, Py_file_input, globals, globals); if (unlikely(!result_obj)) goto ignore; Py_DECREF(result_obj); Py_DECREF(globals); return module; ignore: Py_XDECREF(globals); PyErr_WriteUnraisable(module); if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, "Cython module failed to patch module with custom type", 1) < 0)) { Py_DECREF(module); module = NULL; } #else py_code++; #endif return module; } /* PatchGeneratorABC */ #ifndef CYTHON_REGISTER_ABCS #define CYTHON_REGISTER_ABCS 1 #endif #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) static PyObject* __Pyx_patch_abc_module(PyObject *module); static PyObject* __Pyx_patch_abc_module(PyObject *module) { module = __Pyx_Coroutine_patch_module( module, "" "if _cython_generator_type is not None:\n" " try: Generator = _module.Generator\n" " except AttributeError: pass\n" " else: Generator.register(_cython_generator_type)\n" "if _cython_coroutine_type is not None:\n" " try: Coroutine = _module.Coroutine\n" " except AttributeError: pass\n" " else: Coroutine.register(_cython_coroutine_type)\n" ); return module; } #endif static int __Pyx_patch_abc(void) { #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) static int abc_patched = 0; if (CYTHON_REGISTER_ABCS && !abc_patched) { PyObject *module; module = PyImport_ImportModule((PY_MAJOR_VERSION >= 3) ? "collections.abc" : "collections"); if (!module) { PyErr_WriteUnraisable(NULL); if (unlikely(PyErr_WarnEx(PyExc_RuntimeWarning, ((PY_MAJOR_VERSION >= 3) ? "Cython module failed to register with collections.abc module" : "Cython module failed to register with collections module"), 1) < 0)) { return -1; } } else { module = __Pyx_patch_abc_module(module); abc_patched = 1; if (unlikely(!module)) return -1; Py_DECREF(module); } module = PyImport_ImportModule("backports_abc"); if (module) { module = __Pyx_patch_abc_module(module); Py_XDECREF(module); } if (!module) { PyErr_Clear(); } } #else if ((0)) __Pyx_Coroutine_patch_module(NULL, NULL); #endif return 0; } /* Generator */ static PyMethodDef __pyx_Generator_methods[] = { {"send", (PyCFunction) __Pyx_Coroutine_Send, METH_O, (char*) PyDoc_STR("send(arg) -> send 'arg' into generator,\nreturn next yielded value or raise StopIteration.")}, {"throw", (PyCFunction) __Pyx_Coroutine_Throw, METH_VARARGS, (char*) PyDoc_STR("throw(typ[,val[,tb]]) -> raise exception in generator,\nreturn next yielded value or raise StopIteration.")}, {"close", (PyCFunction) __Pyx_Coroutine_Close_Method, METH_NOARGS, (char*) PyDoc_STR("close() -> raise GeneratorExit inside generator.")}, {0, 0, 0, 0} }; static PyMemberDef __pyx_Generator_memberlist[] = { {(char *) "gi_running", T_BOOL, offsetof(__pyx_CoroutineObject, is_running), READONLY, NULL}, {(char*) "gi_yieldfrom", T_OBJECT, offsetof(__pyx_CoroutineObject, yieldfrom), READONLY, (char*) PyDoc_STR("object being iterated by 'yield from', or None")}, {(char*) "gi_code", T_OBJECT, offsetof(__pyx_CoroutineObject, gi_code), READONLY, NULL}, {0, 0, 0, 0, 0} }; static PyGetSetDef __pyx_Generator_getsets[] = { {(char *) "__name__", (getter)__Pyx_Coroutine_get_name, (setter)__Pyx_Coroutine_set_name, (char*) PyDoc_STR("name of the generator"), 0}, {(char *) "__qualname__", (getter)__Pyx_Coroutine_get_qualname, (setter)__Pyx_Coroutine_set_qualname, (char*) PyDoc_STR("qualified name of the generator"), 0}, {0, 0, 0, 0, 0} }; static PyTypeObject __pyx_GeneratorType_type = { PyVarObject_HEAD_INIT(0, 0) "generator", sizeof(__pyx_CoroutineObject), 0, (destructor) __Pyx_Coroutine_dealloc, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_HAVE_FINALIZE, 0, (traverseproc) __Pyx_Coroutine_traverse, 0, 0, offsetof(__pyx_CoroutineObject, gi_weakreflist), 0, (iternextfunc) __Pyx_Generator_Next, __pyx_Generator_methods, __pyx_Generator_memberlist, __pyx_Generator_getsets, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, #if CYTHON_USE_TP_FINALIZE 0, #else __Pyx_Coroutine_del, #endif 0, #if CYTHON_USE_TP_FINALIZE __Pyx_Coroutine_del, #elif PY_VERSION_HEX >= 0x030400a1 0, #endif #if PY_VERSION_HEX >= 0x030800b1 0, #endif }; static int __pyx_Generator_init(void) { __pyx_GeneratorType_type.tp_getattro = __Pyx_PyObject_GenericGetAttrNoDict; __pyx_GeneratorType_type.tp_iter = PyObject_SelfIter; __pyx_GeneratorType = __Pyx_FetchCommonType(&__pyx_GeneratorType_type); if (unlikely(!__pyx_GeneratorType)) { return -1; } return 0; } /* CheckBinaryVersion */ static int __Pyx_check_binary_version(void) { char ctversion[4], rtversion[4]; PyOS_snprintf(ctversion, 4, "%d.%d", PY_MAJOR_VERSION, PY_MINOR_VERSION); PyOS_snprintf(rtversion, 4, "%s", Py_GetVersion()); if (ctversion[0] != rtversion[0] || ctversion[2] != rtversion[2]) { char message[200]; PyOS_snprintf(message, sizeof(message), "compiletime version %s of module '%.100s' " "does not match runtime version %s", ctversion, __Pyx_MODULE_NAME, rtversion); return PyErr_WarnEx(NULL, message, 1); } return 0; } /* InitStrings */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t) { while (t->p) { #if PY_MAJOR_VERSION < 3 if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } #else if (t->is_unicode | t->is_str) { if (t->intern) { *t->p = PyUnicode_InternFromString(t->s); } else if (t->encoding) { *t->p = PyUnicode_Decode(t->s, t->n - 1, t->encoding, NULL); } else { *t->p = PyUnicode_FromStringAndSize(t->s, t->n - 1); } } else { *t->p = PyBytes_FromStringAndSize(t->s, t->n - 1); } #endif if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; ++t; } return 0; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type %.200s). " "The ability to return an instance of a strict subclass of int " "is deprecated, and may be removed in a future version of Python.", Py_TYPE(result)->tp_name)) { Py_DECREF(result); return NULL; } return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type %.200s)", type_name, type_name, Py_TYPE(result)->tp_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)b)->ob_digit; const Py_ssize_t size = Py_SIZE(b); if (likely(__Pyx_sst_abs(size) <= 1)) { ival = likely(size) ? digits[0] : 0; if (size == -1) ival = -ival; return ival; } else { switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } #endif /* Py_PYTHON_H */ aiohttp-3.6.2/aiohttp/_http_parser.pyx0000644000175100001650000007007313547410117020400 0ustar vstsdocker00000000000000#cython: language_level=3 # # Based on https://github.com/MagicStack/httptools # from __future__ import absolute_import, print_function from cpython.mem cimport PyMem_Malloc, PyMem_Free from libc.string cimport memcpy from cpython cimport (PyObject_GetBuffer, PyBuffer_Release, PyBUF_SIMPLE, Py_buffer, PyBytes_AsString, PyBytes_AsStringAndSize) from multidict import (CIMultiDict as _CIMultiDict, CIMultiDictProxy as _CIMultiDictProxy) from yarl import URL as _URL from aiohttp import hdrs from .http_exceptions import ( BadHttpMessage, BadStatusLine, InvalidHeader, LineTooLong, InvalidURLError, PayloadEncodingError, ContentLengthError, TransferEncodingError) from .http_writer import (HttpVersion as _HttpVersion, HttpVersion10 as _HttpVersion10, HttpVersion11 as _HttpVersion11) from .http_parser import DeflateBuffer as _DeflateBuffer from .streams import (EMPTY_PAYLOAD as _EMPTY_PAYLOAD, StreamReader as _StreamReader) cimport cython from aiohttp cimport _cparser as cparser include "_headers.pxi" from aiohttp cimport _find_header DEF DEFAULT_FREELIST_SIZE = 250 cdef extern from "Python.h": int PyByteArray_Resize(object, Py_ssize_t) except -1 Py_ssize_t PyByteArray_Size(object) except -1 char* PyByteArray_AsString(object) __all__ = ('HttpRequestParser', 'HttpResponseParser', 'RawRequestMessage', 'RawResponseMessage') cdef object URL = _URL cdef object URL_build = URL.build cdef object CIMultiDict = _CIMultiDict cdef object CIMultiDictProxy = _CIMultiDictProxy cdef object HttpVersion = _HttpVersion cdef object HttpVersion10 = _HttpVersion10 cdef object HttpVersion11 = _HttpVersion11 cdef object SEC_WEBSOCKET_KEY1 = hdrs.SEC_WEBSOCKET_KEY1 cdef object CONTENT_ENCODING = hdrs.CONTENT_ENCODING cdef object EMPTY_PAYLOAD = _EMPTY_PAYLOAD cdef object StreamReader = _StreamReader cdef object DeflateBuffer = _DeflateBuffer cdef inline object extend(object buf, const char* at, size_t length): cdef Py_ssize_t s cdef char* ptr s = PyByteArray_Size(buf) PyByteArray_Resize(buf, s + length) ptr = PyByteArray_AsString(buf) memcpy(ptr + s, at, length) DEF METHODS_COUNT = 34; cdef list _http_method = [] for i in range(METHODS_COUNT): _http_method.append( cparser.http_method_str( i).decode('ascii')) cdef inline str http_method_str(int i): if i < METHODS_COUNT: return _http_method[i] else: return "" cdef inline object find_header(bytes raw_header): cdef Py_ssize_t size cdef char *buf cdef int idx PyBytes_AsStringAndSize(raw_header, &buf, &size) idx = _find_header.find_header(buf, size) if idx == -1: return raw_header.decode('utf-8', 'surrogateescape') return headers[idx] @cython.freelist(DEFAULT_FREELIST_SIZE) cdef class RawRequestMessage: cdef readonly str method cdef readonly str path cdef readonly object version # HttpVersion cdef readonly object headers # CIMultiDict cdef readonly object raw_headers # tuple cdef readonly object should_close cdef readonly object compression cdef readonly object upgrade cdef readonly object chunked cdef readonly object url # yarl.URL def __init__(self, method, path, version, headers, raw_headers, should_close, compression, upgrade, chunked, url): self.method = method self.path = path self.version = version self.headers = headers self.raw_headers = raw_headers self.should_close = should_close self.compression = compression self.upgrade = upgrade self.chunked = chunked self.url = url def __repr__(self): info = [] info.append(("method", self.method)) info.append(("path", self.path)) info.append(("version", self.version)) info.append(("headers", self.headers)) info.append(("raw_headers", self.raw_headers)) info.append(("should_close", self.should_close)) info.append(("compression", self.compression)) info.append(("upgrade", self.upgrade)) info.append(("chunked", self.chunked)) info.append(("url", self.url)) sinfo = ', '.join(name + '=' + repr(val) for name, val in info) return '' def _replace(self, **dct): cdef RawRequestMessage ret ret = _new_request_message(self.method, self.path, self.version, self.headers, self.raw_headers, self.should_close, self.compression, self.upgrade, self.chunked, self.url) if "method" in dct: ret.method = dct["method"] if "path" in dct: ret.path = dct["path"] if "version" in dct: ret.version = dct["version"] if "headers" in dct: ret.headers = dct["headers"] if "raw_headers" in dct: ret.raw_headers = dct["raw_headers"] if "should_close" in dct: ret.should_close = dct["should_close"] if "compression" in dct: ret.compression = dct["compression"] if "upgrade" in dct: ret.upgrade = dct["upgrade"] if "chunked" in dct: ret.chunked = dct["chunked"] if "url" in dct: ret.url = dct["url"] return ret cdef _new_request_message(str method, str path, object version, object headers, object raw_headers, bint should_close, object compression, bint upgrade, bint chunked, object url): cdef RawRequestMessage ret ret = RawRequestMessage.__new__(RawRequestMessage) ret.method = method ret.path = path ret.version = version ret.headers = headers ret.raw_headers = raw_headers ret.should_close = should_close ret.compression = compression ret.upgrade = upgrade ret.chunked = chunked ret.url = url return ret @cython.freelist(DEFAULT_FREELIST_SIZE) cdef class RawResponseMessage: cdef readonly object version # HttpVersion cdef readonly int code cdef readonly str reason cdef readonly object headers # CIMultiDict cdef readonly object raw_headers # tuple cdef readonly object should_close cdef readonly object compression cdef readonly object upgrade cdef readonly object chunked def __init__(self, version, code, reason, headers, raw_headers, should_close, compression, upgrade, chunked): self.version = version self.code = code self.reason = reason self.headers = headers self.raw_headers = raw_headers self.should_close = should_close self.compression = compression self.upgrade = upgrade self.chunked = chunked def __repr__(self): info = [] info.append(("version", self.version)) info.append(("code", self.code)) info.append(("reason", self.reason)) info.append(("headers", self.headers)) info.append(("raw_headers", self.raw_headers)) info.append(("should_close", self.should_close)) info.append(("compression", self.compression)) info.append(("upgrade", self.upgrade)) info.append(("chunked", self.chunked)) sinfo = ', '.join(name + '=' + repr(val) for name, val in info) return '' cdef _new_response_message(object version, int code, str reason, object headers, object raw_headers, bint should_close, object compression, bint upgrade, bint chunked): cdef RawResponseMessage ret ret = RawResponseMessage.__new__(RawResponseMessage) ret.version = version ret.code = code ret.reason = reason ret.headers = headers ret.raw_headers = raw_headers ret.should_close = should_close ret.compression = compression ret.upgrade = upgrade ret.chunked = chunked return ret @cython.internal cdef class HttpParser: cdef: cparser.http_parser* _cparser cparser.http_parser_settings* _csettings bytearray _raw_name bytearray _raw_value bint _has_value object _protocol object _loop object _timer size_t _max_line_size size_t _max_field_size size_t _max_headers bint _response_with_body bint _started object _url bytearray _buf str _path str _reason object _headers list _raw_headers bint _upgraded list _messages object _payload bint _payload_error object _payload_exception object _last_error bint _auto_decompress str _content_encoding Py_buffer py_buf def __cinit__(self): self._cparser = \ PyMem_Malloc(sizeof(cparser.http_parser)) if self._cparser is NULL: raise MemoryError() self._csettings = \ PyMem_Malloc(sizeof(cparser.http_parser_settings)) if self._csettings is NULL: raise MemoryError() def __dealloc__(self): PyMem_Free(self._cparser) PyMem_Free(self._csettings) cdef _init(self, cparser.http_parser_type mode, object protocol, object loop, object timer=None, size_t max_line_size=8190, size_t max_headers=32768, size_t max_field_size=8190, payload_exception=None, bint response_with_body=True, bint auto_decompress=True): cparser.http_parser_init(self._cparser, mode) self._cparser.data = self self._cparser.content_length = 0 cparser.http_parser_settings_init(self._csettings) self._protocol = protocol self._loop = loop self._timer = timer self._buf = bytearray() self._payload = None self._payload_error = 0 self._payload_exception = payload_exception self._messages = [] self._raw_name = bytearray() self._raw_value = bytearray() self._has_value = False self._max_line_size = max_line_size self._max_headers = max_headers self._max_field_size = max_field_size self._response_with_body = response_with_body self._upgraded = False self._auto_decompress = auto_decompress self._content_encoding = None self._csettings.on_url = cb_on_url self._csettings.on_status = cb_on_status self._csettings.on_header_field = cb_on_header_field self._csettings.on_header_value = cb_on_header_value self._csettings.on_headers_complete = cb_on_headers_complete self._csettings.on_body = cb_on_body self._csettings.on_message_begin = cb_on_message_begin self._csettings.on_message_complete = cb_on_message_complete self._csettings.on_chunk_header = cb_on_chunk_header self._csettings.on_chunk_complete = cb_on_chunk_complete self._last_error = None cdef _process_header(self): if self._raw_name: raw_name = bytes(self._raw_name) raw_value = bytes(self._raw_value) name = find_header(raw_name) value = raw_value.decode('utf-8', 'surrogateescape') self._headers.add(name, value) if name is CONTENT_ENCODING: self._content_encoding = value PyByteArray_Resize(self._raw_name, 0) PyByteArray_Resize(self._raw_value, 0) self._has_value = False self._raw_headers.append((raw_name, raw_value)) cdef _on_header_field(self, char* at, size_t length): cdef Py_ssize_t size cdef char *buf if self._has_value: self._process_header() size = PyByteArray_Size(self._raw_name) PyByteArray_Resize(self._raw_name, size + length) buf = PyByteArray_AsString(self._raw_name) memcpy(buf + size, at, length) cdef _on_header_value(self, char* at, size_t length): cdef Py_ssize_t size cdef char *buf size = PyByteArray_Size(self._raw_value) PyByteArray_Resize(self._raw_value, size + length) buf = PyByteArray_AsString(self._raw_value) memcpy(buf + size, at, length) self._has_value = True cdef _on_headers_complete(self): self._process_header() method = http_method_str(self._cparser.method) should_close = not cparser.http_should_keep_alive(self._cparser) upgrade = self._cparser.upgrade chunked = self._cparser.flags & cparser.F_CHUNKED raw_headers = tuple(self._raw_headers) headers = CIMultiDictProxy(self._headers) if upgrade or self._cparser.method == 5: # cparser.CONNECT: self._upgraded = True # do not support old websocket spec if SEC_WEBSOCKET_KEY1 in headers: raise InvalidHeader(SEC_WEBSOCKET_KEY1) encoding = None enc = self._content_encoding if enc is not None: self._content_encoding = None enc = enc.lower() if enc in ('gzip', 'deflate', 'br'): encoding = enc if self._cparser.type == cparser.HTTP_REQUEST: msg = _new_request_message( method, self._path, self.http_version(), headers, raw_headers, should_close, encoding, upgrade, chunked, self._url) else: msg = _new_response_message( self.http_version(), self._cparser.status_code, self._reason, headers, raw_headers, should_close, encoding, upgrade, chunked) if (self._cparser.content_length > 0 or chunked or self._cparser.method == 5): # CONNECT: 5 payload = StreamReader( self._protocol, timer=self._timer, loop=self._loop) else: payload = EMPTY_PAYLOAD self._payload = payload if encoding is not None and self._auto_decompress: self._payload = DeflateBuffer(payload, encoding) if not self._response_with_body: payload = EMPTY_PAYLOAD self._messages.append((msg, payload)) cdef _on_message_complete(self): self._payload.feed_eof() self._payload = None cdef _on_chunk_header(self): self._payload.begin_http_chunk_receiving() cdef _on_chunk_complete(self): self._payload.end_http_chunk_receiving() cdef object _on_status_complete(self): pass cdef inline http_version(self): cdef cparser.http_parser* parser = self._cparser if parser.http_major == 1: if parser.http_minor == 0: return HttpVersion10 elif parser.http_minor == 1: return HttpVersion11 return HttpVersion(parser.http_major, parser.http_minor) ### Public API ### def feed_eof(self): cdef bytes desc if self._payload is not None: if self._cparser.flags & cparser.F_CHUNKED: raise TransferEncodingError( "Not enough data for satisfy transfer length header.") elif self._cparser.flags & cparser.F_CONTENTLENGTH: raise ContentLengthError( "Not enough data for satisfy content length header.") elif self._cparser.http_errno != cparser.HPE_OK: desc = cparser.http_errno_description( self._cparser.http_errno) raise PayloadEncodingError(desc.decode('latin-1')) else: self._payload.feed_eof() elif self._started: self._on_headers_complete() if self._messages: return self._messages[-1][0] def feed_data(self, data): cdef: size_t data_len size_t nb PyObject_GetBuffer(data, &self.py_buf, PyBUF_SIMPLE) data_len = self.py_buf.len nb = cparser.http_parser_execute( self._cparser, self._csettings, self.py_buf.buf, data_len) PyBuffer_Release(&self.py_buf) # i am not sure about cparser.HPE_INVALID_METHOD, # seems get err for valid request # test_client_functional.py::test_post_data_with_bytesio_file if (self._cparser.http_errno != cparser.HPE_OK and (self._cparser.http_errno != cparser.HPE_INVALID_METHOD or self._cparser.method == 0)): if self._payload_error == 0: if self._last_error is not None: ex = self._last_error self._last_error = None else: ex = parser_error_from_errno( self._cparser.http_errno) self._payload = None raise ex if self._messages: messages = self._messages self._messages = [] else: messages = () if self._upgraded: return messages, True, data[nb:] else: return messages, False, b'' cdef class HttpRequestParser(HttpParser): def __init__(self, protocol, loop, timer=None, size_t max_line_size=8190, size_t max_headers=32768, size_t max_field_size=8190, payload_exception=None, bint response_with_body=True, bint read_until_eof=False): self._init(cparser.HTTP_REQUEST, protocol, loop, timer, max_line_size, max_headers, max_field_size, payload_exception, response_with_body) cdef object _on_status_complete(self): cdef Py_buffer py_buf if not self._buf: return self._path = self._buf.decode('utf-8', 'surrogateescape') if self._cparser.method == 5: # CONNECT self._url = URL(self._path) else: PyObject_GetBuffer(self._buf, &py_buf, PyBUF_SIMPLE) try: self._url = _parse_url(py_buf.buf, py_buf.len) finally: PyBuffer_Release(&py_buf) PyByteArray_Resize(self._buf, 0) cdef class HttpResponseParser(HttpParser): def __init__(self, protocol, loop, timer=None, size_t max_line_size=8190, size_t max_headers=32768, size_t max_field_size=8190, payload_exception=None, bint response_with_body=True, bint read_until_eof=False, bint auto_decompress=True): self._init(cparser.HTTP_RESPONSE, protocol, loop, timer, max_line_size, max_headers, max_field_size, payload_exception, response_with_body, auto_decompress) cdef object _on_status_complete(self): if self._buf: self._reason = self._buf.decode('utf-8', 'surrogateescape') PyByteArray_Resize(self._buf, 0) else: self._reason = self._reason or '' cdef int cb_on_message_begin(cparser.http_parser* parser) except -1: cdef HttpParser pyparser = parser.data pyparser._started = True pyparser._headers = CIMultiDict() pyparser._raw_headers = [] PyByteArray_Resize(pyparser._buf, 0) pyparser._path = None pyparser._reason = None return 0 cdef int cb_on_url(cparser.http_parser* parser, const char *at, size_t length) except -1: cdef HttpParser pyparser = parser.data try: if length > pyparser._max_line_size: raise LineTooLong( 'Status line is too long', pyparser._max_line_size, length) extend(pyparser._buf, at, length) except BaseException as ex: pyparser._last_error = ex return -1 else: return 0 cdef int cb_on_status(cparser.http_parser* parser, const char *at, size_t length) except -1: cdef HttpParser pyparser = parser.data cdef str reason try: if length > pyparser._max_line_size: raise LineTooLong( 'Status line is too long', pyparser._max_line_size, length) extend(pyparser._buf, at, length) except BaseException as ex: pyparser._last_error = ex return -1 else: return 0 cdef int cb_on_header_field(cparser.http_parser* parser, const char *at, size_t length) except -1: cdef HttpParser pyparser = parser.data cdef Py_ssize_t size try: pyparser._on_status_complete() size = len(pyparser._raw_name) + length if size > pyparser._max_field_size: raise LineTooLong( 'Header name is too long', pyparser._max_field_size, size) pyparser._on_header_field(at, length) except BaseException as ex: pyparser._last_error = ex return -1 else: return 0 cdef int cb_on_header_value(cparser.http_parser* parser, const char *at, size_t length) except -1: cdef HttpParser pyparser = parser.data cdef Py_ssize_t size try: size = len(pyparser._raw_value) + length if size > pyparser._max_field_size: raise LineTooLong( 'Header value is too long', pyparser._max_field_size, size) pyparser._on_header_value(at, length) except BaseException as ex: pyparser._last_error = ex return -1 else: return 0 cdef int cb_on_headers_complete(cparser.http_parser* parser) except -1: cdef HttpParser pyparser = parser.data try: pyparser._on_status_complete() pyparser._on_headers_complete() except BaseException as exc: pyparser._last_error = exc return -1 else: if pyparser._cparser.upgrade or pyparser._cparser.method == 5: # CONNECT return 2 else: return 0 cdef int cb_on_body(cparser.http_parser* parser, const char *at, size_t length) except -1: cdef HttpParser pyparser = parser.data cdef bytes body = at[:length] try: pyparser._payload.feed_data(body, length) except BaseException as exc: if pyparser._payload_exception is not None: pyparser._payload.set_exception(pyparser._payload_exception(str(exc))) else: pyparser._payload.set_exception(exc) pyparser._payload_error = 1 return -1 else: return 0 cdef int cb_on_message_complete(cparser.http_parser* parser) except -1: cdef HttpParser pyparser = parser.data try: pyparser._started = False pyparser._on_message_complete() except BaseException as exc: pyparser._last_error = exc return -1 else: return 0 cdef int cb_on_chunk_header(cparser.http_parser* parser) except -1: cdef HttpParser pyparser = parser.data try: pyparser._on_chunk_header() except BaseException as exc: pyparser._last_error = exc return -1 else: return 0 cdef int cb_on_chunk_complete(cparser.http_parser* parser) except -1: cdef HttpParser pyparser = parser.data try: pyparser._on_chunk_complete() except BaseException as exc: pyparser._last_error = exc return -1 else: return 0 cdef parser_error_from_errno(cparser.http_errno errno): cdef bytes desc = cparser.http_errno_description(errno) if errno in (cparser.HPE_CB_message_begin, cparser.HPE_CB_url, cparser.HPE_CB_header_field, cparser.HPE_CB_header_value, cparser.HPE_CB_headers_complete, cparser.HPE_CB_body, cparser.HPE_CB_message_complete, cparser.HPE_CB_status, cparser.HPE_CB_chunk_header, cparser.HPE_CB_chunk_complete): cls = BadHttpMessage elif errno == cparser.HPE_INVALID_STATUS: cls = BadStatusLine elif errno == cparser.HPE_INVALID_METHOD: cls = BadStatusLine elif errno == cparser.HPE_INVALID_URL: cls = InvalidURLError else: cls = BadHttpMessage return cls(desc.decode('latin-1')) def parse_url(url): cdef: Py_buffer py_buf char* buf_data PyObject_GetBuffer(url, &py_buf, PyBUF_SIMPLE) try: buf_data = py_buf.buf return _parse_url(buf_data, py_buf.len) finally: PyBuffer_Release(&py_buf) cdef _parse_url(char* buf_data, size_t length): cdef: cparser.http_parser_url* parsed int res str schema = None str host = None object port = None str path = None str query = None str fragment = None str user = None str password = None str userinfo = None object result = None int off int ln parsed = \ PyMem_Malloc(sizeof(cparser.http_parser_url)) if parsed is NULL: raise MemoryError() cparser.http_parser_url_init(parsed) try: res = cparser.http_parser_parse_url(buf_data, length, 0, parsed) if res == 0: if parsed.field_set & (1 << cparser.UF_SCHEMA): off = parsed.field_data[cparser.UF_SCHEMA].off ln = parsed.field_data[cparser.UF_SCHEMA].len schema = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') else: schema = '' if parsed.field_set & (1 << cparser.UF_HOST): off = parsed.field_data[cparser.UF_HOST].off ln = parsed.field_data[cparser.UF_HOST].len host = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') else: host = '' if parsed.field_set & (1 << cparser.UF_PORT): port = parsed.port if parsed.field_set & (1 << cparser.UF_PATH): off = parsed.field_data[cparser.UF_PATH].off ln = parsed.field_data[cparser.UF_PATH].len path = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') else: path = '' if parsed.field_set & (1 << cparser.UF_QUERY): off = parsed.field_data[cparser.UF_QUERY].off ln = parsed.field_data[cparser.UF_QUERY].len query = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') else: query = '' if parsed.field_set & (1 << cparser.UF_FRAGMENT): off = parsed.field_data[cparser.UF_FRAGMENT].off ln = parsed.field_data[cparser.UF_FRAGMENT].len fragment = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') else: fragment = '' if parsed.field_set & (1 << cparser.UF_USERINFO): off = parsed.field_data[cparser.UF_USERINFO].off ln = parsed.field_data[cparser.UF_USERINFO].len userinfo = buf_data[off:off+ln].decode('utf-8', 'surrogateescape') user, sep, password = userinfo.partition(':') return URL_build(scheme=schema, user=user, password=password, host=host, port=port, path=path, query=query, fragment=fragment) else: raise InvalidURLError("invalid url {!r}".format(buf_data)) finally: PyMem_Free(parsed) aiohttp-3.6.2/aiohttp/_http_writer.c0000644000175100001650000063524213547410135020027 0ustar vstsdocker00000000000000/* Generated by Cython 0.29.13 */ #define PY_SSIZE_T_CLEAN #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02060000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.6+ or Python 3.3+. #else #define CYTHON_ABI "0_29_13" #define CYTHON_HEX_VERSION 0x001D0DF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #if PY_VERSION_HEX >= 0x02070000 #define HAVE_LONG_LONG #endif #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #ifdef PYPY_VERSION #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif defined(PYSTON_VERSION) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #elif !defined(CYTHON_USE_PYTYPE_LOOKUP) #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #elif !defined(CYTHON_USE_PYLONG_INTERNALS) #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT (PY_VERSION_HEX >= 0x03050000) #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1) #endif #ifndef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX >= 0x030600B1) #endif #ifndef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK (PY_VERSION_HEX >= 0x030700A3) #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if CYTHON_USE_PYLONG_INTERNALS #include "longintrepr.h" #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_MAYBE_UNUSED_VAR( const T& ) { } # else # define CYTHON_MAYBE_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int32 uint32_t; #endif #endif #else #include #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) && __cplusplus >= 201103L #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #elif __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__ ) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #define Py_OptimizeFlag 0 #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #define __Pyx_DefaultClassType PyClass_Type #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #if PY_VERSION_HEX >= 0x030800A4 && PY_VERSION_HEX < 0x030800B2 #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #define __Pyx_DefaultClassType PyType_Type #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #define __Pyx_PyCFunctionFast _PyCFunctionFast #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #if CYTHON_FAST_PYCCALL #define __Pyx_PyFastCFunction_Check(func)\ ((PyCFunction_Check(func) && (METH_FASTCALL == (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))))) #else #define __Pyx_PyFastCFunction_Check(func) 0 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030400A1 #define PyMem_RawMalloc(n) PyMem_Malloc(n) #define PyMem_RawRealloc(p, n) PyMem_Realloc(p, n) #define PyMem_RawFree(p) PyMem_Free(p) #endif #if CYTHON_COMPILING_IN_PYSTON #define __Pyx_PyCode_HasFreeVars(co) PyCode_HasFreeVars(co) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if !CYTHON_FAST_THREAD_STATE || PY_VERSION_HEX < 0x02070000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStr(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) #else #define __Pyx_PyDict_GetItemStr(dict, name) PyDict_GetItem(dict, name) #endif #if PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) PyUnicode_KIND(u) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535 : 1114111) #define __Pyx_PyUnicode_KIND(u) (sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains) #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #define PyObject_Unicode PyObject_Str #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) (Py_TYPE(obj) == &PySet_Type) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #else #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t PyInt_AsLong #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t PyInt_AsSsize_t #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyMethod_New(func, self, klass) ((self) ? PyMethod_New(func, self) : (Py_INCREF(func), func)) #else #define __Pyx_PyMethod_New(func, self, klass) PyMethod_New(func, self, klass) #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(WIN32) || defined(MS_WINDOWS) #define _USE_MATH_DEFINES #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_ERR(f_index, lineno, Ln_error) \ { \ __pyx_filename = __pyx_f[f_index]; __pyx_lineno = lineno; __pyx_clineno = __LINE__; goto Ln_error; \ } #ifndef __PYX_EXTERN_C #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__aiohttp___http_writer #define __PYX_HAVE_API__aiohttp___http_writer /* Early includes */ #include #include #include #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); #define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) { const Py_UNICODE *u_end = u; while (*u_end++) ; return (size_t)(u_end - u - 1); } #define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) #define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #define __Pyx_PyNumber_Float(x) (PyFloat_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Float(x)) #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } static PyObject *__pyx_m = NULL; static PyObject *__pyx_d; static PyObject *__pyx_b; static PyObject *__pyx_cython_runtime = NULL; static PyObject *__pyx_empty_tuple; static PyObject *__pyx_empty_bytes; static PyObject *__pyx_empty_unicode; static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm= __FILE__; static const char *__pyx_filename; static const char *__pyx_f[] = { "aiohttp/_http_writer.pyx", "type.pxd", }; /*--- Type declarations ---*/ struct __pyx_t_7aiohttp_12_http_writer_Writer; /* "aiohttp/_http_writer.pyx":19 * # ----------------- writer --------------------------- * * cdef struct Writer: # <<<<<<<<<<<<<< * char *buf * Py_ssize_t size */ struct __pyx_t_7aiohttp_12_http_writer_Writer { char *buf; Py_ssize_t size; Py_ssize_t pos; }; /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, int); void (*DECREF)(void*, PyObject*, int); void (*GOTREF)(void*, PyObject*, int); void (*GIVEREF)(void*, PyObject*, int); void* (*SetupContext)(const char*, int, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__) #endif #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_XINCREF(r) do { if((r) != NULL) {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) != NULL) {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) != NULL) {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) != NULL) {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #define __Pyx_PyErr_Occurred() __pyx_tstate->curexc_type #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() PyErr_Occurred() #endif /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* WriteUnraisableException.proto */ static void __Pyx_WriteUnraisable(const char *name, int clineno, int lineno, const char *filename, int full_traceback, int nogil); /* unicode_iter.proto */ static CYTHON_INLINE int __Pyx_init_unicode_iteration( PyObject* ustring, Py_ssize_t *length, void** data, int *kind); /* PyCFunctionFastCall.proto */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject *__Pyx_PyCFunction_FastCall(PyObject *func, PyObject **args, Py_ssize_t nargs); #else #define __Pyx_PyCFunction_FastCall(func, args, nargs) (assert(0), NULL) #endif /* PyFunctionFastCall.proto */ #if CYTHON_FAST_PYCALL #define __Pyx_PyFunction_FastCall(func, args, nargs)\ __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); #else #define __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs) _PyFunction_FastCallDict(func, args, nargs, kwargs) #endif #define __Pyx_BUILD_ASSERT_EXPR(cond)\ (sizeof(char [1 - 2*!(cond)]) - 1) #ifndef Py_MEMBER_SIZE #define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) #endif static size_t __pyx_pyframe_localsplus_offset = 0; #include "frameobject.h" #define __Pxy_PyFrame_Initialize_Offsets()\ ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) #define __Pyx_PyFrame_GetLocalsplus(frame)\ (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) #endif /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* PyObjectCall2Args.proto */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2); /* PyObjectCallMethO.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); #endif /* PyObjectCallOneArg.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); /* RaiseException.proto */ static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause); /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject **argnames[],\ PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,\ const char* function_name); /* ArgTypeTest.proto */ #define __Pyx_ArgTypeTest(obj, type, none_allowed, name, exact)\ ((likely((Py_TYPE(obj) == type) | (none_allowed && (obj == Py_None)))) ? 1 :\ __Pyx__ArgTypeTest(obj, type, name, exact)) static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact); /* GetTopmostException.proto */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate); #endif /* ReRaiseException.proto */ static CYTHON_INLINE void __Pyx_ReraiseException(void); /* IterFinish.proto */ static CYTHON_INLINE int __Pyx_IterFinish(void); /* PyObjectCallNoArg.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func); #else #define __Pyx_PyObject_CallNoArg(func) __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL) #endif /* PyObjectGetMethod.proto */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method); /* PyObjectCallMethod0.proto */ static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name); /* RaiseNeedMoreValuesToUnpack.proto */ static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index); /* RaiseTooManyValuesToUnpack.proto */ static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected); /* UnpackItemEndCheck.proto */ static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected); /* RaiseNoneIterError.proto */ static CYTHON_INLINE void __Pyx_RaiseNoneNotIterableError(void); /* UnpackTupleError.proto */ static void __Pyx_UnpackTupleError(PyObject *, Py_ssize_t index); /* UnpackTuple2.proto */ #define __Pyx_unpack_tuple2(tuple, value1, value2, is_tuple, has_known_size, decref_tuple)\ (likely(is_tuple || PyTuple_Check(tuple)) ?\ (likely(has_known_size || PyTuple_GET_SIZE(tuple) == 2) ?\ __Pyx_unpack_tuple2_exact(tuple, value1, value2, decref_tuple) :\ (__Pyx_UnpackTupleError(tuple, 2), -1)) :\ __Pyx_unpack_tuple2_generic(tuple, value1, value2, has_known_size, decref_tuple)) static CYTHON_INLINE int __Pyx_unpack_tuple2_exact( PyObject* tuple, PyObject** value1, PyObject** value2, int decref_tuple); static int __Pyx_unpack_tuple2_generic( PyObject* tuple, PyObject** value1, PyObject** value2, int has_known_size, int decref_tuple); /* dict_iter.proto */ static CYTHON_INLINE PyObject* __Pyx_dict_iterator(PyObject* dict, int is_dict, PyObject* method_name, Py_ssize_t* p_orig_length, int* p_is_dict); static CYTHON_INLINE int __Pyx_dict_iter_next(PyObject* dict_or_iter, Py_ssize_t orig_length, Py_ssize_t* ppos, PyObject** pkey, PyObject** pvalue, PyObject** pitem, int is_dict); /* GetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_GetException(type, value, tb) __Pyx__GetException(__pyx_tstate, type, value, tb) static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb); #endif /* SwapException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSwap(type, value, tb) __Pyx__ExceptionSwap(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #else static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb); #endif /* SaveResetException.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_ExceptionSave(type, value, tb) __Pyx__ExceptionSave(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #define __Pyx_ExceptionReset(type, value, tb) __Pyx__ExceptionReset(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); #else #define __Pyx_ExceptionSave(type, value, tb) PyErr_GetExcInfo(type, value, tb) #define __Pyx_ExceptionReset(type, value, tb) PyErr_SetExcInfo(type, value, tb) #endif /* TypeImport.proto */ #ifndef __PYX_HAVE_RT_ImportType_proto #define __PYX_HAVE_RT_ImportType_proto enum __Pyx_ImportType_CheckSize { __Pyx_ImportType_CheckSize_Error = 0, __Pyx_ImportType_CheckSize_Warn = 1, __Pyx_ImportType_CheckSize_Ignore = 2 }; static PyTypeObject *__Pyx_ImportType(PyObject* module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size); #endif /* Import.proto */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level); /* ImportFrom.proto */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name); /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* GetModuleGlobalName.proto */ #if CYTHON_USE_DICT_VERSIONS #define __Pyx_GetModuleGlobalName(var, name) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ (var) = (likely(__pyx_dict_version == __PYX_GET_DICT_VERSION(__pyx_d))) ?\ (likely(__pyx_dict_cached_value) ? __Pyx_NewRef(__pyx_dict_cached_value) : __Pyx_GetBuiltinName(name)) :\ __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } #define __Pyx_GetModuleGlobalNameUncached(var, name) {\ PY_UINT64_T __pyx_dict_version;\ PyObject *__pyx_dict_cached_value;\ (var) = __Pyx__GetModuleGlobalName(name, &__pyx_dict_version, &__pyx_dict_cached_value);\ } static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value); #else #define __Pyx_GetModuleGlobalName(var, name) (var) = __Pyx__GetModuleGlobalName(name) #define __Pyx_GetModuleGlobalNameUncached(var, name) (var) = __Pyx__GetModuleGlobalName(name) static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name); #endif /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* CheckBinaryVersion.proto */ static int __Pyx_check_binary_version(void); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /* Module declarations from 'libc.stdint' */ /* Module declarations from 'libc.string' */ /* Module declarations from 'libc.stdio' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.type' */ static PyTypeObject *__pyx_ptype_7cpython_4type_type = 0; /* Module declarations from 'cpython' */ /* Module declarations from 'cpython.object' */ /* Module declarations from 'cpython.exc' */ /* Module declarations from 'cpython.mem' */ /* Module declarations from 'cpython.bytes' */ /* Module declarations from 'aiohttp._http_writer' */ static char __pyx_v_7aiohttp_12_http_writer_BUFFER[0x4000]; static PyObject *__pyx_v_7aiohttp_12_http_writer__istr = 0; static CYTHON_INLINE void __pyx_f_7aiohttp_12_http_writer__init_writer(struct __pyx_t_7aiohttp_12_http_writer_Writer *); /*proto*/ static CYTHON_INLINE void __pyx_f_7aiohttp_12_http_writer__release_writer(struct __pyx_t_7aiohttp_12_http_writer_Writer *); /*proto*/ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_byte(struct __pyx_t_7aiohttp_12_http_writer_Writer *, uint8_t); /*proto*/ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_utf8(struct __pyx_t_7aiohttp_12_http_writer_Writer *, Py_UCS4); /*proto*/ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_str(struct __pyx_t_7aiohttp_12_http_writer_Writer *, PyObject *); /*proto*/ static PyObject *__pyx_f_7aiohttp_12_http_writer_to_str(PyObject *); /*proto*/ #define __Pyx_MODULE_NAME "aiohttp._http_writer" extern int __pyx_module_is_main_aiohttp___http_writer; int __pyx_module_is_main_aiohttp___http_writer = 0; /* Implementation of 'aiohttp._http_writer' */ static PyObject *__pyx_builtin_TypeError; static const char __pyx_k_key[] = "key"; static const char __pyx_k_ret[] = "ret"; static const char __pyx_k_val[] = "val"; static const char __pyx_k_istr[] = "istr"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_items[] = "items"; static const char __pyx_k_format[] = "format"; static const char __pyx_k_import[] = "__import__"; static const char __pyx_k_writer[] = "writer"; static const char __pyx_k_headers[] = "headers"; static const char __pyx_k_TypeError[] = "TypeError"; static const char __pyx_k_multidict[] = "multidict"; static const char __pyx_k_status_line[] = "status_line"; static const char __pyx_k_serialize_headers[] = "_serialize_headers"; static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; static const char __pyx_k_aiohttp__http_writer[] = "aiohttp._http_writer"; static const char __pyx_k_aiohttp__http_writer_pyx[] = "aiohttp/_http_writer.pyx"; static const char __pyx_k_Cannot_serialize_non_str_key_r[] = "Cannot serialize non-str key {!r}"; static PyObject *__pyx_kp_u_Cannot_serialize_non_str_key_r; static PyObject *__pyx_n_s_TypeError; static PyObject *__pyx_n_s_aiohttp__http_writer; static PyObject *__pyx_kp_s_aiohttp__http_writer_pyx; static PyObject *__pyx_n_s_cline_in_traceback; static PyObject *__pyx_n_s_format; static PyObject *__pyx_n_s_headers; static PyObject *__pyx_n_s_import; static PyObject *__pyx_n_s_istr; static PyObject *__pyx_n_s_items; static PyObject *__pyx_n_s_key; static PyObject *__pyx_n_s_main; static PyObject *__pyx_n_s_multidict; static PyObject *__pyx_n_s_name; static PyObject *__pyx_n_s_ret; static PyObject *__pyx_n_s_serialize_headers; static PyObject *__pyx_n_s_status_line; static PyObject *__pyx_n_s_test; static PyObject *__pyx_n_s_val; static PyObject *__pyx_n_s_writer; static PyObject *__pyx_pf_7aiohttp_12_http_writer__serialize_headers(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_status_line, PyObject *__pyx_v_headers); /* proto */ static PyObject *__pyx_tuple_; static PyObject *__pyx_codeobj__2; /* Late includes */ /* "aiohttp/_http_writer.pyx":25 * * * cdef inline void _init_writer(Writer* writer): # <<<<<<<<<<<<<< * writer.buf = &BUFFER[0] * writer.size = BUF_SIZE */ static CYTHON_INLINE void __pyx_f_7aiohttp_12_http_writer__init_writer(struct __pyx_t_7aiohttp_12_http_writer_Writer *__pyx_v_writer) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("_init_writer", 0); /* "aiohttp/_http_writer.pyx":26 * * cdef inline void _init_writer(Writer* writer): * writer.buf = &BUFFER[0] # <<<<<<<<<<<<<< * writer.size = BUF_SIZE * writer.pos = 0 */ __pyx_v_writer->buf = (&(__pyx_v_7aiohttp_12_http_writer_BUFFER[0])); /* "aiohttp/_http_writer.pyx":27 * cdef inline void _init_writer(Writer* writer): * writer.buf = &BUFFER[0] * writer.size = BUF_SIZE # <<<<<<<<<<<<<< * writer.pos = 0 * */ __pyx_v_writer->size = 0x4000; /* "aiohttp/_http_writer.pyx":28 * writer.buf = &BUFFER[0] * writer.size = BUF_SIZE * writer.pos = 0 # <<<<<<<<<<<<<< * * */ __pyx_v_writer->pos = 0; /* "aiohttp/_http_writer.pyx":25 * * * cdef inline void _init_writer(Writer* writer): # <<<<<<<<<<<<<< * writer.buf = &BUFFER[0] * writer.size = BUF_SIZE */ /* function exit code */ __Pyx_RefNannyFinishContext(); } /* "aiohttp/_http_writer.pyx":31 * * * cdef inline void _release_writer(Writer* writer): # <<<<<<<<<<<<<< * if writer.buf != BUFFER: * PyMem_Free(writer.buf) */ static CYTHON_INLINE void __pyx_f_7aiohttp_12_http_writer__release_writer(struct __pyx_t_7aiohttp_12_http_writer_Writer *__pyx_v_writer) { __Pyx_RefNannyDeclarations int __pyx_t_1; __Pyx_RefNannySetupContext("_release_writer", 0); /* "aiohttp/_http_writer.pyx":32 * * cdef inline void _release_writer(Writer* writer): * if writer.buf != BUFFER: # <<<<<<<<<<<<<< * PyMem_Free(writer.buf) * */ __pyx_t_1 = ((__pyx_v_writer->buf != __pyx_v_7aiohttp_12_http_writer_BUFFER) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":33 * cdef inline void _release_writer(Writer* writer): * if writer.buf != BUFFER: * PyMem_Free(writer.buf) # <<<<<<<<<<<<<< * * */ PyMem_Free(__pyx_v_writer->buf); /* "aiohttp/_http_writer.pyx":32 * * cdef inline void _release_writer(Writer* writer): * if writer.buf != BUFFER: # <<<<<<<<<<<<<< * PyMem_Free(writer.buf) * */ } /* "aiohttp/_http_writer.pyx":31 * * * cdef inline void _release_writer(Writer* writer): # <<<<<<<<<<<<<< * if writer.buf != BUFFER: * PyMem_Free(writer.buf) */ /* function exit code */ __Pyx_RefNannyFinishContext(); } /* "aiohttp/_http_writer.pyx":36 * * * cdef inline int _write_byte(Writer* writer, uint8_t ch): # <<<<<<<<<<<<<< * cdef char * buf * cdef Py_ssize_t size */ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_byte(struct __pyx_t_7aiohttp_12_http_writer_Writer *__pyx_v_writer, uint8_t __pyx_v_ch) { char *__pyx_v_buf; Py_ssize_t __pyx_v_size; int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2; __Pyx_RefNannySetupContext("_write_byte", 0); /* "aiohttp/_http_writer.pyx":40 * cdef Py_ssize_t size * * if writer.pos == writer.size: # <<<<<<<<<<<<<< * # reallocate * size = writer.size + BUF_SIZE */ __pyx_t_1 = ((__pyx_v_writer->pos == __pyx_v_writer->size) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":42 * if writer.pos == writer.size: * # reallocate * size = writer.size + BUF_SIZE # <<<<<<<<<<<<<< * if writer.buf == BUFFER: * buf = PyMem_Malloc(size) */ __pyx_v_size = (__pyx_v_writer->size + 0x4000); /* "aiohttp/_http_writer.pyx":43 * # reallocate * size = writer.size + BUF_SIZE * if writer.buf == BUFFER: # <<<<<<<<<<<<<< * buf = PyMem_Malloc(size) * if buf == NULL: */ __pyx_t_1 = ((__pyx_v_writer->buf == __pyx_v_7aiohttp_12_http_writer_BUFFER) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":44 * size = writer.size + BUF_SIZE * if writer.buf == BUFFER: * buf = PyMem_Malloc(size) # <<<<<<<<<<<<<< * if buf == NULL: * PyErr_NoMemory() */ __pyx_v_buf = ((char *)PyMem_Malloc(__pyx_v_size)); /* "aiohttp/_http_writer.pyx":45 * if writer.buf == BUFFER: * buf = PyMem_Malloc(size) * if buf == NULL: # <<<<<<<<<<<<<< * PyErr_NoMemory() * return -1 */ __pyx_t_1 = ((__pyx_v_buf == NULL) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":46 * buf = PyMem_Malloc(size) * if buf == NULL: * PyErr_NoMemory() # <<<<<<<<<<<<<< * return -1 * memcpy(buf, writer.buf, writer.size) */ __pyx_t_2 = PyErr_NoMemory(); if (unlikely(__pyx_t_2 == ((PyObject *)NULL))) __PYX_ERR(0, 46, __pyx_L1_error) /* "aiohttp/_http_writer.pyx":47 * if buf == NULL: * PyErr_NoMemory() * return -1 # <<<<<<<<<<<<<< * memcpy(buf, writer.buf, writer.size) * else: */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":45 * if writer.buf == BUFFER: * buf = PyMem_Malloc(size) * if buf == NULL: # <<<<<<<<<<<<<< * PyErr_NoMemory() * return -1 */ } /* "aiohttp/_http_writer.pyx":48 * PyErr_NoMemory() * return -1 * memcpy(buf, writer.buf, writer.size) # <<<<<<<<<<<<<< * else: * buf = PyMem_Realloc(writer.buf, size) */ (void)(memcpy(__pyx_v_buf, __pyx_v_writer->buf, __pyx_v_writer->size)); /* "aiohttp/_http_writer.pyx":43 * # reallocate * size = writer.size + BUF_SIZE * if writer.buf == BUFFER: # <<<<<<<<<<<<<< * buf = PyMem_Malloc(size) * if buf == NULL: */ goto __pyx_L4; } /* "aiohttp/_http_writer.pyx":50 * memcpy(buf, writer.buf, writer.size) * else: * buf = PyMem_Realloc(writer.buf, size) # <<<<<<<<<<<<<< * if buf == NULL: * PyErr_NoMemory() */ /*else*/ { __pyx_v_buf = ((char *)PyMem_Realloc(__pyx_v_writer->buf, __pyx_v_size)); /* "aiohttp/_http_writer.pyx":51 * else: * buf = PyMem_Realloc(writer.buf, size) * if buf == NULL: # <<<<<<<<<<<<<< * PyErr_NoMemory() * return -1 */ __pyx_t_1 = ((__pyx_v_buf == NULL) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":52 * buf = PyMem_Realloc(writer.buf, size) * if buf == NULL: * PyErr_NoMemory() # <<<<<<<<<<<<<< * return -1 * writer.buf = buf */ __pyx_t_2 = PyErr_NoMemory(); if (unlikely(__pyx_t_2 == ((PyObject *)NULL))) __PYX_ERR(0, 52, __pyx_L1_error) /* "aiohttp/_http_writer.pyx":53 * if buf == NULL: * PyErr_NoMemory() * return -1 # <<<<<<<<<<<<<< * writer.buf = buf * writer.size = size */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":51 * else: * buf = PyMem_Realloc(writer.buf, size) * if buf == NULL: # <<<<<<<<<<<<<< * PyErr_NoMemory() * return -1 */ } } __pyx_L4:; /* "aiohttp/_http_writer.pyx":54 * PyErr_NoMemory() * return -1 * writer.buf = buf # <<<<<<<<<<<<<< * writer.size = size * writer.buf[writer.pos] = ch */ __pyx_v_writer->buf = __pyx_v_buf; /* "aiohttp/_http_writer.pyx":55 * return -1 * writer.buf = buf * writer.size = size # <<<<<<<<<<<<<< * writer.buf[writer.pos] = ch * writer.pos += 1 */ __pyx_v_writer->size = __pyx_v_size; /* "aiohttp/_http_writer.pyx":40 * cdef Py_ssize_t size * * if writer.pos == writer.size: # <<<<<<<<<<<<<< * # reallocate * size = writer.size + BUF_SIZE */ } /* "aiohttp/_http_writer.pyx":56 * writer.buf = buf * writer.size = size * writer.buf[writer.pos] = ch # <<<<<<<<<<<<<< * writer.pos += 1 * return 0 */ (__pyx_v_writer->buf[__pyx_v_writer->pos]) = ((char)__pyx_v_ch); /* "aiohttp/_http_writer.pyx":57 * writer.size = size * writer.buf[writer.pos] = ch * writer.pos += 1 # <<<<<<<<<<<<<< * return 0 * */ __pyx_v_writer->pos = (__pyx_v_writer->pos + 1); /* "aiohttp/_http_writer.pyx":58 * writer.buf[writer.pos] = ch * writer.pos += 1 * return 0 # <<<<<<<<<<<<<< * * */ __pyx_r = 0; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":36 * * * cdef inline int _write_byte(Writer* writer, uint8_t ch): # <<<<<<<<<<<<<< * cdef char * buf * cdef Py_ssize_t size */ /* function exit code */ __pyx_L1_error:; __Pyx_WriteUnraisable("aiohttp._http_writer._write_byte", __pyx_clineno, __pyx_lineno, __pyx_filename, 1, 0); __pyx_r = 0; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_writer.pyx":61 * * * cdef inline int _write_utf8(Writer* writer, Py_UCS4 symbol): # <<<<<<<<<<<<<< * cdef uint64_t utf = symbol * */ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_utf8(struct __pyx_t_7aiohttp_12_http_writer_Writer *__pyx_v_writer, Py_UCS4 __pyx_v_symbol) { uint64_t __pyx_v_utf; int __pyx_r; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; __Pyx_RefNannySetupContext("_write_utf8", 0); /* "aiohttp/_http_writer.pyx":62 * * cdef inline int _write_utf8(Writer* writer, Py_UCS4 symbol): * cdef uint64_t utf = symbol # <<<<<<<<<<<<<< * * if utf < 0x80: */ __pyx_v_utf = ((uint64_t)__pyx_v_symbol); /* "aiohttp/_http_writer.pyx":64 * cdef uint64_t utf = symbol * * if utf < 0x80: # <<<<<<<<<<<<<< * return _write_byte(writer, utf) * elif utf < 0x800: */ __pyx_t_1 = ((__pyx_v_utf < 0x80) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":65 * * if utf < 0x80: * return _write_byte(writer, utf) # <<<<<<<<<<<<<< * elif utf < 0x800: * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: */ __pyx_r = __pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)__pyx_v_utf)); goto __pyx_L0; /* "aiohttp/_http_writer.pyx":64 * cdef uint64_t utf = symbol * * if utf < 0x80: # <<<<<<<<<<<<<< * return _write_byte(writer, utf) * elif utf < 0x800: */ } /* "aiohttp/_http_writer.pyx":66 * if utf < 0x80: * return _write_byte(writer, utf) * elif utf < 0x800: # <<<<<<<<<<<<<< * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: * return -1 */ __pyx_t_1 = ((__pyx_v_utf < 0x800) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":67 * return _write_byte(writer, utf) * elif utf < 0x800: * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: # <<<<<<<<<<<<<< * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0xc0 | (__pyx_v_utf >> 6)))) < 0) != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":68 * elif utf < 0x800: * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: * return -1 # <<<<<<<<<<<<<< * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif 0xD800 <= utf <= 0xDFFF: */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":67 * return _write_byte(writer, utf) * elif utf < 0x800: * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: # <<<<<<<<<<<<<< * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) */ } /* "aiohttp/_http_writer.pyx":69 * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) # <<<<<<<<<<<<<< * elif 0xD800 <= utf <= 0xDFFF: * # surogate pair, ignored */ __pyx_r = __pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | (__pyx_v_utf & 0x3f)))); goto __pyx_L0; /* "aiohttp/_http_writer.pyx":66 * if utf < 0x80: * return _write_byte(writer, utf) * elif utf < 0x800: # <<<<<<<<<<<<<< * if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: * return -1 */ } /* "aiohttp/_http_writer.pyx":70 * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif 0xD800 <= utf <= 0xDFFF: # <<<<<<<<<<<<<< * # surogate pair, ignored * return 0 */ __pyx_t_1 = (0xD800 <= __pyx_v_utf); if (__pyx_t_1) { __pyx_t_1 = (__pyx_v_utf <= 0xDFFF); } __pyx_t_2 = (__pyx_t_1 != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":72 * elif 0xD800 <= utf <= 0xDFFF: * # surogate pair, ignored * return 0 # <<<<<<<<<<<<<< * elif utf < 0x10000: * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: */ __pyx_r = 0; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":70 * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif 0xD800 <= utf <= 0xDFFF: # <<<<<<<<<<<<<< * # surogate pair, ignored * return 0 */ } /* "aiohttp/_http_writer.pyx":73 * # surogate pair, ignored * return 0 * elif utf < 0x10000: # <<<<<<<<<<<<<< * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: * return -1 */ __pyx_t_2 = ((__pyx_v_utf < 0x10000) != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":74 * return 0 * elif utf < 0x10000: * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: # <<<<<<<<<<<<<< * return -1 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: */ __pyx_t_2 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0xe0 | (__pyx_v_utf >> 12)))) < 0) != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":75 * elif utf < 0x10000: * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: * return -1 # <<<<<<<<<<<<<< * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":74 * return 0 * elif utf < 0x10000: * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: # <<<<<<<<<<<<<< * return -1 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: */ } /* "aiohttp/_http_writer.pyx":76 * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: * return -1 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: # <<<<<<<<<<<<<< * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) */ __pyx_t_2 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | ((__pyx_v_utf >> 6) & 0x3f)))) < 0) != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":77 * return -1 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 # <<<<<<<<<<<<<< * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif utf > 0x10FFFF: */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":76 * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: * return -1 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: # <<<<<<<<<<<<<< * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) */ } /* "aiohttp/_http_writer.pyx":78 * if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) # <<<<<<<<<<<<<< * elif utf > 0x10FFFF: * # symbol is too large */ __pyx_r = __pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | (__pyx_v_utf & 0x3f)))); goto __pyx_L0; /* "aiohttp/_http_writer.pyx":73 * # surogate pair, ignored * return 0 * elif utf < 0x10000: # <<<<<<<<<<<<<< * if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: * return -1 */ } /* "aiohttp/_http_writer.pyx":79 * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif utf > 0x10FFFF: # <<<<<<<<<<<<<< * # symbol is too large * return 0 */ __pyx_t_2 = ((__pyx_v_utf > 0x10FFFF) != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":81 * elif utf > 0x10FFFF: * # symbol is too large * return 0 # <<<<<<<<<<<<<< * else: * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: */ __pyx_r = 0; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":79 * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) * elif utf > 0x10FFFF: # <<<<<<<<<<<<<< * # symbol is too large * return 0 */ } /* "aiohttp/_http_writer.pyx":83 * return 0 * else: * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: # <<<<<<<<<<<<<< * return -1 * if _write_byte(writer, */ /*else*/ { __pyx_t_2 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0xf0 | (__pyx_v_utf >> 18)))) < 0) != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":84 * else: * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: * return -1 # <<<<<<<<<<<<<< * if _write_byte(writer, * (0x80 | ((utf >> 12) & 0x3f))) < 0: */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":83 * return 0 * else: * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: # <<<<<<<<<<<<<< * return -1 * if _write_byte(writer, */ } /* "aiohttp/_http_writer.pyx":86 * return -1 * if _write_byte(writer, * (0x80 | ((utf >> 12) & 0x3f))) < 0: # <<<<<<<<<<<<<< * return -1 * if _write_byte(writer, */ __pyx_t_2 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | ((__pyx_v_utf >> 12) & 0x3f)))) < 0) != 0); /* "aiohttp/_http_writer.pyx":85 * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: * return -1 * if _write_byte(writer, # <<<<<<<<<<<<<< * (0x80 | ((utf >> 12) & 0x3f))) < 0: * return -1 */ if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":87 * if _write_byte(writer, * (0x80 | ((utf >> 12) & 0x3f))) < 0: * return -1 # <<<<<<<<<<<<<< * if _write_byte(writer, * (0x80 | ((utf >> 6) & 0x3f))) < 0: */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":85 * if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: * return -1 * if _write_byte(writer, # <<<<<<<<<<<<<< * (0x80 | ((utf >> 12) & 0x3f))) < 0: * return -1 */ } /* "aiohttp/_http_writer.pyx":89 * return -1 * if _write_byte(writer, * (0x80 | ((utf >> 6) & 0x3f))) < 0: # <<<<<<<<<<<<<< * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) */ __pyx_t_2 = ((__pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | ((__pyx_v_utf >> 6) & 0x3f)))) < 0) != 0); /* "aiohttp/_http_writer.pyx":88 * (0x80 | ((utf >> 12) & 0x3f))) < 0: * return -1 * if _write_byte(writer, # <<<<<<<<<<<<<< * (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 */ if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":90 * if _write_byte(writer, * (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 # <<<<<<<<<<<<<< * return _write_byte(writer, (0x80 | (utf & 0x3f))) * */ __pyx_r = -1; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":88 * (0x80 | ((utf >> 12) & 0x3f))) < 0: * return -1 * if _write_byte(writer, # <<<<<<<<<<<<<< * (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 */ } /* "aiohttp/_http_writer.pyx":91 * (0x80 | ((utf >> 6) & 0x3f))) < 0: * return -1 * return _write_byte(writer, (0x80 | (utf & 0x3f))) # <<<<<<<<<<<<<< * * */ __pyx_r = __pyx_f_7aiohttp_12_http_writer__write_byte(__pyx_v_writer, ((uint8_t)(0x80 | (__pyx_v_utf & 0x3f)))); goto __pyx_L0; } /* "aiohttp/_http_writer.pyx":61 * * * cdef inline int _write_utf8(Writer* writer, Py_UCS4 symbol): # <<<<<<<<<<<<<< * cdef uint64_t utf = symbol * */ /* function exit code */ __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_writer.pyx":94 * * * cdef inline int _write_str(Writer* writer, str s): # <<<<<<<<<<<<<< * cdef Py_UCS4 ch * for ch in s: */ static CYTHON_INLINE int __pyx_f_7aiohttp_12_http_writer__write_str(struct __pyx_t_7aiohttp_12_http_writer_Writer *__pyx_v_writer, PyObject *__pyx_v_s) { Py_UCS4 __pyx_v_ch; int __pyx_r; __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; Py_ssize_t __pyx_t_2; Py_ssize_t __pyx_t_3; void *__pyx_t_4; int __pyx_t_5; int __pyx_t_6; Py_ssize_t __pyx_t_7; int __pyx_t_8; __Pyx_RefNannySetupContext("_write_str", 0); /* "aiohttp/_http_writer.pyx":96 * cdef inline int _write_str(Writer* writer, str s): * cdef Py_UCS4 ch * for ch in s: # <<<<<<<<<<<<<< * if _write_utf8(writer, ch) < 0: * return -1 */ if (unlikely(__pyx_v_s == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' is not iterable"); __PYX_ERR(0, 96, __pyx_L1_error) } __Pyx_INCREF(__pyx_v_s); __pyx_t_1 = __pyx_v_s; __pyx_t_6 = __Pyx_init_unicode_iteration(__pyx_t_1, (&__pyx_t_3), (&__pyx_t_4), (&__pyx_t_5)); if (unlikely(__pyx_t_6 == ((int)-1))) __PYX_ERR(0, 96, __pyx_L1_error) for (__pyx_t_7 = 0; __pyx_t_7 < __pyx_t_3; __pyx_t_7++) { __pyx_t_2 = __pyx_t_7; __pyx_v_ch = __Pyx_PyUnicode_READ(__pyx_t_5, __pyx_t_4, __pyx_t_2); /* "aiohttp/_http_writer.pyx":97 * cdef Py_UCS4 ch * for ch in s: * if _write_utf8(writer, ch) < 0: # <<<<<<<<<<<<<< * return -1 * */ __pyx_t_8 = ((__pyx_f_7aiohttp_12_http_writer__write_utf8(__pyx_v_writer, __pyx_v_ch) < 0) != 0); if (__pyx_t_8) { /* "aiohttp/_http_writer.pyx":98 * for ch in s: * if _write_utf8(writer, ch) < 0: * return -1 # <<<<<<<<<<<<<< * * */ __pyx_r = -1; __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":97 * cdef Py_UCS4 ch * for ch in s: * if _write_utf8(writer, ch) < 0: # <<<<<<<<<<<<<< * return -1 * */ } } __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_http_writer.pyx":94 * * * cdef inline int _write_str(Writer* writer, str s): # <<<<<<<<<<<<<< * cdef Py_UCS4 ch * for ch in s: */ /* function exit code */ __pyx_r = 0; goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_WriteUnraisable("aiohttp._http_writer._write_str", __pyx_clineno, __pyx_lineno, __pyx_filename, 1, 0); __pyx_r = 0; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_writer.pyx":103 * # --------------- _serialize_headers ---------------------- * * cdef str to_str(object s): # <<<<<<<<<<<<<< * typ = type(s) * if typ is str: */ static PyObject *__pyx_f_7aiohttp_12_http_writer_to_str(PyObject *__pyx_v_s) { PyTypeObject *__pyx_v_typ = NULL; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; int __pyx_t_2; PyObject *__pyx_t_3 = NULL; PyObject *__pyx_t_4 = NULL; PyObject *__pyx_t_5 = NULL; __Pyx_RefNannySetupContext("to_str", 0); /* "aiohttp/_http_writer.pyx":104 * * cdef str to_str(object s): * typ = type(s) # <<<<<<<<<<<<<< * if typ is str: * return s */ __Pyx_INCREF(((PyObject *)Py_TYPE(__pyx_v_s))); __pyx_v_typ = ((PyTypeObject*)((PyObject *)Py_TYPE(__pyx_v_s))); /* "aiohttp/_http_writer.pyx":105 * cdef str to_str(object s): * typ = type(s) * if typ is str: # <<<<<<<<<<<<<< * return s * elif typ is _istr: */ __pyx_t_1 = (__pyx_v_typ == (&PyUnicode_Type)); __pyx_t_2 = (__pyx_t_1 != 0); if (__pyx_t_2) { /* "aiohttp/_http_writer.pyx":106 * typ = type(s) * if typ is str: * return s # <<<<<<<<<<<<<< * elif typ is _istr: * return PyObject_Str(s) */ __Pyx_XDECREF(__pyx_r); __Pyx_INCREF(((PyObject*)__pyx_v_s)); __pyx_r = ((PyObject*)__pyx_v_s); goto __pyx_L0; /* "aiohttp/_http_writer.pyx":105 * cdef str to_str(object s): * typ = type(s) * if typ is str: # <<<<<<<<<<<<<< * return s * elif typ is _istr: */ } /* "aiohttp/_http_writer.pyx":107 * if typ is str: * return s * elif typ is _istr: # <<<<<<<<<<<<<< * return PyObject_Str(s) * elif not isinstance(s, str): */ __pyx_t_2 = (__pyx_v_typ == ((PyTypeObject*)__pyx_v_7aiohttp_12_http_writer__istr)); __pyx_t_1 = (__pyx_t_2 != 0); if (__pyx_t_1) { /* "aiohttp/_http_writer.pyx":108 * return s * elif typ is _istr: * return PyObject_Str(s) # <<<<<<<<<<<<<< * elif not isinstance(s, str): * raise TypeError("Cannot serialize non-str key {!r}".format(s)) */ __Pyx_XDECREF(__pyx_r); __pyx_t_3 = PyObject_Str(__pyx_v_s); if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 108, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); if (!(likely(PyUnicode_CheckExact(__pyx_t_3))||((__pyx_t_3) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "unicode", Py_TYPE(__pyx_t_3)->tp_name), 0))) __PYX_ERR(0, 108, __pyx_L1_error) __pyx_r = ((PyObject*)__pyx_t_3); __pyx_t_3 = 0; goto __pyx_L0; /* "aiohttp/_http_writer.pyx":107 * if typ is str: * return s * elif typ is _istr: # <<<<<<<<<<<<<< * return PyObject_Str(s) * elif not isinstance(s, str): */ } /* "aiohttp/_http_writer.pyx":109 * elif typ is _istr: * return PyObject_Str(s) * elif not isinstance(s, str): # <<<<<<<<<<<<<< * raise TypeError("Cannot serialize non-str key {!r}".format(s)) * else: */ __pyx_t_1 = PyUnicode_Check(__pyx_v_s); __pyx_t_2 = ((!(__pyx_t_1 != 0)) != 0); if (unlikely(__pyx_t_2)) { /* "aiohttp/_http_writer.pyx":110 * return PyObject_Str(s) * elif not isinstance(s, str): * raise TypeError("Cannot serialize non-str key {!r}".format(s)) # <<<<<<<<<<<<<< * else: * return str(s) */ __pyx_t_4 = __Pyx_PyObject_GetAttrStr(__pyx_kp_u_Cannot_serialize_non_str_key_r, __pyx_n_s_format); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 110, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_4))) { __pyx_t_5 = PyMethod_GET_SELF(__pyx_t_4); if (likely(__pyx_t_5)) { PyObject* function = PyMethod_GET_FUNCTION(__pyx_t_4); __Pyx_INCREF(__pyx_t_5); __Pyx_INCREF(function); __Pyx_DECREF_SET(__pyx_t_4, function); } } __pyx_t_3 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_4, __pyx_t_5, __pyx_v_s) : __Pyx_PyObject_CallOneArg(__pyx_t_4, __pyx_v_s); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; if (unlikely(!__pyx_t_3)) __PYX_ERR(0, 110, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_t_4 = __Pyx_PyObject_CallOneArg(__pyx_builtin_TypeError, __pyx_t_3); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 110, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __Pyx_Raise(__pyx_t_4, 0, 0, 0); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __PYX_ERR(0, 110, __pyx_L1_error) /* "aiohttp/_http_writer.pyx":109 * elif typ is _istr: * return PyObject_Str(s) * elif not isinstance(s, str): # <<<<<<<<<<<<<< * raise TypeError("Cannot serialize non-str key {!r}".format(s)) * else: */ } /* "aiohttp/_http_writer.pyx":112 * raise TypeError("Cannot serialize non-str key {!r}".format(s)) * else: * return str(s) # <<<<<<<<<<<<<< * * */ /*else*/ { __Pyx_XDECREF(__pyx_r); __pyx_t_4 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyUnicode_Type)), __pyx_v_s); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 112, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_r = ((PyObject*)__pyx_t_4); __pyx_t_4 = 0; goto __pyx_L0; } /* "aiohttp/_http_writer.pyx":103 * # --------------- _serialize_headers ---------------------- * * cdef str to_str(object s): # <<<<<<<<<<<<<< * typ = type(s) * if typ is str: */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_3); __Pyx_XDECREF(__pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __Pyx_AddTraceback("aiohttp._http_writer.to_str", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = 0; __pyx_L0:; __Pyx_XDECREF(__pyx_v_typ); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } /* "aiohttp/_http_writer.pyx":115 * * * def _serialize_headers(str status_line, headers): # <<<<<<<<<<<<<< * cdef Writer writer * cdef object key */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_12_http_writer_1_serialize_headers(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static PyMethodDef __pyx_mdef_7aiohttp_12_http_writer_1_serialize_headers = {"_serialize_headers", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_12_http_writer_1_serialize_headers, METH_VARARGS|METH_KEYWORDS, 0}; static PyObject *__pyx_pw_7aiohttp_12_http_writer_1_serialize_headers(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_status_line = 0; PyObject *__pyx_v_headers = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("_serialize_headers (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_status_line,&__pyx_n_s_headers,0}; PyObject* values[2] = {0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_status_line)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_headers)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("_serialize_headers", 1, 2, 2, 1); __PYX_ERR(0, 115, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "_serialize_headers") < 0)) __PYX_ERR(0, 115, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 2) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); } __pyx_v_status_line = ((PyObject*)values[0]); __pyx_v_headers = values[1]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("_serialize_headers", 1, 2, 2, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 115, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._http_writer._serialize_headers", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; if (unlikely(!__Pyx_ArgTypeTest(((PyObject *)__pyx_v_status_line), (&PyUnicode_Type), 1, "status_line", 1))) __PYX_ERR(0, 115, __pyx_L1_error) __pyx_r = __pyx_pf_7aiohttp_12_http_writer__serialize_headers(__pyx_self, __pyx_v_status_line, __pyx_v_headers); /* function exit code */ goto __pyx_L0; __pyx_L1_error:; __pyx_r = NULL; __pyx_L0:; __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_12_http_writer__serialize_headers(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_status_line, PyObject *__pyx_v_headers) { struct __pyx_t_7aiohttp_12_http_writer_Writer __pyx_v_writer; PyObject *__pyx_v_key = 0; PyObject *__pyx_v_val = 0; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations int __pyx_t_1; PyObject *__pyx_t_2 = NULL; Py_ssize_t __pyx_t_3; Py_ssize_t __pyx_t_4; int __pyx_t_5; PyObject *__pyx_t_6 = NULL; PyObject *__pyx_t_7 = NULL; int __pyx_t_8; char const *__pyx_t_9; PyObject *__pyx_t_10 = NULL; PyObject *__pyx_t_11 = NULL; PyObject *__pyx_t_12 = NULL; PyObject *__pyx_t_13 = NULL; PyObject *__pyx_t_14 = NULL; PyObject *__pyx_t_15 = NULL; __Pyx_RefNannySetupContext("_serialize_headers", 0); /* "aiohttp/_http_writer.pyx":121 * cdef bytes ret * * _init_writer(&writer) # <<<<<<<<<<<<<< * * try: */ __pyx_f_7aiohttp_12_http_writer__init_writer((&__pyx_v_writer)); /* "aiohttp/_http_writer.pyx":123 * _init_writer(&writer) * * try: # <<<<<<<<<<<<<< * if _write_str(&writer, status_line) < 0: * raise */ /*try:*/ { /* "aiohttp/_http_writer.pyx":124 * * try: * if _write_str(&writer, status_line) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\r') < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_str((&__pyx_v_writer), __pyx_v_status_line) < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":125 * try: * if _write_str(&writer, status_line) < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b'\r') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 125, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":124 * * try: * if _write_str(&writer, status_line) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\r') < 0: */ } /* "aiohttp/_http_writer.pyx":126 * if _write_str(&writer, status_line) < 0: * raise * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\r') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":127 * raise * if _write_byte(&writer, b'\r') < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b'\n') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 127, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":126 * if _write_str(&writer, status_line) < 0: * raise * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ } /* "aiohttp/_http_writer.pyx":128 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\n') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":129 * raise * if _write_byte(&writer, b'\n') < 0: * raise # <<<<<<<<<<<<<< * * for key, val in headers.items(): */ __Pyx_ReraiseException(); __PYX_ERR(0, 129, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":128 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ } /* "aiohttp/_http_writer.pyx":131 * raise * * for key, val in headers.items(): # <<<<<<<<<<<<<< * if _write_str(&writer, to_str(key)) < 0: * raise */ __pyx_t_3 = 0; if (unlikely(__pyx_v_headers == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "items"); __PYX_ERR(0, 131, __pyx_L4_error) } __pyx_t_6 = __Pyx_dict_iterator(__pyx_v_headers, 0, __pyx_n_s_items, (&__pyx_t_4), (&__pyx_t_5)); if (unlikely(!__pyx_t_6)) __PYX_ERR(0, 131, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = __pyx_t_6; __pyx_t_6 = 0; while (1) { __pyx_t_8 = __Pyx_dict_iter_next(__pyx_t_2, __pyx_t_4, &__pyx_t_3, &__pyx_t_6, &__pyx_t_7, NULL, __pyx_t_5); if (unlikely(__pyx_t_8 == 0)) break; if (unlikely(__pyx_t_8 == -1)) __PYX_ERR(0, 131, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_6); __Pyx_GOTREF(__pyx_t_7); __Pyx_XDECREF_SET(__pyx_v_key, __pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF_SET(__pyx_v_val, __pyx_t_7); __pyx_t_7 = 0; /* "aiohttp/_http_writer.pyx":132 * * for key, val in headers.items(): * if _write_str(&writer, to_str(key)) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b':') < 0: */ __pyx_t_7 = __pyx_f_7aiohttp_12_http_writer_to_str(__pyx_v_key); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 132, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_str((&__pyx_v_writer), ((PyObject*)__pyx_t_7)) < 0) != 0); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":133 * for key, val in headers.items(): * if _write_str(&writer, to_str(key)) < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b':') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 133, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":132 * * for key, val in headers.items(): * if _write_str(&writer, to_str(key)) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b':') < 0: */ } /* "aiohttp/_http_writer.pyx":134 * if _write_str(&writer, to_str(key)) < 0: * raise * if _write_byte(&writer, b':') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b' ') < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), ':') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":135 * raise * if _write_byte(&writer, b':') < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b' ') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 135, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":134 * if _write_str(&writer, to_str(key)) < 0: * raise * if _write_byte(&writer, b':') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b' ') < 0: */ } /* "aiohttp/_http_writer.pyx":136 * if _write_byte(&writer, b':') < 0: * raise * if _write_byte(&writer, b' ') < 0: # <<<<<<<<<<<<<< * raise * if _write_str(&writer, to_str(val)) < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), ' ') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":137 * raise * if _write_byte(&writer, b' ') < 0: * raise # <<<<<<<<<<<<<< * if _write_str(&writer, to_str(val)) < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 137, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":136 * if _write_byte(&writer, b':') < 0: * raise * if _write_byte(&writer, b' ') < 0: # <<<<<<<<<<<<<< * raise * if _write_str(&writer, to_str(val)) < 0: */ } /* "aiohttp/_http_writer.pyx":138 * if _write_byte(&writer, b' ') < 0: * raise * if _write_str(&writer, to_str(val)) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\r') < 0: */ __pyx_t_7 = __pyx_f_7aiohttp_12_http_writer_to_str(__pyx_v_val); if (unlikely(!__pyx_t_7)) __PYX_ERR(0, 138, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_7); __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_str((&__pyx_v_writer), ((PyObject*)__pyx_t_7)) < 0) != 0); __Pyx_DECREF(__pyx_t_7); __pyx_t_7 = 0; if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":139 * raise * if _write_str(&writer, to_str(val)) < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b'\r') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 139, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":138 * if _write_byte(&writer, b' ') < 0: * raise * if _write_str(&writer, to_str(val)) < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\r') < 0: */ } /* "aiohttp/_http_writer.pyx":140 * if _write_str(&writer, to_str(val)) < 0: * raise * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\r') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":141 * raise * if _write_byte(&writer, b'\r') < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b'\n') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 141, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":140 * if _write_str(&writer, to_str(val)) < 0: * raise * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ } /* "aiohttp/_http_writer.pyx":142 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\n') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":143 * raise * if _write_byte(&writer, b'\n') < 0: * raise # <<<<<<<<<<<<<< * * if _write_byte(&writer, b'\r') < 0: */ __Pyx_ReraiseException(); __PYX_ERR(0, 143, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":142 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ } } __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_writer.pyx":145 * raise * * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\r') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":146 * * if _write_byte(&writer, b'\r') < 0: * raise # <<<<<<<<<<<<<< * if _write_byte(&writer, b'\n') < 0: * raise */ __Pyx_ReraiseException(); __PYX_ERR(0, 146, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":145 * raise * * if _write_byte(&writer, b'\r') < 0: # <<<<<<<<<<<<<< * raise * if _write_byte(&writer, b'\n') < 0: */ } /* "aiohttp/_http_writer.pyx":147 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ __pyx_t_1 = ((__pyx_f_7aiohttp_12_http_writer__write_byte((&__pyx_v_writer), '\n') < 0) != 0); if (unlikely(__pyx_t_1)) { /* "aiohttp/_http_writer.pyx":148 * raise * if _write_byte(&writer, b'\n') < 0: * raise # <<<<<<<<<<<<<< * * return PyBytes_FromStringAndSize(writer.buf, writer.pos) */ __Pyx_ReraiseException(); __PYX_ERR(0, 148, __pyx_L4_error) /* "aiohttp/_http_writer.pyx":147 * if _write_byte(&writer, b'\r') < 0: * raise * if _write_byte(&writer, b'\n') < 0: # <<<<<<<<<<<<<< * raise * */ } /* "aiohttp/_http_writer.pyx":150 * raise * * return PyBytes_FromStringAndSize(writer.buf, writer.pos) # <<<<<<<<<<<<<< * finally: * _release_writer(&writer) */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = PyBytes_FromStringAndSize(__pyx_v_writer.buf, __pyx_v_writer.pos); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 150, __pyx_L4_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L3_return; } /* "aiohttp/_http_writer.pyx":152 * return PyBytes_FromStringAndSize(writer.buf, writer.pos) * finally: * _release_writer(&writer) # <<<<<<<<<<<<<< */ /*finally:*/ { __pyx_L4_error:; /*exception exit:*/{ __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __pyx_t_10 = 0; __pyx_t_11 = 0; __pyx_t_12 = 0; __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_t_15 = 0; __Pyx_XDECREF(__pyx_t_2); __pyx_t_2 = 0; __Pyx_XDECREF(__pyx_t_6); __pyx_t_6 = 0; __Pyx_XDECREF(__pyx_t_7); __pyx_t_7 = 0; if (PY_MAJOR_VERSION >= 3) __Pyx_ExceptionSwap(&__pyx_t_13, &__pyx_t_14, &__pyx_t_15); if ((PY_MAJOR_VERSION < 3) || unlikely(__Pyx_GetException(&__pyx_t_10, &__pyx_t_11, &__pyx_t_12) < 0)) __Pyx_ErrFetch(&__pyx_t_10, &__pyx_t_11, &__pyx_t_12); __Pyx_XGOTREF(__pyx_t_10); __Pyx_XGOTREF(__pyx_t_11); __Pyx_XGOTREF(__pyx_t_12); __Pyx_XGOTREF(__pyx_t_13); __Pyx_XGOTREF(__pyx_t_14); __Pyx_XGOTREF(__pyx_t_15); __pyx_t_5 = __pyx_lineno; __pyx_t_8 = __pyx_clineno; __pyx_t_9 = __pyx_filename; { __pyx_f_7aiohttp_12_http_writer__release_writer((&__pyx_v_writer)); } if (PY_MAJOR_VERSION >= 3) { __Pyx_XGIVEREF(__pyx_t_13); __Pyx_XGIVEREF(__pyx_t_14); __Pyx_XGIVEREF(__pyx_t_15); __Pyx_ExceptionReset(__pyx_t_13, __pyx_t_14, __pyx_t_15); } __Pyx_XGIVEREF(__pyx_t_10); __Pyx_XGIVEREF(__pyx_t_11); __Pyx_XGIVEREF(__pyx_t_12); __Pyx_ErrRestore(__pyx_t_10, __pyx_t_11, __pyx_t_12); __pyx_t_10 = 0; __pyx_t_11 = 0; __pyx_t_12 = 0; __pyx_t_13 = 0; __pyx_t_14 = 0; __pyx_t_15 = 0; __pyx_lineno = __pyx_t_5; __pyx_clineno = __pyx_t_8; __pyx_filename = __pyx_t_9; goto __pyx_L1_error; } __pyx_L3_return: { __pyx_t_15 = __pyx_r; __pyx_r = 0; __pyx_f_7aiohttp_12_http_writer__release_writer((&__pyx_v_writer)); __pyx_r = __pyx_t_15; __pyx_t_15 = 0; goto __pyx_L0; } } /* "aiohttp/_http_writer.pyx":115 * * * def _serialize_headers(str status_line, headers): # <<<<<<<<<<<<<< * cdef Writer writer * cdef object key */ /* function exit code */ __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_2); __Pyx_XDECREF(__pyx_t_6); __Pyx_XDECREF(__pyx_t_7); __Pyx_AddTraceback("aiohttp._http_writer._serialize_headers", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_key); __Pyx_XDECREF(__pyx_v_val); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec__http_writer(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec__http_writer}, {0, NULL} }; #endif static struct PyModuleDef __pyx_moduledef = { PyModuleDef_HEAD_INIT, "_http_writer", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; #endif #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_kp_u_Cannot_serialize_non_str_key_r, __pyx_k_Cannot_serialize_non_str_key_r, sizeof(__pyx_k_Cannot_serialize_non_str_key_r), 0, 1, 0, 0}, {&__pyx_n_s_TypeError, __pyx_k_TypeError, sizeof(__pyx_k_TypeError), 0, 0, 1, 1}, {&__pyx_n_s_aiohttp__http_writer, __pyx_k_aiohttp__http_writer, sizeof(__pyx_k_aiohttp__http_writer), 0, 0, 1, 1}, {&__pyx_kp_s_aiohttp__http_writer_pyx, __pyx_k_aiohttp__http_writer_pyx, sizeof(__pyx_k_aiohttp__http_writer_pyx), 0, 0, 1, 0}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_format, __pyx_k_format, sizeof(__pyx_k_format), 0, 0, 1, 1}, {&__pyx_n_s_headers, __pyx_k_headers, sizeof(__pyx_k_headers), 0, 0, 1, 1}, {&__pyx_n_s_import, __pyx_k_import, sizeof(__pyx_k_import), 0, 0, 1, 1}, {&__pyx_n_s_istr, __pyx_k_istr, sizeof(__pyx_k_istr), 0, 0, 1, 1}, {&__pyx_n_s_items, __pyx_k_items, sizeof(__pyx_k_items), 0, 0, 1, 1}, {&__pyx_n_s_key, __pyx_k_key, sizeof(__pyx_k_key), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_multidict, __pyx_k_multidict, sizeof(__pyx_k_multidict), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_ret, __pyx_k_ret, sizeof(__pyx_k_ret), 0, 0, 1, 1}, {&__pyx_n_s_serialize_headers, __pyx_k_serialize_headers, sizeof(__pyx_k_serialize_headers), 0, 0, 1, 1}, {&__pyx_n_s_status_line, __pyx_k_status_line, sizeof(__pyx_k_status_line), 0, 0, 1, 1}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_val, __pyx_k_val, sizeof(__pyx_k_val), 0, 0, 1, 1}, {&__pyx_n_s_writer, __pyx_k_writer, sizeof(__pyx_k_writer), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_TypeError = __Pyx_GetBuiltinName(__pyx_n_s_TypeError); if (!__pyx_builtin_TypeError) __PYX_ERR(0, 110, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "aiohttp/_http_writer.pyx":115 * * * def _serialize_headers(str status_line, headers): # <<<<<<<<<<<<<< * cdef Writer writer * cdef object key */ __pyx_tuple_ = PyTuple_Pack(6, __pyx_n_s_status_line, __pyx_n_s_headers, __pyx_n_s_writer, __pyx_n_s_key, __pyx_n_s_val, __pyx_n_s_ret); if (unlikely(!__pyx_tuple_)) __PYX_ERR(0, 115, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple_); __Pyx_GIVEREF(__pyx_tuple_); __pyx_codeobj__2 = (PyObject*)__Pyx_PyCode_New(2, 0, 6, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple_, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_aiohttp__http_writer_pyx, __pyx_n_s_serialize_headers, 115, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__2)) __PYX_ERR(0, 115, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { if (__Pyx_InitStrings(__pyx_string_tab) < 0) __PYX_ERR(0, 1, __pyx_L1_error); return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __pyx_v_7aiohttp_12_http_writer__istr = Py_None; Py_INCREF(Py_None); __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4type_type = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "type", #if defined(PYPY_VERSION_NUM) && PYPY_VERSION_NUM < 0x050B0000 sizeof(PyTypeObject), #else sizeof(PyHeapTypeObject), #endif __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_4type_type) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION < 3 #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC void #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #else #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyObject * #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC init_http_writer(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC init_http_writer(void) #else __Pyx_PyMODINIT_FUNC PyInit__http_writer(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit__http_writer(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { result = PyDict_SetItemString(moddict, to_name, value); } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, CYTHON_UNUSED PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec__http_writer(PyObject *__pyx_pyinit_module) #endif #endif { PyObject *__pyx_t_1 = NULL; PyObject *__pyx_t_2 = NULL; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module '_http_writer' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit__http_writer(void)", 0); if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS #ifdef WITH_THREAD /* Python build with threading support? */ PyEval_InitThreads(); #endif #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("_http_writer", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); #else __pyx_m = PyModule_Create(&__pyx_moduledef); #endif if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_b); __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_cython_runtime); if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error); /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_aiohttp___http_writer) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "aiohttp._http_writer")) { if (unlikely(PyDict_SetItemString(modules, "aiohttp._http_writer", __pyx_m) < 0)) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) goto __pyx_L1_error; /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) goto __pyx_L1_error; /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); (void)__Pyx_modinit_type_init_code(); if (unlikely(__Pyx_modinit_type_import_code() != 0)) goto __pyx_L1_error; (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "aiohttp/_http_writer.pyx":9 * from cpython.object cimport PyObject_Str * * from multidict import istr # <<<<<<<<<<<<<< * * DEF BUF_SIZE = 16 * 1024 # 16KiB */ __pyx_t_1 = PyList_New(1); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_INCREF(__pyx_n_s_istr); __Pyx_GIVEREF(__pyx_n_s_istr); PyList_SET_ITEM(__pyx_t_1, 0, __pyx_n_s_istr); __pyx_t_2 = __Pyx_Import(__pyx_n_s_multidict, __pyx_t_1, 0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = __Pyx_ImportFrom(__pyx_t_2, __pyx_n_s_istr); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_istr, __pyx_t_1) < 0) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_writer.pyx":14 * cdef char BUFFER[BUF_SIZE] * * cdef object _istr = istr # <<<<<<<<<<<<<< * * */ __Pyx_GetModuleGlobalName(__pyx_t_2, __pyx_n_s_istr); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 14, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_XGOTREF(__pyx_v_7aiohttp_12_http_writer__istr); __Pyx_DECREF_SET(__pyx_v_7aiohttp_12_http_writer__istr, __pyx_t_2); __Pyx_GIVEREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_writer.pyx":115 * * * def _serialize_headers(str status_line, headers): # <<<<<<<<<<<<<< * cdef Writer writer * cdef object key */ __pyx_t_2 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_12_http_writer_1_serialize_headers, NULL, __pyx_n_s_aiohttp__http_writer); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 115, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_serialize_headers, __pyx_t_2) < 0) __PYX_ERR(0, 115, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /* "aiohttp/_http_writer.pyx":1 * from libc.stdint cimport uint8_t, uint64_t # <<<<<<<<<<<<<< * from libc.string cimport memcpy * from cpython.exc cimport PyErr_NoMemory */ __pyx_t_2 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_2) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_XDECREF(__pyx_t_2); if (__pyx_m) { if (__pyx_d) { __Pyx_AddTraceback("init aiohttp._http_writer", __pyx_clineno, __pyx_lineno, __pyx_filename); } Py_CLEAR(__pyx_m); } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init aiohttp._http_writer"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStr(__pyx_b, name); if (unlikely(!result)) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; } #endif /* WriteUnraisableException */ static void __Pyx_WriteUnraisable(const char *name, CYTHON_UNUSED int clineno, CYTHON_UNUSED int lineno, CYTHON_UNUSED const char *filename, int full_traceback, CYTHON_UNUSED int nogil) { PyObject *old_exc, *old_val, *old_tb; PyObject *ctx; __Pyx_PyThreadState_declare #ifdef WITH_THREAD PyGILState_STATE state; if (nogil) state = PyGILState_Ensure(); #ifdef _MSC_VER else state = (PyGILState_STATE)-1; #endif #endif __Pyx_PyThreadState_assign __Pyx_ErrFetch(&old_exc, &old_val, &old_tb); if (full_traceback) { Py_XINCREF(old_exc); Py_XINCREF(old_val); Py_XINCREF(old_tb); __Pyx_ErrRestore(old_exc, old_val, old_tb); PyErr_PrintEx(1); } #if PY_MAJOR_VERSION < 3 ctx = PyString_FromString(name); #else ctx = PyUnicode_FromString(name); #endif __Pyx_ErrRestore(old_exc, old_val, old_tb); if (!ctx) { PyErr_WriteUnraisable(Py_None); } else { PyErr_WriteUnraisable(ctx); Py_DECREF(ctx); } #ifdef WITH_THREAD if (nogil) PyGILState_Release(state); #endif } /* unicode_iter */ static CYTHON_INLINE int __Pyx_init_unicode_iteration( PyObject* ustring, Py_ssize_t *length, void** data, int *kind) { #if CYTHON_PEP393_ENABLED if (unlikely(__Pyx_PyUnicode_READY(ustring) < 0)) return -1; *kind = PyUnicode_KIND(ustring); *length = PyUnicode_GET_LENGTH(ustring); *data = PyUnicode_DATA(ustring); #else *kind = 0; *length = PyUnicode_GET_SIZE(ustring); *data = (void*)PyUnicode_AS_UNICODE(ustring); #endif return 0; } /* PyCFunctionFastCall */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject * __Pyx_PyCFunction_FastCall(PyObject *func_obj, PyObject **args, Py_ssize_t nargs) { PyCFunctionObject *func = (PyCFunctionObject*)func_obj; PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyObject *self = PyCFunction_GET_SELF(func); int flags = PyCFunction_GET_FLAGS(func); assert(PyCFunction_Check(func)); assert(METH_FASTCALL == (flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))); assert(nargs >= 0); assert(nargs == 0 || args != NULL); /* _PyCFunction_FastCallDict() must not be called with an exception set, because it may clear it (directly or indirectly) and so the caller loses its exception */ assert(!PyErr_Occurred()); if ((PY_VERSION_HEX < 0x030700A0) || unlikely(flags & METH_KEYWORDS)) { return (*((__Pyx_PyCFunctionFastWithKeywords)(void*)meth)) (self, args, nargs, NULL); } else { return (*((__Pyx_PyCFunctionFast)(void*)meth)) (self, args, nargs); } } #endif /* PyFunctionFastCall */ #if CYTHON_FAST_PYCALL static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, PyObject *globals) { PyFrameObject *f; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject **fastlocals; Py_ssize_t i; PyObject *result; assert(globals != NULL); /* XXX Perhaps we should create a specialized PyFrame_New() that doesn't take locals, but does take builtins without sanity checking them. */ assert(tstate != NULL); f = PyFrame_New(tstate, co, globals, NULL); if (f == NULL) { return NULL; } fastlocals = __Pyx_PyFrame_GetLocalsplus(f); for (i = 0; i < na; i++) { Py_INCREF(*args); fastlocals[i] = *args++; } result = PyEval_EvalFrameEx(f,0); ++tstate->recursion_depth; Py_DECREF(f); --tstate->recursion_depth; return result; } #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); PyObject *globals = PyFunction_GET_GLOBALS(func); PyObject *argdefs = PyFunction_GET_DEFAULTS(func); PyObject *closure; #if PY_MAJOR_VERSION >= 3 PyObject *kwdefs; #endif PyObject *kwtuple, **k; PyObject **d; Py_ssize_t nd; Py_ssize_t nk; PyObject *result; assert(kwargs == NULL || PyDict_Check(kwargs)); nk = kwargs ? PyDict_Size(kwargs) : 0; if (Py_EnterRecursiveCall((char*)" while calling a Python object")) { return NULL; } if ( #if PY_MAJOR_VERSION >= 3 co->co_kwonlyargcount == 0 && #endif likely(kwargs == NULL || nk == 0) && co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { if (argdefs == NULL && co->co_argcount == nargs) { result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); goto done; } else if (nargs == 0 && argdefs != NULL && co->co_argcount == Py_SIZE(argdefs)) { /* function called with no arguments, but all parameters have a default value: use default values as arguments .*/ args = &PyTuple_GET_ITEM(argdefs, 0); result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); goto done; } } if (kwargs != NULL) { Py_ssize_t pos, i; kwtuple = PyTuple_New(2 * nk); if (kwtuple == NULL) { result = NULL; goto done; } k = &PyTuple_GET_ITEM(kwtuple, 0); pos = i = 0; while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { Py_INCREF(k[i]); Py_INCREF(k[i+1]); i += 2; } nk = i / 2; } else { kwtuple = NULL; k = NULL; } closure = PyFunction_GET_CLOSURE(func); #if PY_MAJOR_VERSION >= 3 kwdefs = PyFunction_GET_KW_DEFAULTS(func); #endif if (argdefs != NULL) { d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } else { d = NULL; nd = 0; } #if PY_MAJOR_VERSION >= 3 result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, kwdefs, closure); #else result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, closure); #endif Py_XDECREF(kwtuple); done: Py_LeaveRecursiveCall(); return result; } #endif #endif /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = func->ob_type->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCall2Args */ static CYTHON_UNUSED PyObject* __Pyx_PyObject_Call2Args(PyObject* function, PyObject* arg1, PyObject* arg2) { PyObject *args, *result = NULL; #if CYTHON_FAST_PYCALL if (PyFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyFunction_FastCall(function, args, 2); } #endif #if CYTHON_FAST_PYCCALL if (__Pyx_PyFastCFunction_Check(function)) { PyObject *args[2] = {arg1, arg2}; return __Pyx_PyCFunction_FastCall(function, args, 2); } #endif args = PyTuple_New(2); if (unlikely(!args)) goto done; Py_INCREF(arg1); PyTuple_SET_ITEM(args, 0, arg1); Py_INCREF(arg2); PyTuple_SET_ITEM(args, 1, arg2); Py_INCREF(function); result = __Pyx_PyObject_Call(function, args, NULL); Py_DECREF(args); Py_DECREF(function); done: return result; } /* PyObjectCallMethO */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { PyObject *self, *result; PyCFunction cfunc; cfunc = PyCFunction_GET_FUNCTION(func); self = PyCFunction_GET_SELF(func); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = cfunc(self, arg); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallOneArg */ #if CYTHON_COMPILING_IN_CPYTHON static PyObject* __Pyx__PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_New(1); if (unlikely(!args)) return NULL; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, &arg, 1); } #endif if (likely(PyCFunction_Check(func))) { if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { return __Pyx_PyObject_CallMethO(func, arg); #if CYTHON_FAST_PYCCALL } else if (PyCFunction_GET_FLAGS(func) & METH_FASTCALL) { return __Pyx_PyCFunction_FastCall(func, &arg, 1); #endif } } return __Pyx__PyObject_CallOneArg(func, arg); } #else static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_Pack(1, arg); if (unlikely(!args)) return NULL; result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } #endif /* RaiseException */ #if PY_MAJOR_VERSION < 3 static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, CYTHON_UNUSED PyObject *cause) { __Pyx_PyThreadState_declare Py_XINCREF(type); if (!value || value == Py_None) value = NULL; else Py_INCREF(value); if (!tb || tb == Py_None) tb = NULL; else { Py_INCREF(tb); if (!PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto raise_error; } } if (PyType_Check(type)) { #if CYTHON_COMPILING_IN_PYPY if (!value) { Py_INCREF(Py_None); value = Py_None; } #endif PyErr_NormalizeException(&type, &value, &tb); } else { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto raise_error; } value = type; type = (PyObject*) Py_TYPE(type); Py_INCREF(type); if (!PyType_IsSubtype((PyTypeObject *)type, (PyTypeObject *)PyExc_BaseException)) { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto raise_error; } } __Pyx_PyThreadState_assign __Pyx_ErrRestore(type, value, tb); return; raise_error: Py_XDECREF(value); Py_XDECREF(type); Py_XDECREF(tb); return; } #else static void __Pyx_Raise(PyObject *type, PyObject *value, PyObject *tb, PyObject *cause) { PyObject* owned_instance = NULL; if (tb == Py_None) { tb = 0; } else if (tb && !PyTraceBack_Check(tb)) { PyErr_SetString(PyExc_TypeError, "raise: arg 3 must be a traceback or None"); goto bad; } if (value == Py_None) value = 0; if (PyExceptionInstance_Check(type)) { if (value) { PyErr_SetString(PyExc_TypeError, "instance exception may not have a separate value"); goto bad; } value = type; type = (PyObject*) Py_TYPE(value); } else if (PyExceptionClass_Check(type)) { PyObject *instance_class = NULL; if (value && PyExceptionInstance_Check(value)) { instance_class = (PyObject*) Py_TYPE(value); if (instance_class != type) { int is_subclass = PyObject_IsSubclass(instance_class, type); if (!is_subclass) { instance_class = NULL; } else if (unlikely(is_subclass == -1)) { goto bad; } else { type = instance_class; } } } if (!instance_class) { PyObject *args; if (!value) args = PyTuple_New(0); else if (PyTuple_Check(value)) { Py_INCREF(value); args = value; } else args = PyTuple_Pack(1, value); if (!args) goto bad; owned_instance = PyObject_Call(type, args, NULL); Py_DECREF(args); if (!owned_instance) goto bad; value = owned_instance; if (!PyExceptionInstance_Check(value)) { PyErr_Format(PyExc_TypeError, "calling %R should have returned an instance of " "BaseException, not %R", type, Py_TYPE(value)); goto bad; } } } else { PyErr_SetString(PyExc_TypeError, "raise: exception class must be a subclass of BaseException"); goto bad; } if (cause) { PyObject *fixed_cause; if (cause == Py_None) { fixed_cause = NULL; } else if (PyExceptionClass_Check(cause)) { fixed_cause = PyObject_CallObject(cause, NULL); if (fixed_cause == NULL) goto bad; } else if (PyExceptionInstance_Check(cause)) { fixed_cause = cause; Py_INCREF(fixed_cause); } else { PyErr_SetString(PyExc_TypeError, "exception causes must derive from " "BaseException"); goto bad; } PyException_SetCause(value, fixed_cause); } PyErr_SetObject(type, value); if (tb) { #if CYTHON_COMPILING_IN_PYPY PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_Fetch(&tmp_type, &tmp_value, &tmp_tb); Py_INCREF(tb); PyErr_Restore(tmp_type, tmp_value, tb); Py_XDECREF(tmp_tb); #else PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* tmp_tb = tstate->curexc_traceback; if (tb != tmp_tb) { Py_INCREF(tb); tstate->curexc_traceback = tb; Py_XDECREF(tmp_tb); } #endif } bad: Py_XDECREF(owned_instance); return; } #endif /* RaiseArgTupleInvalid */ static void __Pyx_RaiseArgtupleInvalid( const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found) { Py_ssize_t num_expected; const char *more_or_less; if (num_found < num_min) { num_expected = num_min; more_or_less = "at least"; } else { num_expected = num_max; more_or_less = "at most"; } if (exact) { more_or_less = "exactly"; } PyErr_Format(PyExc_TypeError, "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", func_name, more_or_less, num_expected, (num_expected == 1) ? "" : "s", num_found); } /* RaiseDoubleKeywords */ static void __Pyx_RaiseDoubleKeywordsError( const char* func_name, PyObject* kw_name) { PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION >= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; while (PyDict_Next(kwds, &pos, &key, &value)) { name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; continue; } name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_CheckExact(key)) || likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = (**name == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**name) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**name, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**argname) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: return -1; } /* ArgTypeTest */ static int __Pyx__ArgTypeTest(PyObject *obj, PyTypeObject *type, const char *name, int exact) { if (unlikely(!type)) { PyErr_SetString(PyExc_SystemError, "Missing type object"); return 0; } else if (exact) { #if PY_MAJOR_VERSION == 2 if ((type == &PyBaseString_Type) && likely(__Pyx_PyBaseString_CheckExact(obj))) return 1; #endif } else { if (likely(__Pyx_TypeCheck(obj, type))) return 1; } PyErr_Format(PyExc_TypeError, "Argument '%.200s' has incorrect type (expected %.200s, got %.200s)", name, type->tp_name, Py_TYPE(obj)->tp_name); return 0; } /* GetTopmostException */ #if CYTHON_USE_EXC_INFO_STACK static _PyErr_StackItem * __Pyx_PyErr_GetTopmostException(PyThreadState *tstate) { _PyErr_StackItem *exc_info = tstate->exc_info; while ((exc_info->exc_type == NULL || exc_info->exc_type == Py_None) && exc_info->previous_item != NULL) { exc_info = exc_info->previous_item; } return exc_info; } #endif /* ReRaiseException */ static CYTHON_INLINE void __Pyx_ReraiseException(void) { PyObject *type = NULL, *value = NULL, *tb = NULL; #if CYTHON_FAST_THREAD_STATE PyThreadState *tstate = PyThreadState_GET(); #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); type = exc_info->exc_type; value = exc_info->exc_value; tb = exc_info->exc_traceback; #else type = tstate->exc_type; value = tstate->exc_value; tb = tstate->exc_traceback; #endif #else PyErr_GetExcInfo(&type, &value, &tb); #endif if (!type || type == Py_None) { #if !CYTHON_FAST_THREAD_STATE Py_XDECREF(type); Py_XDECREF(value); Py_XDECREF(tb); #endif PyErr_SetString(PyExc_RuntimeError, "No active exception to reraise"); } else { #if CYTHON_FAST_THREAD_STATE Py_INCREF(type); Py_XINCREF(value); Py_XINCREF(tb); #endif PyErr_Restore(type, value, tb); } } /* IterFinish */ static CYTHON_INLINE int __Pyx_IterFinish(void) { #if CYTHON_FAST_THREAD_STATE PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject* exc_type = tstate->curexc_type; if (unlikely(exc_type)) { if (likely(__Pyx_PyErr_GivenExceptionMatches(exc_type, PyExc_StopIteration))) { PyObject *exc_value, *exc_tb; exc_value = tstate->curexc_value; exc_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; Py_DECREF(exc_type); Py_XDECREF(exc_value); Py_XDECREF(exc_tb); return 0; } else { return -1; } } return 0; #else if (unlikely(PyErr_Occurred())) { if (likely(PyErr_ExceptionMatches(PyExc_StopIteration))) { PyErr_Clear(); return 0; } else { return -1; } } return 0; #endif } /* PyObjectCallNoArg */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallNoArg(PyObject *func) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, NULL, 0); } #endif #ifdef __Pyx_CyFunction_USED if (likely(PyCFunction_Check(func) || __Pyx_CyFunction_Check(func))) #else if (likely(PyCFunction_Check(func))) #endif { if (likely(PyCFunction_GET_FLAGS(func) & METH_NOARGS)) { return __Pyx_PyObject_CallMethO(func, NULL); } } return __Pyx_PyObject_Call(func, __pyx_empty_tuple, NULL); } #endif /* PyObjectGetMethod */ static int __Pyx_PyObject_GetMethod(PyObject *obj, PyObject *name, PyObject **method) { PyObject *attr; #if CYTHON_UNPACK_METHODS && CYTHON_COMPILING_IN_CPYTHON && CYTHON_USE_PYTYPE_LOOKUP PyTypeObject *tp = Py_TYPE(obj); PyObject *descr; descrgetfunc f = NULL; PyObject **dictptr, *dict; int meth_found = 0; assert (*method == NULL); if (unlikely(tp->tp_getattro != PyObject_GenericGetAttr)) { attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; } if (unlikely(tp->tp_dict == NULL) && unlikely(PyType_Ready(tp) < 0)) { return 0; } descr = _PyType_Lookup(tp, name); if (likely(descr != NULL)) { Py_INCREF(descr); #if PY_MAJOR_VERSION >= 3 #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr) || (Py_TYPE(descr) == &PyMethodDescr_Type))) #endif #else #ifdef __Pyx_CyFunction_USED if (likely(PyFunction_Check(descr) || __Pyx_CyFunction_Check(descr))) #else if (likely(PyFunction_Check(descr))) #endif #endif { meth_found = 1; } else { f = Py_TYPE(descr)->tp_descr_get; if (f != NULL && PyDescr_IsData(descr)) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } } } dictptr = _PyObject_GetDictPtr(obj); if (dictptr != NULL && (dict = *dictptr) != NULL) { Py_INCREF(dict); attr = __Pyx_PyDict_GetItemStr(dict, name); if (attr != NULL) { Py_INCREF(attr); Py_DECREF(dict); Py_XDECREF(descr); goto try_unpack; } Py_DECREF(dict); } if (meth_found) { *method = descr; return 1; } if (f != NULL) { attr = f(descr, obj, (PyObject *)Py_TYPE(obj)); Py_DECREF(descr); goto try_unpack; } if (descr != NULL) { *method = descr; return 0; } PyErr_Format(PyExc_AttributeError, #if PY_MAJOR_VERSION >= 3 "'%.50s' object has no attribute '%U'", tp->tp_name, name); #else "'%.50s' object has no attribute '%.400s'", tp->tp_name, PyString_AS_STRING(name)); #endif return 0; #else attr = __Pyx_PyObject_GetAttrStr(obj, name); goto try_unpack; #endif try_unpack: #if CYTHON_UNPACK_METHODS if (likely(attr) && PyMethod_Check(attr) && likely(PyMethod_GET_SELF(attr) == obj)) { PyObject *function = PyMethod_GET_FUNCTION(attr); Py_INCREF(function); Py_DECREF(attr); *method = function; return 1; } #endif *method = attr; return 0; } /* PyObjectCallMethod0 */ static PyObject* __Pyx_PyObject_CallMethod0(PyObject* obj, PyObject* method_name) { PyObject *method = NULL, *result = NULL; int is_method = __Pyx_PyObject_GetMethod(obj, method_name, &method); if (likely(is_method)) { result = __Pyx_PyObject_CallOneArg(method, obj); Py_DECREF(method); return result; } if (unlikely(!method)) goto bad; result = __Pyx_PyObject_CallNoArg(method); Py_DECREF(method); bad: return result; } /* RaiseNeedMoreValuesToUnpack */ static CYTHON_INLINE void __Pyx_RaiseNeedMoreValuesError(Py_ssize_t index) { PyErr_Format(PyExc_ValueError, "need more than %" CYTHON_FORMAT_SSIZE_T "d value%.1s to unpack", index, (index == 1) ? "" : "s"); } /* RaiseTooManyValuesToUnpack */ static CYTHON_INLINE void __Pyx_RaiseTooManyValuesError(Py_ssize_t expected) { PyErr_Format(PyExc_ValueError, "too many values to unpack (expected %" CYTHON_FORMAT_SSIZE_T "d)", expected); } /* UnpackItemEndCheck */ static int __Pyx_IternextUnpackEndCheck(PyObject *retval, Py_ssize_t expected) { if (unlikely(retval)) { Py_DECREF(retval); __Pyx_RaiseTooManyValuesError(expected); return -1; } else { return __Pyx_IterFinish(); } return 0; } /* RaiseNoneIterError */ static CYTHON_INLINE void __Pyx_RaiseNoneNotIterableError(void) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); } /* UnpackTupleError */ static void __Pyx_UnpackTupleError(PyObject *t, Py_ssize_t index) { if (t == Py_None) { __Pyx_RaiseNoneNotIterableError(); } else if (PyTuple_GET_SIZE(t) < index) { __Pyx_RaiseNeedMoreValuesError(PyTuple_GET_SIZE(t)); } else { __Pyx_RaiseTooManyValuesError(index); } } /* UnpackTuple2 */ static CYTHON_INLINE int __Pyx_unpack_tuple2_exact( PyObject* tuple, PyObject** pvalue1, PyObject** pvalue2, int decref_tuple) { PyObject *value1 = NULL, *value2 = NULL; #if CYTHON_COMPILING_IN_PYPY value1 = PySequence_ITEM(tuple, 0); if (unlikely(!value1)) goto bad; value2 = PySequence_ITEM(tuple, 1); if (unlikely(!value2)) goto bad; #else value1 = PyTuple_GET_ITEM(tuple, 0); Py_INCREF(value1); value2 = PyTuple_GET_ITEM(tuple, 1); Py_INCREF(value2); #endif if (decref_tuple) { Py_DECREF(tuple); } *pvalue1 = value1; *pvalue2 = value2; return 0; #if CYTHON_COMPILING_IN_PYPY bad: Py_XDECREF(value1); Py_XDECREF(value2); if (decref_tuple) { Py_XDECREF(tuple); } return -1; #endif } static int __Pyx_unpack_tuple2_generic(PyObject* tuple, PyObject** pvalue1, PyObject** pvalue2, int has_known_size, int decref_tuple) { Py_ssize_t index; PyObject *value1 = NULL, *value2 = NULL, *iter = NULL; iternextfunc iternext; iter = PyObject_GetIter(tuple); if (unlikely(!iter)) goto bad; if (decref_tuple) { Py_DECREF(tuple); tuple = NULL; } iternext = Py_TYPE(iter)->tp_iternext; value1 = iternext(iter); if (unlikely(!value1)) { index = 0; goto unpacking_failed; } value2 = iternext(iter); if (unlikely(!value2)) { index = 1; goto unpacking_failed; } if (!has_known_size && unlikely(__Pyx_IternextUnpackEndCheck(iternext(iter), 2))) goto bad; Py_DECREF(iter); *pvalue1 = value1; *pvalue2 = value2; return 0; unpacking_failed: if (!has_known_size && __Pyx_IterFinish() == 0) __Pyx_RaiseNeedMoreValuesError(index); bad: Py_XDECREF(iter); Py_XDECREF(value1); Py_XDECREF(value2); if (decref_tuple) { Py_XDECREF(tuple); } return -1; } /* dict_iter */ static CYTHON_INLINE PyObject* __Pyx_dict_iterator(PyObject* iterable, int is_dict, PyObject* method_name, Py_ssize_t* p_orig_length, int* p_source_is_dict) { is_dict = is_dict || likely(PyDict_CheckExact(iterable)); *p_source_is_dict = is_dict; if (is_dict) { #if !CYTHON_COMPILING_IN_PYPY *p_orig_length = PyDict_Size(iterable); Py_INCREF(iterable); return iterable; #elif PY_MAJOR_VERSION >= 3 static PyObject *py_items = NULL, *py_keys = NULL, *py_values = NULL; PyObject **pp = NULL; if (method_name) { const char *name = PyUnicode_AsUTF8(method_name); if (strcmp(name, "iteritems") == 0) pp = &py_items; else if (strcmp(name, "iterkeys") == 0) pp = &py_keys; else if (strcmp(name, "itervalues") == 0) pp = &py_values; if (pp) { if (!*pp) { *pp = PyUnicode_FromString(name + 4); if (!*pp) return NULL; } method_name = *pp; } } #endif } *p_orig_length = 0; if (method_name) { PyObject* iter; iterable = __Pyx_PyObject_CallMethod0(iterable, method_name); if (!iterable) return NULL; #if !CYTHON_COMPILING_IN_PYPY if (PyTuple_CheckExact(iterable) || PyList_CheckExact(iterable)) return iterable; #endif iter = PyObject_GetIter(iterable); Py_DECREF(iterable); return iter; } return PyObject_GetIter(iterable); } static CYTHON_INLINE int __Pyx_dict_iter_next( PyObject* iter_obj, CYTHON_NCP_UNUSED Py_ssize_t orig_length, CYTHON_NCP_UNUSED Py_ssize_t* ppos, PyObject** pkey, PyObject** pvalue, PyObject** pitem, int source_is_dict) { PyObject* next_item; #if !CYTHON_COMPILING_IN_PYPY if (source_is_dict) { PyObject *key, *value; if (unlikely(orig_length != PyDict_Size(iter_obj))) { PyErr_SetString(PyExc_RuntimeError, "dictionary changed size during iteration"); return -1; } if (unlikely(!PyDict_Next(iter_obj, ppos, &key, &value))) { return 0; } if (pitem) { PyObject* tuple = PyTuple_New(2); if (unlikely(!tuple)) { return -1; } Py_INCREF(key); Py_INCREF(value); PyTuple_SET_ITEM(tuple, 0, key); PyTuple_SET_ITEM(tuple, 1, value); *pitem = tuple; } else { if (pkey) { Py_INCREF(key); *pkey = key; } if (pvalue) { Py_INCREF(value); *pvalue = value; } } return 1; } else if (PyTuple_CheckExact(iter_obj)) { Py_ssize_t pos = *ppos; if (unlikely(pos >= PyTuple_GET_SIZE(iter_obj))) return 0; *ppos = pos + 1; next_item = PyTuple_GET_ITEM(iter_obj, pos); Py_INCREF(next_item); } else if (PyList_CheckExact(iter_obj)) { Py_ssize_t pos = *ppos; if (unlikely(pos >= PyList_GET_SIZE(iter_obj))) return 0; *ppos = pos + 1; next_item = PyList_GET_ITEM(iter_obj, pos); Py_INCREF(next_item); } else #endif { next_item = PyIter_Next(iter_obj); if (unlikely(!next_item)) { return __Pyx_IterFinish(); } } if (pitem) { *pitem = next_item; } else if (pkey && pvalue) { if (__Pyx_unpack_tuple2(next_item, pkey, pvalue, source_is_dict, source_is_dict, 1)) return -1; } else if (pkey) { *pkey = next_item; } else { *pvalue = next_item; } return 1; } /* GetException */ #if CYTHON_FAST_THREAD_STATE static int __Pyx__GetException(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) #else static int __Pyx_GetException(PyObject **type, PyObject **value, PyObject **tb) #endif { PyObject *local_type, *local_value, *local_tb; #if CYTHON_FAST_THREAD_STATE PyObject *tmp_type, *tmp_value, *tmp_tb; local_type = tstate->curexc_type; local_value = tstate->curexc_value; local_tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; #else PyErr_Fetch(&local_type, &local_value, &local_tb); #endif PyErr_NormalizeException(&local_type, &local_value, &local_tb); #if CYTHON_FAST_THREAD_STATE if (unlikely(tstate->curexc_type)) #else if (unlikely(PyErr_Occurred())) #endif goto bad; #if PY_MAJOR_VERSION >= 3 if (local_tb) { if (unlikely(PyException_SetTraceback(local_value, local_tb) < 0)) goto bad; } #endif Py_XINCREF(local_tb); Py_XINCREF(local_type); Py_XINCREF(local_value); *type = local_type; *value = local_value; *tb = local_tb; #if CYTHON_FAST_THREAD_STATE #if CYTHON_USE_EXC_INFO_STACK { _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = local_type; exc_info->exc_value = local_value; exc_info->exc_traceback = local_tb; } #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = local_type; tstate->exc_value = local_value; tstate->exc_traceback = local_tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); #else PyErr_SetExcInfo(local_type, local_value, local_tb); #endif return 0; bad: *type = 0; *value = 0; *tb = 0; Py_XDECREF(local_type); Py_XDECREF(local_value); Py_XDECREF(local_tb); return -1; } /* SwapException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSwap(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = *type; exc_info->exc_value = *value; exc_info->exc_traceback = *tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = *type; tstate->exc_value = *value; tstate->exc_traceback = *tb; #endif *type = tmp_type; *value = tmp_value; *tb = tmp_tb; } #else static CYTHON_INLINE void __Pyx_ExceptionSwap(PyObject **type, PyObject **value, PyObject **tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; PyErr_GetExcInfo(&tmp_type, &tmp_value, &tmp_tb); PyErr_SetExcInfo(*type, *value, *tb); *type = tmp_type; *value = tmp_value; *tb = tmp_tb; } #endif /* SaveResetException */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx__ExceptionSave(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = __Pyx_PyErr_GetTopmostException(tstate); *type = exc_info->exc_type; *value = exc_info->exc_value; *tb = exc_info->exc_traceback; #else *type = tstate->exc_type; *value = tstate->exc_value; *tb = tstate->exc_traceback; #endif Py_XINCREF(*type); Py_XINCREF(*value); Py_XINCREF(*tb); } static CYTHON_INLINE void __Pyx__ExceptionReset(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; #if CYTHON_USE_EXC_INFO_STACK _PyErr_StackItem *exc_info = tstate->exc_info; tmp_type = exc_info->exc_type; tmp_value = exc_info->exc_value; tmp_tb = exc_info->exc_traceback; exc_info->exc_type = type; exc_info->exc_value = value; exc_info->exc_traceback = tb; #else tmp_type = tstate->exc_type; tmp_value = tstate->exc_value; tmp_tb = tstate->exc_traceback; tstate->exc_type = type; tstate->exc_value = value; tstate->exc_traceback = tb; #endif Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } #endif /* TypeImport */ #ifndef __PYX_HAVE_RT_ImportType #define __PYX_HAVE_RT_ImportType static PyTypeObject *__Pyx_ImportType(PyObject *module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size) { PyObject *result = 0; char warning[200]; Py_ssize_t basicsize; #ifdef Py_LIMITED_API PyObject *py_basicsize; #endif result = PyObject_GetAttrString(module, class_name); if (!result) goto bad; if (!PyType_Check(result)) { PyErr_Format(PyExc_TypeError, "%.200s.%.200s is not a type object", module_name, class_name); goto bad; } #ifndef Py_LIMITED_API basicsize = ((PyTypeObject *)result)->tp_basicsize; #else py_basicsize = PyObject_GetAttrString(result, "__basicsize__"); if (!py_basicsize) goto bad; basicsize = PyLong_AsSsize_t(py_basicsize); Py_DECREF(py_basicsize); py_basicsize = 0; if (basicsize == (Py_ssize_t)-1 && PyErr_Occurred()) goto bad; #endif if ((size_t)basicsize < size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } if (check_size == __Pyx_ImportType_CheckSize_Error && (size_t)basicsize != size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } else if (check_size == __Pyx_ImportType_CheckSize_Warn && (size_t)basicsize > size) { PyOS_snprintf(warning, sizeof(warning), "%s.%s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); if (PyErr_WarnEx(NULL, warning, 0) < 0) goto bad; } return (PyTypeObject *)result; bad: Py_XDECREF(result); return NULL; } #endif /* Import */ static PyObject *__Pyx_Import(PyObject *name, PyObject *from_list, int level) { PyObject *empty_list = 0; PyObject *module = 0; PyObject *global_dict = 0; PyObject *empty_dict = 0; PyObject *list; #if PY_MAJOR_VERSION < 3 PyObject *py_import; py_import = __Pyx_PyObject_GetAttrStr(__pyx_b, __pyx_n_s_import); if (!py_import) goto bad; #endif if (from_list) list = from_list; else { empty_list = PyList_New(0); if (!empty_list) goto bad; list = empty_list; } global_dict = PyModule_GetDict(__pyx_m); if (!global_dict) goto bad; empty_dict = PyDict_New(); if (!empty_dict) goto bad; { #if PY_MAJOR_VERSION >= 3 if (level == -1) { if (strchr(__Pyx_MODULE_NAME, '.')) { module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, 1); if (!module) { if (!PyErr_ExceptionMatches(PyExc_ImportError)) goto bad; PyErr_Clear(); } } level = 0; } #endif if (!module) { #if PY_MAJOR_VERSION < 3 PyObject *py_level = PyInt_FromLong(level); if (!py_level) goto bad; module = PyObject_CallFunctionObjArgs(py_import, name, global_dict, empty_dict, list, py_level, (PyObject *)NULL); Py_DECREF(py_level); #else module = PyImport_ImportModuleLevelObject( name, global_dict, empty_dict, list, level); #endif } } bad: #if PY_MAJOR_VERSION < 3 Py_XDECREF(py_import); #endif Py_XDECREF(empty_list); Py_XDECREF(empty_dict); return module; } /* ImportFrom */ static PyObject* __Pyx_ImportFrom(PyObject* module, PyObject* name) { PyObject* value = __Pyx_PyObject_GetAttrStr(module, name); if (unlikely(!value) && PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Format(PyExc_ImportError, #if PY_MAJOR_VERSION < 3 "cannot import name %.230s", PyString_AS_STRING(name)); #else "cannot import name %S", name); #endif } return value; } /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* GetModuleGlobalName */ #if CYTHON_USE_DICT_VERSIONS static PyObject *__Pyx__GetModuleGlobalName(PyObject *name, PY_UINT64_T *dict_version, PyObject **dict_cached_value) #else static CYTHON_INLINE PyObject *__Pyx__GetModuleGlobalName(PyObject *name) #endif { PyObject *result; #if !CYTHON_AVOID_BORROWED_REFS #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 result = _PyDict_GetItem_KnownHash(__pyx_d, name, ((PyASCIIObject *) name)->hash); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } else if (unlikely(PyErr_Occurred())) { return NULL; } #else result = PyDict_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } #endif #else result = PyObject_GetItem(__pyx_d, name); __PYX_UPDATE_DICT_CACHE(__pyx_d, result, *dict_cached_value, *dict_version) if (likely(result)) { return __Pyx_NewRef(result); } PyErr_Clear(); #endif return __Pyx_GetBuiltinName(name); } /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, (size_t)new_max*sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyObject *py_srcfile = 0; PyObject *py_funcname = 0; #if PY_MAJOR_VERSION < 3 py_srcfile = PyString_FromString(filename); #else py_srcfile = PyUnicode_FromString(filename); #endif if (!py_srcfile) goto bad; if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); #else py_funcname = PyUnicode_FromString(funcname); #endif } if (!py_funcname) goto bad; py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); Py_DECREF(py_funcname); return py_code; bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) goto bad; __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); } } /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(long) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case 1: __PYX_VERIFY_RETURN_INT(long, digit, digits[0]) case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 2 * PyLong_SHIFT) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 3 * PyLong_SHIFT) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 4 * PyLong_SHIFT) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(long) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case -1: __PYX_VERIFY_RETURN_INT(long, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(long, digit, +digits[0]) case -2: if (8 * sizeof(long) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } #endif if (sizeof(long) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else long val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (long) -1; } } else { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(int) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case 1: __PYX_VERIFY_RETURN_INT(int, digit, digits[0]) case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 2 * PyLong_SHIFT) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 3 * PyLong_SHIFT) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 4 * PyLong_SHIFT) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(int) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case -1: __PYX_VERIFY_RETURN_INT(int, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(int, digit, +digits[0]) case -2: if (8 * sizeof(int) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } #endif if (sizeof(int) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else int val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (int) -1; } } else { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = a->tp_base; if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { int res = exc_type1 ? __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type1) : 0; if (!res) { res = __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } return res; } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; ip) { #if PY_MAJOR_VERSION < 3 if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } #else if (t->is_unicode | t->is_str) { if (t->intern) { *t->p = PyUnicode_InternFromString(t->s); } else if (t->encoding) { *t->p = PyUnicode_Decode(t->s, t->n - 1, t->encoding, NULL); } else { *t->p = PyUnicode_FromStringAndSize(t->s, t->n - 1); } } else { *t->p = PyBytes_FromStringAndSize(t->s, t->n - 1); } #endif if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; ++t; } return 0; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type %.200s). " "The ability to return an instance of a strict subclass of int " "is deprecated, and may be removed in a future version of Python.", Py_TYPE(result)->tp_name)) { Py_DECREF(result); return NULL; } return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type %.200s)", type_name, type_name, Py_TYPE(result)->tp_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)b)->ob_digit; const Py_ssize_t size = Py_SIZE(b); if (likely(__Pyx_sst_abs(size) <= 1)) { ival = likely(size) ? digits[0] : 0; if (size == -1) ival = -ival; return ival; } else { switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } #endif /* Py_PYTHON_H */ aiohttp-3.6.2/aiohttp/_http_writer.pyx0000644000175100001650000001015113547410117020407 0ustar vstsdocker00000000000000from libc.stdint cimport uint8_t, uint64_t from libc.string cimport memcpy from cpython.exc cimport PyErr_NoMemory from cpython.mem cimport PyMem_Malloc, PyMem_Realloc, PyMem_Free from cpython.bytes cimport PyBytes_FromStringAndSize from cpython.object cimport PyObject_Str from multidict import istr DEF BUF_SIZE = 16 * 1024 # 16KiB cdef char BUFFER[BUF_SIZE] cdef object _istr = istr # ----------------- writer --------------------------- cdef struct Writer: char *buf Py_ssize_t size Py_ssize_t pos cdef inline void _init_writer(Writer* writer): writer.buf = &BUFFER[0] writer.size = BUF_SIZE writer.pos = 0 cdef inline void _release_writer(Writer* writer): if writer.buf != BUFFER: PyMem_Free(writer.buf) cdef inline int _write_byte(Writer* writer, uint8_t ch): cdef char * buf cdef Py_ssize_t size if writer.pos == writer.size: # reallocate size = writer.size + BUF_SIZE if writer.buf == BUFFER: buf = PyMem_Malloc(size) if buf == NULL: PyErr_NoMemory() return -1 memcpy(buf, writer.buf, writer.size) else: buf = PyMem_Realloc(writer.buf, size) if buf == NULL: PyErr_NoMemory() return -1 writer.buf = buf writer.size = size writer.buf[writer.pos] = ch writer.pos += 1 return 0 cdef inline int _write_utf8(Writer* writer, Py_UCS4 symbol): cdef uint64_t utf = symbol if utf < 0x80: return _write_byte(writer, utf) elif utf < 0x800: if _write_byte(writer, (0xc0 | (utf >> 6))) < 0: return -1 return _write_byte(writer, (0x80 | (utf & 0x3f))) elif 0xD800 <= utf <= 0xDFFF: # surogate pair, ignored return 0 elif utf < 0x10000: if _write_byte(writer, (0xe0 | (utf >> 12))) < 0: return -1 if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: return -1 return _write_byte(writer, (0x80 | (utf & 0x3f))) elif utf > 0x10FFFF: # symbol is too large return 0 else: if _write_byte(writer, (0xf0 | (utf >> 18))) < 0: return -1 if _write_byte(writer, (0x80 | ((utf >> 12) & 0x3f))) < 0: return -1 if _write_byte(writer, (0x80 | ((utf >> 6) & 0x3f))) < 0: return -1 return _write_byte(writer, (0x80 | (utf & 0x3f))) cdef inline int _write_str(Writer* writer, str s): cdef Py_UCS4 ch for ch in s: if _write_utf8(writer, ch) < 0: return -1 # --------------- _serialize_headers ---------------------- cdef str to_str(object s): typ = type(s) if typ is str: return s elif typ is _istr: return PyObject_Str(s) elif not isinstance(s, str): raise TypeError("Cannot serialize non-str key {!r}".format(s)) else: return str(s) def _serialize_headers(str status_line, headers): cdef Writer writer cdef object key cdef object val cdef bytes ret _init_writer(&writer) try: if _write_str(&writer, status_line) < 0: raise if _write_byte(&writer, b'\r') < 0: raise if _write_byte(&writer, b'\n') < 0: raise for key, val in headers.items(): if _write_str(&writer, to_str(key)) < 0: raise if _write_byte(&writer, b':') < 0: raise if _write_byte(&writer, b' ') < 0: raise if _write_str(&writer, to_str(val)) < 0: raise if _write_byte(&writer, b'\r') < 0: raise if _write_byte(&writer, b'\n') < 0: raise if _write_byte(&writer, b'\r') < 0: raise if _write_byte(&writer, b'\n') < 0: raise return PyBytes_FromStringAndSize(writer.buf, writer.pos) finally: _release_writer(&writer) aiohttp-3.6.2/aiohttp/_websocket.c0000644000175100001650000041263413547410135017440 0ustar vstsdocker00000000000000/* Generated by Cython 0.29.13 */ #define PY_SSIZE_T_CLEAN #include "Python.h" #ifndef Py_PYTHON_H #error Python headers needed to compile C extensions, please install development version of Python. #elif PY_VERSION_HEX < 0x02060000 || (0x03000000 <= PY_VERSION_HEX && PY_VERSION_HEX < 0x03030000) #error Cython requires Python 2.6+ or Python 3.3+. #else #define CYTHON_ABI "0_29_13" #define CYTHON_HEX_VERSION 0x001D0DF0 #define CYTHON_FUTURE_DIVISION 1 #include #ifndef offsetof #define offsetof(type, member) ( (size_t) & ((type*)0) -> member ) #endif #if !defined(WIN32) && !defined(MS_WINDOWS) #ifndef __stdcall #define __stdcall #endif #ifndef __cdecl #define __cdecl #endif #ifndef __fastcall #define __fastcall #endif #endif #ifndef DL_IMPORT #define DL_IMPORT(t) t #endif #ifndef DL_EXPORT #define DL_EXPORT(t) t #endif #define __PYX_COMMA , #ifndef HAVE_LONG_LONG #if PY_VERSION_HEX >= 0x02070000 #define HAVE_LONG_LONG #endif #endif #ifndef PY_LONG_LONG #define PY_LONG_LONG LONG_LONG #endif #ifndef Py_HUGE_VAL #define Py_HUGE_VAL HUGE_VAL #endif #ifdef PYPY_VERSION #define CYTHON_COMPILING_IN_PYPY 1 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 0 #undef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 0 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #if PY_VERSION_HEX < 0x03050000 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #undef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #undef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 1 #undef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 0 #undef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 0 #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #elif defined(PYSTON_VERSION) #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 1 #define CYTHON_COMPILING_IN_CPYTHON 0 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #undef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 0 #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #undef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 0 #undef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 0 #undef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT 0 #undef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE 0 #undef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS 0 #undef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK 0 #else #define CYTHON_COMPILING_IN_PYPY 0 #define CYTHON_COMPILING_IN_PYSTON 0 #define CYTHON_COMPILING_IN_CPYTHON 1 #ifndef CYTHON_USE_TYPE_SLOTS #define CYTHON_USE_TYPE_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYTYPE_LOOKUP #define CYTHON_USE_PYTYPE_LOOKUP 0 #elif !defined(CYTHON_USE_PYTYPE_LOOKUP) #define CYTHON_USE_PYTYPE_LOOKUP 1 #endif #if PY_MAJOR_VERSION < 3 #undef CYTHON_USE_ASYNC_SLOTS #define CYTHON_USE_ASYNC_SLOTS 0 #elif !defined(CYTHON_USE_ASYNC_SLOTS) #define CYTHON_USE_ASYNC_SLOTS 1 #endif #if PY_VERSION_HEX < 0x02070000 #undef CYTHON_USE_PYLONG_INTERNALS #define CYTHON_USE_PYLONG_INTERNALS 0 #elif !defined(CYTHON_USE_PYLONG_INTERNALS) #define CYTHON_USE_PYLONG_INTERNALS 1 #endif #ifndef CYTHON_USE_PYLIST_INTERNALS #define CYTHON_USE_PYLIST_INTERNALS 1 #endif #ifndef CYTHON_USE_UNICODE_INTERNALS #define CYTHON_USE_UNICODE_INTERNALS 1 #endif #if PY_VERSION_HEX < 0x030300F0 #undef CYTHON_USE_UNICODE_WRITER #define CYTHON_USE_UNICODE_WRITER 0 #elif !defined(CYTHON_USE_UNICODE_WRITER) #define CYTHON_USE_UNICODE_WRITER 1 #endif #ifndef CYTHON_AVOID_BORROWED_REFS #define CYTHON_AVOID_BORROWED_REFS 0 #endif #ifndef CYTHON_ASSUME_SAFE_MACROS #define CYTHON_ASSUME_SAFE_MACROS 1 #endif #ifndef CYTHON_UNPACK_METHODS #define CYTHON_UNPACK_METHODS 1 #endif #ifndef CYTHON_FAST_THREAD_STATE #define CYTHON_FAST_THREAD_STATE 1 #endif #ifndef CYTHON_FAST_PYCALL #define CYTHON_FAST_PYCALL 1 #endif #ifndef CYTHON_PEP489_MULTI_PHASE_INIT #define CYTHON_PEP489_MULTI_PHASE_INIT (PY_VERSION_HEX >= 0x03050000) #endif #ifndef CYTHON_USE_TP_FINALIZE #define CYTHON_USE_TP_FINALIZE (PY_VERSION_HEX >= 0x030400a1) #endif #ifndef CYTHON_USE_DICT_VERSIONS #define CYTHON_USE_DICT_VERSIONS (PY_VERSION_HEX >= 0x030600B1) #endif #ifndef CYTHON_USE_EXC_INFO_STACK #define CYTHON_USE_EXC_INFO_STACK (PY_VERSION_HEX >= 0x030700A3) #endif #endif #if !defined(CYTHON_FAST_PYCCALL) #define CYTHON_FAST_PYCCALL (CYTHON_FAST_PYCALL && PY_VERSION_HEX >= 0x030600B1) #endif #if CYTHON_USE_PYLONG_INTERNALS #include "longintrepr.h" #undef SHIFT #undef BASE #undef MASK #ifdef SIZEOF_VOID_P enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) }; #endif #endif #ifndef __has_attribute #define __has_attribute(x) 0 #endif #ifndef __has_cpp_attribute #define __has_cpp_attribute(x) 0 #endif #ifndef CYTHON_RESTRICT #if defined(__GNUC__) #define CYTHON_RESTRICT __restrict__ #elif defined(_MSC_VER) && _MSC_VER >= 1400 #define CYTHON_RESTRICT __restrict #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_RESTRICT restrict #else #define CYTHON_RESTRICT #endif #endif #ifndef CYTHON_UNUSED # if defined(__GNUC__) # if !(defined(__cplusplus)) || (__GNUC__ > 3 || (__GNUC__ == 3 && __GNUC_MINOR__ >= 4)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif # elif defined(__ICC) || (defined(__INTEL_COMPILER) && !defined(_MSC_VER)) # define CYTHON_UNUSED __attribute__ ((__unused__)) # else # define CYTHON_UNUSED # endif #endif #ifndef CYTHON_MAYBE_UNUSED_VAR # if defined(__cplusplus) template void CYTHON_MAYBE_UNUSED_VAR( const T& ) { } # else # define CYTHON_MAYBE_UNUSED_VAR(x) (void)(x) # endif #endif #ifndef CYTHON_NCP_UNUSED # if CYTHON_COMPILING_IN_CPYTHON # define CYTHON_NCP_UNUSED # else # define CYTHON_NCP_UNUSED CYTHON_UNUSED # endif #endif #define __Pyx_void_to_None(void_result) ((void)(void_result), Py_INCREF(Py_None), Py_None) #ifdef _MSC_VER #ifndef _MSC_STDINT_H_ #if _MSC_VER < 1300 typedef unsigned char uint8_t; typedef unsigned int uint32_t; #else typedef unsigned __int8 uint8_t; typedef unsigned __int32 uint32_t; #endif #endif #else #include #endif #ifndef CYTHON_FALLTHROUGH #if defined(__cplusplus) && __cplusplus >= 201103L #if __has_cpp_attribute(fallthrough) #define CYTHON_FALLTHROUGH [[fallthrough]] #elif __has_cpp_attribute(clang::fallthrough) #define CYTHON_FALLTHROUGH [[clang::fallthrough]] #elif __has_cpp_attribute(gnu::fallthrough) #define CYTHON_FALLTHROUGH [[gnu::fallthrough]] #endif #endif #ifndef CYTHON_FALLTHROUGH #if __has_attribute(fallthrough) #define CYTHON_FALLTHROUGH __attribute__((fallthrough)) #else #define CYTHON_FALLTHROUGH #endif #endif #if defined(__clang__ ) && defined(__apple_build_version__) #if __apple_build_version__ < 7000000 #undef CYTHON_FALLTHROUGH #define CYTHON_FALLTHROUGH #endif #endif #endif #ifndef CYTHON_INLINE #if defined(__clang__) #define CYTHON_INLINE __inline__ __attribute__ ((__unused__)) #elif defined(__GNUC__) #define CYTHON_INLINE __inline__ #elif defined(_MSC_VER) #define CYTHON_INLINE __inline #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define CYTHON_INLINE inline #else #define CYTHON_INLINE #endif #endif #if CYTHON_COMPILING_IN_PYPY && PY_VERSION_HEX < 0x02070600 && !defined(Py_OptimizeFlag) #define Py_OptimizeFlag 0 #endif #define __PYX_BUILD_PY_SSIZE_T "n" #define CYTHON_FORMAT_SSIZE_T "z" #if PY_MAJOR_VERSION < 3 #define __Pyx_BUILTIN_MODULE_NAME "__builtin__" #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a+k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #define __Pyx_DefaultClassType PyClass_Type #else #define __Pyx_BUILTIN_MODULE_NAME "builtins" #if PY_VERSION_HEX >= 0x030800A4 && PY_VERSION_HEX < 0x030800B2 #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, 0, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #else #define __Pyx_PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos)\ PyCode_New(a, k, l, s, f, code, c, n, v, fv, cell, fn, name, fline, lnos) #endif #define __Pyx_DefaultClassType PyType_Type #endif #ifndef Py_TPFLAGS_CHECKTYPES #define Py_TPFLAGS_CHECKTYPES 0 #endif #ifndef Py_TPFLAGS_HAVE_INDEX #define Py_TPFLAGS_HAVE_INDEX 0 #endif #ifndef Py_TPFLAGS_HAVE_NEWBUFFER #define Py_TPFLAGS_HAVE_NEWBUFFER 0 #endif #ifndef Py_TPFLAGS_HAVE_FINALIZE #define Py_TPFLAGS_HAVE_FINALIZE 0 #endif #ifndef METH_STACKLESS #define METH_STACKLESS 0 #endif #if PY_VERSION_HEX <= 0x030700A3 || !defined(METH_FASTCALL) #ifndef METH_FASTCALL #define METH_FASTCALL 0x80 #endif typedef PyObject *(*__Pyx_PyCFunctionFast) (PyObject *self, PyObject *const *args, Py_ssize_t nargs); typedef PyObject *(*__Pyx_PyCFunctionFastWithKeywords) (PyObject *self, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames); #else #define __Pyx_PyCFunctionFast _PyCFunctionFast #define __Pyx_PyCFunctionFastWithKeywords _PyCFunctionFastWithKeywords #endif #if CYTHON_FAST_PYCCALL #define __Pyx_PyFastCFunction_Check(func)\ ((PyCFunction_Check(func) && (METH_FASTCALL == (PyCFunction_GET_FLAGS(func) & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))))) #else #define __Pyx_PyFastCFunction_Check(func) 0 #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Malloc) #define PyObject_Malloc(s) PyMem_Malloc(s) #define PyObject_Free(p) PyMem_Free(p) #define PyObject_Realloc(p) PyMem_Realloc(p) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX < 0x030400A1 #define PyMem_RawMalloc(n) PyMem_Malloc(n) #define PyMem_RawRealloc(p, n) PyMem_Realloc(p, n) #define PyMem_RawFree(p) PyMem_Free(p) #endif #if CYTHON_COMPILING_IN_PYSTON #define __Pyx_PyCode_HasFreeVars(co) PyCode_HasFreeVars(co) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) PyFrame_SetLineNumber(frame, lineno) #else #define __Pyx_PyCode_HasFreeVars(co) (PyCode_GetNumFree(co) > 0) #define __Pyx_PyFrame_SetLineNumber(frame, lineno) (frame)->f_lineno = (lineno) #endif #if !CYTHON_FAST_THREAD_STATE || PY_VERSION_HEX < 0x02070000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #elif PY_VERSION_HEX >= 0x03060000 #define __Pyx_PyThreadState_Current _PyThreadState_UncheckedGet() #elif PY_VERSION_HEX >= 0x03000000 #define __Pyx_PyThreadState_Current PyThreadState_GET() #else #define __Pyx_PyThreadState_Current _PyThreadState_Current #endif #if PY_VERSION_HEX < 0x030700A2 && !defined(PyThread_tss_create) && !defined(Py_tss_NEEDS_INIT) #include "pythread.h" #define Py_tss_NEEDS_INIT 0 typedef int Py_tss_t; static CYTHON_INLINE int PyThread_tss_create(Py_tss_t *key) { *key = PyThread_create_key(); return 0; } static CYTHON_INLINE Py_tss_t * PyThread_tss_alloc(void) { Py_tss_t *key = (Py_tss_t *)PyObject_Malloc(sizeof(Py_tss_t)); *key = Py_tss_NEEDS_INIT; return key; } static CYTHON_INLINE void PyThread_tss_free(Py_tss_t *key) { PyObject_Free(key); } static CYTHON_INLINE int PyThread_tss_is_created(Py_tss_t *key) { return *key != Py_tss_NEEDS_INIT; } static CYTHON_INLINE void PyThread_tss_delete(Py_tss_t *key) { PyThread_delete_key(*key); *key = Py_tss_NEEDS_INIT; } static CYTHON_INLINE int PyThread_tss_set(Py_tss_t *key, void *value) { return PyThread_set_key_value(*key, value); } static CYTHON_INLINE void * PyThread_tss_get(Py_tss_t *key) { return PyThread_get_key_value(*key); } #endif #if CYTHON_COMPILING_IN_CPYTHON || defined(_PyDict_NewPresized) #define __Pyx_PyDict_NewPresized(n) ((n <= 8) ? PyDict_New() : _PyDict_NewPresized(n)) #else #define __Pyx_PyDict_NewPresized(n) PyDict_New() #endif #if PY_MAJOR_VERSION >= 3 || CYTHON_FUTURE_DIVISION #define __Pyx_PyNumber_Divide(x,y) PyNumber_TrueDivide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceTrueDivide(x,y) #else #define __Pyx_PyNumber_Divide(x,y) PyNumber_Divide(x,y) #define __Pyx_PyNumber_InPlaceDivide(x,y) PyNumber_InPlaceDivide(x,y) #endif #if CYTHON_COMPILING_IN_CPYTHON && PY_VERSION_HEX >= 0x030500A1 && CYTHON_USE_UNICODE_INTERNALS #define __Pyx_PyDict_GetItemStr(dict, name) _PyDict_GetItem_KnownHash(dict, name, ((PyASCIIObject *) name)->hash) #else #define __Pyx_PyDict_GetItemStr(dict, name) PyDict_GetItem(dict, name) #endif #if PY_VERSION_HEX > 0x03030000 && defined(PyUnicode_KIND) #define CYTHON_PEP393_ENABLED 1 #define __Pyx_PyUnicode_READY(op) (likely(PyUnicode_IS_READY(op)) ?\ 0 : _PyUnicode_Ready((PyObject *)(op))) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_LENGTH(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) PyUnicode_READ_CHAR(u, i) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) PyUnicode_MAX_CHAR_VALUE(u) #define __Pyx_PyUnicode_KIND(u) PyUnicode_KIND(u) #define __Pyx_PyUnicode_DATA(u) PyUnicode_DATA(u) #define __Pyx_PyUnicode_READ(k, d, i) PyUnicode_READ(k, d, i) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) PyUnicode_WRITE(k, d, i, ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != (likely(PyUnicode_IS_READY(u)) ? PyUnicode_GET_LENGTH(u) : PyUnicode_GET_SIZE(u))) #else #define CYTHON_PEP393_ENABLED 0 #define PyUnicode_1BYTE_KIND 1 #define PyUnicode_2BYTE_KIND 2 #define PyUnicode_4BYTE_KIND 4 #define __Pyx_PyUnicode_READY(op) (0) #define __Pyx_PyUnicode_GET_LENGTH(u) PyUnicode_GET_SIZE(u) #define __Pyx_PyUnicode_READ_CHAR(u, i) ((Py_UCS4)(PyUnicode_AS_UNICODE(u)[i])) #define __Pyx_PyUnicode_MAX_CHAR_VALUE(u) ((sizeof(Py_UNICODE) == 2) ? 65535 : 1114111) #define __Pyx_PyUnicode_KIND(u) (sizeof(Py_UNICODE)) #define __Pyx_PyUnicode_DATA(u) ((void*)PyUnicode_AS_UNICODE(u)) #define __Pyx_PyUnicode_READ(k, d, i) ((void)(k), (Py_UCS4)(((Py_UNICODE*)d)[i])) #define __Pyx_PyUnicode_WRITE(k, d, i, ch) (((void)(k)), ((Py_UNICODE*)d)[i] = ch) #define __Pyx_PyUnicode_IS_TRUE(u) (0 != PyUnicode_GET_SIZE(u)) #endif #if CYTHON_COMPILING_IN_PYPY #define __Pyx_PyUnicode_Concat(a, b) PyNumber_Add(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) PyNumber_Add(a, b) #else #define __Pyx_PyUnicode_Concat(a, b) PyUnicode_Concat(a, b) #define __Pyx_PyUnicode_ConcatSafe(a, b) ((unlikely((a) == Py_None) || unlikely((b) == Py_None)) ?\ PyNumber_Add(a, b) : __Pyx_PyUnicode_Concat(a, b)) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyUnicode_Contains) #define PyUnicode_Contains(u, s) PySequence_Contains(u, s) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyByteArray_Check) #define PyByteArray_Check(obj) PyObject_TypeCheck(obj, &PyByteArray_Type) #endif #if CYTHON_COMPILING_IN_PYPY && !defined(PyObject_Format) #define PyObject_Format(obj, fmt) PyObject_CallMethod(obj, "__format__", "O", fmt) #endif #define __Pyx_PyString_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyString_Check(b) && !PyString_CheckExact(b)))) ? PyNumber_Remainder(a, b) : __Pyx_PyString_Format(a, b)) #define __Pyx_PyUnicode_FormatSafe(a, b) ((unlikely((a) == Py_None || (PyUnicode_Check(b) && !PyUnicode_CheckExact(b)))) ? PyNumber_Remainder(a, b) : PyUnicode_Format(a, b)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyString_Format(a, b) PyUnicode_Format(a, b) #else #define __Pyx_PyString_Format(a, b) PyString_Format(a, b) #endif #if PY_MAJOR_VERSION < 3 && !defined(PyObject_ASCII) #define PyObject_ASCII(o) PyObject_Repr(o) #endif #if PY_MAJOR_VERSION >= 3 #define PyBaseString_Type PyUnicode_Type #define PyStringObject PyUnicodeObject #define PyString_Type PyUnicode_Type #define PyString_Check PyUnicode_Check #define PyString_CheckExact PyUnicode_CheckExact #define PyObject_Unicode PyObject_Str #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyBaseString_Check(obj) PyUnicode_Check(obj) #define __Pyx_PyBaseString_CheckExact(obj) PyUnicode_CheckExact(obj) #else #define __Pyx_PyBaseString_Check(obj) (PyString_Check(obj) || PyUnicode_Check(obj)) #define __Pyx_PyBaseString_CheckExact(obj) (PyString_CheckExact(obj) || PyUnicode_CheckExact(obj)) #endif #ifndef PySet_CheckExact #define PySet_CheckExact(obj) (Py_TYPE(obj) == &PySet_Type) #endif #if CYTHON_ASSUME_SAFE_MACROS #define __Pyx_PySequence_SIZE(seq) Py_SIZE(seq) #else #define __Pyx_PySequence_SIZE(seq) PySequence_Size(seq) #endif #if PY_MAJOR_VERSION >= 3 #define PyIntObject PyLongObject #define PyInt_Type PyLong_Type #define PyInt_Check(op) PyLong_Check(op) #define PyInt_CheckExact(op) PyLong_CheckExact(op) #define PyInt_FromString PyLong_FromString #define PyInt_FromUnicode PyLong_FromUnicode #define PyInt_FromLong PyLong_FromLong #define PyInt_FromSize_t PyLong_FromSize_t #define PyInt_FromSsize_t PyLong_FromSsize_t #define PyInt_AsLong PyLong_AsLong #define PyInt_AS_LONG PyLong_AS_LONG #define PyInt_AsSsize_t PyLong_AsSsize_t #define PyInt_AsUnsignedLongMask PyLong_AsUnsignedLongMask #define PyInt_AsUnsignedLongLongMask PyLong_AsUnsignedLongLongMask #define PyNumber_Int PyNumber_Long #endif #if PY_MAJOR_VERSION >= 3 #define PyBoolObject PyLongObject #endif #if PY_MAJOR_VERSION >= 3 && CYTHON_COMPILING_IN_PYPY #ifndef PyUnicode_InternFromString #define PyUnicode_InternFromString(s) PyUnicode_FromString(s) #endif #endif #if PY_VERSION_HEX < 0x030200A4 typedef long Py_hash_t; #define __Pyx_PyInt_FromHash_t PyInt_FromLong #define __Pyx_PyInt_AsHash_t PyInt_AsLong #else #define __Pyx_PyInt_FromHash_t PyInt_FromSsize_t #define __Pyx_PyInt_AsHash_t PyInt_AsSsize_t #endif #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyMethod_New(func, self, klass) ((self) ? PyMethod_New(func, self) : (Py_INCREF(func), func)) #else #define __Pyx_PyMethod_New(func, self, klass) PyMethod_New(func, self, klass) #endif #if CYTHON_USE_ASYNC_SLOTS #if PY_VERSION_HEX >= 0x030500B1 #define __Pyx_PyAsyncMethodsStruct PyAsyncMethods #define __Pyx_PyType_AsAsync(obj) (Py_TYPE(obj)->tp_as_async) #else #define __Pyx_PyType_AsAsync(obj) ((__Pyx_PyAsyncMethodsStruct*) (Py_TYPE(obj)->tp_reserved)) #endif #else #define __Pyx_PyType_AsAsync(obj) NULL #endif #ifndef __Pyx_PyAsyncMethodsStruct typedef struct { unaryfunc am_await; unaryfunc am_aiter; unaryfunc am_anext; } __Pyx_PyAsyncMethodsStruct; #endif #if defined(WIN32) || defined(MS_WINDOWS) #define _USE_MATH_DEFINES #endif #include #ifdef NAN #define __PYX_NAN() ((float) NAN) #else static CYTHON_INLINE float __PYX_NAN() { float value; memset(&value, 0xFF, sizeof(value)); return value; } #endif #if defined(__CYGWIN__) && defined(_LDBL_EQ_DBL) #define __Pyx_truncl trunc #else #define __Pyx_truncl truncl #endif #define __PYX_ERR(f_index, lineno, Ln_error) \ { \ __pyx_filename = __pyx_f[f_index]; __pyx_lineno = lineno; __pyx_clineno = __LINE__; goto Ln_error; \ } #ifndef __PYX_EXTERN_C #ifdef __cplusplus #define __PYX_EXTERN_C extern "C" #else #define __PYX_EXTERN_C extern #endif #endif #define __PYX_HAVE__aiohttp___websocket #define __PYX_HAVE_API__aiohttp___websocket /* Early includes */ #include #include #include "pythread.h" #include #ifdef _OPENMP #include #endif /* _OPENMP */ #if defined(PYREX_WITHOUT_ASSERTIONS) && !defined(CYTHON_WITHOUT_ASSERTIONS) #define CYTHON_WITHOUT_ASSERTIONS #endif typedef struct {PyObject **p; const char *s; const Py_ssize_t n; const char* encoding; const char is_unicode; const char is_str; const char intern; } __Pyx_StringTabEntry; #define __PYX_DEFAULT_STRING_ENCODING_IS_ASCII 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_UTF8 0 #define __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT (PY_MAJOR_VERSION >= 3 && __PYX_DEFAULT_STRING_ENCODING_IS_UTF8) #define __PYX_DEFAULT_STRING_ENCODING "" #define __Pyx_PyObject_FromString __Pyx_PyBytes_FromString #define __Pyx_PyObject_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #define __Pyx_uchar_cast(c) ((unsigned char)c) #define __Pyx_long_cast(x) ((long)x) #define __Pyx_fits_Py_ssize_t(v, type, is_signed) (\ (sizeof(type) < sizeof(Py_ssize_t)) ||\ (sizeof(type) > sizeof(Py_ssize_t) &&\ likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX) &&\ (!is_signed || likely(v > (type)PY_SSIZE_T_MIN ||\ v == (type)PY_SSIZE_T_MIN))) ||\ (sizeof(type) == sizeof(Py_ssize_t) &&\ (is_signed || likely(v < (type)PY_SSIZE_T_MAX ||\ v == (type)PY_SSIZE_T_MAX))) ) static CYTHON_INLINE int __Pyx_is_valid_index(Py_ssize_t i, Py_ssize_t limit) { return (size_t) i < (size_t) limit; } #if defined (__cplusplus) && __cplusplus >= 201103L #include #define __Pyx_sst_abs(value) std::abs(value) #elif SIZEOF_INT >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) abs(value) #elif SIZEOF_LONG >= SIZEOF_SIZE_T #define __Pyx_sst_abs(value) labs(value) #elif defined (_MSC_VER) #define __Pyx_sst_abs(value) ((Py_ssize_t)_abs64(value)) #elif defined (__STDC_VERSION__) && __STDC_VERSION__ >= 199901L #define __Pyx_sst_abs(value) llabs(value) #elif defined (__GNUC__) #define __Pyx_sst_abs(value) __builtin_llabs(value) #else #define __Pyx_sst_abs(value) ((value<0) ? -value : value) #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject*); static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject*, Py_ssize_t* length); #define __Pyx_PyByteArray_FromString(s) PyByteArray_FromStringAndSize((const char*)s, strlen((const char*)s)) #define __Pyx_PyByteArray_FromStringAndSize(s, l) PyByteArray_FromStringAndSize((const char*)s, l) #define __Pyx_PyBytes_FromString PyBytes_FromString #define __Pyx_PyBytes_FromStringAndSize PyBytes_FromStringAndSize static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char*); #if PY_MAJOR_VERSION < 3 #define __Pyx_PyStr_FromString __Pyx_PyBytes_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyBytes_FromStringAndSize #else #define __Pyx_PyStr_FromString __Pyx_PyUnicode_FromString #define __Pyx_PyStr_FromStringAndSize __Pyx_PyUnicode_FromStringAndSize #endif #define __Pyx_PyBytes_AsWritableString(s) ((char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableSString(s) ((signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsWritableUString(s) ((unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsString(s) ((const char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsSString(s) ((const signed char*) PyBytes_AS_STRING(s)) #define __Pyx_PyBytes_AsUString(s) ((const unsigned char*) PyBytes_AS_STRING(s)) #define __Pyx_PyObject_AsWritableString(s) ((char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableSString(s) ((signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsWritableUString(s) ((unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsSString(s) ((const signed char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_AsUString(s) ((const unsigned char*) __Pyx_PyObject_AsString(s)) #define __Pyx_PyObject_FromCString(s) __Pyx_PyObject_FromString((const char*)s) #define __Pyx_PyBytes_FromCString(s) __Pyx_PyBytes_FromString((const char*)s) #define __Pyx_PyByteArray_FromCString(s) __Pyx_PyByteArray_FromString((const char*)s) #define __Pyx_PyStr_FromCString(s) __Pyx_PyStr_FromString((const char*)s) #define __Pyx_PyUnicode_FromCString(s) __Pyx_PyUnicode_FromString((const char*)s) static CYTHON_INLINE size_t __Pyx_Py_UNICODE_strlen(const Py_UNICODE *u) { const Py_UNICODE *u_end = u; while (*u_end++) ; return (size_t)(u_end - u - 1); } #define __Pyx_PyUnicode_FromUnicode(u) PyUnicode_FromUnicode(u, __Pyx_Py_UNICODE_strlen(u)) #define __Pyx_PyUnicode_FromUnicodeAndLength PyUnicode_FromUnicode #define __Pyx_PyUnicode_AsUnicode PyUnicode_AsUnicode #define __Pyx_NewRef(obj) (Py_INCREF(obj), obj) #define __Pyx_Owned_Py_None(b) __Pyx_NewRef(Py_None) static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b); static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject*); static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject*); static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x); #define __Pyx_PySequence_Tuple(obj)\ (likely(PyTuple_CheckExact(obj)) ? __Pyx_NewRef(obj) : PySequence_Tuple(obj)) static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject*); static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t); #if CYTHON_ASSUME_SAFE_MACROS #define __pyx_PyFloat_AsDouble(x) (PyFloat_CheckExact(x) ? PyFloat_AS_DOUBLE(x) : PyFloat_AsDouble(x)) #else #define __pyx_PyFloat_AsDouble(x) PyFloat_AsDouble(x) #endif #define __pyx_PyFloat_AsFloat(x) ((float) __pyx_PyFloat_AsDouble(x)) #if PY_MAJOR_VERSION >= 3 #define __Pyx_PyNumber_Int(x) (PyLong_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Long(x)) #else #define __Pyx_PyNumber_Int(x) (PyInt_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Int(x)) #endif #define __Pyx_PyNumber_Float(x) (PyFloat_CheckExact(x) ? __Pyx_NewRef(x) : PyNumber_Float(x)) #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII static int __Pyx_sys_getdefaultencoding_not_ascii; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; PyObject* ascii_chars_u = NULL; PyObject* ascii_chars_b = NULL; const char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; if (strcmp(default_encoding_c, "ascii") == 0) { __Pyx_sys_getdefaultencoding_not_ascii = 0; } else { char ascii_chars[128]; int c; for (c = 0; c < 128; c++) { ascii_chars[c] = c; } __Pyx_sys_getdefaultencoding_not_ascii = 1; ascii_chars_u = PyUnicode_DecodeASCII(ascii_chars, 128, NULL); if (!ascii_chars_u) goto bad; ascii_chars_b = PyUnicode_AsEncodedString(ascii_chars_u, default_encoding_c, NULL); if (!ascii_chars_b || !PyBytes_Check(ascii_chars_b) || memcmp(ascii_chars, PyBytes_AS_STRING(ascii_chars_b), 128) != 0) { PyErr_Format( PyExc_ValueError, "This module compiled with c_string_encoding=ascii, but default encoding '%.200s' is not a superset of ascii.", default_encoding_c); goto bad; } Py_DECREF(ascii_chars_u); Py_DECREF(ascii_chars_b); } Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); Py_XDECREF(ascii_chars_u); Py_XDECREF(ascii_chars_b); return -1; } #endif #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT && PY_MAJOR_VERSION >= 3 #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_DecodeUTF8(c_str, size, NULL) #else #define __Pyx_PyUnicode_FromStringAndSize(c_str, size) PyUnicode_Decode(c_str, size, __PYX_DEFAULT_STRING_ENCODING, NULL) #if __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT static char* __PYX_DEFAULT_STRING_ENCODING; static int __Pyx_init_sys_getdefaultencoding_params(void) { PyObject* sys; PyObject* default_encoding = NULL; char* default_encoding_c; sys = PyImport_ImportModule("sys"); if (!sys) goto bad; default_encoding = PyObject_CallMethod(sys, (char*) (const char*) "getdefaultencoding", NULL); Py_DECREF(sys); if (!default_encoding) goto bad; default_encoding_c = PyBytes_AsString(default_encoding); if (!default_encoding_c) goto bad; __PYX_DEFAULT_STRING_ENCODING = (char*) malloc(strlen(default_encoding_c) + 1); if (!__PYX_DEFAULT_STRING_ENCODING) goto bad; strcpy(__PYX_DEFAULT_STRING_ENCODING, default_encoding_c); Py_DECREF(default_encoding); return 0; bad: Py_XDECREF(default_encoding); return -1; } #endif #endif /* Test for GCC > 2.95 */ #if defined(__GNUC__) && (__GNUC__ > 2 || (__GNUC__ == 2 && (__GNUC_MINOR__ > 95))) #define likely(x) __builtin_expect(!!(x), 1) #define unlikely(x) __builtin_expect(!!(x), 0) #else /* !__GNUC__ or GCC < 2.95 */ #define likely(x) (x) #define unlikely(x) (x) #endif /* __GNUC__ */ static CYTHON_INLINE void __Pyx_pretend_to_initialize(void* ptr) { (void)ptr; } static PyObject *__pyx_m = NULL; static PyObject *__pyx_d; static PyObject *__pyx_b; static PyObject *__pyx_cython_runtime = NULL; static PyObject *__pyx_empty_tuple; static PyObject *__pyx_empty_bytes; static PyObject *__pyx_empty_unicode; static int __pyx_lineno; static int __pyx_clineno = 0; static const char * __pyx_cfilenm= __FILE__; static const char *__pyx_filename; static const char *__pyx_f[] = { "aiohttp/_websocket.pyx", "type.pxd", "bool.pxd", "complex.pxd", }; /*--- Type declarations ---*/ /* --- Runtime support code (head) --- */ /* Refnanny.proto */ #ifndef CYTHON_REFNANNY #define CYTHON_REFNANNY 0 #endif #if CYTHON_REFNANNY typedef struct { void (*INCREF)(void*, PyObject*, int); void (*DECREF)(void*, PyObject*, int); void (*GOTREF)(void*, PyObject*, int); void (*GIVEREF)(void*, PyObject*, int); void* (*SetupContext)(const char*, int, const char*); void (*FinishContext)(void**); } __Pyx_RefNannyAPIStruct; static __Pyx_RefNannyAPIStruct *__Pyx_RefNanny = NULL; static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname); #define __Pyx_RefNannyDeclarations void *__pyx_refnanny = NULL; #ifdef WITH_THREAD #define __Pyx_RefNannySetupContext(name, acquire_gil)\ if (acquire_gil) {\ PyGILState_STATE __pyx_gilstate_save = PyGILState_Ensure();\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ PyGILState_Release(__pyx_gilstate_save);\ } else {\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__);\ } #else #define __Pyx_RefNannySetupContext(name, acquire_gil)\ __pyx_refnanny = __Pyx_RefNanny->SetupContext((name), __LINE__, __FILE__) #endif #define __Pyx_RefNannyFinishContext()\ __Pyx_RefNanny->FinishContext(&__pyx_refnanny) #define __Pyx_INCREF(r) __Pyx_RefNanny->INCREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_DECREF(r) __Pyx_RefNanny->DECREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GOTREF(r) __Pyx_RefNanny->GOTREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_GIVEREF(r) __Pyx_RefNanny->GIVEREF(__pyx_refnanny, (PyObject *)(r), __LINE__) #define __Pyx_XINCREF(r) do { if((r) != NULL) {__Pyx_INCREF(r); }} while(0) #define __Pyx_XDECREF(r) do { if((r) != NULL) {__Pyx_DECREF(r); }} while(0) #define __Pyx_XGOTREF(r) do { if((r) != NULL) {__Pyx_GOTREF(r); }} while(0) #define __Pyx_XGIVEREF(r) do { if((r) != NULL) {__Pyx_GIVEREF(r);}} while(0) #else #define __Pyx_RefNannyDeclarations #define __Pyx_RefNannySetupContext(name, acquire_gil) #define __Pyx_RefNannyFinishContext() #define __Pyx_INCREF(r) Py_INCREF(r) #define __Pyx_DECREF(r) Py_DECREF(r) #define __Pyx_GOTREF(r) #define __Pyx_GIVEREF(r) #define __Pyx_XINCREF(r) Py_XINCREF(r) #define __Pyx_XDECREF(r) Py_XDECREF(r) #define __Pyx_XGOTREF(r) #define __Pyx_XGIVEREF(r) #endif #define __Pyx_XDECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_XDECREF(tmp);\ } while (0) #define __Pyx_DECREF_SET(r, v) do {\ PyObject *tmp = (PyObject *) r;\ r = v; __Pyx_DECREF(tmp);\ } while (0) #define __Pyx_CLEAR(r) do { PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);} while(0) #define __Pyx_XCLEAR(r) do { if((r) != NULL) {PyObject* tmp = ((PyObject*)(r)); r = NULL; __Pyx_DECREF(tmp);}} while(0) /* PyObjectGetAttrStr.proto */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name); #else #define __Pyx_PyObject_GetAttrStr(o,n) PyObject_GetAttr(o,n) #endif /* GetBuiltinName.proto */ static PyObject *__Pyx_GetBuiltinName(PyObject *name); /* RaiseArgTupleInvalid.proto */ static void __Pyx_RaiseArgtupleInvalid(const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found); /* RaiseDoubleKeywords.proto */ static void __Pyx_RaiseDoubleKeywordsError(const char* func_name, PyObject* kw_name); /* ParseKeywords.proto */ static int __Pyx_ParseOptionalKeywords(PyObject *kwds, PyObject **argnames[],\ PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args,\ const char* function_name); /* PyCFunctionFastCall.proto */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject *__Pyx_PyCFunction_FastCall(PyObject *func, PyObject **args, Py_ssize_t nargs); #else #define __Pyx_PyCFunction_FastCall(func, args, nargs) (assert(0), NULL) #endif /* PyFunctionFastCall.proto */ #if CYTHON_FAST_PYCALL #define __Pyx_PyFunction_FastCall(func, args, nargs)\ __Pyx_PyFunction_FastCallDict((func), (args), (nargs), NULL) #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs); #else #define __Pyx_PyFunction_FastCallDict(func, args, nargs, kwargs) _PyFunction_FastCallDict(func, args, nargs, kwargs) #endif #define __Pyx_BUILD_ASSERT_EXPR(cond)\ (sizeof(char [1 - 2*!(cond)]) - 1) #ifndef Py_MEMBER_SIZE #define Py_MEMBER_SIZE(type, member) sizeof(((type *)0)->member) #endif static size_t __pyx_pyframe_localsplus_offset = 0; #include "frameobject.h" #define __Pxy_PyFrame_Initialize_Offsets()\ ((void)__Pyx_BUILD_ASSERT_EXPR(sizeof(PyFrameObject) == offsetof(PyFrameObject, f_localsplus) + Py_MEMBER_SIZE(PyFrameObject, f_localsplus)),\ (void)(__pyx_pyframe_localsplus_offset = ((size_t)PyFrame_Type.tp_basicsize) - Py_MEMBER_SIZE(PyFrameObject, f_localsplus))) #define __Pyx_PyFrame_GetLocalsplus(frame)\ (assert(__pyx_pyframe_localsplus_offset), (PyObject **)(((char *)(frame)) + __pyx_pyframe_localsplus_offset)) #endif /* PyObjectCall.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw); #else #define __Pyx_PyObject_Call(func, arg, kw) PyObject_Call(func, arg, kw) #endif /* PyObjectCallMethO.proto */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg); #endif /* PyObjectCallOneArg.proto */ static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg); /* TypeImport.proto */ #ifndef __PYX_HAVE_RT_ImportType_proto #define __PYX_HAVE_RT_ImportType_proto enum __Pyx_ImportType_CheckSize { __Pyx_ImportType_CheckSize_Error = 0, __Pyx_ImportType_CheckSize_Warn = 1, __Pyx_ImportType_CheckSize_Ignore = 2 }; static PyTypeObject *__Pyx_ImportType(PyObject* module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size); #endif /* PyDictVersioning.proto */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS #define __PYX_DICT_VERSION_INIT ((PY_UINT64_T) -1) #define __PYX_GET_DICT_VERSION(dict) (((PyDictObject*)(dict))->ma_version_tag) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var)\ (version_var) = __PYX_GET_DICT_VERSION(dict);\ (cache_var) = (value); #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) {\ static PY_UINT64_T __pyx_dict_version = 0;\ static PyObject *__pyx_dict_cached_value = NULL;\ if (likely(__PYX_GET_DICT_VERSION(DICT) == __pyx_dict_version)) {\ (VAR) = __pyx_dict_cached_value;\ } else {\ (VAR) = __pyx_dict_cached_value = (LOOKUP);\ __pyx_dict_version = __PYX_GET_DICT_VERSION(DICT);\ }\ } static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj); static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj); static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version); #else #define __PYX_GET_DICT_VERSION(dict) (0) #define __PYX_UPDATE_DICT_CACHE(dict, value, cache_var, version_var) #define __PYX_PY_DICT_LOOKUP_IF_MODIFIED(VAR, DICT, LOOKUP) (VAR) = (LOOKUP); #endif /* PyThreadStateGet.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyThreadState_declare PyThreadState *__pyx_tstate; #define __Pyx_PyThreadState_assign __pyx_tstate = __Pyx_PyThreadState_Current; #define __Pyx_PyErr_Occurred() __pyx_tstate->curexc_type #else #define __Pyx_PyThreadState_declare #define __Pyx_PyThreadState_assign #define __Pyx_PyErr_Occurred() PyErr_Occurred() #endif /* PyErrFetchRestore.proto */ #if CYTHON_FAST_THREAD_STATE #define __Pyx_PyErr_Clear() __Pyx_ErrRestore(NULL, NULL, NULL) #define __Pyx_ErrRestoreWithState(type, value, tb) __Pyx_ErrRestoreInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) __Pyx_ErrFetchInState(PyThreadState_GET(), type, value, tb) #define __Pyx_ErrRestore(type, value, tb) __Pyx_ErrRestoreInState(__pyx_tstate, type, value, tb) #define __Pyx_ErrFetch(type, value, tb) __Pyx_ErrFetchInState(__pyx_tstate, type, value, tb) static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb); static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb); #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_PyErr_SetNone(exc) (Py_INCREF(exc), __Pyx_ErrRestore((exc), NULL, NULL)) #else #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #endif #else #define __Pyx_PyErr_Clear() PyErr_Clear() #define __Pyx_PyErr_SetNone(exc) PyErr_SetNone(exc) #define __Pyx_ErrRestoreWithState(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchWithState(type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestoreInState(tstate, type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetchInState(tstate, type, value, tb) PyErr_Fetch(type, value, tb) #define __Pyx_ErrRestore(type, value, tb) PyErr_Restore(type, value, tb) #define __Pyx_ErrFetch(type, value, tb) PyErr_Fetch(type, value, tb) #endif /* CLineInTraceback.proto */ #ifdef CYTHON_CLINE_IN_TRACEBACK #define __Pyx_CLineForTraceback(tstate, c_line) (((CYTHON_CLINE_IN_TRACEBACK)) ? c_line : 0) #else static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line); #endif /* CodeObjectCache.proto */ typedef struct { PyCodeObject* code_object; int code_line; } __Pyx_CodeObjectCacheEntry; struct __Pyx_CodeObjectCache { int count; int max_count; __Pyx_CodeObjectCacheEntry* entries; }; static struct __Pyx_CodeObjectCache __pyx_code_cache = {0,0,NULL}; static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line); static PyCodeObject *__pyx_find_code_object(int code_line); static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object); /* AddTraceback.proto */ static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename); /* CIntToPy.proto */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value); /* CIntFromPy.proto */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *); /* CIntFromPy.proto */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *); /* FastTypeChecks.proto */ #if CYTHON_COMPILING_IN_CPYTHON #define __Pyx_TypeCheck(obj, type) __Pyx_IsSubtype(Py_TYPE(obj), (PyTypeObject *)type) static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches(PyObject *err, PyObject *type); static CYTHON_INLINE int __Pyx_PyErr_GivenExceptionMatches2(PyObject *err, PyObject *type1, PyObject *type2); #else #define __Pyx_TypeCheck(obj, type) PyObject_TypeCheck(obj, (PyTypeObject *)type) #define __Pyx_PyErr_GivenExceptionMatches(err, type) PyErr_GivenExceptionMatches(err, type) #define __Pyx_PyErr_GivenExceptionMatches2(err, type1, type2) (PyErr_GivenExceptionMatches(err, type1) || PyErr_GivenExceptionMatches(err, type2)) #endif #define __Pyx_PyException_Check(obj) __Pyx_TypeCheck(obj, PyExc_Exception) /* CheckBinaryVersion.proto */ static int __Pyx_check_binary_version(void); /* InitStrings.proto */ static int __Pyx_InitStrings(__Pyx_StringTabEntry *t); /* Module declarations from 'cpython.version' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.type' */ static PyTypeObject *__pyx_ptype_7cpython_4type_type = 0; /* Module declarations from 'libc.string' */ /* Module declarations from 'libc.stdio' */ /* Module declarations from 'cpython.object' */ /* Module declarations from 'cpython.ref' */ /* Module declarations from 'cpython.exc' */ /* Module declarations from 'cpython.module' */ /* Module declarations from 'cpython.mem' */ /* Module declarations from 'cpython.tuple' */ /* Module declarations from 'cpython.list' */ /* Module declarations from 'cpython.sequence' */ /* Module declarations from 'cpython.mapping' */ /* Module declarations from 'cpython.iterator' */ /* Module declarations from 'cpython.number' */ /* Module declarations from 'cpython.int' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.bool' */ static PyTypeObject *__pyx_ptype_7cpython_4bool_bool = 0; /* Module declarations from 'cpython.long' */ /* Module declarations from 'cpython.float' */ /* Module declarations from '__builtin__' */ /* Module declarations from 'cpython.complex' */ static PyTypeObject *__pyx_ptype_7cpython_7complex_complex = 0; /* Module declarations from 'cpython.string' */ /* Module declarations from 'cpython.unicode' */ /* Module declarations from 'cpython.dict' */ /* Module declarations from 'cpython.instance' */ /* Module declarations from 'cpython.function' */ /* Module declarations from 'cpython.method' */ /* Module declarations from 'cpython.weakref' */ /* Module declarations from 'cpython.getargs' */ /* Module declarations from 'cpython.pythread' */ /* Module declarations from 'cpython.pystate' */ /* Module declarations from 'cpython.cobject' */ /* Module declarations from 'cpython.oldbuffer' */ /* Module declarations from 'cpython.set' */ /* Module declarations from 'cpython.buffer' */ /* Module declarations from 'cpython.bytes' */ /* Module declarations from 'cpython.pycapsule' */ /* Module declarations from 'cpython' */ /* Module declarations from 'libc.stdint' */ /* Module declarations from 'aiohttp._websocket' */ #define __Pyx_MODULE_NAME "aiohttp._websocket" extern int __pyx_module_is_main_aiohttp___websocket; int __pyx_module_is_main_aiohttp___websocket = 0; /* Implementation of 'aiohttp._websocket' */ static PyObject *__pyx_builtin_range; static const char __pyx_k_i[] = "i"; static const char __pyx_k_data[] = "data"; static const char __pyx_k_main[] = "__main__"; static const char __pyx_k_mask[] = "mask"; static const char __pyx_k_name[] = "__name__"; static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_range[] = "range"; static const char __pyx_k_in_buf[] = "in_buf"; static const char __pyx_k_data_len[] = "data_len"; static const char __pyx_k_mask_buf[] = "mask_buf"; static const char __pyx_k_uint32_msk[] = "uint32_msk"; static const char __pyx_k_uint64_msk[] = "uint64_msk"; static const char __pyx_k_aiohttp__websocket[] = "aiohttp._websocket"; static const char __pyx_k_cline_in_traceback[] = "cline_in_traceback"; static const char __pyx_k_websocket_mask_cython[] = "_websocket_mask_cython"; static const char __pyx_k_aiohttp__websocket_pyx[] = "aiohttp/_websocket.pyx"; static PyObject *__pyx_n_s_aiohttp__websocket; static PyObject *__pyx_kp_s_aiohttp__websocket_pyx; static PyObject *__pyx_n_s_cline_in_traceback; static PyObject *__pyx_n_s_data; static PyObject *__pyx_n_s_data_len; static PyObject *__pyx_n_s_i; static PyObject *__pyx_n_s_in_buf; static PyObject *__pyx_n_s_main; static PyObject *__pyx_n_s_mask; static PyObject *__pyx_n_s_mask_buf; static PyObject *__pyx_n_s_name; static PyObject *__pyx_n_s_range; static PyObject *__pyx_n_s_test; static PyObject *__pyx_n_s_uint32_msk; static PyObject *__pyx_n_s_uint64_msk; static PyObject *__pyx_n_s_websocket_mask_cython; static PyObject *__pyx_pf_7aiohttp_10_websocket__websocket_mask_cython(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_mask, PyObject *__pyx_v_data); /* proto */ static PyObject *__pyx_tuple_; static PyObject *__pyx_codeobj__2; /* Late includes */ /* "aiohttp/_websocket.pyx":9 * from libc.stdint cimport uint32_t, uint64_t, uintmax_t * * def _websocket_mask_cython(object mask, object data): # <<<<<<<<<<<<<< * """Note, this function mutates its `data` argument * """ */ /* Python wrapper */ static PyObject *__pyx_pw_7aiohttp_10_websocket_1_websocket_mask_cython(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds); /*proto*/ static char __pyx_doc_7aiohttp_10_websocket__websocket_mask_cython[] = "Note, this function mutates its `data` argument\n "; static PyMethodDef __pyx_mdef_7aiohttp_10_websocket_1_websocket_mask_cython = {"_websocket_mask_cython", (PyCFunction)(void*)(PyCFunctionWithKeywords)__pyx_pw_7aiohttp_10_websocket_1_websocket_mask_cython, METH_VARARGS|METH_KEYWORDS, __pyx_doc_7aiohttp_10_websocket__websocket_mask_cython}; static PyObject *__pyx_pw_7aiohttp_10_websocket_1_websocket_mask_cython(PyObject *__pyx_self, PyObject *__pyx_args, PyObject *__pyx_kwds) { PyObject *__pyx_v_mask = 0; PyObject *__pyx_v_data = 0; PyObject *__pyx_r = 0; __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("_websocket_mask_cython (wrapper)", 0); { static PyObject **__pyx_pyargnames[] = {&__pyx_n_s_mask,&__pyx_n_s_data,0}; PyObject* values[2] = {0,0}; if (unlikely(__pyx_kwds)) { Py_ssize_t kw_args; const Py_ssize_t pos_args = PyTuple_GET_SIZE(__pyx_args); switch (pos_args) { case 2: values[1] = PyTuple_GET_ITEM(__pyx_args, 1); CYTHON_FALLTHROUGH; case 1: values[0] = PyTuple_GET_ITEM(__pyx_args, 0); CYTHON_FALLTHROUGH; case 0: break; default: goto __pyx_L5_argtuple_error; } kw_args = PyDict_Size(__pyx_kwds); switch (pos_args) { case 0: if (likely((values[0] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_mask)) != 0)) kw_args--; else goto __pyx_L5_argtuple_error; CYTHON_FALLTHROUGH; case 1: if (likely((values[1] = __Pyx_PyDict_GetItemStr(__pyx_kwds, __pyx_n_s_data)) != 0)) kw_args--; else { __Pyx_RaiseArgtupleInvalid("_websocket_mask_cython", 1, 2, 2, 1); __PYX_ERR(0, 9, __pyx_L3_error) } } if (unlikely(kw_args > 0)) { if (unlikely(__Pyx_ParseOptionalKeywords(__pyx_kwds, __pyx_pyargnames, 0, values, pos_args, "_websocket_mask_cython") < 0)) __PYX_ERR(0, 9, __pyx_L3_error) } } else if (PyTuple_GET_SIZE(__pyx_args) != 2) { goto __pyx_L5_argtuple_error; } else { values[0] = PyTuple_GET_ITEM(__pyx_args, 0); values[1] = PyTuple_GET_ITEM(__pyx_args, 1); } __pyx_v_mask = values[0]; __pyx_v_data = values[1]; } goto __pyx_L4_argument_unpacking_done; __pyx_L5_argtuple_error:; __Pyx_RaiseArgtupleInvalid("_websocket_mask_cython", 1, 2, 2, PyTuple_GET_SIZE(__pyx_args)); __PYX_ERR(0, 9, __pyx_L3_error) __pyx_L3_error:; __Pyx_AddTraceback("aiohttp._websocket._websocket_mask_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __Pyx_RefNannyFinishContext(); return NULL; __pyx_L4_argument_unpacking_done:; __pyx_r = __pyx_pf_7aiohttp_10_websocket__websocket_mask_cython(__pyx_self, __pyx_v_mask, __pyx_v_data); /* function exit code */ __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyObject *__pyx_pf_7aiohttp_10_websocket__websocket_mask_cython(CYTHON_UNUSED PyObject *__pyx_self, PyObject *__pyx_v_mask, PyObject *__pyx_v_data) { Py_ssize_t __pyx_v_data_len; Py_ssize_t __pyx_v_i; unsigned char *__pyx_v_in_buf; unsigned char const *__pyx_v_mask_buf; uint32_t __pyx_v_uint32_msk; uint64_t __pyx_v_uint64_msk; PyObject *__pyx_r = NULL; __Pyx_RefNannyDeclarations Py_ssize_t __pyx_t_1; int __pyx_t_2; int __pyx_t_3; PyObject *__pyx_t_4 = NULL; char *__pyx_t_5; uint64_t *__pyx_t_6; long __pyx_t_7; uint32_t *__pyx_t_8; Py_ssize_t __pyx_t_9; Py_ssize_t __pyx_t_10; Py_ssize_t __pyx_t_11; __Pyx_RefNannySetupContext("_websocket_mask_cython", 0); __Pyx_INCREF(__pyx_v_mask); __Pyx_INCREF(__pyx_v_data); /* "aiohttp/_websocket.pyx":20 * uint64_t uint64_msk * * assert len(mask) == 4 # <<<<<<<<<<<<<< * * if not isinstance(mask, bytes): */ #ifndef CYTHON_WITHOUT_ASSERTIONS if (unlikely(!Py_OptimizeFlag)) { __pyx_t_1 = PyObject_Length(__pyx_v_mask); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 20, __pyx_L1_error) if (unlikely(!((__pyx_t_1 == 4) != 0))) { PyErr_SetNone(PyExc_AssertionError); __PYX_ERR(0, 20, __pyx_L1_error) } } #endif /* "aiohttp/_websocket.pyx":22 * assert len(mask) == 4 * * if not isinstance(mask, bytes): # <<<<<<<<<<<<<< * mask = bytes(mask) * */ __pyx_t_2 = PyBytes_Check(__pyx_v_mask); __pyx_t_3 = ((!(__pyx_t_2 != 0)) != 0); if (__pyx_t_3) { /* "aiohttp/_websocket.pyx":23 * * if not isinstance(mask, bytes): * mask = bytes(mask) # <<<<<<<<<<<<<< * * if isinstance(data, bytearray): */ __pyx_t_4 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyBytes_Type)), __pyx_v_mask); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 23, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF_SET(__pyx_v_mask, __pyx_t_4); __pyx_t_4 = 0; /* "aiohttp/_websocket.pyx":22 * assert len(mask) == 4 * * if not isinstance(mask, bytes): # <<<<<<<<<<<<<< * mask = bytes(mask) * */ } /* "aiohttp/_websocket.pyx":25 * mask = bytes(mask) * * if isinstance(data, bytearray): # <<<<<<<<<<<<<< * data = data * else: */ __pyx_t_3 = PyByteArray_Check(__pyx_v_data); __pyx_t_2 = (__pyx_t_3 != 0); if (__pyx_t_2) { /* "aiohttp/_websocket.pyx":26 * * if isinstance(data, bytearray): * data = data # <<<<<<<<<<<<<< * else: * data = bytearray(data) */ __pyx_t_4 = __pyx_v_data; __Pyx_INCREF(__pyx_t_4); __Pyx_DECREF_SET(__pyx_v_data, __pyx_t_4); __pyx_t_4 = 0; /* "aiohttp/_websocket.pyx":25 * mask = bytes(mask) * * if isinstance(data, bytearray): # <<<<<<<<<<<<<< * data = data * else: */ goto __pyx_L4; } /* "aiohttp/_websocket.pyx":28 * data = data * else: * data = bytearray(data) # <<<<<<<<<<<<<< * * data_len = len(data) */ /*else*/ { __pyx_t_4 = __Pyx_PyObject_CallOneArg(((PyObject *)(&PyByteArray_Type)), __pyx_v_data); if (unlikely(!__pyx_t_4)) __PYX_ERR(0, 28, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_DECREF_SET(__pyx_v_data, __pyx_t_4); __pyx_t_4 = 0; } __pyx_L4:; /* "aiohttp/_websocket.pyx":30 * data = bytearray(data) * * data_len = len(data) # <<<<<<<<<<<<<< * in_buf = PyByteArray_AsString(data) * mask_buf = PyBytes_AsString(mask) */ __pyx_t_1 = PyObject_Length(__pyx_v_data); if (unlikely(__pyx_t_1 == ((Py_ssize_t)-1))) __PYX_ERR(0, 30, __pyx_L1_error) __pyx_v_data_len = __pyx_t_1; /* "aiohttp/_websocket.pyx":31 * * data_len = len(data) * in_buf = PyByteArray_AsString(data) # <<<<<<<<<<<<<< * mask_buf = PyBytes_AsString(mask) * uint32_msk = (mask_buf)[0] */ if (!(likely(PyByteArray_CheckExact(__pyx_v_data))||((__pyx_v_data) == Py_None)||(PyErr_Format(PyExc_TypeError, "Expected %.16s, got %.200s", "bytearray", Py_TYPE(__pyx_v_data)->tp_name), 0))) __PYX_ERR(0, 31, __pyx_L1_error) __pyx_t_5 = PyByteArray_AsString(((PyObject*)__pyx_v_data)); if (unlikely(__pyx_t_5 == ((char *)NULL))) __PYX_ERR(0, 31, __pyx_L1_error) __pyx_v_in_buf = ((unsigned char *)__pyx_t_5); /* "aiohttp/_websocket.pyx":32 * data_len = len(data) * in_buf = PyByteArray_AsString(data) * mask_buf = PyBytes_AsString(mask) # <<<<<<<<<<<<<< * uint32_msk = (mask_buf)[0] * */ __pyx_t_5 = PyBytes_AsString(__pyx_v_mask); if (unlikely(__pyx_t_5 == ((char *)NULL))) __PYX_ERR(0, 32, __pyx_L1_error) __pyx_v_mask_buf = ((unsigned char const *)__pyx_t_5); /* "aiohttp/_websocket.pyx":33 * in_buf = PyByteArray_AsString(data) * mask_buf = PyBytes_AsString(mask) * uint32_msk = (mask_buf)[0] # <<<<<<<<<<<<<< * * # TODO: align in_data ptr to achieve even faster speeds */ __pyx_v_uint32_msk = (((uint32_t *)__pyx_v_mask_buf)[0]); /* "aiohttp/_websocket.pyx":38 * # does it need in python ?! malloc() always aligns to sizeof(long) bytes * * if sizeof(size_t) >= 8: # <<<<<<<<<<<<<< * uint64_msk = uint32_msk * uint64_msk = (uint64_msk << 32) | uint32_msk */ __pyx_t_2 = (((sizeof(size_t)) >= 8) != 0); if (__pyx_t_2) { /* "aiohttp/_websocket.pyx":39 * * if sizeof(size_t) >= 8: * uint64_msk = uint32_msk # <<<<<<<<<<<<<< * uint64_msk = (uint64_msk << 32) | uint32_msk * */ __pyx_v_uint64_msk = __pyx_v_uint32_msk; /* "aiohttp/_websocket.pyx":40 * if sizeof(size_t) >= 8: * uint64_msk = uint32_msk * uint64_msk = (uint64_msk << 32) | uint32_msk # <<<<<<<<<<<<<< * * while data_len >= 8: */ __pyx_v_uint64_msk = ((__pyx_v_uint64_msk << 32) | __pyx_v_uint32_msk); /* "aiohttp/_websocket.pyx":42 * uint64_msk = (uint64_msk << 32) | uint32_msk * * while data_len >= 8: # <<<<<<<<<<<<<< * (in_buf)[0] ^= uint64_msk * in_buf += 8 */ while (1) { __pyx_t_2 = ((__pyx_v_data_len >= 8) != 0); if (!__pyx_t_2) break; /* "aiohttp/_websocket.pyx":43 * * while data_len >= 8: * (in_buf)[0] ^= uint64_msk # <<<<<<<<<<<<<< * in_buf += 8 * data_len -= 8 */ __pyx_t_6 = ((uint64_t *)__pyx_v_in_buf); __pyx_t_7 = 0; (__pyx_t_6[__pyx_t_7]) = ((__pyx_t_6[__pyx_t_7]) ^ __pyx_v_uint64_msk); /* "aiohttp/_websocket.pyx":44 * while data_len >= 8: * (in_buf)[0] ^= uint64_msk * in_buf += 8 # <<<<<<<<<<<<<< * data_len -= 8 * */ __pyx_v_in_buf = (__pyx_v_in_buf + 8); /* "aiohttp/_websocket.pyx":45 * (in_buf)[0] ^= uint64_msk * in_buf += 8 * data_len -= 8 # <<<<<<<<<<<<<< * * */ __pyx_v_data_len = (__pyx_v_data_len - 8); } /* "aiohttp/_websocket.pyx":38 * # does it need in python ?! malloc() always aligns to sizeof(long) bytes * * if sizeof(size_t) >= 8: # <<<<<<<<<<<<<< * uint64_msk = uint32_msk * uint64_msk = (uint64_msk << 32) | uint32_msk */ } /* "aiohttp/_websocket.pyx":48 * * * while data_len >= 4: # <<<<<<<<<<<<<< * (in_buf)[0] ^= uint32_msk * in_buf += 4 */ while (1) { __pyx_t_2 = ((__pyx_v_data_len >= 4) != 0); if (!__pyx_t_2) break; /* "aiohttp/_websocket.pyx":49 * * while data_len >= 4: * (in_buf)[0] ^= uint32_msk # <<<<<<<<<<<<<< * in_buf += 4 * data_len -= 4 */ __pyx_t_8 = ((uint32_t *)__pyx_v_in_buf); __pyx_t_7 = 0; (__pyx_t_8[__pyx_t_7]) = ((__pyx_t_8[__pyx_t_7]) ^ __pyx_v_uint32_msk); /* "aiohttp/_websocket.pyx":50 * while data_len >= 4: * (in_buf)[0] ^= uint32_msk * in_buf += 4 # <<<<<<<<<<<<<< * data_len -= 4 * */ __pyx_v_in_buf = (__pyx_v_in_buf + 4); /* "aiohttp/_websocket.pyx":51 * (in_buf)[0] ^= uint32_msk * in_buf += 4 * data_len -= 4 # <<<<<<<<<<<<<< * * for i in range(0, data_len): */ __pyx_v_data_len = (__pyx_v_data_len - 4); } /* "aiohttp/_websocket.pyx":53 * data_len -= 4 * * for i in range(0, data_len): # <<<<<<<<<<<<<< * in_buf[i] ^= mask_buf[i] */ __pyx_t_1 = __pyx_v_data_len; __pyx_t_9 = __pyx_t_1; for (__pyx_t_10 = 0; __pyx_t_10 < __pyx_t_9; __pyx_t_10+=1) { __pyx_v_i = __pyx_t_10; /* "aiohttp/_websocket.pyx":54 * * for i in range(0, data_len): * in_buf[i] ^= mask_buf[i] # <<<<<<<<<<<<<< */ __pyx_t_11 = __pyx_v_i; (__pyx_v_in_buf[__pyx_t_11]) = ((__pyx_v_in_buf[__pyx_t_11]) ^ (__pyx_v_mask_buf[__pyx_v_i])); } /* "aiohttp/_websocket.pyx":9 * from libc.stdint cimport uint32_t, uint64_t, uintmax_t * * def _websocket_mask_cython(object mask, object data): # <<<<<<<<<<<<<< * """Note, this function mutates its `data` argument * """ */ /* function exit code */ __pyx_r = Py_None; __Pyx_INCREF(Py_None); goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_4); __Pyx_AddTraceback("aiohttp._websocket._websocket_mask_cython", __pyx_clineno, __pyx_lineno, __pyx_filename); __pyx_r = NULL; __pyx_L0:; __Pyx_XDECREF(__pyx_v_mask); __Pyx_XDECREF(__pyx_v_data); __Pyx_XGIVEREF(__pyx_r); __Pyx_RefNannyFinishContext(); return __pyx_r; } static PyMethodDef __pyx_methods[] = { {0, 0, 0, 0} }; #if PY_MAJOR_VERSION >= 3 #if CYTHON_PEP489_MULTI_PHASE_INIT static PyObject* __pyx_pymod_create(PyObject *spec, PyModuleDef *def); /*proto*/ static int __pyx_pymod_exec__websocket(PyObject* module); /*proto*/ static PyModuleDef_Slot __pyx_moduledef_slots[] = { {Py_mod_create, (void*)__pyx_pymod_create}, {Py_mod_exec, (void*)__pyx_pymod_exec__websocket}, {0, NULL} }; #endif static struct PyModuleDef __pyx_moduledef = { PyModuleDef_HEAD_INIT, "_websocket", 0, /* m_doc */ #if CYTHON_PEP489_MULTI_PHASE_INIT 0, /* m_size */ #else -1, /* m_size */ #endif __pyx_methods /* m_methods */, #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_moduledef_slots, /* m_slots */ #else NULL, /* m_reload */ #endif NULL, /* m_traverse */ NULL, /* m_clear */ NULL /* m_free */ }; #endif #ifndef CYTHON_SMALL_CODE #if defined(__clang__) #define CYTHON_SMALL_CODE #elif defined(__GNUC__) && (__GNUC__ > 4 || (__GNUC__ == 4 && __GNUC_MINOR__ >= 3)) #define CYTHON_SMALL_CODE __attribute__((cold)) #else #define CYTHON_SMALL_CODE #endif #endif static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_n_s_aiohttp__websocket, __pyx_k_aiohttp__websocket, sizeof(__pyx_k_aiohttp__websocket), 0, 0, 1, 1}, {&__pyx_kp_s_aiohttp__websocket_pyx, __pyx_k_aiohttp__websocket_pyx, sizeof(__pyx_k_aiohttp__websocket_pyx), 0, 0, 1, 0}, {&__pyx_n_s_cline_in_traceback, __pyx_k_cline_in_traceback, sizeof(__pyx_k_cline_in_traceback), 0, 0, 1, 1}, {&__pyx_n_s_data, __pyx_k_data, sizeof(__pyx_k_data), 0, 0, 1, 1}, {&__pyx_n_s_data_len, __pyx_k_data_len, sizeof(__pyx_k_data_len), 0, 0, 1, 1}, {&__pyx_n_s_i, __pyx_k_i, sizeof(__pyx_k_i), 0, 0, 1, 1}, {&__pyx_n_s_in_buf, __pyx_k_in_buf, sizeof(__pyx_k_in_buf), 0, 0, 1, 1}, {&__pyx_n_s_main, __pyx_k_main, sizeof(__pyx_k_main), 0, 0, 1, 1}, {&__pyx_n_s_mask, __pyx_k_mask, sizeof(__pyx_k_mask), 0, 0, 1, 1}, {&__pyx_n_s_mask_buf, __pyx_k_mask_buf, sizeof(__pyx_k_mask_buf), 0, 0, 1, 1}, {&__pyx_n_s_name, __pyx_k_name, sizeof(__pyx_k_name), 0, 0, 1, 1}, {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, {&__pyx_n_s_test, __pyx_k_test, sizeof(__pyx_k_test), 0, 0, 1, 1}, {&__pyx_n_s_uint32_msk, __pyx_k_uint32_msk, sizeof(__pyx_k_uint32_msk), 0, 0, 1, 1}, {&__pyx_n_s_uint64_msk, __pyx_k_uint64_msk, sizeof(__pyx_k_uint64_msk), 0, 0, 1, 1}, {&__pyx_n_s_websocket_mask_cython, __pyx_k_websocket_mask_cython, sizeof(__pyx_k_websocket_mask_cython), 0, 0, 1, 1}, {0, 0, 0, 0, 0, 0, 0} }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(0, 53, __pyx_L1_error) return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_InitCachedConstants", 0); /* "aiohttp/_websocket.pyx":9 * from libc.stdint cimport uint32_t, uint64_t, uintmax_t * * def _websocket_mask_cython(object mask, object data): # <<<<<<<<<<<<<< * """Note, this function mutates its `data` argument * """ */ __pyx_tuple_ = PyTuple_Pack(8, __pyx_n_s_mask, __pyx_n_s_data, __pyx_n_s_data_len, __pyx_n_s_i, __pyx_n_s_in_buf, __pyx_n_s_mask_buf, __pyx_n_s_uint32_msk, __pyx_n_s_uint64_msk); if (unlikely(!__pyx_tuple_)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_tuple_); __Pyx_GIVEREF(__pyx_tuple_); __pyx_codeobj__2 = (PyObject*)__Pyx_PyCode_New(2, 0, 8, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple_, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_aiohttp__websocket_pyx, __pyx_n_s_websocket_mask_cython, 9, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__2)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_RefNannyFinishContext(); return -1; } static CYTHON_SMALL_CODE int __Pyx_InitGlobals(void) { if (__Pyx_InitStrings(__pyx_string_tab) < 0) __PYX_ERR(0, 1, __pyx_L1_error); return 0; __pyx_L1_error:; return -1; } static CYTHON_SMALL_CODE int __Pyx_modinit_global_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_export_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_init_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_type_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_variable_import_code(void); /*proto*/ static CYTHON_SMALL_CODE int __Pyx_modinit_function_import_code(void); /*proto*/ static int __Pyx_modinit_global_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_global_init_code", 0); /*--- Global init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_variable_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_export_code", 0); /*--- Variable export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_export_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_export_code", 0); /*--- Function export code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_init_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_type_init_code", 0); /*--- Type init code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_type_import_code(void) { __Pyx_RefNannyDeclarations PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__Pyx_modinit_type_import_code", 0); /*--- Type import code ---*/ __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4type_type = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "type", #if defined(PYPY_VERSION_NUM) && PYPY_VERSION_NUM < 0x050B0000 sizeof(PyTypeObject), #else sizeof(PyHeapTypeObject), #endif __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_4type_type) __PYX_ERR(1, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(2, 8, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_4bool_bool = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "bool", sizeof(PyBoolObject), __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_4bool_bool) __PYX_ERR(2, 8, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __pyx_t_1 = PyImport_ImportModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_t_1)) __PYX_ERR(3, 15, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_ptype_7cpython_7complex_complex = __Pyx_ImportType(__pyx_t_1, __Pyx_BUILTIN_MODULE_NAME, "complex", sizeof(PyComplexObject), __Pyx_ImportType_CheckSize_Warn); if (!__pyx_ptype_7cpython_7complex_complex) __PYX_ERR(3, 15, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); __Pyx_RefNannyFinishContext(); return -1; } static int __Pyx_modinit_variable_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_variable_import_code", 0); /*--- Variable import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } static int __Pyx_modinit_function_import_code(void) { __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__Pyx_modinit_function_import_code", 0); /*--- Function import code ---*/ __Pyx_RefNannyFinishContext(); return 0; } #if PY_MAJOR_VERSION < 3 #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC void #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #else #ifdef CYTHON_NO_PYINIT_EXPORT #define __Pyx_PyMODINIT_FUNC PyObject * #else #define __Pyx_PyMODINIT_FUNC PyMODINIT_FUNC #endif #endif #if PY_MAJOR_VERSION < 3 __Pyx_PyMODINIT_FUNC init_websocket(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC init_websocket(void) #else __Pyx_PyMODINIT_FUNC PyInit__websocket(void) CYTHON_SMALL_CODE; /*proto*/ __Pyx_PyMODINIT_FUNC PyInit__websocket(void) #if CYTHON_PEP489_MULTI_PHASE_INIT { return PyModuleDef_Init(&__pyx_moduledef); } static CYTHON_SMALL_CODE int __Pyx_check_single_interpreter(void) { #if PY_VERSION_HEX >= 0x030700A1 static PY_INT64_T main_interpreter_id = -1; PY_INT64_T current_id = PyInterpreterState_GetID(PyThreadState_Get()->interp); if (main_interpreter_id == -1) { main_interpreter_id = current_id; return (unlikely(current_id == -1)) ? -1 : 0; } else if (unlikely(main_interpreter_id != current_id)) #else static PyInterpreterState *main_interpreter = NULL; PyInterpreterState *current_interpreter = PyThreadState_Get()->interp; if (!main_interpreter) { main_interpreter = current_interpreter; } else if (unlikely(main_interpreter != current_interpreter)) #endif { PyErr_SetString( PyExc_ImportError, "Interpreter change detected - this module can only be loaded into one interpreter per process."); return -1; } return 0; } static CYTHON_SMALL_CODE int __Pyx_copy_spec_to_module(PyObject *spec, PyObject *moddict, const char* from_name, const char* to_name, int allow_none) { PyObject *value = PyObject_GetAttrString(spec, from_name); int result = 0; if (likely(value)) { if (allow_none || value != Py_None) { result = PyDict_SetItemString(moddict, to_name, value); } Py_DECREF(value); } else if (PyErr_ExceptionMatches(PyExc_AttributeError)) { PyErr_Clear(); } else { result = -1; } return result; } static CYTHON_SMALL_CODE PyObject* __pyx_pymod_create(PyObject *spec, CYTHON_UNUSED PyModuleDef *def) { PyObject *module = NULL, *moddict, *modname; if (__Pyx_check_single_interpreter()) return NULL; if (__pyx_m) return __Pyx_NewRef(__pyx_m); modname = PyObject_GetAttrString(spec, "name"); if (unlikely(!modname)) goto bad; module = PyModule_NewObject(modname); Py_DECREF(modname); if (unlikely(!module)) goto bad; moddict = PyModule_GetDict(module); if (unlikely(!moddict)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "loader", "__loader__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "origin", "__file__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "parent", "__package__", 1) < 0)) goto bad; if (unlikely(__Pyx_copy_spec_to_module(spec, moddict, "submodule_search_locations", "__path__", 0) < 0)) goto bad; return module; bad: Py_XDECREF(module); return NULL; } static CYTHON_SMALL_CODE int __pyx_pymod_exec__websocket(PyObject *__pyx_pyinit_module) #endif #endif { PyObject *__pyx_t_1 = NULL; __Pyx_RefNannyDeclarations #if CYTHON_PEP489_MULTI_PHASE_INIT if (__pyx_m) { if (__pyx_m == __pyx_pyinit_module) return 0; PyErr_SetString(PyExc_RuntimeError, "Module '_websocket' has already been imported. Re-initialisation is not supported."); return -1; } #elif PY_MAJOR_VERSION >= 3 if (__pyx_m) return __Pyx_NewRef(__pyx_m); #endif #if CYTHON_REFNANNY __Pyx_RefNanny = __Pyx_RefNannyImportAPI("refnanny"); if (!__Pyx_RefNanny) { PyErr_Clear(); __Pyx_RefNanny = __Pyx_RefNannyImportAPI("Cython.Runtime.refnanny"); if (!__Pyx_RefNanny) Py_FatalError("failed to import 'refnanny' module"); } #endif __Pyx_RefNannySetupContext("__Pyx_PyMODINIT_FUNC PyInit__websocket(void)", 0); if (__Pyx_check_binary_version() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pxy_PyFrame_Initialize_Offsets __Pxy_PyFrame_Initialize_Offsets(); #endif __pyx_empty_tuple = PyTuple_New(0); if (unlikely(!__pyx_empty_tuple)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_bytes = PyBytes_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_bytes)) __PYX_ERR(0, 1, __pyx_L1_error) __pyx_empty_unicode = PyUnicode_FromStringAndSize("", 0); if (unlikely(!__pyx_empty_unicode)) __PYX_ERR(0, 1, __pyx_L1_error) #ifdef __Pyx_CyFunction_USED if (__pyx_CyFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_FusedFunction_USED if (__pyx_FusedFunction_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Coroutine_USED if (__pyx_Coroutine_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_Generator_USED if (__pyx_Generator_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_AsyncGen_USED if (__pyx_AsyncGen_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif #ifdef __Pyx_StopAsyncIteration_USED if (__pyx_StopAsyncIteration_init() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /*--- Library function declarations ---*/ /*--- Threads initialization code ---*/ #if defined(__PYX_FORCE_INIT_THREADS) && __PYX_FORCE_INIT_THREADS #ifdef WITH_THREAD /* Python build with threading support? */ PyEval_InitThreads(); #endif #endif /*--- Module creation code ---*/ #if CYTHON_PEP489_MULTI_PHASE_INIT __pyx_m = __pyx_pyinit_module; Py_INCREF(__pyx_m); #else #if PY_MAJOR_VERSION < 3 __pyx_m = Py_InitModule4("_websocket", __pyx_methods, 0, 0, PYTHON_API_VERSION); Py_XINCREF(__pyx_m); #else __pyx_m = PyModule_Create(&__pyx_moduledef); #endif if (unlikely(!__pyx_m)) __PYX_ERR(0, 1, __pyx_L1_error) #endif __pyx_d = PyModule_GetDict(__pyx_m); if (unlikely(!__pyx_d)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_d); __pyx_b = PyImport_AddModule(__Pyx_BUILTIN_MODULE_NAME); if (unlikely(!__pyx_b)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_b); __pyx_cython_runtime = PyImport_AddModule((char *) "cython_runtime"); if (unlikely(!__pyx_cython_runtime)) __PYX_ERR(0, 1, __pyx_L1_error) Py_INCREF(__pyx_cython_runtime); if (PyObject_SetAttrString(__pyx_m, "__builtins__", __pyx_b) < 0) __PYX_ERR(0, 1, __pyx_L1_error); /*--- Initialize various global constants etc. ---*/ if (__Pyx_InitGlobals() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #if PY_MAJOR_VERSION < 3 && (__PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT) if (__Pyx_init_sys_getdefaultencoding_params() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif if (__pyx_module_is_main_aiohttp___websocket) { if (PyObject_SetAttr(__pyx_m, __pyx_n_s_name, __pyx_n_s_main) < 0) __PYX_ERR(0, 1, __pyx_L1_error) } #if PY_MAJOR_VERSION >= 3 { PyObject *modules = PyImport_GetModuleDict(); if (unlikely(!modules)) __PYX_ERR(0, 1, __pyx_L1_error) if (!PyDict_GetItemString(modules, "aiohttp._websocket")) { if (unlikely(PyDict_SetItemString(modules, "aiohttp._websocket", __pyx_m) < 0)) __PYX_ERR(0, 1, __pyx_L1_error) } } #endif /*--- Builtin init code ---*/ if (__Pyx_InitCachedBuiltins() < 0) goto __pyx_L1_error; /*--- Constants init code ---*/ if (__Pyx_InitCachedConstants() < 0) goto __pyx_L1_error; /*--- Global type/function init code ---*/ (void)__Pyx_modinit_global_init_code(); (void)__Pyx_modinit_variable_export_code(); (void)__Pyx_modinit_function_export_code(); (void)__Pyx_modinit_type_init_code(); if (unlikely(__Pyx_modinit_type_import_code() != 0)) goto __pyx_L1_error; (void)__Pyx_modinit_variable_import_code(); (void)__Pyx_modinit_function_import_code(); /*--- Execution code ---*/ #if defined(__Pyx_Generator_USED) || defined(__Pyx_Coroutine_USED) if (__Pyx_patch_abc() < 0) __PYX_ERR(0, 1, __pyx_L1_error) #endif /* "aiohttp/_websocket.pyx":9 * from libc.stdint cimport uint32_t, uint64_t, uintmax_t * * def _websocket_mask_cython(object mask, object data): # <<<<<<<<<<<<<< * """Note, this function mutates its `data` argument * """ */ __pyx_t_1 = PyCFunction_NewEx(&__pyx_mdef_7aiohttp_10_websocket_1_websocket_mask_cython, NULL, __pyx_n_s_aiohttp__websocket); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_websocket_mask_cython, __pyx_t_1) < 0) __PYX_ERR(0, 9, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /* "aiohttp/_websocket.pyx":1 * from cpython cimport PyBytes_AsString # <<<<<<<<<<<<<< * * #from cpython cimport PyByteArray_AsString # cython still not exports that */ __pyx_t_1 = __Pyx_PyDict_NewPresized(0); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); if (PyDict_SetItem(__pyx_d, __pyx_n_s_test, __pyx_t_1) < 0) __PYX_ERR(0, 1, __pyx_L1_error) __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; /*--- Wrapped vars code ---*/ goto __pyx_L0; __pyx_L1_error:; __Pyx_XDECREF(__pyx_t_1); if (__pyx_m) { if (__pyx_d) { __Pyx_AddTraceback("init aiohttp._websocket", __pyx_clineno, __pyx_lineno, __pyx_filename); } Py_CLEAR(__pyx_m); } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_ImportError, "init aiohttp._websocket"); } __pyx_L0:; __Pyx_RefNannyFinishContext(); #if CYTHON_PEP489_MULTI_PHASE_INIT return (__pyx_m != NULL) ? 0 : -1; #elif PY_MAJOR_VERSION >= 3 return __pyx_m; #else return; #endif } /* --- Runtime support code --- */ /* Refnanny */ #if CYTHON_REFNANNY static __Pyx_RefNannyAPIStruct *__Pyx_RefNannyImportAPI(const char *modname) { PyObject *m = NULL, *p = NULL; void *r = NULL; m = PyImport_ImportModule(modname); if (!m) goto end; p = PyObject_GetAttrString(m, "RefNannyAPI"); if (!p) goto end; r = PyLong_AsVoidPtr(p); end: Py_XDECREF(p); Py_XDECREF(m); return (__Pyx_RefNannyAPIStruct *)r; } #endif /* PyObjectGetAttrStr */ #if CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PyObject* __Pyx_PyObject_GetAttrStr(PyObject* obj, PyObject* attr_name) { PyTypeObject* tp = Py_TYPE(obj); if (likely(tp->tp_getattro)) return tp->tp_getattro(obj, attr_name); #if PY_MAJOR_VERSION < 3 if (likely(tp->tp_getattr)) return tp->tp_getattr(obj, PyString_AS_STRING(attr_name)); #endif return PyObject_GetAttr(obj, attr_name); } #endif /* GetBuiltinName */ static PyObject *__Pyx_GetBuiltinName(PyObject *name) { PyObject* result = __Pyx_PyObject_GetAttrStr(__pyx_b, name); if (unlikely(!result)) { PyErr_Format(PyExc_NameError, #if PY_MAJOR_VERSION >= 3 "name '%U' is not defined", name); #else "name '%.200s' is not defined", PyString_AS_STRING(name)); #endif } return result; } /* RaiseArgTupleInvalid */ static void __Pyx_RaiseArgtupleInvalid( const char* func_name, int exact, Py_ssize_t num_min, Py_ssize_t num_max, Py_ssize_t num_found) { Py_ssize_t num_expected; const char *more_or_less; if (num_found < num_min) { num_expected = num_min; more_or_less = "at least"; } else { num_expected = num_max; more_or_less = "at most"; } if (exact) { more_or_less = "exactly"; } PyErr_Format(PyExc_TypeError, "%.200s() takes %.8s %" CYTHON_FORMAT_SSIZE_T "d positional argument%.1s (%" CYTHON_FORMAT_SSIZE_T "d given)", func_name, more_or_less, num_expected, (num_expected == 1) ? "" : "s", num_found); } /* RaiseDoubleKeywords */ static void __Pyx_RaiseDoubleKeywordsError( const char* func_name, PyObject* kw_name) { PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION >= 3 "%s() got multiple values for keyword argument '%U'", func_name, kw_name); #else "%s() got multiple values for keyword argument '%s'", func_name, PyString_AsString(kw_name)); #endif } /* ParseKeywords */ static int __Pyx_ParseOptionalKeywords( PyObject *kwds, PyObject **argnames[], PyObject *kwds2, PyObject *values[], Py_ssize_t num_pos_args, const char* function_name) { PyObject *key = 0, *value = 0; Py_ssize_t pos = 0; PyObject*** name; PyObject*** first_kw_arg = argnames + num_pos_args; while (PyDict_Next(kwds, &pos, &key, &value)) { name = first_kw_arg; while (*name && (**name != key)) name++; if (*name) { values[name-argnames] = value; continue; } name = first_kw_arg; #if PY_MAJOR_VERSION < 3 if (likely(PyString_CheckExact(key)) || likely(PyString_Check(key))) { while (*name) { if ((CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**name) == PyString_GET_SIZE(key)) && _PyString_Eq(**name, key)) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { if ((**argname == key) || ( (CYTHON_COMPILING_IN_PYPY || PyString_GET_SIZE(**argname) == PyString_GET_SIZE(key)) && _PyString_Eq(**argname, key))) { goto arg_passed_twice; } argname++; } } } else #endif if (likely(PyUnicode_Check(key))) { while (*name) { int cmp = (**name == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**name) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**name, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) { values[name-argnames] = value; break; } name++; } if (*name) continue; else { PyObject*** argname = argnames; while (argname != first_kw_arg) { int cmp = (**argname == key) ? 0 : #if !CYTHON_COMPILING_IN_PYPY && PY_MAJOR_VERSION >= 3 (PyUnicode_GET_SIZE(**argname) != PyUnicode_GET_SIZE(key)) ? 1 : #endif PyUnicode_Compare(**argname, key); if (cmp < 0 && unlikely(PyErr_Occurred())) goto bad; if (cmp == 0) goto arg_passed_twice; argname++; } } } else goto invalid_keyword_type; if (kwds2) { if (unlikely(PyDict_SetItem(kwds2, key, value))) goto bad; } else { goto invalid_keyword; } } return 0; arg_passed_twice: __Pyx_RaiseDoubleKeywordsError(function_name, key); goto bad; invalid_keyword_type: PyErr_Format(PyExc_TypeError, "%.200s() keywords must be strings", function_name); goto bad; invalid_keyword: PyErr_Format(PyExc_TypeError, #if PY_MAJOR_VERSION < 3 "%.200s() got an unexpected keyword argument '%.200s'", function_name, PyString_AsString(key)); #else "%s() got an unexpected keyword argument '%U'", function_name, key); #endif bad: return -1; } /* PyCFunctionFastCall */ #if CYTHON_FAST_PYCCALL static CYTHON_INLINE PyObject * __Pyx_PyCFunction_FastCall(PyObject *func_obj, PyObject **args, Py_ssize_t nargs) { PyCFunctionObject *func = (PyCFunctionObject*)func_obj; PyCFunction meth = PyCFunction_GET_FUNCTION(func); PyObject *self = PyCFunction_GET_SELF(func); int flags = PyCFunction_GET_FLAGS(func); assert(PyCFunction_Check(func)); assert(METH_FASTCALL == (flags & ~(METH_CLASS | METH_STATIC | METH_COEXIST | METH_KEYWORDS | METH_STACKLESS))); assert(nargs >= 0); assert(nargs == 0 || args != NULL); /* _PyCFunction_FastCallDict() must not be called with an exception set, because it may clear it (directly or indirectly) and so the caller loses its exception */ assert(!PyErr_Occurred()); if ((PY_VERSION_HEX < 0x030700A0) || unlikely(flags & METH_KEYWORDS)) { return (*((__Pyx_PyCFunctionFastWithKeywords)(void*)meth)) (self, args, nargs, NULL); } else { return (*((__Pyx_PyCFunctionFast)(void*)meth)) (self, args, nargs); } } #endif /* PyFunctionFastCall */ #if CYTHON_FAST_PYCALL static PyObject* __Pyx_PyFunction_FastCallNoKw(PyCodeObject *co, PyObject **args, Py_ssize_t na, PyObject *globals) { PyFrameObject *f; PyThreadState *tstate = __Pyx_PyThreadState_Current; PyObject **fastlocals; Py_ssize_t i; PyObject *result; assert(globals != NULL); /* XXX Perhaps we should create a specialized PyFrame_New() that doesn't take locals, but does take builtins without sanity checking them. */ assert(tstate != NULL); f = PyFrame_New(tstate, co, globals, NULL); if (f == NULL) { return NULL; } fastlocals = __Pyx_PyFrame_GetLocalsplus(f); for (i = 0; i < na; i++) { Py_INCREF(*args); fastlocals[i] = *args++; } result = PyEval_EvalFrameEx(f,0); ++tstate->recursion_depth; Py_DECREF(f); --tstate->recursion_depth; return result; } #if 1 || PY_VERSION_HEX < 0x030600B1 static PyObject *__Pyx_PyFunction_FastCallDict(PyObject *func, PyObject **args, Py_ssize_t nargs, PyObject *kwargs) { PyCodeObject *co = (PyCodeObject *)PyFunction_GET_CODE(func); PyObject *globals = PyFunction_GET_GLOBALS(func); PyObject *argdefs = PyFunction_GET_DEFAULTS(func); PyObject *closure; #if PY_MAJOR_VERSION >= 3 PyObject *kwdefs; #endif PyObject *kwtuple, **k; PyObject **d; Py_ssize_t nd; Py_ssize_t nk; PyObject *result; assert(kwargs == NULL || PyDict_Check(kwargs)); nk = kwargs ? PyDict_Size(kwargs) : 0; if (Py_EnterRecursiveCall((char*)" while calling a Python object")) { return NULL; } if ( #if PY_MAJOR_VERSION >= 3 co->co_kwonlyargcount == 0 && #endif likely(kwargs == NULL || nk == 0) && co->co_flags == (CO_OPTIMIZED | CO_NEWLOCALS | CO_NOFREE)) { if (argdefs == NULL && co->co_argcount == nargs) { result = __Pyx_PyFunction_FastCallNoKw(co, args, nargs, globals); goto done; } else if (nargs == 0 && argdefs != NULL && co->co_argcount == Py_SIZE(argdefs)) { /* function called with no arguments, but all parameters have a default value: use default values as arguments .*/ args = &PyTuple_GET_ITEM(argdefs, 0); result =__Pyx_PyFunction_FastCallNoKw(co, args, Py_SIZE(argdefs), globals); goto done; } } if (kwargs != NULL) { Py_ssize_t pos, i; kwtuple = PyTuple_New(2 * nk); if (kwtuple == NULL) { result = NULL; goto done; } k = &PyTuple_GET_ITEM(kwtuple, 0); pos = i = 0; while (PyDict_Next(kwargs, &pos, &k[i], &k[i+1])) { Py_INCREF(k[i]); Py_INCREF(k[i+1]); i += 2; } nk = i / 2; } else { kwtuple = NULL; k = NULL; } closure = PyFunction_GET_CLOSURE(func); #if PY_MAJOR_VERSION >= 3 kwdefs = PyFunction_GET_KW_DEFAULTS(func); #endif if (argdefs != NULL) { d = &PyTuple_GET_ITEM(argdefs, 0); nd = Py_SIZE(argdefs); } else { d = NULL; nd = 0; } #if PY_MAJOR_VERSION >= 3 result = PyEval_EvalCodeEx((PyObject*)co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, kwdefs, closure); #else result = PyEval_EvalCodeEx(co, globals, (PyObject *)NULL, args, (int)nargs, k, (int)nk, d, (int)nd, closure); #endif Py_XDECREF(kwtuple); done: Py_LeaveRecursiveCall(); return result; } #endif #endif /* PyObjectCall */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_Call(PyObject *func, PyObject *arg, PyObject *kw) { PyObject *result; ternaryfunc call = func->ob_type->tp_call; if (unlikely(!call)) return PyObject_Call(func, arg, kw); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = (*call)(func, arg, kw); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallMethO */ #if CYTHON_COMPILING_IN_CPYTHON static CYTHON_INLINE PyObject* __Pyx_PyObject_CallMethO(PyObject *func, PyObject *arg) { PyObject *self, *result; PyCFunction cfunc; cfunc = PyCFunction_GET_FUNCTION(func); self = PyCFunction_GET_SELF(func); if (unlikely(Py_EnterRecursiveCall((char*)" while calling a Python object"))) return NULL; result = cfunc(self, arg); Py_LeaveRecursiveCall(); if (unlikely(!result) && unlikely(!PyErr_Occurred())) { PyErr_SetString( PyExc_SystemError, "NULL result without error in PyObject_Call"); } return result; } #endif /* PyObjectCallOneArg */ #if CYTHON_COMPILING_IN_CPYTHON static PyObject* __Pyx__PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_New(1); if (unlikely(!args)) return NULL; Py_INCREF(arg); PyTuple_SET_ITEM(args, 0, arg); result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { #if CYTHON_FAST_PYCALL if (PyFunction_Check(func)) { return __Pyx_PyFunction_FastCall(func, &arg, 1); } #endif if (likely(PyCFunction_Check(func))) { if (likely(PyCFunction_GET_FLAGS(func) & METH_O)) { return __Pyx_PyObject_CallMethO(func, arg); #if CYTHON_FAST_PYCCALL } else if (PyCFunction_GET_FLAGS(func) & METH_FASTCALL) { return __Pyx_PyCFunction_FastCall(func, &arg, 1); #endif } } return __Pyx__PyObject_CallOneArg(func, arg); } #else static CYTHON_INLINE PyObject* __Pyx_PyObject_CallOneArg(PyObject *func, PyObject *arg) { PyObject *result; PyObject *args = PyTuple_Pack(1, arg); if (unlikely(!args)) return NULL; result = __Pyx_PyObject_Call(func, args, NULL); Py_DECREF(args); return result; } #endif /* TypeImport */ #ifndef __PYX_HAVE_RT_ImportType #define __PYX_HAVE_RT_ImportType static PyTypeObject *__Pyx_ImportType(PyObject *module, const char *module_name, const char *class_name, size_t size, enum __Pyx_ImportType_CheckSize check_size) { PyObject *result = 0; char warning[200]; Py_ssize_t basicsize; #ifdef Py_LIMITED_API PyObject *py_basicsize; #endif result = PyObject_GetAttrString(module, class_name); if (!result) goto bad; if (!PyType_Check(result)) { PyErr_Format(PyExc_TypeError, "%.200s.%.200s is not a type object", module_name, class_name); goto bad; } #ifndef Py_LIMITED_API basicsize = ((PyTypeObject *)result)->tp_basicsize; #else py_basicsize = PyObject_GetAttrString(result, "__basicsize__"); if (!py_basicsize) goto bad; basicsize = PyLong_AsSsize_t(py_basicsize); Py_DECREF(py_basicsize); py_basicsize = 0; if (basicsize == (Py_ssize_t)-1 && PyErr_Occurred()) goto bad; #endif if ((size_t)basicsize < size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } if (check_size == __Pyx_ImportType_CheckSize_Error && (size_t)basicsize != size) { PyErr_Format(PyExc_ValueError, "%.200s.%.200s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); goto bad; } else if (check_size == __Pyx_ImportType_CheckSize_Warn && (size_t)basicsize > size) { PyOS_snprintf(warning, sizeof(warning), "%s.%s size changed, may indicate binary incompatibility. " "Expected %zd from C header, got %zd from PyObject", module_name, class_name, size, basicsize); if (PyErr_WarnEx(NULL, warning, 0) < 0) goto bad; } return (PyTypeObject *)result; bad: Py_XDECREF(result); return NULL; } #endif /* PyDictVersioning */ #if CYTHON_USE_DICT_VERSIONS && CYTHON_USE_TYPE_SLOTS static CYTHON_INLINE PY_UINT64_T __Pyx_get_tp_dict_version(PyObject *obj) { PyObject *dict = Py_TYPE(obj)->tp_dict; return likely(dict) ? __PYX_GET_DICT_VERSION(dict) : 0; } static CYTHON_INLINE PY_UINT64_T __Pyx_get_object_dict_version(PyObject *obj) { PyObject **dictptr = NULL; Py_ssize_t offset = Py_TYPE(obj)->tp_dictoffset; if (offset) { #if CYTHON_COMPILING_IN_CPYTHON dictptr = (likely(offset > 0)) ? (PyObject **) ((char *)obj + offset) : _PyObject_GetDictPtr(obj); #else dictptr = _PyObject_GetDictPtr(obj); #endif } return (dictptr && *dictptr) ? __PYX_GET_DICT_VERSION(*dictptr) : 0; } static CYTHON_INLINE int __Pyx_object_dict_version_matches(PyObject* obj, PY_UINT64_T tp_dict_version, PY_UINT64_T obj_dict_version) { PyObject *dict = Py_TYPE(obj)->tp_dict; if (unlikely(!dict) || unlikely(tp_dict_version != __PYX_GET_DICT_VERSION(dict))) return 0; return obj_dict_version == __Pyx_get_object_dict_version(obj); } #endif /* PyErrFetchRestore */ #if CYTHON_FAST_THREAD_STATE static CYTHON_INLINE void __Pyx_ErrRestoreInState(PyThreadState *tstate, PyObject *type, PyObject *value, PyObject *tb) { PyObject *tmp_type, *tmp_value, *tmp_tb; tmp_type = tstate->curexc_type; tmp_value = tstate->curexc_value; tmp_tb = tstate->curexc_traceback; tstate->curexc_type = type; tstate->curexc_value = value; tstate->curexc_traceback = tb; Py_XDECREF(tmp_type); Py_XDECREF(tmp_value); Py_XDECREF(tmp_tb); } static CYTHON_INLINE void __Pyx_ErrFetchInState(PyThreadState *tstate, PyObject **type, PyObject **value, PyObject **tb) { *type = tstate->curexc_type; *value = tstate->curexc_value; *tb = tstate->curexc_traceback; tstate->curexc_type = 0; tstate->curexc_value = 0; tstate->curexc_traceback = 0; } #endif /* CLineInTraceback */ #ifndef CYTHON_CLINE_IN_TRACEBACK static int __Pyx_CLineForTraceback(PyThreadState *tstate, int c_line) { PyObject *use_cline; PyObject *ptype, *pvalue, *ptraceback; #if CYTHON_COMPILING_IN_CPYTHON PyObject **cython_runtime_dict; #endif if (unlikely(!__pyx_cython_runtime)) { return c_line; } __Pyx_ErrFetchInState(tstate, &ptype, &pvalue, &ptraceback); #if CYTHON_COMPILING_IN_CPYTHON cython_runtime_dict = _PyObject_GetDictPtr(__pyx_cython_runtime); if (likely(cython_runtime_dict)) { __PYX_PY_DICT_LOOKUP_IF_MODIFIED( use_cline, *cython_runtime_dict, __Pyx_PyDict_GetItemStr(*cython_runtime_dict, __pyx_n_s_cline_in_traceback)) } else #endif { PyObject *use_cline_obj = __Pyx_PyObject_GetAttrStr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback); if (use_cline_obj) { use_cline = PyObject_Not(use_cline_obj) ? Py_False : Py_True; Py_DECREF(use_cline_obj); } else { PyErr_Clear(); use_cline = NULL; } } if (!use_cline) { c_line = 0; PyObject_SetAttr(__pyx_cython_runtime, __pyx_n_s_cline_in_traceback, Py_False); } else if (use_cline == Py_False || (use_cline != Py_True && PyObject_Not(use_cline) != 0)) { c_line = 0; } __Pyx_ErrRestoreInState(tstate, ptype, pvalue, ptraceback); return c_line; } #endif /* CodeObjectCache */ static int __pyx_bisect_code_objects(__Pyx_CodeObjectCacheEntry* entries, int count, int code_line) { int start = 0, mid = 0, end = count - 1; if (end >= 0 && code_line > entries[end].code_line) { return count; } while (start < end) { mid = start + (end - start) / 2; if (code_line < entries[mid].code_line) { end = mid; } else if (code_line > entries[mid].code_line) { start = mid + 1; } else { return mid; } } if (code_line <= entries[mid].code_line) { return mid; } else { return mid + 1; } } static PyCodeObject *__pyx_find_code_object(int code_line) { PyCodeObject* code_object; int pos; if (unlikely(!code_line) || unlikely(!__pyx_code_cache.entries)) { return NULL; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if (unlikely(pos >= __pyx_code_cache.count) || unlikely(__pyx_code_cache.entries[pos].code_line != code_line)) { return NULL; } code_object = __pyx_code_cache.entries[pos].code_object; Py_INCREF(code_object); return code_object; } static void __pyx_insert_code_object(int code_line, PyCodeObject* code_object) { int pos, i; __Pyx_CodeObjectCacheEntry* entries = __pyx_code_cache.entries; if (unlikely(!code_line)) { return; } if (unlikely(!entries)) { entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Malloc(64*sizeof(__Pyx_CodeObjectCacheEntry)); if (likely(entries)) { __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = 64; __pyx_code_cache.count = 1; entries[0].code_line = code_line; entries[0].code_object = code_object; Py_INCREF(code_object); } return; } pos = __pyx_bisect_code_objects(__pyx_code_cache.entries, __pyx_code_cache.count, code_line); if ((pos < __pyx_code_cache.count) && unlikely(__pyx_code_cache.entries[pos].code_line == code_line)) { PyCodeObject* tmp = entries[pos].code_object; entries[pos].code_object = code_object; Py_DECREF(tmp); return; } if (__pyx_code_cache.count == __pyx_code_cache.max_count) { int new_max = __pyx_code_cache.max_count + 64; entries = (__Pyx_CodeObjectCacheEntry*)PyMem_Realloc( __pyx_code_cache.entries, (size_t)new_max*sizeof(__Pyx_CodeObjectCacheEntry)); if (unlikely(!entries)) { return; } __pyx_code_cache.entries = entries; __pyx_code_cache.max_count = new_max; } for (i=__pyx_code_cache.count; i>pos; i--) { entries[i] = entries[i-1]; } entries[pos].code_line = code_line; entries[pos].code_object = code_object; __pyx_code_cache.count++; Py_INCREF(code_object); } /* AddTraceback */ #include "compile.h" #include "frameobject.h" #include "traceback.h" static PyCodeObject* __Pyx_CreateCodeObjectForTraceback( const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyObject *py_srcfile = 0; PyObject *py_funcname = 0; #if PY_MAJOR_VERSION < 3 py_srcfile = PyString_FromString(filename); #else py_srcfile = PyUnicode_FromString(filename); #endif if (!py_srcfile) goto bad; if (c_line) { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #else py_funcname = PyUnicode_FromFormat( "%s (%s:%d)", funcname, __pyx_cfilenm, c_line); #endif } else { #if PY_MAJOR_VERSION < 3 py_funcname = PyString_FromString(funcname); #else py_funcname = PyUnicode_FromString(funcname); #endif } if (!py_funcname) goto bad; py_code = __Pyx_PyCode_New( 0, 0, 0, 0, 0, __pyx_empty_bytes, /*PyObject *code,*/ __pyx_empty_tuple, /*PyObject *consts,*/ __pyx_empty_tuple, /*PyObject *names,*/ __pyx_empty_tuple, /*PyObject *varnames,*/ __pyx_empty_tuple, /*PyObject *freevars,*/ __pyx_empty_tuple, /*PyObject *cellvars,*/ py_srcfile, /*PyObject *filename,*/ py_funcname, /*PyObject *name,*/ py_line, __pyx_empty_bytes /*PyObject *lnotab*/ ); Py_DECREF(py_srcfile); Py_DECREF(py_funcname); return py_code; bad: Py_XDECREF(py_srcfile); Py_XDECREF(py_funcname); return NULL; } static void __Pyx_AddTraceback(const char *funcname, int c_line, int py_line, const char *filename) { PyCodeObject *py_code = 0; PyFrameObject *py_frame = 0; PyThreadState *tstate = __Pyx_PyThreadState_Current; if (c_line) { c_line = __Pyx_CLineForTraceback(tstate, c_line); } py_code = __pyx_find_code_object(c_line ? -c_line : py_line); if (!py_code) { py_code = __Pyx_CreateCodeObjectForTraceback( funcname, c_line, py_line, filename); if (!py_code) goto bad; __pyx_insert_code_object(c_line ? -c_line : py_line, py_code); } py_frame = PyFrame_New( tstate, /*PyThreadState *tstate,*/ py_code, /*PyCodeObject *code,*/ __pyx_d, /*PyObject *globals,*/ 0 /*PyObject *locals*/ ); if (!py_frame) goto bad; __Pyx_PyFrame_SetLineNumber(py_frame, py_line); PyTraceBack_Here(py_frame); bad: Py_XDECREF(py_code); Py_XDECREF(py_frame); } /* CIntToPy */ static CYTHON_INLINE PyObject* __Pyx_PyInt_From_long(long value) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; if (is_unsigned) { if (sizeof(long) < sizeof(long)) { return PyInt_FromLong((long) value); } else if (sizeof(long) <= sizeof(unsigned long)) { return PyLong_FromUnsignedLong((unsigned long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { return PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG) value); #endif } } else { if (sizeof(long) <= sizeof(long)) { return PyInt_FromLong((long) value); #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { return PyLong_FromLongLong((PY_LONG_LONG) value); #endif } } { int one = 1; int little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&value; return _PyLong_FromByteArray(bytes, sizeof(long), little, !is_unsigned); } } /* CIntFromPyVerify */ #define __PYX_VERIFY_RETURN_INT(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 0) #define __PYX_VERIFY_RETURN_INT_EXC(target_type, func_type, func_value)\ __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, 1) #define __PYX__VERIFY_RETURN_INT(target_type, func_type, func_value, exc)\ {\ func_type value = func_value;\ if (sizeof(target_type) < sizeof(func_type)) {\ if (unlikely(value != (func_type) (target_type) value)) {\ func_type zero = 0;\ if (exc && unlikely(value == (func_type)-1 && PyErr_Occurred()))\ return (target_type) -1;\ if (is_unsigned && unlikely(value < zero))\ goto raise_neg_overflow;\ else\ goto raise_overflow;\ }\ }\ return (target_type) value;\ } /* CIntFromPy */ static CYTHON_INLINE long __Pyx_PyInt_As_long(PyObject *x) { const long neg_one = (long) ((long) 0 - (long) 1), const_zero = (long) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(long) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(long, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (long) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case 1: __PYX_VERIFY_RETURN_INT(long, digit, digits[0]) case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 2 * PyLong_SHIFT) { return (long) (((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 3 * PyLong_SHIFT) { return (long) (((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) >= 4 * PyLong_SHIFT) { return (long) (((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (long) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(long) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (long) 0; case -1: __PYX_VERIFY_RETURN_INT(long, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(long, digit, +digits[0]) case -2: if (8 * sizeof(long) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) (((long)-1)*(((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 2: if (8 * sizeof(long) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { return (long) ((((((long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -3: if (8 * sizeof(long) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 3: if (8 * sizeof(long) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { return (long) ((((((((long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case -4: if (8 * sizeof(long) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) (((long)-1)*(((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; case 4: if (8 * sizeof(long) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(long, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(long) - 1 > 4 * PyLong_SHIFT) { return (long) ((((((((((long)digits[3]) << PyLong_SHIFT) | (long)digits[2]) << PyLong_SHIFT) | (long)digits[1]) << PyLong_SHIFT) | (long)digits[0]))); } } break; } #endif if (sizeof(long) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(long, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(long) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(long, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else long val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (long) -1; } } else { long val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (long) -1; val = __Pyx_PyInt_As_long(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to long"); return (long) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to long"); return (long) -1; } /* CIntFromPy */ static CYTHON_INLINE int __Pyx_PyInt_As_int(PyObject *x) { const int neg_one = (int) ((int) 0 - (int) 1), const_zero = (int) 0; const int is_unsigned = neg_one > const_zero; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x))) { if (sizeof(int) < sizeof(long)) { __PYX_VERIFY_RETURN_INT(int, long, PyInt_AS_LONG(x)) } else { long val = PyInt_AS_LONG(x); if (is_unsigned && unlikely(val < 0)) { goto raise_neg_overflow; } return (int) val; } } else #endif if (likely(PyLong_Check(x))) { if (is_unsigned) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case 1: __PYX_VERIFY_RETURN_INT(int, digit, digits[0]) case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 2 * PyLong_SHIFT) { return (int) (((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 3 * PyLong_SHIFT) { return (int) (((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) >= 4 * PyLong_SHIFT) { return (int) (((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0])); } } break; } #endif #if CYTHON_COMPILING_IN_CPYTHON if (unlikely(Py_SIZE(x) < 0)) { goto raise_neg_overflow; } #else { int result = PyObject_RichCompareBool(x, Py_False, Py_LT); if (unlikely(result < 0)) return (int) -1; if (unlikely(result == 1)) goto raise_neg_overflow; } #endif if (sizeof(int) <= sizeof(unsigned long)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned long, PyLong_AsUnsignedLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(unsigned PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, unsigned PY_LONG_LONG, PyLong_AsUnsignedLongLong(x)) #endif } } else { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)x)->ob_digit; switch (Py_SIZE(x)) { case 0: return (int) 0; case -1: __PYX_VERIFY_RETURN_INT(int, sdigit, (sdigit) (-(sdigit)digits[0])) case 1: __PYX_VERIFY_RETURN_INT(int, digit, +digits[0]) case -2: if (8 * sizeof(int) - 1 > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) (((int)-1)*(((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 2: if (8 * sizeof(int) > 1 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 2 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { return (int) ((((((int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -3: if (8 * sizeof(int) - 1 > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 3: if (8 * sizeof(int) > 2 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 3 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { return (int) ((((((((int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case -4: if (8 * sizeof(int) - 1 > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, long, -(long) (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) (((int)-1)*(((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; case 4: if (8 * sizeof(int) > 3 * PyLong_SHIFT) { if (8 * sizeof(unsigned long) > 4 * PyLong_SHIFT) { __PYX_VERIFY_RETURN_INT(int, unsigned long, (((((((((unsigned long)digits[3]) << PyLong_SHIFT) | (unsigned long)digits[2]) << PyLong_SHIFT) | (unsigned long)digits[1]) << PyLong_SHIFT) | (unsigned long)digits[0]))) } else if (8 * sizeof(int) - 1 > 4 * PyLong_SHIFT) { return (int) ((((((((((int)digits[3]) << PyLong_SHIFT) | (int)digits[2]) << PyLong_SHIFT) | (int)digits[1]) << PyLong_SHIFT) | (int)digits[0]))); } } break; } #endif if (sizeof(int) <= sizeof(long)) { __PYX_VERIFY_RETURN_INT_EXC(int, long, PyLong_AsLong(x)) #ifdef HAVE_LONG_LONG } else if (sizeof(int) <= sizeof(PY_LONG_LONG)) { __PYX_VERIFY_RETURN_INT_EXC(int, PY_LONG_LONG, PyLong_AsLongLong(x)) #endif } } { #if CYTHON_COMPILING_IN_PYPY && !defined(_PyLong_AsByteArray) PyErr_SetString(PyExc_RuntimeError, "_PyLong_AsByteArray() not available in PyPy, cannot convert large numbers"); #else int val; PyObject *v = __Pyx_PyNumber_IntOrLong(x); #if PY_MAJOR_VERSION < 3 if (likely(v) && !PyLong_Check(v)) { PyObject *tmp = v; v = PyNumber_Long(tmp); Py_DECREF(tmp); } #endif if (likely(v)) { int one = 1; int is_little = (int)*(unsigned char *)&one; unsigned char *bytes = (unsigned char *)&val; int ret = _PyLong_AsByteArray((PyLongObject *)v, bytes, sizeof(val), is_little, !is_unsigned); Py_DECREF(v); if (likely(!ret)) return val; } #endif return (int) -1; } } else { int val; PyObject *tmp = __Pyx_PyNumber_IntOrLong(x); if (!tmp) return (int) -1; val = __Pyx_PyInt_As_int(tmp); Py_DECREF(tmp); return val; } raise_overflow: PyErr_SetString(PyExc_OverflowError, "value too large to convert to int"); return (int) -1; raise_neg_overflow: PyErr_SetString(PyExc_OverflowError, "can't convert negative value to int"); return (int) -1; } /* FastTypeChecks */ #if CYTHON_COMPILING_IN_CPYTHON static int __Pyx_InBases(PyTypeObject *a, PyTypeObject *b) { while (a) { a = a->tp_base; if (a == b) return 1; } return b == &PyBaseObject_Type; } static CYTHON_INLINE int __Pyx_IsSubtype(PyTypeObject *a, PyTypeObject *b) { PyObject *mro; if (a == b) return 1; mro = a->tp_mro; if (likely(mro)) { Py_ssize_t i, n; n = PyTuple_GET_SIZE(mro); for (i = 0; i < n; i++) { if (PyTuple_GET_ITEM(mro, i) == (PyObject *)b) return 1; } return 0; } return __Pyx_InBases(a, b); } #if PY_MAJOR_VERSION == 2 static int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject* exc_type2) { PyObject *exception, *value, *tb; int res; __Pyx_PyThreadState_declare __Pyx_PyThreadState_assign __Pyx_ErrFetch(&exception, &value, &tb); res = exc_type1 ? PyObject_IsSubclass(err, exc_type1) : 0; if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } if (!res) { res = PyObject_IsSubclass(err, exc_type2); if (unlikely(res == -1)) { PyErr_WriteUnraisable(err); res = 0; } } __Pyx_ErrRestore(exception, value, tb); return res; } #else static CYTHON_INLINE int __Pyx_inner_PyErr_GivenExceptionMatches2(PyObject *err, PyObject* exc_type1, PyObject *exc_type2) { int res = exc_type1 ? __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type1) : 0; if (!res) { res = __Pyx_IsSubtype((PyTypeObject*)err, (PyTypeObject*)exc_type2); } return res; } #endif static int __Pyx_PyErr_GivenExceptionMatchesTuple(PyObject *exc_type, PyObject *tuple) { Py_ssize_t i, n; assert(PyExceptionClass_Check(exc_type)); n = PyTuple_GET_SIZE(tuple); #if PY_MAJOR_VERSION >= 3 for (i=0; ip) { #if PY_MAJOR_VERSION < 3 if (t->is_unicode) { *t->p = PyUnicode_DecodeUTF8(t->s, t->n - 1, NULL); } else if (t->intern) { *t->p = PyString_InternFromString(t->s); } else { *t->p = PyString_FromStringAndSize(t->s, t->n - 1); } #else if (t->is_unicode | t->is_str) { if (t->intern) { *t->p = PyUnicode_InternFromString(t->s); } else if (t->encoding) { *t->p = PyUnicode_Decode(t->s, t->n - 1, t->encoding, NULL); } else { *t->p = PyUnicode_FromStringAndSize(t->s, t->n - 1); } } else { *t->p = PyBytes_FromStringAndSize(t->s, t->n - 1); } #endif if (!*t->p) return -1; if (PyObject_Hash(*t->p) == -1) return -1; ++t; } return 0; } static CYTHON_INLINE PyObject* __Pyx_PyUnicode_FromString(const char* c_str) { return __Pyx_PyUnicode_FromStringAndSize(c_str, (Py_ssize_t)strlen(c_str)); } static CYTHON_INLINE const char* __Pyx_PyObject_AsString(PyObject* o) { Py_ssize_t ignore; return __Pyx_PyObject_AsStringAndSize(o, &ignore); } #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT #if !CYTHON_PEP393_ENABLED static const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { char* defenc_c; PyObject* defenc = _PyUnicode_AsDefaultEncodedString(o, NULL); if (!defenc) return NULL; defenc_c = PyBytes_AS_STRING(defenc); #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII { char* end = defenc_c + PyBytes_GET_SIZE(defenc); char* c; for (c = defenc_c; c < end; c++) { if ((unsigned char) (*c) >= 128) { PyUnicode_AsASCIIString(o); return NULL; } } } #endif *length = PyBytes_GET_SIZE(defenc); return defenc_c; } #else static CYTHON_INLINE const char* __Pyx_PyUnicode_AsStringAndSize(PyObject* o, Py_ssize_t *length) { if (unlikely(__Pyx_PyUnicode_READY(o) == -1)) return NULL; #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII if (likely(PyUnicode_IS_ASCII(o))) { *length = PyUnicode_GET_LENGTH(o); return PyUnicode_AsUTF8(o); } else { PyUnicode_AsASCIIString(o); return NULL; } #else return PyUnicode_AsUTF8AndSize(o, length); #endif } #endif #endif static CYTHON_INLINE const char* __Pyx_PyObject_AsStringAndSize(PyObject* o, Py_ssize_t *length) { #if __PYX_DEFAULT_STRING_ENCODING_IS_ASCII || __PYX_DEFAULT_STRING_ENCODING_IS_DEFAULT if ( #if PY_MAJOR_VERSION < 3 && __PYX_DEFAULT_STRING_ENCODING_IS_ASCII __Pyx_sys_getdefaultencoding_not_ascii && #endif PyUnicode_Check(o)) { return __Pyx_PyUnicode_AsStringAndSize(o, length); } else #endif #if (!CYTHON_COMPILING_IN_PYPY) || (defined(PyByteArray_AS_STRING) && defined(PyByteArray_GET_SIZE)) if (PyByteArray_Check(o)) { *length = PyByteArray_GET_SIZE(o); return PyByteArray_AS_STRING(o); } else #endif { char* result; int r = PyBytes_AsStringAndSize(o, &result, length); if (unlikely(r < 0)) { return NULL; } else { return result; } } } static CYTHON_INLINE int __Pyx_PyObject_IsTrue(PyObject* x) { int is_true = x == Py_True; if (is_true | (x == Py_False) | (x == Py_None)) return is_true; else return PyObject_IsTrue(x); } static CYTHON_INLINE int __Pyx_PyObject_IsTrueAndDecref(PyObject* x) { int retval; if (unlikely(!x)) return -1; retval = __Pyx_PyObject_IsTrue(x); Py_DECREF(x); return retval; } static PyObject* __Pyx_PyNumber_IntOrLongWrongResultType(PyObject* result, const char* type_name) { #if PY_MAJOR_VERSION >= 3 if (PyLong_Check(result)) { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "__int__ returned non-int (type %.200s). " "The ability to return an instance of a strict subclass of int " "is deprecated, and may be removed in a future version of Python.", Py_TYPE(result)->tp_name)) { Py_DECREF(result); return NULL; } return result; } #endif PyErr_Format(PyExc_TypeError, "__%.4s__ returned non-%.4s (type %.200s)", type_name, type_name, Py_TYPE(result)->tp_name); Py_DECREF(result); return NULL; } static CYTHON_INLINE PyObject* __Pyx_PyNumber_IntOrLong(PyObject* x) { #if CYTHON_USE_TYPE_SLOTS PyNumberMethods *m; #endif const char *name = NULL; PyObject *res = NULL; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_Check(x) || PyLong_Check(x))) #else if (likely(PyLong_Check(x))) #endif return __Pyx_NewRef(x); #if CYTHON_USE_TYPE_SLOTS m = Py_TYPE(x)->tp_as_number; #if PY_MAJOR_VERSION < 3 if (m && m->nb_int) { name = "int"; res = m->nb_int(x); } else if (m && m->nb_long) { name = "long"; res = m->nb_long(x); } #else if (likely(m && m->nb_int)) { name = "int"; res = m->nb_int(x); } #endif #else if (!PyBytes_CheckExact(x) && !PyUnicode_CheckExact(x)) { res = PyNumber_Int(x); } #endif if (likely(res)) { #if PY_MAJOR_VERSION < 3 if (unlikely(!PyInt_Check(res) && !PyLong_Check(res))) { #else if (unlikely(!PyLong_CheckExact(res))) { #endif return __Pyx_PyNumber_IntOrLongWrongResultType(res, name); } } else if (!PyErr_Occurred()) { PyErr_SetString(PyExc_TypeError, "an integer is required"); } return res; } static CYTHON_INLINE Py_ssize_t __Pyx_PyIndex_AsSsize_t(PyObject* b) { Py_ssize_t ival; PyObject *x; #if PY_MAJOR_VERSION < 3 if (likely(PyInt_CheckExact(b))) { if (sizeof(Py_ssize_t) >= sizeof(long)) return PyInt_AS_LONG(b); else return PyInt_AsSsize_t(b); } #endif if (likely(PyLong_CheckExact(b))) { #if CYTHON_USE_PYLONG_INTERNALS const digit* digits = ((PyLongObject*)b)->ob_digit; const Py_ssize_t size = Py_SIZE(b); if (likely(__Pyx_sst_abs(size) <= 1)) { ival = likely(size) ? digits[0] : 0; if (size == -1) ival = -ival; return ival; } else { switch (size) { case 2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return (Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -2: if (8 * sizeof(Py_ssize_t) > 2 * PyLong_SHIFT) { return -(Py_ssize_t) (((((size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return (Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -3: if (8 * sizeof(Py_ssize_t) > 3 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case 4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return (Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; case -4: if (8 * sizeof(Py_ssize_t) > 4 * PyLong_SHIFT) { return -(Py_ssize_t) (((((((((size_t)digits[3]) << PyLong_SHIFT) | (size_t)digits[2]) << PyLong_SHIFT) | (size_t)digits[1]) << PyLong_SHIFT) | (size_t)digits[0])); } break; } } #endif return PyLong_AsSsize_t(b); } x = PyNumber_Index(b); if (!x) return -1; ival = PyInt_AsSsize_t(x); Py_DECREF(x); return ival; } static CYTHON_INLINE PyObject * __Pyx_PyBool_FromLong(long b) { return b ? __Pyx_NewRef(Py_True) : __Pyx_NewRef(Py_False); } static CYTHON_INLINE PyObject * __Pyx_PyInt_FromSize_t(size_t ival) { return PyInt_FromSize_t(ival); } #endif /* Py_PYTHON_H */ aiohttp-3.6.2/aiohttp/_websocket.pyx0000644000175100001650000000302713547410117020026 0ustar vstsdocker00000000000000from cpython cimport PyBytes_AsString #from cpython cimport PyByteArray_AsString # cython still not exports that cdef extern from "Python.h": char* PyByteArray_AsString(bytearray ba) except NULL from libc.stdint cimport uint32_t, uint64_t, uintmax_t def _websocket_mask_cython(object mask, object data): """Note, this function mutates its `data` argument """ cdef: Py_ssize_t data_len, i # bit operations on signed integers are implementation-specific unsigned char * in_buf const unsigned char * mask_buf uint32_t uint32_msk uint64_t uint64_msk assert len(mask) == 4 if not isinstance(mask, bytes): mask = bytes(mask) if isinstance(data, bytearray): data = data else: data = bytearray(data) data_len = len(data) in_buf = PyByteArray_AsString(data) mask_buf = PyBytes_AsString(mask) uint32_msk = (mask_buf)[0] # TODO: align in_data ptr to achieve even faster speeds # does it need in python ?! malloc() always aligns to sizeof(long) bytes if sizeof(size_t) >= 8: uint64_msk = uint32_msk uint64_msk = (uint64_msk << 32) | uint32_msk while data_len >= 8: (in_buf)[0] ^= uint64_msk in_buf += 8 data_len -= 8 while data_len >= 4: (in_buf)[0] ^= uint32_msk in_buf += 4 data_len -= 4 for i in range(0, data_len): in_buf[i] ^= mask_buf[i] aiohttp-3.6.2/aiohttp/abc.py0000644000175100001650000001242013547410117016233 0ustar vstsdocker00000000000000import asyncio import logging from abc import ABC, abstractmethod from collections.abc import Sized from http.cookies import BaseCookie, Morsel # noqa from typing import ( TYPE_CHECKING, Any, Awaitable, Callable, Dict, Generator, Iterable, List, Optional, Tuple, ) from multidict import CIMultiDict # noqa from yarl import URL from .helpers import get_running_loop from .typedefs import LooseCookies if TYPE_CHECKING: # pragma: no cover from .web_request import BaseRequest, Request from .web_response import StreamResponse from .web_app import Application from .web_exceptions import HTTPException else: BaseRequest = Request = Application = StreamResponse = None HTTPException = None class AbstractRouter(ABC): def __init__(self) -> None: self._frozen = False def post_init(self, app: Application) -> None: """Post init stage. Not an abstract method for sake of backward compatibility, but if the router wants to be aware of the application it can override this. """ @property def frozen(self) -> bool: return self._frozen def freeze(self) -> None: """Freeze router.""" self._frozen = True @abstractmethod async def resolve(self, request: Request) -> 'AbstractMatchInfo': """Return MATCH_INFO for given request""" class AbstractMatchInfo(ABC): @property # pragma: no branch @abstractmethod def handler(self) -> Callable[[Request], Awaitable[StreamResponse]]: """Execute matched request handler""" @property @abstractmethod def expect_handler(self) -> Callable[[Request], Awaitable[None]]: """Expect handler for 100-continue processing""" @property # pragma: no branch @abstractmethod def http_exception(self) -> Optional[HTTPException]: """HTTPException instance raised on router's resolving, or None""" @abstractmethod # pragma: no branch def get_info(self) -> Dict[str, Any]: """Return a dict with additional info useful for introspection""" @property # pragma: no branch @abstractmethod def apps(self) -> Tuple[Application, ...]: """Stack of nested applications. Top level application is left-most element. """ @abstractmethod def add_app(self, app: Application) -> None: """Add application to the nested apps stack.""" @abstractmethod def freeze(self) -> None: """Freeze the match info. The method is called after route resolution. After the call .add_app() is forbidden. """ class AbstractView(ABC): """Abstract class based view.""" def __init__(self, request: Request) -> None: self._request = request @property def request(self) -> Request: """Request instance.""" return self._request @abstractmethod def __await__(self) -> Generator[Any, None, StreamResponse]: """Execute the view handler.""" class AbstractResolver(ABC): """Abstract DNS resolver.""" @abstractmethod async def resolve(self, host: str, port: int, family: int) -> List[Dict[str, Any]]: """Return IP address for given hostname""" @abstractmethod async def close(self) -> None: """Release resolver""" if TYPE_CHECKING: # pragma: no cover IterableBase = Iterable[Morsel[str]] else: IterableBase = Iterable class AbstractCookieJar(Sized, IterableBase): """Abstract Cookie Jar.""" def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: self._loop = get_running_loop(loop) @abstractmethod def clear(self) -> None: """Clear all cookies.""" @abstractmethod def update_cookies(self, cookies: LooseCookies, response_url: URL=URL()) -> None: """Update cookies.""" @abstractmethod def filter_cookies(self, request_url: URL) -> 'BaseCookie[str]': """Return the jar's cookies filtered by their attributes.""" class AbstractStreamWriter(ABC): """Abstract stream writer.""" buffer_size = 0 output_size = 0 length = 0 # type: Optional[int] @abstractmethod async def write(self, chunk: bytes) -> None: """Write chunk into stream.""" @abstractmethod async def write_eof(self, chunk: bytes=b'') -> None: """Write last chunk.""" @abstractmethod async def drain(self) -> None: """Flush the write buffer.""" @abstractmethod def enable_compression(self, encoding: str='deflate') -> None: """Enable HTTP body compression""" @abstractmethod def enable_chunking(self) -> None: """Enable HTTP chunked mode""" @abstractmethod async def write_headers(self, status_line: str, headers: 'CIMultiDict[str]') -> None: """Write HTTP headers""" class AbstractAccessLogger(ABC): """Abstract writer to access log.""" def __init__(self, logger: logging.Logger, log_format: str) -> None: self.logger = logger self.log_format = log_format @abstractmethod def log(self, request: BaseRequest, response: StreamResponse, time: float) -> None: """Emit log to logger.""" aiohttp-3.6.2/aiohttp/base_protocol.py0000644000175100001650000000514713547410117020351 0ustar vstsdocker00000000000000import asyncio from typing import Optional, cast from .tcp_helpers import tcp_nodelay class BaseProtocol(asyncio.Protocol): __slots__ = ('_loop', '_paused', '_drain_waiter', '_connection_lost', '_reading_paused', 'transport') def __init__(self, loop: asyncio.AbstractEventLoop) -> None: self._loop = loop # type: asyncio.AbstractEventLoop self._paused = False self._drain_waiter = None # type: Optional[asyncio.Future[None]] self._connection_lost = False self._reading_paused = False self.transport = None # type: Optional[asyncio.Transport] def pause_writing(self) -> None: assert not self._paused self._paused = True def resume_writing(self) -> None: assert self._paused self._paused = False waiter = self._drain_waiter if waiter is not None: self._drain_waiter = None if not waiter.done(): waiter.set_result(None) def pause_reading(self) -> None: if not self._reading_paused and self.transport is not None: try: self.transport.pause_reading() except (AttributeError, NotImplementedError, RuntimeError): pass self._reading_paused = True def resume_reading(self) -> None: if self._reading_paused and self.transport is not None: try: self.transport.resume_reading() except (AttributeError, NotImplementedError, RuntimeError): pass self._reading_paused = False def connection_made(self, transport: asyncio.BaseTransport) -> None: tr = cast(asyncio.Transport, transport) tcp_nodelay(tr, True) self.transport = tr def connection_lost(self, exc: Optional[BaseException]) -> None: self._connection_lost = True # Wake up the writer if currently paused. self.transport = None if not self._paused: return waiter = self._drain_waiter if waiter is None: return self._drain_waiter = None if waiter.done(): return if exc is None: waiter.set_result(None) else: waiter.set_exception(exc) async def _drain_helper(self) -> None: if self._connection_lost: raise ConnectionResetError('Connection lost') if not self._paused: return waiter = self._drain_waiter assert waiter is None or waiter.cancelled() waiter = self._loop.create_future() self._drain_waiter = waiter await waiter aiohttp-3.6.2/aiohttp/client.py0000644000175100001650000012571313547410117016776 0ustar vstsdocker00000000000000"""HTTP Client for asyncio.""" import asyncio import base64 import hashlib import json import os import sys import traceback import warnings from types import SimpleNamespace, TracebackType from typing import ( # noqa Any, Coroutine, Generator, Generic, Iterable, List, Mapping, Optional, Set, Tuple, Type, TypeVar, Union, ) import attr from multidict import CIMultiDict, MultiDict, MultiDictProxy, istr from yarl import URL from . import hdrs, http, payload from .abc import AbstractCookieJar from .client_exceptions import ClientConnectionError as ClientConnectionError from .client_exceptions import ( ClientConnectorCertificateError as ClientConnectorCertificateError, ) from .client_exceptions import ClientConnectorError as ClientConnectorError from .client_exceptions import ( ClientConnectorSSLError as ClientConnectorSSLError, ) from .client_exceptions import ClientError as ClientError from .client_exceptions import ClientHttpProxyError as ClientHttpProxyError from .client_exceptions import ClientOSError as ClientOSError from .client_exceptions import ClientPayloadError as ClientPayloadError from .client_exceptions import ( ClientProxyConnectionError as ClientProxyConnectionError, ) from .client_exceptions import ClientResponseError as ClientResponseError from .client_exceptions import ClientSSLError as ClientSSLError from .client_exceptions import ContentTypeError as ContentTypeError from .client_exceptions import InvalidURL as InvalidURL from .client_exceptions import ServerConnectionError as ServerConnectionError from .client_exceptions import ( ServerDisconnectedError as ServerDisconnectedError, ) from .client_exceptions import ( ServerFingerprintMismatch as ServerFingerprintMismatch, ) from .client_exceptions import ServerTimeoutError as ServerTimeoutError from .client_exceptions import TooManyRedirects as TooManyRedirects from .client_exceptions import WSServerHandshakeError as WSServerHandshakeError from .client_reqrep import ClientRequest as ClientRequest from .client_reqrep import ClientResponse as ClientResponse from .client_reqrep import Fingerprint as Fingerprint from .client_reqrep import RequestInfo as RequestInfo from .client_reqrep import _merge_ssl_params from .client_ws import ClientWebSocketResponse as ClientWebSocketResponse from .connector import BaseConnector as BaseConnector from .connector import NamedPipeConnector as NamedPipeConnector from .connector import TCPConnector as TCPConnector from .connector import UnixConnector as UnixConnector from .cookiejar import CookieJar from .helpers import ( DEBUG, PY_36, BasicAuth, CeilTimeout, TimeoutHandle, get_running_loop, proxies_from_env, sentinel, strip_auth_from_url, ) from .http import WS_KEY, HttpVersion, WebSocketReader, WebSocketWriter from .http_websocket import ( # noqa WSHandshakeError, WSMessage, ws_ext_gen, ws_ext_parse, ) from .streams import FlowControlDataQueue from .tracing import Trace, TraceConfig from .typedefs import JSONEncoder, LooseCookies, LooseHeaders, StrOrURL __all__ = ( # client_exceptions 'ClientConnectionError', 'ClientConnectorCertificateError', 'ClientConnectorError', 'ClientConnectorSSLError', 'ClientError', 'ClientHttpProxyError', 'ClientOSError', 'ClientPayloadError', 'ClientProxyConnectionError', 'ClientResponseError', 'ClientSSLError', 'ContentTypeError', 'InvalidURL', 'ServerConnectionError', 'ServerDisconnectedError', 'ServerFingerprintMismatch', 'ServerTimeoutError', 'TooManyRedirects', 'WSServerHandshakeError', # client_reqrep 'ClientRequest', 'ClientResponse', 'Fingerprint', 'RequestInfo', # connector 'BaseConnector', 'TCPConnector', 'UnixConnector', 'NamedPipeConnector', # client_ws 'ClientWebSocketResponse', # client 'ClientSession', 'ClientTimeout', 'request') try: from ssl import SSLContext except ImportError: # pragma: no cover SSLContext = object # type: ignore @attr.s(frozen=True, slots=True) class ClientTimeout: total = attr.ib(type=Optional[float], default=None) connect = attr.ib(type=Optional[float], default=None) sock_read = attr.ib(type=Optional[float], default=None) sock_connect = attr.ib(type=Optional[float], default=None) # pool_queue_timeout = attr.ib(type=float, default=None) # dns_resolution_timeout = attr.ib(type=float, default=None) # socket_connect_timeout = attr.ib(type=float, default=None) # connection_acquiring_timeout = attr.ib(type=float, default=None) # new_connection_timeout = attr.ib(type=float, default=None) # http_header_timeout = attr.ib(type=float, default=None) # response_body_timeout = attr.ib(type=float, default=None) # to create a timeout specific for a single request, either # - create a completely new one to overwrite the default # - or use http://www.attrs.org/en/stable/api.html#attr.evolve # to overwrite the defaults # 5 Minute default read timeout DEFAULT_TIMEOUT = ClientTimeout(total=5*60) _RetType = TypeVar('_RetType') class ClientSession: """First-class interface for making HTTP requests.""" ATTRS = frozenset([ '_source_traceback', '_connector', 'requote_redirect_url', '_loop', '_cookie_jar', '_connector_owner', '_default_auth', '_version', '_json_serialize', '_requote_redirect_url', '_timeout', '_raise_for_status', '_auto_decompress', '_trust_env', '_default_headers', '_skip_auto_headers', '_request_class', '_response_class', '_ws_response_class', '_trace_configs']) _source_traceback = None def __init__(self, *, connector: Optional[BaseConnector]=None, loop: Optional[asyncio.AbstractEventLoop]=None, cookies: Optional[LooseCookies]=None, headers: Optional[LooseHeaders]=None, skip_auto_headers: Optional[Iterable[str]]=None, auth: Optional[BasicAuth]=None, json_serialize: JSONEncoder=json.dumps, request_class: Type[ClientRequest]=ClientRequest, response_class: Type[ClientResponse]=ClientResponse, ws_response_class: Type[ClientWebSocketResponse]=ClientWebSocketResponse, # noqa version: HttpVersion=http.HttpVersion11, cookie_jar: Optional[AbstractCookieJar]=None, connector_owner: bool=True, raise_for_status: bool=False, read_timeout: Union[float, object]=sentinel, conn_timeout: Optional[float]=None, timeout: Union[object, ClientTimeout]=sentinel, auto_decompress: bool=True, trust_env: bool=False, requote_redirect_url: bool=True, trace_configs: Optional[List[TraceConfig]]=None) -> None: if loop is None: if connector is not None: loop = connector._loop loop = get_running_loop(loop) if connector is None: connector = TCPConnector(loop=loop) if connector._loop is not loop: raise RuntimeError( "Session and connector has to use same event loop") self._loop = loop if loop.get_debug(): self._source_traceback = traceback.extract_stack(sys._getframe(1)) if cookie_jar is None: cookie_jar = CookieJar(loop=loop) self._cookie_jar = cookie_jar if cookies is not None: self._cookie_jar.update_cookies(cookies) self._connector = connector # type: Optional[BaseConnector] self._connector_owner = connector_owner self._default_auth = auth self._version = version self._json_serialize = json_serialize if timeout is sentinel: self._timeout = DEFAULT_TIMEOUT if read_timeout is not sentinel: warnings.warn("read_timeout is deprecated, " "use timeout argument instead", DeprecationWarning, stacklevel=2) self._timeout = attr.evolve(self._timeout, total=read_timeout) if conn_timeout is not None: self._timeout = attr.evolve(self._timeout, connect=conn_timeout) warnings.warn("conn_timeout is deprecated, " "use timeout argument instead", DeprecationWarning, stacklevel=2) else: self._timeout = timeout # type: ignore if read_timeout is not sentinel: raise ValueError("read_timeout and timeout parameters " "conflict, please setup " "timeout.read") if conn_timeout is not None: raise ValueError("conn_timeout and timeout parameters " "conflict, please setup " "timeout.connect") self._raise_for_status = raise_for_status self._auto_decompress = auto_decompress self._trust_env = trust_env self._requote_redirect_url = requote_redirect_url # Convert to list of tuples if headers: real_headers = CIMultiDict(headers) # type: CIMultiDict[str] else: real_headers = CIMultiDict() self._default_headers = real_headers # type: CIMultiDict[str] if skip_auto_headers is not None: self._skip_auto_headers = frozenset([istr(i) for i in skip_auto_headers]) else: self._skip_auto_headers = frozenset() self._request_class = request_class self._response_class = response_class self._ws_response_class = ws_response_class self._trace_configs = trace_configs or [] for trace_config in self._trace_configs: trace_config.freeze() def __init_subclass__(cls: Type['ClientSession']) -> None: warnings.warn("Inheritance class {} from ClientSession " "is discouraged".format(cls.__name__), DeprecationWarning, stacklevel=2) if DEBUG: def __setattr__(self, name: str, val: Any) -> None: if name not in self.ATTRS: warnings.warn("Setting custom ClientSession.{} attribute " "is discouraged".format(name), DeprecationWarning, stacklevel=2) super().__setattr__(name, val) def __del__(self, _warnings: Any=warnings) -> None: if not self.closed: if PY_36: kwargs = {'source': self} else: kwargs = {} _warnings.warn("Unclosed client session {!r}".format(self), ResourceWarning, **kwargs) context = {'client_session': self, 'message': 'Unclosed client session'} if self._source_traceback is not None: context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) def request(self, method: str, url: StrOrURL, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP request.""" return _RequestContextManager(self._request(method, url, **kwargs)) async def _request( self, method: str, str_or_url: StrOrURL, *, params: Optional[Mapping[str, str]]=None, data: Any=None, json: Any=None, cookies: Optional[LooseCookies]=None, headers: LooseHeaders=None, skip_auto_headers: Optional[Iterable[str]]=None, auth: Optional[BasicAuth]=None, allow_redirects: bool=True, max_redirects: int=10, compress: Optional[str]=None, chunked: Optional[bool]=None, expect100: bool=False, raise_for_status: Optional[bool]=None, read_until_eof: bool=True, proxy: Optional[StrOrURL]=None, proxy_auth: Optional[BasicAuth]=None, timeout: Union[ClientTimeout, object]=sentinel, verify_ssl: Optional[bool]=None, fingerprint: Optional[bytes]=None, ssl_context: Optional[SSLContext]=None, ssl: Optional[Union[SSLContext, bool, Fingerprint]]=None, proxy_headers: Optional[LooseHeaders]=None, trace_request_ctx: Optional[SimpleNamespace]=None ) -> ClientResponse: # NOTE: timeout clamps existing connect and read timeouts. We cannot # set the default to None because we need to detect if the user wants # to use the existing timeouts by setting timeout to None. if self.closed: raise RuntimeError('Session is closed') ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if data is not None and json is not None: raise ValueError( 'data and json parameters can not be used at the same time') elif json is not None: data = payload.JsonPayload(json, dumps=self._json_serialize) if not isinstance(chunked, bool) and chunked is not None: warnings.warn( 'Chunk size is deprecated #1615', DeprecationWarning) redirects = 0 history = [] version = self._version # Merge with default headers and transform to CIMultiDict headers = self._prepare_headers(headers) proxy_headers = self._prepare_headers(proxy_headers) try: url = URL(str_or_url) except ValueError: raise InvalidURL(str_or_url) skip_headers = set(self._skip_auto_headers) if skip_auto_headers is not None: for i in skip_auto_headers: skip_headers.add(istr(i)) if proxy is not None: try: proxy = URL(proxy) except ValueError: raise InvalidURL(proxy) if timeout is sentinel: real_timeout = self._timeout # type: ClientTimeout else: if not isinstance(timeout, ClientTimeout): real_timeout = ClientTimeout(total=timeout) # type: ignore else: real_timeout = timeout # timeout is cumulative for all request operations # (request, redirects, responses, data consuming) tm = TimeoutHandle(self._loop, real_timeout.total) handle = tm.start() traces = [ Trace( self, trace_config, trace_config.trace_config_ctx( trace_request_ctx=trace_request_ctx) ) for trace_config in self._trace_configs ] for trace in traces: await trace.send_request_start( method, url, headers ) timer = tm.timer() try: with timer: while True: url, auth_from_url = strip_auth_from_url(url) if auth and auth_from_url: raise ValueError("Cannot combine AUTH argument with " "credentials encoded in URL") if auth is None: auth = auth_from_url if auth is None: auth = self._default_auth # It would be confusing if we support explicit # Authorization header with auth argument if (headers is not None and auth is not None and hdrs.AUTHORIZATION in headers): raise ValueError("Cannot combine AUTHORIZATION header " "with AUTH argument or credentials " "encoded in URL") all_cookies = self._cookie_jar.filter_cookies(url) if cookies is not None: tmp_cookie_jar = CookieJar() tmp_cookie_jar.update_cookies(cookies) req_cookies = tmp_cookie_jar.filter_cookies(url) if req_cookies: all_cookies.load(req_cookies) if proxy is not None: proxy = URL(proxy) elif self._trust_env: for scheme, proxy_info in proxies_from_env().items(): if scheme == url.scheme: proxy = proxy_info.proxy proxy_auth = proxy_info.proxy_auth break req = self._request_class( method, url, params=params, headers=headers, skip_auto_headers=skip_headers, data=data, cookies=all_cookies, auth=auth, version=version, compress=compress, chunked=chunked, expect100=expect100, loop=self._loop, response_class=self._response_class, proxy=proxy, proxy_auth=proxy_auth, timer=timer, session=self, ssl=ssl, proxy_headers=proxy_headers, traces=traces) # connection timeout try: with CeilTimeout(real_timeout.connect, loop=self._loop): assert self._connector is not None conn = await self._connector.connect( req, traces=traces, timeout=real_timeout ) except asyncio.TimeoutError as exc: raise ServerTimeoutError( 'Connection timeout ' 'to host {0}'.format(url)) from exc assert conn.transport is not None assert conn.protocol is not None conn.protocol.set_response_params( timer=timer, skip_payload=method.upper() == 'HEAD', read_until_eof=read_until_eof, auto_decompress=self._auto_decompress, read_timeout=real_timeout.sock_read) try: try: resp = await req.send(conn) try: await resp.start(conn) except BaseException: resp.close() raise except BaseException: conn.close() raise except ClientError: raise except OSError as exc: raise ClientOSError(*exc.args) from exc self._cookie_jar.update_cookies(resp.cookies, resp.url) # redirects if resp.status in ( 301, 302, 303, 307, 308) and allow_redirects: for trace in traces: await trace.send_request_redirect( method, url, headers, resp ) redirects += 1 history.append(resp) if max_redirects and redirects >= max_redirects: resp.close() raise TooManyRedirects( history[0].request_info, tuple(history)) # For 301 and 302, mimic IE, now changed in RFC # https://github.com/kennethreitz/requests/pull/269 if (resp.status == 303 and resp.method != hdrs.METH_HEAD) \ or (resp.status in (301, 302) and resp.method == hdrs.METH_POST): method = hdrs.METH_GET data = None if headers.get(hdrs.CONTENT_LENGTH): headers.pop(hdrs.CONTENT_LENGTH) r_url = (resp.headers.get(hdrs.LOCATION) or resp.headers.get(hdrs.URI)) if r_url is None: # see github.com/aio-libs/aiohttp/issues/2022 break else: # reading from correct redirection # response is forbidden resp.release() try: r_url = URL( r_url, encoded=not self._requote_redirect_url) except ValueError: raise InvalidURL(r_url) scheme = r_url.scheme if scheme not in ('http', 'https', ''): resp.close() raise ValueError( 'Can redirect only to http or https') elif not scheme: r_url = url.join(r_url) if url.origin() != r_url.origin(): auth = None headers.pop(hdrs.AUTHORIZATION, None) url = r_url params = None resp.release() continue break # check response status if raise_for_status is None: raise_for_status = self._raise_for_status if raise_for_status: resp.raise_for_status() # register connection if handle is not None: if resp.connection is not None: resp.connection.add_callback(handle.cancel) else: handle.cancel() resp._history = tuple(history) for trace in traces: await trace.send_request_end( method, url, headers, resp ) return resp except BaseException as e: # cleanup timer tm.close() if handle: handle.cancel() handle = None for trace in traces: await trace.send_request_exception( method, url, headers, e ) raise def ws_connect( self, url: StrOrURL, *, method: str=hdrs.METH_GET, protocols: Iterable[str]=(), timeout: float=10.0, receive_timeout: Optional[float]=None, autoclose: bool=True, autoping: bool=True, heartbeat: Optional[float]=None, auth: Optional[BasicAuth]=None, origin: Optional[str]=None, headers: Optional[LooseHeaders]=None, proxy: Optional[StrOrURL]=None, proxy_auth: Optional[BasicAuth]=None, ssl: Union[SSLContext, bool, None, Fingerprint]=None, verify_ssl: Optional[bool]=None, fingerprint: Optional[bytes]=None, ssl_context: Optional[SSLContext]=None, proxy_headers: Optional[LooseHeaders]=None, compress: int=0, max_msg_size: int=4*1024*1024) -> '_WSRequestContextManager': """Initiate websocket connection.""" return _WSRequestContextManager( self._ws_connect(url, method=method, protocols=protocols, timeout=timeout, receive_timeout=receive_timeout, autoclose=autoclose, autoping=autoping, heartbeat=heartbeat, auth=auth, origin=origin, headers=headers, proxy=proxy, proxy_auth=proxy_auth, ssl=ssl, verify_ssl=verify_ssl, fingerprint=fingerprint, ssl_context=ssl_context, proxy_headers=proxy_headers, compress=compress, max_msg_size=max_msg_size)) async def _ws_connect( self, url: StrOrURL, *, method: str=hdrs.METH_GET, protocols: Iterable[str]=(), timeout: float=10.0, receive_timeout: Optional[float]=None, autoclose: bool=True, autoping: bool=True, heartbeat: Optional[float]=None, auth: Optional[BasicAuth]=None, origin: Optional[str]=None, headers: Optional[LooseHeaders]=None, proxy: Optional[StrOrURL]=None, proxy_auth: Optional[BasicAuth]=None, ssl: Union[SSLContext, bool, None, Fingerprint]=None, verify_ssl: Optional[bool]=None, fingerprint: Optional[bytes]=None, ssl_context: Optional[SSLContext]=None, proxy_headers: Optional[LooseHeaders]=None, compress: int=0, max_msg_size: int=4*1024*1024 ) -> ClientWebSocketResponse: if headers is None: real_headers = CIMultiDict() # type: CIMultiDict[str] else: real_headers = CIMultiDict(headers) default_headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_VERSION: '13', } for key, value in default_headers.items(): real_headers.setdefault(key, value) sec_key = base64.b64encode(os.urandom(16)) real_headers[hdrs.SEC_WEBSOCKET_KEY] = sec_key.decode() if protocols: real_headers[hdrs.SEC_WEBSOCKET_PROTOCOL] = ','.join(protocols) if origin is not None: real_headers[hdrs.ORIGIN] = origin if compress: extstr = ws_ext_gen(compress=compress) real_headers[hdrs.SEC_WEBSOCKET_EXTENSIONS] = extstr ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) # send request resp = await self.request(method, url, headers=real_headers, read_until_eof=False, auth=auth, proxy=proxy, proxy_auth=proxy_auth, ssl=ssl, proxy_headers=proxy_headers) try: # check handshake if resp.status != 101: raise WSServerHandshakeError( resp.request_info, resp.history, message='Invalid response status', status=resp.status, headers=resp.headers) if resp.headers.get(hdrs.UPGRADE, '').lower() != 'websocket': raise WSServerHandshakeError( resp.request_info, resp.history, message='Invalid upgrade header', status=resp.status, headers=resp.headers) if resp.headers.get(hdrs.CONNECTION, '').lower() != 'upgrade': raise WSServerHandshakeError( resp.request_info, resp.history, message='Invalid connection header', status=resp.status, headers=resp.headers) # key calculation key = resp.headers.get(hdrs.SEC_WEBSOCKET_ACCEPT, '') match = base64.b64encode( hashlib.sha1(sec_key + WS_KEY).digest()).decode() if key != match: raise WSServerHandshakeError( resp.request_info, resp.history, message='Invalid challenge response', status=resp.status, headers=resp.headers) # websocket protocol protocol = None if protocols and hdrs.SEC_WEBSOCKET_PROTOCOL in resp.headers: resp_protocols = [ proto.strip() for proto in resp.headers[hdrs.SEC_WEBSOCKET_PROTOCOL].split(',')] for proto in resp_protocols: if proto in protocols: protocol = proto break # websocket compress notakeover = False if compress: compress_hdrs = resp.headers.get(hdrs.SEC_WEBSOCKET_EXTENSIONS) if compress_hdrs: try: compress, notakeover = ws_ext_parse(compress_hdrs) except WSHandshakeError as exc: raise WSServerHandshakeError( resp.request_info, resp.history, message=exc.args[0], status=resp.status, headers=resp.headers) else: compress = 0 notakeover = False conn = resp.connection assert conn is not None proto = conn.protocol assert proto is not None transport = conn.transport assert transport is not None reader = FlowControlDataQueue( proto, limit=2 ** 16, loop=self._loop) # type: FlowControlDataQueue[WSMessage] # noqa proto.set_parser(WebSocketReader(reader, max_msg_size), reader) writer = WebSocketWriter( proto, transport, use_mask=True, compress=compress, notakeover=notakeover) except BaseException: resp.close() raise else: return self._ws_response_class(reader, writer, protocol, resp, timeout, autoclose, autoping, self._loop, receive_timeout=receive_timeout, heartbeat=heartbeat, compress=compress, client_notakeover=notakeover) def _prepare_headers( self, headers: Optional[LooseHeaders]) -> 'CIMultiDict[str]': """ Add default headers and transform it to CIMultiDict """ # Convert headers to MultiDict result = CIMultiDict(self._default_headers) if headers: if not isinstance(headers, (MultiDictProxy, MultiDict)): headers = CIMultiDict(headers) added_names = set() # type: Set[str] for key, value in headers.items(): if key in added_names: result.add(key, value) else: result[key] = value added_names.add(key) return result def get(self, url: StrOrURL, *, allow_redirects: bool=True, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP GET request.""" return _RequestContextManager( self._request(hdrs.METH_GET, url, allow_redirects=allow_redirects, **kwargs)) def options(self, url: StrOrURL, *, allow_redirects: bool=True, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP OPTIONS request.""" return _RequestContextManager( self._request(hdrs.METH_OPTIONS, url, allow_redirects=allow_redirects, **kwargs)) def head(self, url: StrOrURL, *, allow_redirects: bool=False, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP HEAD request.""" return _RequestContextManager( self._request(hdrs.METH_HEAD, url, allow_redirects=allow_redirects, **kwargs)) def post(self, url: StrOrURL, *, data: Any=None, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP POST request.""" return _RequestContextManager( self._request(hdrs.METH_POST, url, data=data, **kwargs)) def put(self, url: StrOrURL, *, data: Any=None, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP PUT request.""" return _RequestContextManager( self._request(hdrs.METH_PUT, url, data=data, **kwargs)) def patch(self, url: StrOrURL, *, data: Any=None, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP PATCH request.""" return _RequestContextManager( self._request(hdrs.METH_PATCH, url, data=data, **kwargs)) def delete(self, url: StrOrURL, **kwargs: Any) -> '_RequestContextManager': """Perform HTTP DELETE request.""" return _RequestContextManager( self._request(hdrs.METH_DELETE, url, **kwargs)) async def close(self) -> None: """Close underlying connector. Release all acquired resources. """ if not self.closed: if self._connector is not None and self._connector_owner: await self._connector.close() self._connector = None @property def closed(self) -> bool: """Is client session closed. A readonly property. """ return self._connector is None or self._connector.closed @property def connector(self) -> Optional[BaseConnector]: """Connector instance used for the session.""" return self._connector @property def cookie_jar(self) -> AbstractCookieJar: """The session cookies.""" return self._cookie_jar @property def version(self) -> Tuple[int, int]: """The session HTTP protocol version.""" return self._version @property def requote_redirect_url(self) -> bool: """Do URL requoting on redirection handling.""" return self._requote_redirect_url @requote_redirect_url.setter def requote_redirect_url(self, val: bool) -> None: """Do URL requoting on redirection handling.""" warnings.warn("session.requote_redirect_url modification " "is deprecated #2778", DeprecationWarning, stacklevel=2) self._requote_redirect_url = val @property def loop(self) -> asyncio.AbstractEventLoop: """Session's loop.""" warnings.warn("client.loop property is deprecated", DeprecationWarning, stacklevel=2) return self._loop def detach(self) -> None: """Detach connector from session without closing the former. Session is switched to closed state anyway. """ self._connector = None def __enter__(self) -> None: raise TypeError("Use async with instead") def __exit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> None: # __exit__ should exist in pair with __enter__ but never executed pass # pragma: no cover async def __aenter__(self) -> 'ClientSession': return self async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> None: await self.close() class _BaseRequestContextManager(Coroutine[Any, Any, _RetType], Generic[_RetType]): __slots__ = ('_coro', '_resp') def __init__( self, coro: Coroutine['asyncio.Future[Any]', None, _RetType] ) -> None: self._coro = coro def send(self, arg: None) -> 'asyncio.Future[Any]': return self._coro.send(arg) def throw(self, arg: BaseException) -> None: # type: ignore self._coro.throw(arg) # type: ignore def close(self) -> None: return self._coro.close() def __await__(self) -> Generator[Any, None, _RetType]: ret = self._coro.__await__() return ret def __iter__(self) -> Generator[Any, None, _RetType]: return self.__await__() async def __aenter__(self) -> _RetType: self._resp = await self._coro return self._resp class _RequestContextManager(_BaseRequestContextManager[ClientResponse]): async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException], tb: Optional[TracebackType]) -> None: # We're basing behavior on the exception as it can be caused by # user code unrelated to the status of the connection. If you # would like to close a connection you must do that # explicitly. Otherwise connection error handling should kick in # and close/recycle the connection as required. self._resp.release() class _WSRequestContextManager(_BaseRequestContextManager[ ClientWebSocketResponse]): async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException], tb: Optional[TracebackType]) -> None: await self._resp.close() class _SessionRequestContextManager: __slots__ = ('_coro', '_resp', '_session') def __init__(self, coro: Coroutine['asyncio.Future[Any]', None, ClientResponse], session: ClientSession) -> None: self._coro = coro self._resp = None # type: Optional[ClientResponse] self._session = session async def __aenter__(self) -> ClientResponse: try: self._resp = await self._coro except BaseException: await self._session.close() raise else: return self._resp async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException], tb: Optional[TracebackType]) -> None: assert self._resp is not None self._resp.close() await self._session.close() def request( method: str, url: StrOrURL, *, params: Optional[Mapping[str, str]]=None, data: Any=None, json: Any=None, headers: LooseHeaders=None, skip_auto_headers: Optional[Iterable[str]]=None, auth: Optional[BasicAuth]=None, allow_redirects: bool=True, max_redirects: int=10, compress: Optional[str]=None, chunked: Optional[bool]=None, expect100: bool=False, raise_for_status: Optional[bool]=None, read_until_eof: bool=True, proxy: Optional[StrOrURL]=None, proxy_auth: Optional[BasicAuth]=None, timeout: Union[ClientTimeout, object]=sentinel, cookies: Optional[LooseCookies]=None, version: HttpVersion=http.HttpVersion11, connector: Optional[BaseConnector]=None, loop: Optional[asyncio.AbstractEventLoop]=None ) -> _SessionRequestContextManager: """Constructs and sends a request. Returns response object. method - HTTP method url - request url params - (optional) Dictionary or bytes to be sent in the query string of the new request data - (optional) Dictionary, bytes, or file-like object to send in the body of the request json - (optional) Any json compatible python object headers - (optional) Dictionary of HTTP Headers to send with the request cookies - (optional) Dict object to send with the request auth - (optional) BasicAuth named tuple represent HTTP Basic Auth auth - aiohttp.helpers.BasicAuth allow_redirects - (optional) If set to False, do not follow redirects version - Request HTTP version. compress - Set to True if request has to be compressed with deflate encoding. chunked - Set to chunk size for chunked transfer encoding. expect100 - Expect 100-continue response from server. connector - BaseConnector sub-class instance to support connection pooling. read_until_eof - Read response until eof if response does not have Content-Length header. loop - Optional event loop. timeout - Optional ClientTimeout settings structure, 5min total timeout by default. Usage:: >>> import aiohttp >>> resp = await aiohttp.request('GET', 'http://python.org/') >>> resp >>> data = await resp.read() """ connector_owner = False if connector is None: connector_owner = True connector = TCPConnector(loop=loop, force_close=True) session = ClientSession( loop=loop, cookies=cookies, version=version, timeout=timeout, connector=connector, connector_owner=connector_owner) return _SessionRequestContextManager( session._request(method, url, params=params, data=data, json=json, headers=headers, skip_auto_headers=skip_auto_headers, auth=auth, allow_redirects=allow_redirects, max_redirects=max_redirects, compress=compress, chunked=chunked, expect100=expect100, raise_for_status=raise_for_status, read_until_eof=read_until_eof, proxy=proxy, proxy_auth=proxy_auth,), session) aiohttp-3.6.2/aiohttp/client_exceptions.py0000644000175100001650000002045613547410117021235 0ustar vstsdocker00000000000000"""HTTP related errors.""" import asyncio import warnings from typing import TYPE_CHECKING, Any, Optional, Tuple, Union from .typedefs import _CIMultiDict try: import ssl SSLContext = ssl.SSLContext except ImportError: # pragma: no cover ssl = SSLContext = None # type: ignore if TYPE_CHECKING: # pragma: no cover from .client_reqrep import (RequestInfo, ClientResponse, ConnectionKey, # noqa Fingerprint) else: RequestInfo = ClientResponse = ConnectionKey = None __all__ = ( 'ClientError', 'ClientConnectionError', 'ClientOSError', 'ClientConnectorError', 'ClientProxyConnectionError', 'ClientSSLError', 'ClientConnectorSSLError', 'ClientConnectorCertificateError', 'ServerConnectionError', 'ServerTimeoutError', 'ServerDisconnectedError', 'ServerFingerprintMismatch', 'ClientResponseError', 'ClientHttpProxyError', 'WSServerHandshakeError', 'ContentTypeError', 'ClientPayloadError', 'InvalidURL') class ClientError(Exception): """Base class for client connection errors.""" class ClientResponseError(ClientError): """Connection error during reading response. request_info: instance of RequestInfo """ def __init__(self, request_info: RequestInfo, history: Tuple[ClientResponse, ...], *, code: Optional[int]=None, status: Optional[int]=None, message: str='', headers: Optional[_CIMultiDict]=None) -> None: self.request_info = request_info if code is not None: if status is not None: raise ValueError( "Both code and status arguments are provided; " "code is deprecated, use status instead") warnings.warn("code argument is deprecated, use status instead", DeprecationWarning, stacklevel=2) if status is not None: self.status = status elif code is not None: self.status = code else: self.status = 0 self.message = message self.headers = headers self.history = history self.args = (request_info, history) def __str__(self) -> str: return ("%s, message=%r, url=%r" % (self.status, self.message, self.request_info.real_url)) def __repr__(self) -> str: args = "%r, %r" % (self.request_info, self.history) if self.status != 0: args += ", status=%r" % (self.status,) if self.message != '': args += ", message=%r" % (self.message,) if self.headers is not None: args += ", headers=%r" % (self.headers,) return "%s(%s)" % (type(self).__name__, args) @property def code(self) -> int: warnings.warn("code property is deprecated, use status instead", DeprecationWarning, stacklevel=2) return self.status @code.setter def code(self, value: int) -> None: warnings.warn("code property is deprecated, use status instead", DeprecationWarning, stacklevel=2) self.status = value class ContentTypeError(ClientResponseError): """ContentType found is not valid.""" class WSServerHandshakeError(ClientResponseError): """websocket server handshake error.""" class ClientHttpProxyError(ClientResponseError): """HTTP proxy error. Raised in :class:`aiohttp.connector.TCPConnector` if proxy responds with status other than ``200 OK`` on ``CONNECT`` request. """ class TooManyRedirects(ClientResponseError): """Client was redirected too many times.""" class ClientConnectionError(ClientError): """Base class for client socket errors.""" class ClientOSError(ClientConnectionError, OSError): """OSError error.""" class ClientConnectorError(ClientOSError): """Client connector error. Raised in :class:`aiohttp.connector.TCPConnector` if connection to proxy can not be established. """ def __init__(self, connection_key: ConnectionKey, os_error: OSError) -> None: self._conn_key = connection_key self._os_error = os_error super().__init__(os_error.errno, os_error.strerror) self.args = (connection_key, os_error) @property def os_error(self) -> OSError: return self._os_error @property def host(self) -> str: return self._conn_key.host @property def port(self) -> Optional[int]: return self._conn_key.port @property def ssl(self) -> Union[SSLContext, None, bool, 'Fingerprint']: return self._conn_key.ssl def __str__(self) -> str: return ('Cannot connect to host {0.host}:{0.port} ssl:{1} [{2}]' .format(self, self.ssl if self.ssl is not None else 'default', self.strerror)) # OSError.__reduce__ does too much black magick __reduce__ = BaseException.__reduce__ class ClientProxyConnectionError(ClientConnectorError): """Proxy connection error. Raised in :class:`aiohttp.connector.TCPConnector` if connection to proxy can not be established. """ class ServerConnectionError(ClientConnectionError): """Server connection errors.""" class ServerDisconnectedError(ServerConnectionError): """Server disconnected.""" def __init__(self, message: Optional[str]=None) -> None: self.message = message if message is None: self.args = () else: self.args = (message,) class ServerTimeoutError(ServerConnectionError, asyncio.TimeoutError): """Server timeout error.""" class ServerFingerprintMismatch(ServerConnectionError): """SSL certificate does not match expected fingerprint.""" def __init__(self, expected: bytes, got: bytes, host: str, port: int) -> None: self.expected = expected self.got = got self.host = host self.port = port self.args = (expected, got, host, port) def __repr__(self) -> str: return '<{} expected={!r} got={!r} host={!r} port={!r}>'.format( self.__class__.__name__, self.expected, self.got, self.host, self.port) class ClientPayloadError(ClientError): """Response payload error.""" class InvalidURL(ClientError, ValueError): """Invalid URL. URL used for fetching is malformed, e.g. it doesn't contains host part.""" # Derive from ValueError for backward compatibility def __init__(self, url: Any) -> None: # The type of url is not yarl.URL because the exception can be raised # on URL(url) call super().__init__(url) @property def url(self) -> Any: return self.args[0] def __repr__(self) -> str: return '<{} {}>'.format(self.__class__.__name__, self.url) class ClientSSLError(ClientConnectorError): """Base error for ssl.*Errors.""" if ssl is not None: cert_errors = (ssl.CertificateError,) cert_errors_bases = (ClientSSLError, ssl.CertificateError,) ssl_errors = (ssl.SSLError,) ssl_error_bases = (ClientSSLError, ssl.SSLError) else: # pragma: no cover cert_errors = tuple() cert_errors_bases = (ClientSSLError, ValueError,) ssl_errors = tuple() ssl_error_bases = (ClientSSLError,) class ClientConnectorSSLError(*ssl_error_bases): # type: ignore """Response ssl error.""" class ClientConnectorCertificateError(*cert_errors_bases): # type: ignore """Response certificate error.""" def __init__(self, connection_key: ConnectionKey, certificate_error: Exception) -> None: self._conn_key = connection_key self._certificate_error = certificate_error self.args = (connection_key, certificate_error) @property def certificate_error(self) -> Exception: return self._certificate_error @property def host(self) -> str: return self._conn_key.host @property def port(self) -> Optional[int]: return self._conn_key.port @property def ssl(self) -> bool: return self._conn_key.is_ssl def __str__(self) -> str: return ('Cannot connect to host {0.host}:{0.port} ssl:{0.ssl} ' '[{0.certificate_error.__class__.__name__}: ' '{0.certificate_error.args}]'.format(self)) aiohttp-3.6.2/aiohttp/client_proto.py0000644000175100001650000001754513547410117020224 0ustar vstsdocker00000000000000import asyncio from contextlib import suppress from typing import Any, Optional, Tuple from .base_protocol import BaseProtocol from .client_exceptions import ( ClientOSError, ClientPayloadError, ServerDisconnectedError, ServerTimeoutError, ) from .helpers import BaseTimerContext from .http import HttpResponseParser, RawResponseMessage from .streams import EMPTY_PAYLOAD, DataQueue, StreamReader class ResponseHandler(BaseProtocol, DataQueue[Tuple[RawResponseMessage, StreamReader]]): """Helper class to adapt between Protocol and StreamReader.""" def __init__(self, loop: asyncio.AbstractEventLoop) -> None: BaseProtocol.__init__(self, loop=loop) DataQueue.__init__(self, loop) self._should_close = False self._payload = None self._skip_payload = False self._payload_parser = None self._timer = None self._tail = b'' self._upgraded = False self._parser = None # type: Optional[HttpResponseParser] self._read_timeout = None # type: Optional[float] self._read_timeout_handle = None # type: Optional[asyncio.TimerHandle] @property def upgraded(self) -> bool: return self._upgraded @property def should_close(self) -> bool: if (self._payload is not None and not self._payload.is_eof() or self._upgraded): return True return (self._should_close or self._upgraded or self.exception() is not None or self._payload_parser is not None or len(self) > 0 or bool(self._tail)) def force_close(self) -> None: self._should_close = True def close(self) -> None: transport = self.transport if transport is not None: transport.close() self.transport = None self._payload = None self._drop_timeout() def is_connected(self) -> bool: return self.transport is not None def connection_lost(self, exc: Optional[BaseException]) -> None: self._drop_timeout() if self._payload_parser is not None: with suppress(Exception): self._payload_parser.feed_eof() uncompleted = None if self._parser is not None: try: uncompleted = self._parser.feed_eof() except Exception: if self._payload is not None: self._payload.set_exception( ClientPayloadError( 'Response payload is not completed')) if not self.is_eof(): if isinstance(exc, OSError): exc = ClientOSError(*exc.args) if exc is None: exc = ServerDisconnectedError(uncompleted) # assigns self._should_close to True as side effect, # we do it anyway below self.set_exception(exc) self._should_close = True self._parser = None self._payload = None self._payload_parser = None self._reading_paused = False super().connection_lost(exc) def eof_received(self) -> None: # should call parser.feed_eof() most likely self._drop_timeout() def pause_reading(self) -> None: super().pause_reading() self._drop_timeout() def resume_reading(self) -> None: super().resume_reading() self._reschedule_timeout() def set_exception(self, exc: BaseException) -> None: self._should_close = True self._drop_timeout() super().set_exception(exc) def set_parser(self, parser: Any, payload: Any) -> None: # TODO: actual types are: # parser: WebSocketReader # payload: FlowControlDataQueue # but they are not generi enough # Need an ABC for both types self._payload = payload self._payload_parser = parser self._drop_timeout() if self._tail: data, self._tail = self._tail, b'' self.data_received(data) def set_response_params(self, *, timer: BaseTimerContext=None, skip_payload: bool=False, read_until_eof: bool=False, auto_decompress: bool=True, read_timeout: Optional[float]=None) -> None: self._skip_payload = skip_payload self._read_timeout = read_timeout self._reschedule_timeout() self._parser = HttpResponseParser( self, self._loop, timer=timer, payload_exception=ClientPayloadError, read_until_eof=read_until_eof, auto_decompress=auto_decompress) if self._tail: data, self._tail = self._tail, b'' self.data_received(data) def _drop_timeout(self) -> None: if self._read_timeout_handle is not None: self._read_timeout_handle.cancel() self._read_timeout_handle = None def _reschedule_timeout(self) -> None: timeout = self._read_timeout if self._read_timeout_handle is not None: self._read_timeout_handle.cancel() if timeout: self._read_timeout_handle = self._loop.call_later( timeout, self._on_read_timeout) else: self._read_timeout_handle = None def _on_read_timeout(self) -> None: exc = ServerTimeoutError("Timeout on reading data from socket") self.set_exception(exc) if self._payload is not None: self._payload.set_exception(exc) def data_received(self, data: bytes) -> None: self._reschedule_timeout() if not data: return # custom payload parser if self._payload_parser is not None: eof, tail = self._payload_parser.feed_data(data) if eof: self._payload = None self._payload_parser = None if tail: self.data_received(tail) return else: if self._upgraded or self._parser is None: # i.e. websocket connection, websocket parser is not set yet self._tail += data else: # parse http messages try: messages, upgraded, tail = self._parser.feed_data(data) except BaseException as exc: if self.transport is not None: # connection.release() could be called BEFORE # data_received(), the transport is already # closed in this case self.transport.close() # should_close is True after the call self.set_exception(exc) return self._upgraded = upgraded payload = None for message, payload in messages: if message.should_close: self._should_close = True self._payload = payload if self._skip_payload or message.code in (204, 304): self.feed_data((message, EMPTY_PAYLOAD), 0) # type: ignore # noqa else: self.feed_data((message, payload), 0) if payload is not None: # new message(s) was processed # register timeout handler unsubscribing # either on end-of-stream or immediately for # EMPTY_PAYLOAD if payload is not EMPTY_PAYLOAD: payload.on_eof(self._drop_timeout) else: self._drop_timeout() if tail: if upgraded: self.data_received(tail) else: self._tail = tail aiohttp-3.6.2/aiohttp/client_reqrep.py0000644000175100001650000010625413547410117020353 0ustar vstsdocker00000000000000import asyncio import codecs import io import re import sys import traceback import warnings from hashlib import md5, sha1, sha256 from http.cookies import CookieError, Morsel, SimpleCookie from types import MappingProxyType, TracebackType from typing import ( # noqa TYPE_CHECKING, Any, Dict, Iterable, List, Mapping, Optional, Tuple, Type, Union, cast, ) import attr from multidict import CIMultiDict, CIMultiDictProxy, MultiDict, MultiDictProxy from yarl import URL from . import hdrs, helpers, http, multipart, payload from .abc import AbstractStreamWriter from .client_exceptions import ( ClientConnectionError, ClientOSError, ClientResponseError, ContentTypeError, InvalidURL, ServerFingerprintMismatch, ) from .formdata import FormData from .helpers import ( # noqa PY_36, BaseTimerContext, BasicAuth, HeadersMixin, TimerNoop, noop, reify, set_result, ) from .http import SERVER_SOFTWARE, HttpVersion10, HttpVersion11, StreamWriter from .log import client_logger from .streams import StreamReader # noqa from .typedefs import ( DEFAULT_JSON_DECODER, JSONDecoder, LooseCookies, LooseHeaders, RawHeaders, ) try: import ssl from ssl import SSLContext except ImportError: # pragma: no cover ssl = None # type: ignore SSLContext = object # type: ignore try: import cchardet as chardet except ImportError: # pragma: no cover import chardet __all__ = ('ClientRequest', 'ClientResponse', 'RequestInfo', 'Fingerprint') if TYPE_CHECKING: # pragma: no cover from .client import ClientSession # noqa from .connector import Connection # noqa from .tracing import Trace # noqa json_re = re.compile(r'^application/(?:[\w.+-]+?\+)?json') @attr.s(frozen=True, slots=True) class ContentDisposition: type = attr.ib(type=str) # type: Optional[str] parameters = attr.ib(type=MappingProxyType) # type: MappingProxyType[str, str] # noqa filename = attr.ib(type=str) # type: Optional[str] @attr.s(frozen=True, slots=True) class RequestInfo: url = attr.ib(type=URL) method = attr.ib(type=str) headers = attr.ib(type=CIMultiDictProxy) # type: CIMultiDictProxy[str] real_url = attr.ib(type=URL) @real_url.default def real_url_default(self) -> URL: return self.url class Fingerprint: HASHFUNC_BY_DIGESTLEN = { 16: md5, 20: sha1, 32: sha256, } def __init__(self, fingerprint: bytes) -> None: digestlen = len(fingerprint) hashfunc = self.HASHFUNC_BY_DIGESTLEN.get(digestlen) if not hashfunc: raise ValueError('fingerprint has invalid length') elif hashfunc is md5 or hashfunc is sha1: raise ValueError('md5 and sha1 are insecure and ' 'not supported. Use sha256.') self._hashfunc = hashfunc self._fingerprint = fingerprint @property def fingerprint(self) -> bytes: return self._fingerprint def check(self, transport: asyncio.Transport) -> None: if not transport.get_extra_info('sslcontext'): return sslobj = transport.get_extra_info('ssl_object') cert = sslobj.getpeercert(binary_form=True) got = self._hashfunc(cert).digest() if got != self._fingerprint: host, port, *_ = transport.get_extra_info('peername') raise ServerFingerprintMismatch(self._fingerprint, got, host, port) if ssl is not None: SSL_ALLOWED_TYPES = (ssl.SSLContext, bool, Fingerprint, type(None)) else: # pragma: no cover SSL_ALLOWED_TYPES = type(None) def _merge_ssl_params( ssl: Union['SSLContext', bool, Fingerprint, None], verify_ssl: Optional[bool], ssl_context: Optional['SSLContext'], fingerprint: Optional[bytes] ) -> Union['SSLContext', bool, Fingerprint, None]: if verify_ssl is not None and not verify_ssl: warnings.warn("verify_ssl is deprecated, use ssl=False instead", DeprecationWarning, stacklevel=3) if ssl is not None: raise ValueError("verify_ssl, ssl_context, fingerprint and ssl " "parameters are mutually exclusive") else: ssl = False if ssl_context is not None: warnings.warn("ssl_context is deprecated, use ssl=context instead", DeprecationWarning, stacklevel=3) if ssl is not None: raise ValueError("verify_ssl, ssl_context, fingerprint and ssl " "parameters are mutually exclusive") else: ssl = ssl_context if fingerprint is not None: warnings.warn("fingerprint is deprecated, " "use ssl=Fingerprint(fingerprint) instead", DeprecationWarning, stacklevel=3) if ssl is not None: raise ValueError("verify_ssl, ssl_context, fingerprint and ssl " "parameters are mutually exclusive") else: ssl = Fingerprint(fingerprint) if not isinstance(ssl, SSL_ALLOWED_TYPES): raise TypeError("ssl should be SSLContext, bool, Fingerprint or None, " "got {!r} instead.".format(ssl)) return ssl @attr.s(slots=True, frozen=True) class ConnectionKey: # the key should contain an information about used proxy / TLS # to prevent reusing wrong connections from a pool host = attr.ib(type=str) port = attr.ib(type=int) # type: Optional[int] is_ssl = attr.ib(type=bool) ssl = attr.ib() # type: Union[SSLContext, None, bool, Fingerprint] proxy = attr.ib() # type: Optional[URL] proxy_auth = attr.ib() # type: Optional[BasicAuth] proxy_headers_hash = attr.ib(type=int) # type: Optional[int] # noqa # hash(CIMultiDict) def _is_expected_content_type(response_content_type: str, expected_content_type: str) -> bool: if expected_content_type == 'application/json': return json_re.match(response_content_type) is not None return expected_content_type in response_content_type class ClientRequest: GET_METHODS = { hdrs.METH_GET, hdrs.METH_HEAD, hdrs.METH_OPTIONS, hdrs.METH_TRACE, } POST_METHODS = {hdrs.METH_PATCH, hdrs.METH_POST, hdrs.METH_PUT} ALL_METHODS = GET_METHODS.union(POST_METHODS).union({hdrs.METH_DELETE}) DEFAULT_HEADERS = { hdrs.ACCEPT: '*/*', hdrs.ACCEPT_ENCODING: 'gzip, deflate', } body = b'' auth = None response = None _writer = None # async task for streaming data _continue = None # waiter future for '100 Continue' response # N.B. # Adding __del__ method with self._writer closing doesn't make sense # because _writer is instance method, thus it keeps a reference to self. # Until writer has finished finalizer will not be called. def __init__(self, method: str, url: URL, *, params: Optional[Mapping[str, str]]=None, headers: Optional[LooseHeaders]=None, skip_auto_headers: Iterable[str]=frozenset(), data: Any=None, cookies: Optional[LooseCookies]=None, auth: Optional[BasicAuth]=None, version: http.HttpVersion=http.HttpVersion11, compress: Optional[str]=None, chunked: Optional[bool]=None, expect100: bool=False, loop: Optional[asyncio.AbstractEventLoop]=None, response_class: Optional[Type['ClientResponse']]=None, proxy: Optional[URL]=None, proxy_auth: Optional[BasicAuth]=None, timer: Optional[BaseTimerContext]=None, session: Optional['ClientSession']=None, ssl: Union[SSLContext, bool, Fingerprint, None]=None, proxy_headers: Optional[LooseHeaders]=None, traces: Optional[List['Trace']]=None): if loop is None: loop = asyncio.get_event_loop() assert isinstance(url, URL), url assert isinstance(proxy, (URL, type(None))), proxy # FIXME: session is None in tests only, need to fix tests # assert session is not None self._session = cast('ClientSession', session) if params: q = MultiDict(url.query) url2 = url.with_query(params) q.extend(url2.query) url = url.with_query(q) self.original_url = url self.url = url.with_fragment(None) self.method = method.upper() self.chunked = chunked self.compress = compress self.loop = loop self.length = None if response_class is None: real_response_class = ClientResponse else: real_response_class = response_class self.response_class = real_response_class # type: Type[ClientResponse] self._timer = timer if timer is not None else TimerNoop() self._ssl = ssl if loop.get_debug(): self._source_traceback = traceback.extract_stack(sys._getframe(1)) self.update_version(version) self.update_host(url) self.update_headers(headers) self.update_auto_headers(skip_auto_headers) self.update_cookies(cookies) self.update_content_encoding(data) self.update_auth(auth) self.update_proxy(proxy, proxy_auth, proxy_headers) self.update_body_from_data(data) if data or self.method not in self.GET_METHODS: self.update_transfer_encoding() self.update_expect_continue(expect100) if traces is None: traces = [] self._traces = traces def is_ssl(self) -> bool: return self.url.scheme in ('https', 'wss') @property def ssl(self) -> Union['SSLContext', None, bool, Fingerprint]: return self._ssl @property def connection_key(self) -> ConnectionKey: proxy_headers = self.proxy_headers if proxy_headers: h = hash(tuple((k, v) for k, v in proxy_headers.items())) # type: Optional[int] # noqa else: h = None return ConnectionKey(self.host, self.port, self.is_ssl(), self.ssl, self.proxy, self.proxy_auth, h) @property def host(self) -> str: ret = self.url.host assert ret is not None return ret @property def port(self) -> Optional[int]: return self.url.port @property def request_info(self) -> RequestInfo: headers = CIMultiDictProxy(self.headers) # type: CIMultiDictProxy[str] return RequestInfo(self.url, self.method, headers, self.original_url) def update_host(self, url: URL) -> None: """Update destination host, port and connection type (ssl).""" # get host/port if not url.host: raise InvalidURL(url) # basic auth info username, password = url.user, url.password if username: self.auth = helpers.BasicAuth(username, password or '') def update_version(self, version: Union[http.HttpVersion, str]) -> None: """Convert request version to two elements tuple. parser HTTP version '1.1' => (1, 1) """ if isinstance(version, str): v = [l.strip() for l in version.split('.', 1)] try: version = http.HttpVersion(int(v[0]), int(v[1])) except ValueError: raise ValueError( 'Can not parse http version number: {}' .format(version)) from None self.version = version def update_headers(self, headers: Optional[LooseHeaders]) -> None: """Update request headers.""" self.headers = CIMultiDict() # type: CIMultiDict[str] # add host netloc = cast(str, self.url.raw_host) if helpers.is_ipv6_address(netloc): netloc = '[{}]'.format(netloc) if self.url.port is not None and not self.url.is_default_port(): netloc += ':' + str(self.url.port) self.headers[hdrs.HOST] = netloc if headers: if isinstance(headers, (dict, MultiDictProxy, MultiDict)): headers = headers.items() # type: ignore for key, value in headers: # A special case for Host header if key.lower() == 'host': self.headers[key] = value else: self.headers.add(key, value) def update_auto_headers(self, skip_auto_headers: Iterable[str]) -> None: self.skip_auto_headers = CIMultiDict( (hdr, None) for hdr in sorted(skip_auto_headers)) used_headers = self.headers.copy() used_headers.extend(self.skip_auto_headers) # type: ignore for hdr, val in self.DEFAULT_HEADERS.items(): if hdr not in used_headers: self.headers.add(hdr, val) if hdrs.USER_AGENT not in used_headers: self.headers[hdrs.USER_AGENT] = SERVER_SOFTWARE def update_cookies(self, cookies: Optional[LooseCookies]) -> None: """Update request cookies header.""" if not cookies: return c = SimpleCookie() if hdrs.COOKIE in self.headers: c.load(self.headers.get(hdrs.COOKIE, '')) del self.headers[hdrs.COOKIE] if isinstance(cookies, Mapping): iter_cookies = cookies.items() else: iter_cookies = cookies # type: ignore for name, value in iter_cookies: if isinstance(value, Morsel): # Preserve coded_value mrsl_val = value.get(value.key, Morsel()) mrsl_val.set(value.key, value.value, value.coded_value) c[name] = mrsl_val else: c[name] = value # type: ignore self.headers[hdrs.COOKIE] = c.output(header='', sep=';').strip() def update_content_encoding(self, data: Any) -> None: """Set request content encoding.""" if not data: return enc = self.headers.get(hdrs.CONTENT_ENCODING, '').lower() if enc: if self.compress: raise ValueError( 'compress can not be set ' 'if Content-Encoding header is set') elif self.compress: if not isinstance(self.compress, str): self.compress = 'deflate' self.headers[hdrs.CONTENT_ENCODING] = self.compress self.chunked = True # enable chunked, no need to deal with length def update_transfer_encoding(self) -> None: """Analyze transfer-encoding header.""" te = self.headers.get(hdrs.TRANSFER_ENCODING, '').lower() if 'chunked' in te: if self.chunked: raise ValueError( 'chunked can not be set ' 'if "Transfer-Encoding: chunked" header is set') elif self.chunked: if hdrs.CONTENT_LENGTH in self.headers: raise ValueError( 'chunked can not be set ' 'if Content-Length header is set') self.headers[hdrs.TRANSFER_ENCODING] = 'chunked' else: if hdrs.CONTENT_LENGTH not in self.headers: self.headers[hdrs.CONTENT_LENGTH] = str(len(self.body)) def update_auth(self, auth: Optional[BasicAuth]) -> None: """Set basic auth.""" if auth is None: auth = self.auth if auth is None: return if not isinstance(auth, helpers.BasicAuth): raise TypeError('BasicAuth() tuple is required instead') self.headers[hdrs.AUTHORIZATION] = auth.encode() def update_body_from_data(self, body: Any) -> None: if not body: return # FormData if isinstance(body, FormData): body = body() try: body = payload.PAYLOAD_REGISTRY.get(body, disposition=None) except payload.LookupError: body = FormData(body)() self.body = body # enable chunked encoding if needed if not self.chunked: if hdrs.CONTENT_LENGTH not in self.headers: size = body.size if size is None: self.chunked = True else: if hdrs.CONTENT_LENGTH not in self.headers: self.headers[hdrs.CONTENT_LENGTH] = str(size) # copy payload headers assert body.headers for (key, value) in body.headers.items(): if key in self.headers: continue if key in self.skip_auto_headers: continue self.headers[key] = value def update_expect_continue(self, expect: bool=False) -> None: if expect: self.headers[hdrs.EXPECT] = '100-continue' elif self.headers.get(hdrs.EXPECT, '').lower() == '100-continue': expect = True if expect: self._continue = self.loop.create_future() def update_proxy(self, proxy: Optional[URL], proxy_auth: Optional[BasicAuth], proxy_headers: Optional[LooseHeaders]) -> None: if proxy and not proxy.scheme == 'http': raise ValueError("Only http proxies are supported") if proxy_auth and not isinstance(proxy_auth, helpers.BasicAuth): raise ValueError("proxy_auth must be None or BasicAuth() tuple") self.proxy = proxy self.proxy_auth = proxy_auth self.proxy_headers = proxy_headers def keep_alive(self) -> bool: if self.version < HttpVersion10: # keep alive not supported at all return False if self.version == HttpVersion10: if self.headers.get(hdrs.CONNECTION) == 'keep-alive': return True else: # no headers means we close for Http 1.0 return False elif self.headers.get(hdrs.CONNECTION) == 'close': return False return True async def write_bytes(self, writer: AbstractStreamWriter, conn: 'Connection') -> None: """Support coroutines that yields bytes objects.""" # 100 response if self._continue is not None: await writer.drain() await self._continue protocol = conn.protocol assert protocol is not None try: if isinstance(self.body, payload.Payload): await self.body.write(writer) else: if isinstance(self.body, (bytes, bytearray)): self.body = (self.body,) # type: ignore for chunk in self.body: await writer.write(chunk) # type: ignore await writer.write_eof() except OSError as exc: new_exc = ClientOSError( exc.errno, 'Can not write request body for %s' % self.url) new_exc.__context__ = exc new_exc.__cause__ = exc protocol.set_exception(new_exc) except asyncio.CancelledError as exc: if not conn.closed: protocol.set_exception(exc) except Exception as exc: protocol.set_exception(exc) finally: self._writer = None async def send(self, conn: 'Connection') -> 'ClientResponse': # Specify request target: # - CONNECT request must send authority form URI # - not CONNECT proxy must send absolute form URI # - most common is origin form URI if self.method == hdrs.METH_CONNECT: connect_host = self.url.raw_host assert connect_host is not None if helpers.is_ipv6_address(connect_host): connect_host = '[{}]'.format(connect_host) path = '{}:{}'.format(connect_host, self.url.port) elif self.proxy and not self.is_ssl(): path = str(self.url) else: path = self.url.raw_path if self.url.raw_query_string: path += '?' + self.url.raw_query_string protocol = conn.protocol assert protocol is not None writer = StreamWriter( protocol, self.loop, on_chunk_sent=self._on_chunk_request_sent ) if self.compress: writer.enable_compression(self.compress) if self.chunked is not None: writer.enable_chunking() # set default content-type if (self.method in self.POST_METHODS and hdrs.CONTENT_TYPE not in self.skip_auto_headers and hdrs.CONTENT_TYPE not in self.headers): self.headers[hdrs.CONTENT_TYPE] = 'application/octet-stream' # set the connection header connection = self.headers.get(hdrs.CONNECTION) if not connection: if self.keep_alive(): if self.version == HttpVersion10: connection = 'keep-alive' else: if self.version == HttpVersion11: connection = 'close' if connection is not None: self.headers[hdrs.CONNECTION] = connection # status + headers status_line = '{0} {1} HTTP/{2[0]}.{2[1]}'.format( self.method, path, self.version) await writer.write_headers(status_line, self.headers) self._writer = self.loop.create_task(self.write_bytes(writer, conn)) response_class = self.response_class assert response_class is not None self.response = response_class( self.method, self.original_url, writer=self._writer, continue100=self._continue, timer=self._timer, request_info=self.request_info, traces=self._traces, loop=self.loop, session=self._session ) return self.response async def close(self) -> None: if self._writer is not None: try: await self._writer finally: self._writer = None def terminate(self) -> None: if self._writer is not None: if not self.loop.is_closed(): self._writer.cancel() self._writer = None async def _on_chunk_request_sent(self, chunk: bytes) -> None: for trace in self._traces: await trace.send_request_chunk_sent(chunk) class ClientResponse(HeadersMixin): # from the Status-Line of the response version = None # HTTP-Version status = None # type: int # Status-Code reason = None # Reason-Phrase content = None # type: StreamReader # Payload stream _headers = None # type: CIMultiDictProxy[str] # Response headers _raw_headers = None # type: RawHeaders # Response raw headers _connection = None # current connection _source_traceback = None # setted up by ClientRequest after ClientResponse object creation # post-init stage allows to not change ctor signature _closed = True # to allow __del__ for non-initialized properly response _released = False def __init__(self, method: str, url: URL, *, writer: 'asyncio.Task[None]', continue100: Optional['asyncio.Future[bool]'], timer: BaseTimerContext, request_info: RequestInfo, traces: List['Trace'], loop: asyncio.AbstractEventLoop, session: 'ClientSession') -> None: assert isinstance(url, URL) self.method = method self.cookies = SimpleCookie() self._real_url = url self._url = url.with_fragment(None) self._body = None # type: Any self._writer = writer # type: Optional[asyncio.Task[None]] self._continue = continue100 # None by default self._closed = True self._history = () # type: Tuple[ClientResponse, ...] self._request_info = request_info self._timer = timer if timer is not None else TimerNoop() self._cache = {} # type: Dict[str, Any] self._traces = traces self._loop = loop # store a reference to session #1985 self._session = session # type: Optional[ClientSession] if loop.get_debug(): self._source_traceback = traceback.extract_stack(sys._getframe(1)) @reify def url(self) -> URL: return self._url @reify def url_obj(self) -> URL: warnings.warn( "Deprecated, use .url #1654", DeprecationWarning, stacklevel=2) return self._url @reify def real_url(self) -> URL: return self._real_url @reify def host(self) -> str: assert self._url.host is not None return self._url.host @reify def headers(self) -> 'CIMultiDictProxy[str]': return self._headers @reify def raw_headers(self) -> RawHeaders: return self._raw_headers @reify def request_info(self) -> RequestInfo: return self._request_info @reify def content_disposition(self) -> Optional[ContentDisposition]: raw = self._headers.get(hdrs.CONTENT_DISPOSITION) if raw is None: return None disposition_type, params_dct = multipart.parse_content_disposition(raw) params = MappingProxyType(params_dct) filename = multipart.content_disposition_filename(params) return ContentDisposition(disposition_type, params, filename) def __del__(self, _warnings: Any=warnings) -> None: if self._closed: return if self._connection is not None: self._connection.release() self._cleanup_writer() if self._loop.get_debug(): if PY_36: kwargs = {'source': self} else: kwargs = {} _warnings.warn("Unclosed response {!r}".format(self), ResourceWarning, **kwargs) context = {'client_response': self, 'message': 'Unclosed response'} if self._source_traceback: context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) def __repr__(self) -> str: out = io.StringIO() ascii_encodable_url = str(self.url) if self.reason: ascii_encodable_reason = self.reason.encode('ascii', 'backslashreplace') \ .decode('ascii') else: ascii_encodable_reason = self.reason print(''.format( ascii_encodable_url, self.status, ascii_encodable_reason), file=out) print(self.headers, file=out) return out.getvalue() @property def connection(self) -> Optional['Connection']: return self._connection @reify def history(self) -> Tuple['ClientResponse', ...]: """A sequence of of responses, if redirects occurred.""" return self._history @reify def links(self) -> 'MultiDictProxy[MultiDictProxy[Union[str, URL]]]': links_str = ", ".join(self.headers.getall("link", [])) if not links_str: return MultiDictProxy(MultiDict()) links = MultiDict() # type: MultiDict[MultiDictProxy[Union[str, URL]]] for val in re.split(r",(?=\s*<)", links_str): match = re.match(r"\s*<(.*)>(.*)", val) if match is None: # pragma: no cover # the check exists to suppress mypy error continue url, params_str = match.groups() params = params_str.split(";")[1:] link = MultiDict() # type: MultiDict[Union[str, URL]] for param in params: match = re.match( r"^\s*(\S*)\s*=\s*(['\"]?)(.*?)(\2)\s*$", param, re.M ) if match is None: # pragma: no cover # the check exists to suppress mypy error continue key, _, value, _ = match.groups() link.add(key, value) key = link.get("rel", url) # type: ignore link.add("url", self.url.join(URL(url))) links.add(key, MultiDictProxy(link)) return MultiDictProxy(links) async def start(self, connection: 'Connection') -> 'ClientResponse': """Start response processing.""" self._closed = False self._protocol = connection.protocol self._connection = connection with self._timer: while True: # read response try: message, payload = await self._protocol.read() # type: ignore # noqa except http.HttpProcessingError as exc: raise ClientResponseError( self.request_info, self.history, status=exc.code, message=exc.message, headers=exc.headers) from exc if (message.code < 100 or message.code > 199 or message.code == 101): break if self._continue is not None: set_result(self._continue, True) self._continue = None # payload eof handler payload.on_eof(self._response_eof) # response status self.version = message.version self.status = message.code self.reason = message.reason # headers self._headers = message.headers # type is CIMultiDictProxy self._raw_headers = message.raw_headers # type is Tuple[bytes, bytes] # payload self.content = payload # cookies for hdr in self.headers.getall(hdrs.SET_COOKIE, ()): try: self.cookies.load(hdr) except CookieError as exc: client_logger.warning( 'Can not load response cookies: %s', exc) return self def _response_eof(self) -> None: if self._closed: return if self._connection is not None: # websocket, protocol could be None because # connection could be detached if (self._connection.protocol is not None and self._connection.protocol.upgraded): return self._connection.release() self._connection = None self._closed = True self._cleanup_writer() @property def closed(self) -> bool: return self._closed def close(self) -> None: if not self._released: self._notify_content() if self._closed: return self._closed = True if self._loop is None or self._loop.is_closed(): return if self._connection is not None: self._connection.close() self._connection = None self._cleanup_writer() def release(self) -> Any: if not self._released: self._notify_content() if self._closed: return noop() self._closed = True if self._connection is not None: self._connection.release() self._connection = None self._cleanup_writer() return noop() def raise_for_status(self) -> None: if 400 <= self.status: # reason should always be not None for a started response assert self.reason is not None self.release() raise ClientResponseError( self.request_info, self.history, status=self.status, message=self.reason, headers=self.headers) def _cleanup_writer(self) -> None: if self._writer is not None: self._writer.cancel() self._writer = None self._session = None def _notify_content(self) -> None: content = self.content if content and content.exception() is None: content.set_exception( ClientConnectionError('Connection closed')) self._released = True async def wait_for_close(self) -> None: if self._writer is not None: try: await self._writer finally: self._writer = None self.release() async def read(self) -> bytes: """Read response payload.""" if self._body is None: try: self._body = await self.content.read() for trace in self._traces: await trace.send_response_chunk_received(self._body) except BaseException: self.close() raise elif self._released: raise ClientConnectionError('Connection closed') return self._body def get_encoding(self) -> str: ctype = self.headers.get(hdrs.CONTENT_TYPE, '').lower() mimetype = helpers.parse_mimetype(ctype) encoding = mimetype.parameters.get('charset') if encoding: try: codecs.lookup(encoding) except LookupError: encoding = None if not encoding: if mimetype.type == 'application' and mimetype.subtype == 'json': # RFC 7159 states that the default encoding is UTF-8. encoding = 'utf-8' else: encoding = chardet.detect(self._body)['encoding'] if not encoding: encoding = 'utf-8' return encoding async def text(self, encoding: Optional[str]=None, errors: str='strict') -> str: """Read response payload and decode.""" if self._body is None: await self.read() if encoding is None: encoding = self.get_encoding() return self._body.decode(encoding, errors=errors) # type: ignore async def json(self, *, encoding: str=None, loads: JSONDecoder=DEFAULT_JSON_DECODER, content_type: Optional[str]='application/json') -> Any: """Read and decodes JSON response.""" if self._body is None: await self.read() if content_type: ctype = self.headers.get(hdrs.CONTENT_TYPE, '').lower() if not _is_expected_content_type(ctype, content_type): raise ContentTypeError( self.request_info, self.history, message=('Attempt to decode JSON with ' 'unexpected mimetype: %s' % ctype), headers=self.headers) stripped = self._body.strip() # type: ignore if not stripped: return None if encoding is None: encoding = self.get_encoding() return loads(stripped.decode(encoding)) async def __aenter__(self) -> 'ClientResponse': return self async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> None: # similar to _RequestContextManager, we do not need to check # for exceptions, response object can closes connection # is state is broken self.release() aiohttp-3.6.2/aiohttp/client_ws.py0000644000175100001650000002470613547410117017507 0ustar vstsdocker00000000000000"""WebSocket client for asyncio.""" import asyncio from typing import Any, Optional import async_timeout from .client_exceptions import ClientError from .client_reqrep import ClientResponse from .helpers import call_later, set_result from .http import ( WS_CLOSED_MESSAGE, WS_CLOSING_MESSAGE, WebSocketError, WSMessage, WSMsgType, ) from .http_websocket import WebSocketWriter # WSMessage from .streams import EofStream, FlowControlDataQueue # noqa from .typedefs import ( DEFAULT_JSON_DECODER, DEFAULT_JSON_ENCODER, JSONDecoder, JSONEncoder, ) class ClientWebSocketResponse: def __init__(self, reader: 'FlowControlDataQueue[WSMessage]', writer: WebSocketWriter, protocol: Optional[str], response: ClientResponse, timeout: float, autoclose: bool, autoping: bool, loop: asyncio.AbstractEventLoop, *, receive_timeout: Optional[float]=None, heartbeat: Optional[float]=None, compress: int=0, client_notakeover: bool=False) -> None: self._response = response self._conn = response.connection self._writer = writer self._reader = reader self._protocol = protocol self._closed = False self._closing = False self._close_code = None # type: Optional[int] self._timeout = timeout self._receive_timeout = receive_timeout self._autoclose = autoclose self._autoping = autoping self._heartbeat = heartbeat self._heartbeat_cb = None if heartbeat is not None: self._pong_heartbeat = heartbeat / 2.0 self._pong_response_cb = None self._loop = loop self._waiting = None # type: Optional[asyncio.Future[bool]] self._exception = None # type: Optional[BaseException] self._compress = compress self._client_notakeover = client_notakeover self._reset_heartbeat() def _cancel_heartbeat(self) -> None: if self._pong_response_cb is not None: self._pong_response_cb.cancel() self._pong_response_cb = None if self._heartbeat_cb is not None: self._heartbeat_cb.cancel() self._heartbeat_cb = None def _reset_heartbeat(self) -> None: self._cancel_heartbeat() if self._heartbeat is not None: self._heartbeat_cb = call_later( self._send_heartbeat, self._heartbeat, self._loop) def _send_heartbeat(self) -> None: if self._heartbeat is not None and not self._closed: # fire-and-forget a task is not perfect but maybe ok for # sending ping. Otherwise we need a long-living heartbeat # task in the class. self._loop.create_task(self._writer.ping()) if self._pong_response_cb is not None: self._pong_response_cb.cancel() self._pong_response_cb = call_later( self._pong_not_received, self._pong_heartbeat, self._loop) def _pong_not_received(self) -> None: if not self._closed: self._closed = True self._close_code = 1006 self._exception = asyncio.TimeoutError() self._response.close() @property def closed(self) -> bool: return self._closed @property def close_code(self) -> Optional[int]: return self._close_code @property def protocol(self) -> Optional[str]: return self._protocol @property def compress(self) -> int: return self._compress @property def client_notakeover(self) -> bool: return self._client_notakeover def get_extra_info(self, name: str, default: Any=None) -> Any: """extra info from connection transport""" conn = self._response.connection if conn is None: return default transport = conn.transport if transport is None: return default return transport.get_extra_info(name, default) def exception(self) -> Optional[BaseException]: return self._exception async def ping(self, message: bytes=b'') -> None: await self._writer.ping(message) async def pong(self, message: bytes=b'') -> None: await self._writer.pong(message) async def send_str(self, data: str, compress: Optional[int]=None) -> None: if not isinstance(data, str): raise TypeError('data argument must be str (%r)' % type(data)) await self._writer.send(data, binary=False, compress=compress) async def send_bytes(self, data: bytes, compress: Optional[int]=None) -> None: if not isinstance(data, (bytes, bytearray, memoryview)): raise TypeError('data argument must be byte-ish (%r)' % type(data)) await self._writer.send(data, binary=True, compress=compress) async def send_json(self, data: Any, compress: Optional[int]=None, *, dumps: JSONEncoder=DEFAULT_JSON_ENCODER) -> None: await self.send_str(dumps(data), compress=compress) async def close(self, *, code: int=1000, message: bytes=b'') -> bool: # we need to break `receive()` cycle first, # `close()` may be called from different task if self._waiting is not None and not self._closed: self._reader.feed_data(WS_CLOSING_MESSAGE, 0) await self._waiting if not self._closed: self._cancel_heartbeat() self._closed = True try: await self._writer.close(code, message) except asyncio.CancelledError: self._close_code = 1006 self._response.close() raise except Exception as exc: self._close_code = 1006 self._exception = exc self._response.close() return True if self._closing: self._response.close() return True while True: try: with async_timeout.timeout(self._timeout, loop=self._loop): msg = await self._reader.read() except asyncio.CancelledError: self._close_code = 1006 self._response.close() raise except Exception as exc: self._close_code = 1006 self._exception = exc self._response.close() return True if msg.type == WSMsgType.CLOSE: self._close_code = msg.data self._response.close() return True else: return False async def receive(self, timeout: Optional[float]=None) -> WSMessage: while True: if self._waiting is not None: raise RuntimeError( 'Concurrent call to receive() is not allowed') if self._closed: return WS_CLOSED_MESSAGE elif self._closing: await self.close() return WS_CLOSED_MESSAGE try: self._waiting = self._loop.create_future() try: with async_timeout.timeout( timeout or self._receive_timeout, loop=self._loop): msg = await self._reader.read() self._reset_heartbeat() finally: waiter = self._waiting self._waiting = None set_result(waiter, True) except (asyncio.CancelledError, asyncio.TimeoutError): self._close_code = 1006 raise except EofStream: self._close_code = 1000 await self.close() return WSMessage(WSMsgType.CLOSED, None, None) except ClientError: self._closed = True self._close_code = 1006 return WS_CLOSED_MESSAGE except WebSocketError as exc: self._close_code = exc.code await self.close(code=exc.code) return WSMessage(WSMsgType.ERROR, exc, None) except Exception as exc: self._exception = exc self._closing = True self._close_code = 1006 await self.close() return WSMessage(WSMsgType.ERROR, exc, None) if msg.type == WSMsgType.CLOSE: self._closing = True self._close_code = msg.data if not self._closed and self._autoclose: await self.close() elif msg.type == WSMsgType.CLOSING: self._closing = True elif msg.type == WSMsgType.PING and self._autoping: await self.pong(msg.data) continue elif msg.type == WSMsgType.PONG and self._autoping: continue return msg async def receive_str(self, *, timeout: Optional[float]=None) -> str: msg = await self.receive(timeout) if msg.type != WSMsgType.TEXT: raise TypeError( "Received message {}:{!r} is not str".format(msg.type, msg.data)) return msg.data async def receive_bytes(self, *, timeout: Optional[float]=None) -> bytes: msg = await self.receive(timeout) if msg.type != WSMsgType.BINARY: raise TypeError( "Received message {}:{!r} is not bytes".format(msg.type, msg.data)) return msg.data async def receive_json(self, *, loads: JSONDecoder=DEFAULT_JSON_DECODER, timeout: Optional[float]=None) -> Any: data = await self.receive_str(timeout=timeout) return loads(data) def __aiter__(self) -> 'ClientWebSocketResponse': return self async def __anext__(self) -> WSMessage: msg = await self.receive() if msg.type in (WSMsgType.CLOSE, WSMsgType.CLOSING, WSMsgType.CLOSED): raise StopAsyncIteration # NOQA return msg aiohttp-3.6.2/aiohttp/connector.py0000644000175100001650000012275713547410117017517 0ustar vstsdocker00000000000000import asyncio import functools import random import sys import traceback import warnings from collections import defaultdict, deque from contextlib import suppress from http.cookies import SimpleCookie from itertools import cycle, islice from time import monotonic from types import TracebackType from typing import ( # noqa TYPE_CHECKING, Any, Awaitable, Callable, DefaultDict, Dict, Iterator, List, Optional, Set, Tuple, Type, Union, cast, ) import attr from . import hdrs, helpers from .abc import AbstractResolver from .client_exceptions import ( ClientConnectionError, ClientConnectorCertificateError, ClientConnectorError, ClientConnectorSSLError, ClientHttpProxyError, ClientProxyConnectionError, ServerFingerprintMismatch, cert_errors, ssl_errors, ) from .client_proto import ResponseHandler from .client_reqrep import ClientRequest, Fingerprint, _merge_ssl_params from .helpers import ( PY_36, CeilTimeout, get_running_loop, is_ip_address, noop2, sentinel, ) from .http import RESPONSES from .locks import EventResultOrError from .resolver import DefaultResolver try: import ssl SSLContext = ssl.SSLContext except ImportError: # pragma: no cover ssl = None # type: ignore SSLContext = object # type: ignore __all__ = ('BaseConnector', 'TCPConnector', 'UnixConnector', 'NamedPipeConnector') if TYPE_CHECKING: # pragma: no cover from .client import ClientTimeout # noqa from .client_reqrep import ConnectionKey # noqa from .tracing import Trace # noqa class _DeprecationWaiter: __slots__ = ('_awaitable', '_awaited') def __init__(self, awaitable: Awaitable[Any]) -> None: self._awaitable = awaitable self._awaited = False def __await__(self) -> Any: self._awaited = True return self._awaitable.__await__() def __del__(self) -> None: if not self._awaited: warnings.warn("Connector.close() is a coroutine, " "please use await connector.close()", DeprecationWarning) class Connection: _source_traceback = None _transport = None def __init__(self, connector: 'BaseConnector', key: 'ConnectionKey', protocol: ResponseHandler, loop: asyncio.AbstractEventLoop) -> None: self._key = key self._connector = connector self._loop = loop self._protocol = protocol # type: Optional[ResponseHandler] self._callbacks = [] # type: List[Callable[[], None]] if loop.get_debug(): self._source_traceback = traceback.extract_stack(sys._getframe(1)) def __repr__(self) -> str: return 'Connection<{}>'.format(self._key) def __del__(self, _warnings: Any=warnings) -> None: if self._protocol is not None: if PY_36: kwargs = {'source': self} else: kwargs = {} _warnings.warn('Unclosed connection {!r}'.format(self), ResourceWarning, **kwargs) if self._loop.is_closed(): return self._connector._release( self._key, self._protocol, should_close=True) context = {'client_connection': self, 'message': 'Unclosed connection'} if self._source_traceback is not None: context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) @property def loop(self) -> asyncio.AbstractEventLoop: warnings.warn("connector.loop property is deprecated", DeprecationWarning, stacklevel=2) return self._loop @property def transport(self) -> Optional[asyncio.Transport]: if self._protocol is None: return None return self._protocol.transport @property def protocol(self) -> Optional[ResponseHandler]: return self._protocol def add_callback(self, callback: Callable[[], None]) -> None: if callback is not None: self._callbacks.append(callback) def _notify_release(self) -> None: callbacks, self._callbacks = self._callbacks[:], [] for cb in callbacks: with suppress(Exception): cb() def close(self) -> None: self._notify_release() if self._protocol is not None: self._connector._release( self._key, self._protocol, should_close=True) self._protocol = None def release(self) -> None: self._notify_release() if self._protocol is not None: self._connector._release( self._key, self._protocol, should_close=self._protocol.should_close) self._protocol = None @property def closed(self) -> bool: return self._protocol is None or not self._protocol.is_connected() class _TransportPlaceholder: """ placeholder for BaseConnector.connect function """ def close(self) -> None: pass class BaseConnector: """Base connector class. keepalive_timeout - (optional) Keep-alive timeout. force_close - Set to True to force close and do reconnect after each request (and between redirects). limit - The total number of simultaneous connections. limit_per_host - Number of simultaneous connections to one host. enable_cleanup_closed - Enables clean-up closed ssl transports. Disabled by default. loop - Optional event loop. """ _closed = True # prevent AttributeError in __del__ if ctor was failed _source_traceback = None # abort transport after 2 seconds (cleanup broken connections) _cleanup_closed_period = 2.0 def __init__(self, *, keepalive_timeout: Union[object, None, float]=sentinel, force_close: bool=False, limit: int=100, limit_per_host: int=0, enable_cleanup_closed: bool=False, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: if force_close: if keepalive_timeout is not None and \ keepalive_timeout is not sentinel: raise ValueError('keepalive_timeout cannot ' 'be set if force_close is True') else: if keepalive_timeout is sentinel: keepalive_timeout = 15.0 loop = get_running_loop(loop) self._closed = False if loop.get_debug(): self._source_traceback = traceback.extract_stack(sys._getframe(1)) self._conns = {} # type: Dict[ConnectionKey, List[Tuple[ResponseHandler, float]]] # noqa self._limit = limit self._limit_per_host = limit_per_host self._acquired = set() # type: Set[ResponseHandler] self._acquired_per_host = defaultdict(set) # type: DefaultDict[ConnectionKey, Set[ResponseHandler]] # noqa self._keepalive_timeout = cast(float, keepalive_timeout) self._force_close = force_close # {host_key: FIFO list of waiters} self._waiters = defaultdict(deque) # type: ignore self._loop = loop self._factory = functools.partial(ResponseHandler, loop=loop) self.cookies = SimpleCookie() # start keep-alive connection cleanup task self._cleanup_handle = None # start cleanup closed transports task self._cleanup_closed_handle = None self._cleanup_closed_disabled = not enable_cleanup_closed self._cleanup_closed_transports = [] # type: List[Optional[asyncio.Transport]] # noqa self._cleanup_closed() def __del__(self, _warnings: Any=warnings) -> None: if self._closed: return if not self._conns: return conns = [repr(c) for c in self._conns.values()] self._close() if PY_36: kwargs = {'source': self} else: kwargs = {} _warnings.warn("Unclosed connector {!r}".format(self), ResourceWarning, **kwargs) context = {'connector': self, 'connections': conns, 'message': 'Unclosed connector'} if self._source_traceback is not None: context['source_traceback'] = self._source_traceback self._loop.call_exception_handler(context) def __enter__(self) -> 'BaseConnector': warnings.warn('"witn Connector():" is deprecated, ' 'use "async with Connector():" instead', DeprecationWarning) return self def __exit__(self, *exc: Any) -> None: self.close() async def __aenter__(self) -> 'BaseConnector': return self async def __aexit__(self, exc_type: Optional[Type[BaseException]]=None, exc_value: Optional[BaseException]=None, exc_traceback: Optional[TracebackType]=None ) -> None: await self.close() @property def force_close(self) -> bool: """Ultimately close connection on releasing if True.""" return self._force_close @property def limit(self) -> int: """The total number for simultaneous connections. If limit is 0 the connector has no limit. The default limit size is 100. """ return self._limit @property def limit_per_host(self) -> int: """The limit_per_host for simultaneous connections to the same endpoint. Endpoints are the same if they are have equal (host, port, is_ssl) triple. """ return self._limit_per_host def _cleanup(self) -> None: """Cleanup unused transports.""" if self._cleanup_handle: self._cleanup_handle.cancel() now = self._loop.time() timeout = self._keepalive_timeout if self._conns: connections = {} deadline = now - timeout for key, conns in self._conns.items(): alive = [] for proto, use_time in conns: if proto.is_connected(): if use_time - deadline < 0: transport = proto.transport proto.close() if (key.is_ssl and not self._cleanup_closed_disabled): self._cleanup_closed_transports.append( transport) else: alive.append((proto, use_time)) if alive: connections[key] = alive self._conns = connections if self._conns: self._cleanup_handle = helpers.weakref_handle( self, '_cleanup', timeout, self._loop) def _drop_acquired_per_host(self, key: 'ConnectionKey', val: ResponseHandler) -> None: acquired_per_host = self._acquired_per_host if key not in acquired_per_host: return conns = acquired_per_host[key] conns.remove(val) if not conns: del self._acquired_per_host[key] def _cleanup_closed(self) -> None: """Double confirmation for transport close. Some broken ssl servers may leave socket open without proper close. """ if self._cleanup_closed_handle: self._cleanup_closed_handle.cancel() for transport in self._cleanup_closed_transports: if transport is not None: transport.abort() self._cleanup_closed_transports = [] if not self._cleanup_closed_disabled: self._cleanup_closed_handle = helpers.weakref_handle( self, '_cleanup_closed', self._cleanup_closed_period, self._loop) def close(self) -> Awaitable[None]: """Close all opened transports.""" self._close() return _DeprecationWaiter(noop2()) def _close(self) -> None: if self._closed: return self._closed = True try: if self._loop.is_closed(): return # cancel cleanup task if self._cleanup_handle: self._cleanup_handle.cancel() # cancel cleanup close task if self._cleanup_closed_handle: self._cleanup_closed_handle.cancel() for data in self._conns.values(): for proto, t0 in data: proto.close() for proto in self._acquired: proto.close() for transport in self._cleanup_closed_transports: if transport is not None: transport.abort() finally: self._conns.clear() self._acquired.clear() self._waiters.clear() self._cleanup_handle = None self._cleanup_closed_transports.clear() self._cleanup_closed_handle = None @property def closed(self) -> bool: """Is connector closed. A readonly property. """ return self._closed def _available_connections(self, key: 'ConnectionKey') -> int: """ Return number of available connections taking into account the limit, limit_per_host and the connection key. If it returns less than 1 means that there is no connections availables. """ if self._limit: # total calc available connections available = self._limit - len(self._acquired) # check limit per host if (self._limit_per_host and available > 0 and key in self._acquired_per_host): acquired = self._acquired_per_host.get(key) assert acquired is not None available = self._limit_per_host - len(acquired) elif self._limit_per_host and key in self._acquired_per_host: # check limit per host acquired = self._acquired_per_host.get(key) assert acquired is not None available = self._limit_per_host - len(acquired) else: available = 1 return available async def connect(self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout') -> Connection: """Get from pool or create new connection.""" key = req.connection_key available = self._available_connections(key) # Wait if there are no available connections. if available <= 0: fut = self._loop.create_future() # This connection will now count towards the limit. waiters = self._waiters[key] waiters.append(fut) if traces: for trace in traces: await trace.send_connection_queued_start() try: await fut except BaseException as e: # remove a waiter even if it was cancelled, normally it's # removed when it's notified try: waiters.remove(fut) except ValueError: # fut may no longer be in list pass raise e finally: if not waiters: try: del self._waiters[key] except KeyError: # the key was evicted before. pass if traces: for trace in traces: await trace.send_connection_queued_end() proto = self._get(key) if proto is None: placeholder = cast(ResponseHandler, _TransportPlaceholder()) self._acquired.add(placeholder) self._acquired_per_host[key].add(placeholder) if traces: for trace in traces: await trace.send_connection_create_start() try: proto = await self._create_connection(req, traces, timeout) if self._closed: proto.close() raise ClientConnectionError("Connector is closed.") except BaseException: if not self._closed: self._acquired.remove(placeholder) self._drop_acquired_per_host(key, placeholder) self._release_waiter() raise else: if not self._closed: self._acquired.remove(placeholder) self._drop_acquired_per_host(key, placeholder) if traces: for trace in traces: await trace.send_connection_create_end() else: if traces: for trace in traces: await trace.send_connection_reuseconn() self._acquired.add(proto) self._acquired_per_host[key].add(proto) return Connection(self, key, proto, self._loop) def _get(self, key: 'ConnectionKey') -> Optional[ResponseHandler]: try: conns = self._conns[key] except KeyError: return None t1 = self._loop.time() while conns: proto, t0 = conns.pop() if proto.is_connected(): if t1 - t0 > self._keepalive_timeout: transport = proto.transport proto.close() # only for SSL transports if key.is_ssl and not self._cleanup_closed_disabled: self._cleanup_closed_transports.append(transport) else: if not conns: # The very last connection was reclaimed: drop the key del self._conns[key] return proto # No more connections: drop the key del self._conns[key] return None def _release_waiter(self) -> None: """ Iterates over all waiters till found one that is not finsihed and belongs to a host that has available connections. """ if not self._waiters: return # Having the dict keys ordered this avoids to iterate # at the same order at each call. queues = list(self._waiters.keys()) random.shuffle(queues) for key in queues: if self._available_connections(key) < 1: continue waiters = self._waiters[key] while waiters: waiter = waiters.popleft() if not waiter.done(): waiter.set_result(None) return def _release_acquired(self, key: 'ConnectionKey', proto: ResponseHandler) -> None: if self._closed: # acquired connection is already released on connector closing return try: self._acquired.remove(proto) self._drop_acquired_per_host(key, proto) except KeyError: # pragma: no cover # this may be result of undetermenistic order of objects # finalization due garbage collection. pass else: self._release_waiter() def _release(self, key: 'ConnectionKey', protocol: ResponseHandler, *, should_close: bool=False) -> None: if self._closed: # acquired connection is already released on connector closing return self._release_acquired(key, protocol) if self._force_close: should_close = True if should_close or protocol.should_close: transport = protocol.transport protocol.close() if key.is_ssl and not self._cleanup_closed_disabled: self._cleanup_closed_transports.append(transport) else: conns = self._conns.get(key) if conns is None: conns = self._conns[key] = [] conns.append((protocol, self._loop.time())) if self._cleanup_handle is None: self._cleanup_handle = helpers.weakref_handle( self, '_cleanup', self._keepalive_timeout, self._loop) async def _create_connection(self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout') -> ResponseHandler: raise NotImplementedError() class _DNSCacheTable: def __init__(self, ttl: Optional[float]=None) -> None: self._addrs_rr = {} # type: Dict[Tuple[str, int], Tuple[Iterator[Dict[str, Any]], int]] # noqa self._timestamps = {} # type: Dict[Tuple[str, int], float] self._ttl = ttl def __contains__(self, host: object) -> bool: return host in self._addrs_rr def add(self, key: Tuple[str, int], addrs: List[Dict[str, Any]]) -> None: self._addrs_rr[key] = (cycle(addrs), len(addrs)) if self._ttl: self._timestamps[key] = monotonic() def remove(self, key: Tuple[str, int]) -> None: self._addrs_rr.pop(key, None) if self._ttl: self._timestamps.pop(key, None) def clear(self) -> None: self._addrs_rr.clear() self._timestamps.clear() def next_addrs(self, key: Tuple[str, int]) -> List[Dict[str, Any]]: loop, length = self._addrs_rr[key] addrs = list(islice(loop, length)) # Consume one more element to shift internal state of `cycle` next(loop) return addrs def expired(self, key: Tuple[str, int]) -> bool: if self._ttl is None: return False return self._timestamps[key] + self._ttl < monotonic() class TCPConnector(BaseConnector): """TCP connector. verify_ssl - Set to True to check ssl certifications. fingerprint - Pass the binary sha256 digest of the expected certificate in DER format to verify that the certificate the server presents matches. See also https://en.wikipedia.org/wiki/Transport_Layer_Security#Certificate_pinning resolver - Enable DNS lookups and use this resolver use_dns_cache - Use memory cache for DNS lookups. ttl_dns_cache - Max seconds having cached a DNS entry, None forever. family - socket address family local_addr - local tuple of (host, port) to bind socket to keepalive_timeout - (optional) Keep-alive timeout. force_close - Set to True to force close and do reconnect after each request (and between redirects). limit - The total number of simultaneous connections. limit_per_host - Number of simultaneous connections to one host. enable_cleanup_closed - Enables clean-up closed ssl transports. Disabled by default. loop - Optional event loop. """ def __init__(self, *, verify_ssl: bool=True, fingerprint: Optional[bytes]=None, use_dns_cache: bool=True, ttl_dns_cache: int=10, family: int=0, ssl_context: Optional[SSLContext]=None, ssl: Union[None, bool, Fingerprint, SSLContext]=None, local_addr: Optional[Tuple[str, int]]=None, resolver: Optional[AbstractResolver]=None, keepalive_timeout: Union[None, float, object]=sentinel, force_close: bool=False, limit: int=100, limit_per_host: int=0, enable_cleanup_closed: bool=False, loop: Optional[asyncio.AbstractEventLoop]=None): super().__init__(keepalive_timeout=keepalive_timeout, force_close=force_close, limit=limit, limit_per_host=limit_per_host, enable_cleanup_closed=enable_cleanup_closed, loop=loop) self._ssl = _merge_ssl_params(ssl, verify_ssl, ssl_context, fingerprint) if resolver is None: resolver = DefaultResolver(loop=self._loop) self._resolver = resolver self._use_dns_cache = use_dns_cache self._cached_hosts = _DNSCacheTable(ttl=ttl_dns_cache) self._throttle_dns_events = {} # type: Dict[Tuple[str, int], EventResultOrError] # noqa self._family = family self._local_addr = local_addr def close(self) -> Awaitable[None]: """Close all ongoing DNS calls.""" for ev in self._throttle_dns_events.values(): ev.cancel() return super().close() @property def family(self) -> int: """Socket family like AF_INET.""" return self._family @property def use_dns_cache(self) -> bool: """True if local DNS caching is enabled.""" return self._use_dns_cache def clear_dns_cache(self, host: Optional[str]=None, port: Optional[int]=None) -> None: """Remove specified host/port or clear all dns local cache.""" if host is not None and port is not None: self._cached_hosts.remove((host, port)) elif host is not None or port is not None: raise ValueError("either both host and port " "or none of them are allowed") else: self._cached_hosts.clear() async def _resolve_host(self, host: str, port: int, traces: Optional[List['Trace']]=None ) -> List[Dict[str, Any]]: if is_ip_address(host): return [{'hostname': host, 'host': host, 'port': port, 'family': self._family, 'proto': 0, 'flags': 0}] if not self._use_dns_cache: if traces: for trace in traces: await trace.send_dns_resolvehost_start(host) res = (await self._resolver.resolve( host, port, family=self._family)) if traces: for trace in traces: await trace.send_dns_resolvehost_end(host) return res key = (host, port) if (key in self._cached_hosts) and \ (not self._cached_hosts.expired(key)): # get result early, before any await (#4014) result = self._cached_hosts.next_addrs(key) if traces: for trace in traces: await trace.send_dns_cache_hit(host) return result if key in self._throttle_dns_events: # get event early, before any await (#4014) event = self._throttle_dns_events[key] if traces: for trace in traces: await trace.send_dns_cache_hit(host) await event.wait() else: # update dict early, before any await (#4014) self._throttle_dns_events[key] = \ EventResultOrError(self._loop) if traces: for trace in traces: await trace.send_dns_cache_miss(host) try: if traces: for trace in traces: await trace.send_dns_resolvehost_start(host) addrs = await \ self._resolver.resolve(host, port, family=self._family) if traces: for trace in traces: await trace.send_dns_resolvehost_end(host) self._cached_hosts.add(key, addrs) self._throttle_dns_events[key].set() except BaseException as e: # any DNS exception, independently of the implementation # is set for the waiters to raise the same exception. self._throttle_dns_events[key].set(exc=e) raise finally: self._throttle_dns_events.pop(key) return self._cached_hosts.next_addrs(key) async def _create_connection(self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout') -> ResponseHandler: """Create connection. Has same keyword arguments as BaseEventLoop.create_connection. """ if req.proxy: _, proto = await self._create_proxy_connection( req, traces, timeout) else: _, proto = await self._create_direct_connection( req, traces, timeout) return proto @staticmethod @functools.lru_cache(None) def _make_ssl_context(verified: bool) -> SSLContext: if verified: return ssl.create_default_context() else: sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) sslcontext.options |= ssl.OP_NO_SSLv2 sslcontext.options |= ssl.OP_NO_SSLv3 try: sslcontext.options |= ssl.OP_NO_COMPRESSION except AttributeError as attr_err: warnings.warn( '{!s}: The Python interpreter is compiled ' 'against OpenSSL < 1.0.0. Ref: ' 'https://docs.python.org/3/library/ssl.html' '#ssl.OP_NO_COMPRESSION'. format(attr_err), ) sslcontext.set_default_verify_paths() return sslcontext def _get_ssl_context(self, req: 'ClientRequest') -> Optional[SSLContext]: """Logic to get the correct SSL context 0. if req.ssl is false, return None 1. if ssl_context is specified in req, use it 2. if _ssl_context is specified in self, use it 3. otherwise: 1. if verify_ssl is not specified in req, use self.ssl_context (will generate a default context according to self.verify_ssl) 2. if verify_ssl is True in req, generate a default SSL context 3. if verify_ssl is False in req, generate a SSL context that won't verify """ if req.is_ssl(): if ssl is None: # pragma: no cover raise RuntimeError('SSL is not supported.') sslcontext = req.ssl if isinstance(sslcontext, ssl.SSLContext): return sslcontext if sslcontext is not None: # not verified or fingerprinted return self._make_ssl_context(False) sslcontext = self._ssl if isinstance(sslcontext, ssl.SSLContext): return sslcontext if sslcontext is not None: # not verified or fingerprinted return self._make_ssl_context(False) return self._make_ssl_context(True) else: return None def _get_fingerprint(self, req: 'ClientRequest') -> Optional['Fingerprint']: ret = req.ssl if isinstance(ret, Fingerprint): return ret ret = self._ssl if isinstance(ret, Fingerprint): return ret return None async def _wrap_create_connection( self, *args: Any, req: 'ClientRequest', timeout: 'ClientTimeout', client_error: Type[Exception]=ClientConnectorError, **kwargs: Any) -> Tuple[asyncio.Transport, ResponseHandler]: try: with CeilTimeout(timeout.sock_connect): return await self._loop.create_connection(*args, **kwargs) # type: ignore # noqa except cert_errors as exc: raise ClientConnectorCertificateError( req.connection_key, exc) from exc except ssl_errors as exc: raise ClientConnectorSSLError(req.connection_key, exc) from exc except OSError as exc: raise client_error(req.connection_key, exc) from exc async def _create_direct_connection( self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout', *, client_error: Type[Exception]=ClientConnectorError ) -> Tuple[asyncio.Transport, ResponseHandler]: sslcontext = self._get_ssl_context(req) fingerprint = self._get_fingerprint(req) try: # Cancelling this lookup should not cancel the underlying lookup # or else the cancel event will get broadcast to all the waiters # across all connections. host = req.url.raw_host assert host is not None port = req.port assert port is not None hosts = await asyncio.shield(self._resolve_host( host, port, traces=traces), loop=self._loop) except OSError as exc: # in case of proxy it is not ClientProxyConnectionError # it is problem of resolving proxy ip itself raise ClientConnectorError(req.connection_key, exc) from exc last_exc = None # type: Optional[Exception] for hinfo in hosts: host = hinfo['host'] port = hinfo['port'] try: transp, proto = await self._wrap_create_connection( self._factory, host, port, timeout=timeout, ssl=sslcontext, family=hinfo['family'], proto=hinfo['proto'], flags=hinfo['flags'], server_hostname=hinfo['hostname'] if sslcontext else None, local_addr=self._local_addr, req=req, client_error=client_error) except ClientConnectorError as exc: last_exc = exc continue if req.is_ssl() and fingerprint: try: fingerprint.check(transp) except ServerFingerprintMismatch as exc: transp.close() if not self._cleanup_closed_disabled: self._cleanup_closed_transports.append(transp) last_exc = exc continue return transp, proto else: assert last_exc is not None raise last_exc async def _create_proxy_connection( self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout' ) -> Tuple[asyncio.Transport, ResponseHandler]: headers = {} # type: Dict[str, str] if req.proxy_headers is not None: headers = req.proxy_headers # type: ignore headers[hdrs.HOST] = req.headers[hdrs.HOST] url = req.proxy assert url is not None proxy_req = ClientRequest( hdrs.METH_GET, url, headers=headers, auth=req.proxy_auth, loop=self._loop, ssl=req.ssl) # create connection to proxy server transport, proto = await self._create_direct_connection( proxy_req, [], timeout, client_error=ClientProxyConnectionError) # Many HTTP proxies has buggy keepalive support. Let's not # reuse connection but close it after processing every # response. proto.force_close() auth = proxy_req.headers.pop(hdrs.AUTHORIZATION, None) if auth is not None: if not req.is_ssl(): req.headers[hdrs.PROXY_AUTHORIZATION] = auth else: proxy_req.headers[hdrs.PROXY_AUTHORIZATION] = auth if req.is_ssl(): sslcontext = self._get_ssl_context(req) # For HTTPS requests over HTTP proxy # we must notify proxy to tunnel connection # so we send CONNECT command: # CONNECT www.python.org:443 HTTP/1.1 # Host: www.python.org # # next we must do TLS handshake and so on # to do this we must wrap raw socket into secure one # asyncio handles this perfectly proxy_req.method = hdrs.METH_CONNECT proxy_req.url = req.url key = attr.evolve(req.connection_key, proxy=None, proxy_auth=None, proxy_headers_hash=None) conn = Connection(self, key, proto, self._loop) proxy_resp = await proxy_req.send(conn) try: protocol = conn._protocol assert protocol is not None protocol.set_response_params() resp = await proxy_resp.start(conn) except BaseException: proxy_resp.close() conn.close() raise else: conn._protocol = None conn._transport = None try: if resp.status != 200: message = resp.reason if message is None: message = RESPONSES[resp.status][0] raise ClientHttpProxyError( proxy_resp.request_info, resp.history, status=resp.status, message=message, headers=resp.headers) rawsock = transport.get_extra_info('socket', default=None) if rawsock is None: raise RuntimeError( "Transport does not expose socket instance") # Duplicate the socket, so now we can close proxy transport rawsock = rawsock.dup() finally: transport.close() transport, proto = await self._wrap_create_connection( self._factory, timeout=timeout, ssl=sslcontext, sock=rawsock, server_hostname=req.host, req=req) finally: proxy_resp.close() return transport, proto class UnixConnector(BaseConnector): """Unix socket connector. path - Unix socket path. keepalive_timeout - (optional) Keep-alive timeout. force_close - Set to True to force close and do reconnect after each request (and between redirects). limit - The total number of simultaneous connections. limit_per_host - Number of simultaneous connections to one host. loop - Optional event loop. """ def __init__(self, path: str, force_close: bool=False, keepalive_timeout: Union[object, float, None]=sentinel, limit: int=100, limit_per_host: int=0, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: super().__init__(force_close=force_close, keepalive_timeout=keepalive_timeout, limit=limit, limit_per_host=limit_per_host, loop=loop) self._path = path @property def path(self) -> str: """Path to unix socket.""" return self._path async def _create_connection(self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout') -> ResponseHandler: try: with CeilTimeout(timeout.sock_connect): _, proto = await self._loop.create_unix_connection( self._factory, self._path) except OSError as exc: raise ClientConnectorError(req.connection_key, exc) from exc return cast(ResponseHandler, proto) class NamedPipeConnector(BaseConnector): """Named pipe connector. Only supported by the proactor event loop. See also: https://docs.python.org/3.7/library/asyncio-eventloop.html path - Windows named pipe path. keepalive_timeout - (optional) Keep-alive timeout. force_close - Set to True to force close and do reconnect after each request (and between redirects). limit - The total number of simultaneous connections. limit_per_host - Number of simultaneous connections to one host. loop - Optional event loop. """ def __init__(self, path: str, force_close: bool=False, keepalive_timeout: Union[object, float, None]=sentinel, limit: int=100, limit_per_host: int=0, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: super().__init__(force_close=force_close, keepalive_timeout=keepalive_timeout, limit=limit, limit_per_host=limit_per_host, loop=loop) if not isinstance(self._loop, asyncio.ProactorEventLoop): # type: ignore # noqa raise RuntimeError("Named Pipes only available in proactor " "loop under windows") self._path = path @property def path(self) -> str: """Path to the named pipe.""" return self._path async def _create_connection(self, req: 'ClientRequest', traces: List['Trace'], timeout: 'ClientTimeout') -> ResponseHandler: try: with CeilTimeout(timeout.sock_connect): _, proto = await self._loop.create_pipe_connection( # type: ignore # noqa self._factory, self._path ) # the drain is required so that the connection_made is called # and transport is set otherwise it is not set before the # `assert conn.transport is not None` # in client.py's _request method await asyncio.sleep(0) # other option is to manually set transport like # `proto.transport = trans` except OSError as exc: raise ClientConnectorError(req.connection_key, exc) from exc return cast(ResponseHandler, proto) aiohttp-3.6.2/aiohttp/cookiejar.py0000644000175100001650000002715113547410117017463 0ustar vstsdocker00000000000000import asyncio import datetime import os # noqa import pathlib import pickle import re from collections import defaultdict from http.cookies import BaseCookie, Morsel, SimpleCookie # noqa from typing import ( # noqa DefaultDict, Dict, Iterable, Iterator, Mapping, Optional, Set, Tuple, Union, cast, ) from yarl import URL from .abc import AbstractCookieJar from .helpers import is_ip_address, next_whole_second from .typedefs import LooseCookies, PathLike __all__ = ('CookieJar', 'DummyCookieJar') CookieItem = Union[str, 'Morsel[str]'] class CookieJar(AbstractCookieJar): """Implements cookie storage adhering to RFC 6265.""" DATE_TOKENS_RE = re.compile( r"[\x09\x20-\x2F\x3B-\x40\x5B-\x60\x7B-\x7E]*" r"(?P[\x00-\x08\x0A-\x1F\d:a-zA-Z\x7F-\xFF]+)") DATE_HMS_TIME_RE = re.compile(r"(\d{1,2}):(\d{1,2}):(\d{1,2})") DATE_DAY_OF_MONTH_RE = re.compile(r"(\d{1,2})") DATE_MONTH_RE = re.compile("(jan)|(feb)|(mar)|(apr)|(may)|(jun)|(jul)|" "(aug)|(sep)|(oct)|(nov)|(dec)", re.I) DATE_YEAR_RE = re.compile(r"(\d{2,4})") MAX_TIME = datetime.datetime.max.replace( tzinfo=datetime.timezone.utc) def __init__(self, *, unsafe: bool=False, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: super().__init__(loop=loop) self._cookies = defaultdict(SimpleCookie) #type: DefaultDict[str, SimpleCookie] # noqa self._host_only_cookies = set() # type: Set[Tuple[str, str]] self._unsafe = unsafe self._next_expiration = next_whole_second() self._expirations = {} # type: Dict[Tuple[str, str], datetime.datetime] # noqa: E501 def save(self, file_path: PathLike) -> None: file_path = pathlib.Path(file_path) with file_path.open(mode='wb') as f: pickle.dump(self._cookies, f, pickle.HIGHEST_PROTOCOL) def load(self, file_path: PathLike) -> None: file_path = pathlib.Path(file_path) with file_path.open(mode='rb') as f: self._cookies = pickle.load(f) def clear(self) -> None: self._cookies.clear() self._host_only_cookies.clear() self._next_expiration = next_whole_second() self._expirations.clear() def __iter__(self) -> 'Iterator[Morsel[str]]': self._do_expiration() for val in self._cookies.values(): yield from val.values() def __len__(self) -> int: return sum(1 for i in self) def _do_expiration(self) -> None: now = datetime.datetime.now(datetime.timezone.utc) if self._next_expiration > now: return if not self._expirations: return next_expiration = self.MAX_TIME to_del = [] cookies = self._cookies expirations = self._expirations for (domain, name), when in expirations.items(): if when <= now: cookies[domain].pop(name, None) to_del.append((domain, name)) self._host_only_cookies.discard((domain, name)) else: next_expiration = min(next_expiration, when) for key in to_del: del expirations[key] try: self._next_expiration = (next_expiration.replace(microsecond=0) + datetime.timedelta(seconds=1)) except OverflowError: self._next_expiration = self.MAX_TIME def _expire_cookie(self, when: datetime.datetime, domain: str, name: str ) -> None: self._next_expiration = min(self._next_expiration, when) self._expirations[(domain, name)] = when def update_cookies(self, cookies: LooseCookies, response_url: URL=URL()) -> None: """Update cookies.""" hostname = response_url.raw_host if not self._unsafe and is_ip_address(hostname): # Don't accept cookies from IPs return if isinstance(cookies, Mapping): cookies = cookies.items() # type: ignore for name, cookie in cookies: if not isinstance(cookie, Morsel): tmp = SimpleCookie() tmp[name] = cookie # type: ignore cookie = tmp[name] domain = cookie["domain"] # ignore domains with trailing dots if domain.endswith('.'): domain = "" del cookie["domain"] if not domain and hostname is not None: # Set the cookie's domain to the response hostname # and set its host-only-flag self._host_only_cookies.add((hostname, name)) domain = cookie["domain"] = hostname if domain.startswith("."): # Remove leading dot domain = domain[1:] cookie["domain"] = domain if hostname and not self._is_domain_match(domain, hostname): # Setting cookies for different domains is not allowed continue path = cookie["path"] if not path or not path.startswith("/"): # Set the cookie's path to the response path path = response_url.path if not path.startswith("/"): path = "/" else: # Cut everything from the last slash to the end path = "/" + path[1:path.rfind("/")] cookie["path"] = path max_age = cookie["max-age"] if max_age: try: delta_seconds = int(max_age) try: max_age_expiration = ( datetime.datetime.now(datetime.timezone.utc) + datetime.timedelta(seconds=delta_seconds)) except OverflowError: max_age_expiration = self.MAX_TIME self._expire_cookie(max_age_expiration, domain, name) except ValueError: cookie["max-age"] = "" else: expires = cookie["expires"] if expires: expire_time = self._parse_date(expires) if expire_time: self._expire_cookie(expire_time, domain, name) else: cookie["expires"] = "" self._cookies[domain][name] = cookie self._do_expiration() def filter_cookies(self, request_url: URL=URL()) -> 'BaseCookie[str]': """Returns this jar's cookies filtered by their attributes.""" self._do_expiration() request_url = URL(request_url) filtered = SimpleCookie() hostname = request_url.raw_host or "" is_not_secure = request_url.scheme not in ("https", "wss") for cookie in self: name = cookie.key domain = cookie["domain"] # Send shared cookies if not domain: filtered[name] = cookie.value continue if not self._unsafe and is_ip_address(hostname): continue if (domain, name) in self._host_only_cookies: if domain != hostname: continue elif not self._is_domain_match(domain, hostname): continue if not self._is_path_match(request_url.path, cookie["path"]): continue if is_not_secure and cookie["secure"]: continue # It's critical we use the Morsel so the coded_value # (based on cookie version) is preserved mrsl_val = cast('Morsel[str]', cookie.get(cookie.key, Morsel())) mrsl_val.set(cookie.key, cookie.value, cookie.coded_value) filtered[name] = mrsl_val return filtered @staticmethod def _is_domain_match(domain: str, hostname: str) -> bool: """Implements domain matching adhering to RFC 6265.""" if hostname == domain: return True if not hostname.endswith(domain): return False non_matching = hostname[:-len(domain)] if not non_matching.endswith("."): return False return not is_ip_address(hostname) @staticmethod def _is_path_match(req_path: str, cookie_path: str) -> bool: """Implements path matching adhering to RFC 6265.""" if not req_path.startswith("/"): req_path = "/" if req_path == cookie_path: return True if not req_path.startswith(cookie_path): return False if cookie_path.endswith("/"): return True non_matching = req_path[len(cookie_path):] return non_matching.startswith("/") @classmethod def _parse_date(cls, date_str: str) -> Optional[datetime.datetime]: """Implements date string parsing adhering to RFC 6265.""" if not date_str: return None found_time = False found_day = False found_month = False found_year = False hour = minute = second = 0 day = 0 month = 0 year = 0 for token_match in cls.DATE_TOKENS_RE.finditer(date_str): token = token_match.group("token") if not found_time: time_match = cls.DATE_HMS_TIME_RE.match(token) if time_match: found_time = True hour, minute, second = [ int(s) for s in time_match.groups()] continue if not found_day: day_match = cls.DATE_DAY_OF_MONTH_RE.match(token) if day_match: found_day = True day = int(day_match.group()) continue if not found_month: month_match = cls.DATE_MONTH_RE.match(token) if month_match: found_month = True assert month_match.lastindex is not None month = month_match.lastindex continue if not found_year: year_match = cls.DATE_YEAR_RE.match(token) if year_match: found_year = True year = int(year_match.group()) if 70 <= year <= 99: year += 1900 elif 0 <= year <= 69: year += 2000 if False in (found_day, found_month, found_year, found_time): return None if not 1 <= day <= 31: return None if year < 1601 or hour > 23 or minute > 59 or second > 59: return None return datetime.datetime(year, month, day, hour, minute, second, tzinfo=datetime.timezone.utc) class DummyCookieJar(AbstractCookieJar): """Implements a dummy cookie storage. It can be used with the ClientSession when no cookie processing is needed. """ def __init__(self, *, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: super().__init__(loop=loop) def __iter__(self) -> 'Iterator[Morsel[str]]': while False: yield None def __len__(self) -> int: return 0 def clear(self) -> None: pass def update_cookies(self, cookies: LooseCookies, response_url: URL=URL()) -> None: pass def filter_cookies(self, request_url: URL) -> 'BaseCookie[str]': return SimpleCookie() aiohttp-3.6.2/aiohttp/formdata.py0000644000175100001650000001325713547410117017314 0ustar vstsdocker00000000000000import io from typing import Any, Iterable, List, Optional # noqa from urllib.parse import urlencode from multidict import MultiDict, MultiDictProxy from . import hdrs, multipart, payload from .helpers import guess_filename from .payload import Payload __all__ = ('FormData',) class FormData: """Helper class for multipart/form-data and application/x-www-form-urlencoded body generation.""" def __init__(self, fields: Iterable[Any]=(), quote_fields: bool=True, charset: Optional[str]=None) -> None: self._writer = multipart.MultipartWriter('form-data') self._fields = [] # type: List[Any] self._is_multipart = False self._quote_fields = quote_fields self._charset = charset if isinstance(fields, dict): fields = list(fields.items()) elif not isinstance(fields, (list, tuple)): fields = (fields,) self.add_fields(*fields) @property def is_multipart(self) -> bool: return self._is_multipart def add_field(self, name: str, value: Any, *, content_type: Optional[str]=None, filename: Optional[str]=None, content_transfer_encoding: Optional[str]=None) -> None: if isinstance(value, io.IOBase): self._is_multipart = True elif isinstance(value, (bytes, bytearray, memoryview)): if filename is None and content_transfer_encoding is None: filename = name type_options = MultiDict({'name': name}) if filename is not None and not isinstance(filename, str): raise TypeError('filename must be an instance of str. ' 'Got: %s' % filename) if filename is None and isinstance(value, io.IOBase): filename = guess_filename(value, name) if filename is not None: type_options['filename'] = filename self._is_multipart = True headers = {} if content_type is not None: if not isinstance(content_type, str): raise TypeError('content_type must be an instance of str. ' 'Got: %s' % content_type) headers[hdrs.CONTENT_TYPE] = content_type self._is_multipart = True if content_transfer_encoding is not None: if not isinstance(content_transfer_encoding, str): raise TypeError('content_transfer_encoding must be an instance' ' of str. Got: %s' % content_transfer_encoding) headers[hdrs.CONTENT_TRANSFER_ENCODING] = content_transfer_encoding self._is_multipart = True self._fields.append((type_options, headers, value)) def add_fields(self, *fields: Any) -> None: to_add = list(fields) while to_add: rec = to_add.pop(0) if isinstance(rec, io.IOBase): k = guess_filename(rec, 'unknown') self.add_field(k, rec) # type: ignore elif isinstance(rec, (MultiDictProxy, MultiDict)): to_add.extend(rec.items()) elif isinstance(rec, (list, tuple)) and len(rec) == 2: k, fp = rec self.add_field(k, fp) # type: ignore else: raise TypeError('Only io.IOBase, multidict and (name, file) ' 'pairs allowed, use .add_field() for passing ' 'more complex parameters, got {!r}' .format(rec)) def _gen_form_urlencoded(self) -> payload.BytesPayload: # form data (x-www-form-urlencoded) data = [] for type_options, _, value in self._fields: data.append((type_options['name'], value)) charset = self._charset if self._charset is not None else 'utf-8' if charset == 'utf-8': content_type = 'application/x-www-form-urlencoded' else: content_type = ('application/x-www-form-urlencoded; ' 'charset=%s' % charset) return payload.BytesPayload( urlencode(data, doseq=True, encoding=charset).encode(), content_type=content_type) def _gen_form_data(self) -> multipart.MultipartWriter: """Encode a list of fields using the multipart/form-data MIME format""" for dispparams, headers, value in self._fields: try: if hdrs.CONTENT_TYPE in headers: part = payload.get_payload( value, content_type=headers[hdrs.CONTENT_TYPE], headers=headers, encoding=self._charset) else: part = payload.get_payload( value, headers=headers, encoding=self._charset) except Exception as exc: raise TypeError( 'Can not serialize value type: %r\n ' 'headers: %r\n value: %r' % ( type(value), headers, value)) from exc if dispparams: part.set_content_disposition( 'form-data', quote_fields=self._quote_fields, **dispparams ) # FIXME cgi.FieldStorage doesn't likes body parts with # Content-Length which were sent via chunked transfer encoding assert part.headers is not None part.headers.popall(hdrs.CONTENT_LENGTH, None) self._writer.append_payload(part) return self._writer def __call__(self) -> Payload: if self._is_multipart: return self._gen_form_data() else: return self._gen_form_urlencoded() aiohttp-3.6.2/aiohttp/frozenlist.py0000644000175100001650000000336513547410117017715 0ustar vstsdocker00000000000000from collections.abc import MutableSequence from functools import total_ordering from .helpers import NO_EXTENSIONS @total_ordering class FrozenList(MutableSequence): __slots__ = ('_frozen', '_items') def __init__(self, items=None): self._frozen = False if items is not None: items = list(items) else: items = [] self._items = items @property def frozen(self): return self._frozen def freeze(self): self._frozen = True def __getitem__(self, index): return self._items[index] def __setitem__(self, index, value): if self._frozen: raise RuntimeError("Cannot modify frozen list.") self._items[index] = value def __delitem__(self, index): if self._frozen: raise RuntimeError("Cannot modify frozen list.") del self._items[index] def __len__(self): return self._items.__len__() def __iter__(self): return self._items.__iter__() def __reversed__(self): return self._items.__reversed__() def __eq__(self, other): return list(self) == other def __le__(self, other): return list(self) <= other def insert(self, pos, item): if self._frozen: raise RuntimeError("Cannot modify frozen list.") self._items.insert(pos, item) def __repr__(self): return ''.format(self._frozen, self._items) PyFrozenList = FrozenList try: from aiohttp._frozenlist import FrozenList as CFrozenList # type: ignore if not NO_EXTENSIONS: FrozenList = CFrozenList # type: ignore except ImportError: # pragma: no cover pass aiohttp-3.6.2/aiohttp/frozenlist.pyi0000644000175100001650000000265113547410117020063 0ustar vstsdocker00000000000000from typing import ( Generic, Iterable, Iterator, List, MutableSequence, Optional, TypeVar, Union, overload, ) _T = TypeVar('_T') _Arg = Union[List[_T], Iterable[_T]] class FrozenList(MutableSequence[_T], Generic[_T]): def __init__(self, items: Optional[_Arg[_T]]=...) -> None: ... @property def frozen(self) -> bool: ... def freeze(self) -> None: ... @overload def __getitem__(self, i: int) -> _T: ... @overload def __getitem__(self, s: slice) -> FrozenList[_T]: ... @overload def __setitem__(self, i: int, o: _T) -> None: ... @overload def __setitem__(self, s: slice, o: Iterable[_T]) -> None: ... @overload def __delitem__(self, i: int) -> None: ... @overload def __delitem__(self, i: slice) -> None: ... def __len__(self) -> int: ... def __iter__(self) -> Iterator[_T]: ... def __reversed__(self) -> Iterator[_T]: ... def __eq__(self, other: object) -> bool: ... def __le__(self, other: FrozenList[_T]) -> bool: ... def __ne__(self, other: object) -> bool: ... def __lt__(self, other: FrozenList[_T]) -> bool: ... def __ge__(self, other: FrozenList[_T]) -> bool: ... def __gt__(self, other: FrozenList[_T]) -> bool: ... def insert(self, pos: int, item: _T) -> None: ... def __repr__(self) -> str: ... # types for C accelerators are the same CFrozenList = PyFrozenList = FrozenList aiohttp-3.6.2/aiohttp/hdrs.py0000644000175100001650000000657113547410117016460 0ustar vstsdocker00000000000000"""HTTP Headers constants.""" # After changing the file content call ./tools/gen.py # to regenerate the headers parser from multidict import istr METH_ANY = '*' METH_CONNECT = 'CONNECT' METH_HEAD = 'HEAD' METH_GET = 'GET' METH_DELETE = 'DELETE' METH_OPTIONS = 'OPTIONS' METH_PATCH = 'PATCH' METH_POST = 'POST' METH_PUT = 'PUT' METH_TRACE = 'TRACE' METH_ALL = {METH_CONNECT, METH_HEAD, METH_GET, METH_DELETE, METH_OPTIONS, METH_PATCH, METH_POST, METH_PUT, METH_TRACE} ACCEPT = istr('Accept') ACCEPT_CHARSET = istr('Accept-Charset') ACCEPT_ENCODING = istr('Accept-Encoding') ACCEPT_LANGUAGE = istr('Accept-Language') ACCEPT_RANGES = istr('Accept-Ranges') ACCESS_CONTROL_MAX_AGE = istr('Access-Control-Max-Age') ACCESS_CONTROL_ALLOW_CREDENTIALS = istr('Access-Control-Allow-Credentials') ACCESS_CONTROL_ALLOW_HEADERS = istr('Access-Control-Allow-Headers') ACCESS_CONTROL_ALLOW_METHODS = istr('Access-Control-Allow-Methods') ACCESS_CONTROL_ALLOW_ORIGIN = istr('Access-Control-Allow-Origin') ACCESS_CONTROL_EXPOSE_HEADERS = istr('Access-Control-Expose-Headers') ACCESS_CONTROL_REQUEST_HEADERS = istr('Access-Control-Request-Headers') ACCESS_CONTROL_REQUEST_METHOD = istr('Access-Control-Request-Method') AGE = istr('Age') ALLOW = istr('Allow') AUTHORIZATION = istr('Authorization') CACHE_CONTROL = istr('Cache-Control') CONNECTION = istr('Connection') CONTENT_DISPOSITION = istr('Content-Disposition') CONTENT_ENCODING = istr('Content-Encoding') CONTENT_LANGUAGE = istr('Content-Language') CONTENT_LENGTH = istr('Content-Length') CONTENT_LOCATION = istr('Content-Location') CONTENT_MD5 = istr('Content-MD5') CONTENT_RANGE = istr('Content-Range') CONTENT_TRANSFER_ENCODING = istr('Content-Transfer-Encoding') CONTENT_TYPE = istr('Content-Type') COOKIE = istr('Cookie') DATE = istr('Date') DESTINATION = istr('Destination') DIGEST = istr('Digest') ETAG = istr('Etag') EXPECT = istr('Expect') EXPIRES = istr('Expires') FORWARDED = istr('Forwarded') FROM = istr('From') HOST = istr('Host') IF_MATCH = istr('If-Match') IF_MODIFIED_SINCE = istr('If-Modified-Since') IF_NONE_MATCH = istr('If-None-Match') IF_RANGE = istr('If-Range') IF_UNMODIFIED_SINCE = istr('If-Unmodified-Since') KEEP_ALIVE = istr('Keep-Alive') LAST_EVENT_ID = istr('Last-Event-ID') LAST_MODIFIED = istr('Last-Modified') LINK = istr('Link') LOCATION = istr('Location') MAX_FORWARDS = istr('Max-Forwards') ORIGIN = istr('Origin') PRAGMA = istr('Pragma') PROXY_AUTHENTICATE = istr('Proxy-Authenticate') PROXY_AUTHORIZATION = istr('Proxy-Authorization') RANGE = istr('Range') REFERER = istr('Referer') RETRY_AFTER = istr('Retry-After') SEC_WEBSOCKET_ACCEPT = istr('Sec-WebSocket-Accept') SEC_WEBSOCKET_VERSION = istr('Sec-WebSocket-Version') SEC_WEBSOCKET_PROTOCOL = istr('Sec-WebSocket-Protocol') SEC_WEBSOCKET_EXTENSIONS = istr('Sec-WebSocket-Extensions') SEC_WEBSOCKET_KEY = istr('Sec-WebSocket-Key') SEC_WEBSOCKET_KEY1 = istr('Sec-WebSocket-Key1') SERVER = istr('Server') SET_COOKIE = istr('Set-Cookie') TE = istr('TE') TRAILER = istr('Trailer') TRANSFER_ENCODING = istr('Transfer-Encoding') UPGRADE = istr('Upgrade') WEBSOCKET = istr('WebSocket') URI = istr('URI') USER_AGENT = istr('User-Agent') VARY = istr('Vary') VIA = istr('Via') WANT_DIGEST = istr('Want-Digest') WARNING = istr('Warning') WWW_AUTHENTICATE = istr('WWW-Authenticate') X_FORWARDED_FOR = istr('X-Forwarded-For') X_FORWARDED_HOST = istr('X-Forwarded-Host') X_FORWARDED_PROTO = istr('X-Forwarded-Proto') aiohttp-3.6.2/aiohttp/helpers.py0000644000175100001650000005456213547410117017165 0ustar vstsdocker00000000000000"""Various helper functions""" import asyncio import base64 import binascii import cgi import datetime import functools import inspect import netrc import os import platform import re import sys import time import warnings import weakref from collections import namedtuple from contextlib import suppress from math import ceil from pathlib import Path from types import TracebackType from typing import ( # noqa Any, Callable, Dict, Iterable, Iterator, List, Mapping, Optional, Pattern, Set, Tuple, Type, TypeVar, Union, cast, ) from urllib.parse import quote from urllib.request import getproxies import async_timeout import attr from multidict import MultiDict, MultiDictProxy from yarl import URL from . import hdrs from .log import client_logger, internal_logger from .typedefs import PathLike # noqa __all__ = ('BasicAuth', 'ChainMapProxy') PY_36 = sys.version_info >= (3, 6) PY_37 = sys.version_info >= (3, 7) PY_38 = sys.version_info >= (3, 8) if not PY_37: import idna_ssl idna_ssl.patch_match_hostname() try: from typing import ContextManager except ImportError: from typing_extensions import ContextManager def all_tasks( loop: Optional[asyncio.AbstractEventLoop] = None ) -> Set['asyncio.Task[Any]']: tasks = list(asyncio.Task.all_tasks(loop)) return {t for t in tasks if not t.done()} if PY_37: all_tasks = getattr(asyncio, 'all_tasks') # noqa _T = TypeVar('_T') sentinel = object() # type: Any NO_EXTENSIONS = bool(os.environ.get('AIOHTTP_NO_EXTENSIONS')) # type: bool # N.B. sys.flags.dev_mode is available on Python 3.7+, use getattr # for compatibility with older versions DEBUG = (getattr(sys.flags, 'dev_mode', False) or (not sys.flags.ignore_environment and bool(os.environ.get('PYTHONASYNCIODEBUG')))) # type: bool CHAR = set(chr(i) for i in range(0, 128)) CTL = set(chr(i) for i in range(0, 32)) | {chr(127), } SEPARATORS = {'(', ')', '<', '>', '@', ',', ';', ':', '\\', '"', '/', '[', ']', '?', '=', '{', '}', ' ', chr(9)} TOKEN = CHAR ^ CTL ^ SEPARATORS coroutines = asyncio.coroutines old_debug = coroutines._DEBUG # type: ignore # prevent "coroutine noop was never awaited" warning. coroutines._DEBUG = False # type: ignore @asyncio.coroutine def noop(*args, **kwargs): # type: ignore return # type: ignore async def noop2(*args: Any, **kwargs: Any) -> None: return coroutines._DEBUG = old_debug # type: ignore class BasicAuth(namedtuple('BasicAuth', ['login', 'password', 'encoding'])): """Http basic authentication helper.""" def __new__(cls, login: str, password: str='', encoding: str='latin1') -> 'BasicAuth': if login is None: raise ValueError('None is not allowed as login value') if password is None: raise ValueError('None is not allowed as password value') if ':' in login: raise ValueError( 'A ":" is not allowed in login (RFC 1945#section-11.1)') return super().__new__(cls, login, password, encoding) @classmethod def decode(cls, auth_header: str, encoding: str='latin1') -> 'BasicAuth': """Create a BasicAuth object from an Authorization HTTP header.""" try: auth_type, encoded_credentials = auth_header.split(' ', 1) except ValueError: raise ValueError('Could not parse authorization header.') if auth_type.lower() != 'basic': raise ValueError('Unknown authorization method %s' % auth_type) try: decoded = base64.b64decode( encoded_credentials.encode('ascii'), validate=True ).decode(encoding) except binascii.Error: raise ValueError('Invalid base64 encoding.') try: # RFC 2617 HTTP Authentication # https://www.ietf.org/rfc/rfc2617.txt # the colon must be present, but the username and password may be # otherwise blank. username, password = decoded.split(':', 1) except ValueError: raise ValueError('Invalid credentials.') return cls(username, password, encoding=encoding) @classmethod def from_url(cls, url: URL, *, encoding: str='latin1') -> Optional['BasicAuth']: """Create BasicAuth from url.""" if not isinstance(url, URL): raise TypeError("url should be yarl.URL instance") if url.user is None: return None return cls(url.user, url.password or '', encoding=encoding) def encode(self) -> str: """Encode credentials.""" creds = ('%s:%s' % (self.login, self.password)).encode(self.encoding) return 'Basic %s' % base64.b64encode(creds).decode(self.encoding) def strip_auth_from_url(url: URL) -> Tuple[URL, Optional[BasicAuth]]: auth = BasicAuth.from_url(url) if auth is None: return url, None else: return url.with_user(None), auth def netrc_from_env() -> Optional[netrc.netrc]: """Attempt to load the netrc file from the path specified by the env-var NETRC or in the default location in the user's home directory. Returns None if it couldn't be found or fails to parse. """ netrc_env = os.environ.get('NETRC') if netrc_env is not None: netrc_path = Path(netrc_env) else: try: home_dir = Path.home() except RuntimeError as e: # pragma: no cover # if pathlib can't resolve home, it may raise a RuntimeError client_logger.debug('Could not resolve home directory when ' 'trying to look for .netrc file: %s', e) return None netrc_path = home_dir / ( '_netrc' if platform.system() == 'Windows' else '.netrc') try: return netrc.netrc(str(netrc_path)) except netrc.NetrcParseError as e: client_logger.warning('Could not parse .netrc file: %s', e) except OSError as e: # we couldn't read the file (doesn't exist, permissions, etc.) if netrc_env or netrc_path.is_file(): # only warn if the environment wanted us to load it, # or it appears like the default file does actually exist client_logger.warning('Could not read .netrc file: %s', e) return None @attr.s(frozen=True, slots=True) class ProxyInfo: proxy = attr.ib(type=URL) proxy_auth = attr.ib(type=Optional[BasicAuth]) def proxies_from_env() -> Dict[str, ProxyInfo]: proxy_urls = {k: URL(v) for k, v in getproxies().items() if k in ('http', 'https')} netrc_obj = netrc_from_env() stripped = {k: strip_auth_from_url(v) for k, v in proxy_urls.items()} ret = {} for proto, val in stripped.items(): proxy, auth = val if proxy.scheme == 'https': client_logger.warning( "HTTPS proxies %s are not supported, ignoring", proxy) continue if netrc_obj and auth is None: auth_from_netrc = None if proxy.host is not None: auth_from_netrc = netrc_obj.authenticators(proxy.host) if auth_from_netrc is not None: # auth_from_netrc is a (`user`, `account`, `password`) tuple, # `user` and `account` both can be username, # if `user` is None, use `account` *logins, password = auth_from_netrc login = logins[0] if logins[0] else logins[-1] auth = BasicAuth(cast(str, login), cast(str, password)) ret[proto] = ProxyInfo(proxy, auth) return ret def current_task(loop: Optional[asyncio.AbstractEventLoop]=None) -> asyncio.Task: # type: ignore # noqa # Return type is intentionally Generic here if PY_37: return asyncio.current_task(loop=loop) # type: ignore else: return asyncio.Task.current_task(loop=loop) def get_running_loop( loop: Optional[asyncio.AbstractEventLoop]=None ) -> asyncio.AbstractEventLoop: if loop is None: loop = asyncio.get_event_loop() if not loop.is_running(): warnings.warn("The object should be created from async function", DeprecationWarning, stacklevel=3) if loop.get_debug(): internal_logger.warning( "The object should be created from async function", stack_info=True) return loop def isasyncgenfunction(obj: Any) -> bool: func = getattr(inspect, 'isasyncgenfunction', None) if func is not None: return func(obj) else: return False @attr.s(frozen=True, slots=True) class MimeType: type = attr.ib(type=str) subtype = attr.ib(type=str) suffix = attr.ib(type=str) parameters = attr.ib(type=MultiDictProxy) # type: MultiDictProxy[str] @functools.lru_cache(maxsize=56) def parse_mimetype(mimetype: str) -> MimeType: """Parses a MIME type into its components. mimetype is a MIME type string. Returns a MimeType object. Example: >>> parse_mimetype('text/html; charset=utf-8') MimeType(type='text', subtype='html', suffix='', parameters={'charset': 'utf-8'}) """ if not mimetype: return MimeType(type='', subtype='', suffix='', parameters=MultiDictProxy(MultiDict())) parts = mimetype.split(';') params = MultiDict() # type: MultiDict[str] for item in parts[1:]: if not item: continue key, value = cast(Tuple[str, str], item.split('=', 1) if '=' in item else (item, '')) params.add(key.lower().strip(), value.strip(' "')) fulltype = parts[0].strip().lower() if fulltype == '*': fulltype = '*/*' mtype, stype = (cast(Tuple[str, str], fulltype.split('/', 1)) if '/' in fulltype else (fulltype, '')) stype, suffix = (cast(Tuple[str, str], stype.split('+', 1)) if '+' in stype else (stype, '')) return MimeType(type=mtype, subtype=stype, suffix=suffix, parameters=MultiDictProxy(params)) def guess_filename(obj: Any, default: Optional[str]=None) -> Optional[str]: name = getattr(obj, 'name', None) if name and isinstance(name, str) and name[0] != '<' and name[-1] != '>': return Path(name).name return default def content_disposition_header(disptype: str, quote_fields: bool=True, **params: str) -> str: """Sets ``Content-Disposition`` header. disptype is a disposition type: inline, attachment, form-data. Should be valid extension token (see RFC 2183) params is a dict with disposition params. """ if not disptype or not (TOKEN > set(disptype)): raise ValueError('bad content disposition type {!r}' ''.format(disptype)) value = disptype if params: lparams = [] for key, val in params.items(): if not key or not (TOKEN > set(key)): raise ValueError('bad content disposition parameter' ' {!r}={!r}'.format(key, val)) qval = quote(val, '') if quote_fields else val lparams.append((key, '"%s"' % qval)) if key == 'filename': lparams.append(('filename*', "utf-8''" + qval)) sparams = '; '.join('='.join(pair) for pair in lparams) value = '; '.join((value, sparams)) return value class reify: """Use as a class method decorator. It operates almost exactly like the Python `@property` decorator, but it puts the result of the method it decorates into the instance dict after the first call, effectively replacing the function it decorates with an instance variable. It is, in Python parlance, a data descriptor. """ def __init__(self, wrapped: Callable[..., Any]) -> None: self.wrapped = wrapped self.__doc__ = wrapped.__doc__ self.name = wrapped.__name__ def __get__(self, inst: Any, owner: Any) -> Any: try: try: return inst._cache[self.name] except KeyError: val = self.wrapped(inst) inst._cache[self.name] = val return val except AttributeError: if inst is None: return self raise def __set__(self, inst: Any, value: Any) -> None: raise AttributeError("reified property is read-only") reify_py = reify try: from ._helpers import reify as reify_c if not NO_EXTENSIONS: reify = reify_c # type: ignore except ImportError: pass _ipv4_pattern = (r'^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}' r'(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$') _ipv6_pattern = ( r'^(?:(?:(?:[A-F0-9]{1,4}:){6}|(?=(?:[A-F0-9]{0,4}:){0,6}' r'(?:[0-9]{1,3}\.){3}[0-9]{1,3}$)(([0-9A-F]{1,4}:){0,5}|:)' r'((:[0-9A-F]{1,4}){1,5}:|:)|::(?:[A-F0-9]{1,4}:){5})' r'(?:(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])\.){3}' r'(?:25[0-5]|2[0-4][0-9]|1[0-9][0-9]|[1-9]?[0-9])|(?:[A-F0-9]{1,4}:){7}' r'[A-F0-9]{1,4}|(?=(?:[A-F0-9]{0,4}:){0,7}[A-F0-9]{0,4}$)' r'(([0-9A-F]{1,4}:){1,7}|:)((:[0-9A-F]{1,4}){1,7}|:)|(?:[A-F0-9]{1,4}:){7}' r':|:(:[A-F0-9]{1,4}){7})$') _ipv4_regex = re.compile(_ipv4_pattern) _ipv6_regex = re.compile(_ipv6_pattern, flags=re.IGNORECASE) _ipv4_regexb = re.compile(_ipv4_pattern.encode('ascii')) _ipv6_regexb = re.compile(_ipv6_pattern.encode('ascii'), flags=re.IGNORECASE) def _is_ip_address( regex: Pattern[str], regexb: Pattern[bytes], host: Optional[Union[str, bytes]]) -> bool: if host is None: return False if isinstance(host, str): return bool(regex.match(host)) elif isinstance(host, (bytes, bytearray, memoryview)): return bool(regexb.match(host)) else: raise TypeError("{} [{}] is not a str or bytes" .format(host, type(host))) is_ipv4_address = functools.partial(_is_ip_address, _ipv4_regex, _ipv4_regexb) is_ipv6_address = functools.partial(_is_ip_address, _ipv6_regex, _ipv6_regexb) def is_ip_address( host: Optional[Union[str, bytes, bytearray, memoryview]]) -> bool: return is_ipv4_address(host) or is_ipv6_address(host) def next_whole_second() -> datetime.datetime: """Return current time rounded up to the next whole second.""" return ( datetime.datetime.now( datetime.timezone.utc).replace(microsecond=0) + datetime.timedelta(seconds=0) ) _cached_current_datetime = None # type: Optional[int] _cached_formatted_datetime = "" def rfc822_formatted_time() -> str: global _cached_current_datetime global _cached_formatted_datetime now = int(time.time()) if now != _cached_current_datetime: # Weekday and month names for HTTP date/time formatting; # always English! # Tuples are constants stored in codeobject! _weekdayname = ("Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun") _monthname = ("", # Dummy so we can use 1-based month numbers "Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Oct", "Nov", "Dec") year, month, day, hh, mm, ss, wd, *tail = time.gmtime(now) _cached_formatted_datetime = "%s, %02d %3s %4d %02d:%02d:%02d GMT" % ( _weekdayname[wd], day, _monthname[month], year, hh, mm, ss ) _cached_current_datetime = now return _cached_formatted_datetime def _weakref_handle(info): # type: ignore ref, name = info ob = ref() if ob is not None: with suppress(Exception): getattr(ob, name)() def weakref_handle(ob, name, timeout, loop, ceil_timeout=True): # type: ignore if timeout is not None and timeout > 0: when = loop.time() + timeout if ceil_timeout: when = ceil(when) return loop.call_at(when, _weakref_handle, (weakref.ref(ob), name)) def call_later(cb, timeout, loop): # type: ignore if timeout is not None and timeout > 0: when = ceil(loop.time() + timeout) return loop.call_at(when, cb) class TimeoutHandle: """ Timeout handle """ def __init__(self, loop: asyncio.AbstractEventLoop, timeout: Optional[float]) -> None: self._timeout = timeout self._loop = loop self._callbacks = [] # type: List[Tuple[Callable[..., None], Tuple[Any, ...], Dict[str, Any]]] # noqa def register(self, callback: Callable[..., None], *args: Any, **kwargs: Any) -> None: self._callbacks.append((callback, args, kwargs)) def close(self) -> None: self._callbacks.clear() def start(self) -> Optional[asyncio.Handle]: if self._timeout is not None and self._timeout > 0: at = ceil(self._loop.time() + self._timeout) return self._loop.call_at(at, self.__call__) else: return None def timer(self) -> 'BaseTimerContext': if self._timeout is not None and self._timeout > 0: timer = TimerContext(self._loop) self.register(timer.timeout) return timer else: return TimerNoop() def __call__(self) -> None: for cb, args, kwargs in self._callbacks: with suppress(Exception): cb(*args, **kwargs) self._callbacks.clear() class BaseTimerContext(ContextManager['BaseTimerContext']): pass class TimerNoop(BaseTimerContext): def __enter__(self) -> BaseTimerContext: return self def __exit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> Optional[bool]: return False class TimerContext(BaseTimerContext): """ Low resolution timeout context manager """ def __init__(self, loop: asyncio.AbstractEventLoop) -> None: self._loop = loop self._tasks = [] # type: List[asyncio.Task[Any]] self._cancelled = False def __enter__(self) -> BaseTimerContext: task = current_task(loop=self._loop) if task is None: raise RuntimeError('Timeout context manager should be used ' 'inside a task') if self._cancelled: task.cancel() raise asyncio.TimeoutError from None self._tasks.append(task) return self def __exit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> Optional[bool]: if self._tasks: self._tasks.pop() if exc_type is asyncio.CancelledError and self._cancelled: raise asyncio.TimeoutError from None return None def timeout(self) -> None: if not self._cancelled: for task in set(self._tasks): task.cancel() self._cancelled = True class CeilTimeout(async_timeout.timeout): def __enter__(self) -> async_timeout.timeout: if self._timeout is not None: self._task = current_task(loop=self._loop) if self._task is None: raise RuntimeError( 'Timeout context manager should be used inside a task') self._cancel_handler = self._loop.call_at( ceil(self._loop.time() + self._timeout), self._cancel_task) return self class HeadersMixin: ATTRS = frozenset([ '_content_type', '_content_dict', '_stored_content_type']) _content_type = None # type: Optional[str] _content_dict = None # type: Optional[Dict[str, str]] _stored_content_type = sentinel def _parse_content_type(self, raw: str) -> None: self._stored_content_type = raw if raw is None: # default value according to RFC 2616 self._content_type = 'application/octet-stream' self._content_dict = {} else: self._content_type, self._content_dict = cgi.parse_header(raw) @property def content_type(self) -> str: """The value of content part for Content-Type HTTP header.""" raw = self._headers.get(hdrs.CONTENT_TYPE) # type: ignore if self._stored_content_type != raw: self._parse_content_type(raw) return self._content_type # type: ignore @property def charset(self) -> Optional[str]: """The value of charset part for Content-Type HTTP header.""" raw = self._headers.get(hdrs.CONTENT_TYPE) # type: ignore if self._stored_content_type != raw: self._parse_content_type(raw) return self._content_dict.get('charset') # type: ignore @property def content_length(self) -> Optional[int]: """The value of Content-Length HTTP header.""" content_length = self._headers.get(hdrs.CONTENT_LENGTH) # type: ignore if content_length is not None: return int(content_length) else: return None def set_result(fut: 'asyncio.Future[_T]', result: _T) -> None: if not fut.done(): fut.set_result(result) def set_exception(fut: 'asyncio.Future[_T]', exc: BaseException) -> None: if not fut.done(): fut.set_exception(exc) class ChainMapProxy(Mapping[str, Any]): __slots__ = ('_maps',) def __init__(self, maps: Iterable[Mapping[str, Any]]) -> None: self._maps = tuple(maps) def __init_subclass__(cls) -> None: raise TypeError("Inheritance class {} from ChainMapProxy " "is forbidden".format(cls.__name__)) def __getitem__(self, key: str) -> Any: for mapping in self._maps: try: return mapping[key] except KeyError: pass raise KeyError(key) def get(self, key: str, default: Any=None) -> Any: return self[key] if key in self else default def __len__(self) -> int: # reuses stored hash values if possible return len(set().union(*self._maps)) # type: ignore def __iter__(self) -> Iterator[str]: d = {} # type: Dict[str, Any] for mapping in reversed(self._maps): # reuses stored hash values if possible d.update(mapping) return iter(d) def __contains__(self, key: object) -> bool: return any(key in m for m in self._maps) def __bool__(self) -> bool: return any(self._maps) def __repr__(self) -> str: content = ", ".join(map(repr, self._maps)) return 'ChainMapProxy({})'.format(content) aiohttp-3.6.2/aiohttp/http.py0000644000175100001650000000412513547410117016470 0ustar vstsdocker00000000000000import http.server import sys from typing import Mapping, Tuple # noqa from . import __version__ from .http_exceptions import HttpProcessingError as HttpProcessingError from .http_parser import HeadersParser as HeadersParser from .http_parser import HttpParser as HttpParser from .http_parser import HttpRequestParser as HttpRequestParser from .http_parser import HttpResponseParser as HttpResponseParser from .http_parser import RawRequestMessage as RawRequestMessage from .http_parser import RawResponseMessage as RawResponseMessage from .http_websocket import WS_CLOSED_MESSAGE as WS_CLOSED_MESSAGE from .http_websocket import WS_CLOSING_MESSAGE as WS_CLOSING_MESSAGE from .http_websocket import WS_KEY as WS_KEY from .http_websocket import WebSocketError as WebSocketError from .http_websocket import WebSocketReader as WebSocketReader from .http_websocket import WebSocketWriter as WebSocketWriter from .http_websocket import WSCloseCode as WSCloseCode from .http_websocket import WSMessage as WSMessage from .http_websocket import WSMsgType as WSMsgType from .http_websocket import ws_ext_gen as ws_ext_gen from .http_websocket import ws_ext_parse as ws_ext_parse from .http_writer import HttpVersion as HttpVersion from .http_writer import HttpVersion10 as HttpVersion10 from .http_writer import HttpVersion11 as HttpVersion11 from .http_writer import StreamWriter as StreamWriter __all__ = ( 'HttpProcessingError', 'RESPONSES', 'SERVER_SOFTWARE', # .http_writer 'StreamWriter', 'HttpVersion', 'HttpVersion10', 'HttpVersion11', # .http_parser 'HeadersParser', 'HttpParser', 'HttpRequestParser', 'HttpResponseParser', 'RawRequestMessage', 'RawResponseMessage', # .http_websocket 'WS_CLOSED_MESSAGE', 'WS_CLOSING_MESSAGE', 'WS_KEY', 'WebSocketReader', 'WebSocketWriter', 'ws_ext_gen', 'ws_ext_parse', 'WSMessage', 'WebSocketError', 'WSMsgType', 'WSCloseCode', ) SERVER_SOFTWARE = 'Python/{0[0]}.{0[1]} aiohttp/{1}'.format( sys.version_info, __version__) # type: str RESPONSES = http.server.BaseHTTPRequestHandler.responses # type: Mapping[int, Tuple[str, str]] # noqa aiohttp-3.6.2/aiohttp/http_exceptions.py0000644000175100001650000000514713547410117020736 0ustar vstsdocker00000000000000"""Low-level http related exceptions.""" from typing import Optional, Union from .typedefs import _CIMultiDict __all__ = ('HttpProcessingError',) class HttpProcessingError(Exception): """HTTP error. Shortcut for raising HTTP errors with custom code, message and headers. code: HTTP Error code. message: (optional) Error message. headers: (optional) Headers to be sent in response, a list of pairs """ code = 0 message = '' headers = None def __init__(self, *, code: Optional[int]=None, message: str='', headers: Optional[_CIMultiDict]=None) -> None: if code is not None: self.code = code self.headers = headers self.message = message def __str__(self) -> str: return "%s, message=%r" % (self.code, self.message) def __repr__(self) -> str: return "<%s: %s>" % (self.__class__.__name__, self) class BadHttpMessage(HttpProcessingError): code = 400 message = 'Bad Request' def __init__(self, message: str, *, headers: Optional[_CIMultiDict]=None) -> None: super().__init__(message=message, headers=headers) self.args = (message,) class HttpBadRequest(BadHttpMessage): code = 400 message = 'Bad Request' class PayloadEncodingError(BadHttpMessage): """Base class for payload errors""" class ContentEncodingError(PayloadEncodingError): """Content encoding error.""" class TransferEncodingError(PayloadEncodingError): """transfer encoding error.""" class ContentLengthError(PayloadEncodingError): """Not enough data for satisfy content length header.""" class LineTooLong(BadHttpMessage): def __init__(self, line: str, limit: str='Unknown', actual_size: str='Unknown') -> None: super().__init__( "Got more than %s bytes (%s) when reading %s." % ( limit, actual_size, line)) self.args = (line, limit, actual_size) class InvalidHeader(BadHttpMessage): def __init__(self, hdr: Union[bytes, str]) -> None: if isinstance(hdr, bytes): hdr = hdr.decode('utf-8', 'surrogateescape') super().__init__('Invalid HTTP Header: {}'.format(hdr)) self.hdr = hdr self.args = (hdr,) class BadStatusLine(BadHttpMessage): def __init__(self, line: str='') -> None: if not isinstance(line, str): line = repr(line) self.args = (line,) self.line = line __str__ = Exception.__str__ __repr__ = Exception.__repr__ class InvalidURLError(BadHttpMessage): pass aiohttp-3.6.2/aiohttp/http_parser.py0000644000175100001650000006641013547410117020051 0ustar vstsdocker00000000000000import abc import asyncio import collections import re import string import zlib from enum import IntEnum from typing import Any, List, Optional, Tuple, Type, Union # noqa from multidict import CIMultiDict, CIMultiDictProxy, istr from yarl import URL from . import hdrs from .base_protocol import BaseProtocol from .helpers import NO_EXTENSIONS, BaseTimerContext from .http_exceptions import ( BadStatusLine, ContentEncodingError, ContentLengthError, InvalidHeader, LineTooLong, TransferEncodingError, ) from .http_writer import HttpVersion, HttpVersion10 from .log import internal_logger from .streams import EMPTY_PAYLOAD, StreamReader from .typedefs import RawHeaders try: import brotli HAS_BROTLI = True except ImportError: # pragma: no cover HAS_BROTLI = False __all__ = ( 'HeadersParser', 'HttpParser', 'HttpRequestParser', 'HttpResponseParser', 'RawRequestMessage', 'RawResponseMessage') ASCIISET = set(string.printable) # See https://tools.ietf.org/html/rfc7230#section-3.1.1 # and https://tools.ietf.org/html/rfc7230#appendix-B # # method = token # tchar = "!" / "#" / "$" / "%" / "&" / "'" / "*" / "+" / "-" / "." / # "^" / "_" / "`" / "|" / "~" / DIGIT / ALPHA # token = 1*tchar METHRE = re.compile(r"[!#$%&'*+\-.^_`|~0-9A-Za-z]+") VERSRE = re.compile(r'HTTP/(\d+).(\d+)') HDRRE = re.compile(rb'[\x00-\x1F\x7F()<>@,;:\[\]={} \t\\\\\"]') RawRequestMessage = collections.namedtuple( 'RawRequestMessage', ['method', 'path', 'version', 'headers', 'raw_headers', 'should_close', 'compression', 'upgrade', 'chunked', 'url']) RawResponseMessage = collections.namedtuple( 'RawResponseMessage', ['version', 'code', 'reason', 'headers', 'raw_headers', 'should_close', 'compression', 'upgrade', 'chunked']) class ParseState(IntEnum): PARSE_NONE = 0 PARSE_LENGTH = 1 PARSE_CHUNKED = 2 PARSE_UNTIL_EOF = 3 class ChunkState(IntEnum): PARSE_CHUNKED_SIZE = 0 PARSE_CHUNKED_CHUNK = 1 PARSE_CHUNKED_CHUNK_EOF = 2 PARSE_MAYBE_TRAILERS = 3 PARSE_TRAILERS = 4 class HeadersParser: def __init__(self, max_line_size: int=8190, max_headers: int=32768, max_field_size: int=8190) -> None: self.max_line_size = max_line_size self.max_headers = max_headers self.max_field_size = max_field_size def parse_headers( self, lines: List[bytes] ) -> Tuple['CIMultiDictProxy[str]', RawHeaders]: headers = CIMultiDict() # type: CIMultiDict[str] raw_headers = [] lines_idx = 1 line = lines[1] line_count = len(lines) while line: # Parse initial header name : value pair. try: bname, bvalue = line.split(b':', 1) except ValueError: raise InvalidHeader(line) from None bname = bname.strip(b' \t') bvalue = bvalue.lstrip() if HDRRE.search(bname): raise InvalidHeader(bname) if len(bname) > self.max_field_size: raise LineTooLong( "request header name {}".format( bname.decode("utf8", "xmlcharrefreplace")), str(self.max_field_size), str(len(bname))) header_length = len(bvalue) # next line lines_idx += 1 line = lines[lines_idx] # consume continuation lines continuation = line and line[0] in (32, 9) # (' ', '\t') if continuation: bvalue_lst = [bvalue] while continuation: header_length += len(line) if header_length > self.max_field_size: raise LineTooLong( 'request header field {}'.format( bname.decode("utf8", "xmlcharrefreplace")), str(self.max_field_size), str(header_length)) bvalue_lst.append(line) # next line lines_idx += 1 if lines_idx < line_count: line = lines[lines_idx] if line: continuation = line[0] in (32, 9) # (' ', '\t') else: line = b'' break bvalue = b''.join(bvalue_lst) else: if header_length > self.max_field_size: raise LineTooLong( 'request header field {}'.format( bname.decode("utf8", "xmlcharrefreplace")), str(self.max_field_size), str(header_length)) bvalue = bvalue.strip() name = bname.decode('utf-8', 'surrogateescape') value = bvalue.decode('utf-8', 'surrogateescape') headers.add(name, value) raw_headers.append((bname, bvalue)) return (CIMultiDictProxy(headers), tuple(raw_headers)) class HttpParser(abc.ABC): def __init__(self, protocol: Optional[BaseProtocol]=None, loop: Optional[asyncio.AbstractEventLoop]=None, max_line_size: int=8190, max_headers: int=32768, max_field_size: int=8190, timer: Optional[BaseTimerContext]=None, code: Optional[int]=None, method: Optional[str]=None, readall: bool=False, payload_exception: Optional[Type[BaseException]]=None, response_with_body: bool=True, read_until_eof: bool=False, auto_decompress: bool=True) -> None: self.protocol = protocol self.loop = loop self.max_line_size = max_line_size self.max_headers = max_headers self.max_field_size = max_field_size self.timer = timer self.code = code self.method = method self.readall = readall self.payload_exception = payload_exception self.response_with_body = response_with_body self.read_until_eof = read_until_eof self._lines = [] # type: List[bytes] self._tail = b'' self._upgraded = False self._payload = None self._payload_parser = None # type: Optional[HttpPayloadParser] self._auto_decompress = auto_decompress self._headers_parser = HeadersParser(max_line_size, max_headers, max_field_size) @abc.abstractmethod def parse_message(self, lines: List[bytes]) -> Any: pass def feed_eof(self) -> Any: if self._payload_parser is not None: self._payload_parser.feed_eof() self._payload_parser = None else: # try to extract partial message if self._tail: self._lines.append(self._tail) if self._lines: if self._lines[-1] != '\r\n': self._lines.append(b'') try: return self.parse_message(self._lines) except Exception: return None def feed_data( self, data: bytes, SEP: bytes=b'\r\n', EMPTY: bytes=b'', CONTENT_LENGTH: istr=hdrs.CONTENT_LENGTH, METH_CONNECT: str=hdrs.METH_CONNECT, SEC_WEBSOCKET_KEY1: istr=hdrs.SEC_WEBSOCKET_KEY1 ) -> Tuple[List[Any], bool, bytes]: messages = [] if self._tail: data, self._tail = self._tail + data, b'' data_len = len(data) start_pos = 0 loop = self.loop while start_pos < data_len: # read HTTP message (request/response line + headers), \r\n\r\n # and split by lines if self._payload_parser is None and not self._upgraded: pos = data.find(SEP, start_pos) # consume \r\n if pos == start_pos and not self._lines: start_pos = pos + 2 continue if pos >= start_pos: # line found self._lines.append(data[start_pos:pos]) start_pos = pos + 2 # \r\n\r\n found if self._lines[-1] == EMPTY: try: msg = self.parse_message(self._lines) finally: self._lines.clear() # payload length length = msg.headers.get(CONTENT_LENGTH) if length is not None: try: length = int(length) except ValueError: raise InvalidHeader(CONTENT_LENGTH) if length < 0: raise InvalidHeader(CONTENT_LENGTH) # do not support old websocket spec if SEC_WEBSOCKET_KEY1 in msg.headers: raise InvalidHeader(SEC_WEBSOCKET_KEY1) self._upgraded = msg.upgrade method = getattr(msg, 'method', self.method) assert self.protocol is not None # calculate payload if ((length is not None and length > 0) or msg.chunked and not msg.upgrade): payload = StreamReader( self.protocol, timer=self.timer, loop=loop) payload_parser = HttpPayloadParser( payload, length=length, chunked=msg.chunked, method=method, compression=msg.compression, code=self.code, readall=self.readall, response_with_body=self.response_with_body, auto_decompress=self._auto_decompress) if not payload_parser.done: self._payload_parser = payload_parser elif method == METH_CONNECT: payload = StreamReader( self.protocol, timer=self.timer, loop=loop) self._upgraded = True self._payload_parser = HttpPayloadParser( payload, method=msg.method, compression=msg.compression, readall=True, auto_decompress=self._auto_decompress) else: if (getattr(msg, 'code', 100) >= 199 and length is None and self.read_until_eof): payload = StreamReader( self.protocol, timer=self.timer, loop=loop) payload_parser = HttpPayloadParser( payload, length=length, chunked=msg.chunked, method=method, compression=msg.compression, code=self.code, readall=True, response_with_body=self.response_with_body, auto_decompress=self._auto_decompress) if not payload_parser.done: self._payload_parser = payload_parser else: payload = EMPTY_PAYLOAD # type: ignore messages.append((msg, payload)) else: self._tail = data[start_pos:] data = EMPTY break # no parser, just store elif self._payload_parser is None and self._upgraded: assert not self._lines break # feed payload elif data and start_pos < data_len: assert not self._lines assert self._payload_parser is not None try: eof, data = self._payload_parser.feed_data( data[start_pos:]) except BaseException as exc: if self.payload_exception is not None: self._payload_parser.payload.set_exception( self.payload_exception(str(exc))) else: self._payload_parser.payload.set_exception(exc) eof = True data = b'' if eof: start_pos = 0 data_len = len(data) self._payload_parser = None continue else: break if data and start_pos < data_len: data = data[start_pos:] else: data = EMPTY return messages, self._upgraded, data def parse_headers( self, lines: List[bytes] ) -> Tuple['CIMultiDictProxy[str]', RawHeaders, Optional[bool], Optional[str], bool, bool]: """Parses RFC 5322 headers from a stream. Line continuations are supported. Returns list of header name and value pairs. Header name is in upper case. """ headers, raw_headers = self._headers_parser.parse_headers(lines) close_conn = None encoding = None upgrade = False chunked = False # keep-alive conn = headers.get(hdrs.CONNECTION) if conn: v = conn.lower() if v == 'close': close_conn = True elif v == 'keep-alive': close_conn = False elif v == 'upgrade': upgrade = True # encoding enc = headers.get(hdrs.CONTENT_ENCODING) if enc: enc = enc.lower() if enc in ('gzip', 'deflate', 'br'): encoding = enc # chunking te = headers.get(hdrs.TRANSFER_ENCODING) if te and 'chunked' in te.lower(): chunked = True return (headers, raw_headers, close_conn, encoding, upgrade, chunked) class HttpRequestParser(HttpParser): """Read request status line. Exception .http_exceptions.BadStatusLine could be raised in case of any errors in status line. Returns RawRequestMessage. """ def parse_message(self, lines: List[bytes]) -> Any: # request line line = lines[0].decode('utf-8', 'surrogateescape') try: method, path, version = line.split(None, 2) except ValueError: raise BadStatusLine(line) from None if len(path) > self.max_line_size: raise LineTooLong( 'Status line is too long', str(self.max_line_size), str(len(path))) # method if not METHRE.match(method): raise BadStatusLine(method) # version try: if version.startswith('HTTP/'): n1, n2 = version[5:].split('.', 1) version_o = HttpVersion(int(n1), int(n2)) else: raise BadStatusLine(version) except Exception: raise BadStatusLine(version) # read headers (headers, raw_headers, close, compression, upgrade, chunked) = self.parse_headers(lines) if close is None: # then the headers weren't set in the request if version_o <= HttpVersion10: # HTTP 1.0 must asks to not close close = True else: # HTTP 1.1 must ask to close. close = False return RawRequestMessage( method, path, version_o, headers, raw_headers, close, compression, upgrade, chunked, URL(path)) class HttpResponseParser(HttpParser): """Read response status line and headers. BadStatusLine could be raised in case of any errors in status line. Returns RawResponseMessage""" def parse_message(self, lines: List[bytes]) -> Any: line = lines[0].decode('utf-8', 'surrogateescape') try: version, status = line.split(None, 1) except ValueError: raise BadStatusLine(line) from None try: status, reason = status.split(None, 1) except ValueError: reason = '' if len(reason) > self.max_line_size: raise LineTooLong( 'Status line is too long', str(self.max_line_size), str(len(reason))) # version match = VERSRE.match(version) if match is None: raise BadStatusLine(line) version_o = HttpVersion(int(match.group(1)), int(match.group(2))) # The status code is a three-digit number try: status_i = int(status) except ValueError: raise BadStatusLine(line) from None if status_i > 999: raise BadStatusLine(line) # read headers (headers, raw_headers, close, compression, upgrade, chunked) = self.parse_headers(lines) if close is None: close = version_o <= HttpVersion10 return RawResponseMessage( version_o, status_i, reason.strip(), headers, raw_headers, close, compression, upgrade, chunked) class HttpPayloadParser: def __init__(self, payload: StreamReader, length: Optional[int]=None, chunked: bool=False, compression: Optional[str]=None, code: Optional[int]=None, method: Optional[str]=None, readall: bool=False, response_with_body: bool=True, auto_decompress: bool=True) -> None: self._length = 0 self._type = ParseState.PARSE_NONE self._chunk = ChunkState.PARSE_CHUNKED_SIZE self._chunk_size = 0 self._chunk_tail = b'' self._auto_decompress = auto_decompress self.done = False # payload decompression wrapper if response_with_body and compression and self._auto_decompress: real_payload = DeflateBuffer(payload, compression) # type: Union[StreamReader, DeflateBuffer] # noqa else: real_payload = payload # payload parser if not response_with_body: # don't parse payload if it's not expected to be received self._type = ParseState.PARSE_NONE real_payload.feed_eof() self.done = True elif chunked: self._type = ParseState.PARSE_CHUNKED elif length is not None: self._type = ParseState.PARSE_LENGTH self._length = length if self._length == 0: real_payload.feed_eof() self.done = True else: if readall and code != 204: self._type = ParseState.PARSE_UNTIL_EOF elif method in ('PUT', 'POST'): internal_logger.warning( # pragma: no cover 'Content-Length or Transfer-Encoding header is required') self._type = ParseState.PARSE_NONE real_payload.feed_eof() self.done = True self.payload = real_payload def feed_eof(self) -> None: if self._type == ParseState.PARSE_UNTIL_EOF: self.payload.feed_eof() elif self._type == ParseState.PARSE_LENGTH: raise ContentLengthError( "Not enough data for satisfy content length header.") elif self._type == ParseState.PARSE_CHUNKED: raise TransferEncodingError( "Not enough data for satisfy transfer length header.") def feed_data(self, chunk: bytes, SEP: bytes=b'\r\n', CHUNK_EXT: bytes=b';') -> Tuple[bool, bytes]: # Read specified amount of bytes if self._type == ParseState.PARSE_LENGTH: required = self._length chunk_len = len(chunk) if required >= chunk_len: self._length = required - chunk_len self.payload.feed_data(chunk, chunk_len) if self._length == 0: self.payload.feed_eof() return True, b'' else: self._length = 0 self.payload.feed_data(chunk[:required], required) self.payload.feed_eof() return True, chunk[required:] # Chunked transfer encoding parser elif self._type == ParseState.PARSE_CHUNKED: if self._chunk_tail: chunk = self._chunk_tail + chunk self._chunk_tail = b'' while chunk: # read next chunk size if self._chunk == ChunkState.PARSE_CHUNKED_SIZE: pos = chunk.find(SEP) if pos >= 0: i = chunk.find(CHUNK_EXT, 0, pos) if i >= 0: size_b = chunk[:i] # strip chunk-extensions else: size_b = chunk[:pos] try: size = int(bytes(size_b), 16) except ValueError: exc = TransferEncodingError( chunk[:pos].decode('ascii', 'surrogateescape')) self.payload.set_exception(exc) raise exc from None chunk = chunk[pos+2:] if size == 0: # eof marker self._chunk = ChunkState.PARSE_MAYBE_TRAILERS else: self._chunk = ChunkState.PARSE_CHUNKED_CHUNK self._chunk_size = size self.payload.begin_http_chunk_receiving() else: self._chunk_tail = chunk return False, b'' # read chunk and feed buffer if self._chunk == ChunkState.PARSE_CHUNKED_CHUNK: required = self._chunk_size chunk_len = len(chunk) if required > chunk_len: self._chunk_size = required - chunk_len self.payload.feed_data(chunk, chunk_len) return False, b'' else: self._chunk_size = 0 self.payload.feed_data(chunk[:required], required) chunk = chunk[required:] self._chunk = ChunkState.PARSE_CHUNKED_CHUNK_EOF self.payload.end_http_chunk_receiving() # toss the CRLF at the end of the chunk if self._chunk == ChunkState.PARSE_CHUNKED_CHUNK_EOF: if chunk[:2] == SEP: chunk = chunk[2:] self._chunk = ChunkState.PARSE_CHUNKED_SIZE else: self._chunk_tail = chunk return False, b'' # if stream does not contain trailer, after 0\r\n # we should get another \r\n otherwise # trailers needs to be skiped until \r\n\r\n if self._chunk == ChunkState.PARSE_MAYBE_TRAILERS: if chunk[:2] == SEP: # end of stream self.payload.feed_eof() return True, chunk[2:] else: self._chunk = ChunkState.PARSE_TRAILERS # read and discard trailer up to the CRLF terminator if self._chunk == ChunkState.PARSE_TRAILERS: pos = chunk.find(SEP) if pos >= 0: chunk = chunk[pos+2:] self._chunk = ChunkState.PARSE_MAYBE_TRAILERS else: self._chunk_tail = chunk return False, b'' # Read all bytes until eof elif self._type == ParseState.PARSE_UNTIL_EOF: self.payload.feed_data(chunk, len(chunk)) return False, b'' class DeflateBuffer: """DeflateStream decompress stream and feed data into specified stream.""" def __init__(self, out: StreamReader, encoding: Optional[str]) -> None: self.out = out self.size = 0 self.encoding = encoding self._started_decoding = False if encoding == 'br': if not HAS_BROTLI: # pragma: no cover raise ContentEncodingError( 'Can not decode content-encoding: brotli (br). ' 'Please install `brotlipy`') self.decompressor = brotli.Decompressor() else: zlib_mode = (16 + zlib.MAX_WBITS if encoding == 'gzip' else -zlib.MAX_WBITS) self.decompressor = zlib.decompressobj(wbits=zlib_mode) def set_exception(self, exc: BaseException) -> None: self.out.set_exception(exc) def feed_data(self, chunk: bytes, size: int) -> None: self.size += size try: chunk = self.decompressor.decompress(chunk) except Exception: if not self._started_decoding and self.encoding == 'deflate': self.decompressor = zlib.decompressobj() try: chunk = self.decompressor.decompress(chunk) except Exception: raise ContentEncodingError( 'Can not decode content-encoding: %s' % self.encoding) else: raise ContentEncodingError( 'Can not decode content-encoding: %s' % self.encoding) if chunk: self._started_decoding = True self.out.feed_data(chunk, len(chunk)) def feed_eof(self) -> None: chunk = self.decompressor.flush() if chunk or self.size > 0: self.out.feed_data(chunk, len(chunk)) if self.encoding == 'deflate' and not self.decompressor.eof: raise ContentEncodingError('deflate') self.out.feed_eof() def begin_http_chunk_receiving(self) -> None: self.out.begin_http_chunk_receiving() def end_http_chunk_receiving(self) -> None: self.out.end_http_chunk_receiving() HttpRequestParserPy = HttpRequestParser HttpResponseParserPy = HttpResponseParser RawRequestMessagePy = RawRequestMessage RawResponseMessagePy = RawResponseMessage try: if not NO_EXTENSIONS: from ._http_parser import (HttpRequestParser, # type: ignore # noqa HttpResponseParser, RawRequestMessage, RawResponseMessage) HttpRequestParserC = HttpRequestParser HttpResponseParserC = HttpResponseParser RawRequestMessageC = RawRequestMessage RawResponseMessageC = RawResponseMessage except ImportError: # pragma: no cover pass aiohttp-3.6.2/aiohttp/http_websocket.py0000644000175100001650000006024513547410117020543 0ustar vstsdocker00000000000000"""WebSocket protocol versions 13 and 8.""" import asyncio import collections import json import random import re import sys import zlib from enum import IntEnum from struct import Struct from typing import Any, Callable, List, Optional, Tuple, Union from .base_protocol import BaseProtocol from .helpers import NO_EXTENSIONS from .log import ws_logger from .streams import DataQueue __all__ = ('WS_CLOSED_MESSAGE', 'WS_CLOSING_MESSAGE', 'WS_KEY', 'WebSocketReader', 'WebSocketWriter', 'WSMessage', 'WebSocketError', 'WSMsgType', 'WSCloseCode') class WSCloseCode(IntEnum): OK = 1000 GOING_AWAY = 1001 PROTOCOL_ERROR = 1002 UNSUPPORTED_DATA = 1003 INVALID_TEXT = 1007 POLICY_VIOLATION = 1008 MESSAGE_TOO_BIG = 1009 MANDATORY_EXTENSION = 1010 INTERNAL_ERROR = 1011 SERVICE_RESTART = 1012 TRY_AGAIN_LATER = 1013 ALLOWED_CLOSE_CODES = {int(i) for i in WSCloseCode} class WSMsgType(IntEnum): # websocket spec types CONTINUATION = 0x0 TEXT = 0x1 BINARY = 0x2 PING = 0x9 PONG = 0xa CLOSE = 0x8 # aiohttp specific types CLOSING = 0x100 CLOSED = 0x101 ERROR = 0x102 text = TEXT binary = BINARY ping = PING pong = PONG close = CLOSE closing = CLOSING closed = CLOSED error = ERROR WS_KEY = b'258EAFA5-E914-47DA-95CA-C5AB0DC85B11' UNPACK_LEN2 = Struct('!H').unpack_from UNPACK_LEN3 = Struct('!Q').unpack_from UNPACK_CLOSE_CODE = Struct('!H').unpack PACK_LEN1 = Struct('!BB').pack PACK_LEN2 = Struct('!BBH').pack PACK_LEN3 = Struct('!BBQ').pack PACK_CLOSE_CODE = Struct('!H').pack MSG_SIZE = 2 ** 14 DEFAULT_LIMIT = 2 ** 16 _WSMessageBase = collections.namedtuple('_WSMessageBase', ['type', 'data', 'extra']) class WSMessage(_WSMessageBase): def json(self, *, loads: Callable[[Any], Any]=json.loads) -> Any: """Return parsed JSON data. .. versionadded:: 0.22 """ return loads(self.data) WS_CLOSED_MESSAGE = WSMessage(WSMsgType.CLOSED, None, None) WS_CLOSING_MESSAGE = WSMessage(WSMsgType.CLOSING, None, None) class WebSocketError(Exception): """WebSocket protocol parser error.""" def __init__(self, code: int, message: str) -> None: self.code = code super().__init__(code, message) def __str__(self) -> str: return self.args[1] class WSHandshakeError(Exception): """WebSocket protocol handshake error.""" native_byteorder = sys.byteorder # Used by _websocket_mask_python _XOR_TABLE = [bytes(a ^ b for a in range(256)) for b in range(256)] def _websocket_mask_python(mask: bytes, data: bytearray) -> None: """Websocket masking function. `mask` is a `bytes` object of length 4; `data` is a `bytearray` object of any length. The contents of `data` are masked with `mask`, as specified in section 5.3 of RFC 6455. Note that this function mutates the `data` argument. This pure-python implementation may be replaced by an optimized version when available. """ assert isinstance(data, bytearray), data assert len(mask) == 4, mask if data: a, b, c, d = (_XOR_TABLE[n] for n in mask) data[::4] = data[::4].translate(a) data[1::4] = data[1::4].translate(b) data[2::4] = data[2::4].translate(c) data[3::4] = data[3::4].translate(d) if NO_EXTENSIONS: # pragma: no cover _websocket_mask = _websocket_mask_python else: try: from ._websocket import _websocket_mask_cython # type: ignore _websocket_mask = _websocket_mask_cython except ImportError: # pragma: no cover _websocket_mask = _websocket_mask_python _WS_DEFLATE_TRAILING = bytes([0x00, 0x00, 0xff, 0xff]) _WS_EXT_RE = re.compile(r'^(?:;\s*(?:' r'(server_no_context_takeover)|' r'(client_no_context_takeover)|' r'(server_max_window_bits(?:=(\d+))?)|' r'(client_max_window_bits(?:=(\d+))?)))*$') _WS_EXT_RE_SPLIT = re.compile(r'permessage-deflate([^,]+)?') def ws_ext_parse(extstr: str, isserver: bool=False) -> Tuple[int, bool]: if not extstr: return 0, False compress = 0 notakeover = False for ext in _WS_EXT_RE_SPLIT.finditer(extstr): defext = ext.group(1) # Return compress = 15 when get `permessage-deflate` if not defext: compress = 15 break match = _WS_EXT_RE.match(defext) if match: compress = 15 if isserver: # Server never fail to detect compress handshake. # Server does not need to send max wbit to client if match.group(4): compress = int(match.group(4)) # Group3 must match if group4 matches # Compress wbit 8 does not support in zlib # If compress level not support, # CONTINUE to next extension if compress > 15 or compress < 9: compress = 0 continue if match.group(1): notakeover = True # Ignore regex group 5 & 6 for client_max_window_bits break else: if match.group(6): compress = int(match.group(6)) # Group5 must match if group6 matches # Compress wbit 8 does not support in zlib # If compress level not support, # FAIL the parse progress if compress > 15 or compress < 9: raise WSHandshakeError('Invalid window size') if match.group(2): notakeover = True # Ignore regex group 5 & 6 for client_max_window_bits break # Return Fail if client side and not match elif not isserver: raise WSHandshakeError('Extension for deflate not supported' + ext.group(1)) return compress, notakeover def ws_ext_gen(compress: int=15, isserver: bool=False, server_notakeover: bool=False) -> str: # client_notakeover=False not used for server # compress wbit 8 does not support in zlib if compress < 9 or compress > 15: raise ValueError('Compress wbits must between 9 and 15, ' 'zlib does not support wbits=8') enabledext = ['permessage-deflate'] if not isserver: enabledext.append('client_max_window_bits') if compress < 15: enabledext.append('server_max_window_bits=' + str(compress)) if server_notakeover: enabledext.append('server_no_context_takeover') # if client_notakeover: # enabledext.append('client_no_context_takeover') return '; '.join(enabledext) class WSParserState(IntEnum): READ_HEADER = 1 READ_PAYLOAD_LENGTH = 2 READ_PAYLOAD_MASK = 3 READ_PAYLOAD = 4 class WebSocketReader: def __init__(self, queue: DataQueue[WSMessage], max_msg_size: int, compress: bool=True) -> None: self.queue = queue self._max_msg_size = max_msg_size self._exc = None # type: Optional[BaseException] self._partial = bytearray() self._state = WSParserState.READ_HEADER self._opcode = None # type: Optional[int] self._frame_fin = False self._frame_opcode = None # type: Optional[int] self._frame_payload = bytearray() self._tail = b'' self._has_mask = False self._frame_mask = None # type: Optional[bytes] self._payload_length = 0 self._payload_length_flag = 0 self._compressed = None # type: Optional[bool] self._decompressobj = None # type: Any # zlib.decompressobj actually self._compress = compress def feed_eof(self) -> None: self.queue.feed_eof() def feed_data(self, data: bytes) -> Tuple[bool, bytes]: if self._exc: return True, data try: return self._feed_data(data) except Exception as exc: self._exc = exc self.queue.set_exception(exc) return True, b'' def _feed_data(self, data: bytes) -> Tuple[bool, bytes]: for fin, opcode, payload, compressed in self.parse_frame(data): if compressed and not self._decompressobj: self._decompressobj = zlib.decompressobj(wbits=-zlib.MAX_WBITS) if opcode == WSMsgType.CLOSE: if len(payload) >= 2: close_code = UNPACK_CLOSE_CODE(payload[:2])[0] if (close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES): raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Invalid close code: {}'.format(close_code)) try: close_message = payload[2:].decode('utf-8') except UnicodeDecodeError as exc: raise WebSocketError( WSCloseCode.INVALID_TEXT, 'Invalid UTF-8 text message') from exc msg = WSMessage(WSMsgType.CLOSE, close_code, close_message) elif payload: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Invalid close frame: {} {} {!r}'.format( fin, opcode, payload)) else: msg = WSMessage(WSMsgType.CLOSE, 0, '') self.queue.feed_data(msg, 0) elif opcode == WSMsgType.PING: self.queue.feed_data( WSMessage(WSMsgType.PING, payload, ''), len(payload)) elif opcode == WSMsgType.PONG: self.queue.feed_data( WSMessage(WSMsgType.PONG, payload, ''), len(payload)) elif opcode not in ( WSMsgType.TEXT, WSMsgType.BINARY) and self._opcode is None: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, "Unexpected opcode={!r}".format(opcode)) else: # load text/binary if not fin: # got partial frame payload if opcode != WSMsgType.CONTINUATION: self._opcode = opcode self._partial.extend(payload) if (self._max_msg_size and len(self._partial) >= self._max_msg_size): raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, "Message size {} exceeds limit {}".format( len(self._partial), self._max_msg_size)) else: # previous frame was non finished # we should get continuation opcode if self._partial: if opcode != WSMsgType.CONTINUATION: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'The opcode in non-fin frame is expected ' 'to be zero, got {!r}'.format(opcode)) if opcode == WSMsgType.CONTINUATION: assert self._opcode is not None opcode = self._opcode self._opcode = None self._partial.extend(payload) if (self._max_msg_size and len(self._partial) >= self._max_msg_size): raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, "Message size {} exceeds limit {}".format( len(self._partial), self._max_msg_size)) # Decompress process must to be done after all packets # received. if compressed: self._partial.extend(_WS_DEFLATE_TRAILING) payload_merged = self._decompressobj.decompress( self._partial, self._max_msg_size) if self._decompressobj.unconsumed_tail: left = len(self._decompressobj.unconsumed_tail) raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, "Decompressed message size {} exceeds limit {}" .format( self._max_msg_size + left, self._max_msg_size ) ) else: payload_merged = bytes(self._partial) self._partial.clear() if opcode == WSMsgType.TEXT: try: text = payload_merged.decode('utf-8') self.queue.feed_data( WSMessage(WSMsgType.TEXT, text, ''), len(text)) except UnicodeDecodeError as exc: raise WebSocketError( WSCloseCode.INVALID_TEXT, 'Invalid UTF-8 text message') from exc else: self.queue.feed_data( WSMessage(WSMsgType.BINARY, payload_merged, ''), len(payload_merged)) return False, b'' def parse_frame(self, buf: bytes) -> List[Tuple[bool, Optional[int], bytearray, Optional[bool]]]: """Return the next frame from the socket.""" frames = [] if self._tail: buf, self._tail = self._tail + buf, b'' start_pos = 0 buf_length = len(buf) while True: # read header if self._state == WSParserState.READ_HEADER: if buf_length - start_pos >= 2: data = buf[start_pos:start_pos+2] start_pos += 2 first_byte, second_byte = data fin = (first_byte >> 7) & 1 rsv1 = (first_byte >> 6) & 1 rsv2 = (first_byte >> 5) & 1 rsv3 = (first_byte >> 4) & 1 opcode = first_byte & 0xf # frame-fin = %x0 ; more frames of this message follow # / %x1 ; final frame of this message # frame-rsv1 = %x0 ; # 1 bit, MUST be 0 unless negotiated otherwise # frame-rsv2 = %x0 ; # 1 bit, MUST be 0 unless negotiated otherwise # frame-rsv3 = %x0 ; # 1 bit, MUST be 0 unless negotiated otherwise # # Remove rsv1 from this test for deflate development if rsv2 or rsv3 or (rsv1 and not self._compress): raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Received frame with non-zero reserved bits') if opcode > 0x7 and fin == 0: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Received fragmented control frame') has_mask = (second_byte >> 7) & 1 length = second_byte & 0x7f # Control frames MUST have a payload # length of 125 bytes or less if opcode > 0x7 and length > 125: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Control frame payload cannot be ' 'larger than 125 bytes') # Set compress status if last package is FIN # OR set compress status if this is first fragment # Raise error if not first fragment with rsv1 = 0x1 if self._frame_fin or self._compressed is None: self._compressed = True if rsv1 else False elif rsv1: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, 'Received frame with non-zero reserved bits') self._frame_fin = bool(fin) self._frame_opcode = opcode self._has_mask = bool(has_mask) self._payload_length_flag = length self._state = WSParserState.READ_PAYLOAD_LENGTH else: break # read payload length if self._state == WSParserState.READ_PAYLOAD_LENGTH: length = self._payload_length_flag if length == 126: if buf_length - start_pos >= 2: data = buf[start_pos:start_pos+2] start_pos += 2 length = UNPACK_LEN2(data)[0] self._payload_length = length self._state = ( WSParserState.READ_PAYLOAD_MASK if self._has_mask else WSParserState.READ_PAYLOAD) else: break elif length > 126: if buf_length - start_pos >= 8: data = buf[start_pos:start_pos+8] start_pos += 8 length = UNPACK_LEN3(data)[0] self._payload_length = length self._state = ( WSParserState.READ_PAYLOAD_MASK if self._has_mask else WSParserState.READ_PAYLOAD) else: break else: self._payload_length = length self._state = ( WSParserState.READ_PAYLOAD_MASK if self._has_mask else WSParserState.READ_PAYLOAD) # read payload mask if self._state == WSParserState.READ_PAYLOAD_MASK: if buf_length - start_pos >= 4: self._frame_mask = buf[start_pos:start_pos+4] start_pos += 4 self._state = WSParserState.READ_PAYLOAD else: break if self._state == WSParserState.READ_PAYLOAD: length = self._payload_length payload = self._frame_payload chunk_len = buf_length - start_pos if length >= chunk_len: self._payload_length = length - chunk_len payload.extend(buf[start_pos:]) start_pos = buf_length else: self._payload_length = 0 payload.extend(buf[start_pos:start_pos+length]) start_pos = start_pos + length if self._payload_length == 0: if self._has_mask: assert self._frame_mask is not None _websocket_mask(self._frame_mask, payload) frames.append(( self._frame_fin, self._frame_opcode, payload, self._compressed)) self._frame_payload = bytearray() self._state = WSParserState.READ_HEADER else: break self._tail = buf[start_pos:] return frames class WebSocketWriter: def __init__(self, protocol: BaseProtocol, transport: asyncio.Transport, *, use_mask: bool=False, limit: int=DEFAULT_LIMIT, random: Any=random.Random(), compress: int=0, notakeover: bool=False) -> None: self.protocol = protocol self.transport = transport self.use_mask = use_mask self.randrange = random.randrange self.compress = compress self.notakeover = notakeover self._closing = False self._limit = limit self._output_size = 0 self._compressobj = None # type: Any # actually compressobj async def _send_frame(self, message: bytes, opcode: int, compress: Optional[int]=None) -> None: """Send a frame over the websocket with message as its payload.""" if self._closing: ws_logger.warning('websocket connection is closing.') rsv = 0 # Only compress larger packets (disabled) # Does small packet needs to be compressed? # if self.compress and opcode < 8 and len(message) > 124: if (compress or self.compress) and opcode < 8: if compress: # Do not set self._compress if compressing is for this frame compressobj = zlib.compressobj(wbits=-compress) else: # self.compress if not self._compressobj: self._compressobj = zlib.compressobj(wbits=-self.compress) compressobj = self._compressobj message = compressobj.compress(message) message = message + compressobj.flush( zlib.Z_FULL_FLUSH if self.notakeover else zlib.Z_SYNC_FLUSH) if message.endswith(_WS_DEFLATE_TRAILING): message = message[:-4] rsv = rsv | 0x40 msg_length = len(message) use_mask = self.use_mask if use_mask: mask_bit = 0x80 else: mask_bit = 0 if msg_length < 126: header = PACK_LEN1(0x80 | rsv | opcode, msg_length | mask_bit) elif msg_length < (1 << 16): header = PACK_LEN2(0x80 | rsv | opcode, 126 | mask_bit, msg_length) else: header = PACK_LEN3(0x80 | rsv | opcode, 127 | mask_bit, msg_length) if use_mask: mask = self.randrange(0, 0xffffffff) mask = mask.to_bytes(4, 'big') message = bytearray(message) _websocket_mask(mask, message) self.transport.write(header + mask + message) self._output_size += len(header) + len(mask) + len(message) else: if len(message) > MSG_SIZE: self.transport.write(header) self.transport.write(message) else: self.transport.write(header + message) self._output_size += len(header) + len(message) if self._output_size > self._limit: self._output_size = 0 await self.protocol._drain_helper() async def pong(self, message: bytes=b'') -> None: """Send pong message.""" if isinstance(message, str): message = message.encode('utf-8') await self._send_frame(message, WSMsgType.PONG) async def ping(self, message: bytes=b'') -> None: """Send ping message.""" if isinstance(message, str): message = message.encode('utf-8') await self._send_frame(message, WSMsgType.PING) async def send(self, message: Union[str, bytes], binary: bool=False, compress: Optional[int]=None) -> None: """Send a frame over the websocket with message as its payload.""" if isinstance(message, str): message = message.encode('utf-8') if binary: await self._send_frame(message, WSMsgType.BINARY, compress) else: await self._send_frame(message, WSMsgType.TEXT, compress) async def close(self, code: int=1000, message: bytes=b'') -> None: """Close the websocket, sending the specified code and message.""" if isinstance(message, str): message = message.encode('utf-8') try: await self._send_frame( PACK_CLOSE_CODE(code) + message, opcode=WSMsgType.CLOSE) finally: self._closing = True aiohttp-3.6.2/aiohttp/http_writer.py0000644000175100001650000001216713547410117020071 0ustar vstsdocker00000000000000"""Http related parsers and protocol.""" import asyncio import collections import zlib from typing import Any, Awaitable, Callable, Optional, Union # noqa from multidict import CIMultiDict # noqa from .abc import AbstractStreamWriter from .base_protocol import BaseProtocol from .helpers import NO_EXTENSIONS __all__ = ('StreamWriter', 'HttpVersion', 'HttpVersion10', 'HttpVersion11') HttpVersion = collections.namedtuple('HttpVersion', ['major', 'minor']) HttpVersion10 = HttpVersion(1, 0) HttpVersion11 = HttpVersion(1, 1) _T_OnChunkSent = Optional[Callable[[bytes], Awaitable[None]]] class StreamWriter(AbstractStreamWriter): def __init__(self, protocol: BaseProtocol, loop: asyncio.AbstractEventLoop, on_chunk_sent: _T_OnChunkSent = None) -> None: self._protocol = protocol self._transport = protocol.transport self.loop = loop self.length = None self.chunked = False self.buffer_size = 0 self.output_size = 0 self._eof = False self._compress = None # type: Any self._drain_waiter = None self._on_chunk_sent = on_chunk_sent # type: _T_OnChunkSent @property def transport(self) -> Optional[asyncio.Transport]: return self._transport @property def protocol(self) -> BaseProtocol: return self._protocol def enable_chunking(self) -> None: self.chunked = True def enable_compression(self, encoding: str='deflate') -> None: zlib_mode = (16 + zlib.MAX_WBITS if encoding == 'gzip' else -zlib.MAX_WBITS) self._compress = zlib.compressobj(wbits=zlib_mode) def _write(self, chunk: bytes) -> None: size = len(chunk) self.buffer_size += size self.output_size += size if self._transport is None or self._transport.is_closing(): raise ConnectionResetError('Cannot write to closing transport') self._transport.write(chunk) async def write(self, chunk: bytes, *, drain: bool=True, LIMIT: int=0x10000) -> None: """Writes chunk of data to a stream. write_eof() indicates end of stream. writer can't be used after write_eof() method being called. write() return drain future. """ if self._on_chunk_sent is not None: await self._on_chunk_sent(chunk) if self._compress is not None: chunk = self._compress.compress(chunk) if not chunk: return if self.length is not None: chunk_len = len(chunk) if self.length >= chunk_len: self.length = self.length - chunk_len else: chunk = chunk[:self.length] self.length = 0 if not chunk: return if chunk: if self.chunked: chunk_len_pre = ('%x\r\n' % len(chunk)).encode('ascii') chunk = chunk_len_pre + chunk + b'\r\n' self._write(chunk) if self.buffer_size > LIMIT and drain: self.buffer_size = 0 await self.drain() async def write_headers(self, status_line: str, headers: 'CIMultiDict[str]') -> None: """Write request/response status and headers.""" # status + headers buf = _serialize_headers(status_line, headers) self._write(buf) async def write_eof(self, chunk: bytes=b'') -> None: if self._eof: return if chunk and self._on_chunk_sent is not None: await self._on_chunk_sent(chunk) if self._compress: if chunk: chunk = self._compress.compress(chunk) chunk = chunk + self._compress.flush() if chunk and self.chunked: chunk_len = ('%x\r\n' % len(chunk)).encode('ascii') chunk = chunk_len + chunk + b'\r\n0\r\n\r\n' else: if self.chunked: if chunk: chunk_len = ('%x\r\n' % len(chunk)).encode('ascii') chunk = chunk_len + chunk + b'\r\n0\r\n\r\n' else: chunk = b'0\r\n\r\n' if chunk: self._write(chunk) await self.drain() self._eof = True self._transport = None async def drain(self) -> None: """Flush the write buffer. The intended use is to write await w.write(data) await w.drain() """ if self._protocol.transport is not None: await self._protocol._drain_helper() def _py_serialize_headers(status_line: str, headers: 'CIMultiDict[str]') -> bytes: line = status_line + '\r\n' + ''.join( [k + ': ' + v + '\r\n' for k, v in headers.items()]) return line.encode('utf-8') + b'\r\n' _serialize_headers = _py_serialize_headers try: import aiohttp._http_writer as _http_writer # type: ignore _c_serialize_headers = _http_writer._serialize_headers if not NO_EXTENSIONS: _serialize_headers = _c_serialize_headers except ImportError: pass aiohttp-3.6.2/aiohttp/locks.py0000644000175100001650000000232213547410117016621 0ustar vstsdocker00000000000000import asyncio import collections from typing import Any, Optional try: from typing import Deque except ImportError: from typing_extensions import Deque # noqa class EventResultOrError: """ This class wrappers the Event asyncio lock allowing either awake the locked Tasks without any error or raising an exception. thanks to @vorpalsmith for the simple design. """ def __init__(self, loop: asyncio.AbstractEventLoop) -> None: self._loop = loop self._exc = None # type: Optional[BaseException] self._event = asyncio.Event(loop=loop) self._waiters = collections.deque() # type: Deque[asyncio.Future[Any]] def set(self, exc: Optional[BaseException]=None) -> None: self._exc = exc self._event.set() async def wait(self) -> Any: waiter = self._loop.create_task(self._event.wait()) self._waiters.append(waiter) try: val = await waiter finally: self._waiters.remove(waiter) if self._exc is not None: raise self._exc return val def cancel(self) -> None: """ Cancel all waiters """ for waiter in self._waiters: waiter.cancel() aiohttp-3.6.2/aiohttp/log.py0000644000175100001650000000050513547410117016270 0ustar vstsdocker00000000000000import logging access_logger = logging.getLogger('aiohttp.access') client_logger = logging.getLogger('aiohttp.client') internal_logger = logging.getLogger('aiohttp.internal') server_logger = logging.getLogger('aiohttp.server') web_logger = logging.getLogger('aiohttp.web') ws_logger = logging.getLogger('aiohttp.websocket') aiohttp-3.6.2/aiohttp/multipart.py0000644000175100001650000010001513547410117017525 0ustar vstsdocker00000000000000import base64 import binascii import json import re import uuid import warnings import zlib from collections import deque from types import TracebackType from typing import ( # noqa TYPE_CHECKING, Any, Dict, Iterator, List, Mapping, Optional, Sequence, Tuple, Type, Union, ) from urllib.parse import parse_qsl, unquote, urlencode from multidict import CIMultiDict, CIMultiDictProxy, MultiMapping # noqa from .hdrs import ( CONTENT_DISPOSITION, CONTENT_ENCODING, CONTENT_LENGTH, CONTENT_TRANSFER_ENCODING, CONTENT_TYPE, ) from .helpers import CHAR, TOKEN, parse_mimetype, reify from .http import HeadersParser from .payload import ( JsonPayload, LookupError, Order, Payload, StringPayload, get_payload, payload_type, ) from .streams import StreamReader __all__ = ('MultipartReader', 'MultipartWriter', 'BodyPartReader', 'BadContentDispositionHeader', 'BadContentDispositionParam', 'parse_content_disposition', 'content_disposition_filename') if TYPE_CHECKING: # pragma: no cover from .client_reqrep import ClientResponse # noqa class BadContentDispositionHeader(RuntimeWarning): pass class BadContentDispositionParam(RuntimeWarning): pass def parse_content_disposition(header: Optional[str]) -> Tuple[Optional[str], Dict[str, str]]: def is_token(string: str) -> bool: return bool(string) and TOKEN >= set(string) def is_quoted(string: str) -> bool: return string[0] == string[-1] == '"' def is_rfc5987(string: str) -> bool: return is_token(string) and string.count("'") == 2 def is_extended_param(string: str) -> bool: return string.endswith('*') def is_continuous_param(string: str) -> bool: pos = string.find('*') + 1 if not pos: return False substring = string[pos:-1] if string.endswith('*') else string[pos:] return substring.isdigit() def unescape(text: str, *, chars: str=''.join(map(re.escape, CHAR))) -> str: return re.sub('\\\\([{}])'.format(chars), '\\1', text) if not header: return None, {} disptype, *parts = header.split(';') if not is_token(disptype): warnings.warn(BadContentDispositionHeader(header)) return None, {} params = {} # type: Dict[str, str] while parts: item = parts.pop(0) if '=' not in item: warnings.warn(BadContentDispositionHeader(header)) return None, {} key, value = item.split('=', 1) key = key.lower().strip() value = value.lstrip() if key in params: warnings.warn(BadContentDispositionHeader(header)) return None, {} if not is_token(key): warnings.warn(BadContentDispositionParam(item)) continue elif is_continuous_param(key): if is_quoted(value): value = unescape(value[1:-1]) elif not is_token(value): warnings.warn(BadContentDispositionParam(item)) continue elif is_extended_param(key): if is_rfc5987(value): encoding, _, value = value.split("'", 2) encoding = encoding or 'utf-8' else: warnings.warn(BadContentDispositionParam(item)) continue try: value = unquote(value, encoding, 'strict') except UnicodeDecodeError: # pragma: nocover warnings.warn(BadContentDispositionParam(item)) continue else: failed = True if is_quoted(value): failed = False value = unescape(value[1:-1].lstrip('\\/')) elif is_token(value): failed = False elif parts: # maybe just ; in filename, in any case this is just # one case fix, for proper fix we need to redesign parser _value = '%s;%s' % (value, parts[0]) if is_quoted(_value): parts.pop(0) value = unescape(_value[1:-1].lstrip('\\/')) failed = False if failed: warnings.warn(BadContentDispositionHeader(header)) return None, {} params[key] = value return disptype.lower(), params def content_disposition_filename(params: Mapping[str, str], name: str='filename') -> Optional[str]: name_suf = '%s*' % name if not params: return None elif name_suf in params: return params[name_suf] elif name in params: return params[name] else: parts = [] fnparams = sorted((key, value) for key, value in params.items() if key.startswith(name_suf)) for num, (key, value) in enumerate(fnparams): _, tail = key.split('*', 1) if tail.endswith('*'): tail = tail[:-1] if tail == str(num): parts.append(value) else: break if not parts: return None value = ''.join(parts) if "'" in value: encoding, _, value = value.split("'", 2) encoding = encoding or 'utf-8' return unquote(value, encoding, 'strict') return value class MultipartResponseWrapper: """Wrapper around the MultipartReader. It takes care about underlying connection and close it when it needs in. """ def __init__( self, resp: 'ClientResponse', stream: 'MultipartReader', ) -> None: self.resp = resp self.stream = stream def __aiter__(self) -> 'MultipartResponseWrapper': return self async def __anext__( self, ) -> Union['MultipartReader', 'BodyPartReader']: part = await self.next() if part is None: raise StopAsyncIteration # NOQA return part def at_eof(self) -> bool: """Returns True when all response data had been read.""" return self.resp.content.at_eof() async def next( self, ) -> Optional[Union['MultipartReader', 'BodyPartReader']]: """Emits next multipart reader object.""" item = await self.stream.next() if self.stream.at_eof(): await self.release() return item async def release(self) -> None: """Releases the connection gracefully, reading all the content to the void.""" await self.resp.release() class BodyPartReader: """Multipart reader for single body part.""" chunk_size = 8192 def __init__(self, boundary: bytes, headers: 'CIMultiDictProxy[str]', content: StreamReader) -> None: self.headers = headers self._boundary = boundary self._content = content self._at_eof = False length = self.headers.get(CONTENT_LENGTH, None) self._length = int(length) if length is not None else None self._read_bytes = 0 # TODO: typeing.Deque is not supported by Python 3.5 self._unread = deque() # type: Any self._prev_chunk = None # type: Optional[bytes] self._content_eof = 0 self._cache = {} # type: Dict[str, Any] def __aiter__(self) -> 'BodyPartReader': return self async def __anext__(self) -> bytes: part = await self.next() if part is None: raise StopAsyncIteration # NOQA return part async def next(self) -> Optional[bytes]: item = await self.read() if not item: return None return item async def read(self, *, decode: bool=False) -> bytes: """Reads body part data. decode: Decodes data following by encoding method from Content-Encoding header. If it missed data remains untouched """ if self._at_eof: return b'' data = bytearray() while not self._at_eof: data.extend((await self.read_chunk(self.chunk_size))) if decode: return self.decode(data) return data async def read_chunk(self, size: int=chunk_size) -> bytes: """Reads body part content chunk of the specified size. size: chunk size """ if self._at_eof: return b'' if self._length: chunk = await self._read_chunk_from_length(size) else: chunk = await self._read_chunk_from_stream(size) self._read_bytes += len(chunk) if self._read_bytes == self._length: self._at_eof = True if self._at_eof: clrf = await self._content.readline() assert b'\r\n' == clrf, \ 'reader did not read all the data or it is malformed' return chunk async def _read_chunk_from_length(self, size: int) -> bytes: # Reads body part content chunk of the specified size. # The body part must has Content-Length header with proper value. assert self._length is not None, \ 'Content-Length required for chunked read' chunk_size = min(size, self._length - self._read_bytes) chunk = await self._content.read(chunk_size) return chunk async def _read_chunk_from_stream(self, size: int) -> bytes: # Reads content chunk of body part with unknown length. # The Content-Length header for body part is not necessary. assert size >= len(self._boundary) + 2, \ 'Chunk size must be greater or equal than boundary length + 2' first_chunk = self._prev_chunk is None if first_chunk: self._prev_chunk = await self._content.read(size) chunk = await self._content.read(size) self._content_eof += int(self._content.at_eof()) assert self._content_eof < 3, "Reading after EOF" assert self._prev_chunk is not None window = self._prev_chunk + chunk sub = b'\r\n' + self._boundary if first_chunk: idx = window.find(sub) else: idx = window.find(sub, max(0, len(self._prev_chunk) - len(sub))) if idx >= 0: # pushing boundary back to content with warnings.catch_warnings(): warnings.filterwarnings("ignore", category=DeprecationWarning) self._content.unread_data(window[idx:]) if size > idx: self._prev_chunk = self._prev_chunk[:idx] chunk = window[len(self._prev_chunk):idx] if not chunk: self._at_eof = True result = self._prev_chunk self._prev_chunk = chunk return result async def readline(self) -> bytes: """Reads body part by line by line.""" if self._at_eof: return b'' if self._unread: line = self._unread.popleft() else: line = await self._content.readline() if line.startswith(self._boundary): # the very last boundary may not come with \r\n, # so set single rules for everyone sline = line.rstrip(b'\r\n') boundary = self._boundary last_boundary = self._boundary + b'--' # ensure that we read exactly the boundary, not something alike if sline == boundary or sline == last_boundary: self._at_eof = True self._unread.append(line) return b'' else: next_line = await self._content.readline() if next_line.startswith(self._boundary): line = line[:-2] # strip CRLF but only once self._unread.append(next_line) return line async def release(self) -> None: """Like read(), but reads all the data to the void.""" if self._at_eof: return while not self._at_eof: await self.read_chunk(self.chunk_size) async def text(self, *, encoding: Optional[str]=None) -> str: """Like read(), but assumes that body part contains text data.""" data = await self.read(decode=True) # see https://www.w3.org/TR/html5/forms.html#multipart/form-data-encoding-algorithm # NOQA # and https://dvcs.w3.org/hg/xhr/raw-file/tip/Overview.html#dom-xmlhttprequest-send # NOQA encoding = encoding or self.get_charset(default='utf-8') return data.decode(encoding) async def json(self, *, encoding: Optional[str]=None) -> Optional[Dict[str, Any]]: """Like read(), but assumes that body parts contains JSON data.""" data = await self.read(decode=True) if not data: return None encoding = encoding or self.get_charset(default='utf-8') return json.loads(data.decode(encoding)) async def form(self, *, encoding: Optional[str]=None) -> List[Tuple[str, str]]: """Like read(), but assumes that body parts contains form urlencoded data. """ data = await self.read(decode=True) if not data: return [] if encoding is not None: real_encoding = encoding else: real_encoding = self.get_charset(default='utf-8') return parse_qsl(data.rstrip().decode(real_encoding), keep_blank_values=True, encoding=real_encoding) def at_eof(self) -> bool: """Returns True if the boundary was reached or False otherwise.""" return self._at_eof def decode(self, data: bytes) -> bytes: """Decodes data according the specified Content-Encoding or Content-Transfer-Encoding headers value. """ if CONTENT_TRANSFER_ENCODING in self.headers: data = self._decode_content_transfer(data) if CONTENT_ENCODING in self.headers: return self._decode_content(data) return data def _decode_content(self, data: bytes) -> bytes: encoding = self.headers.get(CONTENT_ENCODING, '').lower() if encoding == 'deflate': return zlib.decompress(data, -zlib.MAX_WBITS) elif encoding == 'gzip': return zlib.decompress(data, 16 + zlib.MAX_WBITS) elif encoding == 'identity': return data else: raise RuntimeError('unknown content encoding: {}'.format(encoding)) def _decode_content_transfer(self, data: bytes) -> bytes: encoding = self.headers.get(CONTENT_TRANSFER_ENCODING, '').lower() if encoding == 'base64': return base64.b64decode(data) elif encoding == 'quoted-printable': return binascii.a2b_qp(data) elif encoding in ('binary', '8bit', '7bit'): return data else: raise RuntimeError('unknown content transfer encoding: {}' ''.format(encoding)) def get_charset(self, default: str) -> str: """Returns charset parameter from Content-Type header or default.""" ctype = self.headers.get(CONTENT_TYPE, '') mimetype = parse_mimetype(ctype) return mimetype.parameters.get('charset', default) @reify def name(self) -> Optional[str]: """Returns name specified in Content-Disposition header or None if missed or header is malformed. """ _, params = parse_content_disposition( self.headers.get(CONTENT_DISPOSITION)) return content_disposition_filename(params, 'name') @reify def filename(self) -> Optional[str]: """Returns filename specified in Content-Disposition header or None if missed or header is malformed. """ _, params = parse_content_disposition( self.headers.get(CONTENT_DISPOSITION)) return content_disposition_filename(params, 'filename') @payload_type(BodyPartReader, order=Order.try_first) class BodyPartReaderPayload(Payload): def __init__(self, value: BodyPartReader, *args: Any, **kwargs: Any) -> None: super().__init__(value, *args, **kwargs) params = {} # type: Dict[str, str] if value.name is not None: params['name'] = value.name if value.filename is not None: params['filename'] = value.filename if params: self.set_content_disposition('attachment', True, **params) async def write(self, writer: Any) -> None: field = self._value chunk = await field.read_chunk(size=2**16) while chunk: await writer.write(field.decode(chunk)) chunk = await field.read_chunk(size=2**16) class MultipartReader: """Multipart body reader.""" #: Response wrapper, used when multipart readers constructs from response. response_wrapper_cls = MultipartResponseWrapper #: Multipart reader class, used to handle multipart/* body parts. #: None points to type(self) multipart_reader_cls = None #: Body part reader class for non multipart/* content types. part_reader_cls = BodyPartReader def __init__(self, headers: Mapping[str, str], content: StreamReader) -> None: self.headers = headers self._boundary = ('--' + self._get_boundary()).encode() self._content = content self._last_part = None # type: Optional[Union['MultipartReader', BodyPartReader]] # noqa self._at_eof = False self._at_bof = True self._unread = [] # type: List[bytes] def __aiter__(self) -> 'MultipartReader': return self async def __anext__( self, ) -> Union['MultipartReader', BodyPartReader]: part = await self.next() if part is None: raise StopAsyncIteration # NOQA return part @classmethod def from_response( cls, response: 'ClientResponse', ) -> MultipartResponseWrapper: """Constructs reader instance from HTTP response. :param response: :class:`~aiohttp.client.ClientResponse` instance """ obj = cls.response_wrapper_cls(response, cls(response.headers, response.content)) return obj def at_eof(self) -> bool: """Returns True if the final boundary was reached or False otherwise. """ return self._at_eof async def next( self, ) -> Optional[Union['MultipartReader', BodyPartReader]]: """Emits the next multipart body part.""" # So, if we're at BOF, we need to skip till the boundary. if self._at_eof: return None await self._maybe_release_last_part() if self._at_bof: await self._read_until_first_boundary() self._at_bof = False else: await self._read_boundary() if self._at_eof: # we just read the last boundary, nothing to do there return None self._last_part = await self.fetch_next_part() return self._last_part async def release(self) -> None: """Reads all the body parts to the void till the final boundary.""" while not self._at_eof: item = await self.next() if item is None: break await item.release() async def fetch_next_part( self, ) -> Union['MultipartReader', BodyPartReader]: """Returns the next body part reader.""" headers = await self._read_headers() return self._get_part_reader(headers) def _get_part_reader( self, headers: 'CIMultiDictProxy[str]', ) -> Union['MultipartReader', BodyPartReader]: """Dispatches the response by the `Content-Type` header, returning suitable reader instance. :param dict headers: Response headers """ ctype = headers.get(CONTENT_TYPE, '') mimetype = parse_mimetype(ctype) if mimetype.type == 'multipart': if self.multipart_reader_cls is None: return type(self)(headers, self._content) return self.multipart_reader_cls(headers, self._content) else: return self.part_reader_cls(self._boundary, headers, self._content) def _get_boundary(self) -> str: mimetype = parse_mimetype(self.headers[CONTENT_TYPE]) assert mimetype.type == 'multipart', ( 'multipart/* content type expected' ) if 'boundary' not in mimetype.parameters: raise ValueError('boundary missed for Content-Type: %s' % self.headers[CONTENT_TYPE]) boundary = mimetype.parameters['boundary'] if len(boundary) > 70: raise ValueError('boundary %r is too long (70 chars max)' % boundary) return boundary async def _readline(self) -> bytes: if self._unread: return self._unread.pop() return await self._content.readline() async def _read_until_first_boundary(self) -> None: while True: chunk = await self._readline() if chunk == b'': raise ValueError("Could not find starting boundary %r" % (self._boundary)) chunk = chunk.rstrip() if chunk == self._boundary: return elif chunk == self._boundary + b'--': self._at_eof = True return async def _read_boundary(self) -> None: chunk = (await self._readline()).rstrip() if chunk == self._boundary: pass elif chunk == self._boundary + b'--': self._at_eof = True epilogue = await self._readline() next_line = await self._readline() # the epilogue is expected and then either the end of input or the # parent multipart boundary, if the parent boundary is found then # it should be marked as unread and handed to the parent for # processing if next_line[:2] == b'--': self._unread.append(next_line) # otherwise the request is likely missing an epilogue and both # lines should be passed to the parent for processing # (this handles the old behavior gracefully) else: self._unread.extend([next_line, epilogue]) else: raise ValueError('Invalid boundary %r, expected %r' % (chunk, self._boundary)) async def _read_headers(self) -> 'CIMultiDictProxy[str]': lines = [b''] while True: chunk = await self._content.readline() chunk = chunk.strip() lines.append(chunk) if not chunk: break parser = HeadersParser() headers, raw_headers = parser.parse_headers(lines) return headers async def _maybe_release_last_part(self) -> None: """Ensures that the last read body part is read completely.""" if self._last_part is not None: if not self._last_part.at_eof(): await self._last_part.release() self._unread.extend(self._last_part._unread) self._last_part = None _Part = Tuple[Payload, str, str] class MultipartWriter(Payload): """Multipart body writer.""" def __init__(self, subtype: str='mixed', boundary: Optional[str]=None) -> None: boundary = boundary if boundary is not None else uuid.uuid4().hex # The underlying Payload API demands a str (utf-8), not bytes, # so we need to ensure we don't lose anything during conversion. # As a result, require the boundary to be ASCII only. # In both situations. try: self._boundary = boundary.encode('ascii') except UnicodeEncodeError: raise ValueError('boundary should contain ASCII only chars') \ from None ctype = ('multipart/{}; boundary={}' .format(subtype, self._boundary_value)) super().__init__(None, content_type=ctype) self._parts = [] # type: List[_Part] # noqa def __enter__(self) -> 'MultipartWriter': return self def __exit__(self, exc_type: Optional[Type[BaseException]], exc_val: Optional[BaseException], exc_tb: Optional[TracebackType]) -> None: pass def __iter__(self) -> Iterator[_Part]: return iter(self._parts) def __len__(self) -> int: return len(self._parts) def __bool__(self) -> bool: return True _valid_tchar_regex = re.compile(br"\A[!#$%&'*+\-.^_`|~\w]+\Z") _invalid_qdtext_char_regex = re.compile(br"[\x00-\x08\x0A-\x1F\x7F]") @property def _boundary_value(self) -> str: """Wrap boundary parameter value in quotes, if necessary. Reads self.boundary and returns a unicode sting. """ # Refer to RFCs 7231, 7230, 5234. # # parameter = token "=" ( token / quoted-string ) # token = 1*tchar # quoted-string = DQUOTE *( qdtext / quoted-pair ) DQUOTE # qdtext = HTAB / SP / %x21 / %x23-5B / %x5D-7E / obs-text # obs-text = %x80-FF # quoted-pair = "\" ( HTAB / SP / VCHAR / obs-text ) # tchar = "!" / "#" / "$" / "%" / "&" / "'" / "*" # / "+" / "-" / "." / "^" / "_" / "`" / "|" / "~" # / DIGIT / ALPHA # ; any VCHAR, except delimiters # VCHAR = %x21-7E value = self._boundary if re.match(self._valid_tchar_regex, value): return value.decode('ascii') # cannot fail if re.search(self._invalid_qdtext_char_regex, value): raise ValueError("boundary value contains invalid characters") # escape %x5C and %x22 quoted_value_content = value.replace(b'\\', b'\\\\') quoted_value_content = quoted_value_content.replace(b'"', b'\\"') return '"' + quoted_value_content.decode('ascii') + '"' @property def boundary(self) -> str: return self._boundary.decode('ascii') def append( self, obj: Any, headers: Optional[MultiMapping[str]]=None ) -> Payload: if headers is None: headers = CIMultiDict() if isinstance(obj, Payload): obj.headers.update(headers) return self.append_payload(obj) else: try: payload = get_payload(obj, headers=headers) except LookupError: raise TypeError('Cannot create payload from %r' % obj) else: return self.append_payload(payload) def append_payload(self, payload: Payload) -> Payload: """Adds a new body part to multipart writer.""" # compression encoding = payload.headers.get( CONTENT_ENCODING, '', ).lower() # type: Optional[str] if encoding and encoding not in ('deflate', 'gzip', 'identity'): raise RuntimeError('unknown content encoding: {}'.format(encoding)) if encoding == 'identity': encoding = None # te encoding te_encoding = payload.headers.get( CONTENT_TRANSFER_ENCODING, '', ).lower() # type: Optional[str] if te_encoding not in ('', 'base64', 'quoted-printable', 'binary'): raise RuntimeError('unknown content transfer encoding: {}' ''.format(te_encoding)) if te_encoding == 'binary': te_encoding = None # size size = payload.size if size is not None and not (encoding or te_encoding): payload.headers[CONTENT_LENGTH] = str(size) self._parts.append((payload, encoding, te_encoding)) # type: ignore return payload def append_json( self, obj: Any, headers: Optional[MultiMapping[str]]=None ) -> Payload: """Helper to append JSON part.""" if headers is None: headers = CIMultiDict() return self.append_payload(JsonPayload(obj, headers=headers)) def append_form( self, obj: Union[Sequence[Tuple[str, str]], Mapping[str, str]], headers: Optional[MultiMapping[str]]=None ) -> Payload: """Helper to append form urlencoded part.""" assert isinstance(obj, (Sequence, Mapping)) if headers is None: headers = CIMultiDict() if isinstance(obj, Mapping): obj = list(obj.items()) data = urlencode(obj, doseq=True) return self.append_payload( StringPayload(data, headers=headers, content_type='application/x-www-form-urlencoded')) @property def size(self) -> Optional[int]: """Size of the payload.""" total = 0 for part, encoding, te_encoding in self._parts: if encoding or te_encoding or part.size is None: return None total += int( 2 + len(self._boundary) + 2 + # b'--'+self._boundary+b'\r\n' part.size + len(part._binary_headers) + 2 # b'\r\n' ) total += 2 + len(self._boundary) + 4 # b'--'+self._boundary+b'--\r\n' return total async def write(self, writer: Any, close_boundary: bool=True) -> None: """Write body.""" for part, encoding, te_encoding in self._parts: await writer.write(b'--' + self._boundary + b'\r\n') await writer.write(part._binary_headers) if encoding or te_encoding: w = MultipartPayloadWriter(writer) if encoding: w.enable_compression(encoding) if te_encoding: w.enable_encoding(te_encoding) await part.write(w) # type: ignore await w.write_eof() else: await part.write(writer) await writer.write(b'\r\n') if close_boundary: await writer.write(b'--' + self._boundary + b'--\r\n') class MultipartPayloadWriter: def __init__(self, writer: Any) -> None: self._writer = writer self._encoding = None # type: Optional[str] self._compress = None # type: Any self._encoding_buffer = None # type: Optional[bytearray] def enable_encoding(self, encoding: str) -> None: if encoding == 'base64': self._encoding = encoding self._encoding_buffer = bytearray() elif encoding == 'quoted-printable': self._encoding = 'quoted-printable' def enable_compression(self, encoding: str='deflate') -> None: zlib_mode = (16 + zlib.MAX_WBITS if encoding == 'gzip' else -zlib.MAX_WBITS) self._compress = zlib.compressobj(wbits=zlib_mode) async def write_eof(self) -> None: if self._compress is not None: chunk = self._compress.flush() if chunk: self._compress = None await self.write(chunk) if self._encoding == 'base64': if self._encoding_buffer: await self._writer.write(base64.b64encode( self._encoding_buffer)) async def write(self, chunk: bytes) -> None: if self._compress is not None: if chunk: chunk = self._compress.compress(chunk) if not chunk: return if self._encoding == 'base64': buf = self._encoding_buffer assert buf is not None buf.extend(chunk) if buf: div, mod = divmod(len(buf), 3) enc_chunk, self._encoding_buffer = ( buf[:div * 3], buf[div * 3:]) if enc_chunk: b64chunk = base64.b64encode(enc_chunk) await self._writer.write(b64chunk) elif self._encoding == 'quoted-printable': await self._writer.write(binascii.b2a_qp(chunk)) else: await self._writer.write(chunk) aiohttp-3.6.2/aiohttp/payload.py0000644000175100001650000003331313547410117017143 0ustar vstsdocker00000000000000import asyncio import enum import io import json import mimetypes import os import warnings from abc import ABC, abstractmethod from itertools import chain from typing import ( IO, TYPE_CHECKING, Any, ByteString, Dict, Iterable, Optional, Text, TextIO, Tuple, Type, Union, ) from multidict import CIMultiDict from . import hdrs from .abc import AbstractStreamWriter from .helpers import ( PY_36, content_disposition_header, guess_filename, parse_mimetype, sentinel, ) from .streams import DEFAULT_LIMIT, StreamReader from .typedefs import JSONEncoder, _CIMultiDict __all__ = ('PAYLOAD_REGISTRY', 'get_payload', 'payload_type', 'Payload', 'BytesPayload', 'StringPayload', 'IOBasePayload', 'BytesIOPayload', 'BufferedReaderPayload', 'TextIOPayload', 'StringIOPayload', 'JsonPayload', 'AsyncIterablePayload') TOO_LARGE_BYTES_BODY = 2 ** 20 # 1 MB if TYPE_CHECKING: # pragma: no cover from typing import List # noqa class LookupError(Exception): pass class Order(str, enum.Enum): normal = 'normal' try_first = 'try_first' try_last = 'try_last' def get_payload(data: Any, *args: Any, **kwargs: Any) -> 'Payload': return PAYLOAD_REGISTRY.get(data, *args, **kwargs) def register_payload(factory: Type['Payload'], type: Any, *, order: Order=Order.normal) -> None: PAYLOAD_REGISTRY.register(factory, type, order=order) class payload_type: def __init__(self, type: Any, *, order: Order=Order.normal) -> None: self.type = type self.order = order def __call__(self, factory: Type['Payload']) -> Type['Payload']: register_payload(factory, self.type, order=self.order) return factory class PayloadRegistry: """Payload registry. note: we need zope.interface for more efficient adapter search """ def __init__(self) -> None: self._first = [] # type: List[Tuple[Type[Payload], Any]] self._normal = [] # type: List[Tuple[Type[Payload], Any]] self._last = [] # type: List[Tuple[Type[Payload], Any]] def get(self, data: Any, *args: Any, _CHAIN: Any=chain, **kwargs: Any) -> 'Payload': if isinstance(data, Payload): return data for factory, type in _CHAIN(self._first, self._normal, self._last): if isinstance(data, type): return factory(data, *args, **kwargs) raise LookupError() def register(self, factory: Type['Payload'], type: Any, *, order: Order=Order.normal) -> None: if order is Order.try_first: self._first.append((factory, type)) elif order is Order.normal: self._normal.append((factory, type)) elif order is Order.try_last: self._last.append((factory, type)) else: raise ValueError("Unsupported order {!r}".format(order)) class Payload(ABC): _default_content_type = 'application/octet-stream' # type: str _size = None # type: Optional[int] def __init__(self, value: Any, headers: Optional[ Union[ _CIMultiDict, Dict[str, str], Iterable[Tuple[str, str]] ] ] = None, content_type: Optional[str]=sentinel, filename: Optional[str]=None, encoding: Optional[str]=None, **kwargs: Any) -> None: self._encoding = encoding self._filename = filename self._headers = CIMultiDict() # type: _CIMultiDict self._value = value if content_type is not sentinel and content_type is not None: self._headers[hdrs.CONTENT_TYPE] = content_type elif self._filename is not None: content_type = mimetypes.guess_type(self._filename)[0] if content_type is None: content_type = self._default_content_type self._headers[hdrs.CONTENT_TYPE] = content_type else: self._headers[hdrs.CONTENT_TYPE] = self._default_content_type self._headers.update(headers or {}) @property def size(self) -> Optional[int]: """Size of the payload.""" return self._size @property def filename(self) -> Optional[str]: """Filename of the payload.""" return self._filename @property def headers(self) -> _CIMultiDict: """Custom item headers""" return self._headers @property def _binary_headers(self) -> bytes: return ''.join( [k + ': ' + v + '\r\n' for k, v in self.headers.items()] ).encode('utf-8') + b'\r\n' @property def encoding(self) -> Optional[str]: """Payload encoding""" return self._encoding @property def content_type(self) -> str: """Content type""" return self._headers[hdrs.CONTENT_TYPE] def set_content_disposition(self, disptype: str, quote_fields: bool=True, **params: Any) -> None: """Sets ``Content-Disposition`` header.""" self._headers[hdrs.CONTENT_DISPOSITION] = content_disposition_header( disptype, quote_fields=quote_fields, **params) @abstractmethod async def write(self, writer: AbstractStreamWriter) -> None: """Write payload. writer is an AbstractStreamWriter instance: """ class BytesPayload(Payload): def __init__(self, value: ByteString, *args: Any, **kwargs: Any) -> None: if not isinstance(value, (bytes, bytearray, memoryview)): raise TypeError("value argument must be byte-ish, not {!r}" .format(type(value))) if 'content_type' not in kwargs: kwargs['content_type'] = 'application/octet-stream' super().__init__(value, *args, **kwargs) self._size = len(value) if self._size > TOO_LARGE_BYTES_BODY: if PY_36: kwargs = {'source': self} else: kwargs = {} warnings.warn("Sending a large body directly with raw bytes might" " lock the event loop. You should probably pass an " "io.BytesIO object instead", ResourceWarning, **kwargs) async def write(self, writer: AbstractStreamWriter) -> None: await writer.write(self._value) class StringPayload(BytesPayload): def __init__(self, value: Text, *args: Any, encoding: Optional[str]=None, content_type: Optional[str]=None, **kwargs: Any) -> None: if encoding is None: if content_type is None: real_encoding = 'utf-8' content_type = 'text/plain; charset=utf-8' else: mimetype = parse_mimetype(content_type) real_encoding = mimetype.parameters.get('charset', 'utf-8') else: if content_type is None: content_type = 'text/plain; charset=%s' % encoding real_encoding = encoding super().__init__( value.encode(real_encoding), encoding=real_encoding, content_type=content_type, *args, **kwargs, ) class StringIOPayload(StringPayload): def __init__(self, value: IO[str], *args: Any, **kwargs: Any) -> None: super().__init__(value.read(), *args, **kwargs) class IOBasePayload(Payload): def __init__(self, value: IO[Any], disposition: str='attachment', *args: Any, **kwargs: Any) -> None: if 'filename' not in kwargs: kwargs['filename'] = guess_filename(value) super().__init__(value, *args, **kwargs) if self._filename is not None and disposition is not None: if hdrs.CONTENT_DISPOSITION not in self.headers: self.set_content_disposition( disposition, filename=self._filename ) async def write(self, writer: AbstractStreamWriter) -> None: loop = asyncio.get_event_loop() try: chunk = await loop.run_in_executor( None, self._value.read, DEFAULT_LIMIT ) while chunk: await writer.write(chunk) chunk = await loop.run_in_executor( None, self._value.read, DEFAULT_LIMIT ) finally: await loop.run_in_executor(None, self._value.close) class TextIOPayload(IOBasePayload): def __init__(self, value: TextIO, *args: Any, encoding: Optional[str]=None, content_type: Optional[str]=None, **kwargs: Any) -> None: if encoding is None: if content_type is None: encoding = 'utf-8' content_type = 'text/plain; charset=utf-8' else: mimetype = parse_mimetype(content_type) encoding = mimetype.parameters.get('charset', 'utf-8') else: if content_type is None: content_type = 'text/plain; charset=%s' % encoding super().__init__( value, content_type=content_type, encoding=encoding, *args, **kwargs, ) @property def size(self) -> Optional[int]: try: return os.fstat(self._value.fileno()).st_size - self._value.tell() except OSError: return None async def write(self, writer: AbstractStreamWriter) -> None: loop = asyncio.get_event_loop() try: chunk = await loop.run_in_executor( None, self._value.read, DEFAULT_LIMIT ) while chunk: await writer.write(chunk.encode(self._encoding)) chunk = await loop.run_in_executor( None, self._value.read, DEFAULT_LIMIT ) finally: await loop.run_in_executor(None, self._value.close) class BytesIOPayload(IOBasePayload): @property def size(self) -> int: position = self._value.tell() end = self._value.seek(0, os.SEEK_END) self._value.seek(position) return end - position class BufferedReaderPayload(IOBasePayload): @property def size(self) -> Optional[int]: try: return os.fstat(self._value.fileno()).st_size - self._value.tell() except OSError: # data.fileno() is not supported, e.g. # io.BufferedReader(io.BytesIO(b'data')) return None class JsonPayload(BytesPayload): def __init__(self, value: Any, encoding: str='utf-8', content_type: str='application/json', dumps: JSONEncoder=json.dumps, *args: Any, **kwargs: Any) -> None: super().__init__( dumps(value).encode(encoding), content_type=content_type, encoding=encoding, *args, **kwargs) if TYPE_CHECKING: # pragma: no cover from typing import AsyncIterator, AsyncIterable _AsyncIterator = AsyncIterator[bytes] _AsyncIterable = AsyncIterable[bytes] else: from collections.abc import AsyncIterable, AsyncIterator _AsyncIterator = AsyncIterator _AsyncIterable = AsyncIterable class AsyncIterablePayload(Payload): _iter = None # type: Optional[_AsyncIterator] def __init__(self, value: _AsyncIterable, *args: Any, **kwargs: Any) -> None: if not isinstance(value, AsyncIterable): raise TypeError("value argument must support " "collections.abc.AsyncIterablebe interface, " "got {!r}".format(type(value))) if 'content_type' not in kwargs: kwargs['content_type'] = 'application/octet-stream' super().__init__(value, *args, **kwargs) self._iter = value.__aiter__() async def write(self, writer: AbstractStreamWriter) -> None: if self._iter: try: # iter is not None check prevents rare cases # when the case iterable is used twice while True: chunk = await self._iter.__anext__() await writer.write(chunk) except StopAsyncIteration: self._iter = None class StreamReaderPayload(AsyncIterablePayload): def __init__(self, value: StreamReader, *args: Any, **kwargs: Any) -> None: super().__init__(value.iter_any(), *args, **kwargs) PAYLOAD_REGISTRY = PayloadRegistry() PAYLOAD_REGISTRY.register(BytesPayload, (bytes, bytearray, memoryview)) PAYLOAD_REGISTRY.register(StringPayload, str) PAYLOAD_REGISTRY.register(StringIOPayload, io.StringIO) PAYLOAD_REGISTRY.register(TextIOPayload, io.TextIOBase) PAYLOAD_REGISTRY.register(BytesIOPayload, io.BytesIO) PAYLOAD_REGISTRY.register( BufferedReaderPayload, (io.BufferedReader, io.BufferedRandom)) PAYLOAD_REGISTRY.register(IOBasePayload, io.IOBase) PAYLOAD_REGISTRY.register(StreamReaderPayload, StreamReader) # try_last for giving a chance to more specialized async interables like # multidict.BodyPartReaderPayload override the default PAYLOAD_REGISTRY.register(AsyncIterablePayload, AsyncIterable, order=Order.try_last) aiohttp-3.6.2/aiohttp/payload_streamer.py0000644000175100001650000000406713547410117021051 0ustar vstsdocker00000000000000""" Payload implemenation for coroutines as data provider. As a simple case, you can upload data from file:: @aiohttp.streamer async def file_sender(writer, file_name=None): with open(file_name, 'rb') as f: chunk = f.read(2**16) while chunk: await writer.write(chunk) chunk = f.read(2**16) Then you can use `file_sender` like this: async with session.post('http://httpbin.org/post', data=file_sender(file_name='huge_file')) as resp: print(await resp.text()) ..note:: Coroutine must accept `writer` as first argument """ import asyncio import warnings from typing import Any, Awaitable, Callable, Dict, Tuple from .abc import AbstractStreamWriter from .payload import Payload, payload_type __all__ = ('streamer',) class _stream_wrapper: def __init__(self, coro: Callable[..., Awaitable[None]], args: Tuple[Any, ...], kwargs: Dict[str, Any]) -> None: self.coro = asyncio.coroutine(coro) self.args = args self.kwargs = kwargs async def __call__(self, writer: AbstractStreamWriter) -> None: await self.coro(writer, *self.args, **self.kwargs) class streamer: def __init__(self, coro: Callable[..., Awaitable[None]]) -> None: warnings.warn("@streamer is deprecated, use async generators instead", DeprecationWarning, stacklevel=2) self.coro = coro def __call__(self, *args: Any, **kwargs: Any) -> _stream_wrapper: return _stream_wrapper(self.coro, args, kwargs) @payload_type(_stream_wrapper) class StreamWrapperPayload(Payload): async def write(self, writer: AbstractStreamWriter) -> None: await self._value(writer) @payload_type(streamer) class StreamPayload(StreamWrapperPayload): def __init__(self, value: Any, *args: Any, **kwargs: Any) -> None: super().__init__(value(), *args, **kwargs) async def write(self, writer: AbstractStreamWriter) -> None: await self._value(writer) aiohttp-3.6.2/aiohttp/py.typed0000644000175100001650000000000613547410117016630 0ustar vstsdocker00000000000000Markeraiohttp-3.6.2/aiohttp/pytest_plugin.py0000644000175100001650000002512313547410117020420 0ustar vstsdocker00000000000000import asyncio import contextlib import warnings from collections.abc import Callable import pytest from aiohttp.helpers import PY_37, isasyncgenfunction from aiohttp.web import Application from .test_utils import ( BaseTestServer, RawTestServer, TestClient, TestServer, loop_context, setup_test_loop, teardown_test_loop, ) from .test_utils import unused_port as _unused_port try: import uvloop except ImportError: # pragma: no cover uvloop = None try: import tokio except ImportError: # pragma: no cover tokio = None def pytest_addoption(parser): # type: ignore parser.addoption( '--aiohttp-fast', action='store_true', default=False, help='run tests faster by disabling extra checks') parser.addoption( '--aiohttp-loop', action='store', default='pyloop', help='run tests with specific loop: pyloop, uvloop, tokio or all') parser.addoption( '--aiohttp-enable-loop-debug', action='store_true', default=False, help='enable event loop debug mode') def pytest_fixture_setup(fixturedef): # type: ignore """ Allow fixtures to be coroutines. Run coroutine fixtures in an event loop. """ func = fixturedef.func if isasyncgenfunction(func): # async generator fixture is_async_gen = True elif asyncio.iscoroutinefunction(func): # regular async fixture is_async_gen = False else: # not an async fixture, nothing to do return strip_request = False if 'request' not in fixturedef.argnames: fixturedef.argnames += ('request',) strip_request = True def wrapper(*args, **kwargs): # type: ignore request = kwargs['request'] if strip_request: del kwargs['request'] # if neither the fixture nor the test use the 'loop' fixture, # 'getfixturevalue' will fail because the test is not parameterized # (this can be removed someday if 'loop' is no longer parameterized) if 'loop' not in request.fixturenames: raise Exception( "Asynchronous fixtures must depend on the 'loop' fixture or " "be used in tests depending from it." ) _loop = request.getfixturevalue('loop') if is_async_gen: # for async generators, we need to advance the generator once, # then advance it again in a finalizer gen = func(*args, **kwargs) def finalizer(): # type: ignore try: return _loop.run_until_complete(gen.__anext__()) except StopAsyncIteration: # NOQA pass request.addfinalizer(finalizer) return _loop.run_until_complete(gen.__anext__()) else: return _loop.run_until_complete(func(*args, **kwargs)) fixturedef.func = wrapper @pytest.fixture def fast(request): # type: ignore """--fast config option""" return request.config.getoption('--aiohttp-fast') @pytest.fixture def loop_debug(request): # type: ignore """--enable-loop-debug config option""" return request.config.getoption('--aiohttp-enable-loop-debug') @contextlib.contextmanager def _runtime_warning_context(): # type: ignore """ Context manager which checks for RuntimeWarnings, specifically to avoid "coroutine 'X' was never awaited" warnings being missed. If RuntimeWarnings occur in the context a RuntimeError is raised. """ with warnings.catch_warnings(record=True) as _warnings: yield rw = ['{w.filename}:{w.lineno}:{w.message}'.format(w=w) for w in _warnings # type: ignore if w.category == RuntimeWarning] if rw: raise RuntimeError('{} Runtime Warning{},\n{}'.format( len(rw), '' if len(rw) == 1 else 's', '\n'.join(rw) )) @contextlib.contextmanager def _passthrough_loop_context(loop, fast=False): # type: ignore """ setups and tears down a loop unless one is passed in via the loop argument when it's passed straight through. """ if loop: # loop already exists, pass it straight through yield loop else: # this shadows loop_context's standard behavior loop = setup_test_loop() yield loop teardown_test_loop(loop, fast=fast) def pytest_pycollect_makeitem(collector, name, obj): # type: ignore """ Fix pytest collecting for coroutines. """ if collector.funcnamefilter(name) and asyncio.iscoroutinefunction(obj): return list(collector._genfunctions(name, obj)) def pytest_pyfunc_call(pyfuncitem): # type: ignore """ Run coroutines in an event loop instead of a normal function call. """ fast = pyfuncitem.config.getoption("--aiohttp-fast") if asyncio.iscoroutinefunction(pyfuncitem.function): existing_loop = pyfuncitem.funcargs.get('proactor_loop')\ or pyfuncitem.funcargs.get('loop', None) with _runtime_warning_context(): with _passthrough_loop_context(existing_loop, fast=fast) as _loop: testargs = {arg: pyfuncitem.funcargs[arg] for arg in pyfuncitem._fixtureinfo.argnames} _loop.run_until_complete(pyfuncitem.obj(**testargs)) return True def pytest_generate_tests(metafunc): # type: ignore if 'loop_factory' not in metafunc.fixturenames: return loops = metafunc.config.option.aiohttp_loop avail_factories = {'pyloop': asyncio.DefaultEventLoopPolicy} if uvloop is not None: # pragma: no cover avail_factories['uvloop'] = uvloop.EventLoopPolicy if tokio is not None: # pragma: no cover avail_factories['tokio'] = tokio.EventLoopPolicy if loops == 'all': loops = 'pyloop,uvloop?,tokio?' factories = {} # type: ignore for name in loops.split(','): required = not name.endswith('?') name = name.strip(' ?') if name not in avail_factories: # pragma: no cover if required: raise ValueError( "Unknown loop '%s', available loops: %s" % ( name, list(factories.keys()))) else: continue factories[name] = avail_factories[name] metafunc.parametrize("loop_factory", list(factories.values()), ids=list(factories.keys())) @pytest.fixture def loop(loop_factory, fast, loop_debug): # type: ignore """Return an instance of the event loop.""" policy = loop_factory() asyncio.set_event_loop_policy(policy) with loop_context(fast=fast) as _loop: if loop_debug: _loop.set_debug(True) # pragma: no cover asyncio.set_event_loop(_loop) yield _loop @pytest.fixture def proactor_loop(): # type: ignore if not PY_37: policy = asyncio.get_event_loop_policy() policy._loop_factory = asyncio.ProactorEventLoop # type: ignore else: policy = asyncio.WindowsProactorEventLoopPolicy() # type: ignore asyncio.set_event_loop_policy(policy) with loop_context(policy.new_event_loop) as _loop: asyncio.set_event_loop(_loop) yield _loop @pytest.fixture def unused_port(aiohttp_unused_port): # type: ignore # pragma: no cover warnings.warn("Deprecated, use aiohttp_unused_port fixture instead", DeprecationWarning) return aiohttp_unused_port @pytest.fixture def aiohttp_unused_port(): # type: ignore """Return a port that is unused on the current host.""" return _unused_port @pytest.fixture def aiohttp_server(loop): # type: ignore """Factory to create a TestServer instance, given an app. aiohttp_server(app, **kwargs) """ servers = [] async def go(app, *, port=None, **kwargs): # type: ignore server = TestServer(app, port=port) await server.start_server(loop=loop, **kwargs) servers.append(server) return server yield go async def finalize(): # type: ignore while servers: await servers.pop().close() loop.run_until_complete(finalize()) @pytest.fixture def test_server(aiohttp_server): # type: ignore # pragma: no cover warnings.warn("Deprecated, use aiohttp_server fixture instead", DeprecationWarning) return aiohttp_server @pytest.fixture def aiohttp_raw_server(loop): # type: ignore """Factory to create a RawTestServer instance, given a web handler. aiohttp_raw_server(handler, **kwargs) """ servers = [] async def go(handler, *, port=None, **kwargs): # type: ignore server = RawTestServer(handler, port=port) await server.start_server(loop=loop, **kwargs) servers.append(server) return server yield go async def finalize(): # type: ignore while servers: await servers.pop().close() loop.run_until_complete(finalize()) @pytest.fixture def raw_test_server(aiohttp_raw_server): # type: ignore # pragma: no cover warnings.warn("Deprecated, use aiohttp_raw_server fixture instead", DeprecationWarning) return aiohttp_raw_server @pytest.fixture def aiohttp_client(loop): # type: ignore """Factory to create a TestClient instance. aiohttp_client(app, **kwargs) aiohttp_client(server, **kwargs) aiohttp_client(raw_server, **kwargs) """ clients = [] async def go(__param, *args, server_kwargs=None, **kwargs): # type: ignore if (isinstance(__param, Callable) and # type: ignore not isinstance(__param, (Application, BaseTestServer))): __param = __param(loop, *args, **kwargs) kwargs = {} else: assert not args, "args should be empty" if isinstance(__param, Application): server_kwargs = server_kwargs or {} server = TestServer(__param, loop=loop, **server_kwargs) client = TestClient(server, loop=loop, **kwargs) elif isinstance(__param, BaseTestServer): client = TestClient(__param, loop=loop, **kwargs) else: raise ValueError("Unknown argument type: %r" % type(__param)) await client.start_server() clients.append(client) return client yield go async def finalize(): # type: ignore while clients: await clients.pop().close() loop.run_until_complete(finalize()) @pytest.fixture def test_client(aiohttp_client): # type: ignore # pragma: no cover warnings.warn("Deprecated, use aiohttp_client fixture instead", DeprecationWarning) return aiohttp_client aiohttp-3.6.2/aiohttp/resolver.py0000644000175100001650000000705213547410117017354 0ustar vstsdocker00000000000000import asyncio import socket from typing import Any, Dict, List, Optional from .abc import AbstractResolver from .helpers import get_running_loop __all__ = ('ThreadedResolver', 'AsyncResolver', 'DefaultResolver') try: import aiodns # aiodns_default = hasattr(aiodns.DNSResolver, 'gethostbyname') except ImportError: # pragma: no cover aiodns = None aiodns_default = False class ThreadedResolver(AbstractResolver): """Use Executor for synchronous getaddrinfo() calls, which defaults to concurrent.futures.ThreadPoolExecutor. """ def __init__(self, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: self._loop = get_running_loop(loop) async def resolve(self, host: str, port: int=0, family: int=socket.AF_INET) -> List[Dict[str, Any]]: infos = await self._loop.getaddrinfo( host, port, type=socket.SOCK_STREAM, family=family) hosts = [] for family, _, proto, _, address in infos: hosts.append( {'hostname': host, 'host': address[0], 'port': address[1], 'family': family, 'proto': proto, 'flags': socket.AI_NUMERICHOST}) return hosts async def close(self) -> None: pass class AsyncResolver(AbstractResolver): """Use the `aiodns` package to make asynchronous DNS lookups""" def __init__(self, loop: Optional[asyncio.AbstractEventLoop]=None, *args: Any, **kwargs: Any) -> None: if aiodns is None: raise RuntimeError("Resolver requires aiodns library") self._loop = get_running_loop(loop) self._resolver = aiodns.DNSResolver(*args, loop=loop, **kwargs) if not hasattr(self._resolver, 'gethostbyname'): # aiodns 1.1 is not available, fallback to DNSResolver.query self.resolve = self._resolve_with_query # type: ignore async def resolve(self, host: str, port: int=0, family: int=socket.AF_INET) -> List[Dict[str, Any]]: try: resp = await self._resolver.gethostbyname(host, family) except aiodns.error.DNSError as exc: msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed" raise OSError(msg) from exc hosts = [] for address in resp.addresses: hosts.append( {'hostname': host, 'host': address, 'port': port, 'family': family, 'proto': 0, 'flags': socket.AI_NUMERICHOST}) if not hosts: raise OSError("DNS lookup failed") return hosts async def _resolve_with_query( self, host: str, port: int=0, family: int=socket.AF_INET) -> List[Dict[str, Any]]: if family == socket.AF_INET6: qtype = 'AAAA' else: qtype = 'A' try: resp = await self._resolver.query(host, qtype) except aiodns.error.DNSError as exc: msg = exc.args[1] if len(exc.args) >= 1 else "DNS lookup failed" raise OSError(msg) from exc hosts = [] for rr in resp: hosts.append( {'hostname': host, 'host': rr.host, 'port': port, 'family': family, 'proto': 0, 'flags': socket.AI_NUMERICHOST}) if not hosts: raise OSError("DNS lookup failed") return hosts async def close(self) -> None: return self._resolver.cancel() DefaultResolver = AsyncResolver if aiodns_default else ThreadedResolver aiohttp-3.6.2/aiohttp/signals.py0000644000175100001650000000166413547410117017156 0ustar vstsdocker00000000000000from aiohttp.frozenlist import FrozenList __all__ = ('Signal',) class Signal(FrozenList): """Coroutine-based signal implementation. To connect a callback to a signal, use any list method. Signals are fired using the send() coroutine, which takes named arguments. """ __slots__ = ('_owner',) def __init__(self, owner): super().__init__() self._owner = owner def __repr__(self): return ''.format(self._owner, self.frozen, list(self)) async def send(self, *args, **kwargs): """ Sends data to all registered receivers. """ if not self.frozen: raise RuntimeError("Cannot send non-frozen signal.") for receiver in self: await receiver(*args, **kwargs) # type: ignore aiohttp-3.6.2/aiohttp/signals.pyi0000644000175100001650000000050413547410117017317 0ustar vstsdocker00000000000000from typing import Any, Generic, TypeVar from aiohttp.frozenlist import FrozenList __all__ = ('Signal',) _T = TypeVar('_T') class Signal(FrozenList[_T], Generic[_T]): def __init__(self, owner: Any) -> None: ... def __repr__(self) -> str: ... async def send(self, *args: Any, **kwargs: Any) -> None: ... aiohttp-3.6.2/aiohttp/streams.py0000644000175100001650000004775313547410117017205 0ustar vstsdocker00000000000000import asyncio import collections import warnings from typing import List # noqa from typing import Awaitable, Callable, Generic, Optional, Tuple, TypeVar from .base_protocol import BaseProtocol from .helpers import BaseTimerContext, set_exception, set_result from .log import internal_logger try: # pragma: no cover from typing import Deque # noqa except ImportError: from typing_extensions import Deque # noqa __all__ = ( 'EMPTY_PAYLOAD', 'EofStream', 'StreamReader', 'DataQueue', 'FlowControlDataQueue') DEFAULT_LIMIT = 2 ** 16 _T = TypeVar('_T') class EofStream(Exception): """eof stream indication.""" class AsyncStreamIterator(Generic[_T]): def __init__(self, read_func: Callable[[], Awaitable[_T]]) -> None: self.read_func = read_func def __aiter__(self) -> 'AsyncStreamIterator[_T]': return self async def __anext__(self) -> _T: try: rv = await self.read_func() except EofStream: raise StopAsyncIteration # NOQA if rv == b'': raise StopAsyncIteration # NOQA return rv class ChunkTupleAsyncStreamIterator: def __init__(self, stream: 'StreamReader') -> None: self._stream = stream def __aiter__(self) -> 'ChunkTupleAsyncStreamIterator': return self async def __anext__(self) -> Tuple[bytes, bool]: rv = await self._stream.readchunk() if rv == (b'', False): raise StopAsyncIteration # NOQA return rv class AsyncStreamReaderMixin: def __aiter__(self) -> AsyncStreamIterator[bytes]: return AsyncStreamIterator(self.readline) # type: ignore def iter_chunked(self, n: int) -> AsyncStreamIterator[bytes]: """Returns an asynchronous iterator that yields chunks of size n. Python-3.5 available for Python 3.5+ only """ return AsyncStreamIterator(lambda: self.read(n)) # type: ignore def iter_any(self) -> AsyncStreamIterator[bytes]: """Returns an asynchronous iterator that yields all the available data as soon as it is received Python-3.5 available for Python 3.5+ only """ return AsyncStreamIterator(self.readany) # type: ignore def iter_chunks(self) -> ChunkTupleAsyncStreamIterator: """Returns an asynchronous iterator that yields chunks of data as they are received by the server. The yielded objects are tuples of (bytes, bool) as returned by the StreamReader.readchunk method. Python-3.5 available for Python 3.5+ only """ return ChunkTupleAsyncStreamIterator(self) # type: ignore class StreamReader(AsyncStreamReaderMixin): """An enhancement of asyncio.StreamReader. Supports asynchronous iteration by line, chunk or as available:: async for line in reader: ... async for chunk in reader.iter_chunked(1024): ... async for slice in reader.iter_any(): ... """ total_bytes = 0 def __init__(self, protocol: BaseProtocol, *, limit: int=DEFAULT_LIMIT, timer: Optional[BaseTimerContext]=None, loop: Optional[asyncio.AbstractEventLoop]=None) -> None: self._protocol = protocol self._low_water = limit self._high_water = limit * 2 if loop is None: loop = asyncio.get_event_loop() self._loop = loop self._size = 0 self._cursor = 0 self._http_chunk_splits = None # type: Optional[List[int]] self._buffer = collections.deque() # type: Deque[bytes] self._buffer_offset = 0 self._eof = False self._waiter = None # type: Optional[asyncio.Future[None]] self._eof_waiter = None # type: Optional[asyncio.Future[None]] self._exception = None # type: Optional[BaseException] self._timer = timer self._eof_callbacks = [] # type: List[Callable[[], None]] def __repr__(self) -> str: info = [self.__class__.__name__] if self._size: info.append('%d bytes' % self._size) if self._eof: info.append('eof') if self._low_water != DEFAULT_LIMIT: info.append('low=%d high=%d' % (self._low_water, self._high_water)) if self._waiter: info.append('w=%r' % self._waiter) if self._exception: info.append('e=%r' % self._exception) return '<%s>' % ' '.join(info) def exception(self) -> Optional[BaseException]: return self._exception def set_exception(self, exc: BaseException) -> None: self._exception = exc self._eof_callbacks.clear() waiter = self._waiter if waiter is not None: self._waiter = None set_exception(waiter, exc) waiter = self._eof_waiter if waiter is not None: self._eof_waiter = None set_exception(waiter, exc) def on_eof(self, callback: Callable[[], None]) -> None: if self._eof: try: callback() except Exception: internal_logger.exception('Exception in eof callback') else: self._eof_callbacks.append(callback) def feed_eof(self) -> None: self._eof = True waiter = self._waiter if waiter is not None: self._waiter = None set_result(waiter, None) waiter = self._eof_waiter if waiter is not None: self._eof_waiter = None set_result(waiter, None) for cb in self._eof_callbacks: try: cb() except Exception: internal_logger.exception('Exception in eof callback') self._eof_callbacks.clear() def is_eof(self) -> bool: """Return True if 'feed_eof' was called.""" return self._eof def at_eof(self) -> bool: """Return True if the buffer is empty and 'feed_eof' was called.""" return self._eof and not self._buffer async def wait_eof(self) -> None: if self._eof: return assert self._eof_waiter is None self._eof_waiter = self._loop.create_future() try: await self._eof_waiter finally: self._eof_waiter = None def unread_data(self, data: bytes) -> None: """ rollback reading some data from stream, inserting it to buffer head. """ warnings.warn("unread_data() is deprecated " "and will be removed in future releases (#3260)", DeprecationWarning, stacklevel=2) if not data: return if self._buffer_offset: self._buffer[0] = self._buffer[0][self._buffer_offset:] self._buffer_offset = 0 self._size += len(data) self._cursor -= len(data) self._buffer.appendleft(data) self._eof_counter = 0 # TODO: size is ignored, remove the param later def feed_data(self, data: bytes, size: int=0) -> None: assert not self._eof, 'feed_data after feed_eof' if not data: return self._size += len(data) self._buffer.append(data) self.total_bytes += len(data) waiter = self._waiter if waiter is not None: self._waiter = None set_result(waiter, None) if (self._size > self._high_water and not self._protocol._reading_paused): self._protocol.pause_reading() def begin_http_chunk_receiving(self) -> None: if self._http_chunk_splits is None: if self.total_bytes: raise RuntimeError("Called begin_http_chunk_receiving when" "some data was already fed") self._http_chunk_splits = [] def end_http_chunk_receiving(self) -> None: if self._http_chunk_splits is None: raise RuntimeError("Called end_chunk_receiving without calling " "begin_chunk_receiving first") # self._http_chunk_splits contains logical byte offsets from start of # the body transfer. Each offset is the offset of the end of a chunk. # "Logical" means bytes, accessible for a user. # If no chunks containig logical data were received, current position # is difinitely zero. pos = self._http_chunk_splits[-1] if self._http_chunk_splits else 0 if self.total_bytes == pos: # We should not add empty chunks here. So we check for that. # Note, when chunked + gzip is used, we can receive a chunk # of compressed data, but that data may not be enough for gzip FSM # to yield any uncompressed data. That's why current position may # not change after receiving a chunk. return self._http_chunk_splits.append(self.total_bytes) # wake up readchunk when end of http chunk received waiter = self._waiter if waiter is not None: self._waiter = None set_result(waiter, None) async def _wait(self, func_name: str) -> None: # StreamReader uses a future to link the protocol feed_data() method # to a read coroutine. Running two read coroutines at the same time # would have an unexpected behaviour. It would not possible to know # which coroutine would get the next data. if self._waiter is not None: raise RuntimeError('%s() called while another coroutine is ' 'already waiting for incoming data' % func_name) waiter = self._waiter = self._loop.create_future() try: if self._timer: with self._timer: await waiter else: await waiter finally: self._waiter = None async def readline(self) -> bytes: if self._exception is not None: raise self._exception line = [] line_size = 0 not_enough = True while not_enough: while self._buffer and not_enough: offset = self._buffer_offset ichar = self._buffer[0].find(b'\n', offset) + 1 # Read from current offset to found b'\n' or to the end. data = self._read_nowait_chunk(ichar - offset if ichar else -1) line.append(data) line_size += len(data) if ichar: not_enough = False if line_size > self._high_water: raise ValueError('Line is too long') if self._eof: break if not_enough: await self._wait('readline') return b''.join(line) async def read(self, n: int=-1) -> bytes: if self._exception is not None: raise self._exception # migration problem; with DataQueue you have to catch # EofStream exception, so common way is to run payload.read() inside # infinite loop. what can cause real infinite loop with StreamReader # lets keep this code one major release. if __debug__: if self._eof and not self._buffer: self._eof_counter = getattr(self, '_eof_counter', 0) + 1 if self._eof_counter > 5: internal_logger.warning( 'Multiple access to StreamReader in eof state, ' 'might be infinite loop.', stack_info=True) if not n: return b'' if n < 0: # This used to just loop creating a new waiter hoping to # collect everything in self._buffer, but that would # deadlock if the subprocess sends more than self.limit # bytes. So just call self.readany() until EOF. blocks = [] while True: block = await self.readany() if not block: break blocks.append(block) return b''.join(blocks) # TODO: should be `if` instead of `while` # because waiter maybe triggered on chunk end, # without feeding any data while not self._buffer and not self._eof: await self._wait('read') return self._read_nowait(n) async def readany(self) -> bytes: if self._exception is not None: raise self._exception # TODO: should be `if` instead of `while` # because waiter maybe triggered on chunk end, # without feeding any data while not self._buffer and not self._eof: await self._wait('readany') return self._read_nowait(-1) async def readchunk(self) -> Tuple[bytes, bool]: """Returns a tuple of (data, end_of_http_chunk). When chunked transfer encoding is used, end_of_http_chunk is a boolean indicating if the end of the data corresponds to the end of a HTTP chunk , otherwise it is always False. """ while True: if self._exception is not None: raise self._exception while self._http_chunk_splits: pos = self._http_chunk_splits.pop(0) if pos == self._cursor: return (b"", True) if pos > self._cursor: return (self._read_nowait(pos-self._cursor), True) internal_logger.warning('Skipping HTTP chunk end due to data ' 'consumption beyond chunk boundary') if self._buffer: return (self._read_nowait_chunk(-1), False) # return (self._read_nowait(-1), False) if self._eof: # Special case for signifying EOF. # (b'', True) is not a final return value actually. return (b'', False) await self._wait('readchunk') async def readexactly(self, n: int) -> bytes: if self._exception is not None: raise self._exception blocks = [] # type: List[bytes] while n > 0: block = await self.read(n) if not block: partial = b''.join(blocks) raise asyncio.IncompleteReadError( partial, len(partial) + n) blocks.append(block) n -= len(block) return b''.join(blocks) def read_nowait(self, n: int=-1) -> bytes: # default was changed to be consistent with .read(-1) # # I believe the most users don't know about the method and # they are not affected. if self._exception is not None: raise self._exception if self._waiter and not self._waiter.done(): raise RuntimeError( 'Called while some coroutine is waiting for incoming data.') return self._read_nowait(n) def _read_nowait_chunk(self, n: int) -> bytes: first_buffer = self._buffer[0] offset = self._buffer_offset if n != -1 and len(first_buffer) - offset > n: data = first_buffer[offset:offset + n] self._buffer_offset += n elif offset: self._buffer.popleft() data = first_buffer[offset:] self._buffer_offset = 0 else: data = self._buffer.popleft() self._size -= len(data) self._cursor += len(data) chunk_splits = self._http_chunk_splits # Prevent memory leak: drop useless chunk splits while chunk_splits and chunk_splits[0] < self._cursor: chunk_splits.pop(0) if self._size < self._low_water and self._protocol._reading_paused: self._protocol.resume_reading() return data def _read_nowait(self, n: int) -> bytes: """ Read not more than n bytes, or whole buffer is n == -1 """ chunks = [] while self._buffer: chunk = self._read_nowait_chunk(n) chunks.append(chunk) if n != -1: n -= len(chunk) if n == 0: break return b''.join(chunks) if chunks else b'' class EmptyStreamReader(AsyncStreamReaderMixin): def exception(self) -> Optional[BaseException]: return None def set_exception(self, exc: BaseException) -> None: pass def on_eof(self, callback: Callable[[], None]) -> None: try: callback() except Exception: internal_logger.exception('Exception in eof callback') def feed_eof(self) -> None: pass def is_eof(self) -> bool: return True def at_eof(self) -> bool: return True async def wait_eof(self) -> None: return def feed_data(self, data: bytes, n: int=0) -> None: pass async def readline(self) -> bytes: return b'' async def read(self, n: int=-1) -> bytes: return b'' async def readany(self) -> bytes: return b'' async def readchunk(self) -> Tuple[bytes, bool]: return (b'', True) async def readexactly(self, n: int) -> bytes: raise asyncio.IncompleteReadError(b'', n) def read_nowait(self) -> bytes: return b'' EMPTY_PAYLOAD = EmptyStreamReader() class DataQueue(Generic[_T]): """DataQueue is a general-purpose blocking queue with one reader.""" def __init__(self, loop: asyncio.AbstractEventLoop) -> None: self._loop = loop self._eof = False self._waiter = None # type: Optional[asyncio.Future[None]] self._exception = None # type: Optional[BaseException] self._size = 0 self._buffer = collections.deque() # type: Deque[Tuple[_T, int]] def __len__(self) -> int: return len(self._buffer) def is_eof(self) -> bool: return self._eof def at_eof(self) -> bool: return self._eof and not self._buffer def exception(self) -> Optional[BaseException]: return self._exception def set_exception(self, exc: BaseException) -> None: self._eof = True self._exception = exc waiter = self._waiter if waiter is not None: self._waiter = None set_exception(waiter, exc) def feed_data(self, data: _T, size: int=0) -> None: self._size += size self._buffer.append((data, size)) waiter = self._waiter if waiter is not None: self._waiter = None set_result(waiter, None) def feed_eof(self) -> None: self._eof = True waiter = self._waiter if waiter is not None: self._waiter = None set_result(waiter, None) async def read(self) -> _T: if not self._buffer and not self._eof: assert not self._waiter self._waiter = self._loop.create_future() try: await self._waiter except (asyncio.CancelledError, asyncio.TimeoutError): self._waiter = None raise if self._buffer: data, size = self._buffer.popleft() self._size -= size return data else: if self._exception is not None: raise self._exception else: raise EofStream def __aiter__(self) -> AsyncStreamIterator[_T]: return AsyncStreamIterator(self.read) class FlowControlDataQueue(DataQueue[_T]): """FlowControlDataQueue resumes and pauses an underlying stream. It is a destination for parsed data.""" def __init__(self, protocol: BaseProtocol, *, limit: int=DEFAULT_LIMIT, loop: asyncio.AbstractEventLoop) -> None: super().__init__(loop=loop) self._protocol = protocol self._limit = limit * 2 def feed_data(self, data: _T, size: int=0) -> None: super().feed_data(data, size) if self._size > self._limit and not self._protocol._reading_paused: self._protocol.pause_reading() async def read(self) -> _T: try: return await super().read() finally: if self._size < self._limit and self._protocol._reading_paused: self._protocol.resume_reading() aiohttp-3.6.2/aiohttp/tcp_helpers.py0000644000175100001650000000313713547410117020023 0ustar vstsdocker00000000000000"""Helper methods to tune a TCP connection""" import asyncio import socket from contextlib import suppress from typing import Optional # noqa __all__ = ('tcp_keepalive', 'tcp_nodelay', 'tcp_cork') if hasattr(socket, 'TCP_CORK'): # pragma: no cover CORK = socket.TCP_CORK # type: Optional[int] elif hasattr(socket, 'TCP_NOPUSH'): # pragma: no cover CORK = socket.TCP_NOPUSH # type: ignore else: # pragma: no cover CORK = None if hasattr(socket, 'SO_KEEPALIVE'): def tcp_keepalive(transport: asyncio.Transport) -> None: sock = transport.get_extra_info('socket') if sock is not None: sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1) else: def tcp_keepalive( transport: asyncio.Transport) -> None: # pragma: no cover pass def tcp_nodelay(transport: asyncio.Transport, value: bool) -> None: sock = transport.get_extra_info('socket') if sock is None: return if sock.family not in (socket.AF_INET, socket.AF_INET6): return value = bool(value) # socket may be closed already, on windows OSError get raised with suppress(OSError): sock.setsockopt( socket.IPPROTO_TCP, socket.TCP_NODELAY, value) def tcp_cork(transport: asyncio.Transport, value: bool) -> None: sock = transport.get_extra_info('socket') if CORK is None: return if sock is None: return if sock.family not in (socket.AF_INET, socket.AF_INET6): return value = bool(value) with suppress(OSError): sock.setsockopt( socket.IPPROTO_TCP, CORK, value) aiohttp-3.6.2/aiohttp/test_utils.py0000644000175100001650000005057413547410117017721 0ustar vstsdocker00000000000000"""Utilities shared by tests.""" import asyncio import contextlib import functools import gc import inspect import socket import sys import unittest from abc import ABC, abstractmethod from types import TracebackType from typing import ( # noqa TYPE_CHECKING, Any, Callable, Iterator, List, Optional, Type, Union, ) from unittest import mock from multidict import CIMultiDict, CIMultiDictProxy from yarl import URL import aiohttp from aiohttp.client import ( ClientResponse, _RequestContextManager, _WSRequestContextManager, ) from . import ClientSession, hdrs from .abc import AbstractCookieJar from .client_reqrep import ClientResponse # noqa from .client_ws import ClientWebSocketResponse # noqa from .helpers import sentinel from .http import HttpVersion, RawRequestMessage from .signals import Signal from .web import ( Application, AppRunner, BaseRunner, Request, Server, ServerRunner, SockSite, UrlMappingMatchInfo, ) from .web_protocol import _RequestHandler if TYPE_CHECKING: # pragma: no cover from ssl import SSLContext else: SSLContext = None def get_unused_port_socket(host: str) -> socket.socket: return get_port_socket(host, 0) def get_port_socket(host: str, port: int) -> socket.socket: s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.bind((host, port)) return s def unused_port() -> int: """Return a port that is unused on the current host.""" with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind(('127.0.0.1', 0)) return s.getsockname()[1] class BaseTestServer(ABC): __test__ = False def __init__(self, *, scheme: Union[str, object]=sentinel, loop: Optional[asyncio.AbstractEventLoop]=None, host: str='127.0.0.1', port: Optional[int]=None, skip_url_asserts: bool=False, **kwargs: Any) -> None: self._loop = loop self.runner = None # type: Optional[BaseRunner] self._root = None # type: Optional[URL] self.host = host self.port = port self._closed = False self.scheme = scheme self.skip_url_asserts = skip_url_asserts async def start_server(self, loop: Optional[asyncio.AbstractEventLoop]=None, **kwargs: Any) -> None: if self.runner: return self._loop = loop self._ssl = kwargs.pop('ssl', None) self.runner = await self._make_runner(**kwargs) await self.runner.setup() if not self.port: self.port = 0 _sock = get_port_socket(self.host, self.port) self.host, self.port = _sock.getsockname()[:2] site = SockSite(self.runner, sock=_sock, ssl_context=self._ssl) await site.start() server = site._server assert server is not None sockets = server.sockets assert sockets is not None self.port = sockets[0].getsockname()[1] if self.scheme is sentinel: if self._ssl: scheme = 'https' else: scheme = 'http' self.scheme = scheme self._root = URL('{}://{}:{}'.format(self.scheme, self.host, self.port)) @abstractmethod # pragma: no cover async def _make_runner(self, **kwargs: Any) -> BaseRunner: pass def make_url(self, path: str) -> URL: assert self._root is not None url = URL(path) if not self.skip_url_asserts: assert not url.is_absolute() return self._root.join(url) else: return URL(str(self._root) + path) @property def started(self) -> bool: return self.runner is not None @property def closed(self) -> bool: return self._closed @property def handler(self) -> Server: # for backward compatibility # web.Server instance runner = self.runner assert runner is not None assert runner.server is not None return runner.server async def close(self) -> None: """Close all fixtures created by the test client. After that point, the TestClient is no longer usable. This is an idempotent function: running close multiple times will not have any additional effects. close is also run when the object is garbage collected, and on exit when used as a context manager. """ if self.started and not self.closed: assert self.runner is not None await self.runner.cleanup() self._root = None self.port = None self._closed = True def __enter__(self) -> None: raise TypeError("Use async with instead") def __exit__(self, exc_type: Optional[Type[BaseException]], exc_value: Optional[BaseException], traceback: Optional[TracebackType]) -> None: # __exit__ should exist in pair with __enter__ but never executed pass # pragma: no cover async def __aenter__(self) -> 'BaseTestServer': await self.start_server(loop=self._loop) return self async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc_value: Optional[BaseException], traceback: Optional[TracebackType]) -> None: await self.close() class TestServer(BaseTestServer): def __init__(self, app: Application, *, scheme: Union[str, object]=sentinel, host: str='127.0.0.1', port: Optional[int]=None, **kwargs: Any): self.app = app super().__init__(scheme=scheme, host=host, port=port, **kwargs) async def _make_runner(self, **kwargs: Any) -> BaseRunner: return AppRunner(self.app, **kwargs) class RawTestServer(BaseTestServer): def __init__(self, handler: _RequestHandler, *, scheme: Union[str, object]=sentinel, host: str='127.0.0.1', port: Optional[int]=None, **kwargs: Any) -> None: self._handler = handler super().__init__(scheme=scheme, host=host, port=port, **kwargs) async def _make_runner(self, debug: bool=True, **kwargs: Any) -> ServerRunner: srv = Server( self._handler, loop=self._loop, debug=debug, **kwargs) return ServerRunner(srv, debug=debug, **kwargs) class TestClient: """ A test client implementation. To write functional tests for aiohttp based servers. """ __test__ = False def __init__(self, server: BaseTestServer, *, cookie_jar: Optional[AbstractCookieJar]=None, loop: Optional[asyncio.AbstractEventLoop]=None, **kwargs: Any) -> None: if not isinstance(server, BaseTestServer): raise TypeError("server must be TestServer " "instance, found type: %r" % type(server)) self._server = server self._loop = loop if cookie_jar is None: cookie_jar = aiohttp.CookieJar(unsafe=True, loop=loop) self._session = ClientSession(loop=loop, cookie_jar=cookie_jar, **kwargs) self._closed = False self._responses = [] # type: List[ClientResponse] self._websockets = [] # type: List[ClientWebSocketResponse] async def start_server(self) -> None: await self._server.start_server(loop=self._loop) @property def host(self) -> str: return self._server.host @property def port(self) -> Optional[int]: return self._server.port @property def server(self) -> BaseTestServer: return self._server @property def app(self) -> Application: return getattr(self._server, "app", None) @property def session(self) -> ClientSession: """An internal aiohttp.ClientSession. Unlike the methods on the TestClient, client session requests do not automatically include the host in the url queried, and will require an absolute path to the resource. """ return self._session def make_url(self, path: str) -> URL: return self._server.make_url(path) async def _request(self, method: str, path: str, **kwargs: Any) -> ClientResponse: resp = await self._session.request( method, self.make_url(path), **kwargs ) # save it to close later self._responses.append(resp) return resp def request(self, method: str, path: str, **kwargs: Any) -> _RequestContextManager: """Routes a request to tested http server. The interface is identical to aiohttp.ClientSession.request, except the loop kwarg is overridden by the instance used by the test server. """ return _RequestContextManager( self._request(method, path, **kwargs) ) def get(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP GET request.""" return _RequestContextManager( self._request(hdrs.METH_GET, path, **kwargs) ) def post(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP POST request.""" return _RequestContextManager( self._request(hdrs.METH_POST, path, **kwargs) ) def options(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP OPTIONS request.""" return _RequestContextManager( self._request(hdrs.METH_OPTIONS, path, **kwargs) ) def head(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP HEAD request.""" return _RequestContextManager( self._request(hdrs.METH_HEAD, path, **kwargs) ) def put(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP PUT request.""" return _RequestContextManager( self._request(hdrs.METH_PUT, path, **kwargs) ) def patch(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP PATCH request.""" return _RequestContextManager( self._request(hdrs.METH_PATCH, path, **kwargs) ) def delete(self, path: str, **kwargs: Any) -> _RequestContextManager: """Perform an HTTP PATCH request.""" return _RequestContextManager( self._request(hdrs.METH_DELETE, path, **kwargs) ) def ws_connect(self, path: str, **kwargs: Any) -> _WSRequestContextManager: """Initiate websocket connection. The api corresponds to aiohttp.ClientSession.ws_connect. """ return _WSRequestContextManager( self._ws_connect(path, **kwargs) ) async def _ws_connect(self, path: str, **kwargs: Any) -> ClientWebSocketResponse: ws = await self._session.ws_connect( self.make_url(path), **kwargs) self._websockets.append(ws) return ws async def close(self) -> None: """Close all fixtures created by the test client. After that point, the TestClient is no longer usable. This is an idempotent function: running close multiple times will not have any additional effects. close is also run on exit when used as a(n) (asynchronous) context manager. """ if not self._closed: for resp in self._responses: resp.close() for ws in self._websockets: await ws.close() await self._session.close() await self._server.close() self._closed = True def __enter__(self) -> None: raise TypeError("Use async with instead") def __exit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException], tb: Optional[TracebackType]) -> None: # __exit__ should exist in pair with __enter__ but never executed pass # pragma: no cover async def __aenter__(self) -> 'TestClient': await self.start_server() return self async def __aexit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException], tb: Optional[TracebackType]) -> None: await self.close() class AioHTTPTestCase(unittest.TestCase): """A base class to allow for unittest web applications using aiohttp. Provides the following: * self.client (aiohttp.test_utils.TestClient): an aiohttp test client. * self.loop (asyncio.BaseEventLoop): the event loop in which the application and server are running. * self.app (aiohttp.web.Application): the application returned by self.get_application() Note that the TestClient's methods are asynchronous: you have to execute function on the test client using asynchronous methods. """ async def get_application(self) -> Application: """ This method should be overridden to return the aiohttp.web.Application object to test. """ return self.get_app() def get_app(self) -> Application: """Obsolete method used to constructing web application. Use .get_application() coroutine instead """ raise RuntimeError("Did you forget to define get_application()?") def setUp(self) -> None: self.loop = setup_test_loop() self.app = self.loop.run_until_complete(self.get_application()) self.server = self.loop.run_until_complete(self.get_server(self.app)) self.client = self.loop.run_until_complete( self.get_client(self.server)) self.loop.run_until_complete(self.client.start_server()) self.loop.run_until_complete(self.setUpAsync()) async def setUpAsync(self) -> None: pass def tearDown(self) -> None: self.loop.run_until_complete(self.tearDownAsync()) self.loop.run_until_complete(self.client.close()) teardown_test_loop(self.loop) async def tearDownAsync(self) -> None: pass async def get_server(self, app: Application) -> TestServer: """Return a TestServer instance.""" return TestServer(app, loop=self.loop) async def get_client(self, server: TestServer) -> TestClient: """Return a TestClient instance.""" return TestClient(server, loop=self.loop) def unittest_run_loop(func: Any, *args: Any, **kwargs: Any) -> Any: """A decorator dedicated to use with asynchronous methods of an AioHTTPTestCase. Handles executing an asynchronous function, using the self.loop of the AioHTTPTestCase. """ @functools.wraps(func, *args, **kwargs) def new_func(self: Any, *inner_args: Any, **inner_kwargs: Any) -> Any: return self.loop.run_until_complete( func(self, *inner_args, **inner_kwargs)) return new_func _LOOP_FACTORY = Callable[[], asyncio.AbstractEventLoop] @contextlib.contextmanager def loop_context(loop_factory: _LOOP_FACTORY=asyncio.new_event_loop, fast: bool=False) -> Iterator[asyncio.AbstractEventLoop]: """A contextmanager that creates an event_loop, for test purposes. Handles the creation and cleanup of a test loop. """ loop = setup_test_loop(loop_factory) yield loop teardown_test_loop(loop, fast=fast) def setup_test_loop( loop_factory: _LOOP_FACTORY=asyncio.new_event_loop ) -> asyncio.AbstractEventLoop: """Create and return an asyncio.BaseEventLoop instance. The caller should also call teardown_test_loop, once they are done with the loop. """ loop = loop_factory() try: module = loop.__class__.__module__ skip_watcher = 'uvloop' in module except AttributeError: # pragma: no cover # Just in case skip_watcher = True asyncio.set_event_loop(loop) if sys.platform != "win32" and not skip_watcher: policy = asyncio.get_event_loop_policy() watcher = asyncio.SafeChildWatcher() # type: ignore watcher.attach_loop(loop) with contextlib.suppress(NotImplementedError): policy.set_child_watcher(watcher) return loop def teardown_test_loop(loop: asyncio.AbstractEventLoop, fast: bool=False) -> None: """Teardown and cleanup an event_loop created by setup_test_loop. """ closed = loop.is_closed() if not closed: loop.call_soon(loop.stop) loop.run_forever() loop.close() if not fast: gc.collect() asyncio.set_event_loop(None) def _create_app_mock() -> mock.MagicMock: def get_dict(app: Any, key: str) -> Any: return app.__app_dict[key] def set_dict(app: Any, key: str, value: Any) -> None: app.__app_dict[key] = value app = mock.MagicMock() app.__app_dict = {} app.__getitem__ = get_dict app.__setitem__ = set_dict app._debug = False app.on_response_prepare = Signal(app) app.on_response_prepare.freeze() return app def _create_transport(sslcontext: Optional[SSLContext]=None) -> mock.Mock: transport = mock.Mock() def get_extra_info(key: str) -> Optional[SSLContext]: if key == 'sslcontext': return sslcontext else: return None transport.get_extra_info.side_effect = get_extra_info return transport def make_mocked_request(method: str, path: str, headers: Any=None, *, match_info: Any=sentinel, version: HttpVersion=HttpVersion(1, 1), closing: bool=False, app: Any=None, writer: Any=sentinel, protocol: Any=sentinel, transport: Any=sentinel, payload: Any=sentinel, sslcontext: Optional[SSLContext]=None, client_max_size: int=1024**2, loop: Any=...) -> Any: """Creates mocked web.Request testing purposes. Useful in unit tests, when spinning full web server is overkill or specific conditions and errors are hard to trigger. """ task = mock.Mock() if loop is ...: loop = mock.Mock() loop.create_future.return_value = () if version < HttpVersion(1, 1): closing = True if headers: headers = CIMultiDictProxy(CIMultiDict(headers)) raw_hdrs = tuple( (k.encode('utf-8'), v.encode('utf-8')) for k, v in headers.items()) else: headers = CIMultiDictProxy(CIMultiDict()) raw_hdrs = () chunked = 'chunked' in headers.get(hdrs.TRANSFER_ENCODING, '').lower() message = RawRequestMessage( method, path, version, headers, raw_hdrs, closing, False, False, chunked, URL(path)) if app is None: app = _create_app_mock() if transport is sentinel: transport = _create_transport(sslcontext) if protocol is sentinel: protocol = mock.Mock() protocol.transport = transport if writer is sentinel: writer = mock.Mock() writer.write_headers = make_mocked_coro(None) writer.write = make_mocked_coro(None) writer.write_eof = make_mocked_coro(None) writer.drain = make_mocked_coro(None) writer.transport = transport protocol.transport = transport protocol.writer = writer if payload is sentinel: payload = mock.Mock() req = Request(message, payload, protocol, writer, task, loop, client_max_size=client_max_size) match_info = UrlMappingMatchInfo( {} if match_info is sentinel else match_info, mock.Mock()) match_info.add_app(app) req._match_info = match_info return req def make_mocked_coro(return_value: Any=sentinel, raise_exception: Any=sentinel) -> Any: """Creates a coroutine mock.""" async def mock_coro(*args: Any, **kwargs: Any) -> Any: if raise_exception is not sentinel: raise raise_exception if not inspect.isawaitable(return_value): return return_value await return_value return mock.Mock(wraps=mock_coro) aiohttp-3.6.2/aiohttp/tracing.py0000644000175100001650000003205113547410117017137 0ustar vstsdocker00000000000000from types import SimpleNamespace from typing import TYPE_CHECKING, Awaitable, Callable, Type, Union import attr from multidict import CIMultiDict # noqa from yarl import URL from .client_reqrep import ClientResponse from .signals import Signal if TYPE_CHECKING: # pragma: no cover from .client import ClientSession # noqa _SignalArgs = Union[ 'TraceRequestStartParams', 'TraceRequestEndParams', 'TraceRequestExceptionParams', 'TraceConnectionQueuedStartParams', 'TraceConnectionQueuedEndParams', 'TraceConnectionCreateStartParams', 'TraceConnectionCreateEndParams', 'TraceConnectionReuseconnParams', 'TraceDnsResolveHostStartParams', 'TraceDnsResolveHostEndParams', 'TraceDnsCacheHitParams', 'TraceDnsCacheMissParams', 'TraceRequestRedirectParams', 'TraceRequestChunkSentParams', 'TraceResponseChunkReceivedParams', ] _Signal = Signal[Callable[[ClientSession, SimpleNamespace, _SignalArgs], Awaitable[None]]] else: _Signal = Signal __all__ = ( 'TraceConfig', 'TraceRequestStartParams', 'TraceRequestEndParams', 'TraceRequestExceptionParams', 'TraceConnectionQueuedStartParams', 'TraceConnectionQueuedEndParams', 'TraceConnectionCreateStartParams', 'TraceConnectionCreateEndParams', 'TraceConnectionReuseconnParams', 'TraceDnsResolveHostStartParams', 'TraceDnsResolveHostEndParams', 'TraceDnsCacheHitParams', 'TraceDnsCacheMissParams', 'TraceRequestRedirectParams', 'TraceRequestChunkSentParams', 'TraceResponseChunkReceivedParams', ) class TraceConfig: """First-class used to trace requests launched via ClientSession objects.""" def __init__( self, trace_config_ctx_factory: Type[SimpleNamespace]=SimpleNamespace ) -> None: self._on_request_start = Signal(self) # type: _Signal self._on_request_chunk_sent = Signal(self) # type: _Signal self._on_response_chunk_received = Signal(self) # type: _Signal self._on_request_end = Signal(self) # type: _Signal self._on_request_exception = Signal(self) # type: _Signal self._on_request_redirect = Signal(self) # type: _Signal self._on_connection_queued_start = Signal(self) # type: _Signal self._on_connection_queued_end = Signal(self) # type: _Signal self._on_connection_create_start = Signal(self) # type: _Signal self._on_connection_create_end = Signal(self) # type: _Signal self._on_connection_reuseconn = Signal(self) # type: _Signal self._on_dns_resolvehost_start = Signal(self) # type: _Signal self._on_dns_resolvehost_end = Signal(self) # type: _Signal self._on_dns_cache_hit = Signal(self) # type: _Signal self._on_dns_cache_miss = Signal(self) # type: _Signal self._trace_config_ctx_factory = trace_config_ctx_factory # type: Type[SimpleNamespace] # noqa def trace_config_ctx( self, trace_request_ctx: SimpleNamespace=None ) -> SimpleNamespace: # noqa """ Return a new trace_config_ctx instance """ return self._trace_config_ctx_factory( trace_request_ctx=trace_request_ctx) def freeze(self) -> None: self._on_request_start.freeze() self._on_request_chunk_sent.freeze() self._on_response_chunk_received.freeze() self._on_request_end.freeze() self._on_request_exception.freeze() self._on_request_redirect.freeze() self._on_connection_queued_start.freeze() self._on_connection_queued_end.freeze() self._on_connection_create_start.freeze() self._on_connection_create_end.freeze() self._on_connection_reuseconn.freeze() self._on_dns_resolvehost_start.freeze() self._on_dns_resolvehost_end.freeze() self._on_dns_cache_hit.freeze() self._on_dns_cache_miss.freeze() @property def on_request_start(self) -> _Signal: return self._on_request_start @property def on_request_chunk_sent(self) -> _Signal: return self._on_request_chunk_sent @property def on_response_chunk_received(self) -> _Signal: return self._on_response_chunk_received @property def on_request_end(self) -> _Signal: return self._on_request_end @property def on_request_exception(self) -> _Signal: return self._on_request_exception @property def on_request_redirect(self) -> _Signal: return self._on_request_redirect @property def on_connection_queued_start(self) -> _Signal: return self._on_connection_queued_start @property def on_connection_queued_end(self) -> _Signal: return self._on_connection_queued_end @property def on_connection_create_start(self) -> _Signal: return self._on_connection_create_start @property def on_connection_create_end(self) -> _Signal: return self._on_connection_create_end @property def on_connection_reuseconn(self) -> _Signal: return self._on_connection_reuseconn @property def on_dns_resolvehost_start(self) -> _Signal: return self._on_dns_resolvehost_start @property def on_dns_resolvehost_end(self) -> _Signal: return self._on_dns_resolvehost_end @property def on_dns_cache_hit(self) -> _Signal: return self._on_dns_cache_hit @property def on_dns_cache_miss(self) -> _Signal: return self._on_dns_cache_miss @attr.s(frozen=True, slots=True) class TraceRequestStartParams: """ Parameters sent by the `on_request_start` signal""" method = attr.ib(type=str) url = attr.ib(type=URL) headers = attr.ib(type='CIMultiDict[str]') @attr.s(frozen=True, slots=True) class TraceRequestChunkSentParams: """ Parameters sent by the `on_request_chunk_sent` signal""" chunk = attr.ib(type=bytes) @attr.s(frozen=True, slots=True) class TraceResponseChunkReceivedParams: """ Parameters sent by the `on_response_chunk_received` signal""" chunk = attr.ib(type=bytes) @attr.s(frozen=True, slots=True) class TraceRequestEndParams: """ Parameters sent by the `on_request_end` signal""" method = attr.ib(type=str) url = attr.ib(type=URL) headers = attr.ib(type='CIMultiDict[str]') response = attr.ib(type=ClientResponse) @attr.s(frozen=True, slots=True) class TraceRequestExceptionParams: """ Parameters sent by the `on_request_exception` signal""" method = attr.ib(type=str) url = attr.ib(type=URL) headers = attr.ib(type='CIMultiDict[str]') exception = attr.ib(type=BaseException) @attr.s(frozen=True, slots=True) class TraceRequestRedirectParams: """ Parameters sent by the `on_request_redirect` signal""" method = attr.ib(type=str) url = attr.ib(type=URL) headers = attr.ib(type='CIMultiDict[str]') response = attr.ib(type=ClientResponse) @attr.s(frozen=True, slots=True) class TraceConnectionQueuedStartParams: """ Parameters sent by the `on_connection_queued_start` signal""" @attr.s(frozen=True, slots=True) class TraceConnectionQueuedEndParams: """ Parameters sent by the `on_connection_queued_end` signal""" @attr.s(frozen=True, slots=True) class TraceConnectionCreateStartParams: """ Parameters sent by the `on_connection_create_start` signal""" @attr.s(frozen=True, slots=True) class TraceConnectionCreateEndParams: """ Parameters sent by the `on_connection_create_end` signal""" @attr.s(frozen=True, slots=True) class TraceConnectionReuseconnParams: """ Parameters sent by the `on_connection_reuseconn` signal""" @attr.s(frozen=True, slots=True) class TraceDnsResolveHostStartParams: """ Parameters sent by the `on_dns_resolvehost_start` signal""" host = attr.ib(type=str) @attr.s(frozen=True, slots=True) class TraceDnsResolveHostEndParams: """ Parameters sent by the `on_dns_resolvehost_end` signal""" host = attr.ib(type=str) @attr.s(frozen=True, slots=True) class TraceDnsCacheHitParams: """ Parameters sent by the `on_dns_cache_hit` signal""" host = attr.ib(type=str) @attr.s(frozen=True, slots=True) class TraceDnsCacheMissParams: """ Parameters sent by the `on_dns_cache_miss` signal""" host = attr.ib(type=str) class Trace: """ Internal class used to keep together the main dependencies used at the moment of send a signal.""" def __init__(self, session: 'ClientSession', trace_config: TraceConfig, trace_config_ctx: SimpleNamespace) -> None: self._trace_config = trace_config self._trace_config_ctx = trace_config_ctx self._session = session async def send_request_start(self, method: str, url: URL, headers: 'CIMultiDict[str]') -> None: return await self._trace_config.on_request_start.send( self._session, self._trace_config_ctx, TraceRequestStartParams(method, url, headers) ) async def send_request_chunk_sent(self, chunk: bytes) -> None: return await self._trace_config.on_request_chunk_sent.send( self._session, self._trace_config_ctx, TraceRequestChunkSentParams(chunk) ) async def send_response_chunk_received(self, chunk: bytes) -> None: return await self._trace_config.on_response_chunk_received.send( self._session, self._trace_config_ctx, TraceResponseChunkReceivedParams(chunk) ) async def send_request_end(self, method: str, url: URL, headers: 'CIMultiDict[str]', response: ClientResponse) -> None: return await self._trace_config.on_request_end.send( self._session, self._trace_config_ctx, TraceRequestEndParams(method, url, headers, response) ) async def send_request_exception(self, method: str, url: URL, headers: 'CIMultiDict[str]', exception: BaseException) -> None: return await self._trace_config.on_request_exception.send( self._session, self._trace_config_ctx, TraceRequestExceptionParams(method, url, headers, exception) ) async def send_request_redirect(self, method: str, url: URL, headers: 'CIMultiDict[str]', response: ClientResponse) -> None: return await self._trace_config._on_request_redirect.send( self._session, self._trace_config_ctx, TraceRequestRedirectParams(method, url, headers, response) ) async def send_connection_queued_start(self) -> None: return await self._trace_config.on_connection_queued_start.send( self._session, self._trace_config_ctx, TraceConnectionQueuedStartParams() ) async def send_connection_queued_end(self) -> None: return await self._trace_config.on_connection_queued_end.send( self._session, self._trace_config_ctx, TraceConnectionQueuedEndParams() ) async def send_connection_create_start(self) -> None: return await self._trace_config.on_connection_create_start.send( self._session, self._trace_config_ctx, TraceConnectionCreateStartParams() ) async def send_connection_create_end(self) -> None: return await self._trace_config.on_connection_create_end.send( self._session, self._trace_config_ctx, TraceConnectionCreateEndParams() ) async def send_connection_reuseconn(self) -> None: return await self._trace_config.on_connection_reuseconn.send( self._session, self._trace_config_ctx, TraceConnectionReuseconnParams() ) async def send_dns_resolvehost_start(self, host: str) -> None: return await self._trace_config.on_dns_resolvehost_start.send( self._session, self._trace_config_ctx, TraceDnsResolveHostStartParams(host) ) async def send_dns_resolvehost_end(self, host: str) -> None: return await self._trace_config.on_dns_resolvehost_end.send( self._session, self._trace_config_ctx, TraceDnsResolveHostEndParams(host) ) async def send_dns_cache_hit(self, host: str) -> None: return await self._trace_config.on_dns_cache_hit.send( self._session, self._trace_config_ctx, TraceDnsCacheHitParams(host) ) async def send_dns_cache_miss(self, host: str) -> None: return await self._trace_config.on_dns_cache_miss.send( self._session, self._trace_config_ctx, TraceDnsCacheMissParams(host) ) aiohttp-3.6.2/aiohttp/typedefs.py0000644000175100001650000000245413547410117017337 0ustar vstsdocker00000000000000import json import os # noqa import pathlib # noqa import sys from typing import ( TYPE_CHECKING, Any, Callable, Iterable, Mapping, Tuple, Union, ) from multidict import ( CIMultiDict, CIMultiDictProxy, MultiDict, MultiDictProxy, istr, ) from yarl import URL DEFAULT_JSON_ENCODER = json.dumps DEFAULT_JSON_DECODER = json.loads if TYPE_CHECKING: # pragma: no cover _CIMultiDict = CIMultiDict[str] _CIMultiDictProxy = CIMultiDictProxy[str] _MultiDict = MultiDict[str] _MultiDictProxy = MultiDictProxy[str] from http.cookies import BaseCookie # noqa else: _CIMultiDict = CIMultiDict _CIMultiDictProxy = CIMultiDictProxy _MultiDict = MultiDict _MultiDictProxy = MultiDictProxy Byteish = Union[bytes, bytearray, memoryview] JSONEncoder = Callable[[Any], str] JSONDecoder = Callable[[str], Any] LooseHeaders = Union[Mapping[Union[str, istr], str], _CIMultiDict, _CIMultiDictProxy] RawHeaders = Tuple[Tuple[bytes, bytes], ...] StrOrURL = Union[str, URL] LooseCookies = Union[Iterable[Tuple[str, 'BaseCookie[str]']], Mapping[str, 'BaseCookie[str]'], 'BaseCookie[str]'] if sys.version_info >= (3, 6): PathLike = Union[str, 'os.PathLike[str]'] else: PathLike = Union[str, pathlib.PurePath] aiohttp-3.6.2/aiohttp/web.py0000644000175100001650000004575613547410117016305 0ustar vstsdocker00000000000000import asyncio import logging import socket import sys from argparse import ArgumentParser from collections.abc import Iterable from importlib import import_module from typing import Any, Awaitable, Callable, List, Optional, Type, Union, cast from .abc import AbstractAccessLogger from .helpers import all_tasks from .log import access_logger from .web_app import Application as Application from .web_app import CleanupError as CleanupError from .web_exceptions import HTTPAccepted as HTTPAccepted from .web_exceptions import HTTPBadGateway as HTTPBadGateway from .web_exceptions import HTTPBadRequest as HTTPBadRequest from .web_exceptions import HTTPClientError as HTTPClientError from .web_exceptions import HTTPConflict as HTTPConflict from .web_exceptions import HTTPCreated as HTTPCreated from .web_exceptions import HTTPError as HTTPError from .web_exceptions import HTTPException as HTTPException from .web_exceptions import HTTPExpectationFailed as HTTPExpectationFailed from .web_exceptions import HTTPFailedDependency as HTTPFailedDependency from .web_exceptions import HTTPForbidden as HTTPForbidden from .web_exceptions import HTTPFound as HTTPFound from .web_exceptions import HTTPGatewayTimeout as HTTPGatewayTimeout from .web_exceptions import HTTPGone as HTTPGone from .web_exceptions import HTTPInsufficientStorage as HTTPInsufficientStorage from .web_exceptions import HTTPInternalServerError as HTTPInternalServerError from .web_exceptions import HTTPLengthRequired as HTTPLengthRequired from .web_exceptions import HTTPMethodNotAllowed as HTTPMethodNotAllowed from .web_exceptions import HTTPMisdirectedRequest as HTTPMisdirectedRequest from .web_exceptions import HTTPMovedPermanently as HTTPMovedPermanently from .web_exceptions import HTTPMultipleChoices as HTTPMultipleChoices from .web_exceptions import ( HTTPNetworkAuthenticationRequired as HTTPNetworkAuthenticationRequired, ) from .web_exceptions import HTTPNoContent as HTTPNoContent from .web_exceptions import ( HTTPNonAuthoritativeInformation as HTTPNonAuthoritativeInformation, ) from .web_exceptions import HTTPNotAcceptable as HTTPNotAcceptable from .web_exceptions import HTTPNotExtended as HTTPNotExtended from .web_exceptions import HTTPNotFound as HTTPNotFound from .web_exceptions import HTTPNotImplemented as HTTPNotImplemented from .web_exceptions import HTTPNotModified as HTTPNotModified from .web_exceptions import HTTPOk as HTTPOk from .web_exceptions import HTTPPartialContent as HTTPPartialContent from .web_exceptions import HTTPPaymentRequired as HTTPPaymentRequired from .web_exceptions import HTTPPermanentRedirect as HTTPPermanentRedirect from .web_exceptions import HTTPPreconditionFailed as HTTPPreconditionFailed from .web_exceptions import ( HTTPPreconditionRequired as HTTPPreconditionRequired, ) from .web_exceptions import ( HTTPProxyAuthenticationRequired as HTTPProxyAuthenticationRequired, ) from .web_exceptions import HTTPRedirection as HTTPRedirection from .web_exceptions import ( HTTPRequestEntityTooLarge as HTTPRequestEntityTooLarge, ) from .web_exceptions import ( HTTPRequestHeaderFieldsTooLarge as HTTPRequestHeaderFieldsTooLarge, ) from .web_exceptions import ( HTTPRequestRangeNotSatisfiable as HTTPRequestRangeNotSatisfiable, ) from .web_exceptions import HTTPRequestTimeout as HTTPRequestTimeout from .web_exceptions import HTTPRequestURITooLong as HTTPRequestURITooLong from .web_exceptions import HTTPResetContent as HTTPResetContent from .web_exceptions import HTTPSeeOther as HTTPSeeOther from .web_exceptions import HTTPServerError as HTTPServerError from .web_exceptions import HTTPServiceUnavailable as HTTPServiceUnavailable from .web_exceptions import HTTPSuccessful as HTTPSuccessful from .web_exceptions import HTTPTemporaryRedirect as HTTPTemporaryRedirect from .web_exceptions import HTTPTooManyRequests as HTTPTooManyRequests from .web_exceptions import HTTPUnauthorized as HTTPUnauthorized from .web_exceptions import ( HTTPUnavailableForLegalReasons as HTTPUnavailableForLegalReasons, ) from .web_exceptions import HTTPUnprocessableEntity as HTTPUnprocessableEntity from .web_exceptions import ( HTTPUnsupportedMediaType as HTTPUnsupportedMediaType, ) from .web_exceptions import HTTPUpgradeRequired as HTTPUpgradeRequired from .web_exceptions import HTTPUseProxy as HTTPUseProxy from .web_exceptions import ( HTTPVariantAlsoNegotiates as HTTPVariantAlsoNegotiates, ) from .web_exceptions import HTTPVersionNotSupported as HTTPVersionNotSupported from .web_fileresponse import FileResponse as FileResponse from .web_log import AccessLogger from .web_middlewares import middleware as middleware from .web_middlewares import ( normalize_path_middleware as normalize_path_middleware, ) from .web_protocol import PayloadAccessError as PayloadAccessError from .web_protocol import RequestHandler as RequestHandler from .web_protocol import RequestPayloadError as RequestPayloadError from .web_request import BaseRequest as BaseRequest from .web_request import FileField as FileField from .web_request import Request as Request from .web_response import ContentCoding as ContentCoding from .web_response import Response as Response from .web_response import StreamResponse as StreamResponse from .web_response import json_response as json_response from .web_routedef import AbstractRouteDef as AbstractRouteDef from .web_routedef import RouteDef as RouteDef from .web_routedef import RouteTableDef as RouteTableDef from .web_routedef import StaticDef as StaticDef from .web_routedef import delete as delete from .web_routedef import get as get from .web_routedef import head as head from .web_routedef import options as options from .web_routedef import patch as patch from .web_routedef import post as post from .web_routedef import put as put from .web_routedef import route as route from .web_routedef import static as static from .web_routedef import view as view from .web_runner import AppRunner as AppRunner from .web_runner import BaseRunner as BaseRunner from .web_runner import BaseSite as BaseSite from .web_runner import GracefulExit as GracefulExit from .web_runner import NamedPipeSite as NamedPipeSite from .web_runner import ServerRunner as ServerRunner from .web_runner import SockSite as SockSite from .web_runner import TCPSite as TCPSite from .web_runner import UnixSite as UnixSite from .web_server import Server as Server from .web_urldispatcher import AbstractResource as AbstractResource from .web_urldispatcher import AbstractRoute as AbstractRoute from .web_urldispatcher import DynamicResource as DynamicResource from .web_urldispatcher import PlainResource as PlainResource from .web_urldispatcher import Resource as Resource from .web_urldispatcher import ResourceRoute as ResourceRoute from .web_urldispatcher import StaticResource as StaticResource from .web_urldispatcher import UrlDispatcher as UrlDispatcher from .web_urldispatcher import UrlMappingMatchInfo as UrlMappingMatchInfo from .web_urldispatcher import View as View from .web_ws import WebSocketReady as WebSocketReady from .web_ws import WebSocketResponse as WebSocketResponse from .web_ws import WSMsgType as WSMsgType __all__ = ( # web_app 'Application', 'CleanupError', # web_exceptions 'HTTPAccepted', 'HTTPBadGateway', 'HTTPBadRequest', 'HTTPClientError', 'HTTPConflict', 'HTTPCreated', 'HTTPError', 'HTTPException', 'HTTPExpectationFailed', 'HTTPFailedDependency', 'HTTPForbidden', 'HTTPFound', 'HTTPGatewayTimeout', 'HTTPGone', 'HTTPInsufficientStorage', 'HTTPInternalServerError', 'HTTPLengthRequired', 'HTTPMethodNotAllowed', 'HTTPMisdirectedRequest', 'HTTPMovedPermanently', 'HTTPMultipleChoices', 'HTTPNetworkAuthenticationRequired', 'HTTPNoContent', 'HTTPNonAuthoritativeInformation', 'HTTPNotAcceptable', 'HTTPNotExtended', 'HTTPNotFound', 'HTTPNotImplemented', 'HTTPNotModified', 'HTTPOk', 'HTTPPartialContent', 'HTTPPaymentRequired', 'HTTPPermanentRedirect', 'HTTPPreconditionFailed', 'HTTPPreconditionRequired', 'HTTPProxyAuthenticationRequired', 'HTTPRedirection', 'HTTPRequestEntityTooLarge', 'HTTPRequestHeaderFieldsTooLarge', 'HTTPRequestRangeNotSatisfiable', 'HTTPRequestTimeout', 'HTTPRequestURITooLong', 'HTTPResetContent', 'HTTPSeeOther', 'HTTPServerError', 'HTTPServiceUnavailable', 'HTTPSuccessful', 'HTTPTemporaryRedirect', 'HTTPTooManyRequests', 'HTTPUnauthorized', 'HTTPUnavailableForLegalReasons', 'HTTPUnprocessableEntity', 'HTTPUnsupportedMediaType', 'HTTPUpgradeRequired', 'HTTPUseProxy', 'HTTPVariantAlsoNegotiates', 'HTTPVersionNotSupported', # web_fileresponse 'FileResponse', # web_middlewares 'middleware', 'normalize_path_middleware', # web_protocol 'PayloadAccessError', 'RequestHandler', 'RequestPayloadError', # web_request 'BaseRequest', 'FileField', 'Request', # web_response 'ContentCoding', 'Response', 'StreamResponse', 'json_response', # web_routedef 'AbstractRouteDef', 'RouteDef', 'RouteTableDef', 'StaticDef', 'delete', 'get', 'head', 'options', 'patch', 'post', 'put', 'route', 'static', 'view', # web_runner 'AppRunner', 'BaseRunner', 'BaseSite', 'GracefulExit', 'ServerRunner', 'SockSite', 'TCPSite', 'UnixSite', 'NamedPipeSite', # web_server 'Server', # web_urldispatcher 'AbstractResource', 'AbstractRoute', 'DynamicResource', 'PlainResource', 'Resource', 'ResourceRoute', 'StaticResource', 'UrlDispatcher', 'UrlMappingMatchInfo', 'View', # web_ws 'WebSocketReady', 'WebSocketResponse', 'WSMsgType', # web 'run_app', ) try: from ssl import SSLContext except ImportError: # pragma: no cover SSLContext = Any # type: ignore async def _run_app(app: Union[Application, Awaitable[Application]], *, host: Optional[str]=None, port: Optional[int]=None, path: Optional[str]=None, sock: Optional[socket.socket]=None, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, print: Callable[..., None]=print, backlog: int=128, access_log_class: Type[AbstractAccessLogger]=AccessLogger, access_log_format: str=AccessLogger.LOG_FORMAT, access_log: Optional[logging.Logger]=access_logger, handle_signals: bool=True, reuse_address: Optional[bool]=None, reuse_port: Optional[bool]=None) -> None: # A internal functio to actually do all dirty job for application running if asyncio.iscoroutine(app): app = await app # type: ignore app = cast(Application, app) runner = AppRunner(app, handle_signals=handle_signals, access_log_class=access_log_class, access_log_format=access_log_format, access_log=access_log) await runner.setup() sites = [] # type: List[BaseSite] try: if host is not None: if isinstance(host, (str, bytes, bytearray, memoryview)): sites.append(TCPSite(runner, host, port, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog, reuse_address=reuse_address, reuse_port=reuse_port)) else: for h in host: sites.append(TCPSite(runner, h, port, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog, reuse_address=reuse_address, reuse_port=reuse_port)) elif path is None and sock is None or port is not None: sites.append(TCPSite(runner, port=port, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog, reuse_address=reuse_address, reuse_port=reuse_port)) if path is not None: if isinstance(path, (str, bytes, bytearray, memoryview)): sites.append(UnixSite(runner, path, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog)) else: for p in path: sites.append(UnixSite(runner, p, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog)) if sock is not None: if not isinstance(sock, Iterable): sites.append(SockSite(runner, sock, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog)) else: for s in sock: sites.append(SockSite(runner, s, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog)) for site in sites: await site.start() if print: # pragma: no branch names = sorted(str(s.name) for s in runner.sites) print("======== Running on {} ========\n" "(Press CTRL+C to quit)".format(', '.join(names))) while True: await asyncio.sleep(3600) # sleep forever by 1 hour intervals finally: await runner.cleanup() def _cancel_all_tasks(loop: asyncio.AbstractEventLoop) -> None: to_cancel = all_tasks(loop) if not to_cancel: return for task in to_cancel: task.cancel() loop.run_until_complete( asyncio.gather(*to_cancel, loop=loop, return_exceptions=True)) for task in to_cancel: if task.cancelled(): continue if task.exception() is not None: loop.call_exception_handler({ 'message': 'unhandled exception during asyncio.run() shutdown', 'exception': task.exception(), 'task': task, }) def run_app(app: Union[Application, Awaitable[Application]], *, host: Optional[str]=None, port: Optional[int]=None, path: Optional[str]=None, sock: Optional[socket.socket]=None, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, print: Callable[..., None]=print, backlog: int=128, access_log_class: Type[AbstractAccessLogger]=AccessLogger, access_log_format: str=AccessLogger.LOG_FORMAT, access_log: Optional[logging.Logger]=access_logger, handle_signals: bool=True, reuse_address: Optional[bool]=None, reuse_port: Optional[bool]=None) -> None: """Run an app locally""" loop = asyncio.get_event_loop() # Configure if and only if in debugging mode and using the default logger if loop.get_debug() and access_log and access_log.name == 'aiohttp.access': if access_log.level == logging.NOTSET: access_log.setLevel(logging.DEBUG) if not access_log.hasHandlers(): access_log.addHandler(logging.StreamHandler()) try: loop.run_until_complete(_run_app(app, host=host, port=port, path=path, sock=sock, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, print=print, backlog=backlog, access_log_class=access_log_class, access_log_format=access_log_format, access_log=access_log, handle_signals=handle_signals, reuse_address=reuse_address, reuse_port=reuse_port)) except (GracefulExit, KeyboardInterrupt): # pragma: no cover pass finally: _cancel_all_tasks(loop) if sys.version_info >= (3, 6): # don't use PY_36 to pass mypy loop.run_until_complete(loop.shutdown_asyncgens()) loop.close() def main(argv: List[str]) -> None: arg_parser = ArgumentParser( description="aiohttp.web Application server", prog="aiohttp.web" ) arg_parser.add_argument( "entry_func", help=("Callable returning the `aiohttp.web.Application` instance to " "run. Should be specified in the 'module:function' syntax."), metavar="entry-func" ) arg_parser.add_argument( "-H", "--hostname", help="TCP/IP hostname to serve on (default: %(default)r)", default="localhost" ) arg_parser.add_argument( "-P", "--port", help="TCP/IP port to serve on (default: %(default)r)", type=int, default="8080" ) arg_parser.add_argument( "-U", "--path", help="Unix file system path to serve on. Specifying a path will cause " "hostname and port arguments to be ignored.", ) args, extra_argv = arg_parser.parse_known_args(argv) # Import logic mod_str, _, func_str = args.entry_func.partition(":") if not func_str or not mod_str: arg_parser.error( "'entry-func' not in 'module:function' syntax" ) if mod_str.startswith("."): arg_parser.error("relative module names not supported") try: module = import_module(mod_str) except ImportError as ex: arg_parser.error("unable to import %s: %s" % (mod_str, ex)) try: func = getattr(module, func_str) except AttributeError: arg_parser.error("module %r has no attribute %r" % (mod_str, func_str)) # Compatibility logic if args.path is not None and not hasattr(socket, 'AF_UNIX'): arg_parser.error("file system paths not supported by your operating" " environment") logging.basicConfig(level=logging.DEBUG) app = func(extra_argv) run_app(app, host=args.hostname, port=args.port, path=args.path) arg_parser.exit(message="Stopped\n") if __name__ == "__main__": # pragma: no branch main(sys.argv[1:]) # pragma: no cover aiohttp-3.6.2/aiohttp/web_app.py0000644000175100001650000004156113547410117017133 0ustar vstsdocker00000000000000import asyncio import logging import warnings from functools import partial from typing import ( # noqa TYPE_CHECKING, Any, AsyncIterator, Awaitable, Callable, Dict, Iterable, Iterator, List, Mapping, MutableMapping, Optional, Sequence, Tuple, Type, Union, cast, ) from . import hdrs from .abc import ( AbstractAccessLogger, AbstractMatchInfo, AbstractRouter, AbstractStreamWriter, ) from .frozenlist import FrozenList from .helpers import DEBUG from .http_parser import RawRequestMessage from .log import web_logger from .signals import Signal from .streams import StreamReader from .web_log import AccessLogger from .web_middlewares import _fix_request_current_app from .web_protocol import RequestHandler from .web_request import Request from .web_response import StreamResponse from .web_routedef import AbstractRouteDef from .web_server import Server from .web_urldispatcher import ( AbstractResource, Domain, MaskDomain, MatchedSubAppResource, PrefixedSubAppResource, UrlDispatcher, ) __all__ = ('Application', 'CleanupError') if TYPE_CHECKING: # pragma: no cover _AppSignal = Signal[Callable[['Application'], Awaitable[None]]] _RespPrepareSignal = Signal[Callable[[Request, StreamResponse], Awaitable[None]]] _Handler = Callable[[Request], Awaitable[StreamResponse]] _Middleware = Union[Callable[[Request, _Handler], Awaitable[StreamResponse]], Callable[['Application', _Handler], # old-style Awaitable[_Handler]]] _Middlewares = FrozenList[_Middleware] _MiddlewaresHandlers = Optional[Sequence[Tuple[_Middleware, bool]]] _Subapps = List['Application'] else: # No type checker mode, skip types _AppSignal = Signal _RespPrepareSignal = Signal _Handler = Callable _Middleware = Callable _Middlewares = FrozenList _MiddlewaresHandlers = Optional[Sequence] _Subapps = List class Application(MutableMapping[str, Any]): ATTRS = frozenset([ 'logger', '_debug', '_router', '_loop', '_handler_args', '_middlewares', '_middlewares_handlers', '_run_middlewares', '_state', '_frozen', '_pre_frozen', '_subapps', '_on_response_prepare', '_on_startup', '_on_shutdown', '_on_cleanup', '_client_max_size', '_cleanup_ctx']) def __init__(self, *, logger: logging.Logger=web_logger, router: Optional[UrlDispatcher]=None, middlewares: Iterable[_Middleware]=(), handler_args: Mapping[str, Any]=None, client_max_size: int=1024**2, loop: Optional[asyncio.AbstractEventLoop]=None, debug: Any=... # mypy doesn't support ellipsis ) -> None: if router is None: router = UrlDispatcher() else: warnings.warn("router argument is deprecated", DeprecationWarning, stacklevel=2) assert isinstance(router, AbstractRouter), router if loop is not None: warnings.warn("loop argument is deprecated", DeprecationWarning, stacklevel=2) if debug is not ...: warnings.warn("debug argument is deprecated", DeprecationWarning, stacklevel=2) self._debug = debug self._router = router # type: UrlDispatcher self._loop = loop self._handler_args = handler_args self.logger = logger self._middlewares = FrozenList(middlewares) # type: _Middlewares # initialized on freezing self._middlewares_handlers = None # type: _MiddlewaresHandlers # initialized on freezing self._run_middlewares = None # type: Optional[bool] self._state = {} # type: Dict[str, Any] self._frozen = False self._pre_frozen = False self._subapps = [] # type: _Subapps self._on_response_prepare = Signal(self) # type: _RespPrepareSignal self._on_startup = Signal(self) # type: _AppSignal self._on_shutdown = Signal(self) # type: _AppSignal self._on_cleanup = Signal(self) # type: _AppSignal self._cleanup_ctx = CleanupContext() self._on_startup.append(self._cleanup_ctx._on_startup) self._on_cleanup.append(self._cleanup_ctx._on_cleanup) self._client_max_size = client_max_size def __init_subclass__(cls: Type['Application']) -> None: warnings.warn("Inheritance class {} from web.Application " "is discouraged".format(cls.__name__), DeprecationWarning, stacklevel=2) if DEBUG: # pragma: no cover def __setattr__(self, name: str, val: Any) -> None: if name not in self.ATTRS: warnings.warn("Setting custom web.Application.{} attribute " "is discouraged".format(name), DeprecationWarning, stacklevel=2) super().__setattr__(name, val) # MutableMapping API def __eq__(self, other: object) -> bool: return self is other def __getitem__(self, key: str) -> Any: return self._state[key] def _check_frozen(self) -> None: if self._frozen: warnings.warn("Changing state of started or joined " "application is deprecated", DeprecationWarning, stacklevel=3) def __setitem__(self, key: str, value: Any) -> None: self._check_frozen() self._state[key] = value def __delitem__(self, key: str) -> None: self._check_frozen() del self._state[key] def __len__(self) -> int: return len(self._state) def __iter__(self) -> Iterator[str]: return iter(self._state) ######## @property def loop(self) -> asyncio.AbstractEventLoop: # Technically the loop can be None # but we mask it by explicit type cast # to provide more convinient type annotation warnings.warn("loop property is deprecated", DeprecationWarning, stacklevel=2) return cast(asyncio.AbstractEventLoop, self._loop) def _set_loop(self, loop: Optional[asyncio.AbstractEventLoop]) -> None: if loop is None: loop = asyncio.get_event_loop() if self._loop is not None and self._loop is not loop: raise RuntimeError( "web.Application instance initialized with different loop") self._loop = loop # set loop debug if self._debug is ...: self._debug = loop.get_debug() # set loop to sub applications for subapp in self._subapps: subapp._set_loop(loop) @property def pre_frozen(self) -> bool: return self._pre_frozen def pre_freeze(self) -> None: if self._pre_frozen: return self._pre_frozen = True self._middlewares.freeze() self._router.freeze() self._on_response_prepare.freeze() self._cleanup_ctx.freeze() self._on_startup.freeze() self._on_shutdown.freeze() self._on_cleanup.freeze() self._middlewares_handlers = tuple(self._prepare_middleware()) # If current app and any subapp do not have middlewares avoid run all # of the code footprint that it implies, which have a middleware # hardcoded per app that sets up the current_app attribute. If no # middlewares are configured the handler will receive the proper # current_app without needing all of this code. self._run_middlewares = True if self.middlewares else False for subapp in self._subapps: subapp.pre_freeze() self._run_middlewares = (self._run_middlewares or subapp._run_middlewares) @property def frozen(self) -> bool: return self._frozen def freeze(self) -> None: if self._frozen: return self.pre_freeze() self._frozen = True for subapp in self._subapps: subapp.freeze() @property def debug(self) -> bool: warnings.warn("debug property is deprecated", DeprecationWarning, stacklevel=2) return self._debug def _reg_subapp_signals(self, subapp: 'Application') -> None: def reg_handler(signame: str) -> None: subsig = getattr(subapp, signame) async def handler(app: 'Application') -> None: await subsig.send(subapp) appsig = getattr(self, signame) appsig.append(handler) reg_handler('on_startup') reg_handler('on_shutdown') reg_handler('on_cleanup') def add_subapp(self, prefix: str, subapp: 'Application') -> AbstractResource: if not isinstance(prefix, str): raise TypeError("Prefix must be str") prefix = prefix.rstrip('/') if not prefix: raise ValueError("Prefix cannot be empty") factory = partial(PrefixedSubAppResource, prefix, subapp) return self._add_subapp(factory, subapp) def _add_subapp(self, resource_factory: Callable[[], AbstractResource], subapp: 'Application') -> AbstractResource: if self.frozen: raise RuntimeError( "Cannot add sub application to frozen application") if subapp.frozen: raise RuntimeError("Cannot add frozen application") resource = resource_factory() self.router.register_resource(resource) self._reg_subapp_signals(subapp) self._subapps.append(subapp) subapp.pre_freeze() if self._loop is not None: subapp._set_loop(self._loop) return resource def add_domain(self, domain: str, subapp: 'Application') -> AbstractResource: if not isinstance(domain, str): raise TypeError("Domain must be str") elif '*' in domain: rule = MaskDomain(domain) # type: Domain else: rule = Domain(domain) factory = partial(MatchedSubAppResource, rule, subapp) return self._add_subapp(factory, subapp) def add_routes(self, routes: Iterable[AbstractRouteDef]) -> None: self.router.add_routes(routes) @property def on_response_prepare(self) -> _RespPrepareSignal: return self._on_response_prepare @property def on_startup(self) -> _AppSignal: return self._on_startup @property def on_shutdown(self) -> _AppSignal: return self._on_shutdown @property def on_cleanup(self) -> _AppSignal: return self._on_cleanup @property def cleanup_ctx(self) -> 'CleanupContext': return self._cleanup_ctx @property def router(self) -> UrlDispatcher: return self._router @property def middlewares(self) -> _Middlewares: return self._middlewares def _make_handler(self, *, loop: Optional[asyncio.AbstractEventLoop]=None, access_log_class: Type[ AbstractAccessLogger]=AccessLogger, **kwargs: Any) -> Server: if not issubclass(access_log_class, AbstractAccessLogger): raise TypeError( 'access_log_class must be subclass of ' 'aiohttp.abc.AbstractAccessLogger, got {}'.format( access_log_class)) self._set_loop(loop) self.freeze() kwargs['debug'] = self._debug kwargs['access_log_class'] = access_log_class if self._handler_args: for k, v in self._handler_args.items(): kwargs[k] = v return Server(self._handle, # type: ignore request_factory=self._make_request, loop=self._loop, **kwargs) def make_handler(self, *, loop: Optional[asyncio.AbstractEventLoop]=None, access_log_class: Type[ AbstractAccessLogger]=AccessLogger, **kwargs: Any) -> Server: warnings.warn("Application.make_handler(...) is deprecated, " "use AppRunner API instead", DeprecationWarning, stacklevel=2) return self._make_handler(loop=loop, access_log_class=access_log_class, **kwargs) async def startup(self) -> None: """Causes on_startup signal Should be called in the event loop along with the request handler. """ await self.on_startup.send(self) async def shutdown(self) -> None: """Causes on_shutdown signal Should be called before cleanup() """ await self.on_shutdown.send(self) async def cleanup(self) -> None: """Causes on_cleanup signal Should be called after shutdown() """ await self.on_cleanup.send(self) def _make_request(self, message: RawRequestMessage, payload: StreamReader, protocol: RequestHandler, writer: AbstractStreamWriter, task: 'asyncio.Task[None]', _cls: Type[Request]=Request) -> Request: return _cls( message, payload, protocol, writer, task, self._loop, client_max_size=self._client_max_size) def _prepare_middleware(self) -> Iterator[Tuple[_Middleware, bool]]: for m in reversed(self._middlewares): if getattr(m, '__middleware_version__', None) == 1: yield m, True else: warnings.warn('old-style middleware "{!r}" deprecated, ' 'see #2252'.format(m), DeprecationWarning, stacklevel=2) yield m, False yield _fix_request_current_app(self), True async def _handle(self, request: Request) -> StreamResponse: loop = asyncio.get_event_loop() debug = loop.get_debug() match_info = await self._router.resolve(request) if debug: # pragma: no cover if not isinstance(match_info, AbstractMatchInfo): raise TypeError("match_info should be AbstractMatchInfo " "instance, not {!r}".format(match_info)) match_info.add_app(self) match_info.freeze() resp = None request._match_info = match_info # type: ignore expect = request.headers.get(hdrs.EXPECT) if expect: resp = await match_info.expect_handler(request) await request.writer.drain() if resp is None: handler = match_info.handler if self._run_middlewares: for app in match_info.apps[::-1]: for m, new_style in app._middlewares_handlers: # type: ignore # noqa if new_style: handler = partial(m, handler=handler) else: handler = await m(app, handler) # type: ignore resp = await handler(request) return resp def __call__(self) -> 'Application': """gunicorn compatibility""" return self def __repr__(self) -> str: return "".format(id(self)) def __bool__(self) -> bool: return True class CleanupError(RuntimeError): @property def exceptions(self) -> List[BaseException]: return self.args[1] if TYPE_CHECKING: # pragma: no cover _CleanupContextBase = FrozenList[Callable[[Application], AsyncIterator[None]]] else: _CleanupContextBase = FrozenList class CleanupContext(_CleanupContextBase): def __init__(self) -> None: super().__init__() self._exits = [] # type: List[AsyncIterator[None]] async def _on_startup(self, app: Application) -> None: for cb in self: it = cb(app).__aiter__() await it.__anext__() self._exits.append(it) async def _on_cleanup(self, app: Application) -> None: errors = [] for it in reversed(self._exits): try: await it.__anext__() except StopAsyncIteration: pass except Exception as exc: errors.append(exc) else: errors.append(RuntimeError("{!r} has more than one 'yield'" .format(it))) if errors: if len(errors) == 1: raise errors[0] else: raise CleanupError("Multiple errors on cleanup stage", errors) aiohttp-3.6.2/aiohttp/web_exceptions.py0000644000175100001650000002357213547410117020536 0ustar vstsdocker00000000000000import warnings from typing import Any, Dict, Iterable, List, Optional, Set # noqa from yarl import URL from .typedefs import LooseHeaders, StrOrURL from .web_response import Response __all__ = ( 'HTTPException', 'HTTPError', 'HTTPRedirection', 'HTTPSuccessful', 'HTTPOk', 'HTTPCreated', 'HTTPAccepted', 'HTTPNonAuthoritativeInformation', 'HTTPNoContent', 'HTTPResetContent', 'HTTPPartialContent', 'HTTPMultipleChoices', 'HTTPMovedPermanently', 'HTTPFound', 'HTTPSeeOther', 'HTTPNotModified', 'HTTPUseProxy', 'HTTPTemporaryRedirect', 'HTTPPermanentRedirect', 'HTTPClientError', 'HTTPBadRequest', 'HTTPUnauthorized', 'HTTPPaymentRequired', 'HTTPForbidden', 'HTTPNotFound', 'HTTPMethodNotAllowed', 'HTTPNotAcceptable', 'HTTPProxyAuthenticationRequired', 'HTTPRequestTimeout', 'HTTPConflict', 'HTTPGone', 'HTTPLengthRequired', 'HTTPPreconditionFailed', 'HTTPRequestEntityTooLarge', 'HTTPRequestURITooLong', 'HTTPUnsupportedMediaType', 'HTTPRequestRangeNotSatisfiable', 'HTTPExpectationFailed', 'HTTPMisdirectedRequest', 'HTTPUnprocessableEntity', 'HTTPFailedDependency', 'HTTPUpgradeRequired', 'HTTPPreconditionRequired', 'HTTPTooManyRequests', 'HTTPRequestHeaderFieldsTooLarge', 'HTTPUnavailableForLegalReasons', 'HTTPServerError', 'HTTPInternalServerError', 'HTTPNotImplemented', 'HTTPBadGateway', 'HTTPServiceUnavailable', 'HTTPGatewayTimeout', 'HTTPVersionNotSupported', 'HTTPVariantAlsoNegotiates', 'HTTPInsufficientStorage', 'HTTPNotExtended', 'HTTPNetworkAuthenticationRequired', ) ############################################################ # HTTP Exceptions ############################################################ class HTTPException(Response, Exception): # You should set in subclasses: # status = 200 status_code = -1 empty_body = False __http_exception__ = True def __init__(self, *, headers: Optional[LooseHeaders]=None, reason: Optional[str]=None, body: Any=None, text: Optional[str]=None, content_type: Optional[str]=None) -> None: if body is not None: warnings.warn( "body argument is deprecated for http web exceptions", DeprecationWarning) Response.__init__(self, status=self.status_code, headers=headers, reason=reason, body=body, text=text, content_type=content_type) Exception.__init__(self, self.reason) if self.body is None and not self.empty_body: self.text = "{}: {}".format(self.status, self.reason) def __bool__(self) -> bool: return True class HTTPError(HTTPException): """Base class for exceptions with status codes in the 400s and 500s.""" class HTTPRedirection(HTTPException): """Base class for exceptions with status codes in the 300s.""" class HTTPSuccessful(HTTPException): """Base class for exceptions with status codes in the 200s.""" class HTTPOk(HTTPSuccessful): status_code = 200 class HTTPCreated(HTTPSuccessful): status_code = 201 class HTTPAccepted(HTTPSuccessful): status_code = 202 class HTTPNonAuthoritativeInformation(HTTPSuccessful): status_code = 203 class HTTPNoContent(HTTPSuccessful): status_code = 204 empty_body = True class HTTPResetContent(HTTPSuccessful): status_code = 205 empty_body = True class HTTPPartialContent(HTTPSuccessful): status_code = 206 ############################################################ # 3xx redirection ############################################################ class _HTTPMove(HTTPRedirection): def __init__(self, location: StrOrURL, *, headers: Optional[LooseHeaders]=None, reason: Optional[str]=None, body: Any=None, text: Optional[str]=None, content_type: Optional[str]=None) -> None: if not location: raise ValueError("HTTP redirects need a location to redirect to.") super().__init__(headers=headers, reason=reason, body=body, text=text, content_type=content_type) self.headers['Location'] = str(URL(location)) self.location = location class HTTPMultipleChoices(_HTTPMove): status_code = 300 class HTTPMovedPermanently(_HTTPMove): status_code = 301 class HTTPFound(_HTTPMove): status_code = 302 # This one is safe after a POST (the redirected location will be # retrieved with GET): class HTTPSeeOther(_HTTPMove): status_code = 303 class HTTPNotModified(HTTPRedirection): # FIXME: this should include a date or etag header status_code = 304 empty_body = True class HTTPUseProxy(_HTTPMove): # Not a move, but looks a little like one status_code = 305 class HTTPTemporaryRedirect(_HTTPMove): status_code = 307 class HTTPPermanentRedirect(_HTTPMove): status_code = 308 ############################################################ # 4xx client error ############################################################ class HTTPClientError(HTTPError): pass class HTTPBadRequest(HTTPClientError): status_code = 400 class HTTPUnauthorized(HTTPClientError): status_code = 401 class HTTPPaymentRequired(HTTPClientError): status_code = 402 class HTTPForbidden(HTTPClientError): status_code = 403 class HTTPNotFound(HTTPClientError): status_code = 404 class HTTPMethodNotAllowed(HTTPClientError): status_code = 405 def __init__(self, method: str, allowed_methods: Iterable[str], *, headers: Optional[LooseHeaders]=None, reason: Optional[str]=None, body: Any=None, text: Optional[str]=None, content_type: Optional[str]=None) -> None: allow = ','.join(sorted(allowed_methods)) super().__init__(headers=headers, reason=reason, body=body, text=text, content_type=content_type) self.headers['Allow'] = allow self.allowed_methods = set(allowed_methods) # type: Set[str] self.method = method.upper() class HTTPNotAcceptable(HTTPClientError): status_code = 406 class HTTPProxyAuthenticationRequired(HTTPClientError): status_code = 407 class HTTPRequestTimeout(HTTPClientError): status_code = 408 class HTTPConflict(HTTPClientError): status_code = 409 class HTTPGone(HTTPClientError): status_code = 410 class HTTPLengthRequired(HTTPClientError): status_code = 411 class HTTPPreconditionFailed(HTTPClientError): status_code = 412 class HTTPRequestEntityTooLarge(HTTPClientError): status_code = 413 def __init__(self, max_size: float, actual_size: float, **kwargs: Any) -> None: kwargs.setdefault( 'text', 'Maximum request body size {} exceeded, ' 'actual body size {}'.format(max_size, actual_size) ) super().__init__(**kwargs) class HTTPRequestURITooLong(HTTPClientError): status_code = 414 class HTTPUnsupportedMediaType(HTTPClientError): status_code = 415 class HTTPRequestRangeNotSatisfiable(HTTPClientError): status_code = 416 class HTTPExpectationFailed(HTTPClientError): status_code = 417 class HTTPMisdirectedRequest(HTTPClientError): status_code = 421 class HTTPUnprocessableEntity(HTTPClientError): status_code = 422 class HTTPFailedDependency(HTTPClientError): status_code = 424 class HTTPUpgradeRequired(HTTPClientError): status_code = 426 class HTTPPreconditionRequired(HTTPClientError): status_code = 428 class HTTPTooManyRequests(HTTPClientError): status_code = 429 class HTTPRequestHeaderFieldsTooLarge(HTTPClientError): status_code = 431 class HTTPUnavailableForLegalReasons(HTTPClientError): status_code = 451 def __init__(self, link: str, *, headers: Optional[LooseHeaders]=None, reason: Optional[str]=None, body: Any=None, text: Optional[str]=None, content_type: Optional[str]=None) -> None: super().__init__(headers=headers, reason=reason, body=body, text=text, content_type=content_type) self.headers['Link'] = '<%s>; rel="blocked-by"' % link self.link = link ############################################################ # 5xx Server Error ############################################################ # Response status codes beginning with the digit "5" indicate cases in # which the server is aware that it has erred or is incapable of # performing the request. Except when responding to a HEAD request, the # server SHOULD include an entity containing an explanation of the error # situation, and whether it is a temporary or permanent condition. User # agents SHOULD display any included entity to the user. These response # codes are applicable to any request method. class HTTPServerError(HTTPError): pass class HTTPInternalServerError(HTTPServerError): status_code = 500 class HTTPNotImplemented(HTTPServerError): status_code = 501 class HTTPBadGateway(HTTPServerError): status_code = 502 class HTTPServiceUnavailable(HTTPServerError): status_code = 503 class HTTPGatewayTimeout(HTTPServerError): status_code = 504 class HTTPVersionNotSupported(HTTPServerError): status_code = 505 class HTTPVariantAlsoNegotiates(HTTPServerError): status_code = 506 class HTTPInsufficientStorage(HTTPServerError): status_code = 507 class HTTPNotExtended(HTTPServerError): status_code = 510 class HTTPNetworkAuthenticationRequired(HTTPServerError): status_code = 511 aiohttp-3.6.2/aiohttp/web_fileresponse.py0000644000175100001650000003074413547410117021052 0ustar vstsdocker00000000000000import asyncio import mimetypes import os import pathlib from functools import partial from typing import ( # noqa IO, TYPE_CHECKING, Any, Awaitable, Callable, List, Optional, Union, cast, ) from . import hdrs from .abc import AbstractStreamWriter from .base_protocol import BaseProtocol from .helpers import set_exception, set_result from .http_writer import StreamWriter from .log import server_logger from .typedefs import LooseHeaders from .web_exceptions import ( HTTPNotModified, HTTPPartialContent, HTTPPreconditionFailed, HTTPRequestRangeNotSatisfiable, ) from .web_response import StreamResponse __all__ = ('FileResponse',) if TYPE_CHECKING: # pragma: no cover from .web_request import BaseRequest # noqa _T_OnChunkSent = Optional[Callable[[bytes], Awaitable[None]]] NOSENDFILE = bool(os.environ.get("AIOHTTP_NOSENDFILE")) class SendfileStreamWriter(StreamWriter): def __init__(self, protocol: BaseProtocol, loop: asyncio.AbstractEventLoop, fobj: IO[Any], count: int, on_chunk_sent: _T_OnChunkSent=None) -> None: super().__init__(protocol, loop, on_chunk_sent) self._sendfile_buffer = [] # type: List[bytes] self._fobj = fobj self._count = count self._offset = fobj.tell() self._in_fd = fobj.fileno() def _write(self, chunk: bytes) -> None: # we overwrite StreamWriter._write, so nothing can be appended to # _buffer, and nothing is written to the transport directly by the # parent class self.output_size += len(chunk) self._sendfile_buffer.append(chunk) def _sendfile_cb(self, fut: 'asyncio.Future[None]', out_fd: int) -> None: if fut.cancelled(): return try: if self._do_sendfile(out_fd): set_result(fut, None) except Exception as exc: set_exception(fut, exc) def _do_sendfile(self, out_fd: int) -> bool: try: n = os.sendfile(out_fd, self._in_fd, self._offset, self._count) if n == 0: # in_fd EOF reached n = self._count except (BlockingIOError, InterruptedError): n = 0 self.output_size += n self._offset += n self._count -= n assert self._count >= 0 return self._count == 0 def _done_fut(self, out_fd: int, fut: 'asyncio.Future[None]') -> None: self.loop.remove_writer(out_fd) async def sendfile(self) -> None: assert self.transport is not None out_socket = self.transport.get_extra_info('socket').dup() out_socket.setblocking(False) out_fd = out_socket.fileno() loop = self.loop data = b''.join(self._sendfile_buffer) try: await loop.sock_sendall(out_socket, data) if not self._do_sendfile(out_fd): fut = loop.create_future() fut.add_done_callback(partial(self._done_fut, out_fd)) loop.add_writer(out_fd, self._sendfile_cb, fut, out_fd) await fut except asyncio.CancelledError: raise except Exception: server_logger.debug('Socket error') self.transport.close() finally: out_socket.close() await super().write_eof() async def write_eof(self, chunk: bytes=b'') -> None: pass class FileResponse(StreamResponse): """A response object can be used to send files.""" def __init__(self, path: Union[str, pathlib.Path], chunk_size: int=256*1024, status: int=200, reason: Optional[str]=None, headers: Optional[LooseHeaders]=None) -> None: super().__init__(status=status, reason=reason, headers=headers) if isinstance(path, str): path = pathlib.Path(path) self._path = path self._chunk_size = chunk_size async def _sendfile_system(self, request: 'BaseRequest', fobj: IO[Any], count: int) -> AbstractStreamWriter: # Write count bytes of fobj to resp using # the os.sendfile system call. # # For details check # https://github.com/KeepSafe/aiohttp/issues/1177 # See https://github.com/KeepSafe/aiohttp/issues/958 for details # # request should be an aiohttp.web.Request instance. # fobj should be an open file object. # count should be an integer > 0. transport = request.transport assert transport is not None if (transport.get_extra_info("sslcontext") or transport.get_extra_info("socket") is None or self.compression): writer = await self._sendfile_fallback(request, fobj, count) else: writer = SendfileStreamWriter( request.protocol, request._loop, fobj, count ) request._payload_writer = writer await super().prepare(request) await writer.sendfile() return writer async def _sendfile_fallback(self, request: 'BaseRequest', fobj: IO[Any], count: int) -> AbstractStreamWriter: # Mimic the _sendfile_system() method, but without using the # os.sendfile() system call. This should be used on systems # that don't support the os.sendfile(). # To keep memory usage low,fobj is transferred in chunks # controlled by the constructor's chunk_size argument. writer = await super().prepare(request) assert writer is not None chunk_size = self._chunk_size loop = asyncio.get_event_loop() chunk = await loop.run_in_executor(None, fobj.read, chunk_size) while chunk: await writer.write(chunk) count = count - chunk_size if count <= 0: break chunk = await loop.run_in_executor( None, fobj.read, min(chunk_size, count) ) await writer.drain() return writer if hasattr(os, "sendfile") and not NOSENDFILE: # pragma: no cover _sendfile = _sendfile_system else: # pragma: no cover _sendfile = _sendfile_fallback async def prepare( self, request: 'BaseRequest' ) -> Optional[AbstractStreamWriter]: filepath = self._path gzip = False if 'gzip' in request.headers.get(hdrs.ACCEPT_ENCODING, ''): gzip_path = filepath.with_name(filepath.name + '.gz') if gzip_path.is_file(): filepath = gzip_path gzip = True loop = asyncio.get_event_loop() st = await loop.run_in_executor(None, filepath.stat) modsince = request.if_modified_since if modsince is not None and st.st_mtime <= modsince.timestamp(): self.set_status(HTTPNotModified.status_code) self._length_check = False # Delete any Content-Length headers provided by user. HTTP 304 # should always have empty response body return await super().prepare(request) unmodsince = request.if_unmodified_since if unmodsince is not None and st.st_mtime > unmodsince.timestamp(): self.set_status(HTTPPreconditionFailed.status_code) return await super().prepare(request) if hdrs.CONTENT_TYPE not in self.headers: ct, encoding = mimetypes.guess_type(str(filepath)) if not ct: ct = 'application/octet-stream' should_set_ct = True else: encoding = 'gzip' if gzip else None should_set_ct = False status = self._status file_size = st.st_size count = file_size start = None ifrange = request.if_range if ifrange is None or st.st_mtime <= ifrange.timestamp(): # If-Range header check: # condition = cached date >= last modification date # return 206 if True else 200. # if False: # Range header would not be processed, return 200 # if True but Range header missing # return 200 try: rng = request.http_range start = rng.start end = rng.stop except ValueError: # https://tools.ietf.org/html/rfc7233: # A server generating a 416 (Range Not Satisfiable) response to # a byte-range request SHOULD send a Content-Range header field # with an unsatisfied-range value. # The complete-length in a 416 response indicates the current # length of the selected representation. # # Will do the same below. Many servers ignore this and do not # send a Content-Range header with HTTP 416 self.headers[hdrs.CONTENT_RANGE] = 'bytes */{0}'.format( file_size) self.set_status(HTTPRequestRangeNotSatisfiable.status_code) return await super().prepare(request) # If a range request has been made, convert start, end slice # notation into file pointer offset and count if start is not None or end is not None: if start < 0 and end is None: # return tail of file start += file_size if start < 0: # if Range:bytes=-1000 in request header but file size # is only 200, there would be trouble without this start = 0 count = file_size - start else: # rfc7233:If the last-byte-pos value is # absent, or if the value is greater than or equal to # the current length of the representation data, # the byte range is interpreted as the remainder # of the representation (i.e., the server replaces the # value of last-byte-pos with a value that is one less than # the current length of the selected representation). count = min(end if end is not None else file_size, file_size) - start if start >= file_size: # HTTP 416 should be returned in this case. # # According to https://tools.ietf.org/html/rfc7233: # If a valid byte-range-set includes at least one # byte-range-spec with a first-byte-pos that is less than # the current length of the representation, or at least one # suffix-byte-range-spec with a non-zero suffix-length, # then the byte-range-set is satisfiable. Otherwise, the # byte-range-set is unsatisfiable. self.headers[hdrs.CONTENT_RANGE] = 'bytes */{0}'.format( file_size) self.set_status(HTTPRequestRangeNotSatisfiable.status_code) return await super().prepare(request) status = HTTPPartialContent.status_code # Even though you are sending the whole file, you should still # return a HTTP 206 for a Range request. self.set_status(status) if should_set_ct: self.content_type = ct # type: ignore if encoding: self.headers[hdrs.CONTENT_ENCODING] = encoding if gzip: self.headers[hdrs.VARY] = hdrs.ACCEPT_ENCODING self.last_modified = st.st_mtime # type: ignore self.content_length = count self.headers[hdrs.ACCEPT_RANGES] = 'bytes' real_start = cast(int, start) if status == HTTPPartialContent.status_code: self.headers[hdrs.CONTENT_RANGE] = 'bytes {0}-{1}/{2}'.format( real_start, real_start + count - 1, file_size) fobj = await loop.run_in_executor(None, filepath.open, 'rb') if start: # be aware that start could be None or int=0 here. await loop.run_in_executor(None, fobj.seek, start) try: return await self._sendfile(request, fobj, count) finally: await loop.run_in_executor(None, fobj.close) aiohttp-3.6.2/aiohttp/web_log.py0000644000175100001650000002007713547410117017133 0ustar vstsdocker00000000000000import datetime import functools import logging import os import re from collections import namedtuple from typing import Any, Callable, Dict, Iterable, List, Tuple # noqa from .abc import AbstractAccessLogger from .web_request import BaseRequest from .web_response import StreamResponse KeyMethod = namedtuple('KeyMethod', 'key method') class AccessLogger(AbstractAccessLogger): """Helper object to log access. Usage: log = logging.getLogger("spam") log_format = "%a %{User-Agent}i" access_logger = AccessLogger(log, log_format) access_logger.log(request, response, time) Format: %% The percent sign %a Remote IP-address (IP-address of proxy if using reverse proxy) %t Time when the request was started to process %P The process ID of the child that serviced the request %r First line of request %s Response status code %b Size of response in bytes, including HTTP headers %T Time taken to serve the request, in seconds %Tf Time taken to serve the request, in seconds with floating fraction in .06f format %D Time taken to serve the request, in microseconds %{FOO}i request.headers['FOO'] %{FOO}o response.headers['FOO'] %{FOO}e os.environ['FOO'] """ LOG_FORMAT_MAP = { 'a': 'remote_address', 't': 'request_start_time', 'P': 'process_id', 'r': 'first_request_line', 's': 'response_status', 'b': 'response_size', 'T': 'request_time', 'Tf': 'request_time_frac', 'D': 'request_time_micro', 'i': 'request_header', 'o': 'response_header', } LOG_FORMAT = '%a %t "%r" %s %b "%{Referer}i" "%{User-Agent}i"' FORMAT_RE = re.compile(r'%(\{([A-Za-z0-9\-_]+)\}([ioe])|[atPrsbOD]|Tf?)') CLEANUP_RE = re.compile(r'(%[^s])') _FORMAT_CACHE = {} # type: Dict[str, Tuple[str, List[KeyMethod]]] def __init__(self, logger: logging.Logger, log_format: str=LOG_FORMAT) -> None: """Initialise the logger. logger is a logger object to be used for logging. log_format is a string with apache compatible log format description. """ super().__init__(logger, log_format=log_format) _compiled_format = AccessLogger._FORMAT_CACHE.get(log_format) if not _compiled_format: _compiled_format = self.compile_format(log_format) AccessLogger._FORMAT_CACHE[log_format] = _compiled_format self._log_format, self._methods = _compiled_format def compile_format(self, log_format: str) -> Tuple[str, List[KeyMethod]]: """Translate log_format into form usable by modulo formatting All known atoms will be replaced with %s Also methods for formatting of those atoms will be added to _methods in appropriate order For example we have log_format = "%a %t" This format will be translated to "%s %s" Also contents of _methods will be [self._format_a, self._format_t] These method will be called and results will be passed to translated string format. Each _format_* method receive 'args' which is list of arguments given to self.log Exceptions are _format_e, _format_i and _format_o methods which also receive key name (by functools.partial) """ # list of (key, method) tuples, we don't use an OrderedDict as users # can repeat the same key more than once methods = list() for atom in self.FORMAT_RE.findall(log_format): if atom[1] == '': format_key1 = self.LOG_FORMAT_MAP[atom[0]] m = getattr(AccessLogger, '_format_%s' % atom[0]) key_method = KeyMethod(format_key1, m) else: format_key2 = (self.LOG_FORMAT_MAP[atom[2]], atom[1]) m = getattr(AccessLogger, '_format_%s' % atom[2]) key_method = KeyMethod(format_key2, functools.partial(m, atom[1])) methods.append(key_method) log_format = self.FORMAT_RE.sub(r'%s', log_format) log_format = self.CLEANUP_RE.sub(r'%\1', log_format) return log_format, methods @staticmethod def _format_i(key: str, request: BaseRequest, response: StreamResponse, time: float) -> str: if request is None: return '(no headers)' # suboptimal, make istr(key) once return request.headers.get(key, '-') @staticmethod def _format_o(key: str, request: BaseRequest, response: StreamResponse, time: float) -> str: # suboptimal, make istr(key) once return response.headers.get(key, '-') @staticmethod def _format_a(request: BaseRequest, response: StreamResponse, time: float) -> str: if request is None: return '-' ip = request.remote return ip if ip is not None else '-' @staticmethod def _format_t(request: BaseRequest, response: StreamResponse, time: float) -> str: now = datetime.datetime.utcnow() start_time = now - datetime.timedelta(seconds=time) return start_time.strftime('[%d/%b/%Y:%H:%M:%S +0000]') @staticmethod def _format_P(request: BaseRequest, response: StreamResponse, time: float) -> str: return "<%s>" % os.getpid() @staticmethod def _format_r(request: BaseRequest, response: StreamResponse, time: float) -> str: if request is None: return '-' return '%s %s HTTP/%s.%s' % (request.method, request.path_qs, request.version.major, request.version.minor) @staticmethod def _format_s(request: BaseRequest, response: StreamResponse, time: float) -> int: return response.status @staticmethod def _format_b(request: BaseRequest, response: StreamResponse, time: float) -> int: return response.body_length @staticmethod def _format_T(request: BaseRequest, response: StreamResponse, time: float) -> str: return str(round(time)) @staticmethod def _format_Tf(request: BaseRequest, response: StreamResponse, time: float) -> str: return '%06f' % time @staticmethod def _format_D(request: BaseRequest, response: StreamResponse, time: float) -> str: return str(round(time * 1000000)) def _format_line(self, request: BaseRequest, response: StreamResponse, time: float) -> Iterable[Tuple[str, Callable[[BaseRequest, StreamResponse, float], str]]]: return [(key, method(request, response, time)) for key, method in self._methods] def log(self, request: BaseRequest, response: StreamResponse, time: float) -> None: try: fmt_info = self._format_line(request, response, time) values = list() extra = dict() for key, value in fmt_info: values.append(value) if key.__class__ is str: extra[key] = value else: k1, k2 = key dct = extra.get(k1, {}) # type: Any dct[k2] = value extra[k1] = dct self.logger.info(self._log_format % tuple(values), extra=extra) except Exception: self.logger.exception("Error in logging") aiohttp-3.6.2/aiohttp/web_middlewares.py0000644000175100001650000001013613547410117020645 0ustar vstsdocker00000000000000import re from typing import TYPE_CHECKING, Awaitable, Callable, Tuple, Type, TypeVar from .web_exceptions import HTTPPermanentRedirect, _HTTPMove from .web_request import Request from .web_response import StreamResponse from .web_urldispatcher import SystemRoute __all__ = ( 'middleware', 'normalize_path_middleware', ) if TYPE_CHECKING: # pragma: no cover from .web_app import Application # noqa _Func = TypeVar('_Func') async def _check_request_resolves(request: Request, path: str) -> Tuple[bool, Request]: alt_request = request.clone(rel_url=path) match_info = await request.app.router.resolve(alt_request) alt_request._match_info = match_info # type: ignore if match_info.http_exception is None: return True, alt_request return False, request def middleware(f: _Func) -> _Func: f.__middleware_version__ = 1 # type: ignore return f _Handler = Callable[[Request], Awaitable[StreamResponse]] _Middleware = Callable[[Request, _Handler], Awaitable[StreamResponse]] def normalize_path_middleware( *, append_slash: bool=True, remove_slash: bool=False, merge_slashes: bool=True, redirect_class: Type[_HTTPMove]=HTTPPermanentRedirect) -> _Middleware: """ Middleware factory which produces a middleware that normalizes the path of a request. By normalizing it means: - Add or remove a trailing slash to the path. - Double slashes are replaced by one. The middleware returns as soon as it finds a path that resolves correctly. The order if both merge and append/remove are enabled is 1) merge slashes 2) append/remove slash 3) both merge slashes and append/remove slash. If the path resolves with at least one of those conditions, it will redirect to the new path. Only one of `append_slash` and `remove_slash` can be enabled. If both are `True` the factory will raise an assertion error If `append_slash` is `True` the middleware will append a slash when needed. If a resource is defined with trailing slash and the request comes without it, it will append it automatically. If `remove_slash` is `True`, `append_slash` must be `False`. When enabled the middleware will remove trailing slashes and redirect if the resource is defined If merge_slashes is True, merge multiple consecutive slashes in the path into one. """ correct_configuration = not (append_slash and remove_slash) assert correct_configuration, "Cannot both remove and append slash" @middleware async def impl(request: Request, handler: _Handler) -> StreamResponse: if isinstance(request.match_info.route, SystemRoute): paths_to_check = [] if '?' in request.raw_path: path, query = request.raw_path.split('?', 1) query = '?' + query else: query = '' path = request.raw_path if merge_slashes: paths_to_check.append(re.sub('//+', '/', path)) if append_slash and not request.path.endswith('/'): paths_to_check.append(path + '/') if remove_slash and request.path.endswith('/'): paths_to_check.append(path[:-1]) if merge_slashes and append_slash: paths_to_check.append( re.sub('//+', '/', path + '/')) if merge_slashes and remove_slash: merged_slashes = re.sub('//+', '/', path) paths_to_check.append(merged_slashes[:-1]) for path in paths_to_check: resolves, request = await _check_request_resolves( request, path) if resolves: raise redirect_class(request.raw_path + query) return await handler(request) return impl def _fix_request_current_app(app: 'Application') -> _Middleware: @middleware async def impl(request: Request, handler: _Handler) -> StreamResponse: with request.match_info.set_current_app(app): return await handler(request) return impl aiohttp-3.6.2/aiohttp/web_protocol.py0000644000175100001650000005326013547410117020213 0ustar vstsdocker00000000000000import asyncio import asyncio.streams import traceback import warnings from collections import deque from contextlib import suppress from html import escape as html_escape from http import HTTPStatus from logging import Logger from typing import ( TYPE_CHECKING, Any, Awaitable, Callable, Optional, Type, cast, ) import yarl from .abc import AbstractAccessLogger, AbstractStreamWriter from .base_protocol import BaseProtocol from .helpers import CeilTimeout, current_task from .http import ( HttpProcessingError, HttpRequestParser, HttpVersion10, RawRequestMessage, StreamWriter, ) from .log import access_logger, server_logger from .streams import EMPTY_PAYLOAD, StreamReader from .tcp_helpers import tcp_keepalive from .web_exceptions import HTTPException from .web_log import AccessLogger from .web_request import BaseRequest from .web_response import Response, StreamResponse __all__ = ('RequestHandler', 'RequestPayloadError', 'PayloadAccessError') if TYPE_CHECKING: # pragma: no cover from .web_server import Server # noqa _RequestFactory = Callable[[RawRequestMessage, StreamReader, 'RequestHandler', AbstractStreamWriter, 'asyncio.Task[None]'], BaseRequest] _RequestHandler = Callable[[BaseRequest], Awaitable[StreamResponse]] ERROR = RawRequestMessage( 'UNKNOWN', '/', HttpVersion10, {}, {}, True, False, False, False, yarl.URL('/')) class RequestPayloadError(Exception): """Payload parsing error.""" class PayloadAccessError(Exception): """Payload was accessed after response was sent.""" class RequestHandler(BaseProtocol): """HTTP protocol implementation. RequestHandler handles incoming HTTP request. It reads request line, request headers and request payload and calls handle_request() method. By default it always returns with 404 response. RequestHandler handles errors in incoming request, like bad status line, bad headers or incomplete payload. If any error occurs, connection gets closed. :param keepalive_timeout: number of seconds before closing keep-alive connection :type keepalive_timeout: int or None :param bool tcp_keepalive: TCP keep-alive is on, default is on :param bool debug: enable debug mode :param logger: custom logger object :type logger: aiohttp.log.server_logger :param access_log_class: custom class for access_logger :type access_log_class: aiohttp.abc.AbstractAccessLogger :param access_log: custom logging object :type access_log: aiohttp.log.server_logger :param str access_log_format: access log format string :param loop: Optional event loop :param int max_line_size: Optional maximum header line size :param int max_field_size: Optional maximum header field size :param int max_headers: Optional maximum header size """ KEEPALIVE_RESCHEDULE_DELAY = 1 __slots__ = ('_request_count', '_keepalive', '_manager', '_request_handler', '_request_factory', '_tcp_keepalive', '_keepalive_time', '_keepalive_handle', '_keepalive_timeout', '_lingering_time', '_messages', '_message_tail', '_waiter', '_error_handler', '_task_handler', '_upgrade', '_payload_parser', '_request_parser', '_reading_paused', 'logger', 'debug', 'access_log', 'access_logger', '_close', '_force_close') def __init__(self, manager: 'Server', *, loop: asyncio.AbstractEventLoop, keepalive_timeout: float=75., # NGINX default is 75 secs tcp_keepalive: bool=True, logger: Logger=server_logger, access_log_class: Type[AbstractAccessLogger]=AccessLogger, access_log: Logger=access_logger, access_log_format: str=AccessLogger.LOG_FORMAT, debug: bool=False, max_line_size: int=8190, max_headers: int=32768, max_field_size: int=8190, lingering_time: float=10.0): super().__init__(loop) self._request_count = 0 self._keepalive = False self._manager = manager # type: Optional[Server] self._request_handler = manager.request_handler # type: Optional[_RequestHandler] # noqa self._request_factory = manager.request_factory # type: Optional[_RequestFactory] # noqa self._tcp_keepalive = tcp_keepalive # placeholder to be replaced on keepalive timeout setup self._keepalive_time = 0.0 self._keepalive_handle = None # type: Optional[asyncio.Handle] self._keepalive_timeout = keepalive_timeout self._lingering_time = float(lingering_time) self._messages = deque() # type: Any # Python 3.5 has no typing.Deque self._message_tail = b'' self._waiter = None # type: Optional[asyncio.Future[None]] self._error_handler = None # type: Optional[asyncio.Task[None]] self._task_handler = None # type: Optional[asyncio.Task[None]] self._upgrade = False self._payload_parser = None # type: Any self._request_parser = HttpRequestParser( self, loop, max_line_size=max_line_size, max_field_size=max_field_size, max_headers=max_headers, payload_exception=RequestPayloadError) # type: Optional[HttpRequestParser] # noqa self.logger = logger self.debug = debug self.access_log = access_log if access_log: self.access_logger = access_log_class( access_log, access_log_format) # type: Optional[AbstractAccessLogger] # noqa else: self.access_logger = None self._close = False self._force_close = False def __repr__(self) -> str: return "<{} {}>".format( self.__class__.__name__, 'connected' if self.transport is not None else 'disconnected') @property def keepalive_timeout(self) -> float: return self._keepalive_timeout async def shutdown(self, timeout: Optional[float]=15.0) -> None: """Worker process is about to exit, we need cleanup everything and stop accepting requests. It is especially important for keep-alive connections.""" self._force_close = True if self._keepalive_handle is not None: self._keepalive_handle.cancel() if self._waiter: self._waiter.cancel() # wait for handlers with suppress(asyncio.CancelledError, asyncio.TimeoutError): with CeilTimeout(timeout, loop=self._loop): if (self._error_handler is not None and not self._error_handler.done()): await self._error_handler if (self._task_handler is not None and not self._task_handler.done()): await self._task_handler # force-close non-idle handler if self._task_handler is not None: self._task_handler.cancel() if self.transport is not None: self.transport.close() self.transport = None def connection_made(self, transport: asyncio.BaseTransport) -> None: super().connection_made(transport) real_transport = cast(asyncio.Transport, transport) if self._tcp_keepalive: tcp_keepalive(real_transport) self._task_handler = self._loop.create_task(self.start()) assert self._manager is not None self._manager.connection_made(self, real_transport) def connection_lost(self, exc: Optional[BaseException]) -> None: if self._manager is None: return self._manager.connection_lost(self, exc) super().connection_lost(exc) self._manager = None self._force_close = True self._request_factory = None self._request_handler = None self._request_parser = None if self._keepalive_handle is not None: self._keepalive_handle.cancel() if self._task_handler is not None: self._task_handler.cancel() if self._error_handler is not None: self._error_handler.cancel() self._task_handler = None if self._payload_parser is not None: self._payload_parser.feed_eof() self._payload_parser = None def set_parser(self, parser: Any) -> None: # Actual type is WebReader assert self._payload_parser is None self._payload_parser = parser if self._message_tail: self._payload_parser.feed_data(self._message_tail) self._message_tail = b'' def eof_received(self) -> None: pass def data_received(self, data: bytes) -> None: if self._force_close or self._close: return # parse http messages if self._payload_parser is None and not self._upgrade: assert self._request_parser is not None try: messages, upgraded, tail = self._request_parser.feed_data(data) except HttpProcessingError as exc: # something happened during parsing self._error_handler = self._loop.create_task( self.handle_parse_error( StreamWriter(self, self._loop), 400, exc, exc.message)) self.close() except Exception as exc: # 500: internal error self._error_handler = self._loop.create_task( self.handle_parse_error( StreamWriter(self, self._loop), 500, exc)) self.close() else: if messages: # sometimes the parser returns no messages for (msg, payload) in messages: self._request_count += 1 self._messages.append((msg, payload)) waiter = self._waiter if waiter is not None: if not waiter.done(): # don't set result twice waiter.set_result(None) self._upgrade = upgraded if upgraded and tail: self._message_tail = tail # no parser, just store elif self._payload_parser is None and self._upgrade and data: self._message_tail += data # feed payload elif data: eof, tail = self._payload_parser.feed_data(data) if eof: self.close() def keep_alive(self, val: bool) -> None: """Set keep-alive connection mode. :param bool val: new state. """ self._keepalive = val if self._keepalive_handle: self._keepalive_handle.cancel() self._keepalive_handle = None def close(self) -> None: """Stop accepting new pipelinig messages and close connection when handlers done processing messages""" self._close = True if self._waiter: self._waiter.cancel() def force_close(self) -> None: """Force close connection""" self._force_close = True if self._waiter: self._waiter.cancel() if self.transport is not None: self.transport.close() self.transport = None def log_access(self, request: BaseRequest, response: StreamResponse, time: float) -> None: if self.access_logger is not None: self.access_logger.log(request, response, time) def log_debug(self, *args: Any, **kw: Any) -> None: if self.debug: self.logger.debug(*args, **kw) def log_exception(self, *args: Any, **kw: Any) -> None: self.logger.exception(*args, **kw) def _process_keepalive(self) -> None: if self._force_close or not self._keepalive: return next = self._keepalive_time + self._keepalive_timeout # handler in idle state if self._waiter: if self._loop.time() > next: self.force_close() return # not all request handlers are done, # reschedule itself to next second self._keepalive_handle = self._loop.call_later( self.KEEPALIVE_RESCHEDULE_DELAY, self._process_keepalive) async def start(self) -> None: """Process incoming request. It reads request line, request headers and request payload, then calls handle_request() method. Subclass has to override handle_request(). start() handles various exceptions in request or response handling. Connection is being closed always unless keep_alive(True) specified. """ loop = self._loop handler = self._task_handler assert handler is not None manager = self._manager assert manager is not None keepalive_timeout = self._keepalive_timeout resp = None assert self._request_factory is not None assert self._request_handler is not None while not self._force_close: if not self._messages: try: # wait for next request self._waiter = loop.create_future() await self._waiter except asyncio.CancelledError: break finally: self._waiter = None message, payload = self._messages.popleft() if self.access_log: now = loop.time() manager.requests_count += 1 writer = StreamWriter(self, loop) request = self._request_factory( message, payload, self, writer, handler) try: # a new task is used for copy context vars (#3406) task = self._loop.create_task( self._request_handler(request)) try: resp = await task except HTTPException as exc: resp = exc except (asyncio.CancelledError, ConnectionError): self.log_debug('Ignored premature client disconnection') break except asyncio.TimeoutError as exc: self.log_debug('Request handler timed out.', exc_info=exc) resp = self.handle_error(request, 504) except Exception as exc: resp = self.handle_error(request, 500, exc) else: # Deprecation warning (See #2415) if getattr(resp, '__http_exception__', False): warnings.warn( "returning HTTPException object is deprecated " "(#2415) and will be removed, " "please raise the exception instead", DeprecationWarning) # Drop the processed task from asyncio.Task.all_tasks() early del task if self.debug: if not isinstance(resp, StreamResponse): if resp is None: raise RuntimeError("Missing return " "statement on request handler") else: raise RuntimeError("Web-handler should return " "a response instance, " "got {!r}".format(resp)) try: prepare_meth = resp.prepare except AttributeError: if resp is None: raise RuntimeError("Missing return " "statement on request handler") else: raise RuntimeError("Web-handler should return " "a response instance, " "got {!r}".format(resp)) try: await prepare_meth(request) await resp.write_eof() except ConnectionError: self.log_debug('Ignored premature client disconnection 2') break # notify server about keep-alive self._keepalive = bool(resp.keep_alive) # log access if self.access_log: self.log_access(request, resp, loop.time() - now) # check payload if not payload.is_eof(): lingering_time = self._lingering_time if not self._force_close and lingering_time: self.log_debug( 'Start lingering close timer for %s sec.', lingering_time) now = loop.time() end_t = now + lingering_time with suppress( asyncio.TimeoutError, asyncio.CancelledError): while not payload.is_eof() and now < end_t: with CeilTimeout(end_t - now, loop=loop): # read and ignore await payload.readany() now = loop.time() # if payload still uncompleted if not payload.is_eof() and not self._force_close: self.log_debug('Uncompleted request.') self.close() payload.set_exception(PayloadAccessError()) except asyncio.CancelledError: self.log_debug('Ignored premature client disconnection ') break except RuntimeError as exc: if self.debug: self.log_exception( 'Unhandled runtime exception', exc_info=exc) self.force_close() except Exception as exc: self.log_exception('Unhandled exception', exc_info=exc) self.force_close() finally: if self.transport is None and resp is not None: self.log_debug('Ignored premature client disconnection.') elif not self._force_close: if self._keepalive and not self._close: # start keep-alive timer if keepalive_timeout is not None: now = self._loop.time() self._keepalive_time = now if self._keepalive_handle is None: self._keepalive_handle = loop.call_at( now + keepalive_timeout, self._process_keepalive) else: break # remove handler, close transport if no handlers left if not self._force_close: self._task_handler = None if self.transport is not None and self._error_handler is None: self.transport.close() def handle_error(self, request: BaseRequest, status: int=500, exc: Optional[BaseException]=None, message: Optional[str]=None) -> StreamResponse: """Handle errors. Returns HTTP response with specific status code. Logs additional information. It always closes current connection.""" self.log_exception("Error handling request", exc_info=exc) ct = 'text/plain' if status == HTTPStatus.INTERNAL_SERVER_ERROR: title = '{0.value} {0.phrase}'.format( HTTPStatus.INTERNAL_SERVER_ERROR ) msg = HTTPStatus.INTERNAL_SERVER_ERROR.description tb = None if self.debug: with suppress(Exception): tb = traceback.format_exc() if 'text/html' in request.headers.get('Accept', ''): if tb: tb = html_escape(tb) msg = '

Traceback:

\n
{}
'.format(tb) message = ( "" "{title}" "\n

{title}

" "\n{msg}\n\n" ).format(title=title, msg=msg) ct = 'text/html' else: if tb: msg = tb message = title + '\n\n' + msg resp = Response(status=status, text=message, content_type=ct) resp.force_close() # some data already got sent, connection is broken if request.writer.output_size > 0 or self.transport is None: self.force_close() return resp async def handle_parse_error(self, writer: AbstractStreamWriter, status: int, exc: Optional[BaseException]=None, message: Optional[str]=None) -> None: request = BaseRequest( # type: ignore ERROR, EMPTY_PAYLOAD, self, writer, current_task(), self._loop) resp = self.handle_error(request, status, exc, message) await resp.prepare(request) await resp.write_eof() if self.transport is not None: self.transport.close() self._error_handler = None aiohttp-3.6.2/aiohttp/web_request.py0000644000175100001650000006225413547410117020045 0ustar vstsdocker00000000000000import asyncio import datetime import io import re import socket import string import tempfile import types import warnings from email.utils import parsedate from http.cookies import SimpleCookie from types import MappingProxyType from typing import ( # noqa TYPE_CHECKING, Any, Dict, Iterator, Mapping, MutableMapping, Optional, Tuple, Union, cast, ) from urllib.parse import parse_qsl import attr from multidict import CIMultiDict, CIMultiDictProxy, MultiDict, MultiDictProxy from yarl import URL from . import hdrs from .abc import AbstractStreamWriter from .helpers import DEBUG, ChainMapProxy, HeadersMixin, reify, sentinel from .http_parser import RawRequestMessage from .multipart import BodyPartReader, MultipartReader from .streams import EmptyStreamReader, StreamReader from .typedefs import ( DEFAULT_JSON_DECODER, JSONDecoder, LooseHeaders, RawHeaders, StrOrURL, ) from .web_exceptions import HTTPRequestEntityTooLarge from .web_response import StreamResponse __all__ = ('BaseRequest', 'FileField', 'Request') if TYPE_CHECKING: # pragma: no cover from .web_app import Application # noqa from .web_urldispatcher import UrlMappingMatchInfo # noqa from .web_protocol import RequestHandler # noqa @attr.s(frozen=True, slots=True) class FileField: name = attr.ib(type=str) filename = attr.ib(type=str) file = attr.ib(type=io.BufferedReader) content_type = attr.ib(type=str) headers = attr.ib(type=CIMultiDictProxy) # type: CIMultiDictProxy[str] _TCHAR = string.digits + string.ascii_letters + r"!#$%&'*+.^_`|~-" # '-' at the end to prevent interpretation as range in a char class _TOKEN = r'[{tchar}]+'.format(tchar=_TCHAR) _QDTEXT = r'[{}]'.format( r''.join(chr(c) for c in (0x09, 0x20, 0x21) + tuple(range(0x23, 0x7F)))) # qdtext includes 0x5C to escape 0x5D ('\]') # qdtext excludes obs-text (because obsoleted, and encoding not specified) _QUOTED_PAIR = r'\\[\t !-~]' _QUOTED_STRING = r'"(?:{quoted_pair}|{qdtext})*"'.format( qdtext=_QDTEXT, quoted_pair=_QUOTED_PAIR) _FORWARDED_PAIR = ( r'({token})=({token}|{quoted_string})(:\d{{1,4}})?'.format( token=_TOKEN, quoted_string=_QUOTED_STRING)) _QUOTED_PAIR_REPLACE_RE = re.compile(r'\\([\t !-~])') # same pattern as _QUOTED_PAIR but contains a capture group _FORWARDED_PAIR_RE = re.compile(_FORWARDED_PAIR) ############################################################ # HTTP Request ############################################################ class BaseRequest(MutableMapping[str, Any], HeadersMixin): POST_METHODS = {hdrs.METH_PATCH, hdrs.METH_POST, hdrs.METH_PUT, hdrs.METH_TRACE, hdrs.METH_DELETE} ATTRS = HeadersMixin.ATTRS | frozenset([ '_message', '_protocol', '_payload_writer', '_payload', '_headers', '_method', '_version', '_rel_url', '_post', '_read_bytes', '_state', '_cache', '_task', '_client_max_size', '_loop', '_transport_sslcontext', '_transport_peername']) def __init__(self, message: RawRequestMessage, payload: StreamReader, protocol: 'RequestHandler', payload_writer: AbstractStreamWriter, task: 'asyncio.Task[None]', loop: asyncio.AbstractEventLoop, *, client_max_size: int=1024**2, state: Optional[Dict[str, Any]]=None, scheme: Optional[str]=None, host: Optional[str]=None, remote: Optional[str]=None) -> None: if state is None: state = {} self._message = message self._protocol = protocol self._payload_writer = payload_writer self._payload = payload self._headers = message.headers self._method = message.method self._version = message.version self._rel_url = message.url self._post = None # type: Optional[MultiDictProxy[Union[str, bytes, FileField]]] # noqa self._read_bytes = None # type: Optional[bytes] self._state = state self._cache = {} # type: Dict[str, Any] self._task = task self._client_max_size = client_max_size self._loop = loop transport = self._protocol.transport assert transport is not None self._transport_sslcontext = transport.get_extra_info('sslcontext') self._transport_peername = transport.get_extra_info('peername') if scheme is not None: self._cache['scheme'] = scheme if host is not None: self._cache['host'] = host if remote is not None: self._cache['remote'] = remote def clone(self, *, method: str=sentinel, rel_url: StrOrURL=sentinel, headers: LooseHeaders=sentinel, scheme: str=sentinel, host: str=sentinel, remote: str=sentinel) -> 'BaseRequest': """Clone itself with replacement some attributes. Creates and returns a new instance of Request object. If no parameters are given, an exact copy is returned. If a parameter is not passed, it will reuse the one from the current request object. """ if self._read_bytes: raise RuntimeError("Cannot clone request " "after reading its content") dct = {} # type: Dict[str, Any] if method is not sentinel: dct['method'] = method if rel_url is not sentinel: new_url = URL(rel_url) dct['url'] = new_url dct['path'] = str(new_url) if headers is not sentinel: # a copy semantic dct['headers'] = CIMultiDictProxy(CIMultiDict(headers)) dct['raw_headers'] = tuple((k.encode('utf-8'), v.encode('utf-8')) for k, v in headers.items()) message = self._message._replace(**dct) kwargs = {} if scheme is not sentinel: kwargs['scheme'] = scheme if host is not sentinel: kwargs['host'] = host if remote is not sentinel: kwargs['remote'] = remote return self.__class__( message, self._payload, self._protocol, self._payload_writer, self._task, self._loop, client_max_size=self._client_max_size, state=self._state.copy(), **kwargs) @property def task(self) -> 'asyncio.Task[None]': return self._task @property def protocol(self) -> 'RequestHandler': return self._protocol @property def transport(self) -> Optional[asyncio.Transport]: if self._protocol is None: return None return self._protocol.transport @property def writer(self) -> AbstractStreamWriter: return self._payload_writer @reify def message(self) -> RawRequestMessage: warnings.warn("Request.message is deprecated", DeprecationWarning, stacklevel=3) return self._message @reify def rel_url(self) -> URL: return self._rel_url @reify def loop(self) -> asyncio.AbstractEventLoop: warnings.warn("request.loop property is deprecated", DeprecationWarning, stacklevel=2) return self._loop # MutableMapping API def __getitem__(self, key: str) -> Any: return self._state[key] def __setitem__(self, key: str, value: Any) -> None: self._state[key] = value def __delitem__(self, key: str) -> None: del self._state[key] def __len__(self) -> int: return len(self._state) def __iter__(self) -> Iterator[str]: return iter(self._state) ######## @reify def secure(self) -> bool: """A bool indicating if the request is handled with SSL.""" return self.scheme == 'https' @reify def forwarded(self) -> Tuple[Mapping[str, str], ...]: """A tuple containing all parsed Forwarded header(s). Makes an effort to parse Forwarded headers as specified by RFC 7239: - It adds one (immutable) dictionary per Forwarded 'field-value', ie per proxy. The element corresponds to the data in the Forwarded field-value added by the first proxy encountered by the client. Each subsequent item corresponds to those added by later proxies. - It checks that every value has valid syntax in general as specified in section 4: either a 'token' or a 'quoted-string'. - It un-escapes found escape sequences. - It does NOT validate 'by' and 'for' contents as specified in section 6. - It does NOT validate 'host' contents (Host ABNF). - It does NOT validate 'proto' contents for valid URI scheme names. Returns a tuple containing one or more immutable dicts """ elems = [] for field_value in self._message.headers.getall(hdrs.FORWARDED, ()): length = len(field_value) pos = 0 need_separator = False elem = {} # type: Dict[str, str] elems.append(types.MappingProxyType(elem)) while 0 <= pos < length: match = _FORWARDED_PAIR_RE.match(field_value, pos) if match is not None: # got a valid forwarded-pair if need_separator: # bad syntax here, skip to next comma pos = field_value.find(',', pos) else: name, value, port = match.groups() if value[0] == '"': # quoted string: remove quotes and unescape value = _QUOTED_PAIR_REPLACE_RE.sub(r'\1', value[1:-1]) if port: value += port elem[name.lower()] = value pos += len(match.group(0)) need_separator = True elif field_value[pos] == ',': # next forwarded-element need_separator = False elem = {} elems.append(types.MappingProxyType(elem)) pos += 1 elif field_value[pos] == ';': # next forwarded-pair need_separator = False pos += 1 elif field_value[pos] in ' \t': # Allow whitespace even between forwarded-pairs, though # RFC 7239 doesn't. This simplifies code and is in line # with Postel's law. pos += 1 else: # bad syntax here, skip to next comma pos = field_value.find(',', pos) return tuple(elems) @reify def scheme(self) -> str: """A string representing the scheme of the request. Hostname is resolved in this order: - overridden value by .clone(scheme=new_scheme) call. - type of connection to peer: HTTPS if socket is SSL, HTTP otherwise. 'http' or 'https'. """ if self._transport_sslcontext: return 'https' else: return 'http' @reify def method(self) -> str: """Read only property for getting HTTP method. The value is upper-cased str like 'GET', 'POST', 'PUT' etc. """ return self._method @reify def version(self) -> Tuple[int, int]: """Read only property for getting HTTP version of request. Returns aiohttp.protocol.HttpVersion instance. """ return self._version @reify def host(self) -> str: """Hostname of the request. Hostname is resolved in this order: - overridden value by .clone(host=new_host) call. - HOST HTTP header - socket.getfqdn() value """ host = self._message.headers.get(hdrs.HOST) if host is not None: return host else: return socket.getfqdn() @reify def remote(self) -> Optional[str]: """Remote IP of client initiated HTTP request. The IP is resolved in this order: - overridden value by .clone(remote=new_remote) call. - peername of opened socket """ if isinstance(self._transport_peername, (list, tuple)): return self._transport_peername[0] else: return self._transport_peername @reify def url(self) -> URL: url = URL.build(scheme=self.scheme, host=self.host) return url.join(self._rel_url) @reify def path(self) -> str: """The URL including *PATH INFO* without the host or scheme. E.g., ``/app/blog`` """ return self._rel_url.path @reify def path_qs(self) -> str: """The URL including PATH_INFO and the query string. E.g, /app/blog?id=10 """ return str(self._rel_url) @reify def raw_path(self) -> str: """ The URL including raw *PATH INFO* without the host or scheme. Warning, the path is unquoted and may contains non valid URL characters E.g., ``/my%2Fpath%7Cwith%21some%25strange%24characters`` """ return self._message.path @reify def query(self) -> 'MultiDictProxy[str]': """A multidict with all the variables in the query string.""" return self._rel_url.query @reify def query_string(self) -> str: """The query string in the URL. E.g., id=10 """ return self._rel_url.query_string @reify def headers(self) -> 'CIMultiDictProxy[str]': """A case-insensitive multidict proxy with all headers.""" return self._headers @reify def raw_headers(self) -> RawHeaders: """A sequence of pairs for all headers.""" return self._message.raw_headers @staticmethod def _http_date(_date_str: str) -> Optional[datetime.datetime]: """Process a date string, return a datetime object """ if _date_str is not None: timetuple = parsedate(_date_str) if timetuple is not None: return datetime.datetime(*timetuple[:6], tzinfo=datetime.timezone.utc) return None @reify def if_modified_since(self) -> Optional[datetime.datetime]: """The value of If-Modified-Since HTTP header, or None. This header is represented as a `datetime` object. """ return self._http_date(self.headers.get(hdrs.IF_MODIFIED_SINCE)) @reify def if_unmodified_since(self) -> Optional[datetime.datetime]: """The value of If-Unmodified-Since HTTP header, or None. This header is represented as a `datetime` object. """ return self._http_date(self.headers.get(hdrs.IF_UNMODIFIED_SINCE)) @reify def if_range(self) -> Optional[datetime.datetime]: """The value of If-Range HTTP header, or None. This header is represented as a `datetime` object. """ return self._http_date(self.headers.get(hdrs.IF_RANGE)) @reify def keep_alive(self) -> bool: """Is keepalive enabled by client?""" return not self._message.should_close @reify def cookies(self) -> Mapping[str, str]: """Return request cookies. A read-only dictionary-like object. """ raw = self.headers.get(hdrs.COOKIE, '') parsed = SimpleCookie(raw) return MappingProxyType( {key: val.value for key, val in parsed.items()}) @reify def http_range(self) -> slice: """The content of Range HTTP header. Return a slice instance. """ rng = self._headers.get(hdrs.RANGE) start, end = None, None if rng is not None: try: pattern = r'^bytes=(\d*)-(\d*)$' start, end = re.findall(pattern, rng)[0] except IndexError: # pattern was not found in header raise ValueError("range not in acceptable format") end = int(end) if end else None start = int(start) if start else None if start is None and end is not None: # end with no start is to return tail of content start = -end end = None if start is not None and end is not None: # end is inclusive in range header, exclusive for slice end += 1 if start >= end: raise ValueError('start cannot be after end') if start is end is None: # No valid range supplied raise ValueError('No start or end of range specified') return slice(start, end, 1) @reify def content(self) -> StreamReader: """Return raw payload stream.""" return self._payload @property def has_body(self) -> bool: """Return True if request's HTTP BODY can be read, False otherwise.""" warnings.warn( "Deprecated, use .can_read_body #2005", DeprecationWarning, stacklevel=2) return not self._payload.at_eof() @property def can_read_body(self) -> bool: """Return True if request's HTTP BODY can be read, False otherwise.""" return not self._payload.at_eof() @reify def body_exists(self) -> bool: """Return True if request has HTTP BODY, False otherwise.""" return type(self._payload) is not EmptyStreamReader async def release(self) -> None: """Release request. Eat unread part of HTTP BODY if present. """ while not self._payload.at_eof(): await self._payload.readany() async def read(self) -> bytes: """Read request body if present. Returns bytes object with full request content. """ if self._read_bytes is None: body = bytearray() while True: chunk = await self._payload.readany() body.extend(chunk) if self._client_max_size: body_size = len(body) if body_size >= self._client_max_size: raise HTTPRequestEntityTooLarge( max_size=self._client_max_size, actual_size=body_size ) if not chunk: break self._read_bytes = bytes(body) return self._read_bytes async def text(self) -> str: """Return BODY as text using encoding from .charset.""" bytes_body = await self.read() encoding = self.charset or 'utf-8' return bytes_body.decode(encoding) async def json(self, *, loads: JSONDecoder=DEFAULT_JSON_DECODER) -> Any: """Return BODY as JSON.""" body = await self.text() return loads(body) async def multipart(self) -> MultipartReader: """Return async iterator to process BODY as multipart.""" return MultipartReader(self._headers, self._payload) async def post(self) -> 'MultiDictProxy[Union[str, bytes, FileField]]': """Return POST parameters.""" if self._post is not None: return self._post if self._method not in self.POST_METHODS: self._post = MultiDictProxy(MultiDict()) return self._post content_type = self.content_type if (content_type not in ('', 'application/x-www-form-urlencoded', 'multipart/form-data')): self._post = MultiDictProxy(MultiDict()) return self._post out = MultiDict() # type: MultiDict[Union[str, bytes, FileField]] if content_type == 'multipart/form-data': multipart = await self.multipart() max_size = self._client_max_size field = await multipart.next() while field is not None: size = 0 field_ct = field.headers.get(hdrs.CONTENT_TYPE) if isinstance(field, BodyPartReader): if field.filename and field_ct: # store file in temp file tmp = tempfile.TemporaryFile() chunk = await field.read_chunk(size=2**16) while chunk: chunk = field.decode(chunk) tmp.write(chunk) size += len(chunk) if 0 < max_size < size: raise HTTPRequestEntityTooLarge( max_size=max_size, actual_size=size ) chunk = await field.read_chunk(size=2**16) tmp.seek(0) ff = FileField(field.name, field.filename, cast(io.BufferedReader, tmp), field_ct, field.headers) out.add(field.name, ff) else: # deal with ordinary data value = await field.read(decode=True) if field_ct is None or \ field_ct.startswith('text/'): charset = field.get_charset(default='utf-8') out.add(field.name, value.decode(charset)) else: out.add(field.name, value) size += len(value) if 0 < max_size < size: raise HTTPRequestEntityTooLarge( max_size=max_size, actual_size=size ) else: raise ValueError( 'To decode nested multipart you need ' 'to use custom reader', ) field = await multipart.next() else: data = await self.read() if data: charset = self.charset or 'utf-8' out.extend( parse_qsl( data.rstrip().decode(charset), keep_blank_values=True, encoding=charset)) self._post = MultiDictProxy(out) return self._post def __repr__(self) -> str: ascii_encodable_path = self.path.encode('ascii', 'backslashreplace') \ .decode('ascii') return "<{} {} {} >".format(self.__class__.__name__, self._method, ascii_encodable_path) def __eq__(self, other: object) -> bool: return id(self) == id(other) def __bool__(self) -> bool: return True async def _prepare_hook(self, response: StreamResponse) -> None: return class Request(BaseRequest): ATTRS = BaseRequest.ATTRS | frozenset(['_match_info']) def __init__(self, *args: Any, **kwargs: Any) -> None: super().__init__(*args, **kwargs) # matchdict, route_name, handler # or information about traversal lookup # initialized after route resolving self._match_info = None # type: Optional[UrlMappingMatchInfo] if DEBUG: def __setattr__(self, name: str, val: Any) -> None: if name not in self.ATTRS: warnings.warn("Setting custom {}.{} attribute " "is discouraged".format(self.__class__.__name__, name), DeprecationWarning, stacklevel=2) super().__setattr__(name, val) def clone(self, *, method: str=sentinel, rel_url: StrOrURL=sentinel, headers: LooseHeaders=sentinel, scheme: str=sentinel, host: str=sentinel, remote: str=sentinel) -> 'Request': ret = super().clone(method=method, rel_url=rel_url, headers=headers, scheme=scheme, host=host, remote=remote) new_ret = cast(Request, ret) new_ret._match_info = self._match_info return new_ret @reify def match_info(self) -> 'UrlMappingMatchInfo': """Result of route resolving.""" match_info = self._match_info assert match_info is not None return match_info @property def app(self) -> 'Application': """Application instance.""" match_info = self._match_info assert match_info is not None return match_info.current_app @property def config_dict(self) -> ChainMapProxy: match_info = self._match_info assert match_info is not None lst = match_info.apps app = self.app idx = lst.index(app) sublist = list(reversed(lst[:idx + 1])) return ChainMapProxy(sublist) async def _prepare_hook(self, response: StreamResponse) -> None: match_info = self._match_info if match_info is None: return for app in match_info._apps: await app.on_response_prepare.send(self, response) aiohttp-3.6.2/aiohttp/web_response.py0000644000175100001650000006161513547410117020213 0ustar vstsdocker00000000000000import asyncio # noqa import collections.abc # noqa import datetime import enum import json import math import time import warnings import zlib from concurrent.futures import Executor from email.utils import parsedate from http.cookies import SimpleCookie from typing import ( # noqa TYPE_CHECKING, Any, Dict, Iterator, Mapping, MutableMapping, Optional, Tuple, Union, cast, ) from multidict import CIMultiDict, istr from . import hdrs, payload from .abc import AbstractStreamWriter from .helpers import HeadersMixin, rfc822_formatted_time, sentinel from .http import RESPONSES, SERVER_SOFTWARE, HttpVersion10, HttpVersion11 from .payload import Payload from .typedefs import JSONEncoder, LooseHeaders __all__ = ('ContentCoding', 'StreamResponse', 'Response', 'json_response') if TYPE_CHECKING: # pragma: no cover from .web_request import BaseRequest # noqa BaseClass = MutableMapping[str, Any] else: BaseClass = collections.abc.MutableMapping class ContentCoding(enum.Enum): # The content codings that we have support for. # # Additional registered codings are listed at: # https://www.iana.org/assignments/http-parameters/http-parameters.xhtml#content-coding deflate = 'deflate' gzip = 'gzip' identity = 'identity' ############################################################ # HTTP Response classes ############################################################ class StreamResponse(BaseClass, HeadersMixin): _length_check = True def __init__(self, *, status: int=200, reason: Optional[str]=None, headers: Optional[LooseHeaders]=None) -> None: self._body = None self._keep_alive = None # type: Optional[bool] self._chunked = False self._compression = False self._compression_force = None # type: Optional[ContentCoding] self._cookies = SimpleCookie() self._req = None # type: Optional[BaseRequest] self._payload_writer = None # type: Optional[AbstractStreamWriter] self._eof_sent = False self._body_length = 0 self._state = {} # type: Dict[str, Any] if headers is not None: self._headers = CIMultiDict(headers) # type: CIMultiDict[str] else: self._headers = CIMultiDict() self.set_status(status, reason) @property def prepared(self) -> bool: return self._payload_writer is not None @property def task(self) -> 'asyncio.Task[None]': return getattr(self._req, 'task', None) @property def status(self) -> int: return self._status @property def chunked(self) -> bool: return self._chunked @property def compression(self) -> bool: return self._compression @property def reason(self) -> str: return self._reason def set_status(self, status: int, reason: Optional[str]=None, _RESPONSES: Mapping[int, Tuple[str, str]]=RESPONSES) -> None: assert not self.prepared, \ 'Cannot change the response status code after ' \ 'the headers have been sent' self._status = int(status) if reason is None: try: reason = _RESPONSES[self._status][0] except Exception: reason = '' self._reason = reason @property def keep_alive(self) -> Optional[bool]: return self._keep_alive def force_close(self) -> None: self._keep_alive = False @property def body_length(self) -> int: return self._body_length @property def output_length(self) -> int: warnings.warn('output_length is deprecated', DeprecationWarning) assert self._payload_writer return self._payload_writer.buffer_size def enable_chunked_encoding(self, chunk_size: Optional[int]=None) -> None: """Enables automatic chunked transfer encoding.""" self._chunked = True if hdrs.CONTENT_LENGTH in self._headers: raise RuntimeError("You can't enable chunked encoding when " "a content length is set") if chunk_size is not None: warnings.warn('Chunk size is deprecated #1615', DeprecationWarning) def enable_compression(self, force: Optional[Union[bool, ContentCoding]]=None ) -> None: """Enables response compression encoding.""" # Backwards compatibility for when force was a bool <0.17. if type(force) == bool: force = ContentCoding.deflate if force else ContentCoding.identity warnings.warn("Using boolean for force is deprecated #3318", DeprecationWarning) elif force is not None: assert isinstance(force, ContentCoding), ("force should one of " "None, bool or " "ContentEncoding") self._compression = True self._compression_force = force @property def headers(self) -> 'CIMultiDict[str]': return self._headers @property def cookies(self) -> SimpleCookie: return self._cookies def set_cookie(self, name: str, value: str, *, expires: Optional[str]=None, domain: Optional[str]=None, max_age: Optional[Union[int, str]]=None, path: str='/', secure: Optional[str]=None, httponly: Optional[str]=None, version: Optional[str]=None) -> None: """Set or update response cookie. Sets new cookie or updates existent with new value. Also updates only those params which are not None. """ old = self._cookies.get(name) if old is not None and old.coded_value == '': # deleted cookie self._cookies.pop(name, None) self._cookies[name] = value c = self._cookies[name] if expires is not None: c['expires'] = expires elif c.get('expires') == 'Thu, 01 Jan 1970 00:00:00 GMT': del c['expires'] if domain is not None: c['domain'] = domain if max_age is not None: c['max-age'] = str(max_age) elif 'max-age' in c: del c['max-age'] c['path'] = path if secure is not None: c['secure'] = secure if httponly is not None: c['httponly'] = httponly if version is not None: c['version'] = version def del_cookie(self, name: str, *, domain: Optional[str]=None, path: str='/') -> None: """Delete cookie. Creates new empty expired cookie. """ # TODO: do we need domain/path here? self._cookies.pop(name, None) self.set_cookie(name, '', max_age=0, expires="Thu, 01 Jan 1970 00:00:00 GMT", domain=domain, path=path) @property def content_length(self) -> Optional[int]: # Just a placeholder for adding setter return super().content_length @content_length.setter def content_length(self, value: Optional[int]) -> None: if value is not None: value = int(value) if self._chunked: raise RuntimeError("You can't set content length when " "chunked encoding is enable") self._headers[hdrs.CONTENT_LENGTH] = str(value) else: self._headers.pop(hdrs.CONTENT_LENGTH, None) @property def content_type(self) -> str: # Just a placeholder for adding setter return super().content_type @content_type.setter def content_type(self, value: str) -> None: self.content_type # read header values if needed self._content_type = str(value) self._generate_content_type_header() @property def charset(self) -> Optional[str]: # Just a placeholder for adding setter return super().charset @charset.setter def charset(self, value: Optional[str]) -> None: ctype = self.content_type # read header values if needed if ctype == 'application/octet-stream': raise RuntimeError("Setting charset for application/octet-stream " "doesn't make sense, setup content_type first") assert self._content_dict is not None if value is None: self._content_dict.pop('charset', None) else: self._content_dict['charset'] = str(value).lower() self._generate_content_type_header() @property def last_modified(self) -> Optional[datetime.datetime]: """The value of Last-Modified HTTP header, or None. This header is represented as a `datetime` object. """ httpdate = self._headers.get(hdrs.LAST_MODIFIED) if httpdate is not None: timetuple = parsedate(httpdate) if timetuple is not None: return datetime.datetime(*timetuple[:6], tzinfo=datetime.timezone.utc) return None @last_modified.setter def last_modified(self, value: Optional[ Union[int, float, datetime.datetime, str]]) -> None: if value is None: self._headers.pop(hdrs.LAST_MODIFIED, None) elif isinstance(value, (int, float)): self._headers[hdrs.LAST_MODIFIED] = time.strftime( "%a, %d %b %Y %H:%M:%S GMT", time.gmtime(math.ceil(value))) elif isinstance(value, datetime.datetime): self._headers[hdrs.LAST_MODIFIED] = time.strftime( "%a, %d %b %Y %H:%M:%S GMT", value.utctimetuple()) elif isinstance(value, str): self._headers[hdrs.LAST_MODIFIED] = value def _generate_content_type_header( self, CONTENT_TYPE: istr=hdrs.CONTENT_TYPE) -> None: assert self._content_dict is not None assert self._content_type is not None params = '; '.join("{}={}".format(k, v) for k, v in self._content_dict.items()) if params: ctype = self._content_type + '; ' + params else: ctype = self._content_type self._headers[CONTENT_TYPE] = ctype async def _do_start_compression(self, coding: ContentCoding) -> None: if coding != ContentCoding.identity: assert self._payload_writer is not None self._headers[hdrs.CONTENT_ENCODING] = coding.value self._payload_writer.enable_compression(coding.value) # Compressed payload may have different content length, # remove the header self._headers.popall(hdrs.CONTENT_LENGTH, None) async def _start_compression(self, request: 'BaseRequest') -> None: if self._compression_force: await self._do_start_compression(self._compression_force) else: accept_encoding = request.headers.get( hdrs.ACCEPT_ENCODING, '').lower() for coding in ContentCoding: if coding.value in accept_encoding: await self._do_start_compression(coding) return async def prepare( self, request: 'BaseRequest' ) -> Optional[AbstractStreamWriter]: if self._eof_sent: return None if self._payload_writer is not None: return self._payload_writer await request._prepare_hook(self) return await self._start(request) async def _start(self, request: 'BaseRequest') -> AbstractStreamWriter: self._req = request keep_alive = self._keep_alive if keep_alive is None: keep_alive = request.keep_alive self._keep_alive = keep_alive version = request.version writer = self._payload_writer = request._payload_writer headers = self._headers for cookie in self._cookies.values(): value = cookie.output(header='')[1:] headers.add(hdrs.SET_COOKIE, value) if self._compression: await self._start_compression(request) if self._chunked: if version != HttpVersion11: raise RuntimeError( "Using chunked encoding is forbidden " "for HTTP/{0.major}.{0.minor}".format(request.version)) writer.enable_chunking() headers[hdrs.TRANSFER_ENCODING] = 'chunked' if hdrs.CONTENT_LENGTH in headers: del headers[hdrs.CONTENT_LENGTH] elif self._length_check: writer.length = self.content_length if writer.length is None: if version >= HttpVersion11: writer.enable_chunking() headers[hdrs.TRANSFER_ENCODING] = 'chunked' if hdrs.CONTENT_LENGTH in headers: del headers[hdrs.CONTENT_LENGTH] else: keep_alive = False headers.setdefault(hdrs.CONTENT_TYPE, 'application/octet-stream') headers.setdefault(hdrs.DATE, rfc822_formatted_time()) headers.setdefault(hdrs.SERVER, SERVER_SOFTWARE) # connection header if hdrs.CONNECTION not in headers: if keep_alive: if version == HttpVersion10: headers[hdrs.CONNECTION] = 'keep-alive' else: if version == HttpVersion11: headers[hdrs.CONNECTION] = 'close' # status line status_line = 'HTTP/{}.{} {} {}'.format( version[0], version[1], self._status, self._reason) await writer.write_headers(status_line, headers) return writer async def write(self, data: bytes) -> None: assert isinstance(data, (bytes, bytearray, memoryview)), \ "data argument must be byte-ish (%r)" % type(data) if self._eof_sent: raise RuntimeError("Cannot call write() after write_eof()") if self._payload_writer is None: raise RuntimeError("Cannot call write() before prepare()") await self._payload_writer.write(data) async def drain(self) -> None: assert not self._eof_sent, "EOF has already been sent" assert self._payload_writer is not None, \ "Response has not been started" warnings.warn("drain method is deprecated, use await resp.write()", DeprecationWarning, stacklevel=2) await self._payload_writer.drain() async def write_eof(self, data: bytes=b'') -> None: assert isinstance(data, (bytes, bytearray, memoryview)), \ "data argument must be byte-ish (%r)" % type(data) if self._eof_sent: return assert self._payload_writer is not None, \ "Response has not been started" await self._payload_writer.write_eof(data) self._eof_sent = True self._req = None self._body_length = self._payload_writer.output_size self._payload_writer = None def __repr__(self) -> str: if self._eof_sent: info = "eof" elif self.prepared: assert self._req is not None info = "{} {} ".format(self._req.method, self._req.path) else: info = "not prepared" return "<{} {} {}>".format(self.__class__.__name__, self.reason, info) def __getitem__(self, key: str) -> Any: return self._state[key] def __setitem__(self, key: str, value: Any) -> None: self._state[key] = value def __delitem__(self, key: str) -> None: del self._state[key] def __len__(self) -> int: return len(self._state) def __iter__(self) -> Iterator[str]: return iter(self._state) def __hash__(self) -> int: return hash(id(self)) def __eq__(self, other: object) -> bool: return self is other class Response(StreamResponse): def __init__(self, *, body: Any=None, status: int=200, reason: Optional[str]=None, text: Optional[str]=None, headers: Optional[LooseHeaders]=None, content_type: Optional[str]=None, charset: Optional[str]=None, zlib_executor_size: Optional[int]=None, zlib_executor: Executor=None) -> None: if body is not None and text is not None: raise ValueError("body and text are not allowed together") if headers is None: real_headers = CIMultiDict() # type: CIMultiDict[str] elif not isinstance(headers, CIMultiDict): real_headers = CIMultiDict(headers) else: real_headers = headers # = cast('CIMultiDict[str]', headers) if content_type is not None and "charset" in content_type: raise ValueError("charset must not be in content_type " "argument") if text is not None: if hdrs.CONTENT_TYPE in real_headers: if content_type or charset: raise ValueError("passing both Content-Type header and " "content_type or charset params " "is forbidden") else: # fast path for filling headers if not isinstance(text, str): raise TypeError("text argument must be str (%r)" % type(text)) if content_type is None: content_type = 'text/plain' if charset is None: charset = 'utf-8' real_headers[hdrs.CONTENT_TYPE] = ( content_type + '; charset=' + charset) body = text.encode(charset) text = None else: if hdrs.CONTENT_TYPE in real_headers: if content_type is not None or charset is not None: raise ValueError("passing both Content-Type header and " "content_type or charset params " "is forbidden") else: if content_type is not None: if charset is not None: content_type += '; charset=' + charset real_headers[hdrs.CONTENT_TYPE] = content_type super().__init__(status=status, reason=reason, headers=real_headers) if text is not None: self.text = text else: self.body = body self._compressed_body = None # type: Optional[bytes] self._zlib_executor_size = zlib_executor_size self._zlib_executor = zlib_executor @property def body(self) -> Optional[Union[bytes, Payload]]: return self._body @body.setter def body(self, body: bytes, CONTENT_TYPE: istr=hdrs.CONTENT_TYPE, CONTENT_LENGTH: istr=hdrs.CONTENT_LENGTH) -> None: if body is None: self._body = None # type: Optional[bytes] self._body_payload = False # type: bool elif isinstance(body, (bytes, bytearray)): self._body = body self._body_payload = False else: try: self._body = body = payload.PAYLOAD_REGISTRY.get(body) except payload.LookupError: raise ValueError('Unsupported body type %r' % type(body)) self._body_payload = True headers = self._headers # set content-length header if needed if not self._chunked and CONTENT_LENGTH not in headers: size = body.size if size is not None: headers[CONTENT_LENGTH] = str(size) # set content-type if CONTENT_TYPE not in headers: headers[CONTENT_TYPE] = body.content_type # copy payload headers if body.headers: for (key, value) in body.headers.items(): if key not in headers: headers[key] = value self._compressed_body = None @property def text(self) -> Optional[str]: if self._body is None: return None return self._body.decode(self.charset or 'utf-8') @text.setter def text(self, text: str) -> None: assert text is None or isinstance(text, str), \ "text argument must be str (%r)" % type(text) if self.content_type == 'application/octet-stream': self.content_type = 'text/plain' if self.charset is None: self.charset = 'utf-8' self._body = text.encode(self.charset) self._body_payload = False self._compressed_body = None @property def content_length(self) -> Optional[int]: if self._chunked: return None if hdrs.CONTENT_LENGTH in self._headers: return super().content_length if self._compressed_body is not None: # Return length of the compressed body return len(self._compressed_body) elif self._body_payload: # A payload without content length, or a compressed payload return None elif self._body is not None: return len(self._body) else: return 0 @content_length.setter def content_length(self, value: Optional[int]) -> None: raise RuntimeError("Content length is set automatically") async def write_eof(self, data: bytes=b'') -> None: if self._eof_sent: return if self._compressed_body is None: body = self._body # type: Optional[Union[bytes, Payload]] else: body = self._compressed_body assert not data, "data arg is not supported, got {!r}".format(data) assert self._req is not None assert self._payload_writer is not None if body is not None: if (self._req._method == hdrs.METH_HEAD or self._status in [204, 304]): await super().write_eof() elif self._body_payload: payload = cast(Payload, body) await payload.write(self._payload_writer) await super().write_eof() else: await super().write_eof(cast(bytes, body)) else: await super().write_eof() async def _start(self, request: 'BaseRequest') -> AbstractStreamWriter: if not self._chunked and hdrs.CONTENT_LENGTH not in self._headers: if not self._body_payload: if self._body is not None: self._headers[hdrs.CONTENT_LENGTH] = str(len(self._body)) else: self._headers[hdrs.CONTENT_LENGTH] = '0' return await super()._start(request) def _compress_body(self, zlib_mode: int) -> None: compressobj = zlib.compressobj(wbits=zlib_mode) body_in = self._body assert body_in is not None self._compressed_body = \ compressobj.compress(body_in) + compressobj.flush() async def _do_start_compression(self, coding: ContentCoding) -> None: if self._body_payload or self._chunked: return await super()._do_start_compression(coding) if coding != ContentCoding.identity: # Instead of using _payload_writer.enable_compression, # compress the whole body zlib_mode = (16 + zlib.MAX_WBITS if coding == ContentCoding.gzip else -zlib.MAX_WBITS) body_in = self._body assert body_in is not None if self._zlib_executor_size is not None and \ len(body_in) > self._zlib_executor_size: await asyncio.get_event_loop().run_in_executor( self._zlib_executor, self._compress_body, zlib_mode) else: self._compress_body(zlib_mode) body_out = self._compressed_body assert body_out is not None self._headers[hdrs.CONTENT_ENCODING] = coding.value self._headers[hdrs.CONTENT_LENGTH] = str(len(body_out)) def json_response(data: Any=sentinel, *, text: str=None, body: bytes=None, status: int=200, reason: Optional[str]=None, headers: LooseHeaders=None, content_type: str='application/json', dumps: JSONEncoder=json.dumps) -> Response: if data is not sentinel: if text or body: raise ValueError( "only one of data, text, or body should be specified" ) else: text = dumps(data) return Response(text=text, body=body, status=status, reason=reason, headers=headers, content_type=content_type) aiohttp-3.6.2/aiohttp/web_routedef.py0000644000175100001650000001372313547410117020167 0ustar vstsdocker00000000000000import abc import os # noqa from typing import ( TYPE_CHECKING, Any, Awaitable, Callable, Dict, Iterator, List, Optional, Sequence, Type, Union, overload, ) import attr from . import hdrs from .abc import AbstractView from .typedefs import PathLike if TYPE_CHECKING: # pragma: no cover from .web_urldispatcher import UrlDispatcher from .web_request import Request from .web_response import StreamResponse else: Request = StreamResponse = UrlDispatcher = None __all__ = ('AbstractRouteDef', 'RouteDef', 'StaticDef', 'RouteTableDef', 'head', 'options', 'get', 'post', 'patch', 'put', 'delete', 'route', 'view', 'static') class AbstractRouteDef(abc.ABC): @abc.abstractmethod def register(self, router: UrlDispatcher) -> None: pass # pragma: no cover _SimpleHandler = Callable[[Request], Awaitable[StreamResponse]] _HandlerType = Union[Type[AbstractView], _SimpleHandler] @attr.s(frozen=True, repr=False, slots=True) class RouteDef(AbstractRouteDef): method = attr.ib(type=str) path = attr.ib(type=str) handler = attr.ib() # type: _HandlerType kwargs = attr.ib(type=Dict[str, Any]) def __repr__(self) -> str: info = [] for name, value in sorted(self.kwargs.items()): info.append(", {}={!r}".format(name, value)) return (" {handler.__name__!r}" "{info}>".format(method=self.method, path=self.path, handler=self.handler, info=''.join(info))) def register(self, router: UrlDispatcher) -> None: if self.method in hdrs.METH_ALL: reg = getattr(router, 'add_'+self.method.lower()) reg(self.path, self.handler, **self.kwargs) else: router.add_route(self.method, self.path, self.handler, **self.kwargs) @attr.s(frozen=True, repr=False, slots=True) class StaticDef(AbstractRouteDef): prefix = attr.ib(type=str) path = attr.ib() # type: PathLike kwargs = attr.ib(type=Dict[str, Any]) def __repr__(self) -> str: info = [] for name, value in sorted(self.kwargs.items()): info.append(", {}={!r}".format(name, value)) return (" {path}" "{info}>".format(prefix=self.prefix, path=self.path, info=''.join(info))) def register(self, router: UrlDispatcher) -> None: router.add_static(self.prefix, self.path, **self.kwargs) def route(method: str, path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return RouteDef(method, path, handler, kwargs) def head(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_HEAD, path, handler, **kwargs) def options(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_OPTIONS, path, handler, **kwargs) def get(path: str, handler: _HandlerType, *, name: Optional[str]=None, allow_head: bool=True, **kwargs: Any) -> RouteDef: return route(hdrs.METH_GET, path, handler, name=name, allow_head=allow_head, **kwargs) def post(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_POST, path, handler, **kwargs) def put(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_PUT, path, handler, **kwargs) def patch(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_PATCH, path, handler, **kwargs) def delete(path: str, handler: _HandlerType, **kwargs: Any) -> RouteDef: return route(hdrs.METH_DELETE, path, handler, **kwargs) def view(path: str, handler: Type[AbstractView], **kwargs: Any) -> RouteDef: return route(hdrs.METH_ANY, path, handler, **kwargs) def static(prefix: str, path: PathLike, **kwargs: Any) -> StaticDef: return StaticDef(prefix, path, kwargs) _Deco = Callable[[_HandlerType], _HandlerType] class RouteTableDef(Sequence[AbstractRouteDef]): """Route definition table""" def __init__(self) -> None: self._items = [] # type: List[AbstractRouteDef] def __repr__(self) -> str: return "".format(len(self._items)) @overload def __getitem__(self, index: int) -> AbstractRouteDef: ... # noqa @overload # noqa def __getitem__(self, index: slice) -> List[AbstractRouteDef]: ... # noqa def __getitem__(self, index): # type: ignore # noqa return self._items[index] def __iter__(self) -> Iterator[AbstractRouteDef]: return iter(self._items) def __len__(self) -> int: return len(self._items) def __contains__(self, item: object) -> bool: return item in self._items def route(self, method: str, path: str, **kwargs: Any) -> _Deco: def inner(handler: _HandlerType) -> _HandlerType: self._items.append(RouteDef(method, path, handler, kwargs)) return handler return inner def head(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_HEAD, path, **kwargs) def get(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_GET, path, **kwargs) def post(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_POST, path, **kwargs) def put(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_PUT, path, **kwargs) def patch(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_PATCH, path, **kwargs) def delete(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_DELETE, path, **kwargs) def view(self, path: str, **kwargs: Any) -> _Deco: return self.route(hdrs.METH_ANY, path, **kwargs) def static(self, prefix: str, path: PathLike, **kwargs: Any) -> None: self._items.append(StaticDef(prefix, path, kwargs)) aiohttp-3.6.2/aiohttp/web_runner.py0000644000175100001650000002560613547410117017666 0ustar vstsdocker00000000000000import asyncio import signal import socket from abc import ABC, abstractmethod from typing import Any, List, Optional, Set from yarl import URL from .web_app import Application from .web_server import Server try: from ssl import SSLContext except ImportError: SSLContext = object # type: ignore __all__ = ('BaseSite', 'TCPSite', 'UnixSite', 'NamedPipeSite', 'SockSite', 'BaseRunner', 'AppRunner', 'ServerRunner', 'GracefulExit') class GracefulExit(SystemExit): code = 1 def _raise_graceful_exit() -> None: raise GracefulExit() class BaseSite(ABC): __slots__ = ('_runner', '_shutdown_timeout', '_ssl_context', '_backlog', '_server') def __init__(self, runner: 'BaseRunner', *, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, backlog: int=128) -> None: if runner.server is None: raise RuntimeError("Call runner.setup() before making a site") self._runner = runner self._shutdown_timeout = shutdown_timeout self._ssl_context = ssl_context self._backlog = backlog self._server = None # type: Optional[asyncio.AbstractServer] @property @abstractmethod def name(self) -> str: pass # pragma: no cover @abstractmethod async def start(self) -> None: self._runner._reg_site(self) async def stop(self) -> None: self._runner._check_site(self) if self._server is None: self._runner._unreg_site(self) return # not started yet self._server.close() # named pipes do not have wait_closed property if hasattr(self._server, 'wait_closed'): await self._server.wait_closed() await self._runner.shutdown() assert self._runner.server await self._runner.server.shutdown(self._shutdown_timeout) self._runner._unreg_site(self) class TCPSite(BaseSite): __slots__ = ('_host', '_port', '_reuse_address', '_reuse_port') def __init__(self, runner: 'BaseRunner', host: str=None, port: int=None, *, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, backlog: int=128, reuse_address: Optional[bool]=None, reuse_port: Optional[bool]=None) -> None: super().__init__(runner, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog) if host is None: host = "0.0.0.0" self._host = host if port is None: port = 8443 if self._ssl_context else 8080 self._port = port self._reuse_address = reuse_address self._reuse_port = reuse_port @property def name(self) -> str: scheme = 'https' if self._ssl_context else 'http' return str(URL.build(scheme=scheme, host=self._host, port=self._port)) async def start(self) -> None: await super().start() loop = asyncio.get_event_loop() server = self._runner.server assert server is not None self._server = await loop.create_server( # type: ignore server, self._host, self._port, ssl=self._ssl_context, backlog=self._backlog, reuse_address=self._reuse_address, reuse_port=self._reuse_port) class UnixSite(BaseSite): __slots__ = ('_path', ) def __init__(self, runner: 'BaseRunner', path: str, *, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, backlog: int=128) -> None: super().__init__(runner, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog) self._path = path @property def name(self) -> str: scheme = 'https' if self._ssl_context else 'http' return '{}://unix:{}:'.format(scheme, self._path) async def start(self) -> None: await super().start() loop = asyncio.get_event_loop() server = self._runner.server assert server is not None self._server = await loop.create_unix_server( server, self._path, ssl=self._ssl_context, backlog=self._backlog) class NamedPipeSite(BaseSite): __slots__ = ('_path', ) def __init__(self, runner: 'BaseRunner', path: str, *, shutdown_timeout: float=60.0) -> None: loop = asyncio.get_event_loop() if not isinstance(loop, asyncio.ProactorEventLoop): # type: ignore raise RuntimeError("Named Pipes only available in proactor" "loop under windows") super().__init__(runner, shutdown_timeout=shutdown_timeout) self._path = path @property def name(self) -> str: return self._path async def start(self) -> None: await super().start() loop = asyncio.get_event_loop() server = self._runner.server assert server is not None _server = await loop.start_serving_pipe( # type: ignore server, self._path ) self._server = _server[0] class SockSite(BaseSite): __slots__ = ('_sock', '_name') def __init__(self, runner: 'BaseRunner', sock: socket.socket, *, shutdown_timeout: float=60.0, ssl_context: Optional[SSLContext]=None, backlog: int=128) -> None: super().__init__(runner, shutdown_timeout=shutdown_timeout, ssl_context=ssl_context, backlog=backlog) self._sock = sock scheme = 'https' if self._ssl_context else 'http' if hasattr(socket, 'AF_UNIX') and sock.family == socket.AF_UNIX: name = '{}://unix:{}:'.format(scheme, sock.getsockname()) else: host, port = sock.getsockname()[:2] name = str(URL.build(scheme=scheme, host=host, port=port)) self._name = name @property def name(self) -> str: return self._name async def start(self) -> None: await super().start() loop = asyncio.get_event_loop() server = self._runner.server assert server is not None self._server = await loop.create_server( # type: ignore server, sock=self._sock, ssl=self._ssl_context, backlog=self._backlog) class BaseRunner(ABC): __slots__ = ('_handle_signals', '_kwargs', '_server', '_sites') def __init__(self, *, handle_signals: bool=False, **kwargs: Any) -> None: self._handle_signals = handle_signals self._kwargs = kwargs self._server = None # type: Optional[Server] self._sites = [] # type: List[BaseSite] @property def server(self) -> Optional[Server]: return self._server @property def addresses(self) -> List[str]: ret = [] # type: List[str] for site in self._sites: server = site._server if server is not None: sockets = server.sockets if sockets is not None: for sock in sockets: ret.append(sock.getsockname()) return ret @property def sites(self) -> Set[BaseSite]: return set(self._sites) async def setup(self) -> None: loop = asyncio.get_event_loop() if self._handle_signals: try: loop.add_signal_handler(signal.SIGINT, _raise_graceful_exit) loop.add_signal_handler(signal.SIGTERM, _raise_graceful_exit) except NotImplementedError: # pragma: no cover # add_signal_handler is not implemented on Windows pass self._server = await self._make_server() @abstractmethod async def shutdown(self) -> None: pass # pragma: no cover async def cleanup(self) -> None: loop = asyncio.get_event_loop() if self._server is None: # no started yet, do nothing return # The loop over sites is intentional, an exception on gather() # leaves self._sites in unpredictable state. # The loop guaranties that a site is either deleted on success or # still present on failure for site in list(self._sites): await site.stop() await self._cleanup_server() self._server = None if self._handle_signals: try: loop.remove_signal_handler(signal.SIGINT) loop.remove_signal_handler(signal.SIGTERM) except NotImplementedError: # pragma: no cover # remove_signal_handler is not implemented on Windows pass @abstractmethod async def _make_server(self) -> Server: pass # pragma: no cover @abstractmethod async def _cleanup_server(self) -> None: pass # pragma: no cover def _reg_site(self, site: BaseSite) -> None: if site in self._sites: raise RuntimeError("Site {} is already registered in runner {}" .format(site, self)) self._sites.append(site) def _check_site(self, site: BaseSite) -> None: if site not in self._sites: raise RuntimeError("Site {} is not registered in runner {}" .format(site, self)) def _unreg_site(self, site: BaseSite) -> None: if site not in self._sites: raise RuntimeError("Site {} is not registered in runner {}" .format(site, self)) self._sites.remove(site) class ServerRunner(BaseRunner): """Low-level web server runner""" __slots__ = ('_web_server',) def __init__(self, web_server: Server, *, handle_signals: bool=False, **kwargs: Any) -> None: super().__init__(handle_signals=handle_signals, **kwargs) self._web_server = web_server async def shutdown(self) -> None: pass async def _make_server(self) -> Server: return self._web_server async def _cleanup_server(self) -> None: pass class AppRunner(BaseRunner): """Web Application runner""" __slots__ = ('_app',) def __init__(self, app: Application, *, handle_signals: bool=False, **kwargs: Any) -> None: super().__init__(handle_signals=handle_signals, **kwargs) if not isinstance(app, Application): raise TypeError("The first argument should be web.Application " "instance, got {!r}".format(app)) self._app = app @property def app(self) -> Application: return self._app async def shutdown(self) -> None: await self._app.shutdown() async def _make_server(self) -> Server: loop = asyncio.get_event_loop() self._app._set_loop(loop) self._app.on_startup.freeze() await self._app.startup() self._app.freeze() return self._app._make_handler(loop=loop, **self._kwargs) async def _cleanup_server(self) -> None: await self._app.cleanup() aiohttp-3.6.2/aiohttp/web_server.py0000644000175100001650000000416513547410117017660 0ustar vstsdocker00000000000000"""Low level HTTP server.""" import asyncio from typing import Any, Awaitable, Callable, Dict, List, Optional # noqa from .abc import AbstractStreamWriter from .helpers import get_running_loop from .http_parser import RawRequestMessage from .streams import StreamReader from .web_protocol import RequestHandler, _RequestFactory, _RequestHandler from .web_request import BaseRequest __all__ = ('Server',) class Server: def __init__(self, handler: _RequestHandler, *, request_factory: Optional[_RequestFactory]=None, loop: Optional[asyncio.AbstractEventLoop]=None, **kwargs: Any) -> None: self._loop = get_running_loop(loop) self._connections = {} # type: Dict[RequestHandler, asyncio.Transport] self._kwargs = kwargs self.requests_count = 0 self.request_handler = handler self.request_factory = request_factory or self._make_request @property def connections(self) -> List[RequestHandler]: return list(self._connections.keys()) def connection_made(self, handler: RequestHandler, transport: asyncio.Transport) -> None: self._connections[handler] = transport def connection_lost(self, handler: RequestHandler, exc: Optional[BaseException]=None) -> None: if handler in self._connections: del self._connections[handler] def _make_request(self, message: RawRequestMessage, payload: StreamReader, protocol: RequestHandler, writer: AbstractStreamWriter, task: 'asyncio.Task[None]') -> BaseRequest: return BaseRequest( message, payload, protocol, writer, task, self._loop) async def shutdown(self, timeout: Optional[float]=None) -> None: coros = [conn.shutdown(timeout) for conn in self._connections] await asyncio.gather(*coros, loop=self._loop) self._connections.clear() def __call__(self) -> RequestHandler: return RequestHandler(self, loop=self._loop, **self._kwargs) aiohttp-3.6.2/aiohttp/web_urldispatcher.py0000644000175100001650000011377413547410117021232 0ustar vstsdocker00000000000000import abc import asyncio import base64 import hashlib import inspect import keyword import os import re import warnings from contextlib import contextmanager from functools import wraps from pathlib import Path from types import MappingProxyType from typing import ( # noqa TYPE_CHECKING, Any, Awaitable, Callable, Container, Dict, Generator, Iterable, Iterator, List, Mapping, Optional, Set, Sized, Tuple, Type, Union, cast, ) from yarl import URL from . import hdrs from .abc import AbstractMatchInfo, AbstractRouter, AbstractView from .helpers import DEBUG from .http import HttpVersion11 from .typedefs import PathLike from .web_exceptions import ( HTTPException, HTTPExpectationFailed, HTTPForbidden, HTTPMethodNotAllowed, HTTPNotFound, ) from .web_fileresponse import FileResponse from .web_request import Request from .web_response import Response, StreamResponse from .web_routedef import AbstractRouteDef __all__ = ('UrlDispatcher', 'UrlMappingMatchInfo', 'AbstractResource', 'Resource', 'PlainResource', 'DynamicResource', 'AbstractRoute', 'ResourceRoute', 'StaticResource', 'View') if TYPE_CHECKING: # pragma: no cover from .web_app import Application # noqa BaseDict = Dict[str, str] else: BaseDict = dict HTTP_METHOD_RE = re.compile(r"^[0-9A-Za-z!#\$%&'\*\+\-\.\^_`\|~]+$") ROUTE_RE = re.compile(r'(\{[_a-zA-Z][^{}]*(?:\{[^{}]*\}[^{}]*)*\})') PATH_SEP = re.escape('/') _WebHandler = Callable[[Request], Awaitable[StreamResponse]] _ExpectHandler = Callable[[Request], Awaitable[None]] _Resolve = Tuple[Optional[AbstractMatchInfo], Set[str]] class AbstractResource(Sized, Iterable['AbstractRoute']): def __init__(self, *, name: Optional[str]=None) -> None: self._name = name @property def name(self) -> Optional[str]: return self._name @property @abc.abstractmethod def canonical(self) -> str: """Exposes the resource's canonical path. For example '/foo/bar/{name}' """ @abc.abstractmethod # pragma: no branch def url_for(self, **kwargs: str) -> URL: """Construct url for resource with additional params.""" @abc.abstractmethod # pragma: no branch async def resolve(self, request: Request) -> _Resolve: """Resolve resource Return (UrlMappingMatchInfo, allowed_methods) pair.""" @abc.abstractmethod def add_prefix(self, prefix: str) -> None: """Add a prefix to processed URLs. Required for subapplications support. """ @abc.abstractmethod def get_info(self) -> Dict[str, Any]: """Return a dict with additional info useful for introspection""" def freeze(self) -> None: pass @abc.abstractmethod def raw_match(self, path: str) -> bool: """Perform a raw match against path""" class AbstractRoute(abc.ABC): def __init__(self, method: str, handler: Union[_WebHandler, Type[AbstractView]], *, expect_handler: _ExpectHandler=None, resource: AbstractResource=None) -> None: if expect_handler is None: expect_handler = _default_expect_handler assert asyncio.iscoroutinefunction(expect_handler), \ 'Coroutine is expected, got {!r}'.format(expect_handler) method = method.upper() if not HTTP_METHOD_RE.match(method): raise ValueError("{} is not allowed HTTP method".format(method)) assert callable(handler), handler if asyncio.iscoroutinefunction(handler): pass elif inspect.isgeneratorfunction(handler): warnings.warn("Bare generators are deprecated, " "use @coroutine wrapper", DeprecationWarning) elif (isinstance(handler, type) and issubclass(handler, AbstractView)): pass else: warnings.warn("Bare functions are deprecated, " "use async ones", DeprecationWarning) @wraps(handler) async def handler_wrapper(request: Request) -> StreamResponse: result = old_handler(request) if asyncio.iscoroutine(result): return await result return result # type: ignore old_handler = handler handler = handler_wrapper self._method = method self._handler = handler self._expect_handler = expect_handler self._resource = resource @property def method(self) -> str: return self._method @property def handler(self) -> _WebHandler: return self._handler @property @abc.abstractmethod def name(self) -> Optional[str]: """Optional route's name, always equals to resource's name.""" @property def resource(self) -> Optional[AbstractResource]: return self._resource @abc.abstractmethod def get_info(self) -> Dict[str, Any]: """Return a dict with additional info useful for introspection""" @abc.abstractmethod # pragma: no branch def url_for(self, *args: str, **kwargs: str) -> URL: """Construct url for route with additional params.""" async def handle_expect_header(self, request: Request) -> None: await self._expect_handler(request) class UrlMappingMatchInfo(BaseDict, AbstractMatchInfo): def __init__(self, match_dict: Dict[str, str], route: AbstractRoute): super().__init__(match_dict) self._route = route self._apps = [] # type: List[Application] self._current_app = None # type: Optional[Application] self._frozen = False @property def handler(self) -> _WebHandler: return self._route.handler @property def route(self) -> AbstractRoute: return self._route @property def expect_handler(self) -> _ExpectHandler: return self._route.handle_expect_header @property def http_exception(self) -> Optional[HTTPException]: return None def get_info(self) -> Dict[str, str]: return self._route.get_info() @property def apps(self) -> Tuple['Application', ...]: return tuple(self._apps) def add_app(self, app: 'Application') -> None: if self._frozen: raise RuntimeError("Cannot change apps stack after .freeze() call") if self._current_app is None: self._current_app = app self._apps.insert(0, app) @property def current_app(self) -> 'Application': app = self._current_app assert app is not None return app @contextmanager def set_current_app(self, app: 'Application') -> Generator[None, None, None]: if DEBUG: # pragma: no cover if app not in self._apps: raise RuntimeError( "Expected one of the following apps {!r}, got {!r}" .format(self._apps, app)) prev = self._current_app self._current_app = app try: yield finally: self._current_app = prev def freeze(self) -> None: self._frozen = True def __repr__(self) -> str: return "".format(super().__repr__(), self._route) class MatchInfoError(UrlMappingMatchInfo): def __init__(self, http_exception: HTTPException) -> None: self._exception = http_exception super().__init__({}, SystemRoute(self._exception)) @property def http_exception(self) -> HTTPException: return self._exception def __repr__(self) -> str: return "".format(self._exception.status, self._exception.reason) async def _default_expect_handler(request: Request) -> None: """Default handler for Expect header. Just send "100 Continue" to client. raise HTTPExpectationFailed if value of header is not "100-continue" """ expect = request.headers.get(hdrs.EXPECT) if request.version == HttpVersion11: if expect.lower() == "100-continue": await request.writer.write(b"HTTP/1.1 100 Continue\r\n\r\n") else: raise HTTPExpectationFailed(text="Unknown Expect: %s" % expect) class Resource(AbstractResource): def __init__(self, *, name: Optional[str]=None) -> None: super().__init__(name=name) self._routes = [] # type: List[ResourceRoute] def add_route(self, method: str, handler: Union[Type[AbstractView], _WebHandler], *, expect_handler: Optional[_ExpectHandler]=None ) -> 'ResourceRoute': for route_obj in self._routes: if route_obj.method == method or route_obj.method == hdrs.METH_ANY: raise RuntimeError("Added route will never be executed, " "method {route.method} is already " "registered".format(route=route_obj)) route_obj = ResourceRoute(method, handler, self, expect_handler=expect_handler) self.register_route(route_obj) return route_obj def register_route(self, route: 'ResourceRoute') -> None: assert isinstance(route, ResourceRoute), \ 'Instance of Route class is required, got {!r}'.format(route) self._routes.append(route) async def resolve(self, request: Request) -> _Resolve: allowed_methods = set() # type: Set[str] match_dict = self._match(request.rel_url.raw_path) if match_dict is None: return None, allowed_methods for route_obj in self._routes: route_method = route_obj.method allowed_methods.add(route_method) if (route_method == request.method or route_method == hdrs.METH_ANY): return (UrlMappingMatchInfo(match_dict, route_obj), allowed_methods) else: return None, allowed_methods @abc.abstractmethod def _match(self, path: str) -> Optional[Dict[str, str]]: pass # pragma: no cover def __len__(self) -> int: return len(self._routes) def __iter__(self) -> Iterator[AbstractRoute]: return iter(self._routes) # TODO: implement all abstract methods class PlainResource(Resource): def __init__(self, path: str, *, name: Optional[str]=None) -> None: super().__init__(name=name) assert not path or path.startswith('/') self._path = path @property def canonical(self) -> str: return self._path def freeze(self) -> None: if not self._path: self._path = '/' def add_prefix(self, prefix: str) -> None: assert prefix.startswith('/') assert not prefix.endswith('/') assert len(prefix) > 1 self._path = prefix + self._path def _match(self, path: str) -> Optional[Dict[str, str]]: # string comparison is about 10 times faster than regexp matching if self._path == path: return {} else: return None def raw_match(self, path: str) -> bool: return self._path == path def get_info(self) -> Dict[str, Any]: return {'path': self._path} def url_for(self) -> URL: # type: ignore return URL.build(path=self._path, encoded=True) def __repr__(self) -> str: name = "'" + self.name + "' " if self.name is not None else "" return "".format(name=name, path=self._path) class DynamicResource(Resource): DYN = re.compile(r'\{(?P[_a-zA-Z][_a-zA-Z0-9]*)\}') DYN_WITH_RE = re.compile( r'\{(?P[_a-zA-Z][_a-zA-Z0-9]*):(?P.+)\}') GOOD = r'[^{}/]+' def __init__(self, path: str, *, name: Optional[str]=None) -> None: super().__init__(name=name) pattern = '' formatter = '' for part in ROUTE_RE.split(path): match = self.DYN.fullmatch(part) if match: pattern += '(?P<{}>{})'.format(match.group('var'), self.GOOD) formatter += '{' + match.group('var') + '}' continue match = self.DYN_WITH_RE.fullmatch(part) if match: pattern += '(?P<{var}>{re})'.format(**match.groupdict()) formatter += '{' + match.group('var') + '}' continue if '{' in part or '}' in part: raise ValueError("Invalid path '{}'['{}']".format(path, part)) path = URL.build(path=part).raw_path formatter += path pattern += re.escape(path) try: compiled = re.compile(pattern) except re.error as exc: raise ValueError( "Bad pattern '{}': {}".format(pattern, exc)) from None assert compiled.pattern.startswith(PATH_SEP) assert formatter.startswith('/') self._pattern = compiled self._formatter = formatter @property def canonical(self) -> str: return self._formatter def add_prefix(self, prefix: str) -> None: assert prefix.startswith('/') assert not prefix.endswith('/') assert len(prefix) > 1 self._pattern = re.compile(re.escape(prefix)+self._pattern.pattern) self._formatter = prefix + self._formatter def _match(self, path: str) -> Optional[Dict[str, str]]: match = self._pattern.fullmatch(path) if match is None: return None else: return {key: URL.build(path=value, encoded=True).path for key, value in match.groupdict().items()} def raw_match(self, path: str) -> bool: return self._formatter == path def get_info(self) -> Dict[str, Any]: return {'formatter': self._formatter, 'pattern': self._pattern} def url_for(self, **parts: str) -> URL: url = self._formatter.format_map({k: URL.build(path=v).raw_path for k, v in parts.items()}) return URL.build(path=url) def __repr__(self) -> str: name = "'" + self.name + "' " if self.name is not None else "" return ("" .format(name=name, formatter=self._formatter)) class PrefixResource(AbstractResource): def __init__(self, prefix: str, *, name: Optional[str]=None) -> None: assert not prefix or prefix.startswith('/'), prefix assert prefix in ('', '/') or not prefix.endswith('/'), prefix super().__init__(name=name) self._prefix = URL.build(path=prefix).raw_path @property def canonical(self) -> str: return self._prefix def add_prefix(self, prefix: str) -> None: assert prefix.startswith('/') assert not prefix.endswith('/') assert len(prefix) > 1 self._prefix = prefix + self._prefix def raw_match(self, prefix: str) -> bool: return False # TODO: impl missing abstract methods class StaticResource(PrefixResource): VERSION_KEY = 'v' def __init__(self, prefix: str, directory: PathLike, *, name: Optional[str]=None, expect_handler: Optional[_ExpectHandler]=None, chunk_size: int=256 * 1024, show_index: bool=False, follow_symlinks: bool=False, append_version: bool=False) -> None: super().__init__(prefix, name=name) try: directory = Path(directory) if str(directory).startswith('~'): directory = Path(os.path.expanduser(str(directory))) directory = directory.resolve() if not directory.is_dir(): raise ValueError('Not a directory') except (FileNotFoundError, ValueError) as error: raise ValueError( "No directory exists at '{}'".format(directory)) from error self._directory = directory self._show_index = show_index self._chunk_size = chunk_size self._follow_symlinks = follow_symlinks self._expect_handler = expect_handler self._append_version = append_version self._routes = {'GET': ResourceRoute('GET', self._handle, self, expect_handler=expect_handler), 'HEAD': ResourceRoute('HEAD', self._handle, self, expect_handler=expect_handler)} def url_for(self, *, filename: Union[str, Path], # type: ignore append_version: Optional[bool]=None) -> URL: if append_version is None: append_version = self._append_version if isinstance(filename, Path): filename = str(filename) while filename.startswith('/'): filename = filename[1:] filename = '/' + filename # filename is not encoded url = URL.build(path=self._prefix + filename) if append_version: try: if filename.startswith('/'): filename = filename[1:] filepath = self._directory.joinpath(filename).resolve() if not self._follow_symlinks: filepath.relative_to(self._directory) except (ValueError, FileNotFoundError): # ValueError for case when path point to symlink # with follow_symlinks is False return url # relatively safe if filepath.is_file(): # TODO cache file content # with file watcher for cache invalidation with open(str(filepath), mode='rb') as f: file_bytes = f.read() h = self._get_file_hash(file_bytes) url = url.with_query({self.VERSION_KEY: h}) return url return url @staticmethod def _get_file_hash(byte_array: bytes) -> str: m = hashlib.sha256() # todo sha256 can be configurable param m.update(byte_array) b64 = base64.urlsafe_b64encode(m.digest()) return b64.decode('ascii') def get_info(self) -> Dict[str, Any]: return {'directory': self._directory, 'prefix': self._prefix} def set_options_route(self, handler: _WebHandler) -> None: if 'OPTIONS' in self._routes: raise RuntimeError('OPTIONS route was set already') self._routes['OPTIONS'] = ResourceRoute( 'OPTIONS', handler, self, expect_handler=self._expect_handler) async def resolve(self, request: Request) -> _Resolve: path = request.rel_url.raw_path method = request.method allowed_methods = set(self._routes) if not path.startswith(self._prefix): return None, set() if method not in allowed_methods: return None, allowed_methods match_dict = {'filename': URL.build(path=path[len(self._prefix)+1:], encoded=True).path} return (UrlMappingMatchInfo(match_dict, self._routes[method]), allowed_methods) def __len__(self) -> int: return len(self._routes) def __iter__(self) -> Iterator[AbstractRoute]: return iter(self._routes.values()) async def _handle(self, request: Request) -> StreamResponse: rel_url = request.match_info['filename'] try: filename = Path(rel_url) if filename.anchor: # rel_url is an absolute name like # /static/\\machine_name\c$ or /static/D:\path # where the static dir is totally different raise HTTPForbidden() filepath = self._directory.joinpath(filename).resolve() if not self._follow_symlinks: filepath.relative_to(self._directory) except (ValueError, FileNotFoundError) as error: # relatively safe raise HTTPNotFound() from error except HTTPForbidden: raise except Exception as error: # perm error or other kind! request.app.logger.exception(error) raise HTTPNotFound() from error # on opening a dir, load its contents if allowed if filepath.is_dir(): if self._show_index: try: return Response(text=self._directory_as_html(filepath), content_type="text/html") except PermissionError: raise HTTPForbidden() else: raise HTTPForbidden() elif filepath.is_file(): return FileResponse(filepath, chunk_size=self._chunk_size) else: raise HTTPNotFound def _directory_as_html(self, filepath: Path) -> str: # returns directory's index as html # sanity check assert filepath.is_dir() relative_path_to_dir = filepath.relative_to(self._directory).as_posix() index_of = "Index of /{}".format(relative_path_to_dir) h1 = "

{}

".format(index_of) index_list = [] dir_index = filepath.iterdir() for _file in sorted(dir_index): # show file url as relative to static path rel_path = _file.relative_to(self._directory).as_posix() file_url = self._prefix + '/' + rel_path # if file is a directory, add '/' to the end of the name if _file.is_dir(): file_name = "{}/".format(_file.name) else: file_name = _file.name index_list.append( '
  • {name}
  • '.format(url=file_url, name=file_name) ) ul = "
      \n{}\n
    ".format('\n'.join(index_list)) body = "\n{}\n{}\n".format(h1, ul) head_str = "\n{}\n".format(index_of) html = "\n{}\n{}\n".format(head_str, body) return html def __repr__(self) -> str: name = "'" + self.name + "'" if self.name is not None else "" return " {directory!r}>".format( name=name, path=self._prefix, directory=self._directory) class PrefixedSubAppResource(PrefixResource): def __init__(self, prefix: str, app: 'Application') -> None: super().__init__(prefix) self._app = app for resource in app.router.resources(): resource.add_prefix(prefix) def add_prefix(self, prefix: str) -> None: super().add_prefix(prefix) for resource in self._app.router.resources(): resource.add_prefix(prefix) def url_for(self, *args: str, **kwargs: str) -> URL: raise RuntimeError(".url_for() is not supported " "by sub-application root") def get_info(self) -> Dict[str, Any]: return {'app': self._app, 'prefix': self._prefix} async def resolve(self, request: Request) -> _Resolve: if not request.url.raw_path.startswith(self._prefix + '/') and \ request.url.raw_path != self._prefix: return None, set() match_info = await self._app.router.resolve(request) match_info.add_app(self._app) if isinstance(match_info.http_exception, HTTPMethodNotAllowed): methods = match_info.http_exception.allowed_methods else: methods = set() return match_info, methods def __len__(self) -> int: return len(self._app.router.routes()) def __iter__(self) -> Iterator[AbstractRoute]: return iter(self._app.router.routes()) def __repr__(self) -> str: return " {app!r}>".format( prefix=self._prefix, app=self._app) class AbstractRuleMatching(abc.ABC): @abc.abstractmethod # pragma: no branch async def match(self, request: Request) -> bool: """Return bool if the request satisfies the criteria""" @abc.abstractmethod # pragma: no branch def get_info(self) -> Dict[str, Any]: """Return a dict with additional info useful for introspection""" @property @abc.abstractmethod # pragma: no branch def canonical(self) -> str: """Return a str""" class Domain(AbstractRuleMatching): re_part = re.compile(r"(?!-)[a-z\d-]{1,63}(? None: super().__init__() self._domain = self.validation(domain) @property def canonical(self) -> str: return self._domain def validation(self, domain: str) -> str: if not isinstance(domain, str): raise TypeError("Domain must be str") domain = domain.rstrip('.').lower() if not domain: raise ValueError("Domain cannot be empty") elif '://' in domain: raise ValueError("Scheme not supported") url = URL('http://' + domain) if not all( self.re_part.fullmatch(x) for x in url.raw_host.split(".")): # type: ignore raise ValueError("Domain not valid") if url.port == 80: return url.raw_host # type: ignore return '{}:{}'.format(url.raw_host, url.port) async def match(self, request: Request) -> bool: host = request.headers.get(hdrs.HOST) return host and self.match_domain(host) def match_domain(self, host: str) -> bool: return host.lower() == self._domain def get_info(self) -> Dict[str, Any]: return {'domain': self._domain} class MaskDomain(Domain): re_part = re.compile(r"(?!-)[a-z\d\*-]{1,63}(? None: super().__init__(domain) mask = self._domain.replace('.', r'\.').replace('*', '.*') self._mask = re.compile(mask) @property def canonical(self) -> str: return self._mask.pattern def match_domain(self, host: str) -> bool: return self._mask.fullmatch(host) is not None class MatchedSubAppResource(PrefixedSubAppResource): def __init__(self, rule: AbstractRuleMatching, app: 'Application') -> None: AbstractResource.__init__(self) self._prefix = '' self._app = app self._rule = rule @property def canonical(self) -> str: return self._rule.canonical def get_info(self) -> Dict[str, Any]: return {'app': self._app, 'rule': self._rule} async def resolve(self, request: Request) -> _Resolve: if not await self._rule.match(request): return None, set() match_info = await self._app.router.resolve(request) match_info.add_app(self._app) if isinstance(match_info.http_exception, HTTPMethodNotAllowed): methods = match_info.http_exception.allowed_methods else: methods = set() return match_info, methods def __repr__(self) -> str: return " {app!r}>" \ "".format(app=self._app) class ResourceRoute(AbstractRoute): """A route with resource""" def __init__(self, method: str, handler: Union[_WebHandler, Type[AbstractView]], resource: AbstractResource, *, expect_handler: Optional[_ExpectHandler]=None) -> None: super().__init__(method, handler, expect_handler=expect_handler, resource=resource) def __repr__(self) -> str: return " {handler!r}".format( method=self.method, resource=self._resource, handler=self.handler) @property def name(self) -> Optional[str]: return self._resource.name # type: ignore def url_for(self, *args: str, **kwargs: str) -> URL: """Construct url for route with additional params.""" return self._resource.url_for(*args, **kwargs) # type: ignore def get_info(self) -> Dict[str, Any]: return self._resource.get_info() # type: ignore class SystemRoute(AbstractRoute): def __init__(self, http_exception: HTTPException) -> None: super().__init__(hdrs.METH_ANY, self._handle) self._http_exception = http_exception def url_for(self, *args: str, **kwargs: str) -> URL: raise RuntimeError(".url_for() is not allowed for SystemRoute") @property def name(self) -> Optional[str]: return None def get_info(self) -> Dict[str, Any]: return {'http_exception': self._http_exception} async def _handle(self, request: Request) -> StreamResponse: raise self._http_exception @property def status(self) -> int: return self._http_exception.status @property def reason(self) -> str: return self._http_exception.reason def __repr__(self) -> str: return "".format(self=self) class View(AbstractView): async def _iter(self) -> StreamResponse: if self.request.method not in hdrs.METH_ALL: self._raise_allowed_methods() method = getattr(self, self.request.method.lower(), None) if method is None: self._raise_allowed_methods() resp = await method() return resp def __await__(self) -> Generator[Any, None, StreamResponse]: return self._iter().__await__() def _raise_allowed_methods(self) -> None: allowed_methods = { m for m in hdrs.METH_ALL if hasattr(self, m.lower())} raise HTTPMethodNotAllowed(self.request.method, allowed_methods) class ResourcesView(Sized, Iterable[AbstractResource], Container[AbstractResource]): def __init__(self, resources: List[AbstractResource]) -> None: self._resources = resources def __len__(self) -> int: return len(self._resources) def __iter__(self) -> Iterator[AbstractResource]: yield from self._resources def __contains__(self, resource: object) -> bool: return resource in self._resources class RoutesView(Sized, Iterable[AbstractRoute], Container[AbstractRoute]): def __init__(self, resources: List[AbstractResource]): self._routes = [] # type: List[AbstractRoute] for resource in resources: for route in resource: self._routes.append(route) def __len__(self) -> int: return len(self._routes) def __iter__(self) -> Iterator[AbstractRoute]: yield from self._routes def __contains__(self, route: object) -> bool: return route in self._routes class UrlDispatcher(AbstractRouter, Mapping[str, AbstractResource]): NAME_SPLIT_RE = re.compile(r'[.:-]') def __init__(self) -> None: super().__init__() self._resources = [] # type: List[AbstractResource] self._named_resources = {} # type: Dict[str, AbstractResource] async def resolve(self, request: Request) -> AbstractMatchInfo: method = request.method allowed_methods = set() # type: Set[str] for resource in self._resources: match_dict, allowed = await resource.resolve(request) if match_dict is not None: return match_dict else: allowed_methods |= allowed else: if allowed_methods: return MatchInfoError(HTTPMethodNotAllowed(method, allowed_methods)) else: return MatchInfoError(HTTPNotFound()) def __iter__(self) -> Iterator[str]: return iter(self._named_resources) def __len__(self) -> int: return len(self._named_resources) def __contains__(self, resource: object) -> bool: return resource in self._named_resources def __getitem__(self, name: str) -> AbstractResource: return self._named_resources[name] def resources(self) -> ResourcesView: return ResourcesView(self._resources) def routes(self) -> RoutesView: return RoutesView(self._resources) def named_resources(self) -> Mapping[str, AbstractResource]: return MappingProxyType(self._named_resources) def register_resource(self, resource: AbstractResource) -> None: assert isinstance(resource, AbstractResource), \ 'Instance of AbstractResource class is required, got {!r}'.format( resource) if self.frozen: raise RuntimeError( "Cannot register a resource into frozen router.") name = resource.name if name is not None: parts = self.NAME_SPLIT_RE.split(name) for part in parts: if not part.isidentifier() or keyword.iskeyword(part): raise ValueError('Incorrect route name {!r}, ' 'the name should be a sequence of ' 'python identifiers separated ' 'by dash, dot or column'.format(name)) if name in self._named_resources: raise ValueError('Duplicate {!r}, ' 'already handled by {!r}' .format(name, self._named_resources[name])) self._named_resources[name] = resource self._resources.append(resource) def add_resource(self, path: str, *, name: Optional[str]=None) -> Resource: if path and not path.startswith('/'): raise ValueError("path should be started with / or be empty") # Reuse last added resource if path and name are the same if self._resources: resource = self._resources[-1] if resource.name == name and resource.raw_match(path): return cast(Resource, resource) if not ('{' in path or '}' in path or ROUTE_RE.search(path)): url = URL.build(path=path) resource = PlainResource(url.raw_path, name=name) self.register_resource(resource) return resource resource = DynamicResource(path, name=name) self.register_resource(resource) return resource def add_route(self, method: str, path: str, handler: Union[_WebHandler, Type[AbstractView]], *, name: Optional[str]=None, expect_handler: Optional[_ExpectHandler]=None ) -> AbstractRoute: resource = self.add_resource(path, name=name) return resource.add_route(method, handler, expect_handler=expect_handler) def add_static(self, prefix: str, path: PathLike, *, name: Optional[str]=None, expect_handler: Optional[_ExpectHandler]=None, chunk_size: int=256 * 1024, show_index: bool=False, follow_symlinks: bool=False, append_version: bool=False) -> AbstractResource: """Add static files view. prefix - url prefix path - folder with files """ assert prefix.startswith('/') if prefix.endswith('/'): prefix = prefix[:-1] resource = StaticResource(prefix, path, name=name, expect_handler=expect_handler, chunk_size=chunk_size, show_index=show_index, follow_symlinks=follow_symlinks, append_version=append_version) self.register_resource(resource) return resource def add_head(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method HEAD """ return self.add_route(hdrs.METH_HEAD, path, handler, **kwargs) def add_options(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method OPTIONS """ return self.add_route(hdrs.METH_OPTIONS, path, handler, **kwargs) def add_get(self, path: str, handler: _WebHandler, *, name: Optional[str]=None, allow_head: bool=True, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method GET, if allow_head is true another route is added allowing head requests to the same endpoint """ resource = self.add_resource(path, name=name) if allow_head: resource.add_route(hdrs.METH_HEAD, handler, **kwargs) return resource.add_route(hdrs.METH_GET, handler, **kwargs) def add_post(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method POST """ return self.add_route(hdrs.METH_POST, path, handler, **kwargs) def add_put(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method PUT """ return self.add_route(hdrs.METH_PUT, path, handler, **kwargs) def add_patch(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method PATCH """ return self.add_route(hdrs.METH_PATCH, path, handler, **kwargs) def add_delete(self, path: str, handler: _WebHandler, **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with method DELETE """ return self.add_route(hdrs.METH_DELETE, path, handler, **kwargs) def add_view(self, path: str, handler: Type[AbstractView], **kwargs: Any) -> AbstractRoute: """ Shortcut for add_route with ANY methods for a class-based view """ return self.add_route(hdrs.METH_ANY, path, handler, **kwargs) def freeze(self) -> None: super().freeze() for resource in self._resources: resource.freeze() def add_routes(self, routes: Iterable[AbstractRouteDef]) -> None: """Append routes to route table. Parameter should be a sequence of RouteDef objects. """ for route_def in routes: route_def.register(self) aiohttp-3.6.2/aiohttp/web_ws.py0000644000175100001650000004107613547410117017005 0ustar vstsdocker00000000000000import asyncio import base64 import binascii import hashlib import json from typing import Any, Iterable, Optional, Tuple import async_timeout import attr from multidict import CIMultiDict from . import hdrs from .abc import AbstractStreamWriter from .helpers import call_later, set_result from .http import ( WS_CLOSED_MESSAGE, WS_CLOSING_MESSAGE, WS_KEY, WebSocketError, WebSocketReader, WebSocketWriter, WSMessage, ) from .http import WSMsgType as WSMsgType from .http import ws_ext_gen, ws_ext_parse from .log import ws_logger from .streams import EofStream, FlowControlDataQueue from .typedefs import JSONDecoder, JSONEncoder from .web_exceptions import HTTPBadRequest, HTTPException from .web_request import BaseRequest from .web_response import StreamResponse __all__ = ('WebSocketResponse', 'WebSocketReady', 'WSMsgType',) THRESHOLD_CONNLOST_ACCESS = 5 @attr.s(frozen=True, slots=True) class WebSocketReady: ok = attr.ib(type=bool) protocol = attr.ib(type=Optional[str]) def __bool__(self) -> bool: return self.ok class WebSocketResponse(StreamResponse): _length_check = False def __init__(self, *, timeout: float=10.0, receive_timeout: Optional[float]=None, autoclose: bool=True, autoping: bool=True, heartbeat: Optional[float]=None, protocols: Iterable[str]=(), compress: bool=True, max_msg_size: int=4*1024*1024) -> None: super().__init__(status=101) self._protocols = protocols self._ws_protocol = None # type: Optional[str] self._writer = None # type: Optional[WebSocketWriter] self._reader = None # type: Optional[FlowControlDataQueue[WSMessage]] self._closed = False self._closing = False self._conn_lost = 0 self._close_code = None # type: Optional[int] self._loop = None # type: Optional[asyncio.AbstractEventLoop] self._waiting = None # type: Optional[asyncio.Future[bool]] self._exception = None # type: Optional[BaseException] self._timeout = timeout self._receive_timeout = receive_timeout self._autoclose = autoclose self._autoping = autoping self._heartbeat = heartbeat self._heartbeat_cb = None if heartbeat is not None: self._pong_heartbeat = heartbeat / 2.0 self._pong_response_cb = None self._compress = compress self._max_msg_size = max_msg_size def _cancel_heartbeat(self) -> None: if self._pong_response_cb is not None: self._pong_response_cb.cancel() self._pong_response_cb = None if self._heartbeat_cb is not None: self._heartbeat_cb.cancel() self._heartbeat_cb = None def _reset_heartbeat(self) -> None: self._cancel_heartbeat() if self._heartbeat is not None: self._heartbeat_cb = call_later( self._send_heartbeat, self._heartbeat, self._loop) def _send_heartbeat(self) -> None: if self._heartbeat is not None and not self._closed: # fire-and-forget a task is not perfect but maybe ok for # sending ping. Otherwise we need a long-living heartbeat # task in the class. self._loop.create_task(self._writer.ping()) # type: ignore if self._pong_response_cb is not None: self._pong_response_cb.cancel() self._pong_response_cb = call_later( self._pong_not_received, self._pong_heartbeat, self._loop) def _pong_not_received(self) -> None: if self._req is not None and self._req.transport is not None: self._closed = True self._close_code = 1006 self._exception = asyncio.TimeoutError() self._req.transport.close() async def prepare(self, request: BaseRequest) -> AbstractStreamWriter: # make pre-check to don't hide it by do_handshake() exceptions if self._payload_writer is not None: return self._payload_writer protocol, writer = self._pre_start(request) payload_writer = await super().prepare(request) assert payload_writer is not None self._post_start(request, protocol, writer) await payload_writer.drain() return payload_writer def _handshake(self, request: BaseRequest) -> Tuple['CIMultiDict[str]', str, bool, bool]: headers = request.headers if 'websocket' != headers.get(hdrs.UPGRADE, '').lower().strip(): raise HTTPBadRequest( text=('No WebSocket UPGRADE hdr: {}\n Can ' '"Upgrade" only to "WebSocket".') .format(headers.get(hdrs.UPGRADE))) if 'upgrade' not in headers.get(hdrs.CONNECTION, '').lower(): raise HTTPBadRequest( text='No CONNECTION upgrade hdr: {}'.format( headers.get(hdrs.CONNECTION))) # find common sub-protocol between client and server protocol = None if hdrs.SEC_WEBSOCKET_PROTOCOL in headers: req_protocols = [str(proto.strip()) for proto in headers[hdrs.SEC_WEBSOCKET_PROTOCOL].split(',')] for proto in req_protocols: if proto in self._protocols: protocol = proto break else: # No overlap found: Return no protocol as per spec ws_logger.warning( 'Client protocols %r don’t overlap server-known ones %r', req_protocols, self._protocols) # check supported version version = headers.get(hdrs.SEC_WEBSOCKET_VERSION, '') if version not in ('13', '8', '7'): raise HTTPBadRequest( text='Unsupported version: {}'.format(version)) # check client handshake for validity key = headers.get(hdrs.SEC_WEBSOCKET_KEY) try: if not key or len(base64.b64decode(key)) != 16: raise HTTPBadRequest( text='Handshake error: {!r}'.format(key)) except binascii.Error: raise HTTPBadRequest( text='Handshake error: {!r}'.format(key)) from None accept_val = base64.b64encode( hashlib.sha1(key.encode() + WS_KEY).digest()).decode() response_headers = CIMultiDict( # type: ignore {hdrs.UPGRADE: 'websocket', hdrs.CONNECTION: 'upgrade', hdrs.SEC_WEBSOCKET_ACCEPT: accept_val}) notakeover = False compress = 0 if self._compress: extensions = headers.get(hdrs.SEC_WEBSOCKET_EXTENSIONS) # Server side always get return with no exception. # If something happened, just drop compress extension compress, notakeover = ws_ext_parse(extensions, isserver=True) if compress: enabledext = ws_ext_gen(compress=compress, isserver=True, server_notakeover=notakeover) response_headers[hdrs.SEC_WEBSOCKET_EXTENSIONS] = enabledext if protocol: response_headers[hdrs.SEC_WEBSOCKET_PROTOCOL] = protocol return (response_headers, # type: ignore protocol, compress, notakeover) def _pre_start(self, request: BaseRequest) -> Tuple[str, WebSocketWriter]: self._loop = request._loop headers, protocol, compress, notakeover = self._handshake( request) self._reset_heartbeat() self.set_status(101) self.headers.update(headers) self.force_close() self._compress = compress transport = request._protocol.transport assert transport is not None writer = WebSocketWriter(request._protocol, transport, compress=compress, notakeover=notakeover) return protocol, writer def _post_start(self, request: BaseRequest, protocol: str, writer: WebSocketWriter) -> None: self._ws_protocol = protocol self._writer = writer loop = self._loop assert loop is not None self._reader = FlowControlDataQueue( request._protocol, limit=2 ** 16, loop=loop) request.protocol.set_parser(WebSocketReader( self._reader, self._max_msg_size, compress=self._compress)) # disable HTTP keepalive for WebSocket request.protocol.keep_alive(False) def can_prepare(self, request: BaseRequest) -> WebSocketReady: if self._writer is not None: raise RuntimeError('Already started') try: _, protocol, _, _ = self._handshake(request) except HTTPException: return WebSocketReady(False, None) else: return WebSocketReady(True, protocol) @property def closed(self) -> bool: return self._closed @property def close_code(self) -> Optional[int]: return self._close_code @property def ws_protocol(self) -> Optional[str]: return self._ws_protocol @property def compress(self) -> bool: return self._compress def exception(self) -> Optional[BaseException]: return self._exception async def ping(self, message: bytes=b'') -> None: if self._writer is None: raise RuntimeError('Call .prepare() first') await self._writer.ping(message) async def pong(self, message: bytes=b'') -> None: # unsolicited pong if self._writer is None: raise RuntimeError('Call .prepare() first') await self._writer.pong(message) async def send_str(self, data: str, compress: Optional[bool]=None) -> None: if self._writer is None: raise RuntimeError('Call .prepare() first') if not isinstance(data, str): raise TypeError('data argument must be str (%r)' % type(data)) await self._writer.send(data, binary=False, compress=compress) async def send_bytes(self, data: bytes, compress: Optional[bool]=None) -> None: if self._writer is None: raise RuntimeError('Call .prepare() first') if not isinstance(data, (bytes, bytearray, memoryview)): raise TypeError('data argument must be byte-ish (%r)' % type(data)) await self._writer.send(data, binary=True, compress=compress) async def send_json(self, data: Any, compress: Optional[bool]=None, *, dumps: JSONEncoder=json.dumps) -> None: await self.send_str(dumps(data), compress=compress) async def write_eof(self) -> None: # type: ignore if self._eof_sent: return if self._payload_writer is None: raise RuntimeError("Response has not been started") await self.close() self._eof_sent = True async def close(self, *, code: int=1000, message: bytes=b'') -> bool: if self._writer is None: raise RuntimeError('Call .prepare() first') self._cancel_heartbeat() reader = self._reader assert reader is not None # we need to break `receive()` cycle first, # `close()` may be called from different task if self._waiting is not None and not self._closed: reader.feed_data(WS_CLOSING_MESSAGE, 0) await self._waiting if not self._closed: self._closed = True try: await self._writer.close(code, message) writer = self._payload_writer assert writer is not None await writer.drain() except (asyncio.CancelledError, asyncio.TimeoutError): self._close_code = 1006 raise except Exception as exc: self._close_code = 1006 self._exception = exc return True if self._closing: return True reader = self._reader assert reader is not None try: with async_timeout.timeout(self._timeout, loop=self._loop): msg = await reader.read() except asyncio.CancelledError: self._close_code = 1006 raise except Exception as exc: self._close_code = 1006 self._exception = exc return True if msg.type == WSMsgType.CLOSE: self._close_code = msg.data return True self._close_code = 1006 self._exception = asyncio.TimeoutError() return True else: return False async def receive(self, timeout: Optional[float]=None) -> WSMessage: if self._reader is None: raise RuntimeError('Call .prepare() first') loop = self._loop assert loop is not None while True: if self._waiting is not None: raise RuntimeError( 'Concurrent call to receive() is not allowed') if self._closed: self._conn_lost += 1 if self._conn_lost >= THRESHOLD_CONNLOST_ACCESS: raise RuntimeError('WebSocket connection is closed.') return WS_CLOSED_MESSAGE elif self._closing: return WS_CLOSING_MESSAGE try: self._waiting = loop.create_future() try: with async_timeout.timeout( timeout or self._receive_timeout, loop=self._loop): msg = await self._reader.read() self._reset_heartbeat() finally: waiter = self._waiting set_result(waiter, True) self._waiting = None except (asyncio.CancelledError, asyncio.TimeoutError): self._close_code = 1006 raise except EofStream: self._close_code = 1000 await self.close() return WSMessage(WSMsgType.CLOSED, None, None) except WebSocketError as exc: self._close_code = exc.code await self.close(code=exc.code) return WSMessage(WSMsgType.ERROR, exc, None) except Exception as exc: self._exception = exc self._closing = True self._close_code = 1006 await self.close() return WSMessage(WSMsgType.ERROR, exc, None) if msg.type == WSMsgType.CLOSE: self._closing = True self._close_code = msg.data if not self._closed and self._autoclose: await self.close() elif msg.type == WSMsgType.CLOSING: self._closing = True elif msg.type == WSMsgType.PING and self._autoping: await self.pong(msg.data) continue elif msg.type == WSMsgType.PONG and self._autoping: continue return msg async def receive_str(self, *, timeout: Optional[float]=None) -> str: msg = await self.receive(timeout) if msg.type != WSMsgType.TEXT: raise TypeError( "Received message {}:{!r} is not WSMsgType.TEXT".format( msg.type, msg.data)) return msg.data async def receive_bytes(self, *, timeout: Optional[float]=None) -> bytes: msg = await self.receive(timeout) if msg.type != WSMsgType.BINARY: raise TypeError( "Received message {}:{!r} is not bytes".format(msg.type, msg.data)) return msg.data async def receive_json(self, *, loads: JSONDecoder=json.loads, timeout: Optional[float]=None) -> Any: data = await self.receive_str(timeout=timeout) return loads(data) async def write(self, data: bytes) -> None: raise RuntimeError("Cannot call .write() for websocket") def __aiter__(self) -> 'WebSocketResponse': return self async def __anext__(self) -> WSMessage: msg = await self.receive() if msg.type in (WSMsgType.CLOSE, WSMsgType.CLOSING, WSMsgType.CLOSED): raise StopAsyncIteration # NOQA return msg aiohttp-3.6.2/aiohttp/worker.py0000644000175100001650000001776213547410117017035 0ustar vstsdocker00000000000000"""Async gunicorn worker for aiohttp.web""" import asyncio import os import re import signal import sys from types import FrameType from typing import Any, Awaitable, Callable, Optional, Union # noqa from gunicorn.config import AccessLogFormat as GunicornAccessLogFormat from gunicorn.workers import base from aiohttp import web from .helpers import set_result from .web_app import Application from .web_log import AccessLogger try: import ssl SSLContext = ssl.SSLContext # noqa except ImportError: # pragma: no cover ssl = None # type: ignore SSLContext = object # type: ignore __all__ = ('GunicornWebWorker', 'GunicornUVLoopWebWorker', 'GunicornTokioWebWorker') class GunicornWebWorker(base.Worker): DEFAULT_AIOHTTP_LOG_FORMAT = AccessLogger.LOG_FORMAT DEFAULT_GUNICORN_LOG_FORMAT = GunicornAccessLogFormat.default def __init__(self, *args: Any, **kw: Any) -> None: # pragma: no cover super().__init__(*args, **kw) self._task = None # type: Optional[asyncio.Task[None]] self.exit_code = 0 self._notify_waiter = None # type: Optional[asyncio.Future[bool]] def init_process(self) -> None: # create new event_loop after fork asyncio.get_event_loop().close() self.loop = asyncio.new_event_loop() asyncio.set_event_loop(self.loop) super().init_process() def run(self) -> None: self._task = self.loop.create_task(self._run()) try: # ignore all finalization problems self.loop.run_until_complete(self._task) except Exception: self.log.exception("Exception in gunicorn worker") if sys.version_info >= (3, 6): self.loop.run_until_complete(self.loop.shutdown_asyncgens()) self.loop.close() sys.exit(self.exit_code) async def _run(self) -> None: if isinstance(self.wsgi, Application): app = self.wsgi elif asyncio.iscoroutinefunction(self.wsgi): app = await self.wsgi() else: raise RuntimeError("wsgi app should be either Application or " "async function returning Application, got {}" .format(self.wsgi)) access_log = self.log.access_log if self.cfg.accesslog else None runner = web.AppRunner(app, logger=self.log, keepalive_timeout=self.cfg.keepalive, access_log=access_log, access_log_format=self._get_valid_log_format( self.cfg.access_log_format)) await runner.setup() ctx = self._create_ssl_context(self.cfg) if self.cfg.is_ssl else None runner = runner assert runner is not None server = runner.server assert server is not None for sock in self.sockets: site = web.SockSite( runner, sock, ssl_context=ctx, shutdown_timeout=self.cfg.graceful_timeout / 100 * 95) await site.start() # If our parent changed then we shut down. pid = os.getpid() try: while self.alive: # type: ignore self.notify() cnt = server.requests_count if self.cfg.max_requests and cnt > self.cfg.max_requests: self.alive = False self.log.info("Max requests, shutting down: %s", self) elif pid == os.getpid() and self.ppid != os.getppid(): self.alive = False self.log.info("Parent changed, shutting down: %s", self) else: await self._wait_next_notify() except BaseException: pass await runner.cleanup() def _wait_next_notify(self) -> 'asyncio.Future[bool]': self._notify_waiter_done() loop = self.loop assert loop is not None self._notify_waiter = waiter = loop.create_future() self.loop.call_later(1.0, self._notify_waiter_done, waiter) return waiter def _notify_waiter_done(self, waiter: 'asyncio.Future[bool]'=None) -> None: if waiter is None: waiter = self._notify_waiter if waiter is not None: set_result(waiter, True) if waiter is self._notify_waiter: self._notify_waiter = None def init_signals(self) -> None: # Set up signals through the event loop API. self.loop.add_signal_handler(signal.SIGQUIT, self.handle_quit, signal.SIGQUIT, None) self.loop.add_signal_handler(signal.SIGTERM, self.handle_exit, signal.SIGTERM, None) self.loop.add_signal_handler(signal.SIGINT, self.handle_quit, signal.SIGINT, None) self.loop.add_signal_handler(signal.SIGWINCH, self.handle_winch, signal.SIGWINCH, None) self.loop.add_signal_handler(signal.SIGUSR1, self.handle_usr1, signal.SIGUSR1, None) self.loop.add_signal_handler(signal.SIGABRT, self.handle_abort, signal.SIGABRT, None) # Don't let SIGTERM and SIGUSR1 disturb active requests # by interrupting system calls signal.siginterrupt(signal.SIGTERM, False) signal.siginterrupt(signal.SIGUSR1, False) def handle_quit(self, sig: int, frame: FrameType) -> None: self.alive = False # worker_int callback self.cfg.worker_int(self) # wakeup closing process self._notify_waiter_done() def handle_abort(self, sig: int, frame: FrameType) -> None: self.alive = False self.exit_code = 1 self.cfg.worker_abort(self) sys.exit(1) @staticmethod def _create_ssl_context(cfg: Any) -> 'SSLContext': """ Creates SSLContext instance for usage in asyncio.create_server. See ssl.SSLSocket.__init__ for more details. """ if ssl is None: # pragma: no cover raise RuntimeError('SSL is not supported.') ctx = ssl.SSLContext(cfg.ssl_version) ctx.load_cert_chain(cfg.certfile, cfg.keyfile) ctx.verify_mode = cfg.cert_reqs if cfg.ca_certs: ctx.load_verify_locations(cfg.ca_certs) if cfg.ciphers: ctx.set_ciphers(cfg.ciphers) return ctx def _get_valid_log_format(self, source_format: str) -> str: if source_format == self.DEFAULT_GUNICORN_LOG_FORMAT: return self.DEFAULT_AIOHTTP_LOG_FORMAT elif re.search(r'%\([^\)]+\)', source_format): raise ValueError( "Gunicorn's style options in form of `%(name)s` are not " "supported for the log formatting. Please use aiohttp's " "format specification to configure access log formatting: " "http://docs.aiohttp.org/en/stable/logging.html" "#format-specification" ) else: return source_format class GunicornUVLoopWebWorker(GunicornWebWorker): def init_process(self) -> None: import uvloop # Close any existing event loop before setting a # new policy. asyncio.get_event_loop().close() # Setup uvloop policy, so that every # asyncio.get_event_loop() will create an instance # of uvloop event loop. asyncio.set_event_loop_policy(uvloop.EventLoopPolicy()) super().init_process() class GunicornTokioWebWorker(GunicornWebWorker): def init_process(self) -> None: # pragma: no cover import tokio # Close any existing event loop before setting a # new policy. asyncio.get_event_loop().close() # Setup tokio policy, so that every # asyncio.get_event_loop() will create an instance # of tokio event loop. asyncio.set_event_loop_policy(tokio.EventLoopPolicy()) super().init_process() aiohttp-3.6.2/aiohttp.egg-info/0000755000175100001650000000000013547410140016623 5ustar vstsdocker00000000000000aiohttp-3.6.2/aiohttp.egg-info/PKG-INFO0000644000175100001650000007021313547410140017723 0ustar vstsdocker00000000000000Metadata-Version: 2.1 Name: aiohttp Version: 3.6.2 Summary: Async http client/server framework (asyncio) Home-page: https://github.com/aio-libs/aiohttp Author: Nikolay Kim Author-email: fafhrd91@gmail.com Maintainer: Nikolay Kim , Andrew Svetlov Maintainer-email: aio-libs@googlegroups.com License: Apache 2 Project-URL: Chat: Gitter, https://gitter.im/aio-libs/Lobby Project-URL: CI: AppVeyor, https://ci.appveyor.com/project/aio-libs/aiohttp Project-URL: CI: Circle, https://circleci.com/gh/aio-libs/aiohttp Project-URL: CI: Shippable, https://app.shippable.com/github/aio-libs/aiohttp Project-URL: CI: Travis, https://travis-ci.com/aio-libs/aiohttp Project-URL: Coverage: codecov, https://codecov.io/github/aio-libs/aiohttp Project-URL: Docs: RTD, https://docs.aiohttp.org Project-URL: GitHub: issues, https://github.com/aio-libs/aiohttp/issues Project-URL: GitHub: repo, https://github.com/aio-libs/aiohttp Description: ================================== Async http client/server framework ================================== .. image:: https://raw.githubusercontent.com/aio-libs/aiohttp/master/docs/_static/aiohttp-icon-128x128.png :height: 64px :width: 64px :alt: aiohttp logo | .. image:: https://travis-ci.com/aio-libs/aiohttp.svg?branch=master :target: https://travis-ci.com/aio-libs/aiohttp :align: right :alt: Travis status for master branch .. image:: https://ci.appveyor.com/api/projects/status/tnddy9k6pphl8w7k/branch/master?svg=true :target: https://ci.appveyor.com/project/aio-libs/aiohttp :align: right :alt: AppVeyor status for master branch .. image:: https://codecov.io/gh/aio-libs/aiohttp/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/aiohttp :alt: codecov.io status for master branch .. image:: https://badge.fury.io/py/aiohttp.svg :target: https://pypi.org/project/aiohttp :alt: Latest PyPI package version .. image:: https://readthedocs.org/projects/aiohttp/badge/?version=latest :target: https://docs.aiohttp.org/ :alt: Latest Read The Docs .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Key Features ============ - Supports both client and server side of HTTP protocol. - Supports both client and server Web-Sockets out-of-the-box and avoids Callback Hell. - Provides Web-server with middlewares and pluggable routing. Getting started =============== Client ------ To get something from the web: .. code-block:: python import aiohttp import asyncio async def fetch(session, url): async with session.get(url) as response: return await response.text() async def main(): async with aiohttp.ClientSession() as session: html = await fetch(session, 'http://python.org') print(html) if __name__ == '__main__': loop = asyncio.get_event_loop() loop.run_until_complete(main()) Server ------ An example using a simple server: .. code-block:: python # examples/server_simple.py from aiohttp import web async def handle(request): name = request.match_info.get('name', "Anonymous") text = "Hello, " + name return web.Response(text=text) async def wshandle(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: if msg.type == web.WSMsgType.text: await ws.send_str("Hello, {}".format(msg.data)) elif msg.type == web.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == web.WSMsgType.close: break return ws app = web.Application() app.add_routes([web.get('/', handle), web.get('/echo', wshandle), web.get('/{name}', handle)]) if __name__ == '__main__': web.run_app(app) Documentation ============= https://aiohttp.readthedocs.io/ Demos ===== https://github.com/aio-libs/aiohttp-demos External links ============== * `Third party libraries `_ * `Built with aiohttp `_ * `Powered by aiohttp `_ Feel free to make a Pull Request for adding your link to these pages! Communication channels ====================== *aio-libs* google group: https://groups.google.com/forum/#!forum/aio-libs Feel free to post your questions and ideas here. *gitter chat* https://gitter.im/aio-libs/Lobby We support `Stack Overflow `_. Please add *aiohttp* tag to your question there. Requirements ============ - Python >= 3.5.3 - async-timeout_ - attrs_ - chardet_ - multidict_ - yarl_ Optionally you may install the cChardet_ and aiodns_ libraries (highly recommended for sake of speed). .. _chardet: https://pypi.python.org/pypi/chardet .. _aiodns: https://pypi.python.org/pypi/aiodns .. _attrs: https://github.com/python-attrs/attrs .. _multidict: https://pypi.python.org/pypi/multidict .. _yarl: https://pypi.python.org/pypi/yarl .. _async-timeout: https://pypi.python.org/pypi/async_timeout .. _cChardet: https://pypi.python.org/pypi/cchardet License ======= ``aiohttp`` is offered under the Apache 2 license. Keepsafe ======== The aiohttp community would like to thank Keepsafe (https://www.getkeepsafe.com) for its support in the early days of the project. Source code =========== The latest developer version is available in a GitHub repository: https://github.com/aio-libs/aiohttp Benchmarks ========== If you are interested in efficiency, the AsyncIO community maintains a list of benchmarks on the official wiki: https://github.com/python/asyncio/wiki/Benchmarks ========= Changelog ========= .. You should *NOT* be adding new change log entries to this file, this file is managed by towncrier. You *may* edit previous change logs to fix problems like typo corrections or such. To add a new change log entry, please see https://pip.pypa.io/en/latest/development/#adding-a-news-entry we named the news folder "changes". WARNING: Don't drop the next directive! .. towncrier release notes start 3.6.2 (2019-10-09) ================== Features -------- - Made exceptions pickleable. Also changed the repr of some exceptions. `#4077 `_ - Use ``Iterable`` type hint instead of ``Sequence`` for ``Application`` *middleware* parameter. `#4125 `_ Bugfixes -------- - Reset the ``sock_read`` timeout each time data is received for a ``aiohttp.ClientResponse``. `#3808 `_ - Fix handling of expired cookies so they are not stored in CookieJar. `#4063 `_ - Fix misleading message in the string representation of ``ClientConnectorError``; ``self.ssl == None`` means default SSL context, not SSL disabled `#4097 `_ - Don't clobber HTTP status when using FileResponse. `#4106 `_ Improved Documentation ---------------------- - Added minimal required logging configuration to logging documentation. `#2469 `_ - Update docs to reflect proxy support. `#4100 `_ - Fix typo in code example in testing docs. `#4108 `_ Misc ---- - `#4102 `_ ---- 3.6.1 (2019-09-19) ================== Features -------- - Compatibility with Python 3.8. `#4056 `_ Bugfixes -------- - correct some exception string format `#4068 `_ - Emit a warning when ``ssl.OP_NO_COMPRESSION`` is unavailable because the runtime is built against an outdated OpenSSL. `#4052 `_ - Update multidict requirement to >= 4.5 `#4057 `_ Improved Documentation ---------------------- - Provide pytest-aiohttp namespace for pytest fixtures in docs. `#3723 `_ ---- 3.6.0 (2019-09-06) ================== Features -------- - Add support for Named Pipes (Site and Connector) under Windows. This feature requires Proactor event loop to work. `#3629 `_ - Removed ``Transfer-Encoding: chunked`` header from websocket responses to be compatible with more http proxy servers. `#3798 `_ - Accept non-GET request for starting websocket handshake on server side. `#3980 `_ Bugfixes -------- - Raise a ClientResponseError instead of an AssertionError for a blank HTTP Reason Phrase. `#3532 `_ - Fix an issue where cookies would sometimes not be set during a redirect. `#3576 `_ - Change normalize_path_middleware to use 308 redirect instead of 301. This behavior should prevent clients from being unable to use PUT/POST methods on endpoints that are redirected because of a trailing slash. `#3579 `_ - Drop the processed task from ``all_tasks()`` list early. It prevents logging about a task with unhandled exception when the server is used in conjunction with ``asyncio.run()``. `#3587 `_ - ``Signal`` type annotation changed from ``Signal[Callable[['TraceConfig'], Awaitable[None]]]`` to ``Signal[Callable[ClientSession, SimpleNamespace, ...]``. `#3595 `_ - Use sanitized URL as Location header in redirects `#3614 `_ - Improve typing annotations for multipart.py along with changes required by mypy in files that references multipart.py. `#3621 `_ - Close session created inside ``aiohttp.request`` when unhandled exception occurs `#3628 `_ - Cleanup per-chunk data in generic data read. Memory leak fixed. `#3631 `_ - Use correct type for add_view and family `#3633 `_ - Fix _keepalive field in __slots__ of ``RequestHandler``. `#3644 `_ - Properly handle ConnectionResetError, to silence the "Cannot write to closing transport" exception when clients disconnect uncleanly. `#3648 `_ - Suppress pytest warnings due to ``test_utils`` classes `#3660 `_ - Fix overshadowing of overlapped sub-application prefixes. `#3701 `_ - Fixed return type annotation for WSMessage.json() `#3720 `_ - Properly expose TooManyRedirects publicly as documented. `#3818 `_ - Fix missing brackets for IPv6 in proxy CONNECT request `#3841 `_ - Make the signature of ``aiohttp.test_utils.TestClient.request`` match ``asyncio.ClientSession.request`` according to the docs `#3852 `_ - Use correct style for re-exported imports, makes mypy ``--strict`` mode happy. `#3868 `_ - Fixed type annotation for add_view method of UrlDispatcher to accept any subclass of View `#3880 `_ - Made cython HTTP parser set Reason-Phrase of the response to an empty string if it is missing. `#3906 `_ - Add URL to the string representation of ClientResponseError. `#3959 `_ - Accept ``istr`` keys in ``LooseHeaders`` type hints. `#3976 `_ - Fixed race conditions in _resolve_host caching and throttling when tracing is enabled. `#4013 `_ - For URLs like "unix://localhost/..." set Host HTTP header to "localhost" instead of "localhost:None". `#4039 `_ Improved Documentation ---------------------- - Modify documentation for Background Tasks to remove deprecated usage of event loop. `#3526 `_ - use ``if __name__ == '__main__':`` in server examples. `#3775 `_ - Update documentation reference to the default access logger. `#3783 `_ - Improve documentation for ``web.BaseRequest.path`` and ``web.BaseRequest.raw_path``. `#3791 `_ - Removed deprecation warning in tracing example docs `#3964 `_ ---- 3.5.4 (2019-01-12) ================== Bugfixes -------- - Fix stream ``.read()`` / ``.readany()`` / ``.iter_any()`` which used to return a partial content only in case of compressed content `#3525 `_ 3.5.3 (2019-01-10) ================== Bugfixes -------- - Fix type stubs for ``aiohttp.web.run_app(access_log=True)`` and fix edge case of ``access_log=True`` and the event loop being in debug mode. `#3504 `_ - Fix ``aiohttp.ClientTimeout`` type annotations to accept ``None`` for fields `#3511 `_ - Send custom per-request cookies even if session jar is empty `#3515 `_ - Restore Linux binary wheels publishing on PyPI ---- 3.5.2 (2019-01-08) ================== Features -------- - ``FileResponse`` from ``web_fileresponse.py`` uses a ``ThreadPoolExecutor`` to work with files asynchronously. I/O based payloads from ``payload.py`` uses a ``ThreadPoolExecutor`` to work with I/O objects asynchronously. `#3313 `_ - Internal Server Errors in plain text if the browser does not support HTML. `#3483 `_ Bugfixes -------- - Preserve MultipartWriter parts headers on write. Refactor the way how ``Payload.headers`` are handled. Payload instances now always have headers and Content-Type defined. Fix Payload Content-Disposition header reset after initial creation. `#3035 `_ - Log suppressed exceptions in ``GunicornWebWorker``. `#3464 `_ - Remove wildcard imports. `#3468 `_ - Use the same task for app initialization and web server handling in gunicorn workers. It allows to use Python3.7 context vars smoothly. `#3471 `_ - Fix handling of chunked+gzipped response when first chunk does not give uncompressed data `#3477 `_ - Replace ``collections.MutableMapping`` with ``collections.abc.MutableMapping`` to avoid a deprecation warning. `#3480 `_ - ``Payload.size`` type annotation changed from ``Optional[float]`` to ``Optional[int]``. `#3484 `_ - Ignore done tasks when cancels pending activities on ``web.run_app`` finalization. `#3497 `_ Improved Documentation ---------------------- - Add documentation for ``aiohttp.web.HTTPException``. `#3490 `_ Misc ---- - `#3487 `_ ---- 3.5.1 (2018-12-24) ==================== - Fix a regression about ``ClientSession._requote_redirect_url`` modification in debug mode. 3.5.0 (2018-12-22) ==================== Features -------- - The library type annotations are checked in strict mode now. - Add support for setting cookies for individual request (`#2387 `_) - Application.add_domain implementation (`#2809 `_) - The default ``app`` in the request returned by ``test_utils.make_mocked_request`` can now have objects assigned to it and retrieved using the ``[]`` operator. (`#3174 `_) - Make ``request.url`` accessible when transport is closed. (`#3177 `_) - Add ``zlib_executor_size`` argument to ``Response`` constructor to allow compression to run in a background executor to avoid blocking the main thread and potentially triggering health check failures. (`#3205 `_) - Enable users to set ``ClientTimeout`` in ``aiohttp.request`` (`#3213 `_) - Don't raise a warning if ``NETRC`` environment variable is not set and ``~/.netrc`` file doesn't exist. (`#3267 `_) - Add default logging handler to web.run_app If the ``Application.debug``` flag is set and the default logger ``aiohttp.access`` is used, access logs will now be output using a *stderr* ``StreamHandler`` if no handlers are attached. Furthermore, if the default logger has no log level set, the log level will be set to ``DEBUG``. (`#3324 `_) - Add method argument to ``session.ws_connect()``. Sometimes server API requires a different HTTP method for WebSocket connection establishment. For example, ``Docker exec`` needs POST. (`#3378 `_) - Create a task per request handling. (`#3406 `_) Bugfixes -------- - Enable passing ``access_log_class`` via ``handler_args`` (`#3158 `_) - Return empty bytes with end-of-chunk marker in empty stream reader. (`#3186 `_) - Accept ``CIMultiDictProxy`` instances for ``headers`` argument in ``web.Response`` constructor. (`#3207 `_) - Don't uppercase HTTP method in parser (`#3233 `_) - Make method match regexp RFC-7230 compliant (`#3235 `_) - Add ``app.pre_frozen`` state to properly handle startup signals in sub-applications. (`#3237 `_) - Enhanced parsing and validation of helpers.BasicAuth.decode. (`#3239 `_) - Change imports from collections module in preparation for 3.8. (`#3258 `_) - Ensure Host header is added first to ClientRequest to better replicate browser (`#3265 `_) - Fix forward compatibility with Python 3.8: importing ABCs directly from the collections module will not be supported anymore. (`#3273 `_) - Keep the query string by ``normalize_path_middleware``. (`#3278 `_) - Fix missing parameter ``raise_for_status`` for aiohttp.request() (`#3290 `_) - Bracket IPv6 addresses in the HOST header (`#3304 `_) - Fix default message for server ping and pong frames. (`#3308 `_) - Fix tests/test_connector.py typo and tests/autobahn/server.py duplicate loop def. (`#3337 `_) - Fix false-negative indicator end_of_HTTP_chunk in StreamReader.readchunk function (`#3361 `_) - Release HTTP response before raising status exception (`#3364 `_) - Fix task cancellation when ``sendfile()`` syscall is used by static file handling. (`#3383 `_) - Fix stack trace for ``asyncio.TimeoutError`` which was not logged, when it is caught in the handler. (`#3414 `_) Improved Documentation ---------------------- - Improve documentation of ``Application.make_handler`` parameters. (`#3152 `_) - Fix BaseRequest.raw_headers doc. (`#3215 `_) - Fix typo in TypeError exception reason in ``web.Application._handle`` (`#3229 `_) - Make server access log format placeholder %b documentation reflect behavior and docstring. (`#3307 `_) Deprecations and Removals ------------------------- - Deprecate modification of ``session.requote_redirect_url`` (`#2278 `_) - Deprecate ``stream.unread_data()`` (`#3260 `_) - Deprecated use of boolean in ``resp.enable_compression()`` (`#3318 `_) - Encourage creation of aiohttp public objects inside a coroutine (`#3331 `_) - Drop dead ``Connection.detach()`` and ``Connection.writer``. Both methods were broken for more than 2 years. (`#3358 `_) - Deprecate ``app.loop``, ``request.loop``, ``client.loop`` and ``connector.loop`` properties. (`#3374 `_) - Deprecate explicit debug argument. Use asyncio debug mode instead. (`#3381 `_) - Deprecate body parameter in HTTPException (and derived classes) constructor. (`#3385 `_) - Deprecate bare connector close, use ``async with connector:`` and ``await connector.close()`` instead. (`#3417 `_) - Deprecate obsolete ``read_timeout`` and ``conn_timeout`` in ``ClientSession`` constructor. (`#3438 `_) Misc ---- - #3341, #3351 Platform: UNKNOWN Classifier: License :: OSI Approved :: Apache Software License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Development Status :: 5 - Production/Stable Classifier: Operating System :: POSIX Classifier: Operating System :: MacOS :: MacOS X Classifier: Operating System :: Microsoft :: Windows Classifier: Topic :: Internet :: WWW/HTTP Classifier: Framework :: AsyncIO Requires-Python: >=3.5.3 Provides-Extra: speedups aiohttp-3.6.2/aiohttp.egg-info/SOURCES.txt0000644000175100001650000001257213547410140020516 0ustar vstsdocker00000000000000CHANGES.rst CONTRIBUTORS.txt LICENSE.txt MANIFEST.in Makefile README.rst setup.cfg setup.py aiohttp/__init__.py aiohttp/_cparser.pxd aiohttp/_find_header.c aiohttp/_find_header.h aiohttp/_find_header.pxd aiohttp/_frozenlist.c aiohttp/_frozenlist.pyx aiohttp/_headers.pxi aiohttp/_helpers.c aiohttp/_helpers.pyi aiohttp/_helpers.pyx aiohttp/_http_parser.c aiohttp/_http_parser.pyx aiohttp/_http_writer.c aiohttp/_http_writer.pyx aiohttp/_websocket.c aiohttp/_websocket.pyx aiohttp/abc.py aiohttp/base_protocol.py aiohttp/client.py aiohttp/client_exceptions.py aiohttp/client_proto.py aiohttp/client_reqrep.py aiohttp/client_ws.py aiohttp/connector.py aiohttp/cookiejar.py aiohttp/formdata.py aiohttp/frozenlist.py aiohttp/frozenlist.pyi aiohttp/hdrs.py aiohttp/helpers.py aiohttp/http.py aiohttp/http_exceptions.py aiohttp/http_parser.py aiohttp/http_websocket.py aiohttp/http_writer.py aiohttp/locks.py aiohttp/log.py aiohttp/multipart.py aiohttp/payload.py aiohttp/payload_streamer.py aiohttp/py.typed aiohttp/pytest_plugin.py aiohttp/resolver.py aiohttp/signals.py aiohttp/signals.pyi aiohttp/streams.py aiohttp/tcp_helpers.py aiohttp/test_utils.py aiohttp/tracing.py aiohttp/typedefs.py aiohttp/web.py aiohttp/web_app.py aiohttp/web_exceptions.py aiohttp/web_fileresponse.py aiohttp/web_log.py aiohttp/web_middlewares.py aiohttp/web_protocol.py aiohttp/web_request.py aiohttp/web_response.py aiohttp/web_routedef.py aiohttp/web_runner.py aiohttp/web_server.py aiohttp/web_urldispatcher.py aiohttp/web_ws.py aiohttp/worker.py aiohttp.egg-info/PKG-INFO aiohttp.egg-info/SOURCES.txt aiohttp.egg-info/dependency_links.txt aiohttp.egg-info/requires.txt aiohttp.egg-info/top_level.txt docs/Makefile docs/abc.rst docs/aiohttp-icon.svg docs/aiohttp-plain.svg docs/built_with.rst docs/changes.rst docs/client.rst docs/client_advanced.rst docs/client_quickstart.rst docs/client_reference.rst docs/conf.py docs/contributing.rst docs/deployment.rst docs/essays.rst docs/external.rst docs/faq.rst docs/favicon.ico docs/glossary.rst docs/index.rst docs/logging.rst docs/make.bat docs/migration_to_2xx.rst docs/misc.rst docs/multipart.rst docs/multipart_reference.rst docs/new_router.rst docs/old-logo.png docs/old-logo.svg docs/powered_by.rst docs/signals.rst docs/spelling_wordlist.txt docs/streams.rst docs/structures.rst docs/testing.rst docs/third_party.rst docs/tracing_reference.rst docs/utilities.rst docs/web.rst docs/web_advanced.rst docs/web_lowlevel.rst docs/web_quickstart.rst docs/web_reference.rst docs/websocket_utilities.rst docs/whats_new_1_1.rst docs/whats_new_3_0.rst docs/_static/aiohttp-icon-128x128.png examples/background_tasks.py examples/cli_app.py examples/client_auth.py examples/client_json.py examples/client_ws.py examples/curl.py examples/fake_server.py examples/lowlevel_srv.py examples/server.crt examples/server.csr examples/server.key examples/server_simple.py examples/static_files.py examples/web_classview.py examples/web_cookies.py examples/web_rewrite_headers_middleware.py examples/web_srv.py examples/web_srv_route_deco.py examples/web_srv_route_table.py examples/web_ws.py examples/websocket.html examples/legacy/crawl.py examples/legacy/srv.py examples/legacy/tcp_protocol_parser.py tests/aiohttp.jpg tests/aiohttp.png tests/conftest.py tests/data.unknown_mime_type tests/hello.txt.gz tests/test_base_protocol.py tests/test_classbasedview.py tests/test_client_connection.py tests/test_client_exceptions.py tests/test_client_fingerprint.py tests/test_client_functional.py tests/test_client_proto.py tests/test_client_request.py tests/test_client_response.py tests/test_client_session.py tests/test_client_ws.py tests/test_client_ws_functional.py tests/test_connector.py tests/test_cookiejar.py tests/test_flowcontrol_streams.py tests/test_formdata.py tests/test_frozenlist.py tests/test_helpers.py tests/test_http_exceptions.py tests/test_http_parser.py tests/test_http_writer.py tests/test_locks.py tests/test_loop.py tests/test_multipart.py tests/test_multipart_helpers.py tests/test_payload.py tests/test_proxy.py tests/test_proxy_functional.py tests/test_pytest_plugin.py tests/test_resolver.py tests/test_route_def.py tests/test_run_app.py tests/test_signals.py tests/test_streams.py tests/test_tcp_helpers.py tests/test_test_utils.py tests/test_tracing.py tests/test_urldispatch.py tests/test_web_app.py tests/test_web_cli.py tests/test_web_exceptions.py tests/test_web_functional.py tests/test_web_log.py tests/test_web_middleware.py tests/test_web_protocol.py tests/test_web_request.py tests/test_web_request_handler.py tests/test_web_response.py tests/test_web_runner.py tests/test_web_sendfile.py tests/test_web_sendfile_functional.py tests/test_web_server.py tests/test_web_urldispatcher.py tests/test_web_websocket.py tests/test_web_websocket_functional.py tests/test_websocket_handshake.py tests/test_websocket_parser.py tests/test_websocket_writer.py tests/test_worker.py tests/autobahn/client.py tests/autobahn/fuzzingclient.json tests/autobahn/fuzzingserver.json tests/autobahn/server.py vendor/http-parser/.git vendor/http-parser/.gitignore vendor/http-parser/.mailmap vendor/http-parser/.travis.yml vendor/http-parser/AUTHORS vendor/http-parser/LICENSE-MIT vendor/http-parser/Makefile vendor/http-parser/README.md vendor/http-parser/bench.c vendor/http-parser/http_parser.c vendor/http-parser/http_parser.gyp vendor/http-parser/http_parser.h vendor/http-parser/test.c vendor/http-parser/contrib/parsertrace.c vendor/http-parser/contrib/url_parser.caiohttp-3.6.2/aiohttp.egg-info/dependency_links.txt0000644000175100001650000000000113547410140022671 0ustar vstsdocker00000000000000 aiohttp-3.6.2/aiohttp.egg-info/requires.txt0000644000175100001650000000030213547410140021216 0ustar vstsdocker00000000000000attrs>=17.3.0 chardet<4.0,>=2.0 multidict<5.0,>=4.5 async_timeout<4.0,>=3.0 yarl<2.0,>=1.0 [:python_version < "3.7"] idna-ssl>=1.0 typing_extensions>=3.6.5 [speedups] aiodns brotlipy cchardet aiohttp-3.6.2/aiohttp.egg-info/top_level.txt0000644000175100001650000000001013547410140021344 0ustar vstsdocker00000000000000aiohttp aiohttp-3.6.2/docs/0000755000175100001650000000000013547410140014411 5ustar vstsdocker00000000000000aiohttp-3.6.2/docs/Makefile0000644000175100001650000001533313547410117016062 0ustar vstsdocker00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/aiohttp.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/aiohttp.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/aiohttp" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/aiohttp" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." spelling: $(SPHINXBUILD) -b spelling $(ALLSPHINXOPTS) $(BUILDDIR)/spelling @echo @echo "Build finished." aiohttp-3.6.2/docs/_static/0000755000175100001650000000000013547410140016037 5ustar vstsdocker00000000000000aiohttp-3.6.2/docs/_static/aiohttp-icon-128x128.png0000644000175100001650000001064713547410117022112 0ustar vstsdocker00000000000000PNG  IHDRgAMA a cHRMz&u0`:pQ<PLTE)Z-[+[,\,[,\,[,[,[,[,[+[,Y,[,[,[,[-\.]3f+]-[,[,[+ZUU,\,\,[,[-[.['X-Z+[-[+\0X,\,[,[,[,[,Z,[,Z,Z-Z,[-Z+\-\,\-[,[,[,[,[-[.[+Z,[,[,Z-\)\,[,\,\-Z-Z,[,[+[-\.\,\,[+[+`,\,[+Y$I3f,[-[,[$[0`*],[,\,Z,]@@,[+Z.],[)Z,[+\,[,[+\,[,Z,[/^-Z,[-Y+[,[,]9U,Z,\,Z,Y+Y,[,[*Y+\+Z `,[,[-Z,[,[-Z-Y,[,\3U,['b,[+U+[,[,[+[,[,\,[,[+[+\,[+[-Z,[,Z,W+Z,[+^,\,[-[,[-[+Z+\,[.[+[-\+[(^1U-[,[,[,[,\,[,[-Y,[+U,Z-Z+\.],\,Z.],\-\-[+Z-\,[,[,[+U-Z,[,Z+\.\+Z+[,Z-[,\,[+[,[,\,[-[/^&Y,Z,\+\+[-[,[-[,\*X+[,[-Z-[,[,X,Z+Z,[.\-[,[+\+[+[,Z+Z,[+Z,[5tRNSIkjQPBҊAECw :ɯifWUdl8GrFD>g';M m7@40,%SNt&J?) z.<+/3( *y$5"#6Lզ^T}_-b\9 ]!K a|Άs[RY=xh¾u~coZV1p`n{2HveXq{bKGDj6 pHYs=tIME  #}Hh pIDATxڽg@WzD H]AY`A5 kT D "EA*jbDŨ(gBk`C#cS3 LM ea9]VCml*s;{a0|.t#-Q~Xru7ǻ48xL7i4﨑9~6vq 1U: UɺQ>1*>v"c#&$B] <(Z&GEKt&J_PAF43jdȴ%ї.Sڄ6~,Y &d(N?sUVjfb戙<\,[`S\p:# 7}W MHR8hR6WqJPTLng*J\҄iu ok$U2lrJU9m-'eby, 4uylX[핐I n`w$OA!dg*X[]nt8ܽ'LDj`B P. <*Ez<ϕw!tG9Ib}*\:AX/f6rC~bZKD|.CiVεK ρQ#DO$r@ J%hf4nU;hb) 3Nⵧ87}H 'і.rӤ> $xE s.j $œJv0T#&p0e;f$bEgM U ZCE:фxmPp8͉6KK<ԃ?`zX*'j":L'$s-h[RkکĮlk Eb/$u_SFrEjPDԲCgq!l'c9Ju@=9H)tk QDMA@ra@C2˨"ibԡ :b {C 'l+#߯PscHq-9 MI39uEt:2/QP{9 1+M 3^;hkV>{}jo)PAdbC-4ي.Ha\ XA~C{ƅ=*#,[rW/{UǷy,8oUfdX49[@(oT0k (3^W?BH?f]qsyle, BKg:*T2|]3$v'l|~$TɎLIpyq`og^gbhT3ݣ0U5d4|Ũ[4\P<]M7>c(Qaa-h&2y:m" wڕbĵj77]<ɔU@WBڼ%} Kq#?/UO#wCz"sVGeM`i-eüDl_Ů6xSj8,}-bL% ʊY$V.!L$8 4NjXB*r;!TɊ 7נoKuGP,$,| A{7N! Ϊ1Z¡frȩT=tH- O`),j-Tm`<%*UȰ|t7Ek 0NXzs val 0pL6}̰Lv@(Ё!pw_Z*#wf/]gnU-OˇL%965|x\acx`>`E ) |,)Bb"]oh)\'х '^KQ1Hz/_LGGx ד N՜ *A݆+2a#?8E=wd[u??ۨ|~䃟4K^}ʘ8žS[4'?Cʗʒ͸s G5SyOq'!MF>Z,L,媎x.b0V`@31Y< Ԫ3Ҝ\v(rED ٹl l..V E{A+[%w㺘xa!6WGCZ = dΉty"LjUWq!r7|#%tEXtdate:create2018-10-06T12:06:19+00:00\JZ%tEXtdate:modify2018-10-06T12:06:19+00:00-tEXtSoftwarewww.inkscape.org<IENDB`aiohttp-3.6.2/docs/abc.rst0000644000175100001650000001167413547410117015705 0ustar vstsdocker00000000000000.. _aiohttp-abc: Abstract Base Classes ===================== .. module:: aiohttp.abc Abstract routing ---------------- aiohttp has abstract classes for managing web interfaces. The most part of :mod:`aiohttp.web` is not intended to be inherited but few of them are. aiohttp.web is built on top of few concepts: *application*, *router*, *request* and *response*. *router* is a *pluggable* part: a library user may build a *router* from scratch, all other parts should work with new router seamlessly. :class:`AbstractRouter` has the only mandatory method: :meth:`AbstractRouter.resolve` coroutine. It must return an :class:`AbstractMatchInfo` instance. If the requested URL handler is found :meth:`AbstractMatchInfo.handler` is a :term:`web-handler` for requested URL and :attr:`AbstractMatchInfo.http_exception` is ``None``. Otherwise :attr:`AbstractMatchInfo.http_exception` is an instance of :exc:`~aiohttp.web.HTTPException` like *404: NotFound* or *405: Method Not Allowed*. :meth:`AbstractMatchInfo.handler` raises :attr:`~AbstractMatchInfo.http_exception` on call. .. class:: aiohttp.abc.AbstractRouter Abstract router, :class:`aiohttp.web.Application` accepts it as *router* parameter and returns as :attr:`aiohttp.web.Application.router`. .. coroutinemethod:: resolve(request) Performs URL resolving. It's an abstract method, should be overridden in *router* implementation. :param request: :class:`aiohttp.web.Request` instance for resolving, the request has :attr:`aiohttp.web.Request.match_info` equals to ``None`` at resolving stage. :return: :class:`AbstractMatchInfo` instance. .. class:: aiohttp.abc.AbstractMatchInfo Abstract *match info*, returned by :meth:`AbstractRouter.resolve` call. .. attribute:: http_exception :exc:`aiohttp.web.HTTPException` if no match was found, ``None`` otherwise. .. coroutinemethod:: handler(request) Abstract method performing :term:`web-handler` processing. :param request: :class:`aiohttp.web.Request` instance for resolving, the request has :attr:`aiohttp.web.Request.match_info` equals to ``None`` at resolving stage. :return: :class:`aiohttp.web.StreamResponse` or descendants. :raise: :class:`aiohttp.web.HTTPException` on error .. coroutinemethod:: expect_handler(request) Abstract method for handling *100-continue* processing. Abstract Class Based Views -------------------------- For *class based view* support aiohttp has abstract :class:`AbstractView` class which is *awaitable* (may be uses like ``await Cls()`` or ``yield from Cls()`` and has a *request* as an attribute. .. class:: AbstractView An abstract class, base for all *class based views* implementations. Methods ``__iter__`` and ``__await__`` should be overridden. .. attribute:: request :class:`aiohttp.web.Request` instance for performing the request. Abstract Cookie Jar ------------------- .. class:: aiohttp.abc.AbstractCookieJar The cookie jar instance is available as :attr:`ClientSession.cookie_jar`. The jar contains :class:`~http.cookies.Morsel` items for storing internal cookie data. API provides a count of saved cookies:: len(session.cookie_jar) These cookies may be iterated over:: for cookie in session.cookie_jar: print(cookie.key) print(cookie["domain"]) An abstract class for cookie storage. Implements :class:`collections.abc.Iterable` and :class:`collections.abc.Sized`. .. method:: update_cookies(cookies, response_url=None) Update cookies returned by server in ``Set-Cookie`` header. :param cookies: a :class:`collections.abc.Mapping` (e.g. :class:`dict`, :class:`~http.cookies.SimpleCookie`) or *iterable* of *pairs* with cookies returned by server's response. :param str response_url: URL of response, ``None`` for *shared cookies*. Regular cookies are coupled with server's URL and are sent only to this server, shared ones are sent in every client request. .. method:: filter_cookies(request_url) Return jar's cookies acceptable for URL and available in ``Cookie`` header for sending client requests for given URL. :param str response_url: request's URL for which cookies are asked. :return: :class:`http.cookies.SimpleCookie` with filtered cookies for given URL. Abstract Abstract Access Logger ------------------------------- .. class:: aiohttp.abc.AbstractAccessLogger An abstract class, base for all :class:`RequestHandler` ``access_logger`` implementations Method ``log`` should be overridden. .. method:: log(request, response, time) :param request: :class:`aiohttp.web.Request` object. :param response: :class:`aiohttp.web.Response` object. :param float time: Time taken to serve the request. aiohttp-3.6.2/docs/aiohttp-icon.svg0000644000175100001650000000774513547410117017551 0ustar vstsdocker00000000000000 image/svg+xml aiohttp-3.6.2/docs/aiohttp-plain.svg0000644000175100001650000000774413547410117017723 0ustar vstsdocker00000000000000 image/svg+xml aiohttp-3.6.2/docs/built_with.rst0000644000175100001650000000172213547410117017323 0ustar vstsdocker00000000000000.. _aiohttp-built-with: Built with aiohttp ================== aiohttp is used to build useful libraries built on top of it, and there's a page dedicated to list them: :ref:`aiohttp-3rd-party`. There are also projects that leverage the power of aiohttp to provide end-user tools, like command lines or software with full user interfaces. This page aims to list those projects. If you are using aiohttp in your software and if it's playing a central role, you can add it here in this list. You can also add a **Built with aiohttp** link somewhere in your project, pointing to ``_. * `Molotov `_ Load testing tool. * `Arsenic `_ Async WebDriver. * `Home Assistant `_ Home Automation Platform. * `Backend.AI `_ Code execution API service. * `doh-proxy `_ DNS Over HTTPS Proxy. aiohttp-3.6.2/docs/changes.rst0000644000175100001650000000011713547410117016556 0ustar vstsdocker00000000000000.. _aiohttp_changes: .. include:: ../CHANGES.rst .. include:: ../HISTORY.rst aiohttp-3.6.2/docs/client.rst0000644000175100001650000000045413547410117016430 0ustar vstsdocker00000000000000.. _aiohttp-client: Client ====== .. currentmodule:: aiohttp The page contains all information about aiohttp Client API: .. toctree:: :name: client Quickstart Advanced Usage Reference Tracing Reference aiohttp-3.6.2/docs/client_advanced.rst0000644000175100001650000004703513547410117020263 0ustar vstsdocker00000000000000.. _aiohttp-client-advanced: Advanced Client Usage ===================== .. currentmodule:: aiohttp .. _aiohttp-client-session: Client Session -------------- :class:`ClientSession` is the heart and the main entry point for all client API operations. Create the session first, use the instance for performing HTTP requests and initiating WebSocket connections. The session contains a cookie storage and connection pool, thus cookies and connections are shared between HTTP requests sent by the same session. Custom Request Headers ---------------------- If you need to add HTTP headers to a request, pass them in a :class:`dict` to the *headers* parameter. For example, if you want to specify the content-type directly:: url = 'http://example.com/image' payload = b'GIF89a\x01\x00\x01\x00\x00\xff\x00,\x00\x00' b'\x00\x00\x01\x00\x01\x00\x00\x02\x00;' headers = {'content-type': 'image/gif'} await session.post(url, data=payload, headers=headers) You also can set default headers for all session requests:: headers={"Authorization": "Basic bG9naW46cGFzcw=="} async with aiohttp.ClientSession(headers=headers) as session: async with session.get("http://httpbin.org/headers") as r: json_body = await r.json() assert json_body['headers']['Authorization'] == \ 'Basic bG9naW46cGFzcw==' Typical use case is sending JSON body. You can specify content type directly as shown above, but it is more convenient to use special keyword ``json``:: await session.post(url, json={'example': 'text'}) For *text/plain* :: await session.post(url, data='Привет, Мир!') Custom Cookies -------------- To send your own cookies to the server, you can use the *cookies* parameter of :class:`ClientSession` constructor:: url = 'http://httpbin.org/cookies' cookies = {'cookies_are': 'working'} async with ClientSession(cookies=cookies) as session: async with session.get(url) as resp: assert await resp.json() == { "cookies": {"cookies_are": "working"}} .. note:: ``httpbin.org/cookies`` endpoint returns request cookies in JSON-encoded body. To access session cookies see :attr:`ClientSession.cookie_jar`. :class:`~aiohttp.ClientSession` may be used for sharing cookies between multiple requests:: async with aiohttp.ClientSession() as session: await session.get( 'http://httpbin.org/cookies/set?my_cookie=my_value') filtered = session.cookie_jar.filter_cookies( 'http://httpbin.org') assert filtered['my_cookie'].value == 'my_value' async with session.get('http://httpbin.org/cookies') as r: json_body = await r.json() assert json_body['cookies']['my_cookie'] == 'my_value' Response Headers and Cookies ---------------------------- We can view the server's response :attr:`ClientResponse.headers` using a :class:`~multidict.CIMultiDictProxy`:: assert resp.headers == { 'ACCESS-CONTROL-ALLOW-ORIGIN': '*', 'CONTENT-TYPE': 'application/json', 'DATE': 'Tue, 15 Jul 2014 16:49:51 GMT', 'SERVER': 'gunicorn/18.0', 'CONTENT-LENGTH': '331', 'CONNECTION': 'keep-alive'} The dictionary is special, though: it's made just for HTTP headers. According to `RFC 7230 `_, HTTP Header names are case-insensitive. It also supports multiple values for the same key as HTTP protocol does. So, we can access the headers using any capitalization we want:: assert resp.headers['Content-Type'] == 'application/json' assert resp.headers.get('content-type') == 'application/json' All headers are converted from binary data using UTF-8 with ``surrogateescape`` option. That works fine on most cases but sometimes unconverted data is needed if a server uses nonstandard encoding. While these headers are malformed from :rfc:`7230` perspective they may be retrieved by using :attr:`ClientResponse.raw_headers` property:: assert resp.raw_headers == ( (b'SERVER', b'nginx'), (b'DATE', b'Sat, 09 Jan 2016 20:28:40 GMT'), (b'CONTENT-TYPE', b'text/html; charset=utf-8'), (b'CONTENT-LENGTH', b'12150'), (b'CONNECTION', b'keep-alive')) If a response contains some *HTTP Cookies*, you can quickly access them:: url = 'http://example.com/some/cookie/setting/url' async with session.get(url) as resp: print(resp.cookies['example_cookie_name']) .. note:: Response cookies contain only values, that were in ``Set-Cookie`` headers of the **last** request in redirection chain. To gather cookies between all redirection requests please use :ref:`aiohttp.ClientSession ` object. Redirection History ------------------- If a request was redirected, it is possible to view previous responses using the :attr:`~ClientResponse.history` attribute:: resp = await session.get('http://example.com/some/redirect/') assert resp.status == 200 assert resp.url = URL('http://example.com/some/other/url/') assert len(resp.history) == 1 assert resp.history[0].status == 301 assert resp.history[0].url = URL( 'http://example.com/some/redirect/') If no redirects occurred or ``allow_redirects`` is set to ``False``, history will be an empty sequence. Cookie Jar ---------- .. _aiohttp-client-cookie-safety: Cookie Safety ^^^^^^^^^^^^^ By default :class:`~aiohttp.ClientSession` uses strict version of :class:`aiohttp.CookieJar`. :rfc:`2109` explicitly forbids cookie accepting from URLs with IP address instead of DNS name (e.g. `http://127.0.0.1:80/cookie`). It's good but sometimes for testing we need to enable support for such cookies. It should be done by passing `unsafe=True` to :class:`aiohttp.CookieJar` constructor:: jar = aiohttp.CookieJar(unsafe=True) session = aiohttp.ClientSession(cookie_jar=jar) .. _aiohttp-client-dummy-cookie-jar: Dummy Cookie Jar ^^^^^^^^^^^^^^^^ Sometimes cookie processing is not desirable. For this purpose it's possible to pass :class:`aiohttp.DummyCookieJar` instance into client session:: jar = aiohttp.DummyCookieJar() session = aiohttp.ClientSession(cookie_jar=jar) Uploading pre-compressed data ----------------------------- To upload data that is already compressed before passing it to aiohttp, call the request function with the used compression algorithm name (usually ``deflate`` or ``gzip``) as the value of the ``Content-Encoding`` header:: async def my_coroutine(session, headers, my_data): data = zlib.compress(my_data) headers = {'Content-Encoding': 'deflate'} async with session.post('http://httpbin.org/post', data=data, headers=headers) pass Disabling content type validation for JSON responses ---------------------------------------------------- The standard explicitly restricts JSON ``Content-Type`` HTTP header to ``application/json`` or any extended form, e.g. ``application/vnd.custom-type+json``. Unfortunately, some servers send a wrong type, like ``text/html``. This can be worked around in two ways: 1. Pass the expected type explicitly (in this case checking will be strict, without the extended form support, so ``custom/xxx+type`` won't be accepted): ``await resp.json(content_type='custom/type')``. 2. Disable the check entirely: ``await resp.json(content_type=None)``. .. _aiohttp-client-tracing: Client Tracing -------------- The execution flow of a specific request can be followed attaching listeners coroutines to the signals provided by the :class:`TraceConfig` instance, this instance will be used as a parameter for the :class:`ClientSession` constructor having as a result a client that triggers the different signals supported by the :class:`TraceConfig`. By default any instance of :class:`ClientSession` class comes with the signals ability disabled. The following snippet shows how the start and the end signals of a request flow can be followed:: async def on_request_start( session, trace_config_ctx, params): print("Starting request") async def on_request_end(session, trace_config_ctx, params): print("Ending request") trace_config = aiohttp.TraceConfig() trace_config.on_request_start.append(on_request_start) trace_config.on_request_end.append(on_request_end) async with aiohttp.ClientSession( trace_configs=[trace_config]) as client: client.get('http://example.com/some/redirect/') The ``trace_configs`` is a list that can contain instances of :class:`TraceConfig` class that allow run the signals handlers coming from different :class:`TraceConfig` instances. The following example shows how two different :class:`TraceConfig` that have a different nature are installed to perform their job in each signal handle:: from mylib.traceconfig import AuditRequest from mylib.traceconfig import XRay async with aiohttp.ClientSession( trace_configs=[AuditRequest(), XRay()]) as client: client.get('http://example.com/some/redirect/') All signals take as a parameters first, the :class:`ClientSession` instance used by the specific request related to that signals and second, a :class:`SimpleNamespace` instance called ``trace_config_ctx``. The ``trace_config_ctx`` object can be used to share the state through to the different signals that belong to the same request and to the same :class:`TraceConfig` class, perhaps:: async def on_request_start( session, trace_config_ctx, params): trace_config_ctx.start = asyncio.get_event_loop().time() async def on_request_end(session, trace_config_ctx, params): elapsed = asyncio.get_event_loop().time() - trace_config_ctx.start print("Request took {}".format(elapsed)) The ``trace_config_ctx`` param is by default a :class:`SimpleNampespace` that is initialized at the beginning of the request flow. However, the factory used to create this object can be overwritten using the ``trace_config_ctx_factory`` constructor param of the :class:`TraceConfig` class. The ``trace_request_ctx`` param can given at the beginning of the request execution, accepted by all of the HTTP verbs, and will be passed as a keyword argument for the ``trace_config_ctx_factory`` factory. This param is useful to pass data that is only available at request time, perhaps:: async def on_request_start( session, trace_config_ctx, params): print(trace_config_ctx.trace_request_ctx) session.get('http://example.com/some/redirect/', trace_request_ctx={'foo': 'bar'}) .. seealso:: :ref:`aiohttp-client-tracing-reference` section for more information about the different signals supported. Connectors ---------- To tweak or change *transport* layer of requests you can pass a custom *connector* to :class:`~aiohttp.ClientSession` and family. For example:: conn = aiohttp.TCPConnector() session = aiohttp.ClientSession(connector=conn) .. note:: By default *session* object takes the ownership of the connector, among other things closing the connections once the *session* is closed. If you are keen on share the same *connector* through different *session* instances you must give the *connector_owner* parameter as **False** for each *session* instance. .. seealso:: :ref:`aiohttp-client-reference-connectors` section for more information about different connector types and configuration options. Limiting connection pool size ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To limit amount of simultaneously opened connections you can pass *limit* parameter to *connector*:: conn = aiohttp.TCPConnector(limit=30) The example limits total amount of parallel connections to `30`. The default is `100`. If you explicitly want not to have limits, pass `0`. For example:: conn = aiohttp.TCPConnector(limit=0) To limit amount of simultaneously opened connection to the same endpoint (``(host, port, is_ssl)`` triple) you can pass *limit_per_host* parameter to *connector*:: conn = aiohttp.TCPConnector(limit_per_host=30) The example limits amount of parallel connections to the same to `30`. The default is `0` (no limit on per host bases). Tuning the DNS cache ^^^^^^^^^^^^^^^^^^^^ By default :class:`~aiohttp.TCPConnector` comes with the DNS cache table enabled, and resolutions will be cached by default for `10` seconds. This behavior can be changed either to change of the TTL for a resolution, as can be seen in the following example:: conn = aiohttp.TCPConnector(ttl_dns_cache=300) or disabling the use of the DNS cache table, meaning that all requests will end up making a DNS resolution, as the following example shows:: conn = aiohttp.TCPConnector(use_dns_cache=False) Resolving using custom nameservers ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ In order to specify the nameservers to when resolving the hostnames, :term:`aiodns` is required:: from aiohttp.resolver import AsyncResolver resolver = AsyncResolver(nameservers=["8.8.8.8", "8.8.4.4"]) conn = aiohttp.TCPConnector(resolver=resolver) Unix domain sockets ^^^^^^^^^^^^^^^^^^^ If your HTTP server uses UNIX domain sockets you can use :class:`~aiohttp.UnixConnector`:: conn = aiohttp.UnixConnector(path='/path/to/socket') session = aiohttp.ClientSession(connector=conn) Named pipes in Windows ^^^^^^^^^^^^^^^^^^^^^^ If your HTTP server uses Named pipes you can use :class:`~aiohttp.NamedPipeConnector`:: conn = aiohttp.NamedPipeConnector(path=r'\\.\pipe\') session = aiohttp.ClientSession(connector=conn) It will only work with the ProactorEventLoop SSL control for TCP sockets --------------------------- By default *aiohttp* uses strict checks for HTTPS protocol. Certification checks can be relaxed by setting *ssl* to ``False``:: r = await session.get('https://example.com', ssl=False) If you need to setup custom ssl parameters (use own certification files for example) you can create a :class:`ssl.SSLContext` instance and pass it into the proper :class:`ClientSession` method:: sslcontext = ssl.create_default_context( cafile='/path/to/ca-bundle.crt') r = await session.get('https://example.com', ssl=sslcontext) If you need to verify *self-signed* certificates, you can do the same thing as the previous example, but add another call to :meth:`ssl.SSLContext.load_cert_chain` with the key pair:: sslcontext = ssl.create_default_context( cafile='/path/to/ca-bundle.crt') sslcontext.load_cert_chain('/path/to/client/public/device.pem', '/path/to/client/private/device.key') r = await session.get('https://example.com', ssl=sslcontext) There is explicit errors when ssl verification fails :class:`aiohttp.ClientConnectorSSLError`:: try: await session.get('https://expired.badssl.com/') except aiohttp.ClientConnectorSSLError as e: assert isinstance(e, ssl.SSLError) :class:`aiohttp.ClientConnectorCertificateError`:: try: await session.get('https://wrong.host.badssl.com/') except aiohttp.ClientConnectorCertificateError as e: assert isinstance(e, ssl.CertificateError) If you need to skip both ssl related errors :class:`aiohttp.ClientSSLError`:: try: await session.get('https://expired.badssl.com/') except aiohttp.ClientSSLError as e: assert isinstance(e, ssl.SSLError) try: await session.get('https://wrong.host.badssl.com/') except aiohttp.ClientSSLError as e: assert isinstance(e, ssl.CertificateError) You may also verify certificates via *SHA256* fingerprint:: # Attempt to connect to https://www.python.org # with a pin to a bogus certificate: bad_fp = b'0'*64 exc = None try: r = await session.get('https://www.python.org', ssl=aiohttp.Fingerprint(bad_fp)) except aiohttp.FingerprintMismatch as e: exc = e assert exc is not None assert exc.expected == bad_fp # www.python.org cert's actual fingerprint assert exc.got == b'...' Note that this is the fingerprint of the DER-encoded certificate. If you have the certificate in PEM format, you can convert it to DER with e.g:: openssl x509 -in crt.pem -inform PEM -outform DER > crt.der .. note:: Tip: to convert from a hexadecimal digest to a binary byte-string, you can use :func:`binascii.unhexlify`. *ssl* parameter could be passed to :class:`TCPConnector` as default, the value from :meth:`ClientSession.get` and others override default. Proxy support ------------- aiohttp supports plain HTTP proxies and HTTP proxies that can be upgraded to HTTPS via the HTTP CONNECT method. aiohttp does not support proxies that must be connected to via ``https://``. To connect, use the *proxy* parameter:: async with aiohttp.ClientSession() as session: async with session.get("http://python.org", proxy="http://proxy.com") as resp: print(resp.status) It also supports proxy authorization:: async with aiohttp.ClientSession() as session: proxy_auth = aiohttp.BasicAuth('user', 'pass') async with session.get("http://python.org", proxy="http://proxy.com", proxy_auth=proxy_auth) as resp: print(resp.status) Authentication credentials can be passed in proxy URL:: session.get("http://python.org", proxy="http://user:pass@some.proxy.com") Contrary to the ``requests`` library, it won't read environment variables by default. But you can do so by passing ``trust_env=True`` into :class:`aiohttp.ClientSession` constructor for extracting proxy configuration from *HTTP_PROXY* or *HTTPS_PROXY* *environment variables* (both are case insensitive):: async with aiohttp.ClientSession(trust_env=True) as session: async with session.get("http://python.org") as resp: print(resp.status) Proxy credentials are given from ``~/.netrc`` file if present (see :class:`aiohttp.ClientSession` for more details). Graceful Shutdown ----------------- When :class:`ClientSession` closes at the end of an ``async with`` block (or through a direct :meth:`ClientSession.close()` call), the underlying connection remains open due to asyncio internal details. In practice, the underlying connection will close after a short while. However, if the event loop is stopped before the underlying connection is closed, an ``ResourceWarning: unclosed transport`` warning is emitted (when warnings are enabled). To avoid this situation, a small delay must be added before closing the event loop to allow any open underlying connections to close. For a :class:`ClientSession` without SSL, a simple zero-sleep (``await asyncio.sleep(0)``) will suffice:: async def read_website(): async with aiohttp.ClientSession() as session: async with session.get('http://example.org/') as resp: await resp.read() loop = asyncio.get_event_loop() loop.run_until_complete(read_website()) # Zero-sleep to allow underlying connections to close loop.run_until_complete(asyncio.sleep(0)) loop.close() For a :class:`ClientSession` with SSL, the application must wait a short duration before closing:: ... # Wait 250 ms for the underlying SSL connections to close loop.run_until_complete(asyncio.sleep(0.250)) loop.close() Note that the appropriate amount of time to wait will vary from application to application. All if this will eventually become obsolete when the asyncio internals are changed so that aiohttp itself can wait on the underlying connection to close. Please follow issue `#1925 `_ for the progress on this. aiohttp-3.6.2/docs/client_quickstart.rst0000644000175100001650000003326413547410117020707 0ustar vstsdocker00000000000000.. _aiohttp-client-quickstart: =================== Client Quickstart =================== .. currentmodule:: aiohttp Eager to get started? This page gives a good introduction in how to get started with aiohttp client API. First, make sure that aiohttp is :ref:`installed ` and *up-to-date* Let's get started with some simple examples. Make a Request ============== Begin by importing the aiohttp module:: import aiohttp Now, let's try to get a web-page. For example let's query ``http://httpbin.org/get``:: async with aiohttp.ClientSession() as session: async with session.get('http://httpbin.org/get') as resp: print(resp.status) print(await resp.text()) Now, we have a :class:`ClientSession` called ``session`` and a :class:`ClientResponse` object called ``resp``. We can get all the information we need from the response. The mandatory parameter of :meth:`ClientSession.get` coroutine is an HTTP *url* (:class:`str` or class:`yarl.URL` instance). In order to make an HTTP POST request use :meth:`ClientSession.post` coroutine:: session.post('http://httpbin.org/post', data=b'data') Other HTTP methods are available as well:: session.put('http://httpbin.org/put', data=b'data') session.delete('http://httpbin.org/delete') session.head('http://httpbin.org/get') session.options('http://httpbin.org/get') session.patch('http://httpbin.org/patch', data=b'data') .. note:: Don't create a session per request. Most likely you need a session per application which performs all requests altogether. More complex cases may require a session per site, e.g. one for Github and other one for Facebook APIs. Anyway making a session for every request is a **very bad** idea. A session contains a connection pool inside. Connection reusage and keep-alives (both are on by default) may speed up total performance. A session context manager usage is not mandatory but ``await session.close()`` method should be called in this case, e.g.:: session = aiohttp.ClientSession() async with session.get('...'): # ... await session.close() Passing Parameters In URLs ========================== You often want to send some sort of data in the URL's query string. If you were constructing the URL by hand, this data would be given as key/value pairs in the URL after a question mark, e.g. ``httpbin.org/get?key=val``. Requests allows you to provide these arguments as a :class:`dict`, using the ``params`` keyword argument. As an example, if you wanted to pass ``key1=value1`` and ``key2=value2`` to ``httpbin.org/get``, you would use the following code:: params = {'key1': 'value1', 'key2': 'value2'} async with session.get('http://httpbin.org/get', params=params) as resp: expect = 'http://httpbin.org/get?key2=value2&key1=value1' assert str(resp.url) == expect You can see that the URL has been correctly encoded by printing the URL. For sending data with multiple values for the same key :class:`MultiDict` may be used as well. It is also possible to pass a list of 2 item tuples as parameters, in that case you can specify multiple values for each key:: params = [('key', 'value1'), ('key', 'value2')] async with session.get('http://httpbin.org/get', params=params) as r: expect = 'http://httpbin.org/get?key=value2&key=value1' assert str(r.url) == expect You can also pass :class:`str` content as param, but beware -- content is not encoded by library. Note that ``+`` is not encoded:: async with session.get('http://httpbin.org/get', params='key=value+1') as r: assert str(r.url) == 'http://httpbin.org/get?key=value+1' .. note:: *aiohttp* internally performs URL canonization before sending request. Canonization encodes *host* part by :term:`IDNA` codec and applies :term:`requoting` to *path* and *query* parts. For example ``URL('http://example.com/путь/%30?a=%31')`` is converted to ``URL('http://example.com/%D0%BF%D1%83%D1%82%D1%8C/0?a=1')``. Sometimes canonization is not desirable if server accepts exact representation and does not requote URL itself. To disable canonization use ``encoded=True`` parameter for URL construction:: await session.get( URL('http://example.com/%30', encoded=True)) .. warning:: Passing *params* overrides ``encoded=True``, never use both options. Response Content and Status Code ================================ We can read the content of the server's response and its status code. Consider the GitHub time-line again:: async with session.get('https://api.github.com/events') as resp: print(resp.status) print(await resp.text()) prints out something like:: 200 '[{"created_at":"2015-06-12T14:06:22Z","public":true,"actor":{... ``aiohttp`` automatically decodes the content from the server. You can specify custom encoding for the :meth:`~ClientResponse.text` method:: await resp.text(encoding='windows-1251') Binary Response Content ======================= You can also access the response body as bytes, for non-text requests:: print(await resp.read()) :: b'[{"created_at":"2015-06-12T14:06:22Z","public":true,"actor":{... The ``gzip`` and ``deflate`` transfer-encodings are automatically decoded for you. You can enable ``brotli`` transfer-encodings support, just install `brotlipy `_. JSON Request ============ Any of session's request methods like :func:`request`, :meth:`ClientSession.get`, :meth:`ClientSesssion.post` etc. accept `json` parameter:: async with aiohttp.ClientSession() as session: async with session.post(url, json={'test': 'object'}) By default session uses python's standard :mod:`json` module for serialization. But it is possible to use different ``serializer``. :class:`ClientSession` accepts ``json_serialize`` parameter:: import ujson async with aiohttp.ClientSession( json_serialize=ujson.dumps) as session: await session.post(url, json={'test': 'object'}) .. note:: ``ujson`` library is faster than standard :mod:`json` but slightly incompatible. JSON Response Content ===================== There's also a built-in JSON decoder, in case you're dealing with JSON data:: async with session.get('https://api.github.com/events') as resp: print(await resp.json()) In case that JSON decoding fails, :meth:`~ClientResponse.json` will raise an exception. It is possible to specify custom encoding and decoder functions for the :meth:`~ClientResponse.json` call. .. note:: The methods above reads the whole response body into memory. If you are planning on reading lots of data, consider using the streaming response method documented below. Streaming Response Content ========================== While methods :meth:`~ClientResponse.read`, :meth:`~ClientResponse.json` and :meth:`~ClientResponse.text` are very convenient you should use them carefully. All these methods load the whole response in memory. For example if you want to download several gigabyte sized files, these methods will load all the data in memory. Instead you can use the :attr:`~ClientResponse.content` attribute. It is an instance of the :class:`aiohttp.StreamReader` class. The ``gzip`` and ``deflate`` transfer-encodings are automatically decoded for you:: async with session.get('https://api.github.com/events') as resp: await resp.content.read(10) In general, however, you should use a pattern like this to save what is being streamed to a file:: with open(filename, 'wb') as fd: while True: chunk = await resp.content.read(chunk_size) if not chunk: break fd.write(chunk) It is not possible to use :meth:`~ClientResponse.read`, :meth:`~ClientResponse.json` and :meth:`~ClientResponse.text` after explicit reading from :attr:`~ClientResponse.content`. More complicated POST requests ============================== Typically, you want to send some form-encoded data -- much like an HTML form. To do this, simply pass a dictionary to the *data* argument. Your dictionary of data will automatically be form-encoded when the request is made:: payload = {'key1': 'value1', 'key2': 'value2'} async with session.post('http://httpbin.org/post', data=payload) as resp: print(await resp.text()) :: { ... "form": { "key2": "value2", "key1": "value1" }, ... } If you want to send data that is not form-encoded you can do it by passing a :class:`bytes` instead of a :class:`dict`. This data will be posted directly and content-type set to 'application/octet-stream' by default:: async with session.post(url, data=b'\x00Binary-data\x00') as resp: ... If you want to send JSON data:: async with session.post(url, json={'example': 'test'}) as resp: ... To send text with appropriate content-type just use ``text`` attribute :: async with session.post(url, data='Тест') as resp: ... POST a Multipart-Encoded File ============================= To upload Multipart-encoded files:: url = 'http://httpbin.org/post' files = {'file': open('report.xls', 'rb')} await session.post(url, data=files) You can set the ``filename`` and ``content_type`` explicitly:: url = 'http://httpbin.org/post' data = FormData() data.add_field('file', open('report.xls', 'rb'), filename='report.xls', content_type='application/vnd.ms-excel') await session.post(url, data=data) If you pass a file object as data parameter, aiohttp will stream it to the server automatically. Check :class:`~aiohttp.streams.StreamReader` for supported format information. .. seealso:: :ref:`aiohttp-multipart` Streaming uploads ================= :mod:`aiohttp` supports multiple types of streaming uploads, which allows you to send large files without reading them into memory. As a simple case, simply provide a file-like object for your body:: with open('massive-body', 'rb') as f: await session.post('http://httpbin.org/post', data=f) Or you can use *asynchronous generator*:: async def file_sender(file_name=None): async with aiofiles.open(file_name, 'rb') as f: chunk = await f.read(64*1024) while chunk: yield chunk chunk = await f.read(64*1024) # Then you can use file_sender as a data provider: async with session.post('http://httpbin.org/post', data=file_sender(file_name='huge_file')) as resp: print(await resp.text()) Because the :attr:`~aiohttp.ClientResponse.content` attribute is a :class:`~aiohttp.StreamReader` (provides async iterator protocol), you can chain get and post requests together:: resp = await session.get('http://python.org') await session.post('http://httpbin.org/post', data=resp.content) .. note:: Python 3.5 has no native support for asynchronous generators, use ``async_generator`` library as workaround. .. deprecated:: 3.1 ``aiohttp`` still supports ``aiohttp.streamer`` decorator but this approach is deprecated in favor of *asynchronous generators* as shown above. .. _aiohttp-client-websockets: WebSockets ========== :mod:`aiohttp` works with client websockets out-of-the-box. You have to use the :meth:`aiohttp.ClientSession.ws_connect` coroutine for client websocket connection. It accepts a *url* as a first parameter and returns :class:`ClientWebSocketResponse`, with that object you can communicate with websocket server using response's methods:: async with session.ws_connect('http://example.org/ws') as ws: async for msg in ws: if msg.type == aiohttp.WSMsgType.TEXT: if msg.data == 'close cmd': await ws.close() break else: await ws.send_str(msg.data + '/answer') elif msg.type == aiohttp.WSMsgType.ERROR: break You **must** use the only websocket task for both reading (e.g. ``await ws.receive()`` or ``async for msg in ws:``) and writing but may have multiple writer tasks which can only send data asynchronously (by ``await ws.send_str('data')`` for example). .. _aiohttp-client-timeouts: Timeouts ======== Timeout settings are stored in :class:`ClientTimeout` data structure. By default *aiohttp* uses a *total* 5min timeout, it means that the whole operation should finish in 5 minutes. The value could be overridden by *timeout* parameter for the session:: timeout = aiohttp.ClientTimeout(total=60) async with aiohttp.ClientSession(timeout=timeout) as session: ... Timeout could be overridden for a request like :meth:`ClientSession.get`:: async with session.get(url, timeout=timeout) as resp: ... Supported :class:`ClientTimeout` fields are: ``total`` The whole operation time including connection establishment, request sending and response reading. ``connect`` The time consists connection establishment for a new connection or waiting for a free connection from a pool if pool connection limits are exceeded. ``sock_connect`` A timeout for connecting to a peer for a new connection, not given from a pool. ``sock_read`` The maximum allowed timeout for period between reading a new data portion from a peer. All fields are floats, ``None`` or ``0`` disables a particular timeout check, see the :class:`ClientTimeout` reference for defaults and additional details. Thus the default timeout is:: aiohttp.ClientTimeout(total=5*60, connect=None, sock_connect=None, sock_read=None) aiohttp-3.6.2/docs/client_reference.rst0000644000175100001650000020263113547410117020447 0ustar vstsdocker00000000000000.. _aiohttp-client-reference: Client Reference ================ .. currentmodule:: aiohttp Client Session -------------- Client session is the recommended interface for making HTTP requests. Session encapsulates a *connection pool* (*connector* instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling. Usage example:: import aiohttp import asyncio async def fetch(client): async with client.get('http://python.org') as resp: assert resp.status == 200 return await resp.text() async def main(): async with aiohttp.ClientSession() as client: html = await fetch(client) print(html) loop = asyncio.get_event_loop() loop.run_until_complete(main()) The client session supports the context manager protocol for self closing. .. class:: ClientSession(*, connector=None, loop=None, cookies=None, \ headers=None, skip_auto_headers=None, \ auth=None, json_serialize=json.dumps, \ version=aiohttp.HttpVersion11, \ cookie_jar=None, read_timeout=None, \ conn_timeout=None, \ timeout=sentinel, \ raise_for_status=False, \ connector_owner=True, \ auto_decompress=True, \ requote_redirect_url=False, \ trust_env=False, \ trace_configs=None) The class for creating client sessions and making requests. :param aiohttp.connector.BaseConnector connector: BaseConnector sub-class instance to support connection pooling. :param loop: :ref:`event loop` used for processing HTTP requests. If *loop* is ``None`` the constructor borrows it from *connector* if specified. :func:`asyncio.get_event_loop` is used for getting default event loop otherwise. .. deprecated:: 2.0 :param dict cookies: Cookies to send with the request (optional) :param headers: HTTP Headers to send with every request (optional). May be either *iterable of key-value pairs* or :class:`~collections.abc.Mapping` (e.g. :class:`dict`, :class:`~multidict.CIMultiDict`). :param skip_auto_headers: set of headers for which autogeneration should be skipped. *aiohttp* autogenerates headers like ``User-Agent`` or ``Content-Type`` if these headers are not explicitly passed. Using ``skip_auto_headers`` parameter allows to skip that generation. Note that ``Content-Length`` autogeneration can't be skipped. Iterable of :class:`str` or :class:`~aiohttp.istr` (optional) :param aiohttp.BasicAuth auth: an object that represents HTTP Basic Authorization (optional) :param version: supported HTTP version, ``HTTP 1.1`` by default. :param cookie_jar: Cookie Jar, :class:`AbstractCookieJar` instance. By default every session instance has own private cookie jar for automatic cookies processing but user may redefine this behavior by providing own jar implementation. One example is not processing cookies at all when working in proxy mode. If no cookie processing is needed, a :class:`aiohttp.DummyCookieJar` instance can be provided. :param callable json_serialize: Json *serializer* callable. By default :func:`json.dumps` function. :param bool raise_for_status: Automatically call :meth:`ClientResponse.raise_for_status()` for each response, ``False`` by default. This parameter can be overridden when you making a request, e.g.:: client_session = aiohttp.ClientSession(raise_for_status=True) resp = await client_session.get(url, raise_for_status=False) async with resp: assert resp.status == 200 Set the parameter to ``True`` if you need ``raise_for_status`` for most of cases but override ``raise_for_status`` for those requests where you need to handle responses with status 400 or higher. :param timeout: a :class:`ClientTimeout` settings structure, 5min total timeout by default. .. versionadded:: 3.3 :param float read_timeout: Request operations timeout. ``read_timeout`` is cumulative for all request operations (request, redirects, responses, data consuming). By default, the read timeout is 5*60 seconds. Use ``None`` or ``0`` to disable timeout checks. .. deprecated:: 3.3 Use ``timeout`` parameter instead. :param float conn_timeout: timeout for connection establishing (optional). Values ``0`` or ``None`` mean no timeout. .. deprecated:: 3.3 Use ``timeout`` parameter instead. :param bool connector_owner: Close connector instance on session closing. Setting the parameter to ``False`` allows to share connection pool between sessions without sharing session state: cookies etc. :param bool auto_decompress: Automatically decompress response body, ``True`` by default .. versionadded:: 2.3 :param bool trust_env: Get proxies information from *HTTP_PROXY* / *HTTPS_PROXY* environment variables if the parameter is ``True`` (``False`` by default). Get proxy credentials from ``~/.netrc`` file if present. .. seealso:: ``.netrc`` documentation: https://www.gnu.org/software/inetutils/manual/html_node/The-_002enetrc-file.html .. versionadded:: 2.3 .. versionchanged:: 3.0 Added support for ``~/.netrc`` file. :param bool requote_redirect_url: Apply *URL requoting* for redirection URLs if automatic redirection is enabled (``True`` by default). .. versionadded:: 3.5 :param trace_configs: A list of :class:`TraceConfig` instances used for client tracing. ``None`` (default) is used for request tracing disabling. See :ref:`aiohttp-client-tracing-reference` for more information. .. attribute:: closed ``True`` if the session has been closed, ``False`` otherwise. A read-only property. .. attribute:: connector :class:`aiohttp.connector.BaseConnector` derived instance used for the session. A read-only property. .. attribute:: cookie_jar The session cookies, :class:`~aiohttp.AbstractCookieJar` instance. Gives access to cookie jar's content and modifiers. A read-only property. .. attribute:: requote_redirect_url aiohttp re quote's redirect urls by default, but some servers require exact url from location header. To disable *re-quote* system set :attr:`requote_redirect_url` attribute to ``False``. .. versionadded:: 2.1 .. note:: This parameter affects all subsequent requests. .. deprecated:: 3.5 The attribute modification is deprecated. .. attribute:: loop A loop instance used for session creation. A read-only property. .. deprecated:: 3.5 .. comethod:: request(method, url, *, params=None, data=None, json=None,\ cookies=None, headers=None, skip_auto_headers=None, \ auth=None, allow_redirects=True,\ max_redirects=10,\ compress=None, chunked=None, expect100=False, raise_for_status=None,\ read_until_eof=True, proxy=None, proxy_auth=None,\ timeout=sentinel, ssl=None, \ verify_ssl=None, fingerprint=None, \ ssl_context=None, proxy_headers=None) :async-with: :coroutine: Performs an asynchronous HTTP request. Returns a response object. :param str method: HTTP method :param url: Request URL, :class:`str` or :class:`~yarl.URL`. :param params: Mapping, iterable of tuple of *key*/*value* pairs or string to be sent as parameters in the query string of the new request. Ignored for subsequent redirected requests (optional) Allowed values are: - :class:`collections.abc.Mapping` e.g. :class:`dict`, :class:`aiohttp.MultiDict` or :class:`aiohttp.MultiDictProxy` - :class:`collections.abc.Iterable` e.g. :class:`tuple` or :class:`list` - :class:`str` with preferably url-encoded content (**Warning:** content will not be encoded by *aiohttp*) :param data: The data to send in the body of the request. This can be a :class:`FormData` object or anything that can be passed into :class:`FormData`, e.g. a dictionary, bytes, or file-like object. (optional) :param json: Any json compatible python object (optional). *json* and *data* parameters could not be used at the same time. :param dict cookies: HTTP Cookies to send with the request (optional) Global session cookies and the explicitly set cookies will be merged when sending the request. .. versionadded:: 3.5 :param dict headers: HTTP Headers to send with the request (optional) :param skip_auto_headers: set of headers for which autogeneration should be skipped. *aiohttp* autogenerates headers like ``User-Agent`` or ``Content-Type`` if these headers are not explicitly passed. Using ``skip_auto_headers`` parameter allows to skip that generation. Iterable of :class:`str` or :class:`~aiohttp.istr` (optional) :param aiohttp.BasicAuth auth: an object that represents HTTP Basic Authorization (optional) :param bool allow_redirects: If set to ``False``, do not follow redirects. ``True`` by default (optional). :param int max_redirects: Maximum number of redirects to follow. ``10`` by default. :param bool compress: Set to ``True`` if request has to be compressed with deflate encoding. If `compress` can not be combined with a *Content-Encoding* and *Content-Length* headers. ``None`` by default (optional). :param int chunked: Enable chunked transfer encoding. It is up to the developer to decide how to chunk data streams. If chunking is enabled, aiohttp encodes the provided chunks in the "Transfer-encoding: chunked" format. If *chunked* is set, then the *Transfer-encoding* and *content-length* headers are disallowed. ``None`` by default (optional). :param bool expect100: Expect 100-continue response from server. ``False`` by default (optional). :param bool raise_for_status: Automatically call :meth:`ClientResponse.raise_for_status()` for response if set to ``True``. If set to ``None`` value from ``ClientSession`` will be used. ``None`` by default (optional). .. versionadded:: 3.4 :param bool read_until_eof: Read response until EOF if response does not have Content-Length header. ``True`` by default (optional). :param proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional) :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP Basic Authorization (optional) :param int timeout: override the session's timeout. .. versionchanged:: 3.3 The parameter is :class:`ClientTimeout` instance, :class:`float` is still supported for sake of backward compatibility. If :class:`float` is passed it is a *total* timeout. :param ssl: SSL validation mode. ``None`` for default SSL check (:func:`ssl.create_default_context` is used), ``False`` for skip SSL certificate validation, :class:`aiohttp.Fingerprint` for fingerprint validation, :class:`ssl.SSLContext` for custom SSL certificate validation. Supersedes *verify_ssl*, *ssl_context* and *fingerprint* parameters. .. versionadded:: 3.0 :param bool verify_ssl: Perform SSL certificate validation for *HTTPS* requests (enabled by default). May be disabled to skip validation for sites with invalid certificates. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=False`` :param bytes fingerprint: Pass the SHA256 digest of the expected certificate in DER format to verify that the certificate the server presents matches. Useful for `certificate pinning `_. Warning: use of MD5 or SHA1 digests is insecure and removed. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=aiohttp.Fingerprint(digest)`` :param ssl.SSLContext ssl_context: ssl context used for processing *HTTPS* requests (optional). *ssl_context* may be used for configuring certification authority channel, supported SSL options etc. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=ssl_context`` :param abc.Mapping proxy_headers: HTTP headers to send to the proxy if the parameter proxy has been provided. .. versionadded:: 2.3 :param trace_request_ctx: Object used to give as a kw param for each new :class:`TraceConfig` object instantiated, used to give information to the tracers that is only available at request time. .. versionadded:: 3.0 :return ClientResponse: a :class:`client response ` object. .. comethod:: get(url, *, allow_redirects=True, **kwargs) :async-with: :coroutine: Perform a ``GET`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param bool allow_redirects: If set to ``False``, do not follow redirects. ``True`` by default (optional). :return ClientResponse: a :class:`client response ` object. .. comethod:: post(url, *, data=None, **kwargs) :async-with: :coroutine: Perform a ``POST`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param data: Data to send in the body of the request; see :meth:`request` for details (optional) :return ClientResponse: a :class:`client response ` object. .. comethod:: put(url, *, data=None, **kwargs) :async-with: :coroutine: Perform a ``PUT`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param data: Data to send in the body of the request; see :meth:`request` for details (optional) :return ClientResponse: a :class:`client response ` object. .. comethod:: delete(url, **kwargs) :async-with: :coroutine: Perform a ``DELETE`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :return ClientResponse: a :class:`client response ` object. .. comethod:: head(url, *, allow_redirects=False, **kwargs) :async-with: :coroutine: Perform a ``HEAD`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param bool allow_redirects: If set to ``False``, do not follow redirects. ``False`` by default (optional). :return ClientResponse: a :class:`client response ` object. .. comethod:: options(url, *, allow_redirects=True, **kwargs) :async-with: :coroutine: Perform an ``OPTIONS`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param bool allow_redirects: If set to ``False``, do not follow redirects. ``True`` by default (optional). :return ClientResponse: a :class:`client response ` object. .. comethod:: patch(url, *, data=None, **kwargs) :async-with: :coroutine: Perform a ``PATCH`` request. In order to modify inner :meth:`request` parameters, provide `kwargs`. :param url: Request URL, :class:`str` or :class:`~yarl.URL` :param data: Data to send in the body of the request; see :meth:`request` for details (optional) :return ClientResponse: a :class:`client response ` object. .. comethod:: ws_connect(url, *, method='GET', \ protocols=(), timeout=10.0,\ receive_timeout=None,\ auth=None,\ autoclose=True,\ autoping=True,\ heartbeat=None,\ origin=None, \ headers=None, \ proxy=None, proxy_auth=None, ssl=None, \ verify_ssl=None, fingerprint=None, \ ssl_context=None, proxy_headers=None, \ compress=0, max_msg_size=4194304) :async-with: :coroutine: Create a websocket connection. Returns a :class:`ClientWebSocketResponse` object. :param url: Websocket server url, :class:`str` or :class:`~yarl.URL` :param tuple protocols: Websocket protocols :param float timeout: Timeout for websocket to close. ``10`` seconds by default :param float receive_timeout: Timeout for websocket to receive complete message. ``None`` (unlimited) seconds by default :param aiohttp.BasicAuth auth: an object that represents HTTP Basic Authorization (optional) :param bool autoclose: Automatically close websocket connection on close message from server. If *autoclose* is False then close procedure has to be handled manually. ``True`` by default :param bool autoping: automatically send *pong* on *ping* message from server. ``True`` by default :param float heartbeat: Send *ping* message every *heartbeat* seconds and wait *pong* response, if *pong* response is not received then close connection. The timer is reset on any data reception.(optional) :param str origin: Origin header to send to server(optional) :param dict headers: HTTP Headers to send with the request (optional) :param str proxy: Proxy URL, :class:`str` or :class:`~yarl.URL` (optional) :param aiohttp.BasicAuth proxy_auth: an object that represents proxy HTTP Basic Authorization (optional) :param ssl: SSL validation mode. ``None`` for default SSL check (:func:`ssl.create_default_context` is used), ``False`` for skip SSL certificate validation, :class:`aiohttp.Fingerprint` for fingerprint validation, :class:`ssl.SSLContext` for custom SSL certificate validation. Supersedes *verify_ssl*, *ssl_context* and *fingerprint* parameters. .. versionadded:: 3.0 :param bool verify_ssl: Perform SSL certificate validation for *HTTPS* requests (enabled by default). May be disabled to skip validation for sites with invalid certificates. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=False`` :param bytes fingerprint: Pass the SHA256 digest of the expected certificate in DER format to verify that the certificate the server presents matches. Useful for `certificate pinning `_. Note: use of MD5 or SHA1 digests is insecure and deprecated. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=aiohttp.Fingerprint(digest)`` :param ssl.SSLContext ssl_context: ssl context used for processing *HTTPS* requests (optional). *ssl_context* may be used for configuring certification authority channel, supported SSL options etc. .. versionadded:: 2.3 .. deprecated:: 3.0 Use ``ssl=ssl_context`` :param dict proxy_headers: HTTP headers to send to the proxy if the parameter proxy has been provided. .. versionadded:: 2.3 :param int compress: Enable Per-Message Compress Extension support. 0 for disable, 9 to 15 for window bit support. Default value is 0. .. versionadded:: 2.3 :param int max_msg_size: maximum size of read websocket message, 4 MB by default. To disable the size limit use ``0``. .. versionadded:: 3.3 :param str method: HTTP method to establish WebSocket connection, ``'GET'`` by default. .. versionadded:: 3.5 .. comethod:: close() Close underlying connector. Release all acquired resources. .. method:: detach() Detach connector from session without closing the former. Session is switched to closed state anyway. Basic API --------- While we encourage :class:`ClientSession` usage we also provide simple coroutines for making HTTP requests. Basic API is good for performing simple HTTP requests without keepaliving, cookies and complex connection stuff like properly configured SSL certification chaining. .. cofunction:: request(method, url, *, params=None, data=None, \ json=None,\ headers=None, cookies=None, auth=None, \ allow_redirects=True, max_redirects=10, \ encoding='utf-8', \ version=HttpVersion(major=1, minor=1), \ compress=None, chunked=None, expect100=False, raise_for_status=False, \ connector=None, loop=None,\ read_until_eof=True, timeout=sentinel) :async-with: Asynchronous context manager for performing an asynchronous HTTP request. Returns a :class:`ClientResponse` response object. :param str method: HTTP method :param url: Requested URL, :class:`str` or :class:`~yarl.URL` :param dict params: Parameters to be sent in the query string of the new request (optional) :param data: The data to send in the body of the request. This can be a :class:`FormData` object or anything that can be passed into :class:`FormData`, e.g. a dictionary, bytes, or file-like object. (optional) :param json: Any json compatible python object (optional). *json* and *data* parameters could not be used at the same time. :param dict headers: HTTP Headers to send with the request (optional) :param dict cookies: Cookies to send with the request (optional) :param aiohttp.BasicAuth auth: an object that represents HTTP Basic Authorization (optional) :param bool allow_redirects: If set to ``False``, do not follow redirects. ``True`` by default (optional). :param aiohttp.protocol.HttpVersion version: Request HTTP version (optional) :param bool compress: Set to ``True`` if request has to be compressed with deflate encoding. ``False`` instructs aiohttp to not compress data. ``None`` by default (optional). :param int chunked: Enables chunked transfer encoding. ``None`` by default (optional). :param bool expect100: Expect 100-continue response from server. ``False`` by default (optional). :param bool raise_for_status: Automatically call :meth:`ClientResponse.raise_for_status()` for response if set to ``True``. If set to ``None`` value from ``ClientSession`` will be used. ``None`` by default (optional). .. versionadded:: 3.4 :param aiohttp.connector.BaseConnector connector: BaseConnector sub-class instance to support connection pooling. :param bool read_until_eof: Read response until EOF if response does not have Content-Length header. ``True`` by default (optional). :param timeout: a :class:`ClientTimeout` settings structure, 5min total timeout by default. :param loop: :ref:`event loop` used for processing HTTP requests. If param is ``None``, :func:`asyncio.get_event_loop` is used for getting default event loop. .. deprecated:: 2.0 :return ClientResponse: a :class:`client response ` object. Usage:: import aiohttp async def fetch(): async with aiohttp.request('GET', 'http://python.org/') as resp: assert resp.status == 200 print(await resp.text()) .. _aiohttp-client-reference-connectors: Connectors ---------- Connectors are transports for aiohttp client API. There are standard connectors: 1. :class:`TCPConnector` for regular *TCP sockets* (both *HTTP* and *HTTPS* schemes supported). 2. :class:`UnixConnector` for connecting via UNIX socket (it's used mostly for testing purposes). All connector classes should be derived from :class:`BaseConnector`. By default all *connectors* support *keep-alive connections* (behavior is controlled by *force_close* constructor's parameter). BaseConnector ^^^^^^^^^^^^^ .. class:: BaseConnector(*, keepalive_timeout=15, \ force_close=False, limit=100, limit_per_host=0, \ enable_cleanup_closed=False, loop=None) Base class for all connectors. :param float keepalive_timeout: timeout for connection reusing after releasing (optional). Values ``0``. For disabling *keep-alive* feature use ``force_close=True`` flag. :param int limit: total number simultaneous connections. If *limit* is ``None`` the connector has no limit (default: 100). :param int limit_per_host: limit simultaneous connections to the same endpoint. Endpoints are the same if they are have equal ``(host, port, is_ssl)`` triple. If *limit* is ``0`` the connector has no limit (default: 0). :param bool force_close: close underlying sockets after connection releasing (optional). :param bool enable_cleanup_closed: some SSL servers do not properly complete SSL shutdown process, in that case asyncio leaks ssl connections. If this parameter is set to True, aiohttp additionally aborts underlining transport after 2 seconds. It is off by default. :param loop: :ref:`event loop` used for handling connections. If param is ``None``, :func:`asyncio.get_event_loop` is used for getting default event loop. .. deprecated:: 2.0 .. attribute:: closed Read-only property, ``True`` if connector is closed. .. attribute:: force_close Read-only property, ``True`` if connector should ultimately close connections on releasing. .. attribute:: limit The total number for simultaneous connections. If limit is 0 the connector has no limit. The default limit size is 100. .. attribute:: limit_per_host The limit for simultaneous connections to the same endpoint. Endpoints are the same if they are have equal ``(host, port, is_ssl)`` triple. If *limit_per_host* is ``None`` the connector has no limit per host. Read-only property. .. comethod:: close() Close all opened connections. .. comethod:: connect(request) Get a free connection from pool or create new one if connection is absent in the pool. The call may be paused if :attr:`limit` is exhausted until used connections returns to pool. :param aiohttp.ClientRequest request: request object which is connection initiator. :return: :class:`Connection` object. .. comethod:: _create_connection(req) Abstract method for actual connection establishing, should be overridden in subclasses. TCPConnector ^^^^^^^^^^^^ .. class:: TCPConnector(*, ssl=None, verify_ssl=True, fingerprint=None, \ use_dns_cache=True, ttl_dns_cache=10, \ family=0, ssl_context=None, local_addr=None, \ resolver=None, keepalive_timeout=sentinel, \ force_close=False, limit=100, limit_per_host=0, \ enable_cleanup_closed=False, loop=None) Connector for working with *HTTP* and *HTTPS* via *TCP* sockets. The most common transport. When you don't know what connector type to use, use a :class:`TCPConnector` instance. :class:`TCPConnector` inherits from :class:`BaseConnector`. Constructor accepts all parameters suitable for :class:`BaseConnector` plus several TCP-specific ones: :param ssl: SSL validation mode. ``None`` for default SSL check (:func:`ssl.create_default_context` is used), ``False`` for skip SSL certificate validation, :class:`aiohttp.Fingerprint` for fingerprint validation, :class:`ssl.SSLContext` for custom SSL certificate validation. Supersedes *verify_ssl*, *ssl_context* and *fingerprint* parameters. .. versionadded:: 3.0 :param bool verify_ssl: perform SSL certificate validation for *HTTPS* requests (enabled by default). May be disabled to skip validation for sites with invalid certificates. .. deprecated:: 2.3 Pass *verify_ssl* to ``ClientSession.get()`` etc. :param bytes fingerprint: pass the SHA256 digest of the expected certificate in DER format to verify that the certificate the server presents matches. Useful for `certificate pinning `_. Note: use of MD5 or SHA1 digests is insecure and deprecated. .. deprecated:: 2.3 Pass *verify_ssl* to ``ClientSession.get()`` etc. :param bool use_dns_cache: use internal cache for DNS lookups, ``True`` by default. Enabling an option *may* speedup connection establishing a bit but may introduce some *side effects* also. :param int ttl_dns_cache: expire after some seconds the DNS entries, ``None`` means cached forever. By default 10 seconds. By default DNS entries are cached forever, in some environments the IP addresses related to a specific HOST can change after a specific time. Use this option to keep the DNS cache updated refreshing each entry after N seconds. :param int limit: total number simultaneous connections. If *limit* is ``None`` the connector has no limit (default: 100). :param int limit_per_host: limit simultaneous connections to the same endpoint. Endpoints are the same if they are have equal ``(host, port, is_ssl)`` triple. If *limit* is ``0`` the connector has no limit (default: 0). :param aiohttp.abc.AbstractResolver resolver: custom resolver instance to use. ``aiohttp.DefaultResolver`` by default (asynchronous if ``aiodns>=1.1`` is installed). Custom resolvers allow to resolve hostnames differently than the way the host is configured. The resolver is ``aiohttp.ThreadedResolver`` by default, asynchronous version is pretty robust but might fail in very rare cases. :param int family: TCP socket family, both IPv4 and IPv6 by default. For *IPv4* only use :const:`socket.AF_INET`, for *IPv6* only -- :const:`socket.AF_INET6`. *family* is ``0`` by default, that means both IPv4 and IPv6 are accepted. To specify only concrete version please pass :const:`socket.AF_INET` or :const:`socket.AF_INET6` explicitly. :param ssl.SSLContext ssl_context: SSL context used for processing *HTTPS* requests (optional). *ssl_context* may be used for configuring certification authority channel, supported SSL options etc. :param tuple local_addr: tuple of ``(local_host, local_port)`` used to bind socket locally if specified. :param bool force_close: close underlying sockets after connection releasing (optional). :param bool enable_cleanup_closed: Some ssl servers do not properly complete SSL shutdown process, in that case asyncio leaks SSL connections. If this parameter is set to True, aiohttp additionally aborts underlining transport after 2 seconds. It is off by default. .. attribute:: family *TCP* socket family e.g. :const:`socket.AF_INET` or :const:`socket.AF_INET6` Read-only property. .. attribute:: dns_cache Use quick lookup in internal *DNS* cache for host names if ``True``. Read-only :class:`bool` property. .. attribute:: cached_hosts The cache of resolved hosts if :attr:`dns_cache` is enabled. Read-only :class:`types.MappingProxyType` property. .. method:: clear_dns_cache(self, host=None, port=None) Clear internal *DNS* cache. Remove specific entry if both *host* and *port* are specified, clear all cache otherwise. UnixConnector ^^^^^^^^^^^^^ .. class:: UnixConnector(path, *, conn_timeout=None, \ keepalive_timeout=30, limit=100, \ force_close=False, loop=None) Unix socket connector. Use :class:`UnixConnector` for sending *HTTP/HTTPS* requests through *UNIX Sockets* as underlying transport. UNIX sockets are handy for writing tests and making very fast connections between processes on the same host. :class:`UnixConnector` is inherited from :class:`BaseConnector`. Usage:: conn = UnixConnector(path='/path/to/socket') session = ClientSession(connector=conn) async with session.get('http://python.org') as resp: ... Constructor accepts all parameters suitable for :class:`BaseConnector` plus UNIX-specific one: :param str path: Unix socket path .. attribute:: path Path to *UNIX socket*, read-only :class:`str` property. Connection ^^^^^^^^^^ .. class:: Connection Encapsulates single connection in connector object. End user should never create :class:`Connection` instances manually but get it by :meth:`BaseConnector.connect` coroutine. .. attribute:: closed :class:`bool` read-only property, ``True`` if connection was closed, released or detached. .. attribute:: loop Event loop used for connection .. deprecated:: 3.5 .. attribute:: transport Connection transport .. method:: close() Close connection with forcibly closing underlying socket. .. method:: release() Release connection back to connector. Underlying socket is not closed, the connection may be reused later if timeout (30 seconds by default) for connection was not expired. Response object --------------- .. class:: ClientResponse Client response returned be :meth:`ClientSession.request` and family. User never creates the instance of ClientResponse class but gets it from API calls. :class:`ClientResponse` supports async context manager protocol, e.g.:: resp = await client_session.get(url) async with resp: assert resp.status == 200 After exiting from ``async with`` block response object will be *released* (see :meth:`release` coroutine). .. attribute:: version Response's version, :class:`HttpVersion` instance. .. attribute:: status HTTP status code of response (:class:`int`), e.g. ``200``. .. attribute:: reason HTTP status reason of response (:class:`str`), e.g. ``"OK"``. .. attribute:: method Request's method (:class:`str`). .. attribute:: url URL of request (:class:`~yarl.URL`). .. attribute:: real_url Unmodified URL of request with URL fragment unstripped (:class:`~yarl.URL`). .. versionadded:: 3.2 .. attribute:: connection :class:`Connection` used for handling response. .. attribute:: content Payload stream, which contains response's BODY (:class:`StreamReader`). It supports various reading methods depending on the expected format. When chunked transfer encoding is used by the server, allows retrieving the actual http chunks. Reading from the stream may raise :exc:`aiohttp.ClientPayloadError` if the response object is closed before response receives all data or in case if any transfer encoding related errors like misformed chunked encoding of broken compression data. .. attribute:: cookies HTTP cookies of response (*Set-Cookie* HTTP header, :class:`~http.cookies.SimpleCookie`). .. attribute:: headers A case-insensitive multidict proxy with HTTP headers of response, :class:`~multidict.CIMultiDictProxy`. .. attribute:: raw_headers Unmodified HTTP headers of response as unconverted bytes, a sequence of ``(key, value)`` pairs. .. attribute:: links Link HTTP header parsed into a :class:`~multidict.MultiDictProxy`. For each link, key is link param `rel` when it exists, or link url as :class:`str` otherwise, and value is :class:`~multidict.MultiDictProxy` of link params and url at key `url` as :class:`~yarl.URL` instance. .. versionadded:: 3.2 .. attribute:: content_type Read-only property with *content* part of *Content-Type* header. .. note:: Returns value is ``'application/octet-stream'`` if no Content-Type header present in HTTP headers according to :rfc:`2616`. To make sure Content-Type header is not present in the server reply, use :attr:`headers` or :attr:`raw_headers`, e.g. ``'CONTENT-TYPE' not in resp.headers``. .. attribute:: charset Read-only property that specifies the *encoding* for the request's BODY. The value is parsed from the *Content-Type* HTTP header. Returns :class:`str` like ``'utf-8'`` or ``None`` if no *Content-Type* header present in HTTP headers or it has no charset information. .. attribute:: content_disposition Read-only property that specified the *Content-Disposition* HTTP header. Instance of :class:`ContentDisposition` or ``None`` if no *Content-Disposition* header present in HTTP headers. .. attribute:: history A :class:`~collections.abc.Sequence` of :class:`ClientResponse` objects of preceding requests (earliest request first) if there were redirects, an empty sequence otherwise. .. method:: close() Close response and underlying connection. For :term:`keep-alive` support see :meth:`release`. .. comethod:: read() Read the whole response's body as :class:`bytes`. Close underlying connection if data reading gets an error, release connection otherwise. Raise an :exc:`aiohttp.ClientResponseError` if the data can't be read. :return bytes: read *BODY*. .. seealso:: :meth:`close`, :meth:`release`. .. comethod:: release() It is not required to call `release` on the response object. When the client fully receives the payload, the underlying connection automatically returns back to pool. If the payload is not fully read, the connection is closed .. method:: raise_for_status() Raise an :exc:`aiohttp.ClientResponseError` if the response status is 400 or higher. Do nothing for success responses (less than 400). .. comethod:: text(encoding=None) Read response's body and return decoded :class:`str` using specified *encoding* parameter. If *encoding* is ``None`` content encoding is autocalculated using ``Content-Type`` HTTP header and *chardet* tool if the header is not provided by server. :term:`cchardet` is used with fallback to :term:`chardet` if *cchardet* is not available. Close underlying connection if data reading gets an error, release connection otherwise. :param str encoding: text encoding used for *BODY* decoding, or ``None`` for encoding autodetection (default). :return str: decoded *BODY* :raise LookupError: if the encoding detected by chardet or cchardet is unknown by Python (e.g. VISCII). .. note:: If response has no ``charset`` info in ``Content-Type`` HTTP header :term:`cchardet` / :term:`chardet` is used for content encoding autodetection. It may hurt performance. If page encoding is known passing explicit *encoding* parameter might help:: await resp.text('ISO-8859-1') .. comethod:: json(*, encoding=None, loads=json.loads, \ content_type='application/json') Read response's body as *JSON*, return :class:`dict` using specified *encoding* and *loader*. If data is not still available a ``read`` call will be done, If *encoding* is ``None`` content encoding is autocalculated using :term:`cchardet` or :term:`chardet` as fallback if *cchardet* is not available. if response's `content-type` does not match `content_type` parameter :exc:`aiohttp.ContentTypeError` get raised. To disable content type check pass ``None`` value. :param str encoding: text encoding used for *BODY* decoding, or ``None`` for encoding autodetection (default). By the standard JSON encoding should be ``UTF-8`` but practice beats purity: some servers return non-UTF responses. Autodetection works pretty fine anyway. :param callable loads: :func:`callable` used for loading *JSON* data, :func:`json.loads` by default. :param str content_type: specify response's content-type, if content type does not match raise :exc:`aiohttp.ClientResponseError`. To disable `content-type` check, pass ``None`` as value. (default: `application/json`). :return: *BODY* as *JSON* data parsed by *loads* parameter or ``None`` if *BODY* is empty or contains white-spaces only. .. attribute:: request_info A namedtuple with request URL and headers from :class:`ClientRequest` object, :class:`aiohttp.RequestInfo` instance. .. method:: get_encoding() Automatically detect content encoding using ``charset`` info in ``Content-Type`` HTTP header. If this info is not exists or there are no appropriate codecs for encoding then :term:`cchardet` / :term:`chardet` is used. Beware that it is not always safe to use the result of this function to decode a response. Some encodings detected by cchardet are not known by Python (e.g. VISCII). .. versionadded:: 3.0 ClientWebSocketResponse ----------------------- To connect to a websocket server :func:`aiohttp.ws_connect` or :meth:`aiohttp.ClientSession.ws_connect` coroutines should be used, do not create an instance of class :class:`ClientWebSocketResponse` manually. .. class:: ClientWebSocketResponse() Class for handling client-side websockets. .. attribute:: closed Read-only property, ``True`` if :meth:`close` has been called or :const:`~aiohttp.WSMsgType.CLOSE` message has been received from peer. .. attribute:: protocol Websocket *subprotocol* chosen after :meth:`start` call. May be ``None`` if server and client protocols are not overlapping. .. method:: get_extra_info(name, default=None) Reads extra info from connection's transport .. method:: exception() Returns exception if any occurs or returns None. .. comethod:: ping(message=b'') Send :const:`~aiohttp.WSMsgType.PING` to peer. :param message: optional payload of *ping* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. .. versionchanged:: 3.0 The method is converted into :term:`coroutine` .. comethod:: pong(message=b'') Send :const:`~aiohttp.WSMsgType.PONG` to peer. :param message: optional payload of *pong* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. .. versionchanged:: 3.0 The method is converted into :term:`coroutine` .. comethod:: send_str(data, compress=None) Send *data* to peer as :const:`~aiohttp.WSMsgType.TEXT` message. :param str data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :raise TypeError: if data is not :class:`str` .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: send_bytes(data, compress=None) Send *data* to peer as :const:`~aiohttp.WSMsgType.BINARY` message. :param data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :raise TypeError: if data is not :class:`bytes`, :class:`bytearray` or :class:`memoryview`. .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: send_json(data, compress=None, *, dumps=json.dumps) Send *data* to peer as JSON string. :param data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :param callable dumps: any :term:`callable` that accepts an object and returns a JSON string (:func:`json.dumps` by default). :raise RuntimeError: if connection is not started or closing :raise ValueError: if data is not serializable object :raise TypeError: if value returned by ``dumps(data)`` is not :class:`str` .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: close(*, code=1000, message=b'') A :ref:`coroutine` that initiates closing handshake by sending :const:`~aiohttp.WSMsgType.CLOSE` message. It waits for close response from server. To add a timeout to `close()` call just wrap the call with `asyncio.wait()` or `asyncio.wait_for()`. :param int code: closing code :param message: optional payload of *pong* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. .. comethod:: receive() A :ref:`coroutine` that waits upcoming *data* message from peer and returns it. The coroutine implicitly handles :const:`~aiohttp.WSMsgType.PING`, :const:`~aiohttp.WSMsgType.PONG` and :const:`~aiohttp.WSMsgType.CLOSE` without returning the message. It process *ping-pong game* and performs *closing handshake* internally. :return: :class:`~aiohttp.WSMessage` .. coroutinemethod:: receive_str() A :ref:`coroutine` that calls :meth:`receive` but also asserts the message type is :const:`~aiohttp.WSMsgType.TEXT`. :return str: peer's message content. :raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`. .. coroutinemethod:: receive_bytes() A :ref:`coroutine` that calls :meth:`receive` but also asserts the message type is :const:`~aiohttp.WSMsgType.BINARY`. :return bytes: peer's message content. :raise TypeError: if message is :const:`~aiohttp.WSMsgType.TEXT`. .. coroutinemethod:: receive_json(*, loads=json.loads) A :ref:`coroutine` that calls :meth:`receive_str` and loads the JSON string to a Python dict. :param callable loads: any :term:`callable` that accepts :class:`str` and returns :class:`dict` with parsed JSON (:func:`json.loads` by default). :return dict: loaded JSON content :raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`. :raise ValueError: if message is not valid JSON. Utilities --------- ClientTimeout ^^^^^^^^^^^^^ .. class:: ClientTimeout(*, total=None, connect=None, \ sock_connect, sock_read=None) A data class for client timeout settings. See :ref:`aiohttp-client-timeouts` for usage examples. .. attribute:: total Total timeout for the whole request. :class:`float`, ``None`` by default. .. attribute:: connect Total timeout for acquiring a connection from pool. The time consists connection establishment for a new connection or waiting for a free connection from a pool if pool connection limits are exceeded. For pure socket connection establishment time use :attr:`sock_connect`. :class:`float`, ``None`` by default. .. attribute:: sock_connect A timeout for connecting to a peer for a new connection, not given from a pool. See also :attr:`connect`. :class:`float`, ``None`` by default. .. attribute:: sock_read A timeout for reading a portion of data from a peer. :class:`float`, ``None`` by default. .. versionadded:: 3.3 RequestInfo ^^^^^^^^^^^ .. class:: RequestInfo() A data class with request URL and headers from :class:`ClientRequest` object, available as :attr:`ClientResponse.request_info` attribute. .. attribute:: url Requested *url*, :class:`yarl.URL` instance. .. attribute:: method Request HTTP method like ``'GET'`` or ``'POST'``, :class:`str`. .. attribute:: headers HTTP headers for request, :class:`multidict.CIMultiDict` instance. .. attribute:: real_url Requested *url* with URL fragment unstripped, :class:`yarl.URL` instance. .. versionadded:: 3.2 BasicAuth ^^^^^^^^^ .. class:: BasicAuth(login, password='', encoding='latin1') HTTP basic authentication helper. :param str login: login :param str password: password :param str encoding: encoding (``'latin1'`` by default) Should be used for specifying authorization data in client API, e.g. *auth* parameter for :meth:`ClientSession.request`. .. classmethod:: decode(auth_header, encoding='latin1') Decode HTTP basic authentication credentials. :param str auth_header: The ``Authorization`` header to decode. :param str encoding: (optional) encoding ('latin1' by default) :return: decoded authentication data, :class:`BasicAuth`. .. classmethod:: from_url(url) Constructed credentials info from url's *user* and *password* parts. :return: credentials data, :class:`BasicAuth` or ``None`` is credentials are not provided. .. versionadded:: 2.3 .. method:: encode() Encode credentials into string suitable for ``Authorization`` header etc. :return: encoded authentication data, :class:`str`. CookieJar ^^^^^^^^^ .. class:: CookieJar(*, unsafe=False, loop=None) The cookie jar instance is available as :attr:`ClientSession.cookie_jar`. The jar contains :class:`~http.cookies.Morsel` items for storing internal cookie data. API provides a count of saved cookies:: len(session.cookie_jar) These cookies may be iterated over:: for cookie in session.cookie_jar: print(cookie.key) print(cookie["domain"]) The class implements :class:`collections.abc.Iterable`, :class:`collections.abc.Sized` and :class:`aiohttp.AbstractCookieJar` interfaces. Implements cookie storage adhering to RFC 6265. :param bool unsafe: (optional) Whether to accept cookies from IPs. :param bool loop: an :ref:`event loop` instance. See :class:`aiohttp.abc.AbstractCookieJar` .. deprecated:: 2.0 .. method:: update_cookies(cookies, response_url=None) Update cookies returned by server in ``Set-Cookie`` header. :param cookies: a :class:`collections.abc.Mapping` (e.g. :class:`dict`, :class:`~http.cookies.SimpleCookie`) or *iterable* of *pairs* with cookies returned by server's response. :param str response_url: URL of response, ``None`` for *shared cookies*. Regular cookies are coupled with server's URL and are sent only to this server, shared ones are sent in every client request. .. method:: filter_cookies(request_url) Return jar's cookies acceptable for URL and available in ``Cookie`` header for sending client requests for given URL. :param str response_url: request's URL for which cookies are asked. :return: :class:`http.cookies.SimpleCookie` with filtered cookies for given URL. .. method:: save(file_path) Write a pickled representation of cookies into the file at provided path. :param file_path: Path to file where cookies will be serialized, :class:`str` or :class:`pathlib.Path` instance. .. method:: load(file_path) Load a pickled representation of cookies from the file at provided path. :param file_path: Path to file from where cookies will be imported, :class:`str` or :class:`pathlib.Path` instance. .. class:: DummyCookieJar(*, loop=None) Dummy cookie jar which does not store cookies but ignores them. Could be useful e.g. for web crawlers to iterate over Internet without blowing up with saved cookies information. To install dummy cookie jar pass it into session instance:: jar = aiohttp.DummyCookieJar() session = aiohttp.ClientSession(cookie_jar=DummyCookieJar()) .. class:: Fingerprint(digest) Fingerprint helper for checking SSL certificates by *SHA256* digest. :param bytes digest: *SHA256* digest for certificate in DER-encoded binary form (see :meth:`ssl.SSLSocket.getpeercert`). To check fingerprint pass the object into :meth:`ClientSession.get` call, e.g.:: import hashlib with open(path_to_cert, 'rb') as f: digest = hashlib.sha256(f.read()).digest() await session.get(url, ssl=aiohttp.Fingerprint(digest)) .. versionadded:: 3.0 FormData ^^^^^^^^ A :class:`FormData` object contains the form data and also handles encoding it into a body that is either ``multipart/form-data`` or ``application/x-www-form-urlencoded``. ``multipart/form-data`` is used if at least one field is an :class:`io.IOBase` object or was added with at least one optional argument to :meth:`add_field` (``content_type``, ``filename``, or ``content_transfer_encoding``). Otherwise, ``application/x-www-form-urlencoded`` is used. :class:`FormData` instances are callable and return a :class:`Payload` on being called. .. class:: FormData(fields, quote_fields=True, charset=None) Helper class for multipart/form-data and application/x-www-form-urlencoded body generation. :param fields: A container for the key/value pairs of this form. Possible types are: - :class:`dict` - :class:`tuple` or :class:`list` - :class:`io.IOBase`, e.g. a file-like object - :class:`multidict.MultiDict` or :class:`multidict.MultiDictProxy` If it is a :class:`tuple` or :class:`list`, it must be a valid argument for :meth:`add_fields`. For :class:`dict`, :class:`multidict.MultiDict`, and :class:`multidict.MultiDictProxy`, the keys and values must be valid `name` and `value` arguments to :meth:`add_field`, respectively. .. method:: add_field(name, value, content_type=None, filename=None,\ content_transfer_encoding=None) Add a field to the form. :param str name: Name of the field :param value: Value of the field Possible types are: - :class:`str` - :class:`bytes`, :class:`bytesarray`, or :class:`memoryview` - :class:`io.IOBase`, e.g. a file-like object :param str content_type: The field's content-type header (optional) :param str filename: The field's filename (optional) If this is not set and ``value`` is a :class:`bytes`, :class:`bytesarray`, or :class:`memoryview` object, the `name` argument is used as the filename unless ``content_transfer_encoding`` is specified. If ``filename`` is not set and ``value`` is an :class:`io.IOBase` object, the filename is extracted from the object if possible. :param str content_transfer_encoding: The field's content-transfer-encoding header (optional) .. method:: add_fields(fields) Add one or more fields to the form. :param fields: An iterable containing: - :class:`io.IOBase`, e.g. a file-like object - :class:`multidict.MultiDict` or :class:`multidict.MultiDictProxy` - :class:`tuple` or :class:`list` of length two, containing a name-value pair Client exceptions ----------------- Exception hierarchy has been significantly modified in version 2.0. aiohttp defines only exceptions that covers connection handling and server response misbehaviors. For developer specific mistakes, aiohttp uses python standard exceptions like :exc:`ValueError` or :exc:`TypeError`. Reading a response content may raise a :exc:`ClientPayloadError` exception. This exception indicates errors specific to the payload encoding. Such as invalid compressed data, malformed chunked-encoded chunks or not enough data that satisfy the content-length header. All exceptions are available as members of *aiohttp* module. .. exception:: ClientError Base class for all client specific exceptions. Derived from :exc:`Exception` .. class:: ClientPayloadError This exception can only be raised while reading the response payload if one of these errors occurs: 1. invalid compression 2. malformed chunked encoding 3. not enough data that satisfy ``Content-Length`` HTTP header. Derived from :exc:`ClientError` .. exception:: InvalidURL URL used for fetching is malformed, e.g. it does not contain host part. Derived from :exc:`ClientError` and :exc:`ValueError` .. attribute:: url Invalid URL, :class:`yarl.URL` instance. .. class:: ContentDisposition Represent Content-Disposition header .. attribute:: value A :class:`str` instance. Value of Content-Disposition header itself, e.g. ``attachment``. .. attribute:: filename A :class:`str` instance. Content filename extracted from parameters. May be ``None``. .. attribute:: parameters Read-only mapping contains all parameters. Response errors ^^^^^^^^^^^^^^^ .. exception:: ClientResponseError These exceptions could happen after we get response from server. Derived from :exc:`ClientError` .. attribute:: request_info Instance of :class:`RequestInfo` object, contains information about request. .. attribute:: status HTTP status code of response (:class:`int`), e.g. ``400``. .. attribute:: message Message of response (:class:`str`), e.g. ``"OK"``. .. attribute:: headers Headers in response, a list of pairs. .. attribute:: history History from failed response, if available, else empty tuple. A :class:`tuple` of :class:`ClientResponse` objects used for handle redirection responses. .. attribute:: code HTTP status code of response (:class:`int`), e.g. ``400``. .. deprecated:: 3.1 .. class:: WSServerHandshakeError Web socket server response error. Derived from :exc:`ClientResponseError` .. class:: ContentTypeError Invalid content type. Derived from :exc:`ClientResponseError` .. versionadded:: 2.3 .. class:: TooManyRedirects Client was redirected too many times. Maximum number of redirects can be configured by using parameter ``max_redirects`` in :meth:`request`. Derived from :exc:`ClientResponseError` .. versionadded:: 3.2 Connection errors ^^^^^^^^^^^^^^^^^ .. class:: ClientConnectionError These exceptions related to low-level connection problems. Derived from :exc:`ClientError` .. class:: ClientOSError Subset of connection errors that are initiated by an :exc:`OSError` exception. Derived from :exc:`ClientConnectionError` and :exc:`OSError` .. class:: ClientConnectorError Connector related exceptions. Derived from :exc:`ClientOSError` .. class:: ClientProxyConnectionError Derived from :exc:`ClientConnectorError` .. class:: ServerConnectionError Derived from :exc:`ClientConnectionError` .. class:: ClientSSLError Derived from :exc:`ClientConnectorError` .. class:: ClientConnectorSSLError Response ssl error. Derived from :exc:`ClientSSLError` and :exc:`ssl.SSLError` .. class:: ClientConnectorCertificateError Response certificate error. Derived from :exc:`ClientSSLError` and :exc:`ssl.CertificateError` .. class:: ServerDisconnectedError Server disconnected. Derived from :exc:`ServerDisconnectionError` .. attribute:: message Partially parsed HTTP message (optional). .. class:: ServerTimeoutError Server operation timeout: read timeout, etc. Derived from :exc:`ServerConnectionError` and :exc:`asyncio.TimeoutError` .. class:: ServerFingerprintMismatch Server fingerprint mismatch. Derived from :exc:`ServerConnectionError` Hierarchy of exceptions ^^^^^^^^^^^^^^^^^^^^^^^ * :exc:`ClientError` * :exc:`ClientResponseError` * :exc:`ContentTypeError` * :exc:`WSServerHandshakeError` * :exc:`ClientHttpProxyError` * :exc:`ClientConnectionError` * :exc:`ClientOSError` * :exc:`ClientConnectorError` * :exc:`ClientSSLError` * :exc:`ClientConnectorCertificateError` * :exc:`ClientConnectorSSLError` * :exc:`ClientProxyConnectionError` * :exc:`ServerConnectionError` * :exc:`ServerDisconnectedError` * :exc:`ServerTimeoutError` * :exc:`ServerFingerprintMismatch` * :exc:`ClientPayloadError` * :exc:`InvalidURL` aiohttp-3.6.2/docs/conf.py0000644000175100001650000002534613547410117015726 0ustar vstsdocker00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- # # aiohttp documentation build configuration file, created by # sphinx-quickstart on Wed Mar 5 12:35:35 2014. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import io import os import re import sys _docs_path = os.path.dirname(__file__) _version_path = os.path.abspath(os.path.join(_docs_path, '..', 'aiohttp', '__init__.py')) with io.open(_version_path, 'r', encoding='latin1') as fp: try: _version_info = re.search(r"^__version__ = '" r"(?P\d+)" r"\.(?P\d+)" r"\.(?P\d+)" r"(?P.*)?'$", fp.read(), re.M).groupdict() except IndexError: raise RuntimeError('Unable to determine version.') # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.viewcode', 'sphinx.ext.intersphinx', 'sphinxcontrib.asyncio', 'sphinxcontrib.blockdiag', ] try: import sphinxcontrib.spelling # noqa extensions.append('sphinxcontrib.spelling') except ImportError: pass intersphinx_mapping = { 'python': ('http://docs.python.org/3', None), 'multidict': ('https://multidict.readthedocs.io/en/stable/', None), 'yarl': ('https://yarl.readthedocs.io/en/stable/', None), 'aiohttpjinja2': ('https://aiohttp-jinja2.readthedocs.io/en/stable/', None), 'aiohttpremotes': ('https://aiohttp-remotes.readthedocs.io/en/stable/', None), 'aiohttpsession': ('https://aiohttp-session.readthedocs.io/en/stable/', None), 'aiohttpdemos': ('https://aiohttp-demos.readthedocs.io/en/latest/', None), } # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. # source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. project = 'aiohttp' copyright = '2013-2018, Aiohttp contributors' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = '{major}.{minor}'.format(**_version_info) # The full version, including alpha/beta/rc tags. release = '{major}.{minor}.{patch}{tag}'.format(**_version_info) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: # today = '' # Else, today_fmt is used as the format for a strftime call. # today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all # documents. # default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. # add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). # add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. # show_authors = False # The name of the Pygments (syntax highlighting) style to use. # pygments_style = 'sphinx' # The default language to highlight source code in. highlight_language = 'python3' # A list of ignored prefixes for module index sorting. # modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. # keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'aiohttp_theme' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. html_theme_options = { 'logo': 'aiohttp-icon-128x128.png', 'description': 'Async HTTP client/server for asyncio and Python', 'canonical_url': 'http://docs.aiohttp.org/en/stable/', 'github_user': 'aio-libs', 'github_repo': 'aiohttp', 'github_button': True, 'github_type': 'star', 'github_banner': True, 'badges': [{'image': 'https://travis-ci.com/aio-libs/aiohttp.svg?branch=master', 'target': 'https://travis-ci.com/aio-libs/aiohttp', 'height': '20', 'alt': 'Travis CI status'}, {'image': 'https://codecov.io/github/aio-libs/aiohttp/coverage.svg?branch=master', 'target': 'https://codecov.io/github/aio-libs/aiohttp', 'height': '20', 'alt': 'Code coverage status'}, {'image': 'https://badge.fury.io/py/aiohttp.svg', 'target': 'https://badge.fury.io/py/aiohttp', 'height': '20', 'alt': 'Latest PyPI package version'}, {'image': 'https://badges.gitter.im/Join%20Chat.svg', 'target': 'https://gitter.im/aio-libs/Lobby', 'height': '20', 'alt': 'Chat on Gitter'}], } # Add any paths that contain custom themes here, relative to this directory. # html_theme_path = [alabaster.get_path()] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". # html_title = None # A shorter title for the navigation bar. Default is the same as html_title. # html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. # html_logo = 'aiohttp-icon.svg' # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. html_favicon = 'favicon.ico' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. # html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. # html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. # html_use_smartypants = True # Custom sidebar templates, maps document names to template names. html_sidebars = { '**': [ 'about.html', 'navigation.html', 'searchbox.html', ] } # Additional templates that should be rendered to pages, maps page names to # template names. # html_additional_pages = {} # If false, no module index is generated. # html_domain_indices = True # If false, no index is generated. # html_use_index = True # If true, the index is split into individual pages for each letter. # html_split_index = False # If true, links to the reST sources are added to the pages. # html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. # html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. # html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. # html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). # html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'aiohttpdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). # 'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). # 'pointsize': '10pt', # Additional stuff for the LaTeX preamble. # 'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'aiohttp.tex', 'aiohttp Documentation', 'aiohttp contributors', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. # latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. # latex_use_parts = False # If true, show page references after internal links. # latex_show_pagerefs = False # If true, show URL addresses after external links. # latex_show_urls = False # Documents to append as an appendix to all manuals. # latex_appendices = [] # If false, no module index is generated. # latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', 'aiohttp', 'aiohttp Documentation', ['aiohttp'], 1) ] # If true, show URL addresses after external links. # man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', 'aiohttp', 'aiohttp Documentation', 'Aiohttp contributors', 'aiohttp', 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. # texinfo_appendices = [] # If false, no module index is generated. # texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. # texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. # texinfo_no_detailmenu = False aiohttp-3.6.2/docs/contributing.rst0000644000175100001650000002061113547410117017656 0ustar vstsdocker00000000000000.. _aiohttp-contributing: Contributing ============ Instructions for contributors ----------------------------- In order to make a clone of the GitHub_ repo: open the link and press the "Fork" button on the upper-right menu of the web page. I hope everybody knows how to work with git and github nowadays :) Workflow is pretty straightforward: 1. Clone the GitHub_ repo using ``--recurse-submodules`` argument 2. Make a change 3. Make sure all tests passed 4. Add a file into ``CHANGES`` folder (`Changelog update`_). 5. Commit changes to own aiohttp clone 6. Make pull request from github page for your clone against master branch 7. Optionally make backport Pull Request(s) for landing a bug fix into released aiohttp versions. .. note:: The project uses *Squash-and-Merge* strategy for *GitHub Merge* button. Basically it means that there is **no need to rebase** a Pull Request against *master* branch. Just ``git merge`` *master* into your working copy (a fork) if needed. The Pull Request is automatically squashed into the single commit once the PR is accepted. Preconditions for running aiohttp test suite -------------------------------------------- We expect you to use a python virtual environment to run our tests. There are several ways to make a virtual environment. If you like to use *virtualenv* please run: .. code-block:: shell $ cd aiohttp $ virtualenv --python=`which python3` venv $ . venv/bin/activate For standard python *venv*: .. code-block:: shell $ cd aiohttp $ python3 -m venv venv $ . venv/bin/activate For *virtualenvwrapper*: .. code-block:: shell $ cd aiohttp $ mkvirtualenv --python=`which python3` aiohttp There are other tools like *pyvenv* but you know the rule of thumb now: create a python3 virtual environment and activate it. After that please install libraries required for development: .. code-block:: shell $ pip install -r requirements/dev.txt .. note:: If you plan to use ``pdb`` or ``ipdb`` within the test suite, execute: .. code-block:: shell $ py.test tests -s command to run the tests with disabled output capturing. Congratulations, you are ready to run the test suite! Run aiohttp test suite ---------------------- After all the preconditions are met you can run tests typing the next command: .. code-block:: shell $ make test The command at first will run the *flake8* tool (sorry, we don't accept pull requests with pep8 or pyflakes errors). On *flake8* success the tests will be run. Please take a look on the produced output. Any extra texts (print statements and so on) should be removed. Tests coverage -------------- We are trying hard to have good test coverage; please don't make it worse. Use: .. code-block:: shell $ make cov to run test suite and collect coverage information. Once the command has finished check your coverage at the file that appears in the last line of the output: ``open file:///.../aiohttp/htmlcov/index.html`` Please go to the link and make sure that your code change is covered. The project uses *codecov.io* for storing coverage results. Visit https://codecov.io/gh/aio-libs/aiohttp for looking on coverage of master branch, history, pull requests etc. The browser extension https://docs.codecov.io/docs/browser-extension is highly recommended for analyzing the coverage just in *Files Changed* tab on *GitHub Pull Request* review page. Documentation ------------- We encourage documentation improvements. Please before making a Pull Request about documentation changes run: .. code-block:: shell $ make doc Once it finishes it will output the index html page ``open file:///.../aiohttp/docs/_build/html/index.html``. Go to the link and make sure your doc changes looks good. Spell checking -------------- We use ``pyenchant`` and ``sphinxcontrib-spelling`` for running spell checker for documentation: .. code-block:: shell $ make doc-spelling Unfortunately there are problems with running spell checker on MacOS X. To run spell checker on Linux box you should install it first: .. code-block:: shell $ sudo apt-get install enchant $ pip install sphinxcontrib-spelling Changelog update ---------------- The ``CHANGES.rst`` file is managed using `towncrier `_ tool and all non trivial changes must be accompanied by a news entry. To add an entry to the news file, first you need to have created an issue describing the change you want to make. A Pull Request itself *may* function as such, but it is preferred to have a dedicated issue (for example, in case the PR ends up rejected due to code quality reasons). Once you have an issue or pull request, you take the number and you create a file inside of the ``CHANGES/`` directory named after that issue number with an extension of ``.removal``, ``.feature``, ``.bugfix``, or ``.doc``. Thus if your issue or PR number is ``1234`` and this change is fixing a bug, then you would create a file ``CHANGES/1234.bugfix``. PRs can span multiple categories by creating multiple files (for instance, if you added a feature and deprecated/removed the old feature at the same time, you would create ``CHANGES/NNNN.feature`` and ``CHANGES/NNNN.removal``). Likewise if a PR touches multiple issues/PRs you may create a file for each of them with the exact same contents and *Towncrier* will deduplicate them. The contents of this file are *reStructuredText* formatted text that will be used as the content of the news file entry. You do not need to reference the issue or PR numbers here as *towncrier* will automatically add a reference to all of the affected issues when rendering the news file. Making a Pull Request --------------------- After finishing all steps make a GitHub_ Pull Request with *master* base branch. Backporting ----------- All Pull Requests are created against *master* git branch. If the Pull Request is not a new functionality but bug fixing *backport* to maintenance branch would be desirable. *aiohttp* project committer may ask for making a *backport* of the PR into maintained branch(es), in this case he or she adds a github label like *needs backport to 3.1*. *Backporting* is performed *after* main PR merging into master. Please do the following steps: 1. Find *Pull Request's commit* for cherry-picking. *aiohttp* does *squashing* PRs on merging, so open your PR page on github and scroll down to message like ``asvetlov merged commit f7b8921 into master 9 days ago``. ``f7b8921`` is the required commit number. 2. Run `cherry_picker `_ tool for making backport PR (the tool is already pre-installed from ``./requirements/dev.txt``), e.g. ``cherry_picker f7b8921 3.1``. 3. In case of conflicts fix them and continue cherry-picking by ``cherry_picker --continue``. ``cherry_picker --abort`` stops the process. ``cherry_picker --status`` shows current cherry-picking status (like ``git status``) 4. After all conflicts are done the tool opens a New Pull Request page in a browser with pre-filed information. Create a backport Pull Request and wait for review/merging. 5. *aiohttp* *committer* should remove *backport Git label* after merging the backport. How to become an aiohttp committer ---------------------------------- Contribute! The easiest way is providing Pull Requests for issues in our bug tracker. But if you have a great idea for the library improvement -- please make an issue and Pull Request. The rules for committers are simple: 1. No wild commits! Everything should go through PRs. 2. Take a part in reviews. It's very important part of maintainer's activity. 3. Pickup issues created by others, especially if they are simple. 4. Keep test suite comprehensive. In practice it means leveling up coverage. 97% is not bad but we wish to have 100% someday. Well, 99% is good target too. 5. Don't hesitate to improve our docs. Documentation is very important thing, it's the key for project success. The documentation should not only cover our public API but help newbies to start using the project and shed a light on non-obvious gotchas. After positive answer aiohttp committer creates an issue on github with the proposal for nomination. If the proposal will collect only positive votes and no strong objection -- you'll be a new member in our team. .. _GitHub: https://github.com/aio-libs/aiohttp .. _ipdb: https://pypi.python.org/pypi/ipdb aiohttp-3.6.2/docs/deployment.rst0000644000175100001650000002201013547410117017322 0ustar vstsdocker00000000000000.. _aiohttp-deployment: ================= Server Deployment ================= There are several options for aiohttp server deployment: * Standalone server * Running a pool of backend servers behind of :term:`nginx`, HAProxy or other *reverse proxy server* * Using :term:`gunicorn` behind of *reverse proxy* Every method has own benefits and disadvantages. .. _aiohttp-deployment-standalone: Standalone ========== Just call :func:`aiohttp.web.run_app` function passing :class:`aiohttp.web.Application` instance. The method is very simple and could be the best solution in some trivial cases. But it does not utilize all CPU cores. For running multiple aiohttp server instances use *reverse proxies*. .. _aiohttp-deployment-nginx-supervisord: Nginx+supervisord ================= Running aiohttp servers behind :term:`nginx` makes several advantages. At first, nginx is the perfect frontend server. It may prevent many attacks based on malformed http protocol etc. Second, running several aiohttp instances behind nginx allows to utilize all CPU cores. Third, nginx serves static files much faster than built-in aiohttp static file support. But this way requires more complex configuration. Nginx configuration -------------------- Here is short extraction about writing Nginx configuration file. It does not cover all available Nginx options. For full reference read `Nginx tutorial `_ and `official Nginx documentation `_. First configure HTTP server itself: .. code-block:: nginx http { server { listen 80; client_max_body_size 4G; server_name example.com; location / { proxy_set_header Host $http_host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_redirect off; proxy_buffering off; proxy_pass http://aiohttp; } location /static { # path for static files root /path/to/app/static; } } } This config listens on port ``80`` for server named ``example.com`` and redirects everything to ``aiohttp`` backend group. Also it serves static files from ``/path/to/app/static`` path as ``example.com/static``. Next we need to configure *aiohttp upstream group*: .. code-block:: nginx http { upstream aiohttp { # fail_timeout=0 means we always retry an upstream even if it failed # to return a good HTTP response # Unix domain servers server unix:/tmp/example_1.sock fail_timeout=0; server unix:/tmp/example_2.sock fail_timeout=0; server unix:/tmp/example_3.sock fail_timeout=0; server unix:/tmp/example_4.sock fail_timeout=0; # Unix domain sockets are used in this example due to their high performance, # but TCP/IP sockets could be used instead: # server 127.0.0.1:8081 fail_timeout=0; # server 127.0.0.1:8082 fail_timeout=0; # server 127.0.0.1:8083 fail_timeout=0; # server 127.0.0.1:8084 fail_timeout=0; } } All HTTP requests for ``http://example.com`` except ones for ``http://example.com/static`` will be redirected to ``example1.sock``, ``example2.sock``, ``example3.sock`` or ``example4.sock`` backend servers. By default, Nginx uses round-robin algorithm for backend selection. .. note:: Nginx is not the only existing *reverse proxy server* but the most popular one. Alternatives like HAProxy may be used as well. Supervisord ----------- After configuring Nginx we need to start our aiohttp backends. Better to use some tool for starting them automatically after system reboot or backend crash. There are very many ways to do it: Supervisord, Upstart, Systemd, Gaffer, Circus, Runit etc. Here we'll use `Supervisord `_ for example: .. code-block:: cfg [program:aiohttp] numprocs = 4 numprocs_start = 1 process_name = example_%(process_num)s ; Unix socket paths are specified by command line. command=/path/to/aiohttp_example.py --path=/tmp/example_%(process_num)s.sock ; We can just as easily pass TCP port numbers: ; command=/path/to/aiohttp_example.py --port=808%(process_num)s user=nobody autostart=true autorestart=true aiohttp server -------------- The last step is preparing aiohttp server for working with supervisord. Assuming we have properly configured :class:`aiohttp.web.Application` and port is specified by command line, the task is trivial: .. code-block:: python3 # aiohttp_example.py import argparse from aiohttp import web parser = argparse.ArgumentParser(description="aiohttp server example") parser.add_argument('--path') parser.add_argument('--port') if __name__ == '__main__': app = web.Application() # configure app args = parser.parse_args() web.run_app(app, path=args.path, port=args.port) For real use cases we perhaps need to configure other things like logging etc., but it's out of scope of the topic. .. _aiohttp-deployment-gunicorn: Nginx+Gunicorn ============== aiohttp can be deployed using `Gunicorn `_, which is based on a pre-fork worker model. Gunicorn launches your app as worker processes for handling incoming requests. In opposite to deployment with :ref:`bare Nginx ` the solution does not need to manually run several aiohttp processes and use tool like supervisord for monitoring it. But nothing is for free: running aiohttp application under gunicorn is slightly slower. Prepare environment ------------------- You firstly need to setup your deployment environment. This example is based on `Ubuntu `_ 16.04. Create a directory for your application:: >> mkdir myapp >> cd myapp Create Python virtual environment:: >> python3 -m venv venv >> source venv/bin/activate Now that the virtual environment is ready, we'll proceed to install aiohttp and gunicorn:: >> pip install gunicorn >> pip install aiohttp Application ----------- Lets write a simple application, which we will save to file. We'll name this file *my_app_module.py*:: from aiohttp import web async def index(request): return web.Response(text="Welcome home!") my_web_app = web.Application() my_web_app.router.add_get('/', index) Application factory ------------------- As an option an entry point could be a coroutine that accepts no parameters and returns an application instance:: from aiohttp import web async def index(request): return web.Response(text="Welcome home!") async def my_web_app(): app = web.Application() app.router.add_get('/', index) return app Start Gunicorn -------------- When `Running Gunicorn `_, you provide the name of the module, i.e. *my_app_module*, and the name of the app or application factory, i.e. *my_web_app*, along with other `Gunicorn Settings `_ provided as command line flags or in your config file. In this case, we will use: * the ``--bind`` flag to set the server's socket address; * the ``--worker-class`` flag to tell Gunicorn that we want to use a custom worker subclass instead of one of the Gunicorn default worker types; * you may also want to use the ``--workers`` flag to tell Gunicorn how many worker processes to use for handling requests. (See the documentation for recommendations on `How Many Workers? `_) * you may also want to use the ``--accesslog`` flag to enable the access log to be populated. (See :ref:`logging ` for more information.) The custom worker subclass is defined in ``aiohttp.GunicornWebWorker``:: >> gunicorn my_app_module:my_web_app --bind localhost:8080 --worker-class aiohttp.GunicornWebWorker [2017-03-11 18:27:21 +0000] [1249] [INFO] Starting gunicorn 19.7.1 [2017-03-11 18:27:21 +0000] [1249] [INFO] Listening at: http://127.0.0.1:8080 (1249) [2017-03-11 18:27:21 +0000] [1249] [INFO] Using worker: aiohttp.worker.GunicornWebWorker [2015-03-11 18:27:21 +0000] [1253] [INFO] Booting worker with pid: 1253 Gunicorn is now running and ready to serve requests to your app's worker processes. .. note:: If you want to use an alternative asyncio event loop `uvloop `_, you can use the ``aiohttp.GunicornUVLoopWebWorker`` worker class. More information ---------------- The Gunicorn documentation recommends deploying Gunicorn behind an Nginx proxy server. See the `official documentation `_ for more information about suggested nginx configuration. Logging configuration --------------------- ``aiohttp`` and ``gunicorn`` use different format for specifying access log. By default aiohttp uses own defaults:: '%a %t "%r" %s %b "%{Referer}i" "%{User-Agent}i"' For more information please read :ref:`Format Specification for Access Log `. aiohttp-3.6.2/docs/essays.rst0000644000175100001650000000014213547410117016453 0ustar vstsdocker00000000000000Essays ====== .. toctree:: new_router whats_new_1_1 migration_to_2xx whats_new_3_0 aiohttp-3.6.2/docs/external.rst0000644000175100001650000000061013547410117016766 0ustar vstsdocker00000000000000Who uses aiohttp? ================= The list of *aiohttp* users: both libraries, big projects and web sites. Please don't hesitate to add your awesome project to the list by making a Pull Request on GitHub_. If you like the project -- please go to GitHub_ and press *Star* button! .. toctree:: third_party built_with powered_by .. _GitHub: https://github.com/aio-libs/aiohttp aiohttp-3.6.2/docs/faq.rst0000644000175100001650000003340613547410117015724 0ustar vstsdocker00000000000000FAQ === .. contents:: :local: Are there plans for an @app.route decorator like in Flask? ---------------------------------------------------------- As of aiohttp 2.3, :class:`~aiohttp.web.RouteTableDef` provides an API similar to Flask's ``@app.route``. See :ref:`aiohttp-web-alternative-routes-definition`. Unlike Flask's ``@app.route``, :class:`~aiohttp.web.RouteTableDef` does not require an ``app`` in the module namespace (which often leads to circular imports). Instead, a :class:`~aiohttp.web.RouteTableDef` is decoupled from an application instance:: routes = web.RouteTableDef() @routes.get('/get') async def handle_get(request): ... @routes.post('/post') async def handle_post(request): ... app.router.add_routes(routes) Does aiohttp have a concept like Flask's "blueprint" or Django's "app"? ----------------------------------------------------------------------- If you're writing a large application, you may want to consider using :ref:`nested applications `, which are similar to Flask's "blueprints" or Django's "apps". See: :ref:`aiohttp-web-nested-applications`. How do I create a route that matches urls with a given prefix? -------------------------------------------------------------- You can do something like the following: :: app.router.add_route('*', '/path/to/{tail:.+}', sink_handler) The first argument, ``*``, matches any HTTP method (*GET, POST, OPTIONS*, etc). The second argument matches URLS with the desired prefix. The third argument is the handler function. Where do I put my database connection so handlers can access it? ---------------------------------------------------------------- :class:`aiohttp.web.Application` object supports the :class:`dict` interface and provides a place to store your database connections or any other resource you want to share between handlers. :: async def go(request): db = request.app['db'] cursor = await db.cursor() await cursor.execute('SELECT 42') # ... return web.Response(status=200, text='ok') async def init_app(loop): app = Application(loop=loop) db = await create_connection(user='user', password='123') app['db'] = db app.router.add_get('/', go) return app Why is Python 3.5.3 the lowest supported version? ------------------------------------------------- Python 3.5.2 fixes the protocol for async iterators: ``__aiter__()`` is not a coroutine but a regular function. Python 3.5.3 has a more important change: :func:`asyncio.get_event_loop` returns the running loop instance if called from a coroutine. Previously it returned a *default* loop, set by :func:`asyncio.set_event_loop`. Previous to Python 3.5.3, :func:`asyncio.get_event_loop` was not reliable, so users were forced to explicitly pass the event loop instance everywhere. If a future object were created for one event loop (e.g. the default loop) but a coroutine was run by another loop, the coroutine was never awaited. As a result, the task would hang. Keep in mind that every internal ``await`` expression either passed instantly or paused, waiting for a future. It's extremely important that all tasks (coroutine runners) and futures use the same event loop. How can middleware store data for web handlers to use? ------------------------------------------------------ Both :class:`aiohttp.web.Request` and :class:`aiohttp.web.Application` support the :class:`dict` interface. Therefore, data may be stored inside a request object. :: async def handler(request): request['unique_key'] = data See https://github.com/aio-libs/aiohttp_session code for an example. The ``aiohttp_session.get_session(request)`` method uses ``SESSION_KEY`` for saving request-specific session information. As of aiohttp 3.0, all response objects are dict-like structures as well. .. _aiohttp_faq_parallel_event_sources: Can a handler receive incoming events from different sources in parallel? ------------------------------------------------------------------------- Yes. As an example, we may have two event sources: 1. WebSocket for events from an end user 2. Redis PubSub for events from other parts of the application The most native way to handle this is to create a separate task for PubSub handling. Parallel :meth:`aiohttp.web.WebSocketResponse.receive` calls are forbidden; a single task should perform WebSocket reading. However, other tasks may use the same WebSocket object for sending data to peers. :: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) task = request.app.loop.create_task( read_subscription(ws, request.app['redis'])) try: async for msg in ws: # handle incoming messages # use ws.send_str() to send data back ... finally: task.cancel() async def read_subscription(ws, redis): channel, = await redis.subscribe('channel:1') try: async for msg in channel.iter(): answer = process message(msg) ws.send_str(answer) finally: await redis.unsubscribe('channel:1') .. _aiohttp_faq_terminating_websockets: How do I programmatically close a WebSocket server-side? -------------------------------------------------------- Let's say we have an application with two endpoints: 1. ``/echo`` a WebSocket echo server that authenticates the user 2. ``/logout_user`` that, when invoked, closes all open WebSockets for that user. One simple solution is to keep a shared registry of WebSocket responses for a user in the :class:`aiohttp.web.Application` instance and call :meth:`aiohttp.web.WebSocketResponse.close` on all of them in ``/logout_user`` handler:: async def echo_handler(request): ws = web.WebSocketResponse() user_id = authenticate_user(request) await ws.prepare(request) request.app['websockets'][user_id].add(ws) try: async for msg in ws: ws.send_str(msg.data) finally: request.app['websockets'][user_id].remove(ws) return ws async def logout_handler(request): user_id = authenticate_user(request) ws_closers = [ws.close() for ws in request.app['websockets'][user_id] if not ws.closed] # Watch out, this will keep us from returing the response # until all are closed ws_closers and await asyncio.gather(*ws_closers) return web.Response(text='OK') def main(): loop = asyncio.get_event_loop() app = web.Application(loop=loop) app.router.add_route('GET', '/echo', echo_handler) app.router.add_route('POST', '/logout', logout_handler) app['websockets'] = defaultdict(set) web.run_app(app, host='localhost', port=8080) How do I make a request from a specific IP address? --------------------------------------------------- If your system has several IP interfaces, you may choose one which will be used used to bind a socket locally:: conn = aiohttp.TCPConnector(local_addr=('127.0.0.1', 0), loop=loop) async with aiohttp.ClientSession(connector=conn) as session: ... .. seealso:: :class:`aiohttp.TCPConnector` and ``local_addr`` parameter. What is the API stability and deprecation policy? ------------------------------------------------- *aiohttp* follows strong `Semantic Versioning `_ (SemVer). Obsolete attributes and methods are marked as *deprecated* in the documentation and raise :class:`DeprecationWarning` upon usage. Assume aiohttp ``X.Y.Z`` where ``X`` is major version, ``Y`` is minor version and ``Z`` is bugfix number. For example, if the latest released version is ``aiohttp==3.0.6``: ``3.0.7`` fixes some bugs but have no new features. ``3.1.0`` introduces new features and can deprecate some API but never remove it, also all bug fixes from previous release are merged. ``4.0.0`` removes all deprecations collected from ``3.Y`` versions **except** deprecations from the **last** ``3.Y`` release. These deprecations will be removed by ``5.0.0``. Unfortunately we may have to break these rules when a **security vulnerability** is found. If a security problem cannot be fixed without breaking backward compatibility, a bugfix release may break compatibility. This is unlikely, but possible. All backward incompatible changes are explicitly marked in :ref:`the changelog `. How do I enable gzip compression globally for my entire application? -------------------------------------------------------------------- It's impossible. Choosing what to compress and what not to compress is is a tricky matter. If you need global compression, write a custom middleware. Or enable compression in NGINX (you are deploying aiohttp behind reverse proxy, right?). How do I manage a ClientSession within a web server? ---------------------------------------------------- :class:`aiohttp.ClientSession` should be created once for the lifetime of the server in order to benefit from connection pooling. Sessions save cookies internally. If you don't need cookie processing, use :class:`aiohttp.DummyCookieJar`. If you need separate cookies for different http calls but process them in logical chains, use a single :class:`aiohttp.TCPConnector` with separate client sessions and ``own_connector=False``. How do I access database connections from a subapplication? ----------------------------------------------------------- Restricting access from subapplication to main (or outer) app is a deliberate choice. A subapplication is an isolated unit by design. If you need to share a database object, do it explicitly:: subapp['db'] = mainapp['db'] mainapp.add_subapp('/prefix', subapp) How do I perform operations in a request handler after sending the response? ---------------------------------------------------------------------------- Middlewares can be written to handle post-response operations, but they run after every request. You can explicitly send the response by calling :meth:`aiohttp.web.Response.write_eof`, which starts sending before the handler returns, giving you a chance to execute follow-up operations:: def ping_handler(request): """Send PONG and increase DB counter.""" # explicitly send the response resp = web.json_response({'message': 'PONG'}) await resp.prepare(request) await resp.write_eof() # increase the pong count APP['db'].inc_pong() return resp A :class:`aiohttp.web.Response` object must be returned. This is required by aiohttp web contracts, even though the response already been sent. How do I make sure my custom middleware response will behave correctly? ------------------------------------------------------------------------ Sometimes your middleware handlers might need to send a custom response. This is just fine as long as you always create a new :class:`aiohttp.web.Response` object when required. The response object is a Finite State Machine. Once it has been dispatched by the server, it will reach its final state and cannot be used again. The following middleware will make the server hang, once it serves the second response:: from aiohttp import web def misbehaved_middleware(): # don't do this! cached = web.Response(status=200, text='Hi, I am cached!') @web.middleware async def middleware(request, handler): # ignoring response for the sake of this example _res = handler(request) return cached return middleware The rule of thumb is *one request, one response*. Why is creating a ClientSession outside of an event loop dangerous? ------------------------------------------------------------------- Short answer is: life-cycle of all asyncio objects should be shorter than life-cycle of event loop. Full explanation is longer. All asyncio object should be correctly finished/disconnected/closed before event loop shutdown. Otherwise user can get unexpected behavior. In the best case it is a warning about unclosed resource, in the worst case the program just hangs, awaiting for coroutine is never resumed etc. Consider the following code from ``mod.py``:: import aiohttp session = aiohttp.ClientSession() async def fetch(url): async with session.get(url) as resp: return await resp.text() The session grabs current event loop instance and stores it in a private variable. The main module imports the module and installs ``uvloop`` (an alternative fast event loop implementation). ``main.py``:: import asyncio import uvloop import mod asyncio.set_event_loop_policy(uvloop.EventLoopPolicy()) asyncio.run(main()) The code is broken: ``session`` is bound to default ``asyncio`` loop on import time but the loop is changed **after the import** by ``set_event_loop()``. As result ``fetch()`` call hangs. To avoid import dependency hell *aiohttp* encourages creation of ``ClientSession`` from async function. The same policy works for ``web.Application`` too. Another use case is unit test writing. Very many test libraries (*aiohttp test tools* first) creates a new loop instance for every test function execution. It's done for sake of tests isolation. Otherwise pending activity (timers, network packets etc.) from previous test may interfere with current one producing very cryptic and unstable test failure. Note: *class variables* are hidden globals actually. The following code has the same problem as ``mod.py`` example, ``session`` variable is the hidden global object:: class A: session = aiohttp.ClientSession() async def fetch(self, url): async with session.get(url) as resp: return await resp.text() aiohttp-3.6.2/docs/favicon.ico0000644000175100001650000001027613547410117016544 0ustar vstsdocker00000000000000  ( @ [,[,[,([,l[,[,ܴ[,[,[,[,[,ܴ[,[,l[,([,[,[,[,[,5[,[,ݴ[,[,[,[,m[,^[,^[,m[,[,[,[,޴[,[,5[,[,Z+[,[,[,[,[,ʴ[,m[,'[,\,[,[-Z([,[,'[,m[,ʴ[,[,[,[,[-[,[,[,3[,´[,[,x[,\,Z,\,Z,[,[,m[,޴[,[,3[,[,[,[,[,?[,ٴ[,[,r[,#[,[,[,[,E[,ڴ[,ش[,?[,[,[,[,[,4[,ش[,[,)[,[,R[,[,[,[,[,C[,[,X[,[,ش[,3[,Z-[,[,[,´[,[,[,4[,[,[,[,[,[,[,[,[,H[,[,[,[,[,[,[,[,[,ݴ[,2[,[, [,n[,^[,[-[,[,[,S[,][, [,[,2[,޴[,[,[,[,[,6[,[,n[,[,[,[,[,U[,[,[,[,*[,[,2[,Z,[,[,[,[,Y[,[,[,[,[,p[,[,6[,[,[,[,ȴ[,[,[,[,[,U[,[,[,[,[,[,[,[,[,G[,.[,[,[,̴[,|[,[, [,6[,I[,d[,۴[,[,[,)[,ܴ[,n[,[,Z,[,[,M[,a[,[,[,,[,U[,R[,4[,M[,![,#[,>[,~[,[,[,[,f[,K[,8[,"[, [,o[,ܴ[,([,l[,ߴ[,'[,[,[,[,[,[,[,[,d[,G[,[,[,[/[,[,0[,[,[,[,[,Z,[,[,([,ߴ[,l[,[,[,![,&[,4[,A[,J[,r[,[,[,[,8[,[,[,[,[,[,q[,*[,[,[,[,[,[,[,۴[,[,N[,E[,8[,+[,[,[,[,[,[,[,[,[-[,O[,*[,Z*[,[,[,[,ܴ[,[,m[,[+Z,[,[, [,![,G[,>[,[,[,[,%[,S[,[,[,[,n[,[,[,^[,[,[,[,X[,"[,[,[,[,Z[,[,[,[,^[,[,[,^[,[,[,[,\[, [,\,[,[,7[,E[,[,[,[,^[,[,[,m[,R9\,[,[,4[,KZ,\+Y/[,[,\[,[,[,[,n[,[,ܴ[,[,[,[,[,[,O[,l[,[,[,[,:[,[,Z-[,[,[,ܴ[,[,[,[,[,Z-[,w[,[,[,[,[,[,[,[,[,[,[,[,m[,[,([,[,[,%[,ߴ[,[,[,[,[,[,9[,[,[,([,ߴ[,l[,)[,ݴ[,o[,[,[,[,D[,[,[,[,[,[,[,_[,[,[,[,n[,ܴ[,([,[,[,ɴ[,[,[,[,2[,[,[,[,[,[,[,}[,$[, R[,[,[,[,ȴ[,[,[,[,6[,[,n[-[,[,[, [,[,[,[,[,[,[,0[,H[,Z[,G[,R[,[,[,[,[,n[,[,6[,[,[,[,[,ݴ[,2[,[,[,\,[,7[,^[,[,д[,Դ[,[,"[,Z+[, [,&[,[,[,y[,[,5[,ݴ[,[-[,[,[,[,´[,[,[,Z-[,[-[,:[,S[,[,[,[,[,[,[,[,[,[,M[,.[,c[,ִ[,[,[,[-[,[,4[,ش[,[,[,[,Z-[,;[,R[,[,Z-[,[,[,[,ش[,4[,[,[,[,[,?[,ٴ[,[,1[+[,;[,P[,[,[,T#[,1[,[,ٴ[,?[,[,[,[,[,4[,ô[,޴[,[,][,[,\,Z,[,[,n[,[,ô[,4[,[,Z,[,[,[,[,[,̴[,m[,'[,Y*[-\-U%[,[,'[,n[,ʴ[,[,[,[,\-[,[,[,5[,[,ݴ[,[,[,[,m[,^[,^[,m[,[,[,[,޴[,[,5[,[,[,[,[,([,l[,[,ܴ[,[,[,[,[,ݴ[,[,m[,([,[,???!>8q8x?????1>?aiohttp-3.6.2/docs/glossary.rst0000644000175100001650000000653313547410117017021 0ustar vstsdocker00000000000000.. _aiohttp-glossary: ========== Glossary ========== .. if you add new entries, keep the alphabetical sorting! .. glossary:: :sorted: aiodns DNS resolver for asyncio. https://pypi.python.org/pypi/aiodns asyncio The library for writing single-threaded concurrent code using coroutines, multiplexing I/O access over sockets and other resources, running network clients and servers, and other related primitives. Reference implementation of :pep:`3156` https://pypi.python.org/pypi/asyncio/ callable Any object that can be called. Use :func:`callable` to check that. chardet The Universal Character Encoding Detector https://pypi.python.org/pypi/chardet/ cchardet cChardet is high speed universal character encoding detector - binding to charsetdetect. https://pypi.python.org/pypi/cchardet/ gunicorn Gunicorn 'Green Unicorn' is a Python WSGI HTTP Server for UNIX. http://gunicorn.org/ IDNA An Internationalized Domain Name in Applications (IDNA) is an industry standard for encoding Internet Domain Names that contain in whole or in part, in a language-specific script or alphabet, such as Arabic, Chinese, Cyrillic, Tamil, Hebrew or the Latin alphabet-based characters with diacritics or ligatures, such as French. These writing systems are encoded by computers in multi-byte Unicode. Internationalized domain names are stored in the Domain Name System as ASCII strings using Punycode transcription. keep-alive A technique for communicating between HTTP client and server when connection is not closed after sending response but kept open for sending next request through the same socket. It makes communication faster by getting rid of connection establishment for every request. nginx Nginx [engine x] is an HTTP and reverse proxy server, a mail proxy server, and a generic TCP/UDP proxy server. https://nginx.org/en/ percent-encoding A mechanism for encoding information in a Uniform Resource Locator (URL) if URL parts don't fit in safe characters space. requoting Applying :term:`percent-encoding` to non-safe symbols and decode percent encoded safe symbols back. According to :rfc:`3986` allowed path symbols are:: allowed = unreserved / pct-encoded / sub-delims / ":" / "@" / "/" pct-encoded = "%" HEXDIG HEXDIG unreserved = ALPHA / DIGIT / "-" / "." / "_" / "~" sub-delims = "!" / "$" / "&" / "'" / "(" / ")" / "*" / "+" / "," / ";" / "=" resource A concept reflects the HTTP **path**, every resource corresponds to *URI*. May have a unique name. Contains :term:`route`\'s for different HTTP methods. route A part of :term:`resource`, resource's *path* coupled with HTTP method. web-handler An endpoint that returns HTTP response. websocket A protocol providing full-duplex communication channels over a single TCP connection. The WebSocket protocol was standardized by the IETF as :rfc:`6455` yarl A library for operating with URL objects. https://pypi.python.org/pypi/yarl aiohttp-3.6.2/docs/index.rst0000644000175100001650000001166013547410117016262 0ustar vstsdocker00000000000000.. aiohttp documentation master file, created by sphinx-quickstart on Wed Mar 5 12:35:35 2014. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. ================== Welcome to AIOHTTP ================== Asynchronous HTTP Client/Server for :term:`asyncio` and Python. Current version is |release|. .. _GitHub: https://github.com/aio-libs/aiohttp Key Features ============ - Supports both :ref:`aiohttp-client` and :ref:`HTTP Server `. - Supports both :ref:`Server WebSockets ` and :ref:`Client WebSockets ` out-of-the-box without the Callback Hell. - Web-server has :ref:`aiohttp-web-middlewares`, :ref:`aiohttp-web-signals` and pluggable routing. .. _aiohttp-installation: Library Installation ==================== .. code-block:: bash $ pip install aiohttp You may want to install *optional* :term:`cchardet` library as faster replacement for :term:`chardet`: .. code-block:: bash $ pip install cchardet For speeding up DNS resolving by client API you may install :term:`aiodns` as well. This option is highly recommended: .. code-block:: bash $ pip install aiodns Installing speedups altogether ------------------------------ The following will get you ``aiohttp`` along with :term:`chardet`, :term:`aiodns` and ``brotlipy`` in one bundle. No need to type separate commands anymore! .. code-block:: bash $ pip install aiohttp[speedups] Getting Started =============== Client example:: import aiohttp import asyncio async def fetch(session, url): async with session.get(url) as response: return await response.text() async def main(): async with aiohttp.ClientSession() as session: html = await fetch(session, 'http://python.org') print(html) if __name__ == '__main__': loop = asyncio.get_event_loop() loop.run_until_complete(main()) Server example:: from aiohttp import web async def handle(request): name = request.match_info.get('name', "Anonymous") text = "Hello, " + name return web.Response(text=text) app = web.Application() app.add_routes([web.get('/', handle), web.get('/{name}', handle)]) if __name__ == '__main__': web.run_app(app) For more information please visit :ref:`aiohttp-client` and :ref:`aiohttp-web` pages. What's new in aiohttp 3? ======================== Go to :ref:`aiohttp_whats_new_3_0` page for aiohttp 3.0 major release changes. Tutorial ======== :ref:`Polls tutorial ` Source code =========== The project is hosted on GitHub_ Please feel free to file an issue on the `bug tracker `_ if you have found a bug or have some suggestion in order to improve the library. The library uses `Travis `_ for Continuous Integration. Dependencies ============ - Python 3.5.3+ - *async_timeout* - *attrs* - *chardet* - *multidict* - *yarl* - *Optional* :term:`cchardet` as faster replacement for :term:`chardet`. Install it explicitly via: .. code-block:: bash $ pip install cchardet - *Optional* :term:`aiodns` for fast DNS resolving. The library is highly recommended. .. code-block:: bash $ pip install aiodns Communication channels ====================== *aio-libs* google group: https://groups.google.com/forum/#!forum/aio-libs Feel free to post your questions and ideas here. *gitter chat* https://gitter.im/aio-libs/Lobby We support `Stack Overflow `_. Please add *aiohttp* tag to your question there. Contributing ============ Please read the :ref:`instructions for contributors` before making a Pull Request. Authors and License =================== The ``aiohttp`` package is written mostly by Nikolay Kim and Andrew Svetlov. It's *Apache 2* licensed and freely available. Feel free to improve this package and send a pull request to GitHub_. .. _aiohttp-backward-compatibility-policy: Policy for Backward Incompatible Changes ======================================== *aiohttp* keeps backward compatibility. After deprecating some *Public API* (method, class, function argument, etc.) the library guaranties the usage of *deprecated API* is still allowed at least for a year and half after publishing new release with deprecation. All deprecations are reflected in documentation and raises :exc:`DeprecationWarning`. Sometimes we are forced to break the own rule for sake of very strong reason. Most likely the reason is a critical bug which cannot be solved without major API change, but we are working hard for keeping these changes as rare as possible. Table Of Contents ================= .. toctree:: :name: mastertoc :maxdepth: 2 client web utilities faq misc external contributing aiohttp-3.6.2/docs/logging.rst0000644000175100001650000001314113547410117016575 0ustar vstsdocker00000000000000.. _aiohttp-logging: Logging ======= .. currentmodule:: aiohttp *aiohttp* uses standard :mod:`logging` for tracking the library activity. We have the following loggers enumerated by names: - ``'aiohttp.access'`` - ``'aiohttp.client'`` - ``'aiohttp.internal'`` - ``'aiohttp.server'`` - ``'aiohttp.web'`` - ``'aiohttp.websocket'`` You may subscribe to these loggers for getting logging messages. The page does not provide instructions for logging subscribing while the most friendly method is :func:`logging.config.dictConfig` for configuring whole loggers in your application. Logging does not work out of the box. It requires at least minimal ``'logging'`` configuration. Example of minimal working logger setup:: import logging from aiohttp import web app = web.Application() logging.basicConfig(level=logging.DEBUG) web.run_app(app, port=5000) .. versionadded:: 4.0.0 Access logs ----------- Access logs are enabled by default. If the `debug` flag is set, and the default logger ``'aiohttp.access'`` is used, access logs will be output to :obj:`~sys.stderr` if no handlers are attached. Furthermore, if the default logger has no log level set, the log level will be set to :obj:`logging.DEBUG`. This logging may be controlled by :meth:`aiohttp.web.AppRunner` and :func:`aiohttp.web.run_app`. To override the default logger, pass an instance of :class:`logging.Logger` to override the default logger. .. note:: Use ``web.run_app(app, access_log=None)`` to disable access logs. In addition, *access_log_format* may be used to specify the log format. .. _aiohttp-logging-access-log-format-spec: Format specification ^^^^^^^^^^^^^^^^^^^^ The library provides custom micro-language to specifying info about request and response: +--------------+---------------------------------------------------------+ | Option | Meaning | +==============+=========================================================+ | ``%%`` | The percent sign | +--------------+---------------------------------------------------------+ | ``%a`` | Remote IP-address | | | (IP-address of proxy if using reverse proxy) | +--------------+---------------------------------------------------------+ | ``%t`` | Time when the request was started to process | +--------------+---------------------------------------------------------+ | ``%P`` | The process ID of the child that serviced the request | +--------------+---------------------------------------------------------+ | ``%r`` | First line of request | +--------------+---------------------------------------------------------+ | ``%s`` | Response status code | +--------------+---------------------------------------------------------+ | ``%b`` | Size of response in bytes, including HTTP headers | +--------------+---------------------------------------------------------+ | ``%T`` | The time taken to serve the request, in seconds | +--------------+---------------------------------------------------------+ | ``%Tf`` | The time taken to serve the request, in seconds | | | with fraction in %.06f format | +--------------+---------------------------------------------------------+ | ``%D`` | The time taken to serve the request, in microseconds | +--------------+---------------------------------------------------------+ | ``%{FOO}i`` | ``request.headers['FOO']`` | +--------------+---------------------------------------------------------+ | ``%{FOO}o`` | ``response.headers['FOO']`` | +--------------+---------------------------------------------------------+ The default access log format is:: '%a %t "%r" %s %b "%{Referer}i" "%{User-Agent}i"' .. versionadded:: 2.3.0 *access_log_class* introduced. Example of a drop-in replacement for the default access logger:: from aiohttp.abc import AbstractAccessLogger class AccessLogger(AbstractAccessLogger): def log(self, request, response, time): self.logger.info(f'{request.remote} ' f'"{request.method} {request.path} ' f'done in {time}s: {response.status}') .. _gunicorn-accesslog: Gunicorn access logs ^^^^^^^^^^^^^^^^^^^^ When `Gunicorn `_ is used for :ref:`deployment `, its default access log format will be automatically replaced with the default aiohttp's access log format. If Gunicorn's option access_logformat_ is specified explicitly, it should use aiohttp's format specification. Gunicorn's access log works only if accesslog_ is specified explicitly in your config or as a command line option. This configuration can be either a path or ``'-'``. If the application uses a custom logging setup intercepting the ``'gunicorn.access'`` logger, accesslog_ should be set to ``'-'`` to prevent Gunicorn to create an empty access log file upon every startup. Error logs ---------- :mod:`aiohttp.web` uses a logger named ``'aiohttp.server'`` to store errors given on web requests handling. This log is enabled by default. To use a different logger name, pass *logger* (:class:`logging.Logger` instance) to the :meth:`aiohttp.web.AppRunner` constructor. .. _access_logformat: http://docs.gunicorn.org/en/stable/settings.html#access-log-format .. _accesslog: http://docs.gunicorn.org/en/stable/settings.html#accesslog aiohttp-3.6.2/docs/make.bat0000644000175100001650000001505713547410117016032 0ustar vstsdocker00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. xml to make Docutils-native XML files echo. pseudoxml to make pseudoxml-XML files for display purposes echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) %SPHINXBUILD% 2> nul if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\aiohttp.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\aiohttp.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdf" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdfja" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf-ja cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) if "%1" == "xml" ( %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml if errorlevel 1 exit /b 1 echo. echo.Build finished. The XML files are in %BUILDDIR%/xml. goto end ) if "%1" == "pseudoxml" ( %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml if errorlevel 1 exit /b 1 echo. echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. goto end ) :end aiohttp-3.6.2/docs/migration_to_2xx.rst0000644000175100001650000001601013547410117020441 0ustar vstsdocker00000000000000.. _aiohttp-migration: Migration to 2.x ================ Client ------ chunking ^^^^^^^^ aiohttp does not support custom chunking sizes. It is up to the developer to decide how to chunk data streams. If chunking is enabled, aiohttp encodes the provided chunks in the "Transfer-encoding: chunked" format. aiohttp does not enable chunked encoding automatically even if a *transfer-encoding* header is supplied: *chunked* has to be set explicitly. If *chunked* is set, then the *Transfer-encoding* and *content-length* headers are disallowed. compression ^^^^^^^^^^^ Compression has to be enabled explicitly with the *compress* parameter. If compression is enabled, adding a *content-encoding* header is not allowed. Compression also enables the *chunked* transfer-encoding. Compression can not be combined with a *Content-Length* header. Client Connector ^^^^^^^^^^^^^^^^ 1. By default a connector object manages a total number of concurrent connections. This limit was a per host rule in version 1.x. In 2.x, the `limit` parameter defines how many concurrent connection connector can open and a new `limit_per_host` parameter defines the limit per host. By default there is no per-host limit. 2. BaseConnector.close is now a normal function as opposed to coroutine in version 1.x 3. BaseConnector.conn_timeout was moved to ClientSession ClientResponse.release ^^^^^^^^^^^^^^^^^^^^^^ Internal implementation was significantly redesigned. It is not required to call `release` on the response object. When the client fully receives the payload, the underlying connection automatically returns back to pool. If the payload is not fully read, the connection is closed Client exceptions ^^^^^^^^^^^^^^^^^ Exception hierarchy has been significantly modified. aiohttp now defines only exceptions that covers connection handling and server response misbehaviors. For developer specific mistakes, aiohttp uses python standard exceptions like ValueError or TypeError. Reading a response content may raise a ClientPayloadError exception. This exception indicates errors specific to the payload encoding. Such as invalid compressed data, malformed chunked-encoded chunks or not enough data that satisfy the content-length header. All exceptions are moved from `aiohttp.errors` module to top level `aiohttp` module. New hierarchy of exceptions: * `ClientError` - Base class for all client specific exceptions - `ClientResponseError` - exceptions that could happen after we get response from server * `WSServerHandshakeError` - web socket server response error - `ClientHttpProxyError` - proxy response - `ClientConnectionError` - exceptions related to low-level connection problems * `ClientOSError` - subset of connection errors that are initiated by an OSError exception - `ClientConnectorError` - connector related exceptions * `ClientProxyConnectionError` - proxy connection initialization error - `ServerConnectionError` - server connection related errors * `ServerDisconnectedError` - server disconnected * `ServerTimeoutError` - server operation timeout, (read timeout, etc) * `ServerFingerprintMismatch` - server fingerprint mismatch - `ClientPayloadError` - This exception can only be raised while reading the response payload if one of these errors occurs: invalid compression, malformed chunked encoding or not enough data that satisfy content-length header. Client payload (form-data) ^^^^^^^^^^^^^^^^^^^^^^^^^^ To unify form-data/payload handling a new `Payload` system was introduced. It handles customized handling of existing types and provide implementation for user-defined types. 1. FormData.__call__ does not take an encoding arg anymore and its return value changes from an iterator or bytes to a Payload instance. aiohttp provides payload adapters for some standard types like `str`, `byte`, `io.IOBase`, `StreamReader` or `DataQueue`. 2. a generator is not supported as data provider anymore, `streamer` can be used instead. For example, to upload data from file:: @aiohttp.streamer def file_sender(writer, file_name=None): with open(file_name, 'rb') as f: chunk = f.read(2**16) while chunk: yield from writer.write(chunk) chunk = f.read(2**16) # Then you can use `file_sender` like this: async with session.post('http://httpbin.org/post', data=file_sender(file_name='huge_file')) as resp: print(await resp.text()) Various ^^^^^^^ 1. the `encoding` parameter is deprecated in `ClientSession.request()`. Payload encoding is controlled at the payload level. It is possible to specify an encoding for each payload instance. 2. the `version` parameter is removed in `ClientSession.request()` client version can be specified in the `ClientSession` constructor. 3. `aiohttp.MsgType` dropped, use `aiohttp.WSMsgType` instead. 4. `ClientResponse.url` is an instance of `yarl.URL` class (`url_obj` is deprecated) 5. `ClientResponse.raise_for_status()` raises :exc:`aiohttp.ClientResponseError` exception 6. `ClientResponse.json()` is strict about response's content type. if content type does not match, it raises :exc:`aiohttp.ClientResponseError` exception. To disable content type check you can pass ``None`` as `content_type` parameter. Server ------ ServerHttpProtocol and low-level details ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Internal implementation was significantly redesigned to provide better performance and support HTTP pipelining. ServerHttpProtocol is dropped, implementation is merged with RequestHandler a lot of low-level api's are dropped. Application ^^^^^^^^^^^ 1. Constructor parameter `loop` is deprecated. Loop is get configured by application runner, `run_app` function for any of gunicorn workers. 2. `Application.router.add_subapp` is dropped, use `Application.add_subapp` instead 3. `Application.finished` is dropped, use `Application.cleanup` instead WebRequest and WebResponse ^^^^^^^^^^^^^^^^^^^^^^^^^^ 1. the `GET` and `POST` attributes no longer exist. Use the `query` attribute instead of `GET` 2. Custom chunking size is not support `WebResponse.chunked` - developer is responsible for actual chunking. 3. Payloads are supported as body. So it is possible to use client response's content object as body parameter for `WebResponse` 4. `FileSender` api is dropped, it is replaced with more general `FileResponse` class:: async def handle(request): return web.FileResponse('path-to-file.txt') 5. `WebSocketResponse.protocol` is renamed to `WebSocketResponse.ws_protocol`. `WebSocketResponse.protocol` is instance of `RequestHandler` class. RequestPayloadError ^^^^^^^^^^^^^^^^^^^ Reading request's payload may raise a `RequestPayloadError` exception. The behavior is similar to `ClientPayloadError`. WSGI ^^^^ *WSGI* support has been dropped, as well as gunicorn wsgi support. We still provide default and uvloop gunicorn workers for `web.Application` aiohttp-3.6.2/docs/misc.rst0000644000175100001650000000037113547410117016103 0ustar vstsdocker00000000000000.. _aiohttp-misc: Miscellaneous ============= Helpful pages. .. toctree:: :name: misc essays glossary .. toctree:: :titlesonly: changes Indices and tables ------------------ * :ref:`genindex` * :ref:`modindex` * :ref:`search` aiohttp-3.6.2/docs/multipart.rst0000644000175100001650000002720213547410117017173 0ustar vstsdocker00000000000000.. currentmodule:: aiohttp .. _aiohttp-multipart: Working with Multipart ====================== ``aiohttp`` supports a full featured multipart reader and writer. Both are designed with steaming processing in mind to avoid unwanted footprint which may be significant if you're dealing with large payloads, but this also means that most I/O operation are only possible to be executed a single time. Reading Multipart Responses --------------------------- Assume you made a request, as usual, and want to process the response multipart data:: async with aiohttp.request(...) as resp: pass First, you need to wrap the response with a :meth:`MultipartReader.from_response`. This needs to keep the implementation of :class:`MultipartReader` separated from the response and the connection routines which makes it more portable:: reader = aiohttp.MultipartReader.from_response(resp) Let's assume with this response you'd received some JSON document and multiple files for it, but you don't need all of them, just a specific one. So first you need to enter into a loop where the multipart body will be processed:: metadata = None filedata = None while True: part = await reader.next() The returned type depends on what the next part is: if it's a simple body part then you'll get :class:`BodyPartReader` instance here, otherwise, it will be another :class:`MultipartReader` instance for the nested multipart. Remember, that multipart format is recursive and supports multiple levels of nested body parts. When there are no more parts left to fetch, ``None`` value will be returned - that's the signal to break the loop:: if part is None: break Both :class:`BodyPartReader` and :class:`MultipartReader` provides access to body part headers: this allows you to filter parts by their attributes:: if part.headers[aiohttp.hdrs.CONTENT_TYPE] == 'application/json': metadata = await part.json() continue Nor :class:`BodyPartReader` or :class:`MultipartReader` instances does not read the whole body part data without explicitly asking for. :class:`BodyPartReader` provides a set of helpers methods to fetch popular content types in friendly way: - :meth:`BodyPartReader.text` for plain text data; - :meth:`BodyPartReader.json` for JSON; - :meth:`BodyPartReader.form` for `application/www-urlform-encode` Each of these methods automatically recognizes if content is compressed by using `gzip` and `deflate` encoding (while it respects `identity` one), or if transfer encoding is base64 or `quoted-printable` - in each case the result will get automatically decoded. But in case you need to access to raw binary data as it is, there are :meth:`BodyPartReader.read` and :meth:`BodyPartReader.read_chunk` coroutine methods as well to read raw binary data as it is all-in-single-shot or by chunks respectively. When you have to deal with multipart files, the :attr:`BodyPartReader.filename` property comes to help. It's a very smart helper which handles `Content-Disposition` handler right and extracts the right filename attribute from it:: if part.filename != 'secret.txt': continue If current body part does not matches your expectation and you want to skip it - just continue a loop to start a next iteration of it. Here is where magic happens. Before fetching the next body part ``await reader.next()`` it ensures that the previous one was read completely. If it was not, all its content sends to the void in term to fetch the next part. So you don't have to care about cleanup routines while you're within a loop. Once you'd found a part for the file you'd searched for, just read it. Let's handle it as it is without applying any decoding magic:: filedata = await part.read(decode=False) Later you may decide to decode the data. It's still simple and possible to do:: filedata = part.decode(filedata) Once you are done with multipart processing, just break a loop:: break Sending Multipart Requests -------------------------- :class:`MultipartWriter` provides an interface to build multipart payload from the Python data and serialize it into chunked binary stream. Since multipart format is recursive and supports deeply nesting, you can use ``with`` statement to design your multipart data closer to how it will be:: with aiohttp.MultipartWriter('mixed') as mpwriter: ... with aiohttp.MultipartWriter('related') as subwriter: ... mpwriter.append(subwriter) with aiohttp.MultipartWriter('related') as subwriter: ... with aiohttp.MultipartWriter('related') as subsubwriter: ... subwriter.append(subsubwriter) mpwriter.append(subwriter) with aiohttp.MultipartWriter('related') as subwriter: ... mpwriter.append(subwriter) The :meth:`MultipartWriter.append` is used to join new body parts into a single stream. It accepts various inputs and determines what default headers should be used for. For text data default `Content-Type` is :mimetype:`text/plain; charset=utf-8`:: mpwriter.append('hello') For binary data :mimetype:`application/octet-stream` is used:: mpwriter.append(b'aiohttp') You can always override these default by passing your own headers with the second argument:: mpwriter.append(io.BytesIO(b'GIF89a...'), {'CONTENT-TYPE': 'image/gif'}) For file objects `Content-Type` will be determined by using Python's mod:`mimetypes` module and additionally `Content-Disposition` header will include the file's basename:: part = root.append(open(__file__, 'rb')) If you want to send a file with a different name, just handle the :class:`Payload` instance which :meth:`MultipartWriter.append` will always return and set `Content-Disposition` explicitly by using the :meth:`Payload.set_content_disposition` helper:: part.set_content_disposition('attachment', filename='secret.txt') Additionally, you may want to set other headers here:: part.headers[aiohttp.hdrs.CONTENT_ID] = 'X-12345' If you'd set `Content-Encoding`, it will be automatically applied to the data on serialization (see below):: part.headers[aiohttp.hdrs.CONTENT_ENCODING] = 'gzip' There are also :meth:`MultipartWriter.append_json` and :meth:`MultipartWriter.append_form` helpers which are useful to work with JSON and form urlencoded data, so you don't have to encode it every time manually:: mpwriter.append_json({'test': 'passed'}) mpwriter.append_form([('key', 'value')]) When it's done, to make a request just pass a root :class:`MultipartWriter` instance as :meth:`aiohttp.ClientSession.request` ``data`` argument:: await session.post('http://example.com', data=mpwriter) Behind the scenes :meth:`MultipartWriter.write` will yield chunks of every part and if body part has `Content-Encoding` or `Content-Transfer-Encoding` they will be applied on streaming content. Please note, that on :meth:`MultipartWriter.write` all the file objects will be read until the end and there is no way to repeat a request without rewinding their pointers to the start. Example MJPEG Streaming ``multipart/x-mixed-replace``. By default :meth:`MultipartWriter.write` appends closing ``--boundary--`` and breaks your content. Providing `close_boundary = False` prevents this.:: my_boundary = 'some-boundary' response = web.StreamResponse( status=200, reason='OK', headers={ 'Content-Type': 'multipart/x-mixed-replace;boundary={}'.format(my_boundary) } ) while True: frame = get_jpeg_frame() with MultipartWriter('image/jpeg', boundary=my_boundary) as mpwriter: mpwriter.append(frame, { 'Content-Type': 'image/jpeg' }) await mpwriter.write(response, close_boundary=False) await response.drain() Hacking Multipart ----------------- The Internet is full of terror and sometimes you may find a server which implements multipart support in strange ways when an oblivious solution does not work. For instance, is server used :class:`cgi.FieldStorage` then you have to ensure that no body part contains a `Content-Length` header:: for part in mpwriter: part.headers.pop(aiohttp.hdrs.CONTENT_LENGTH, None) On the other hand, some server may require to specify `Content-Length` for the whole multipart request. `aiohttp` does not do that since it sends multipart using chunked transfer encoding by default. To overcome this issue, you have to serialize a :class:`MultipartWriter` by our own in the way to calculate its size:: class Writer: def __init__(self): self.buffer = bytearray() async def write(self, data): self.buffer.extend(data) writer = Writer() mpwriter.writer(writer) await aiohttp.post('http://example.com', data=writer.buffer, headers=mpwriter.headers) Sometimes the server response may not be well formed: it may or may not contains nested parts. For instance, we request a resource which returns JSON documents with the files attached to it. If the document has any attachments, they are returned as a nested multipart. If it has not it responds as plain body parts: .. code-block:: none CONTENT-TYPE: multipart/mixed; boundary=--: --: CONTENT-TYPE: application/json {"_id": "foo"} --: CONTENT-TYPE: multipart/related; boundary=----: ----: CONTENT-TYPE: application/json {"_id": "bar"} ----: CONTENT-TYPE: text/plain CONTENT-DISPOSITION: attachment; filename=bar.txt bar! bar! bar! ----:-- --: CONTENT-TYPE: application/json {"_id": "boo"} --: CONTENT-TYPE: multipart/related; boundary=----: ----: CONTENT-TYPE: application/json {"_id": "baz"} ----: CONTENT-TYPE: text/plain CONTENT-DISPOSITION: attachment; filename=baz.txt baz! baz! baz! ----:-- --:-- Reading such kind of data in single stream is possible, but is not clean at all:: result = [] while True: part = await reader.next() if part is None: break if isinstance(part, aiohttp.MultipartReader): # Fetching files while True: filepart = await part.next() if filepart is None: break result[-1].append((await filepart.read())) else: # Fetching document result.append([(await part.json())]) Let's hack a reader in the way to return pairs of document and reader of the related files on each iteration:: class PairsMultipartReader(aiohttp.MultipartReader): # keep reference on the original reader multipart_reader_cls = aiohttp.MultipartReader async def next(self): """Emits a tuple of document object (:class:`dict`) and multipart reader of the followed attachments (if any). :rtype: tuple """ reader = await super().next() if self._at_eof: return None, None if isinstance(reader, self.multipart_reader_cls): part = await reader.next() doc = await part.json() else: doc = await reader.json() return doc, reader And this gives us a more cleaner solution:: reader = PairsMultipartReader.from_response(resp) result = [] while True: doc, files_reader = await reader.next() if doc is None: break files = [] while True: filepart = await files_reader.next() if file.part is None: break files.append((await filepart.read())) result.append((doc, files)) .. seealso:: :ref:`aiohttp-multipart-reference` aiohttp-3.6.2/docs/multipart_reference.rst0000644000175100001650000001156113547410117021212 0ustar vstsdocker00000000000000.. currentmodule:: aiohttp .. _aiohttp-multipart-reference: Multipart reference =================== .. class:: MultipartResponseWrapper(resp, stream) Wrapper around the :class:`MultipartBodyReader` to take care about underlying connection and close it when it needs in. .. method:: at_eof() Returns ``True`` when all response data had been read. :rtype: bool .. comethod:: next() Emits next multipart reader object. .. comethod:: release() Releases the connection gracefully, reading all the content to the void. .. class:: BodyPartReader(boundary, headers, content) Multipart reader for single body part. .. comethod:: read(*, decode=False) Reads body part data. :param bool decode: Decodes data following by encoding method from ``Content-Encoding`` header. If it missed data remains untouched :rtype: bytearray .. comethod:: read_chunk(size=chunk_size) Reads body part content chunk of the specified size. :param int size: chunk size :rtype: bytearray .. comethod:: readline() Reads body part by line by line. :rtype: bytearray .. comethod:: release() Like :meth:`read`, but reads all the data to the void. :rtype: None .. comethod:: text(*, encoding=None) Like :meth:`read`, but assumes that body part contains text data. :param str encoding: Custom text encoding. Overrides specified in charset param of ``Content-Type`` header :rtype: str .. comethod:: json(*, encoding=None) Like :meth:`read`, but assumes that body parts contains JSON data. :param str encoding: Custom JSON encoding. Overrides specified in charset param of ``Content-Type`` header .. comethod:: form(*, encoding=None) Like :meth:`read`, but assumes that body parts contains form urlencoded data. :param str encoding: Custom form encoding. Overrides specified in charset param of ``Content-Type`` header .. method:: at_eof() Returns ``True`` if the boundary was reached or ``False`` otherwise. :rtype: bool .. method:: decode(data) Decodes data according the specified ``Content-Encoding`` or ``Content-Transfer-Encoding`` headers value. Supports ``gzip``, ``deflate`` and ``identity`` encodings for ``Content-Encoding`` header. Supports ``base64``, ``quoted-printable``, ``binary`` encodings for ``Content-Transfer-Encoding`` header. :param bytearray data: Data to decode. :raises: :exc:`RuntimeError` - if encoding is unknown. :rtype: bytes .. method:: get_charset(default=None) Returns charset parameter from ``Content-Type`` header or default. .. attribute:: name A field *name* specified in ``Content-Disposition`` header or ``None`` if missed or header is malformed. Readonly :class:`str` property. .. attribute:: name A field *filename* specified in ``Content-Disposition`` header or ``None`` if missed or header is malformed. Readonly :class:`str` property. .. class:: MultipartReader(headers, content) Multipart body reader. .. classmethod:: from_response(cls, response) Constructs reader instance from HTTP response. :param response: :class:`~aiohttp.client.ClientResponse` instance .. method:: at_eof() Returns ``True`` if the final boundary was reached or ``False`` otherwise. :rtype: bool .. comethod:: next() Emits the next multipart body part. .. comethod:: release() Reads all the body parts to the void till the final boundary. .. comethod:: fetch_next_part() Returns the next body part reader. .. class:: MultipartWriter(subtype='mixed', boundary=None, close_boundary=True) Multipart body writer. ``boundary`` may be an ASCII-only string. .. attribute:: boundary The string (:class:`str`) representation of the boundary. .. versionchanged:: 3.0 Property type was changed from :class:`bytes` to :class:`str`. .. method:: append(obj, headers=None) Append an object to writer. .. method:: append_payload(payload) Adds a new body part to multipart writer. .. method:: append_json(obj, headers=None) Helper to append JSON part. .. method:: append_form(obj, headers=None) Helper to append form urlencoded part. .. attribute:: size Size of the payload. .. comethod:: write(writer, close_boundary=True) Write body. :param bool close_boundary: The (:class:`bool`) that will emit boundary closing. You may want to disable when streaming (``multipart/x-mixed-replace``) .. versionadded:: 3.4 Support ``close_boundary`` argument. aiohttp-3.6.2/docs/new_router.rst0000644000175100001650000000545513547410117017351 0ustar vstsdocker00000000000000.. _aiohttp-router-refactoring-021: Router refactoring in 0.21 ========================== Rationale --------- First generation (v1) of router has mapped ``(method, path)`` pair to :term:`web-handler`. Mapping is named **route**. Routes used to have unique names if any. The main mistake with the design is coupling the **route** to ``(method, path)`` pair while really URL construction operates with **resources** (**location** is a synonym). HTTP method is not part of URI but applied on sending HTTP request only. Having different **route names** for the same path is confusing. Moreover **named routes** constructed for the same path should have unique non overlapping names which is cumbersome is certain situations. From other side sometimes it's desirable to bind several HTTP methods to the same web handler. For *v1* router it can be solved by passing '*' as HTTP method. Class based views require '*' method also usually. Implementation -------------- The change introduces **resource** as first class citizen:: resource = router.add_resource('/path/{to}', name='name') *Resource* has a **path** (dynamic or constant) and optional **name**. The name is **unique** in router context. *Resource* has **routes**. *Route* corresponds to *HTTP method* and :term:`web-handler` for the method:: route = resource.add_route('GET', handler) User still may use wildcard for accepting all HTTP methods (maybe we will add something like ``resource.add_wildcard(handler)`` later). Since **names** belongs to **resources** now ``app.router['name']`` returns a **resource** instance instead of :class:`aiohttp.web.Route`. **resource** has ``.url()`` method, so ``app.router['name'].url(parts={'a': 'b'}, query={'arg': 'param'})`` still works as usual. The change allows to rewrite static file handling and implement nested applications as well. Decoupling of *HTTP location* and *HTTP method* makes life easier. Backward compatibility ---------------------- The refactoring is 99% compatible with previous implementation. 99% means all example and the most of current code works without modifications but we have subtle API backward incompatibles. ``app.router['name']`` returns a :class:`aiohttp.web.BaseResource` instance instead of :class:`aiohttp.web.Route` but resource has the same ``resource.url(...)`` most useful method, so end user should feel no difference. ``route.match(...)`` is **not** supported anymore, use :meth:`aiohttp.web.AbstractResource.resolve` instead. ``app.router.add_route(method, path, handler, name='name')`` now is just shortcut for:: resource = app.router.add_resource(path, name=name) route = resource.add_route(method, handler) return route ``app.router.register_route(...)`` is still supported, it creates :class:`aiohttp.web.ResourceAdapter` for every call (but it's deprecated now). aiohttp-3.6.2/docs/old-logo.png0000644000175100001650000076546113547410117016662 0ustar vstsdocker00000000000000PNG  IHDR<<~jgAMA a cHRMz&u0`:pQ<bKGDIDATxwT~{Mb")fE7%&tJ7x]AҶ]eiƈH,Da+{crǙ=3s̙~>Gnna3f  BOa^ZAAAI Q%H   F"2    BnDQB    gBJ!   "^jBlaüA!   Rng P!AAA ȍpNb$AAA#b#ſNMC    lLAL)"   hoW@ AAA$8JnRf Oi"$   ؘCDdgEs!^b_D^@K AAA$8&8|)6v_|Eɂ{PY A..W&tD!    oņrD;aE>*_|-qY%g&%    o:-ȈI.K/Hg"_?MgD7B"$AAAD\lJjx"3D'_YC~RSarLAJ"!*A   "VrZnL[ÝԐ+4DeBJ|M}Ã˓ݟSy(Jȑ' E<!BAA _ o7BS!G\|]0wo{Xa>x|KHLQJ|Yq%D!! $?  Z oGQ GАD+y!WVx#$5% 6o\iJx*F{ved@K AAADD/6<RR2+!}`.WV8;UV;f;wn{キ?~ڵklhhxqg{{{Vggg^__r``zhh*=N^5mF`0 vβc0?k4^hl:SJCCCgyY;^v?{MΝ[QZZcǎV;^ "W$ߒ#_w!E&*C\W!A $@   {CpUJlԸÅR2;2addd433s'{kjjںh``zddMF^3Lǩp2LzN|822@uwwwQkk?{effNMMwINx"EdȐo%D@%@   &,9&*8!7Ćܘ$"7ĆBC&2LrSO='N,>vrddV"/ %I&[~222r`mm|'N,x'=eʔd)!"!"3S;j   >Rtm/rΌKΝ[qڵ?g TTC²N"F;.]z/<}x D\Ǯ/31 $@  +JCj,jŝpթaS~ݟ9kjkk.{`0pg %"<66vϯ̜~fCD "S LB   HrUrBpuo8fn7$ĆX _ /^|^Jzd2u0c=%fԡR.*^|w7&(B:BA< Rb  !AAA$9dIo8|%8np܀"ձa8|pK6:IjA!fsJ]xڵ?VWW/={dw $$ ""1ouAAIuqx*8S\uop%7\ݟGbc FWqJLdAAAU OƔVWW/}v``"ddd|kk .ۺuϽ b Rv$?  "B%GǗ] WDZQ… wGDK?ÇK_r%Aa\u|I??H~AAIYc*FT<Rߑ/^|JȲ-z%,{KR]lkk;nؾ}NHaHHG_  !.W!Ww3 sT 0sܹo~`c\we _)@+AAADIOT&"8D7~ϥK6wwwj4葔 I7 T_v.Fa]  v   08TCFwy罥uuuϏA)] FFFZ[[\qqqʝwy/ו3K G_H~AADJo8܍SL:244tez$8Jh{ @3@[`'$>  "$XUrHeqHx,8wgYv ""V\8rܹ/c #AAAAtqT:N<RԴSR8== DT JzqGqqqɓa3@NC  X@rx:*Grxq70SnK.mS>z#B,dhhkOOeWኍ%A]AAI#]V!!w1 sY~|ܹY:Sz#S C{__ҥK.\3FzEzn   ߋ_x#9qHvq|K_RTTܼG\`5 |I|Դ+???p}  $@Io $] _Ν[]h2y  T_tisllOb/bAAAȊ%XӨ#ƣ//\{nDG(AAE7FV6HD$=&i,;l?F~|}##/$>  usJIt .fYv"1:44t?~L$T$"?y DAAALHtC8"Gr9!%9>:44t8 =pz19=qy fAA#Pv 5>ی}7=oiio0q 7LE >x!6AA$:!5">?tz,!/zeݻg1 3ōp.t]$>  HtMtHusƸpˑO>5ez  Fʕ+HLLwFȋ?>H|AAp):<[WvsY)--M)Q 2:{ܹ&M"?>ƌdË.$>  "Prx+:9>ø޴m7ǔnlla0Z鱂 "1LY?tyEjˋԆ0AAA ؊w1 s~\tiy&z ;ر>ǦN#~E݆w!  s؊I===e,ˎҳAD)ənǿ   I>gy!_bgsغ9ޣ F}zڵ?:t}v}|t܅r>  "Tt:ӱn;vjoo2ͷ ">F{zz:Hw}xrꯜAAQ": :c2bɓ=w K6GW~"1Faނq zh;.[CT#W+1|e\w~}C_=M0ko/"޻t}{37X] >.AADO9\غ9o&DP?U6`鄾:4-`)|Ts9tT<ҕh_GpP*$~<Ƣfxe6j]sD_+QkƢ~<$p*ђyt#zN?1Q F{4 ֠o"lkooz駧2]]s>H|AANc+'O\244:h χ$נy#1K;:l-@uū~_ {cVXJy KE$4/EG"t݀O`P];07cǚN8a{Ƿ-9`<,㟀SAA!$:6|΅M[%CH~ҥK5sg0 At#`ϡsP7/PxվKկP+FTX8ʄps_fKd{U_>w~uxi:׼ x?y}tF>,-fF&&Nqէ~_>܍|L4AADOHm[a)<-c+=tE' ơvƭ_Ю؈쥨(3_%^usivϰrDxL@DruLpg'6A{^[k^I5ў zǭBU-(ƒqwÌbr>H|AADLa7Sqs Ɓc-O~o|yҶ7N5n~A(F 3P4=; 5Ovg`:Kף042CHaəxA  "dG E|m+gϞ}6a6@} #Wh+YtԼdlRnw&ao2qo nKET4j{m-"R{soѵoxퟁnq %r]Vnhҕ9 _{G9";"AK3qh|~:e!uNC3SdHÎ,CobR:?g6h4/];/,#AAo_6 mtՎގ }oG/4]IN.%3 -ih=]"2Qdt﷯L;0}y)j0u*Hu[~_W$+5s|=_oWbDwːh}yZ^ghzq2QB8vYު1h/5j4Z[> ֫W?񵶮$>  #:)1tJrrOkkk5ͽty(4-0AtoHlv02~(JGp%2CaEzWEi]JI30xHk*W 19;g},,2DM>i~+ce%K8#>>#O$>  ?y~㊔ "ݼy/ZZZ,;JW8Uy s&7n25$vW2vհ7 4[d%3] 52DFCבyΚo9iev5t[ѼٕtQQZ8T^kf}=._LCOlA9ŗU2/1ZU4<=7g$5&"e]]]EO  Pt/̮"tuTlW   X.sղ+)tO̍Y;]"rCJl8uk:#*4dȶ/HirR@mR@[t!V^%Gs|}]))2.Dfc 3;^"D(CBDnw y`4^pk9_QQ1?DI  HtKt+T*N#'Bs2K_:7rñkCNdž\1Cp` c$KJd?τ:{&49͙TY }xJ$7a\r&W_0)N.wkzrDJ8 $ǰ21\ɐ]!:BA R 5OOESQt4OFaC%LÝtD*SN-w  UUUU*%w{%R oĆUjpx5VV9BC(3\ ]}D*wC͂4I$˜Pf+cge*KPybʀ>o#:Ĉ"G1v(c0z01r`"Qx*BIq\tuB+ 1 slI|Au_&(:&":G1bOLǮ$IFAƾ4e)3 ͙ih9փ|KuR} їy50XC1T,5Rh[YJUVR4K-L:g&4͝e+]}7Ӯ ((CLn̟c\i0)ҝXuW0~M"gTT8V.;q] Dh$@m)x[kRČGd2п?/TϞHsйk6:^h1 ;] j|a,UtԽ05OoV*˸#u+:jt Ľ^:ATQѸ/7 OĆUjXņd 4[SڲЖ.reT`_09`X00(Vn+_7p|3E^OybD΂.7,^|XK}8Ne\ G Ҵ+MUOGhllDCCPWWܸq&@Ss}$>>+!>E`S  NvL$C(:nFt={AF ]i YOXYBiD'bC[6^:Kƫ ޭА.$)Zʜo_lqEXB*X7)b)^wzn'R4NſrD"g'BŗRZAy#B&"A@-@Ǜ9hoi@kk+Z[[҂&455͛:/VP fW.\nҤIRkeq1  Htw!:eدt ]8]O ܃ٕʚo7&*7<KmeQ Ʋ;!n ;[0\ ,ER$TutpcccwX| 6%AADHWV!yEJt|a): kih` aC%: Wiyf*j, 4ohyySȨԈpսs]׆ gaV.\r1Xpno"ëiWFʎҤFߋ!}6;8?j"DLy{nB@VCVCRall cccvddd&@j/bqPA]h9fܼ?Rlv7/@j$ j###92">o:k$ć/MI|A3@RYIOGu>})tq%Sޖgu,휅v'e  Wc)wm8wl \qMj>;>DJfȩY/= gu{!sfǁˎ6;&?D) RûNNV V FcOQT70R] QtJGEj@tHuw 凵fWs@ߛ{謹64`>33,2ć.s! _9V1Y :+W=8{ğа#nT:űc\pt2]ݻgO {!vox*7 G*]e gH!P%I! fŀˎ6R2DK RaR h4h4`0JCIF-hTASScSUJAOXZW|۸Zy6!fhZ.QS_D2oܸNZeKc.AAHN@R;ٹsqjg--cQJ]㒣gOmf>ˆq(woH W]NqaZ )-[l!YJ#Mx$[S91`cWVb!V"Q<Z%z5fL&as-X2 DJoG[˺=G ?:aˍW!3c\~p$?} ?e7 >h̅  er:c#p'Oˏ~ZGrh/|_%*9GU3]?k4x47gè\fOp߹kCJlpeZ`_sa`y£8!|^pGH^=d˲8,ڕ1t1T,waN$o>n.Pb[זK(?b/܏(߂Xr Jv|ٻ~y']\R|ADs:1iٳgt4pa؍Yu/L-9GU3v]1gk9o[9v8%R "8DGStnlBml~o.Y KpS6iwGqr?l)(St~ԝʗX68 Tg*@ĺ?䌾x*?$Gͮ$G`:zl7z._Ie=s! CI(2s:+w1 38M|LW@mPt~ Y*CjTEšXbS춦Op.mhKXJ-8wÙ5 p\|rA@[a$H~2S><.?c֡C7c/U{mvw2:ʷ@Uw.=:ӣG.`1|c.BAA]W̌|_Y|wuu`\؉Wr9)9܍uqVa K` RFTBQpDBPdPPiPk6ړ` c|pދL|[Yx<<"@49/m/7:6_aP8bÆ `{| u{Auux52fիO,{y~9&ϣP4\жs\tBrU 8mYqXp >"##-UBAi9ÏG^tA~]~FĻ?F_|)?Cۡ ٕV0r8Do4dg|wkl1BA]_9uR^_GW;ߣoBk/!{%/GKrۮp=bN;d8X N£`F 0 0ןz86̋+kb'"Q3+탈厾~LT~}IFT^}} 2^{a{|wc.{ AuuxyK}߾"_۷ Dc9) y YqˑpnWp?b/9LAXPcqZˎȓև * \nbDg0g²YaTO0qۋU| ;mܗƃА @͡h*Zꩧ2m.10c._Sۃ  CY9]+woܸq6X?AcC; Yf*4BK΃h1KȊr$XQ.69T!-<˟ƒ-?.khǙ5@[Wz{$]tpGˉ\#ihx/RrԪ[#/h]U&?rVm; ʗR---ӟ7PS  ( %q|W_}u`h+Ϯ~P|]ǟ@ݑDGsCh[5hL%9FVʑFUtqI+<#ZvpyB8(JqaEU]1P˹Cj*?Lv%O~He~8ȒG- ZV9!;Qshʷ[># CөS1ÛPS  pJ*5rҥKW|I} d1ws7GjF{tEWރ.jCl\E*ݨOK`P鼈nA8U@Abh#sD-hE<؋Ԫ[]+QkК-y>;0)W [z/,B8*3~AAdլT(ʤI|fnwnCgPRTtA{tCW:l@w\r9|-9UrC#V.F&˟j \8\Vh R+{qv*9=%UE^(>@'Fol>eّW>5iҤ{CM}–AA~MWǔW^ye%flcv!][==tdGIjۚڹ#]+dףxSׇU|y5Vt#ZсBMۃ  qTWǯ~8@ׯ |be`r_#t6`@ ^CȊ*"9,DF[Eț"]ɡ(hI|ّCm/غ>aP}elCNͼN61z`w{{{ CMq$= "-{N:L7}k7m7rVJG0TC =pv*H~vGQBK?(.4_e p]=[2DB #]SǑµp1XC1|Dp 9;g~'j믯dCM_aMmr! [5;ɡCjJaÆ_ +h4r3~l*9c+VF WlĈr[ɡ1BVlJHP 0-#LKB5)K>-z::CqSK^B8SU>ɏd1!6"qCy '0_rʟ3n;t{Ln ߮vy:f枋/n1Ct.CU-Qsȋ9 6cr3Fdz> 9< >t̊z hKyc+ـIA:(a& Viɼ]x*&?mzyY|l3Fn]V!kX}Q9Nͪ["?lD$C8"R#/TmrM~X><quRM^222r u{ADˎPvcOj>9Ƽ5n9܍8- mvG\ȊS.GH*]Ї(DePi&ϟ`Ãb#P{ b_a!;凧y.>ʖg-Cj]{!XX= gѫW>nHzA$:|L:Ѭ)۶m7a;A[!gflNtToJn9#+a-9Y~_)*-N$%Q#`O^&4P)/ZJ#?\]C\UT|wq +1p ̺QzeWRAAv ,eÓ+c}9'Zn':ĂH])ɡVCld%s9BxM-(E`<ĠRZtA 4 LLChz1vէ&MMHzAD:]󟧍]AySP+ u{lD۱K7Mt m0ex1#+5p <>(eMjq֙0 ٠R(_m0j>.GKj^yf*j0 wSOMQzX=@S jWg=kҤI\zO,˪:$Wt cO jw'nO2)3 mEEc+qa:S[Q7Ur,9?(Խc)TLN tz(x^~"Xv؋lȋOg6ŇC4b44/@sQ4u{>?2e{Ј AAQvHuu &ظq/GFFޤ+w^ :Z`:ڏCWZp[q=dM+Q!?!h>$54pdr *- |0zoo!4dst~ϻ>l=N|>.rs>zJ6h wǡu,<Ӆ7j}_ZAD_uuX~S0=&|=>0 _f2ss7{dNTusHx.   ټsf@}.p2FVGaZkևl+~ԅzmm9]kׇ+ѓݻhećWAf;.A7@S_v{ :Z:..0SΝr #E?fw,ѓ=} џekmm@t2wzS[ۀ؍DꦕP 0E3eɁټrA>pfh9q> *rنV܉GnLa}#O|<7 [`7r [xZ=AA$;a8zbN73z#!GEGB /HrANc+VpVX !àX > *yA>wCv8&_y* F|X7X VXpgIHn2п?I|tuMG3SQtV=fhL]'O\aۃ  {!w-T?lmmҵƅ0j1l]Bt*\-^Mz;!!ENmP7GLQA# * ~e,8 QN>M0Rw}z=[ć|֮;\C9^g͊?d@Sa5> AADtW9rdN{k{yߗ,cK*Y u2]V(9d-rItx`;і$/Ę>?(hplAxfQPićr pz<?}sR: vcPB|H 3szw^oOsqhJ#.AD_ں:&O|occ+t}2G}fMt4앗Rt.t)eKa8Z4nlQt_HtLbQL"ePit>TANs5uO /Q&>P?O븋XAڬDh$@}$aA >^ctP__ɓt{| \)I ",d?GXH>ÿ}+.V}z:4EF :teKW,^`>)$:^|J$,|&՛_򁤍l^._ v KG!a>J>+DL@|4,>wcj`ll۶m 㼾;Afva$#Lzٕf)h;?As ;qsʚP,AFRt-:NoSS4$ * f%w<ؼ2bټicL*M'Y-O3fޅ0,1+DćXG ;| tAl6o[˸4C#.AD_NԔI #jvۋ&1ԝ0*\ r)L%0/W~UrӊSS`<,Fܔr۝bc $`Ύ=N|M^gՎBFs}ǎsk)AQ!;܍HW#,w1 s˗8NM@ҡPgټRyPK1&T`K*)ZBtXJDLQBr_% ~ *_c^N?]^\)~ GfŁ͎ 7:`/>z؉R]cl*A ˪?Ǚ@S#.n +;Dǿ dm"]#,qqq??IWv+f'{DIIt5Eɇ%.lWd>$'#= 縐9>XLrs?'ɎxkU|HntK={OWrG\(׃ i-,w1 sOQQQ`h=:rbU,:%Kl"e /%HJ[x0-')"EyaW83_WXL%7ԃ">جXćv.\ZCw-]40 yyy},4BAu^uˌ]3s:"jv' $u%:F֭+ވKoZnjA\Sy$$!;4s9%koq"A$$%=<<#sfňQ.Cv 3*[`Ӛ禡|{8Mr7n<Lj|r24Gć0tFaG|GK3qR!#.'Onq7?#.A3)vtG7܃S/ens:\m^ftBb7C)_CޣIB|MmGI*=|MqQ2?[48o^c%wyLb[] C!wTGӾ$h[.хD8Z7W~NG1pwYNT0/eA|XMC >:{4gց?dG\\\ "×+ga.0Sx≩ ]xZ-n޼i갊]]]A__188!`tt~>r94g mB/DR_͂#oL+Y_=pNW*S[ kOU (;(h_в"啋d=/%0*yoP2^kJNUR/H=XП} ѓ=]Yqxe@:2и/u{Q+ MY..4ͧg  Jv6rf ]ec7rM&:mc``6166Z FFN^@ uI :xqvItK#,#e 7UE*W$)b^ XI>q;Va/>bKmC >%KlcT >ң?o!zs[ >Z@:nOE^^|ؕ߁ի"ÏzV3[\#.B͈ I §+g4iҽMMM{3v$V 4_CSC*:mCRA@BՎF& ff,˂ӎ-Gt.xky1 }`KH=R^@-s5gUqBC_L!ßB;a˝l39c\|X=ե|U| ćۣ;at{Ot^ {&Mt/3յ$= Htx-;ՅAv|aɌH^ٳ:44:p's@3SQt40M/D3t,GWSy}c###6ѡVmCىhfٕ h{x0PnN ( ,G"wy }ZQ󰭰|iáC&9$!U;kPU|h:2Z ŇؘۣfWZ8N>ܹsƈzLz+CaADeX^7^zi^oKǚ0WPt=7 /gh1 F9=}2{! 64Ó1kËm[XvW_}q^](=&1#. > 'Q__@T4؊LuuXWZwB&:c:oP<++~JVL9sa>/~ܭf~Ql ǎ .N͐Ri;R\dY*+ yW_CLH ^4_W$ϻ;\|&Q߻;B_lF|o\vc.sO`ڨ3\ 3% pRǼ0y_QvĠ晩޹sxWm|Eaj D!+aʊl]l^)H^}\4з:4 \͑Z[)ԏzlҝ$ J__.f|lׁn0MTzl%Ut1WzJ' (@R~?`پbVM+~p!,-EI$&%;B*Ó1kGP7g~ #?׃L Hv]vXIy6OLh4ף5j[8ju갎t _D#ՑeͬUtYq`$VfUk疧h8 pqީsA\:ҹ0̅dX 0׿n] ӥC0 *@G(9;(y-p$(|]YsJ"bc.X= 5N_2AzXs=܅#׃ "!!'Ԛq0:ujlo7Q3Pҗfh9 AמtA8CfcPF'`H"TGJ&+DBc\aI9'^v Xp᪲cu}@ӛn) j#f˟ .*PkR)), $J`,Ti +Z㉭0_7 N@_0ZTE0--Mrfzz3P6tBY3J~ϱ*8"pJPe._b0W,|`P.^ z*ASƗt)J`d n-`b/B"-DwteGyh=鸙}ݓ4G|1Z#ż(=9 НG<@Ӂt4OEݞNBG}^]aw2aV9A$:&N*VaNo}h~ w1 v]eavu$غ:EIWwv+65}T@Yg]FbEKѡ/ ]Q"+`+Bw& ס?ީCL|ny׏+bf$*|wHXx@A]>{ȟ@E@61Ft{˖u{ -nn}i~&=:::1zAXᤎcٳ:<|1 @'?rXGÓ*Vt?hc`ozspGC@@kª\Eb²cVQLb²#(I &>XX=\Ԃ1nE=:+q0bddsq3#fJ\ Hv- |*;}t '4]vuW]#]#KWG@W|Q9quG܉߂ځU+lMvJEdGY:Lۡ937NCX$=_϶Q':l b $OrHZx?.ƒ˟Ƈ-,tۘt]HsGnPE&NWOLHXrLi Adf?ߘYt]PRaW0Ԙ>+%V*`F;[AClҤ:'@A?[0~M*gB灳PvX8zݑ>ammGq\蜭n\1Wnan߁5jRzLys7\HzADp.Iy-E;*wţi^uuĺ:W+&:MPfї( q]@_6Sa %bNtPeǂwwP]bfuf_ gd|7r n*a.[SisWǛ;5П!ǁ![k *JENn (Kτw`:;[1PSH7 XsTzL/ mp! vUvlbOkoFsf jz7]bf]}P?ݳ4*ez! %BWgu{1L7NCW H2%4;ac+Ib THb<{]> [9\~/?ҔQ{?0;0S9_HzADt&RD V&jӮUPRRl>4\ ]t W rQuv,tCwt) c-刋[/9n^b p=JP$ˉ%V-3sa|WjjD=j`¨ %:F|V/Zzu|jvuXdGIMv+IzrF\\z_wj͍V|/I&R 0 ?dPS_w{[=MKäugNk?"=b78JHzA`[7'333d2uGۛ[ߓ`[7kh9 ,֮U|GGhUDw`V F_ca%?et>s-J:@3‚vBBzpZޯ pjw2[2oc<4F!ab&=ñۣ0a`Mѷd2eee0 s@z|C  BLvxvz˲cʱf4jwǠ~OSWG9Q1T7˿]WvnTtТ.;<>=nGs|Aþ8̌Gx`L^vxfT'u-Mq{ę-RBLZgKlevJPwv\ﻠ aA Xǹ vmj(űIn"TJz%ǒ u{=h1hɀq%F9pҥ2=AR#,B1N^FݾxEþ84H@DIB%C82t0.T]!jU؅˟(<;0_pTb{y/A"&;%- ]J@3DRCFnjtahlC Vbb>[&PF&@s [nuR˖ad FKx1T$hh:}qqRuM;%pm$Akk BLtTvL4޶ܨj4}1AݞX4eGD$dD}W cxٵ)2j>Ém/mq4)ԓ[n Zt: @pGWHw_8A",J+l=^v{t礣+; G/=ңfw :ׁ5j|(abYe$;U^<Ɏp>ücp1 ??=9IFǑ$JDDC)FR 3Nf|m/I _ٻ2k֬Mo#KP;Vde.:xїT Vad0>{.Y|W/;7}Wm \v-U㻋hJ^~]L}# 5W!B١+Ma276hF^-O`B@J)T@PdQi!!b[=Ma KAwN2:n=XU8@z|ADt/Z$j#I.JHq:Tw]l ,lmk +Ie .^~ h!0ЦFeEW<tx Gb"]]z=Z-z=tbZQ,<OvX< 8#B t{X7˗v{$F\1\f'=zrѕdpq l]||_2kk;IzA76K.dǦM~ޡV4;aO`a0ZUie$Jvusm]\Ndtu! /W0?쎭;?9xIŇ*M(=Q%&vls+^hVJz( 8i=n􌳤]xpd~#*Fn|ۃ=AU^z )#.7Ңj^cMAa*;yY-ZGk8maX:,Cc)]<[0箎Y1 bٵ Ŵj!~D*4*Yy)Nc2>suFQ:#LBCvDki4~_aXú֠\:h\lJ30Vtrp5R/ڳQs ^z)FBzL 0ّd2b5Qdz^vK3.M, `K1;`R6R:b)E]ټ^POWjxLJDG^r;ԅ"Y ;йKCGvDkik&DߏcXxQt.m{ ·x|\fjUww ]am)N\APB922v0)nW[\Z%9zWך/_$0,wkAA&";&]r%*dI5쐵,t# \yXE 8e y C3[ud`NFZ&!o3-NmGHKMrI !/;l҃:}^ )S;Fy2##.5cМ3?Zr=O?Y/ǿ A'کh;?FC^pU^Et-]22%e7씋7 zZ* dDz?=ySӱNnNm .(ݳ%0kh haf2n诃*a4EP$B}QABs)hmaX c-7Syi:( !gZ׹ވC҃ " :;D?aWA!|%ڌ0Z>+ ;a(]|h S0'edink986;^xO~['[\ os=AFğmmmk$= ,;&Mt_GGGI ZA^G@vxaTZOM<`p 4=Y%+UAFtyV.h7T=AG0 0'nS)a~;[\s9eG  (4i S Hx-7IͶph;6Y'y`W03B.Saݑ)uvCsk .p%5bt{^w1"VNGD[G^>̴cXsKb_eA$;5;xX m+DI;&!aa)*J ؈ڪ$9l8%@֋d9=rz3pl%_vw<|)S;Èյr=:L5cR fh2ҙ hAe5& 3;L#hJ '9yb+gAпĀ+y8xmXCs< Sa`+rt/| F\rz8EZ#]zI Hvx';\vv455OYG~yr:LGX9.BRve mY ũ&b,?yTd=Bh:XKmkR 0?Jq5ޯL\20 A_ʎڗ# fjjƊp1d^нBUEsm, 6cǷX@Ų #VvOO *=d 7ا"aNA쐐Ȏ˕ׯ_!QGzpR9yެS`Vd_vC0^xB[8.GY#^xO(K%VJzKmL6#=:=>O҃ Heǝn$G8mbi?2WRvH^zRP]0/)!(_uAD vvLb+ ;f澷zk3s$kz{c` '=l9ɵlbc4 h2Ud@Ȁ^cy0Y>;UU+OLnǃˎ nƂ+`j4>N0h*¬SY Iυ0 Su2 !y#>a~K3U\Rp-e mb:F/y 'H[f/a,OSKS߹FK1Rbb 3MFWN2EzC^~mi3b"}qrdpA҃ Ȏ/Ȏo1 zӧ8h8ȿ1fˎ$;1P,;ԥZDUv3le[ʋ`+R~E(Z(&ƒUڄV nǰ@xA(ONAԟr!Pߋ{[ߕ*ƥ<&=J^zKӡ*XiMz K$'Q}_m6;wn-0Z߱IzCFȎt1q lyuOdG.N&0(3`Po H!i"w-ˮPD1fEIЗ̅x.TNC!TU._Vtb#ؓWx T>D zeQs!8.=qa?@2(apq1\n'=Gc&ǫM{>qڊqLY҃6),ˎD Ɏ/?Nv du*;㲃ȀՑʷldt ]x gD+ N H#ERx ƁܙT\yFp`D?zP#DlE:HcДeؤǨOǥGEf7 ˎeee "dǿȔ_=\d7я3.;Z%QvUٸ9ɎV ;BîJ\C]k _ȎȎ{S| ye쉱&)kg]nb OG\R+c0ZRC qoj sqf3P# VH26HJږC\jĠ獗pޡh͙3?-AD? X3==q";䮝 M,&=3b;6&Z#_x~*:W!Ebңq<1::'?f$=O"%dǿ9Ȏ]Ȏ)wqE0Q {m{n,[aŃd]v (¹~t/ڝ0^'* /qfPc!(:# `W\fVIh+^֤|aH;eJHzA\v|Av|rM555;#OvhR {RаoMvtd%9Ɏt;anb1*Xdع nsP--+edB%|ӑZ*0R:Y[kȀQiE+bёdsQ'-y+^rD7,_<+gIz $d nF_ˎ];E$KAuWY?KXx2Au)'zǙ従ʅ$2?JdmZңa_ j9g9X*⦼-n 1 Aj 6 ï츷r9q5h֍9IAþ44HɎn1*CvdGS>eg)Ofpv;Uυ0&Mxh+#-sTg$J- ua"F-£/'dXx._ U|º%Rfz;(:JYW#q(Nz<&]dtHuA(ۍ,ٮɎ'? dBّn/; eYv3Hv TO=얻s:eo@!Ȍ] ]z=XnSKov(!$4@ٚ⴦փr:ʲvңFGGhllQ-;(01[×j)Ĕ Hvx$; gsssq١mcq9^"f&T4x`GAu KyP~dRTJ?TY9H"Cz'.=]FSֈZYqhc[ZźZe,nd<#[ Zʶ~d2F0\Ȏ61X&*;J KB"çkġH,}5d=7Јն ,ue.Bȣ>u)=i؇ࠨhƼue0Tc2C[d#Y,a.a1ctH9YmGQ' eBdȒe:o?Sp]H`W{B^@wD8( }M?TK+jCVcllLTzt6\Ckz-m` hjf̘ bz)A[dc\:;?<h91 ?vbY0 >詈5QX,x";pɎ@0P`W1K1!6[`Md#@oo/z{pVo 0?|{%fFQTzvUy[,xEkђ&=:=.+?0j"SzPA%;>ȓdǷǏ` aFWNd=z`) c< \EɎV~*zX NItOߏܕT* FGGVja0`24BZ@a:xIznu@\]\hsN*+G0[1X}%Gpjk秌mq<"Je'!w1 sϤI6Rա٢lMv: r1b';te$;UEϦ`ULp3TźhΆ,fĭ}3пgwMGw#0d'|lԢ===1Ch˂X2柣w \~"tOsqEc{]ǁ큹AsB,=Vr:&=z6Kws#} 3bzC!A PHi]]H9L/;ɎN1X# Pe%BSNv+w**K7pxt.31l?twcpp {W_FFFRjml6[MlGjZn<:Jxy;(=00&”,=AU%K0RIztH^uƍ Fv@d}O^ "j_G}J4BKjˎMNeG6_:`IvSt|s|"<]Cǻ<bc@OO:;;RC5t=R٧jFN7>b6a5\) :JxWaApV͍,=U,FK1ZITzң1gVbe>{:1,01bJQ72r;>a`܎瞋cYv4"fכ.>gWٕURʎ0'B\5&E4@x׿ o]P]ׇneYe1W8zKVcl n)bb=46@~:Ż;.<C!8f:r-;'ײC{ \6'lN0-N u/=T0ZERz4FCJg滑b:%b+]A!R!9sNk >k,١>cGe>#2U,3\v SS>_vү< ncccz{{Յ2Л{SK>a8L'7q1Xxw/i=A뤿`J?T?Et\96aT+nb+] U2*h;DĻ^_ v0 )6#)x="h} +єK1v| T#7;/!*q&^Ů@IUx~FZjB??rnNζOFYk1\롭|tb#'6h_{h|]n2%rGqnO? qq>HrPw9r쥇r1LĥbƎoq)=nCC6G{&M~S!7o< q78ɎgnHfMCҲ#7E$;0N@vt,_?qᑟ4 ea2`4hRG[jCTtCקd/<=]F|aF'77khOo? ^ 5oYAR6`J?Tv1c/\q:̖.b)9cǷ`r3nINң1o5/Ĕ p';>8 s;0 sٳgyA 6Z]1P ;o:7ETvrxّ6/\p3^S>qvAuR5 \^IJE@-8˲00L0 tPhJgƢgtt$Zb4ڇ򭢣,J(Evp6 x?99}"lJ<-tO?^vx0j]Xg5pa׿NN,Ŗ0U,b'=7W0J֣xZ ؤGSz :sf&;t9saI\N,Y@eʬLotʆpf؁?{gwtt2;;.QjԢ!,n;'p򾐑v'{CT"pIɎeŸI*OIJC2Ic\J;Q՟?q<ArCJ߶0:;;pne=(_vM|hsˎ`.]V}T ųIV,.׿`an1sƒ P̙o-{\d\vp֥hlK?RuJe2Tpiful>*ZUK_ RǗ~\`'Wl,-w| U.cϊ4p`~#<#Jcݮ橁3=@2V LQ BK=go{!Gz r;np )wy4dhoV莮q);'1`7.K`Р:Jݟ~՜97#s`8<C98 ң [ XYK?!FY\ի[e|1M莹TS^:)`SV;6 UV-Te y_y<!~=#3[ͮ橁>% )LvXVKµUnV,[HJ#?~ C*g 1%3R*1a^xxeRU[ZƜU6!~&;Nl6/ť`sZ'X$p$EĚBTq)r5Y10HkPz혆(i=pʲs!40\x6&yAto_"y`+yBcb2kjc dGduGۅz91b/=nV =նFc*gh_?e5;wL`\Ό~Nh Dd,rr;!Nw3?m'Yh_m[?JvOn^wǃ{+S?+Me&Q8_$o}ȥnxUa:2#s?4CmySLꕵZt)6aHr;;/d~6gn|Qtto/7RVd\ȏ u| ʅ`+2Z:AD?#r,F.~Nj/[l+J5П&Kz4FCJg;3u:]~1 s/cb<ftdzN4歶md+; '6d/<`Lv޸)Au '|ٵɣіIz}8PPg<kh1 m;f`l؁+h;d0T2/J>̰,gaX cRʗңNl-=joq'Ny|<r;߻rʳafРhMvAgztw1Ʋ(Y}6/;L0&d/[vX[:*%: .? h"џ#Ga8,_z< }GV;àFϢ@2+{[1d'4%>mu3ݿ?ƀ-KW °!#F-|l ,&@?7Ԟq}}ZEi̟^z-N`XCR˗At'AwBZztGgzI ` ?裏dAQ1UnG^^| gDZlC*trFm0_}v"s'.;lo`)OaRe$1a`?[rcu{Kq(#Vvh[?BGhʎ#J`Ɋa"0"3$ի_g-n#ZlRzkhWIW8ݽȏ.\U>y|_DA?3X e0T?l-#ݮ88O?tPR$aƟ_.~=4pˣRlx*Ap4C/V N4\ yj}.h<ƻ8VNzKGrOmXze=7v^o3g:y| tyDdv&'Tn#Wdu`xIvsă͉^xP̠TwY" ^0]{@߫|Te:P-᮪p'<*-O73HWk7ExwsWPO"`CLXSۜu×^zg~7yF[HzDGY\v|N$:"25UEدu%;܎00uZ?>gKZp߄W0 b ^ۣPF?T3k(^cʎTˎт A]4%;a@;$ϯ>q驇w(ƏM(=<૗ؐ`+_7ok`-c/w<8|nvX/Wlغ`:S۠;.ÕhYP7] {q F[yy4B~rFYv0}&`𵗡4tdnO D[bfu8vud`0jЗ·t굡!j*r.gO0tMX~%Guygw,F~|1Tpy a%>p|`Ժh/WȻ_`o,҃p擏ؤj4ͧO,yw F[(σ B|]nvu 3kFrَu.Ɏa,L`sS֎(\L 0uepR Gy0 Uhyi*_#0n.Ad8Ԋ[h5'@pա.m|J`0W, _v]z-W':*,\`i~vUo@\eɘ*[+[o.*VpP.4.+=|-A?[7X0.KztGG:umma%׏y4BAev={v0#:YtzSNm,GANv&VYItN: T/{_S(%=G\=Gfhݗ?`æ!.`ͽh_IhEcWʡNv(_ڲ1ԴC*a$+UtpKVwb-U,lυH2߿n^xXd]EpcPUHPuBaXʇ_}EBLln}}GypΝ!5By, Z܎zfy O[`Oo)? DYOJI|w ia;-^7+8~Jn1au)! {U{P׾ }'0< V; F5Վ< C_=tW}tUlCk"JautQtHuuDUv  qf3!ן}]GGTJ@t]!p'֍WEj`$]uyDؐ]VERlQ:ؼy@5r;9`sNxD{bAdS${9tcr럟ෟ-Ga]_+֎֮G! mb-k1 mg )a #]qZ!a;xr}I3U7ߞr5N ~6@v+ڀzۃ}(]6 rKWpO+]+QE`6'7y2bƱ7naDheǗEr;fÇgpg ߐRZrz;3`*LFv+@q)ΥpTG3`VQkRt\w|#kmQ҉>e3* g1Jaik&p)|<8ΐ.mh+A]Q/r;Ht aRzg+;تz k)14ٻM,Z7 >r.VapBb V"zPpHIkDˣzuJ vyoCC'1񎎅`-w/:N{|e$cUEz`b/G7q5&=^pst7;#<ZUK!4+ho}PHTjΛk W(" (A`j *?r ύ >F3UEU?aCp%:%7xu+PS l><<AxlR2}nVDWGr)>|&*?R-rć?F]pf;4-&(s ޽՚.wf+̧M,ǢiU-+όw+hUUU끤mhY5h/Z U% t e `,[cB¬U,W?UZ=¼y2riHp;`-[V`OWJ{l /='\B%ŕJ[<\~ZaM܉fwh w| W{QNovM6ߊc1tt7WJ6hZנ!{WAI8;رc2jd[UJQƻQ {n0$2iИ ٫ДmY=eO+6 񭒲'7lerBBv)\C0d/;)_@)?,m''MJ|8w} q(Ae!b:U|[ Uph*9cayrCdHO=jt{ :X"pG=#XzmH~0e{==%.[Q<l\h u?8ˎY^l7]&Q= [1Zn`&)AO,ـuhyaR0 qqq-m`VDFYՃQ w:|CJhx&;Z ֺ|;vN?jv%,q"mQ£hNwxϖZ`ÿY~ 0D|8)" 2D mRC(7e!26O*rzub~sKG|A|f%QNm{Z&=ڔ9nmF[G,XAoùϬeYCJGYdvQف5ВOB e+;(K:C?{y{'qd[8vlّlَSn}Ofo IeY%۲za+J(QAIe{lߝق-33Nyv~X|`: BxF 89M=k܎)د?$(U&vѰT:nx];jW22GX!)Y r`l`Kޏ8T|\}.S-.聆BB[A#X:ĉψDEɯ,,VXY*h[$ ;vF7cPFO`\$&5OBW v8= gA6= YiUtTe|>(U+M;A9g&MG 4~S\M, 93@Up$3}!!g|8pȋ0;;`5IOUˁ@S5'^Z#{TUnƯkf͌CF: UAC(4K@4Ѱ*g-)U (rP kJ#πjza3, t)LjĸI(17vF*ALmܸgZ_kWkpG,_ UA]H_W~jOODwztXʍ[MvmV#G*YdO-ǁydibצ_~C>pzvR%WqF@jHR?jG G~mE ݎp#[pppkFkOd郎 VqtlkV\${Xk Ξ@A%Χ|TwヨsY+˃6GdsC[zܓ<՛W֡l={ZW&k6Pǟ Gb,D777>>~=$i|gGhm"옞/w}?g8Vv<;P;8W^u[9z&Щ 57к75jas{6l?[4n=I}IO*A*A֮E; ơX g]d0qpG_q%8>ę?8wټΖ2mUK@\sCↆ.[gKw83W8<'88kW{ws z,!-6 OvA*s2wWy ="]=[w}? -!XYce|oy1އ<цb`f遣W-௺# ‡QC2دjڰT|:x:z5C8Wqt+_q##VB*GLCV jbh #E !\2qQyo?K 1$ϣQN_-[pA}`mQ1-+H$gG?ɯRƙ[(y;5|@6f~#rY[Tɷ}';=c2ްNJa3Ȇ͜~}M78h!g$ c뉛HA]L{5;A|#OQS5Iрw87& *l3mw<z-4=r~i(nJB*vnk4;\5gQiǙWɩٴ 0--`ʊvp3wS뿧nVdĪ:C4hz ęG(8`A59%5B Zp~/ ~=TVzӠFf1R^X_KI,[H_UR8tDy<ʟEcccG`mL#,_c9s|de$q( ʂiSȖpy6Jq]ZZɏ_Ay;vzļuohI/8{`dæDF9,ӿV5`jەq{9=!HXV tji,x"C̰CR–,ϊzRy4lC<-vrgX-LkpV.Y}DS=bŊ"b%R ڈr;oU0-a oEg-~NE8]C;y: 8&GEH[8TY5M$EVeh’A&t881" `֖*0K{<_! @G9qh ` 7@dhaՃj|9#zH AU冇iGS.< ŭO`e.VDa3|_n=?l<38,x9`*JkAU..쐺7ijفſSWHG`C^5k7ժ޼P:6Ь;ٮ.39BGX\cmE $miOD*󡗃Q( vpY *-.~6=pd"=`P`-Dn3pIY=CdƤacӹAp#`kAFҦJ v1s'4T.gR;ЃŇWن,Y(zd*=FnzTӳyz<;U[]aW|uPkp#v+_EbeSUb"dmBv4= :7,~Ŏ9<&ox^(c^9AU-NA@Jm wt rqxuDXYtx|Zqe/c#0_ 1m&tu @"'"#v448H~ioҤk!G2<6rxԮIӀi4.P)7cUP^ Q6î)q%X G#4Lkd")ӪyGA?*@|8Y8HYq r\LCq%tMKVU/Z[}./Gwwk[ᶶD`*H;uG(+>VD_l|-XA˖dh bQfW]"g8]g;K 4~󍶃Rd/瀅|QH`zx 7q>ptU춸&5WuxKhXj[alcpJL͕Szថ; L#,h/>mfexJ X83jwהy=σ|+/j).k|-X[=SH۠PK,VodRDw} |$vYYďVІʎ#OA;S;\#]zqt `ղxw?%IGQU]ȀR5*-VowN΋q@Vu)dAo$riv[d?(!H;p4p> ;L j7Nz~=Ⱥ }k3e]뵶4=`mBUgx0 -uپ`m[k˗Bp+"+;/^-H:EOv[Yr;<|4,\ 0Ue.@)N6檳6Q*+ca7\k"c<0CKʎM1x6+<:t#7RleD?a fᴃADXp +K1cey TUnFÎf<$sS;νNj%mHT 2x2E6pMz'رd(ʘaG̠=ip+U͑h"GN>p~fT<*{H=ÎGC@RF=4z =تjCY[cAhWX_!L`OkpV *=G9r-% ZvyT:OTmp"ɰGE>Ҷ z 2N)*Xo!a)Y7lLT.&aWyih GgZ#$8*VA6mi70:5 zhV{%?X*7џ}Ǒ<تj֖Ochh& LB=pdTPiaJ0]YA;!iV|:|%ɕ2fdPiLV.w/`dJJ`f!Wj%;([Hb5Flx4ya3H@\8=<*#0 ÏHU  "z8AhGX!/ RSAM+2zJ yC2VՆ"U[[.ӿL#J%meyP$=6$^v#z2ʍ[\AknSA{7D%KGe9̒e%vCan8p<ZR悸{AxOaQiZ#օt2H遺>*ثjV{j{*6DǵB{޿}k_E =S}PWYJxw5tnG(+KG>&IB=&RQ%v}~ ͙}`J!4LഄU("dRVQs26UaG,\γQSGNN|8 7 zPY҃TeeVl@ul|֖ׯcm `U!T8HDPO=,̼i>2be [A{i@'TFH `i 0Ugo6иJ8Us`*64 hS}у-F|QKa، v AhE.<2a~I-bnaG\t:p8BBÎ Q>/'czs:KUm(k˄bw$I?>`BpYJݰ)E"ƎR% #Ot7dJ[+`ZS 7lAP8kˢ[չ`)AiT]uvP(,ZPvؐvRVUG})W tn[3Oݑ( = p-QK(J$zP[2zasKcd֖[<4F`e!T8%eoȟr|JF@T0-U=KTTqԖ7o.P\V,}UHaM[T?N r71Q$pUJ^=B^ Wv4lZ ;K%լp󶕰Sj^h.ӔZjrɃS^!+Q)&`Gdz{lPm eț 0U(E"B~'X[SePi$Vojv7u][Ys;h,-;W"'F`4odf:Y+m6v "K8!U !4o`?#A R&,jvԮbVu4Ѽ;y{>Q}61w`C[Y:8.B)F`#80 P]%TXbT.T-@@ APa ZիWIz؃mٵMriWPR`DIʉcPBa@})C7Y)P99IL~Ք14e>{AŅ(P So3iHUi\`~ U]SU#-Ҵ[QXXVl`q_הCI!C ̹6dz4lLoYCDkmۑg31zw ۥwYAD *㝕 :>\KRٻ U뢷Ž#OusiaJ tLq1m= J,$3T*ftaGAJL!A])+Qiz~ ~&ˁc Z˂`G`J&x-4n6*X ZX,4p| {M\a?6YiCR:|oLc#HeIzCO~QY[*Vslt5 ōǽ{v}#7BpSP}>VߠD۷?+2tníe_bDneqW6MBPLN">ճ"<äT!hdSS[6_*Mii{X7<\3p zu xZY R`Sپx`lULoohyf3,g`LSyԭx灬^\=X+ I.'CC].&gj? ¶#RUG@ 1\ c ELclAf0?azMz\GZ^ڃ27&Q<zx-U1y{ץ Ծ;hl~貶>籶t9òw/eBMpePv0JI'z/G<:t2ahe1ƪJ0)^|mVi+5UBPi*G2?v| llwp=e AfU#a 0vxynT"Ʈ*s>yPuD 7|!bh/sS2,ejLNJpk`k=#zܹlXa=ڥp{Q|Ԯ9&&GͪUyY[Uܓbt' =?-=oB3qk[vP39SV8xTAjhgHF+'N *Uc: d+`{BA2 L 6ԬL_^#>þ,fyTnkzL8};͛7? 0N@i,5E88ꎇEt:&qs{:[YF]cJz +aRqV@TE9g a`pR`ZL044([RR9pWʊA^ =&@6<`Zm/#,,?qY .{ L L<2O0X',a>/@uFaUZa"ѦGWBH-`@Ҵֽ<sgC/ =4p<|-oͷf©Wz)I6lsH0CPS+PwŽp5x\C{G)2heq;̊RXp=Qws^keQ A 0l9NT7iNN&. 4O~}x`l԰7cA:lâyAn 8wGda9R%u0: 58Q!#y28Nm?`u:/UF@>ndFYklG pXj]\B:.#fQu뿯9C0<|-S_kK*>Ƞ 0jj#hJ}3;~}|2,w.ýV)rd+`YQ 6e)O(;zT|N$z) է *LaN,D,У"5°L px KPi,K#`KLЃpRa?"!aU@@XõFXc q4n"21{l(]9õhn/I&0_ɠg+l#T{xG z4W^?{dCibz X˲a> zUnk=bh >\QkjC8bVwDSC.穡Dcٌ(p+e*"_u) ^qY+S@,,aSM7R$++K pK4*4n53HvPaY"Ͱq TBÎO_<.pc` # >v0m*LݟB,X`G\-/-Rs;n& lUgУf(Ѓ]ó?3gA(QہlXdô?2remzgHWC՞Dkj) ]SEPyGگ> mjjzoXr׸-=owPiU &|JM(JaS®)) *$dAP Dl/c?(s4(DPi$!J7e@HiVt!irBb pX>=uB] t[Fd3s(ѿfU0wv @$HC=p4HzmxC!4ipeQ {Y6le94L^C߻^ǰ444l~۵Wt[[ y [Co@IȗRӣX qFaǶRR,Yg^ya2hHiS6$j"ȆF?䱗A'CNSym p^] Gh?3+)csg \%zˠTF\C`#t.C.{_|&x|u{}A׵lkަGk pyogf86hϐh(א @A*I@:PY Ȇ"s`)ˆ,^eB 혏7gwg{V/ƥ`r5Ѩ<!;BΛ71rtl[]Kл{)[F[Y*\Vrd+`~{:ޘ홽Nsg? *D _;뮡u^ [ڹ;|Jz嘒SwXUJ8+AR]ȋ𐪑,aOѶf2 u- 󖲙*I|i,xL vGda]pK8;"vCWv_[a^ my3E%2!j-ǹm|zPУ4O-YdE,8ʳhQ沶<{0۫`Lt= Y+ٹçI!jjBT;jh^rGO7| n9=oy>b죅KWzad^Nӈl4*w}W/>jK\.La`p)￯Yԭ]ʦ,g&MDMK {^;|}lvVX`i;3ZjT8X7CL> cT+Zj^+kxKnMq~|_5mW`oW"[s a!)-S' ) ~z@.v뿺z9H  \6xV[Jd+0)YVc p{2Zs1k^$ EEEŢT;2}#^7~{zߝ;y9c-͂a?.dvY{y6pĖ(ekQ6x(lAR)e6D;zF+T /%U!%ﷰd$~q7ShafQHA㽼EҢ1c6 2#000ل]K >԰]S" =[CKz9(xS-ުIAC) cE8P;{wRtZm_pm-*T‘ uWQw^CQGQt;C7 t VwѡC xY ^i@1T^ܑU((SObQyT7uGA~;~0{{Aϫ8i+ Yb ݋]:47@PY(@d0+ Mp,Fm1;(J/p ^)ϤMK*è(G&onG1Ȇydt\0S;,fzܾ :Wja}i@_ݑ~*L6z Zjt^q' zX50++2Fa?,g%]FW]z <z%l*ŭp[[*GKѹ#]僴*2gjjC~+ _*H\!Hq>u5ϡy{ދ-Qq0#+)A!\WwCCTӍf~(^#AEL3%;EbJYοR0I١հ9ϫWGG@?VGR;lWkakÎ*[`^n<4pn$\a8v ,":[[Hv$ fklУE =>MZ ֖SBS`j0Uy ] C ^AR&pZw>w#&*otT$zJo1b'&lBa;#Pv<#9p>!4AʲͩuḸ'v\]vԮiz0uJcYSE am! '3[\,w[|iP] Kz|VqAʠ'TVHu}c@!n-EF٥xصC8pꎣGnQ8O* UCTj/f *؂4PQG&;T1gvP@a{jV >^*vx:[4\SaZSo7aX[,~J%=-b;١Si%Tf ;?㘼㭰u= =jaңE=u/*.]CmtLjjLjj/y򨯯(<HdƛZoFAC;JA2R{&,0uVE{i :h9 Ts|LbnvmУ 6嚐#RZӸl,5!kj?yW*"臮wT+<#U)pzt *J(`R(Pe$ z ;(UAJ0}J"* =y*"AuYZYצ )m ;Wkc>L1y'aG<+haw9B~0ڨ`G0PrEz8!yQL; ;i|M?t]Ҭ,!_ *6pgztI;=`q$$옼e5L@t8MS LS ̺i Ý Z&+ ҢzcTwTzL5> 1Mv7t)'GBW0UyX-V>5=sG*JR2v0 0hܛqYpG@a7Lav$~o&& ZKm14zDTChm/v&)JEӄSokK5UޚڻnpLQ__7wEUtvv֭[c`CCCƄ橈jhɊڊijU,F5 ; y ߗ8Lʂ2 :/C0lpSwԮJœqmFA;W*aR Sk-P:r+40록T)L:4~<`ê9X ǂ q6C1}6ZBAve܀z`nG=<&|6Pᮩ[̮ءT**Hk>t6_uG3_n݅JtH;iu0ǡ]Er`.;ڜZWm|@,^aH{ v| ,C}>׸8tbq3ª;Y?/Fؠ[IkT%I *naۑ^Q۵z찶*BE NVILNқtp>XhG=6ͫ=sc̕[o0?Ginǽ֛VX'a1ݏBBxpXYL{}jm`lp1 ;CKkEEGV9lGa.)CJzȊc2 <04[˽55 4ԬC 6JZQVyѽk):BZ/Qy|MPy#ꎺ wt7oD2ܹsǣt: m 0q;r`/ρ<l,[<,(%+ gs|agk6^wjV%vf6&pSI;% kפTakx 67`xX5T'vxbvhZ8 ;o|3R_ӺY_@?BJ?[0OA&HtXY,Ŀ^!L550(alU$ z[A= 8/MZRyxJ^ *<=fP] V*M[qGѽT=HC1u7}DK|a&N|7fY۞V bttVwh0L0(;?رx$|d6F2?9?`ǧN+ 0 G;a~ᗉQ_N5|@B&Řt+KN7m!?ahUТE0aiUBC'*.;TAɊH[S[S ԫPyUFE)tRyYQy+ jg]*Ž?Ic|YHoϦb@CC~ v6JFuR0t׽)p#ʢK/78"02$ ~g 5yI!oO v+K:aV16WxE+ho|t:#p8=AJ/,/'GGW`T|",Vaс$fa3&ut=C۬<=*XZ[[ԛRbmI#%Oc Y򰩽*<ʣ룥 @{ *HoÄy4IDAT;?u׽yܾgu`f3l6Ql?u ;\I[YJ:\/_Yύ;'~e ikGV@0 x :L'?#!AuRde񄔶"Nhr5YwOtؙ7 ht9B*9Z.\Y3pS<,U0^H>HMrGmkee~ؚ@, = 8UR2kK`G GJ,R8ԥ~*]SUyVVy1::z4㫮;G5&u믿 MBm1sa5zbnZ:7,Hby:#NhzLȟ7;VJ0NVwYYNŒQK"][={N=={eG85'cMazi9/_Iy~0L-TɃ֖ q)$M`Y BfUD* ʣww>:w<_k-bQy|GׂCPwSw,_ ;cpp/wH/L= ߟUwZaJ )u x6HZvTdy)X|43v( Sw|08vfUak(M;TWwx*KiBhe94kCv/4)R1yI+pjl0ÑH@yj+0GtͦGaޱ> ip |i3ئ4ͪB+Ȭ--RؔBZ[xHpjJaW¦Lé),a*_N|4O0O0O1هfz[ fPR?VQy,8ΏMoڴiH$A *A+,GB[qMuC70|_Cq]#AEkeYۛ⺍6pagl횵ӵ aaL˅/`~7`W4ՠ,Apڣ ;c%IҠ%[-qYiAi"SyLI*`Wx9FT&<2[eu߅Rw,^H4Efxua6Lr`Uob `7R,wPJGСT*͍Ӛ5XZbdGVP`ZrAhYoFЦX4hemUbi'4cap D\G(UGu)^~EKaX`Z頭.T&xI^dA2?hmxVIqܭ-<*s <ëpjbSytlυC7̏M4.\@Rz@>ϢTwC㇝;Nq lXr`P!~%^Γ%8se֬ZovYVM`*4\`dæԩ;>biS&\ ;t*he2?&a =b́tX[Ĉu| aҳ6 vxG须 =`bkUT ~sWAoM[aVF=C*TqOXuި;ѫ;le9p瀨 ;̂!47E߉i@p1z,Jqx L!'aCJڍqPi2YܰcYB?vL Zv3; 6c\T'`zT04dKd0O4bz$U?R],ɲiN,8<7*K.ʠk/+<2X;"dxf\Uw8*rQ *#+FuǷ]E";4_ߐꙘlxT傚 '_΂xNj7ko{gJP)c˚ꎥ0O_:uDž=,Au$(4\Hu9 pOV) AMaa490o*d=lq&C=|[sR<$WbywR Kc|X-oTz¥gמ,*bQy/AU_D;EED?p;Rw8˳h\8d掴-L![7󟊍ps 1rT=FF4lHcuG!wa"K_Q4`\qsjMϳOPis;|aX#2k aւi5GЃbytHع~Fjqvx<>cSyDfmqaSoHʃ?WA*�<8kN;'~䲵|ϵwGүTyOh<|_cRwKBN_d]`ݍZTi`,+`sۀP93c TySzeZ~|>֟(*5IWwXk]ꎋRTʞ!ixc*BaJYa3ٛ0R =~+;&2QP)sZ뒒a"(~=X Lʍxl'Ui1MGǶ\z7*> E*>/¨Fz B[XN-p x >$%Q,1هK҄yD`hêޔG@EmXzdc``G`W#/Ti(?(--EQ6^;jGSwP"P$6(.K(v(ۍ,ӗF0j~{ JfMUG\6%KY *ezD ; 0etpn{-,#0`ZQ \uaV<9>p|-O:G<]>;^k֬+4_ *Qwm㛁pI|Y1;zu 7޺;W2'MG1 ˻7H`FjL6ˁ[\E3,7i;A?r+[_utpfC6u끩^# y/~<icu2kma[y0a <<30;ZIL6+@<" 0mbq`Z`Za6a{&B瞇ZЃL ̓ LlK%R] KeU1XF*$V~ $ Ty4aHxxi ^(  4+=*:rǭsGv/E:y=R(*jTKPy_AHH$|Ε;&fE"TۙPW=OàR?-[ , Z x~ViK)4(KT$B^Jì0)*I C{E 1::+ည|K*jӳKV>䓳Q`!:ߜ[;vD iA/tfovԯOឺ\:]vuWaPwPY*ϊ^Gg':$9,E m4?3 $o;J"~n2(T~Cb!Gł]\2,c/3;jai'\ݡUwS߹Ž`; w{`ّA=,c2y< ӝ(<GLY\ԽXڽTED*'w=e 4+ l|4vv~(ui|T`I*L>4l;a >a/UpuGe,ALaM?UvkHzҿiDOҠi!-0eSyN / N_ʥ @V/qY[VTy}TSA*{g\ p8Foԥ~@EI!.σ.#ǏnqFV:7 }| ҅>wޟ˨0,e9pTgw@he#)A)2SU"iuhKaGuBPi*tT*GL;_mF8dA[@PiITwa0~Vݘv|)L6  fF?!`եzFo&EaiQW^,0% e<r)QSmlЍ-e)JcBܣ /ݝKб=|`kG^z[KBx)ϪhXEW͇xH n;?ЇQwnuGC f P♜ 0[Aճ3xPճeӆ̀aŐh>~!+< zkQ܂x?dÆ䇕~RqjhTkMRax CG̏m6b0^oqb=|G2!ӄԶ`iZkOjx)i =s;_aUyUzYʣGqs{*[nUT~ǵ{6}JmXAZ_Հηg͙uǽm=ꎉ=*@u#E(~CϾΡlyv[hmb2vp1T]qRdAƨ;<#ٸ9aw+8wg>z3%êE; }0 ,¼"C}0MNvaB5cMmܲ<|LC~VK5 Uy U gdʃR,Uy Ry+j'\᥁*y=yAO!KBx)wJc +/^mOmc`u̫pgtʬ*֩Oڪ^Sea(tC,<lkVep )倲#F m7~C2TauḪ T'_Vt+0 ~L0°{ط{*kԵIQyșKŰC< <%.ؑJR273x(eHJ5.[ ]QpWԎ ɓ'_ ^mE^;_0/+K}wXhl#D>!W-Y|L#sPrgѓ@ ~*GtxieEq~ۨ~ߵ Ty+u*j#Ua۷oӵ=ꎾUV YƮ1[߸[),\>W5U65k3vxLצxg.pgyHrS۱03U3@*&%\WPi",L,c<),,!'O |agV:R1RRs`bTyN sUAKU^ug3xgR, Ry+jz *=^/T|(K:Rzp8Cx``@ɇ|s&n3ۣ0Vw8]Ad" !đt;Kc2l=@+ Ix4mSՒ*vVFzm*:?IvGVVZk8%ӝKIWwйE?o {aB[[ν-pǁeW(3@*\-*"X1\Q"W޵7:t᥺GRHQX!JD{~Unuo!D;(YoT9SC eaJ2v4`.`TmRr:TS `)wYy`G0~v?NʢgM0)vb; ՘ -Jy['`~ 尵y3pk1 R? ҪIQ EQQ{M`"p3.좕ǟYJ2'(Tx*j}UfG`EB{5B駟>ﲵ<"KL/妝>vưRNwS?7f Tw}D"@xugs{?»لKAuMO$6(c 0},ASg.A/JLjgUbo' xfw;ԝ;c0-l慉RڒlSw%Qyt `;Sy:6 !=t (UKXSSd%0w\qzxHbg_זRE?ӛoxaSyc֖D5: SWAuahLSc0#p PW?V{O>{8.z v:lZ%pC tM@)Sm`@l<&&&066! Agg'޽ S]]'Bxi~V:44tcGp.whw/Ծ,Rwʽ2d;<#ǛA-?e'_b]U[]pQfD:KX&HjIx~R2 v<5%1kl40nWiqUPuGW1>ǻST?~Tظ 3=O0l0n099.RyjG`crz;'ނo<O©YB^_QjE \{Xn<~''x,PlhZG/HސT7`(2,l6fCb||ݻwTv~ϵ7G!4u#R;KR_]g?!I sWtX;qG1B,nu٥De6qHl♩]j59v[ 2?~C,3aG:΄@<bY`GZbD;k;Ԃ#gvT/`G ҹBa]Lmgq빝_]p; SX1xZ񫗄M07xX m`abb"6@}߅y6\ La<XaU [9 X?{X%I`!+3Ptn ԋ4 9MZp p3HHA娸U:=*!KZ^#a+!H .yO@;s,7w>{{{i;,rUw̧f_dcO̝ͺtNG1 yNT%~^4mFsdK; #ZzI;zH°vf@,>R8ڥ%xx,Nqca.L|T*Xpp: dZZetSK) .*ڕcnWܮ*=kjZapX# QRG<l jJu`;6^S':Ɂn{KAoo/:;;ŋܓ/eWxC?/r3/JVH$hl$ +u;|JM!Jiؑ8<(25ϕ3q*ͼ͆bd:cjDpn XaC`G0R`  PfG <uh5#,;twaͺ0`4=%h,b>w`v+=_@yk`lcx v,[ VIaTVwD6u ;wAO_A8\`LZt S.XK*دiKKc1ZB;4&$K~\Xg^e^f#ZH$ԯJS9(J3[2C(v:DV;q'âYK8RlglK%H1r_6$&Q;nV8lz8Bv055Wp*`lpȂm-lA7,eAHfPK0ZU"[?C:/pxtww /k!l-_l-RwY%r|I]*t\[pg_chעa9 AJp.4fy|@@ifRGTCKIfgBG/ɛs%BPijj0"1 D h?haՍnH<0O%x\ުbGXRߊZypjJ# /=wv =|_Uya゚Z;M!$2Yr9stl_]л' w`<eK†RYMrn*\YBPק]?TY 0$r<4J-- չy kWﰪJ`v7 06We>XG;V¦\Z*@;r{0O % x{0rck ӅLvM 5 54Tmk ^?gΎwFǻ ~У^:[=V av>{E/c"Uy?9Td.JCnd~DN[WU <2rCB w*J$zjVL2JZfr70wԬE&YgK `|qq,a!,÷azYaS.LbgLChaӏzG i# 0^S~] 빝0GP"zQSdVFxW΁S2Nr=3ѿRyt_wUDakCPy$;?D>CF~=Re2cK0%z,R"9R"B[\󘁇t~=/n?be WyXT1̞^~_i!3Px,-5?%lƅI.ym-TyL5&v3P/;䦖RB2,^궵K'+a|, fΎև=gsx8Ω|\f|Ep;˃n;ˑ#GEgsVڻ' cVw@+ˇAVj-w;$AH"4Fyܸ[U$Jq%f\0X ; 0"i넠TBb!4ټBaMP~v50r`Q%54T~DҀirkg8 lƅIck1M&xoIQjXj6x8Upr<܌VxJ©* /5HA[>gct, U^j{Уq[˃ ~Y~ ~<::1/{5[qseU/x t|`XV*Y@;xV4i}Rn±; P3#(MYz&|L+#[ӭzE@X" =*CTX*_7AO>ȃ%^qpcKwX&25|_$lƅIN[˯_T l-)_sH KUKc!AṡR9?gahpZnYsP.K` ]L*3qekFZn ނ$I`kᗝ,/N&/Uy,}{J]v|L`^{b8%hٙ x\X9aA}{WYJU|>0|As A AJS J-;R6Yt 9؁2;éʃ~ `W&Yl'^Nm`鹊YkM0IZF.R\zMjV"/UV2<bkQfg6)ҹpJ*My0U̓b.YҾ?Zt8㏟ akG֒;˗,_s\/w},cccrB\Ν pkwpXEY6e!l8%s@W~gu o<(@ #7?godIG@*Mq)ԍj '@ Ջ9/E9&.+1uE ]j`j֘]cZ:Xv_f,a[{*8WkR^G֌zo,Yf8K`- $e9Ծ,^ 0c~8A۳LZ_?D"#>o4l-I#^v/Gbg㏟:U8 sR޷  1TQѪcJlQ=Y6,8@,Y J|5 k 8wt(P v-{#]{~BP5xHE:[쪥IGsuBHM xҸ +*e-=Uh 8mב=?iX-MA5Z$rwQ xpraW¦\< Ǹ x ,m;æej!G05|OoaRqcLðhG`ӏ%xXz0|Nr_Py0nTCBd.9U90@?{yTsq|kvq288X/ZxhgyG$I/.x/nlEݓ,cLh;̇C2jvt]eP|h_` NB8bkUVi ɮiYHf> #x\Zoqp /P"-fFH:Y^Rm+ͷ0)IN&xn5]-_ "Չ5˂l-Y fH)rH.keE/ciZt37ޘ XAzlkT;p~eUpHzJI@v\ͬs{-ϑ-/B0MȖP"V-LG]q|5,c)G~ <2v()4}7aZQx ;]aLz$iM-Gv1? <p6I Og*j^. x90Ehkk4p:ܑhgV (Y VEGӒ*B2t6-A+J !' ;?拵2O(u_#!4qߠRG%Q7A6c9]*iC5nOz`Tya-LJgb҃K'ZYMkZ[ %0(0UCh;AUXJeB8'TQHmP!4S J2;'׀,efYLN6EZ]gX:%4X҃Ku'6x>-LOSx8UK<&DPx?KfbTmkڗ屵okzomqm֒|;ט,81._LƮ=v,eYcfZء CT<Ev ʃg:|jo祸5fY߄7 *]9< "ޣȍ|DB aE|GJ<pƦX%I*ʋx*X[?gH:@]ZIuBiѿnBi,xi_q}VEJ*iMw[X`/t naRr`&B6$JajZZXZWK+ Yz. @%8hAry?EJGKTDI]%G] "nA_d'AG<.] {4i]I;t ֡7?xOt ZIojtaԯ%! 6y]x5miiX P}l-dve,U`tAl"wӰM k+ɻI{`x¦[_`_SKC;pan[yEx-d&H#z em-j6n8ەhkk/IǟHK/Y,un;"Ǘ.]z,d0W)f",`_d>ߋ)r;8`ZLFs"$hX/XG3( "kWdOTx4xuA#,hծ{`G"ȩr ,S{/c=0=&w+մv JjK *[<= l-?.?DG?)prv0Wϋ;}{w) *ey܁w3? 4=͍CP)sj`D;e{-yHBPi&ISxF G`k1V΅"/[ |l-֒;oeһ' 1X`gZ7;K:҇'s9p:Th[PvT$Q#FT qv_,fD <d@i܀f+wIl~ѱm71YyW̉|az,LDw(b l*X ȳiIuk4_!qى_eO-fx$![ZIBGBK1X#Z* BKGz01ԗ44x*`+h9$<bx$D9U,b؇orx\zuO}ipͿ<|Ui vw~OFc+/c{nH;˄9 ʣY2-L}P)#HN)> NLj'6p1nnNk-Ԇ*-J# .4Rak $ppb-/E<ړ; 0PV*b!ts`S" t.ҹ AJg% J2#E>ۣ}ҹ;Ql:P!>~*)IԖ?-P|d?iΥ!e%88ԹraU¢pilͻakF)<<}Nt ZIv`-;~õŰ]EV%9Թ5y[Fy~k =Lk%XYS}ٴyt&(, es{7T=ƪyUd|8ޛηgo2N?#2 z%T~G(;o3\P;RGY!+0Z]h+@_9y0WυMⲳH炐)3]i~U#=$d%nCYwb%^!_ &m@{&5W=?idcaUD#,˻"U0<w< ~Yt 1[Zځ6uH <.QG #gBFz/gvʹVXYFw/>ALf-G}}l-QO:@;. oO޽E2uYvY 1TYL7),b&qHiY!B:l6x +'+ݙ)3AU-Lfmbs0k;@iɬY}RGLuvRsR1+5OFa%vD= `+x?) J6t68$n3UσMѽYE̥Ǜ3$4핸G[KFӆ"ϣf`H$zIv\Lfx9W,J }҇.YEV 6IWE2 Ҁ~)+ RDZxS˖d<"'uOdaG$zȁGKeFki+lIh?|?鵴iҚ[ xXϽ;}QWX#؁mk??QO| kg3<3Ӿ'҉;sWzcÕ?0S崝XEKvU# G ǘVB:8`: M­IN7Mg.8hꁗ:Uԯ?iᱜx vB@a/Q?2P|E_@{^#haӒ|o&cޤU `<Prue90eʃdA#gжB:]ʹ*17;·s> [ܙ>u?h;j.b7 ,&; Mⴀ/Tzuw@a <JQ-!;CF6WJJqgQ1/ah8(GL-]x|Xt x49p9؀GvUx|b|& < x$t&H <ҹ=j.t.16fc`ٸL4i144PO CGFӲH;&!HS |qt)+/)Uݡ yg0 IIP)[aǙw/>~[jgr * LzKȨgX$4LC׀YqGLieiibx?-lI^>{0D1udf҈xLفGQZ} /e,۵"N9q}s< svڵk9/ Xnl-[00:mgr+8E;e6c)7<[2<ePp/͜Rb?iuEAީ 5y #[v/`.uonbSxL^1~1icjz0} u00rFҁh[x\R`Y0)ŽR7p-' <0mGb?:Tfct, YWO{͙pqz{9q޵gjPyī7|,\~};/ScGeLV;_~t6DY|GK/ĉf3sr8!#L?ƹk"jjt]CѸ;<\0{ RSLctNwty ljMqbq\8q/[$W[."EIZ]uMoV]bo($EERD!z3?̀߅] xx(an_R奜/>5CQka x+Tɫ)wQi* 46.|Cd҂9ygYkp赖ftZG[ܾ}ZWg/0ZnfYg f]M1RDtGEdquPQ/Ll8jIK`7mLRJ1fHC=Wc?"lP.~po.x4>HA` xLex8 `~_R,xzu?!Xj8-KiWgmvGUHՖ9yg%5~]0a;U])ha6/sx<>f<ퟱxG׾B>t}ۊ0cCcVJ,D# "R>1"ءk'ˢuaGde^j7^ ov?o(JyB`gtGy6p-$Q wG}Jqcny x4CЏt*?j?~9A〇C{xS Rģͭ+7#T3 h$jg'l"XYP>WP|P(0]PC"?<x8jױoQ׭1,.՞݋{xj4CTx]:|K2=t$-kE xx1)puTgx:Q xDzR&Tl Eȡ(R)ΆfaGi zc޽,O[l;E:KOOBG~7Gc>Ʒ?pVΆzv:tG$GJCI̦<"麛y_"O P`*_)|A LK x4.C{8ؑ `Ny2>Lu^<~!*֦WìNI\*Ukwbr|f( <"R:c"HQT ^} x0;2j] ;q"v4(cFpzw|E&}cϞ=kL]T?x@q(LԎ(Ԕڟī?p<@T-0$~q Mwӥ *-t\,m" ss<S`ںB#wi<`d0lt6 $)@~qnmɡA*poMAۚղ]h/pM0ߔڡS{ .մaGBqk5 [IKYJQ$-x}A=kg#PK<ճ񴏄ZbxeLJ ϧYܺur!/=Cx{ hwRR1;FXZF$~io£~nfH?h~gSfE vJ7<5:b RT*SI->ϳ<QC4åpA I$I zI=ØדT٭ߙ:KKs?Zn)Tа7qOhy$cCN xAxX=w0do9n^q0cG 슨)CzKŠ&!|h]#T!h4OR0o(`9RTh^ *]>aG4,CU\Z7;RR <3)@ A]DaDafۢèUI;BRe'!h{0I3tsyVF7pMu=aAA7<`LJ3w|F&}.<刿CןlUM;ޣwȷNCG8u.&NR8,"9N|@{y:-5 f,>?qp̅Bʗ\!yYťCO<\pݬOH#%MZA ~ ^$&b)F\1#G#NXz|wΉwL0eKnKIDAT8yV=M4䈒inҐh] z r>;@AjSЕ7(-Q'/HgSQ)$#0}.\^o1L+sxW3UT* <+.)k-[p\K2~HMu)Nu =6vJ!SܔGD^W`bڣûTsw?nzâ}rZg >dی q,ks9 KG_ nvYcxo d_+G_9 na>s!8^ Lgs@䈽'KrwICPĥtt*tǤp&~AݸuӵOw$rV桴 x+(D[,_KA\#iK?$d`ZkBыc} xhw>H K;Z! Y)ovE?aS3@~`!-@̮qUj_w@`4/?STKҼ LnW T4K>͇4ǣn|{y^kF=Z C)KIa9@ɦ<4W.F /Uja4W/$pt<cYa7g,p-ʩ`;t/ $TA?xxxƯ x ĥz<>") KcY>w|` K!ɾ B@#VXJg)GmKӚstAG Q' DɻJ#6r 0DI`#As$!Ϲ8?'xsf5k-k0{ZPS1RRTOMy#QSJyR*)MwF&Uu\ۛuνi~ZiB~ǃ#XHRS\:s,ְ13=~E&}瞛#䝥5}?1n4᭦v"҂w0Gx!MQQnv+_*~$$h᱃h^&3.Ad(@T?.ᱜ8#k-ms<*IMyX`ҍarr:PհZ,Sl,әЌB}gR%,3B}VT2K)=:_^gZ ӕ=]g9\gtzC`_SLq? <<qC M:A{<6l0;F\4ĥxH<>lK:/n{D KqRPG!EҙV=@HɟPΌ<?KHt>x*|s/FR*JCU%T;!%17C<bVn1|yU ^ld$z=Z-j5 9fӝHLljR%&pNwDݧY.&.7@}_^^Y8G>ǭѽF!xr c0qZgR ݼ;gKAN!/G1 2 xܿqK2 q <KDXj0 haiKXwP,aGlTAQ1@ "ଡ଼EMd zds%O顧$G)T,?eGܜ{<[kkl6f3F#֢h()eAF`ptQS~L b:[REg1y CЍFOw8)eU唬r-waeChƹ^gbGxuDy49Î@ Ax k5dX&AtY.FfQE$!𨛝xZ{bxo*pQe`蹎=.Mw$pQS`74ZA:KE :=ZTj 0tG0vjJVz+a?:ިu\;ȺYBx$F<qi[C?yiKV  ,KU&}!*]%,H ,wқ`? ӣ?JL36HceYIC$?YDp쭵n+ ;kʃmel>Q:KE.Ц5ݑjZVT鎸u3f/q ճ2ZgbZKwH\:#.W$Lh} ȦCB 2 bjmF;d>>h[ꢢ&/Gf-eS[1GU`$*H`:/Ֆڇ%x!?ٴ k-Odrn}_u7L`P}О݋S1zb'ܚMw0GzƤӱ._teя1v#gaG21ףJzCq7wVVzy]C_ĥG"qض)qi'L]]݆$ҿ#SaWe27}s.vDXͰz5̗`XÅ h8,RsiL)AB"xOt؏R 5\NNAC(] 8nVvc?,Wj`T5>򘸗usʃM`ʖڢio|3ۡkkKeFJS@g+>_ ݥ:/tc/g#6!/6q7 qJFs!.q駄..P~!,L&5YJR UJӯQ>鯶<*J9Ѳ6}P*lUbITZ?x<8J pR-8tƮy[~s?׫))Ȟؒag")[j}v +]#6%]Qifk-hn `+Up4~EuK)KGRÑ_ x qd2?,.#,ZRT]YJ;NMx:KT5Cp>\.h\z*Ly科p!J=!0m(AlW4;JB$?)ܯj}5V=ljIYˬ^+"#2aZ#cCsf=*K2iՖ߃7$0S$xc}who*K:B=tG =ݑsYƘ:ť_3HE\:VTȧ׿+.#G& -a7B}>0ma)Qpa K3VY\{W!gi!JBTTشVJRT5 $@4Eb*_k-ƨ)&&.'\[ @zjʣ+SsU F^G:-n܆׿@Xw'*KTiW-3Ưȧvew tG!M\x$+S[eaٞ <%.|LtBٳg;t%ai|M **pHis6告kʣmCrࡒDJRPE>q.Dra3wP;aV9a뿔U– =Էo(D_zJex# ȳ \Xtt쵋ਝZgt,$.zoJ\J͂=ÜG<#U~ViyLk k-qң/K/r$pnHxbjcW<'wa^y߼ ?A/B}2Ns:;_NY\G~*+5.{(Y9qO-,uL-pNroiaHak Kc^CQÜ;{`f8|?S DxJt7hZ*A Gm?M JmM^j/cV}/$` vP_:R^ {r窠>_ jh/LwO4?Ҧ9YgJMKճ9ť1҉=xLqi,ǃO+,WLtBKF%p K#),%YҸ_GbFE7Lw8Qh9+)女y`ʒ;8Y,\UkR;["x<͋~MxNx^x<(n]/\]u:dNyXrjeyc}L)v (Ž_9$IJXeR;qUI0_ݛJ#+{4#_LRwtt딸t`ÂRWK<%x|8 a_0 -+,B2ɀw G K#]XT1oiMw0Y\{s þpRDm,S4SGz BM.VAs9[ se^#=lQ-1W$X7.Gf"Ma_]HxEs ;VYLf3;Jyճr"/q飜R͎YQ!/˫Z#.Bϖ$#V\aXa) -؄č(ai$MXI$,fV:8~C[Up_KLߘ*:鎜2,0=,u$3u@y?ok-r"/=N'^o<0O}WO(AŘ<!'dߎgl% YFZ.ɾzB-ZUFf)(TEPQRU*j Tq$ `>uLǑALv# ")I o P`B[G=7vc.OУ6ۚR ;ܓ9[sU*˥JTpM0ߔ>vX6&NFU^Z L¨E7Wt܄шR5t+aV z֩ti=VmtGUs'A(׀JiZKK<&+WBo9v/ebIj I-Yx|DŽO1Z(3CEVPY\ cx8C {PPCx4:E{glܜ]m>CØ0@7 p`+O?#Ag1q(1:: Z z~&`ҎvG:0|^e]Txuw4|x4|J(uE(_Pr0PDkxVcdW<+O#3LKKs,I-0#C~4r%|#k2]]]Yn5" |@STֺ5p6H K#Zg e]|T #.a+JabYavT6vK$ 7Ht+I@B?wҍG^;X_yܪp*7NMy\x)m\h[0_\m9Ɂ@Y Q81zz^㫿So;w0221C@I;^ AI76 a\ʢTXe`V= 3G6aGtwxDyR RB9%.)qZT{)1<]͂=ܼy0Zx+ >q -J^cw[1`tw82,,[{C8E^b-ʵւPZTk8<+gX> {>2BWDi|_gRX`,E@{G@sRX 4!^TT%)KQD[T<&OG^u \rTUTkAoa-SiײŽ)Dm[ko&7¥IX`S=mE4W09 v-ٍ9svn+I]jZe]Ye=tGw ;<]rT[`oC$\p7oe|?mA$+ޥZo&/AťZi+«X ;8G&Nؑ !/R2ZOI%]MkG <&Qcpn+xó=ØLK ĥa$m$2;>O+脖0۳1YUOY)A'DZ {>y;+^n];Ӆыg5 _ci ޡSXƇz^`djn;’{a(ءT˹25&Uyh^?!/Hj * 鑯U+ٿ0:-ܤ׫dSRˇw1 -4W"HZ>(R.1 < ࡈ|ܜB ס-i[x[~XpB)W8-[0_R =4gՖѓ;1||=2iء bX֟v ܦAG$E;z&fj` ۑa3Z.WpZeU&P2a?MwpC1<02z2 ,E%~!1&/|9ǒ{<<Z"̄_EBb~ Ѳ?iN;F΂8LԁkӲ)j"=Yh,VǵZ.sff(z5lp'vx) "# p0|*vaJ5,w&*V[s9ݑM <6S},R!KIjA0ĥ>%.u5ZP ڊp YF.?Œa \f$-$ -7o)XʅZ<" -4`&DLMhAۆ]6$ކp-f)Ue_ _:Kϩj xx <(DѴJfvLc.Eưc2 X}v>B!B!Vr@{5$ٔGd"צMxBMvyqwl%izc.pn4 s&|cgN` :Tn ɉY;"1̄;Y1z.Q+IlJOJOw8Us>AsC$_HI-DkjWG%b$\'س̵k^dߊHjϑ护nӶۊ1cIt$mM|$-3%, -TEE׋ xQKCE)tJAص] R*(] w}zc=ڜJA4,q-TH?_Sdf"Ivc_#]a*6sr-M)jZu`q$Ѓ-Ņa^VD4`w';B[_Rߋ[W1X]TݠèˬJaq`{Ay#ar+iݾ%_@v~+t -w+Lwf|C2鎺L,q.=ݑ_! L2Zrl`WA|6CJx LSV洦<Ҵ&=ӈσKb:z^X^GsŢh$z}4`8>;1pbr68` ߾H:+Q7La z,۴")Zm)EͰTTهMf[w)GlR 36Ԣ\2V`3\L4XtZ4#tg$|C&} 6vgHڨ0ZiA -hYy%/vR\&Jh/ KO%aG܌aǔTU)VP?S6ݡps`ZTcj- p{;ՖTWYRynEA Y\5։@6GdnbSfs4PO9>#g;݉=q~? ;c(1wƮۢ"]Y:/5Îq#cS\,oĉJs=MiRRK@MIjaFӎYʈ].\3̈Y -Hڏ}c B~A^P(.baIh0'7wdtXZUj|#z8:` &Mz<52t:p8`ۣQ3 X-7ߍ;0|"&`x _zYp0& ~r0e#2&]_aŽⱁAز6 ;hIe v$ܪUUTڼOw %%2W&$Di{ C\dM>Z%+2웫VzXPu5ВJ$L5Hxxp/Ğ&D;;],\#Q)g5, FDT$A;f E<fua鍚;0|2ԭJws3E_=wB֌W0ݕr09 :0<̝ ^lH;%)M\4WY^ZeKT}!r hR|L.P)& B O,Z8$-MY8q6e1rz|@5Kzb5i8 4P5;Jhj*0=x֟疧\7OA[/_1}hj*1vNa( 8AڊI;1 t0W` =79 ;1^g+0\kvp|-ϫ,[|LwBs4mlRKѴ˂=߼KcihL"iZ&}IHC$H;ѳuITT)͎mL)%v|uО|{JX;RYgɆQWT-0/JW$P?oA$ @j@y!S,@D-)Ie =_8qj9?+/@ƇC=xcw+DDFnT!G$u%[# z$e }θu$0<ܔbJ?Ĥab}^ 1"0$pU쟏|~LF &H9b9:N)?߷p OO ;,7x*v8/ |Cʫ,Mw(Wv <67GӎY]ѻmF* xh4#2Ѵ, -X2c>]޷q4P0\}.w s@p. 豻+l]BÎ0UV]g:9:ч9`V9ڇ$0*K, Fվ%H{>C0obQl~xy̸ZZNY;tl;&.AشA?A{XS]ҁLBAu0aQ3f@d"_ϤK `L!G# z}b;7`&;atgSJJܨE),-'aH4-x-Ю`.v.,l2;, i<2H$L&n drI{oGmUi`F"i 7"<ϣj-Ƈ{199 $b<*vD`G* 3%KU_HNUŽHL tЫ,yUe@|+S c%M9[mrX[6U[A =N u}?8A A lcGaՏ¢+D#b3̯x8R|psEyah1v \eDF`[M9;MURKk9KeICW Gl4%Œ*Gy: <|>V&}M|FN7%UL~ !ċ0{c>#Pr1 H`FÑ0(DGKy]4jKvLLL@VC~wan#+x;R]e )s?AW| vTvDJ. LCT??M2P( <JڢxV*X>ϝBI>h; #'v2SKe͝Bm2A..i8] DRؿN&- 7bG!Gj;K9m ع*_`2vPc;*T,19a|Rj B TMt4mJh+޷4x AN# ?O{x|5ѴyK# 6<>!O= %G.{cA)(AZC ȉ(0YmyX`:+ t:h4j`0yHȎ"*Ğ)z;bWY95r)=?Q?Grwi}g~܉ ǣS+@x2S6*X8xs =z\s{1~z7 1xl;A8\L;H#@|6=6.MULFA{~]]OH8r 9n 甓.p0ݘ9DuX+aW*b,'pvNc?NkGVX44D#*b%BB"jcF>w{$,휉C1V6#E ίI6f{Z8?H:jKz0W[ږrWáU Ve9wxnfmJ#6<{@;$@\Ff1aHlA bԿk$`1d2appCCCyuIcaS;`UêXr5 h^qFǃ <(1g!vC϶90Gȑ#?&dKXj$DǍ7Ҷms1s>F,EUPSKWմnZWCP.<6QM nFVߴ#?n}h+V`b^]%YQga>{BЦ$`0 GYD3A}ѝ%UK0g {K_ CEYXPep-~9 +oZ@JZB>ݞ T1gfU];.T*e9׎%npkwҟX)ٶ dSqEdX6m[h-nu -k CjM[Knqu=l([<3OL-܂L3{k3n3.#kBT7ன$X%0zP0FFF062 vX;(aÎ[5~P 7ӲޖBF~w3l}d(2TQgJ:c޸%0Ԕ@WY9Bcׯ<#<"/ _ms1kF,b1 < K`o,i- <)A(cWZ ˅j.g5zj/(RhC#;1y Q522c9,u\tCHOQ=4t9A4|X'Rnv5ˢ}`=Ki%&&&011FCu(a(wx#;`r{t/*|Q3Op.Œ‡B v#gdͻ3kpC df^Uh/G< z[⡇E?nSn&=:YGl #?Z5\,Bt9S1t]USZz0N Iw°C{q ?~b`e k;PƨUl@<xT̕0PW28(a}0k> < Y$g,ɾN sIAy4V RSl^G xWGn;lX`.m'BC=<tp.ʲ(gfz=]c&Бl#젦;( L ycpG̳;|+Oj 0z._Srަ<l_=< F+KnU =RYq: \H+:Kv;BeTFbҞ0a6a=GI#ީ)qQáv#оͼOwmEl\uia^JÏ@ceL^L&f3F}4䰱Ltxb:VX8aD\Ņ ;j v1 xc  H6G_H =gz7h_`"VMnWq9&nѕl%i`L%] \&u?:gLwi]\h}TG$nvB5%&ڄI8&''a0haZa3`jRե-i+~qװWYau ;h/ѾUպ5f7H%,ҏDF؀G( x]4ٽ,TW<|2vP+- kbG0!v\wrE{#t:1Ue81t^R›r?20y'e&OmY =[-hؑ.HL-p#"0M=XW\L{0\~8$Xg $eOy'1~5tMuaѫa6UvG°C[qUvD'$DZ70ݱA<Ж+ぇ.x 옏s8-X?I-<>k(o,k2[JP/f z͍:7R<@Dvh!rRW]Q1L?"$^7|v# + G,Hm#ӷR4=j6!_BN0$~7 /M)rm-7ǹhM{0p/G;N'Upn-8aTmʮC6,bzT˺j-(hf:a,x饗, <>+'<`d_d:rȏzams1ぇx"ZMݴ"Eۓir-rXJqб@f#  ^}?TZD,0E;hiQS+ǀ/ɧ=>kl%(_mӴGDT ='/AD$K>fLG:I-JhIx % 3V&xLV| x7)*EHL7黸/Yy{NT-]AY&^S&񮶠naGxʣq; WXTY6M{%2=C}*lED[ЃMAH\-+83x6=]k#6ɅmE~tכj111ZMб<GaL>5ֶ;2Z` F24#vzig҃C`:i8wzZgنS0y~HwkFף޸i6\]a+1k甇[כ]q#? vLMzlNl^;r"Bj <hc{о%ua0 R D߅_3_FfI;FsHo|}<,CQ AU2jCh./Np9\|8 oiwID)0] cP'< Tps7k]pch}2GQ)oУxj= L30&099 ^ >:ө.c1>#vp d%G1 =й(][ad :`G #ܫ-}G;ja=,#k#>W!T*?{Dّ =xLNNf@x4Gr <<\<>?# xlN!^< [C)' T53Hvb]=@/L}*HT#0BO+ Aq'|Q5 x>b8GR0$1_M ;xӆrnGR[up_ߞ}D`6P߇jlld`a^\m$%2=  zS eL4oS ýcP_4̕i庩ii<舂wªZrKxd;hŚJ3Q^ia81ߟ2$Z7τ' idodz<!^U=< 5Tb) <NLe, Fo`mu#=cEb[`@uK('G5Sm`MnQKLC7HzؕSìXw =Rrv6- |R]ba`>'bV G{Cv`r=&u \X \܃ .}X@ԗ0xr;F;Xs#9b9bVR <,pQ ;߁Yveѳ魶lH;UrI$*D`Ie$xU,)1[ x\^L0`LxD<>A4m"i <>'ɾߒdB(>(1< .RD+/\Z4;Չ9Fk۠AA\ຜBw˚(a CSS862IG)8ha2|I]Pَ`6*A4}A#7 ~QmZ=v-6 Q#p{ /x$]1q c֨!}Eϡ[Vݟ>舚u)ѫiI8[XᵭnY;옂فG%{tj U)j*,x d 3Ì x'$B(:;s]<+&P.Du ~ xmgw?(ˑaPHҼ L* xrr$]d'ϡ'hؑኁV*VzY;2푮4=p:p:ckլ.nhĹ?3ۣH0Љh„!@$3l ܊S{Pَ;t GvVWb+fj㎂;n^sX>Onj$u|"v<QhR?#~8a;{G/ xH<>$6Hb~@Hq ``|Y*%5ާ*¯Z DsRKj.AHUUP-GHJPǁ?Gh~}L˕wRqI#T9GPy͚E B*4Ϡ_uE@w+pY*z(V 8CX+"_ KpؙՔ@;D >zO'LvK_݅]^Ia ĂTw~ߏ?v@st҂pQ/C繧<&n'7vҼ!.v֡ v0~C6<ȃO8$}!vH٫=#\y ReW-WI8e5\b誊0F,6nԧW ' f<%</^Pfz>17xX`؞U9zo.A:KU4VaC^_x|8iOs{oP/\ Z <\,hdO.+ֺV& yL%vCC^]ϢD#<<Ud#|i?;8ԩ$sJky:;c:d_ JotfTE)?f W;f/I`fg2*Gz9[z^|ucS=9\.y߬fl7'  0a0++0&ۿ~p 8,?~nV:9xQEjn3txkdMq(#9S!U 5֔Qs3GT@`ʂcؿנdBbj$+. ]iBW[rD:T ;ꨊX0!H,a"-$~Ff҃15\)afnqs^} a vD~~& vS` `Nx &?<w. 2})޷: 2UeE&gD"*1dWT)0z*9],2{P`9uy4ΕDgJ*4 zoA`TEuS?eztW\i p^ =Ýtʣ.x ]N xY;R-.v%R]u9\R#r8O'XNnj#;-{$4 *k<π?=QƏj 63;j%Q)g);<&NOzl#يK:z9oM>:Ly?Ľ՜:Hq#0p[?`&[a=CVqV*؇.Ӱ#"1z}6g+,aǒ`< 0M<ȖpJKYo\\$Xq=a Ia`P/Hߎ1ڪGZ< D$*0Txѳr%AwH;ΘZ-< LS?o!ؙgݳ dtg1iSxU|ȹc,4;jxΚLwS]L5TGl劁>~?G;"i,?íXxKG cQ(i>/̭ 3STԛ`iH^YriVIT~)J+&s f ^+SKtZ8DkC\Zc%YtwHҴ<fg!Jg\O;OsnOOz,Ѻ<#iNq` ކN<iau);$QiSLAG3KxK@4.ycdBx]fAz-x|9Oy_qO#`撌֑*?)&3~1AhsEFq%\:A`:icӢ,I;շ9#1O~dDu@ #m#ٚ ۪Y{<<O4=y-|GӴ!GwU]WaV= :Mu&{;H$6;f<ۄ ḝx|o'n{@çfxTŔftWDop"hʛ(JO`5tIL$*ͰBH\[{ ~cu. 2ӁA9Mw%= t( >Է=lCfXp%rx6Q ۉ_¬Xt0X^; IW^~j c=K9 <\.Wx|'Bߒdx̪00Ͱlp:= OG}`w<&-pinX&@G#v} vkAceAeINZ$c=K?eW-%ë$xF˝C0U,b p|𣳆%ATS]Mp\xLM85'utހ< =>m_SN[QRO xeqLoaBd_. .v `*b[IDAT(QbXZ}s ?$謊~n9+(*=`١ jVrAy2 _ǚWX/&бYA@ xtǟsNxA G߉GOQDIp6L\;(0ڔ!쏥SITʧt$*j7'^, ݦJyz k덊u܀ 2ث`*R\0,E/u|*B u؏0.~i䈀kl pEMspdt0ƭSS*2c1pZ(= [1/ 3{ xPex?pК,0ZDHU h8@΍?J7>5sZKTk $xNvtAU)ȃk >~~|>|>|xn쇫v)5epL{ul.f5 ` 0U/~ m[8mbH Lٯmo`9%ky - `68Tlgi\ib3#OrJ1R6<} <9 '& .ċa>x؛<*- Q[pwu~GN0VJ]>A:eY"J*0},*Kt)i>p7BGP-%)8,?@48#x]pEڅ.-\L@,-`?#8/wGEV H&/gꨤ&8NK&Na^@څ7,x'jY9r :WݯV KaG=K?g{D  RA$] <, TӋq4/FO$Vth߸Xz(JXD˥{3gy[g]'r.*T,Fyiާ=bѵ`Yǩór Ki] wRYp-L~0v1̊5|y8 $`o›xg`VVTbNp*˧ Gp< O)Ms :;tBB1),Ӟ5(gx؛vg!O^!^ մGPU;ST88ʼn$g BY(οY[У FP`JK,y% ]f.qP_hU ADc-_JOc1 nXW"~@ z, 4?ۡa?s8+}k+d6Io וw8{OC8̌ . pDYymi> okyZ#y8A׽*th`q@$IwLJ^ YfN+.aiT< 2I4ǥW\oX_2mZ obxӸ(e@R 4 +E ?~'p5F.w\>No^1;?\߂8k؏?)XZ6"_K] n&䈼f0zӼ ֍nmCާ9*dHM &%: _AU)'4$wTP<%x| '<: PKT4 ZG8DEpiHI!_0*2|iF ~ s>ȑIzKP,F@~TtH" BH20Nq}m#{'XQa)6M\9&S!i{_a)xXoBY&Ȍתƃxa(Ʒ>0˞[pW,rU^` PrGO%NdiN=RDZ7=ȳe%'^;@,E=G*$U$"* l7R1R6r$DRt$r@T/db{ހK O##jiFaõХEWwo<(%PRMV(%R1[J!ו*ݺM*V8i^#;0 SkD$Ă6EHT\f{S !GNA#qQY`Wa6S픣y7}AV<QKeoKU=qM憣c7p $ Ai@jBd>70|'JQQ&75S?Fڵ(Q)W=P44!j$G֋@ NkȌե z>#A<˩?ZM(K@Zu w\"JTWt!X1wZ'~.=3TC < g<><|>^:K#Ke~6s6FD**Z:o,50-yQ t(]򅩿u$@7-)f{eA%et.Vr`G S59|A~֤+WdA,Nt$T z9vٵU͟Y_*Ui- p\$|B<4BqCA~j?OeڻkvGNV#'8'\$ƽZa{J`*MydmEU_#dOTsݯ*e&?N2H |(V"RΙ;GRN:CTbQ)WѴD}c:#6~~fWc gTx1<×S\LxȗZ9\|/SߤMV2O$X5Q_VTq qjB+@h <\8$I n JW}&[I"[H9Q(Q0rdtL\,ٛzJ x-QLoaB_ xFy17VHET-0ڒ0yڟZ FʢҥOH,JKӜ/νz($-Y ڞy1!ye7Nc?錛h_7ן1r4.Qy<r?+/fG #HxQKK xAircGDEB]{Ua{!|xC\`FUȴܡՖJdZaS?\:~%=fC f7qk4 xXG R":tH-9rՒVX|qU_*vScJN7 x\' x8{4ʅYِߠ}C$|_5gF珼z$ʽ+/- L3/{IO r /+) W@DT$P!G*`;NA Vƃ4GV;`~^zzpT"@Ve <6Aj2t:oxy&j r(hdtDEw%pYCŠ4lS <d/ !n9w&פ,\1cG"AV:A baB(HST~ڐ#rsp$@ta)mdc K%?=<\Az=L<>ӿYߍ`"kf!X />hY.A,JS !X1/i^A.emZ7VJD5 Z R<!zx0xLOBV>l O/|Yt"-ޛk97ޜfg- LS|uS6 h_~W G f^/O=BnVY# ~<B yփ<$#FP/@hU!!:`>ݩ"XU_s<%ѳXk/xU:cllY#izaXZ>@c ɃOfA&|wZʳFjamRu_S,~%jZa;T7 h4_72UA~P_$yJcYtI*)? iȖi~+'8À^d׊PcSr Dr<g_å)ѷ i&hIHϵr>{<ξ; x?\ e2̪4X4ăeBAo~V"FTxpA1E@)fH~LQg??xXHzvCnn"x|'9`_?K.!4nC-rz?yd!Y P* fu&,Y YH(X~,ՙHB<ǀáÃpGsMېWe: O<\$ԩS yv/xw;'NC iݹ)ɝ ,uv>jwA͆Ep XTaQg`D3fMMQKj! %ko̅mU0apCvaQN4K{}! ǁŵKB}ZS;}&Xqر<>; o }^ѵ1j ;< C ɿBGy D:)M,rK;E2SU䐆_ J?WeQGZ <Mxt+7Ǘy?x <]x|EquzBv= xp <gPa`t]h:NQ"AA[ > (.tGiDm) _-fj.PFuv8iz4Zg4-h8Xܘ)䐄G#m"W'j`TopxH<"xZ*G&bURr"Z'*Լ? qS9?vY]60*QVkdJdOGߍJm[P4*M$gBv@qHlz!5 Q>5>-yaW5ŭDXS%S$pHHqF$@f?ʱj**)ِ3e˖snsx|2qqqsI=!g)C/aU& B% ! %F(yOᙖ.cr9{Lx| _ <C;~-$cdGk )bE xҨH5Q_&Y)@K)M7e)O@67K RE,yB^B`I?ߥk"_E|LzǺ@m9haZEЫaL砿4YD 4mOCxZ.I;,ˈL&{ T}6a0I<)OPl8xm < FAnQY(, QZ'OC@H.=`Q>pE=`I^|8swyzH뿤!׭~@à 7Y .mߠ]]O |}% kKFp.8ݡL$KbI`ɻK<`Z`ՍAT}wO[.G  gYf̚bF 铀iHjd2L& <>?2쁑+DOxZf=C8;0)%D$B- )##; LQ`L_F|0s/`A!(<|u@˫w?XV߭ S'ZE=.LRS"ixTS4R'txSǘruP%Xֶx|cx|ޛno _555)H=1bQ-kWxLZCQyIwRbFE}!ry_(ߔ$Qi .ȃdXѷ|aQ]mn--Ǽ;*Q] Y?D?J$sEfM`~T8IH xgفGK. -n}꽠3e͏bS>(<Mxtc׮]d2ُx[+coϺ<clG6,'f]1 +ym8GLڝqSGI"`9Lw?Dȥ#v^"K"^(U.Q@%bkYFu{Km:(a̞g|'zegKOE&ojS<&2JԨ?aZ0xh˧,hؒ I8Xy槝;y6 x|7DSI=1gPl45<%b0;<|f#$?xb\)0 $hxZuS8 ԋ`,Ef$:dpq, |<1/<wI70 u̱1q|Tx yI- Ncz\SfYV.ۡ[$rqA]l1%v L~sf}/滂`K$?z?I?v='^7{pvyP+;< ЗcLp MI۔LGooI'.}<>?Cu>hkkM >+XѲ6; W?6q…]179Ѿ!<bm3#xL OPλ25-[La"9Ů21.oV.ƺDhT<>EF8T&ȓι`bVNz${?#-0ØoCq-1@ 4n`"`d&?s#:Jjڔsи* [#c1q^3h`T--+Jo{T]C r,gTWKqh`]7čJnE@/`wP% _w„)cd#BD4nMEd4-&xtttC,e@4 x?dH=QOc]sО-x ّ؍K,L[w/!hߋB?^ɻ1FKbՉ/?˞ fw`ޢvZ迨aTC^%b_w5?L!R0IkO<kS47Dymnr|9I{l.*ފ … kH=Qݚ?vܜ4 X ܭ50:O8R[txG)Q'Qyv[nEf*72Z'6?:N _ޥO?7J0ЃGÚ?x"ixo+ <:7Nz'xXbL&%xZ <> 9/;i,~&~N2uz}7ֶT <iMjhQiʊ7UH.P9}亁] x4=IXF$_Si`"KO? gV;bGXXlxx|c|Rǿ9_>DǏlc֬Y1 c L1 j6&~s NhZ>0MeI4&z|/I[K;Qv{5s{y/qF1c;8j`l/qSB从S"1xP!֘?6 Z@Jv/v^5*U/FDxXJ Qf`ՍwnTCG+&GSAhtyPI$ xtmF(4GˎN796`qK<᭄/9[47y|q-'kvc2nrC ѴC:&FQQYOCb=om. LW<[QvD{רԙ(v}q%!h/w"imY$Mxt@,yB&=HZ>D/^pyd\pa-ѴfCti ]{?e zQp~^97?AlR#f,sbo)ʦW|PQ/tFuN*A iyaFK?5ցN$-S7Hgcx<NjH$,o' <Mhڟd_[F%Mk]%b v˯z}S爼i*5* GSܳӿKchUN?D^ƵT|{¬SDhtyPw+ţH&bǫ%Iw/Jx<Ѵ ɴVxآimI-hZǤ~4Zpș/z^p]>\m EZ:]xI$o/e2h aW^MPemS>(zCXɀ\\)|C?43,|__]TNhqBF2VRy"imx|E-.wd .RZ Ѵ#MjRZ7-G22~L%RO\K i&iѴN͒.o  >"h>-QR)Q]@ b\0KU V3p`y!R20D1T/b!*~Qώ'؀?}䄖'x?m|51Kx _YNh4mss.RO\vc2LMAZ@.sbTLatKĽTY,0shqˢ[d"'vE^U_$uD}JE[u!eTC^'EҊ%؁/ymV,}{J,u떚A4hZ!+W6z/M tڢir0XKjQ/A&"X*RVXX: ʍRN>߮'ï L67=,=kYʒOyn1 wũb-2&|N󨲦u&-UtP'ӽ[7(M}SKZnHŽP⾏|8wY9TCN_^>86kA47 kQ<̅c*hU8QV$SG?Cvh;&|{j QͻUzTCxX# v1&ZZEquj?k Ø?"ia\:M{L&`h'>o͹ihDW|dcWAyhZ ޭ1E:Ϻ ؋RKa8X b؋K7 FZ?lyro@I>*@E$vNZY܋YJmv?K##joj W^~?{j#Or+8\KBX*a'S/χ _ltG[^rа%0tUE,𨮮%= g6o"cF,Jj]$֖ѣG!E2WA=&ڪ b 8BX[^l:C?$߭v2l!<z[[PO"z r`Wwc20jX]"h 1R*d;sps+ <+^!9E&z1ӄZE:&<1'{`qifqR#l3.zn\ZXEZ.)v%1٪,0[ KYi_F "_[Zv%O-xcG_:E"=tTtSw@=3,jL2,!RlLɍ2!_xx|KѴBI-sI-ŤȱCKI-ƥ)ƥ\Xg ^y:?|EgT'VZRW>|*^7þ_׋)Ow jo Ŀ7Pq)GRDŽحS$'s6CD~S!xM{'i 'vnvb\]D@?"6jFO 40B; ?-|dTxk[{{峷{Ů<=M |oQA!?έywjpI޹V@ Zl76$"HqZ~(q2֟Ѵ{xƥdZz$ԯ7.Z c=2.Y(80@S ^aQؚ}1Q)v~4t}UK8WB_^_FuNޒXϻ;)FT _Pj\)|{G_چ6!dXڲ#nXZ%Gd{ ZqHh h$LKgM{/f\sLaVROfII- Kg`z4"N)x.(&NĠ2}E{O$] .@+ݥb;Q$ 9|h&PIx/(/S!Rfݰİ=?saikD'?oΰwy9|CELE?VՓtArk_5xpP'gP1*|xAj%jX*hX3,?Xڡx]us-Ro%tuu!\xŸݱt]@ -x͜Fp,g"fXB\~{h);n]B2MwQ ,B,f`5PZ6^X/=2>ȅPVo xtttIh{FBW K=5. Ljr&buƥǸ@#Pf@wFI=KD.q?%yɴ`Ůmv0suiEB;dT;ǡ =TV#2TaSRhŋLhx8&|rcRKii2j?7cRkQƥ#N%jT$Q͗vyPn>E-fvU:q;FM <;3.xu2ڌ3NYhwRJ}8}{CN}TT*\3,ݜBaiAAK" -6oq)?Ѹn$\&*&&&ڼ-UZK]Vy`1YA`vbQXic>_M݊X_K&Չ}no=(v|tw줐?9=>|R(|4C?(':y}q^G? KmaiSsZ~.rHB˗aXꎏZ-"~m0zI=Uod\:q).oVEYbWbt'y o*ËA (w'&ָ*K}<ƥxc=i}+ -<~(|:w"ʽK+xLVT6;W(OpvA>-{;1,W-rٰT[GlO:z+ŧCq7n#Ztvծ&ȸ-!4FE1yE7)v./M ^zAMZ>TW.]OSTN7KQ$ fݿC̰h>: Xait K+**^%3,'%YWKc4I-6K,YOm+|zqiRGWKF~vIxiۍRviA5Q/vv=Q h'/T 2*k~?VQ0,wLgXږ1ɰUGNNNa|S7ǿ;IjCR]I-|ߘfbwA ָCȸa3.e cŁG>hӢXr8**CZh1D-}j`**صtEW1"k,bXBA2)R P\'S1ĀBX*N3,w4pgreP&Ft/yoK]Mjf\:44t<|n\zkT!RcAk>tVۂ֨H(RZ[=Oŕց^2^}x:VyTX~\Uk>NP|:SOص=MFu$0=\BkTDqSFBC; |_U+['ux,rRGOq: 2톥uQ1X)GMMMaxc:Ywx|I-ƥUUU$yfis$v'>Kx^iUMy\κ ~wpw*j`:aT8zI7̸!b:Xʧ(OQKsZTF LQppfzA4Ix K;'hFolX̸>GVVV4'5|xL2.Rx0EqS|֢ RЕ+PHpa H=KD\} 2[U_r\D_`J(/+w?CPGk9QkY;Ͱu'kXjt`0tiXꭄxxfCn+yUaN}>Ox Mp3(!NKJVa}U"9b?SFWI] ع8LJv˳`*π<-cy} AkVV `vMc'O~A?)?:=OQW'vJqEd$z꿤;m <~RwZ3.qw|`,"%gv/\#m؟f3P\Չ-e&Hr< ,_KLi0`ov&-`5ˤdj0ar~LG&{ 1a bbjSQ)Z+ℿ,FTݑJ.)1yQb?W!xc\Kxj֬Y1ätC-ǭ5xXj `&~yLirhrSE_W{+OQݳI>8JXnXZ<00L0L0,0a=d`ksyr>!@ww >$vH1"Jzr;w~Lx\ꆥx~Km>9E,(}<`h5Q_=K/} Q]t_W/xhJt;9 Y+`,j0w`X`2`0r>"dx:- >A?Fd?>}?lŒJ@/?nr/ awo!z===1,xLq2/'CGXXi\ FJ@lF1wT$R9ۥ}/H9*07IXlh|?Ai`J?o]TpKvx[!b`1)NOѿàvӿChqܹU< wHΰtG;|x`.7v &׷SlQ,` sGDtװ3XSNL&{^q K?q~S2Wwq,ԓo컉 Q_9YQ>F0f%'xx\&|qwgee(rKWZJ,0Yʂ 93͇aK E^W|R#To? sTIaPx>pG__9-[h^m̍hOj1;+Sf4ݓނ2BJ¨Qsʓ%)`T $YY[eRX Cy6O\ ${0X5Ы2a(O< &ebG2d<%M?p=D^W;8=7 ?oKg\Kb+QV#0 _ޒtf`>sѸ- uQ~;~ oHBƥx6oK]re Azj7ŠmX58$0 hOwy|Z*S:Wǔ LzR¸ʒWdK,n-Fȯ":0'G~Cy1kat.Z= nQHx00E^y:XQ'U5 aֿÊ/[bxe97 [RPQt(#xzLgxGۤbX/;Mgz2cƍKH)P! д68Xm|<;z<06pob$,v\VwC8?C'WI풔־;løg)2 ^Cy/ M扎1h!Ȑ0<^?Yzc1L'x1Zg ;3є^0F*V\$ϰTȿN Kg1Rw|<;jH,C-h\Q-\<-;֢u6y LQ\Jb%/vo2_zD^wKتzR$;ߏLXy CnkI_$|TCP} ;%;t\.b?0kY&hc|qltGL4nMBGQaXƇ-},;gX];f9gW;|!<}<~"~nDr|+/ Ѹ-k#nk|k%&R(οƥE`‚OJQe`=wc)QpLvB~FXv/3ډP?.>;i\ Kaz >xJoPQ9H >NSًKhT :f!,t<q}EtZg;~︛<${e2>{ M04zc47v=oaxxt<#,%%Z?' FEFW _m\0Fu4{c!;Y{=l).Xq(aɏ&ӃJ _6SF50+8x$HGwG%CQa0"D<2o{;pE7; CCCWH{[vS4nnEsn<ڶĢks nFFF066Nǎ0X '_<CRehjۊLr 'ص=ua8Wh#/Sb(Kzpu2j%R8ZǶH4m| q.m.]oAa>cxwp>+W|׈G8L@mA__䱖R`䀇0ayE*R} jTHNROY\KF9h\>+~p,ow2o?8@axxnb!ߖhhޑOՅ> attMkX}A 0&R41x<4yQja nTj!(o!p)xz;tgA;NW.GLdA5*%}ac("Qh|yD5OR'A5 p,G04ozfnGn~Uy`֬Yu;@g>+c$_{<Oى^X(+x~  0oEBbgJQD!2!4CuN,zv\]aNB$vF=vIA5Tg,!;偁)ųkZY,% ѩZCD 4|tww]ttt 5@xĉ$6Wa}d_tǃ?cnd}'ۉnaqƍFtf$)*NHok)J(xX83b4 RyVȞPUOtbKbkSu\.OT2lɳ۽ `L앉OcL}qӍ(S\Lq "B]׷.|X`Q r1LOYqbT1\T煡oG:E9h}w.S CC}w4_51X=񴿒duV$ZF8% ibE$E)KoY\XՙaVM^ ݲ,\#҂VЫXeW`l>u(aUc4:E 1qwK49`4)j=J\؃"fgVR/N4>/uZ kBX\'`|ЩaL# 0Ta,D(7oެh?p^ ]>?x(7H/nPo%ǡyGXKF3倇 FRXKYTC>0̞L\SEKD9Vubo *U bviü9A_,TK#FUYlk]eԨ-6Q5-fG>Ԩ!7cW;qz~fSZίwɄW.K"c1 kCbYFJ1.:Rn%JII+< w*:/nxGѓ|tvvvg6ƢiG4xo%\6a-CR ̚veڿffžQX*aqWE5{wr.n_ʣۻي/[ ChxXcRz ֡&J _xC<%eԨnF #g"|^Fnwyh\(I,s8ԟ[{w GXX`r} P#],p:βE,6,իѵIww h;pǃOxSN}@KƢvsXˈ2c ZPR9e"L 0v_P{h9L]&|b`yYZNag1]=K;:<&(`,KXK0' F߇kDfsTy}J+Aؓ \Y\1P1 _S02&OHaquë&>8N88Gtx|~>c-6>֒KԹulv>֢̀0ic-AjT$:Ά*y`FZ֎âLFWӬy`TԨO,S~JtOty'X=jT<0*b6 /O6SrS#SjTJSB-"HZw!q1襂 ^nxӍT,hQԴ,]I/K,g oxO{mE6&c-#t3g<yLȀI8X9==}a9 8;0Rႁ]Fu/ТՅqYVFCC* C[qE"Dq3( =h-|/xAJ=~{21QFIr4{ 0z86X/,dFL;΢k:AtM2::Zg!n8Z!YOX=O?k*ӡ+I~JQ3a0TRSXa"eJׇs2/v=IjR@E0ǻjTEpv3tsj`*;)i 0>(|[J|@/Pm;P'-#nE mq ,?'p1Ͽ;q!!=xiOjժ%e.cbX(LqBFu:$i5zs|Lk`~ueZ`eН'o%&ec-rjTUqCFO/Quh bQ? L|\\Q'<´lUKBq8EL;Q|7b>/psRwiR :ւfЗF|=uXX`),j  =怩HuèZpr&0% WnQ{8WW69P43@e:0P 7ga0&/q+~5L,]{g|| U罞~ؿWSYt_HK(qׂ C2d cOk '4ۥ`Subp?]N!Ԩ7hP_Vy$p@~4Pq7bݵx8XI4lwY*<PތL&L&{(??)m?4֢Ȁ<x x0qv!u:1_vڇk9*Rc.z 3;;m@ups/zC7YhfxauZ[bהTLi!Ljr2 i5 61ZX,0Lt( RWc\* xqQBch}ԙC|H '(46ⰻh'GN T@!jq8*HYnz+D fyo,ފu!)x{x 'ZJR0uyc-0 Xb(v gC_< ˏ0WB_<==]b80|0il6h4Bcll ###D1vzT$XU)CA 4āEWeț?5* :Qt $| jEV{nX{?"鈋 ˁV1 X,v1>>VazA%>lHF,zUAoT:PHn g00Ş)jTHQ.8ȫzw|'my:cY +21^b,իF߁A qq䍳8jѺO/2iZ~ċ}[f_ n5֢WgToQDqc$A vGX 40#-!U:s҈|=4lƥlcYi~ ;D!1 -'06* u2"(2Ch' "oiv<SB+D@/'>Rrp/Rغg_j#q8U|<E7vw+C_] 8&l;7SڝqWh2)1y/:< ~~4nmNZlà̜(OU|<`xW-tw]1^( fsM k:eIXقA i(ݗ#=)(ep?D Ի0]K;Rk7 )eEqƥ$X`JKۨZ%ffP?μ-|6 Wj1Uh@-ǵ"k +?lbڎ'H\uwXa*aG΁QHqnu ;RgC\ ,bq_ 8@~]&ŝX,qeGQ) uyiNv()c-.:fy<,Xٶ6һ<=;tviZ32 8" 7ڎc]n6Z"Pd;}?Z Cz/q2Y mKҼFO/RX EqƥvNQ>j)qQ8/'`cQi LAJy=u2VvL$:2}wcäH<I,vwtlͳѸ1ԭ}իe-o8Yra%$h]diZk:Nz\$j7T4HG[~&[ /ĨrtW/AfM&̚ X*aѤL`nUˢfVI0(1H~`|>L;2r1wD°3 Ƽ(`)[y)?J`ƙuVzؿ2LQ,Zz x _Ca4Zy&à̂"u60l] G P֒puIQ?/l"|*#:-wn:=mnT032fMLLKS1^]A"F03}л-][bж1 "qsu8j?=8uY>/;>+exY<o9kٻw[_hCޘ-)ּLtf$}e9T,ĈrTS-Acy̕TfZkeT" ]em;}qh%ށ% Wc_-zkA߀/MUT-cZ>&a$]x(GҖ@-Ey8 7-|T;R 4@-G{\{l''ǺnN~>dLbT٪2 +3`T c KxnEXtnF(ZUa^1#/=j<g 8Zoxxu|aqk\=j6%vs yG:m]9C[c,K#a(I&*HSB40U`2ց:s1varz$ٙK?\>z ۩1bW2 IMf< {Be <e9v[5~_d^svyalTMdDJ\Q#Pm|Y CX5iUݡʀ4H@uw `Gזskjғ>΢u2:d?k0}"HES 4 8@ ?j֓<؛zq˿r%x Б"GZGy hc!_ہ\TK;ziZ0UxSRqZgSZoga;āY3+hAҤqhuhX梣|Y9"<g\rXKEE~Z.Zc-SRveTJ"/,X*3`LS2 V`E2 D 0 >CoBBc^L1~֋[t\q{ h;vm_0噾/+24-D]3AMKɹ S̉QOA*⦚{',|z#p`:g9)O O S"Le*qڃB#;rq%]4nLaaOU:,i8KE2eJG 'tlbYװ,7V́"uGYYkӌq?0=o}[e6GHgNkᛗ`t%,a4FXAXo vz&x0b|r_CaX,Z` %Z>.m 4Ůaj`t{ҡ&AaVf\2[8IJ T_ub^b`>$~~4C=v[ Ԗ.;s` F?/QԬa] 52tw,w9B^kijjMsl 7&Zq*`NJ#c-ƲHY/z0"ۡH }XAAh4l6O@=} 8W0d慩G2Ho$RFz;<ʢā"zG:/:ShER4OPU45B /[ vEpa0P=/~|ȋTםL2+UX C1V> D?? =&Jm,+`7oTqucEÛc-_rs'ܗH>ToLBѲ3UaLj#b¼T ϼ l3ӧo7z t0 C0qař'Od (!0uPI̽|z 0*`*-<wJE? LadTCΨtCx hP/f8X?ӎJ- vR8Haѷ3 ]m7+[9́ycÆ 9ąqq|Ad+n;Κ5􋯹YnNATXKgn8v?/ Å,YE$YKUFKUdt^=N8 L&h qy+c*verpUZ ev9t0σx6,sDS6y7(Gh|MZܨ)qǀ\@7Wb`U" mV" &yݬTW"8Kvs%|Rٙ;UNfg?8;,qpy0esϱWEވi4,FvwLoV:el4nx Ep{ꈯ7._d,8 9c-?pkygMYϘ[)]nnx ͛'<7/Eke_KŌJGK`O3e1-HwDJˣ(5:j`` x,.xy|\@OQ: <`F.Uh_JJ8{@z5RQ8:"(t3Lmnw?,:h Q(81Y8 v#hKB 2!8ˏgf:BX1mP9< ~;<<|qRW=uкeSR}yE*n^k봻C Xi_ŀdIwyx {0_nÕ,=;>FFFV U`,q gCCCWqg*g!wc-d裏I<~QqԮy\y˃o^*U͏ LwF$Cjd8j~QyիFG5/9*"`Q&ż11R@}wG lwy&wym7C$-`yЃqM۫|.z0F?Y1>FLt]ν Fz[C Y?(ico _mRfx\%J0-'t_NÑ9?ߤĬԢLE1ɬTĬuͬ'k :vg=/4eZWk!ײc3/mkkfXtZyE:VZq[2؞L1bhG[L\c-pxǁiBRK_zjA?FFF066^V Q^ \Z]>nJ'&|6k>XZ?؊aMj9tG,:fW{R27++7+uwYC-{x,HYcXmc-۶m{I pNּaNK6R<(.7]B,çI&@RCc-EѾiZH \4.+{8gFGGa2`X&Śة7"lk)%qzEڰ,8.Vgjg&Cխ`v?B/t¿ 2hcjYv#h-LDaÆx,?獳CYkd2c-?̚5wzs%RyY":gBfڲ :Ndն6Ԅmn4/%Ϩ4wo_=K KsZ #ˁ [zaط =K1ZZEoC3еQ$`@ aāǤ:|}|+bBi9z;¥-֍2/Skg G_,up̞'~vOW Tm_ *V%dVD;J#+hq"0fgiƋ]˚_k\|yL&8˽.|Yd8ZXZɉ`!";^0?͏utra*HQJXaG((u40UcC $Cwv%p})Zr6aܿ{b$ [ѽ) ݹI@olErǠl?ɇY㼻C[CK"b"ƧQ򄯩O R5H(X;u0@5`*hΑD^PͯLf`UâGB_Žj0}6ܹh:MGGQa$I8g"g}tc-q7kuy<(~w2ڍOhЖaGػ<,0^wICE"'`4i&f7G@0EO<ohDW~t`9U`d[gs:7rz J~AGлQ x #yaqfX2fd|i2R;9U*P"aGLЖLh;75yZgV8J=g IX˿uy򰍵ǵJ&f߾}@smeO3/mZb(XIl1xLxnжpm G0tV|}XM.JAGП(v>ǡ+ٰ́lup=`h&y1FY"<y8b sa= [<VY<@0̀^O[:X#YϿG*h{w "0/e#jYaKl"/ѧR;' xڨNdU2 cHXK0ˣ }0گ,.IEGзg|8K8𰯊hϺ>K+οP1LI,rT{`yØw1q֏o+|<ξ-2p%;ifyq^wpѷsoVڼ#QFCCgy{囕q kS`o^w}w.ѫ yMu+/6o¼Tˍ8vyX˹.U];>^IhHy|DxFuvO,`ȋnDGּcr;apk89ƥާ =.CxG0^7CEm~XCȬ3+zwgqݶB,p 1*M<찯2GuK+TQt^|})yÅXY:d7E.jCCj*"'Eں;FѮyV(<<Rqc9c-|8>(~[WWBܥ~iRD-QEԚ伈ZE8k4.gq ݢO6 G@KK٫G^ !W80EÕ.tXLJ)6j]'.Wc}g"1*uf`G{a) 2V  e-zq//[t ~iuw\p-Yw?uwtW"3+}"f3g<x|ߘ~Aļ+"]w 3/Ͳe?e껉Un@GDͼT˃)ԉn3ҁ80c醐p"KiT'oߒ)x59sjG0?m7zL?&ڷ<<)#o=SFnߓ\ UһQ >6%bG|>Z w$>@Q""{DO0O_\NiT긔N:mTTY|jwA>1Z,c}vСxm[e2* `lN@?&hF]y)~UOQ2'fĀZ"pgt.' еMZi/Z'#yΚں;W?ݒ=]ozhVJ]bc-BwKļי HMqs.[D."P% <$qAMK&<Q ][A? 4 =GB]b-X?΢Œe }gҡSۣa/U š+ywc` ^U`# 1.|>+Oi8nf2Z֝" ǖvM^VUE0q^wp͏60Hĸݜ`VzYyfwLcV8G]Ӎغ<#`^j]ggI)\ۻ.:k5oS>6*`w! 0TX;<4!xkHIË+f$wwS`eiD;t'WKm;EYYiǵ= ~{ȑK"t]ú7ntäac LMҢ^JI!bMKcaTvθ5~:?)zmK ZCw5ڊeG2RеI&v.E{UՀ˽20{Bبԉicj1.lR(3Kt'x\*75?F\ǺUk̺;7k4.:dppϲQRo~ҟ"jo}K2lh8Z3pb`4=% ;3[}HZC>ͯvx'B+^c@xhs#0ӷCFuorˤW^9pz4`Տ?0{]4&MT(?ڝٚ0նQ]a}K *(Ҩtut؍JsՔ]i"-F6[PT0M0)fݱX%Q\t)!g3Y)5/uf^jf r9W?/uy0p) g_c zw;t.uwԬy.MMMyQBftwP0/o,p.]KQuyـYcEƕ)]܀с 8Z,HKh*(lT'vIU@b z n Gߖpto CyF\Z6`R)! В?Kg;17gZa<tvc(Y떿#x&~F@?ߣU\ 8 ˅VρGy7ގ>{QtWI'xY)?v&fbf^~nz}$SbCO OmDK;{u:K30\XL TV>Ŷw-+cgT5`<TCBW pm`ͭ9rTS -QNjI-~ovp0OfRwx'E{Q+5+/y)˃Qۓ'Oowyydj^1=ݝNCȓص^"ШBØBnGѴ%ahh&iJ*%)w*Ցy&}aA߻}2:Cw5tje[fX t} :r[hϱ>FWACX|S찷D'w].w"''?v-?v _Lru4;bߥNbh" K%S?~3fV#(;pլxD͵X,ZityOIlqr` jn1f ð>&=Щ20Zxš #gSc.wxl׏hooG[[Z[[Ѫ3LZ=%Ђ[['-7G;,p:F,)C_C)]A ;s_ovgŀ1A<A?_T->x~!\w0idݱ$QfC=4Q!j^nDyc`DmMM\J]իvc$xmZV0 # =a> /OVx ǡ0e-MD&vW:q 0&&q| [pm G_G3zzzݍ =ielAsn}@hVˎebfF_21:ƕaȂI`A;"*;}9h`*6QUQ5/.`o\Z8<%@tcqe6@ ~'TMyc%.uwT~XR~i4QBfFRRw;Dں<~0A$p.ݑ?0 dc ̙a$ìN" ze" )`Iå 044c{;Zy~|?E՞7lpp:@!谍@" ,C ;g|וo6gl'$iN8NbYHQIERͪV&q67mY]^@HRdrg{$@t``̠m{>fW&`~9y7ޏ7G_m N~R嫘u^zm@H䕙A#k4 3N,i׏eZVjQi9gTsl;1iczzzL&{č_ٌ%#ZUAAbff o<*RY%cWM} mGGy= o`ddcccFqQyX,XtSy<Tbx,]PPokT^Qap~&.Mgќp;Kq2@@#xx>T]h4`0@C17М~pӧөSq*fXN >pj΂z-Et.pIJxj(30M܃ׇAb||.*KMǁU=\fCWbSy2bC9Aw0" cY8uG IJQ[ Cw^+ ]#KdCf_.50S1sܞ|CHxQkAW.'X:HkJ KI E L|o93M"Zn|D+AfY XyvfV#3π=7=am05Lvè!$ᇤq2_oꞌ}׊S7?Rv/UaCI1SwĨ;Z_ʬ9}uWPwh p@WL111TԘ6tvv M̢5t azA- 踰~0 u$$W"zj\"RpדZRy[+ 0Zwꎶ# @Khw9;F:;(Q/"m-ު<>p?dNOO E355&æXχ݉BMWScʃf[[\KT}yKaxP?q.!RWn{6GpxsIQZ\<83|2KE&(1  ءw%n]{{T%Kt3E9|iPu~\:a֮0,Uk]G@WVa!GB&Z뿨Q]VdQbQMQfWwpch'/vQwt;LUw{~[Dqù;ȈZ|}B*%ݦB_o_??SqPC[oSyKa.<8Pq+4xŰ֤Rʕ0W@W*Nx2 $c|ʳAԄV֗ŠP S1~:S9`:K~"KA,e|ɢdՌaI7\P6Fe:O~P3kDX R KZX'UZKl&$r1he/1~Mx);~y@5UjO+0.|"La<Ǥ~X[5ä Lb,*4WuXޭˀ^ACׁK@+ọCXAW,NePƳ%9gBt/1>w.n_8q.u}"G1+8*IQh*tN/E[K0]MQ谔,d `*^S :`Гѻ A!_#Vq0m~ \|J{I0*Q\ѡ~n5?w(<r嫩,ƲVݑMQ< 1q~ `" ^O|CR:-QwQDyN*o8s1xb.FO=3073BwUw0Fdt轲;p+v嫄'M.MgNd5{ˇ]K_nY2X0dڤp2G 0]>|J 5W ս_GǞBC[E©;LNc윫ƻ2I'OPw|I;((TyST;,q{Uy|(ء_8:ǟ@68O-wPy*tm,Vw(2 T|ut;.=3V'!Z{}W6n.4Ak]=j+&iPC˰ɻ̨n]=-P o=#\ Y;emqM1SwZ໒jSSSw@p<~ ~TTT_JGh~~.Od]-yy'8<%ZFAUʱeLe8^B1 t/-uhe<;s1s~>OTl`{轴c ۢ[Qxyᇼ~nUêL0=V^UZF_=>T;܂ AL6EgV5>L ?hs GQ/? 3q)WHN_$|FP_ĎuTwlm]1=h~~.rbU fD*<>bL\ʣI1UgSyTxxıcjYQġ30jm[ I9E;­,CWs.^K,EIĨ4vڨv0m~??KK7˗Rڜet# yL䀎 нdC_Ѓ(s7XT[{o!߻\*tLz$J5EvRNSw0υE/%uǬ ; uQSN*y())9(cpa41y vTyLۼ<U|S2A!2p> yK14Cib7ek{|`+6"l-Y꿺 хxvЅK!73ShFS(VoU$zgCJay- $k[ 590Tq"_|T' hG[}aRy 2KO3*)fZYꎁӌ󘽕e3*]O~Uw u WyǿAdF Z0%cӘZk'ER]Z[S[p~f-9 +6EO`d@-#FaSydW{uOJ"I*%d`\TaT~_iX\zUǵHx$:Gp~X#+Y3 >Խ\C;fTCzd>(tARwx3;b\SyY`#6GQc"~L-U8Hz`kGoR o53BhyE;fSvv[Y80* "`vx+WƁ|6P&zeĨ.)ORS {`o6y :[0mwR{~0/~߫@3;;&B|u}Y̘z4*]Tyk`(1^%F y F40W63} w-,0TBDadN52k15Rs4p3l"lTj,Kp0*f LrGN$56*T<ޖB:-/x?V_8VV8yp&K1zq |'\{o s|C ;&ôæVQi:Ry\SQ4W ?t'uZYFL1OmL0dVwVŽAmqv+4y| VnQ u3*-eގmya.zWI ƪ;~Sw<uEǿJ&dnf4??/z?P0QBW9y8B:7g@aDa:_DǼ&ö锸ySlcH?P5ucT: s]`ݾK/):Aoai<Ȍ%ڰa|<?6^6H$EDd hpݨw mZhDgorW^;"Ry7^2"ʽcj,S30SGq"4c:o&iz Ca}-j\[;nv t `T Ĩ4R Lb r7v 64tc`WZ`s!lj30-6qLE.Ŷ0#8=&z sjx'+ ԷϣB ?Ì;^&.r:TVeF/]2 .MqNi]x]^_jƀOd䬾, mB"mkCw2fhSp1 ׳BV=ꝌJ'x,焣Q;9;;ʃ=5k֤PeL)_ᯁ)3Ҧ*V8 ̜6z8=C^h>Zʌ& lgvL`7Ĩ4*KM WB0׾U@xVJPke9"d+ zR bf:[ao c֒RW>~ BSlTyf+E׭[v(u *TQGKKBJ8˘Z LK LEkG"ۛ/a=Z\/=K1YCW5f>2tͫupCW`=+;|@'ڈ#\5eX},?Д&}'ϿrQA4(ufZחj'T!ҡfzթ}VdPAϊ<{BPF%FVoJ9uq]Iՙ;;paQy$&&&ZV|wS;pQx20*d@L=Z\6q:㍿q,rޜxs::^Ϊ~ vggWHt3X]W P(Qi@Sj8f& `#4A*cf'=ݨwX7(H~j! N/bT:|nTʩ;/VUp87ꎯ1;bxD>8/ŨFUh9S}6}0X#0VoLetsa΁:X99Al] o]lXaVdTZ+-LK`8E-pj _sЗ3>i輙^LMMl6s3O`nGTtNC4IG_^:έU_ e`85)[myY0TfXm=\ TOL;EJ_m,f,I__Qaa2`0@ׁ6 E`{d5y 0K|",,Ŧz/ tԃ%/HM^%,ZtAT:P5+kr`a\9́Z1V#0X}}6VxZêTm}YuN/IAQyxTy|EL/X,RZcETHG@Z fcxKcb#4h 0VT sM,%˗^RBTt6a7YmhII#`b=("@AAW7Q _ Ak2%m|s&gџ]@: ;2Q4B;NŊ;l^"rV K|&9Z qq`< ޹;V+RCAw.=nj0SU xth_{h/X#f;lếE˪;;NkSwLmh]{n fIFynZRիDA"Tߗd|3yLuIia>Fӱ4HCǙD_~1Zce1Yhrb[[jr`ɁE*T& p xh<v#ukGu6U0)¨X CZYVdAS5*DI&Vw0<bdG~lҺ.T\kK@k 򰷶j80x~Qq)uR:$Z-fffh/f@2R6Mr9!94avu* & t Uwĉ8ʵq.\\><{neh!\\+<)a>'l Zâ^ }f>ljA1\?|Rݞ?+l;xT~,V~^+ P7Krg?f?`b_'''= Ka:סe̵ z< AI ժ_reL&o x,DA"TsDTfI<RyyR'nshmq00e[[8~W|MӧkܡzYXVXVX,f05y~p665eop|%# 6{b`,O&G>GӀ@n] ";|;VmE/P N@U&^mb}kϾ pꎫar {l~+@ZsleuJru>᣺?;EsPw? *P<˩<xaTR]y.-|Ӊ PWm!UyX9qR볬([DC AaX:{pAЂu솟[/'"PΨT,WG~Ⴂ=T¨T,}ۡ|Lei:zݯX kiH |_o7*e] WCԊ,,ցRw|7ꎯuhPy|ɓի&{MR*©oPG;|G/xYb2m?*+[ O<|2SW?[[#0mTt >h ߧFw/fWݶ0C^<uS6us+K44K~}Jnw/YA;c*~&K/~4gΜf-_Z[:d8ɷbn4@w <ᱵvh|Bu8}X?G prbTJ L ;Pw RQB+Y,%L h;ih?]gYG:)„jCsဏ-A+04\%gSt3RwwިTtL[IA7*ŠkR&V(fT'SzU#eF 'ȍ?cf3)НKhLN{?JIAk6a"ʢp1ڇ1S,Z,tDiƨT&(QjU&C0o 1_浲;"$Z[ϵtvv^j``ڝރq^L[[.bvZWȈ۸ٷUouGE,`O,G^b`QݎY^Hie15U> C_ WyDavs /,`vU{1\m,CA[[}^+ Qwz <bߏ-\?GqZwaH=|jmCv{މˆ?F]4p,yְSHWgDAt]w1ןÎ30;vIsMUC[Ma.G}?${ wOZheWH )v_ ֢d:aWR%Fw=Ψch$Uϲ<g` !W_}E-lʬGIpzv`r`Lck uMfkm~h7knzo6S?".@c! OèOxEOb`=phe)_KO‘yt_L>!?4IߵoVVvoeaSEr" '3@1*o J??3*Rw3Qw}Pyx31qgjNɵ޹Yh/ڊ'[@[[8?Nqa{wVså 6+S ڨ4)`ÖUi<,zKvoMb`Q.EBFŴO R˅ UU V_je1\nef`Got>h>MuUF|1V}u/ch=;DA*UqӘx*e2ُ=S'@ktoCwٷ4p1c ؽt҂ᧁ.I&@S,]$|1蚵?^,&\TlLmx L5 K$U}/CGP$˺M"Ek+ ߷S+KwvtoCk ]$?$xx}M` /G$BWUbcjd?h^ nE[t<9Nd`Q :BhTJ LkT*n`"o׬v8b[iHJE LGc| tWIn$܆~ fp10|60M!0q s);VoO,%O` 2${zz.Zo0*%Ry'T"cj70MMM]fZ%ڵ%KkppkkkèZiS\/< XEFSTcch10M!`C,Kg;֋ɔ ٱ"` 7 uUyHGZRmBg=^+ xx~+KKft"5oSJ-bT)V!u?uHPy| LԾg֖ZK-W;4Ȭ/VAS7*.(_ 4*Xݴ};,E[Z׏C5u,~|R|{[|jm1]=C­-pLJgy\Ȁ]PJ]G#\+ խ P* Pfr'h˖<"/Vت O ^oA+bzHr~3toIRuȑ"FތR ]M1ʒ͞Mb`jOyRa7p4kkg!e$fT=t#w %Oax㯁hN+ b/\,ZVV$xWWWRou50G!4o`5G-˴䐧ՌhcLՉ5j8hlgIO+@%atEk&jF@VgFδŤQk@(Zuy6h N !'  2Vr?IЀǕ"F4oqO#hVob/h*V52ܹsүd{dRRy;V|PymjZ[8?~kl6bؠ>\{,5ǰiEv8^-rw;Sw\~$Swf#v ݁3/Rwf!vVf*n};fx楳jkHة ?u0vGvoCk\jNke Qi0A"(-SUGd2OHRXn3*QnJZ1_deP_aIB-qqqeRCw[[[l~WIhZz L&fP]‹CUUD)*"xDJ Aʇ{+Ӣt}d>5_e| ek`xInk SRCx|U@Ӡ?wvGn_'a6'xV;W4*,[YZa`Nz20OnspULOOC@B1èu !$R1i"-zaGyRۘ5C{p˷{G).=KۂQ'9y|.z%VEP_y0ݹLn$f9ͷ8wa1j2oG_N1r$^^g[Yhe_fjw;HD]^~5yЩiii-ˌ$ZftDk^imFWWz{{100`-נpKa|+-<vRn4Qi4\]jd\u"2ֆn~`L!.nⶀwLd2Fn$+~y?A8`g6a .,j/|;vʢNIIYƪ;,w :ÎieCJIHoB``ʵ30uimihhdzw---hooGWW֖!L\5A#=Clk!uQ蘶$bT\ &+rf0輸MFu R1*70e`;JV# S#4FUWg AӴt0L0\|;hDdK~릕%Fd - I pcʽPt~h.!~WRm$2C›=J-H0`)EUic?hz{QdHبtawڿcZI%i>TRw.A6:*;Xa|;LJ};4m yˋVQߑVb`*)/[[㵶|%?d?\aZgZdGOw#{0ڋ}P_8ŃUjmeɒܘB:#؋[Dzî* )D]a9~{K# ;HK+ 6aa^p# Zu֥wZYheŨQ)h50"-x=ihii;wv0- = E<ʑ".JB@է~L俉_v;zGÞY8Nn$ڛ4H'aPmêWb0,F#t:Z-f;ߎ=)dgCCgAB2b-.E7-㷶 !bc6ڊftT=e;/߅=w~|A]J# S(2*Ma˪?tw [}Ͽ pqk./lu ;8!dXI%ORSy ?Qw䀾(At|;S f͈dk>|G!F$Xk˗=$&&&[,IɾjrU[$zw`zp~b&ta!Pks"C~%KAW,>QtboFx},Ͽ $oT*6VԝaDI&͜DPZɼyA4.a1)U 4u<;j5o{ߎ%uiX٩,zhe,PwVQi`>:|˹) F^;h/fG (vazT{1YǶ\:C0^= S!X0àUv#ڍY⺈J2@R2QL>< t¨LTVe%H[.UFsep@tudR SZkgA]%0Mn$>Nbfvr˶d Zkީ{le㦕;HS^k->#KFd)nA!M5>ͨ<U 6Uyl`$}ծ͞50T!nQL bwVGawݢ\ f͜DP:<% `. W*Iagξ,rA_avw4Q g=;lNVXS]a?b^+<;HZ[^z]R-FZm[[[&ja~4@w "ЃЫ6D*+7{ۦXy,IQ;lޚuG-_y\^5+m&7sA:Ԍc<#\@aȁ6.x mm(t>-h- GR^Աc9S!S_Z[~$~$rEz؆؉n W?1>L\<C0\9 07< ˵YCZ 6moQr#Թ*sBɠ"'+vX8@ Z.dRVzcW SHkTz`ld~@er=5`^cU6k1S5+m֑94ϛ)XcBKZTytI369_; ˵anxƫarڏV155exSފ*ݎmhߌw^|A)#?ZYJ?);Hdk'7-w km| F%4twnd~/̥CгskӠue xu051ٳY929`+B ДUKMme@+}~|",DA"& Lgr/[2!~k[ouRnw٠GONЃb֫`Q:@J1٫@+Wx@ʤ+ 2¯h { #_t_]M`ƒ\PW9ÕCvq &oq]]h II[9<ăwHie!imqjmOv <,뮻}Wʅ0܌-h+܊N?~TӠ.ng+-!Ȏ͞ITѷ#9va^mME|~USy~K./nq --$ < !!¹BRi,4 ;:xX/lgMJ};tAs cƟZYގw`+ CKz=gΜ,|SV$$'''I)U21R4dC*7|Z4LCdzqA /?{Q><RmfLUcĴDF5P$Gs-uA5`v?.;с64~k%,dJJ2VoH+ /nmǏd2ُ %MiyV|y7,`Q8Ln *P!}8 LDa7[|1|[xkBo`;<.p츰!qm?,uMKMo9zXMKCUm9 eaI,|'[?BKK ZZZ`4Z[n]ŽGZY3^ ie!A+r_㵶8quu~͸SUσ=Tc< )l3"m;sϗ6]h X0OaRm[1܋0؍ފ*ݎ֏Ass3ZכJzA˵|Mv|1V *oZ[f3㫬\2Q2짽W^Ԧޫhb*6|q'̪MnyXU8+bTYY<ST$qLap ^מ$ϽØ5A4*N;Ѱ.CI%tOEbgoȁ5ෙ6|qYΤ voCW111!Uv{7Lx?,ި;Qw -Om"Mg?oۋz}ԋH90LU8"FSP",wieKF !Qxj}E]hZ =.QC'ҰVR xDžW-w1 .6혩+Cn$umc YR^.>mnfa(KN",P?mF d'o4k/;^ÎS[imڲX,;vdvxA0vAx/<}gR/SVx5z\}M[ABQrT<@پZ6{XKs0*I`,j F2y$K30y.RG?@ Y6=eήNDɷK[LPJ۲%,$HoPy|ҩśQ~mmmUP n z'[B6ـTx)Qid~حu|a5 Ls@&2|x).t.fc\uI,"_ F#Pd? ;(I+@ϛ2Uvr,*\&MOvxRw-ie!A"-~y}'&&>8z\]u. 5Ud.я7|?(j# a`=d=kmƥPd.m֑[:BjoIO2 a]u.y ;Z `¯b]MOO7}|;oie!A"[[f32Z~YYYieZ ʂ>!nu =/A]ȅ o74JI+K:݇{Fk8{k}0*FF &\܃ݘTH%7u cM#iPI2k=\;ax}ՇASh\j '!g6#hI+ ϨZ!?oٳg|uL:tﲍ.ہ>q ՛]v+[ utgQZGEs hew^g_xLvaj;fn:w׈Or6#tjP`vNC[EaG|'vve 5//Ϸ|;Z$b?snZ[<d?}wECh^ zp[m[G.hr7UysM6`-ۨ@ve0NxL^%G-X 6>~VvTBWvȍo=v2v !o<;?M66(K1|3Uäj ,6AK"[g:26`lUL vPU# ;,0;fr1hߌWbY,Wv|%?};H+ QeP8y|񓼼4M[jp mE[s΁&Fe&!2pDi# F8@퓔0𸰖<<*jиxVeҐ[; Obm'aTfP }MtTȅ~NVltӅh iӧoǗo ‡έ-y8-4?}7NFˁdjovW+=U;ar5z2D1+xRf* X"p`gT xm>ۡIn$ G:=cgY<|$;t)12Ӫ?~%3WD:zW&? I`y|K?odUoxY6*7@؀@*FULXraUn# TA׮8מTfRS:PfL0TV?. vʕir) `iF Jșe-]KG&.UIɛ=vXxl _2>å$kEWo`A2Fa$W@_ɨ;SwXIvHi6_vlAoaД%jϵkvbRo};H+ mmu555UF/؆Kw|z'# G5XTbϿt{^\r5K1cimS֓R'1R>jޠRtz aUwhu4Y|$a]qn vAw~&JaGAPSԬ?Tf7)`/& $"?LL9?u] 5T=VwNH<&ءGjzеvTt1PCr"BNq+<(<} v9t]Pi-3u1U'`b_%yn* 1\Ĩ;˘v}JISIzae)δÎ8 wέEkt˟e6Du]w=. I?(7?<11}hhyn1ڞ_ =Jz=̵Z@oa2[5^`v⟿;Lu/'!QB|mӠjObtFN.An$hSf\;*;D J9qn h;ieEͺh4ܛF|;H?oc~LL9=0=*~jժFq,jـ#%JE1yKIkKSn pƛԥAWRA L<8ҙ:Sq>GՓUw';r,* Y 8 |vT<Λ7)oMJ7ķ@kﰅ31ɳ>i}c~ ǖŸ(zQH힃Q_" <^LWðy5re'2/Un6/3 *~P(~衛BDZedK=uzg6hy]t-Qyèv0Š˒]/ / b@+=VZqe/ ˠ;cb] }{#ߓp}5'cU!vStg`OEӲ~)~*s™> 7)u+ ,$HD(ǧy~ޘ~K&0孷ʋlьKɮ#/YLVyq LW@^_!l L#`RfUЫ$cF}_34pnX T*[Oy7?jX;,%~Ž\w23Uk ;&b2MJ}; A"J~쥟'S-C{JqUzLA_;( zuAv{b.(X0d?K{Jjx5 VxЕa)\ sR*IfFm?ƃ@JuGJ.M=8gW930kj͌sww&,&vv !A?`rClrˏd2O~,h4QMڌLm[V{.\Cͨtoiyvh^81KRb`ļ$$XUyp[Tɍ?c/E'HN,;S=;򲢮Ev?,,|R$,&_dML9񰌝ܒ`0DBlj;x)=Xx\<Ĩ΍,; /pQ)10 R]TQ$AL&y*sc4tWw>e24;Wa*v NÎcNg¢b0F/_Bf0vMJefRJ QTzp&b[O?zŢ*衟Bvqz%gؠPZ衩r&UЃ Na^?+`>*&`CLQ .]x8gSAEјH^0X(QwGsRP%)9ء)pCh:N+;eGYX,'pD/y;I) z;2LL}urˣ2GCQTTS&N#gWzDu>lqc$*ߍV/Mn^,nȨTX _S";<<%bU?# Z&G /RNG4toe.8HQ^xaLx;Y>[MJ A|;H ~nMLAJKK`n֡L:[@+zXj7:@nqèT4bN<7^(m*8J3/l"׬p}+20}if>! FBT^^+ ߨTW*i;]vPw'yHzzD&=vƍu>'|zFavRzpck z@&-6k#c=I_"B"5k*10:;<8K40?D'Q4*,Qw)Q<;%a-]srء`ǝ|2 =ś@ Q^^y啣2Y`/Y; A"J<|B~b[>$Mnd555)zГMšt zd@ z衭v&U,ui}z2>.b`7?n\x" ƨT,S/JQB_30巶 }ohIp,6|K+Q$;V,a ATa$T XKC[#=hWvWdh)\ƃDdϊMdw?a A@'Aqsy!EVzs|t[1#OpDQ KwJ4B`T'_-8F'tU' ԓCԹ" M@4v4a* Ψ4A2?‡F'1;FϺ/Ad,QZou]seaYI) zxmbw!q|0=,/Gh>j]\^ӝtpY¬`}j]-Ce硾s(wΡ|TEZe);Ks ءc:Î3%;h ( җ;vBg9ExADB B $З\"V`5cgCW;;ŽSI?vu1㯜ʵj͛TƌuϊMd4)%ADy{\-=˃d4 ѸLSh9= CwckĐhy<10u2's_ ДpWi 5١[k9p8%]wpڒ MW011Qa25hZW;ĦH%!toE;ˇ9);;ŽE~dT0`wE`Ǘe%&$HGYp7QL۷g扨}\z9spZovdōM%˚м3GeFlÎ < ˋ[[ S|3338FFF088>l"JE݇5n};$)ni;s,]/% Kb4=Sˢr[Y;H C- AǏe2O_X,MTJ@4Ѓ[Anqqld&ȗ~ ś`U я,!!VPt$&a}4GmŘ>:׊Ţկ~?~ i" 1)%A"ơ'?@MnC{@ǟE JAnl-= .ulm$OpAJEjJ ]T+v;B`T*n`Y6Qցw`6a4_A}Y0Ovo_֭I};B]]SZXL`` vL;ŽSI;Z뵨\+4M!7/aG '$b xbr =>!}\O̙3hhX'ddm =ck|[+:%R2;-b`J+CY%_+.츺3`%UjCP=j5Zn0W_VB"u_oC_q 2)T+KPq,8l> t!Ln0vpY4a%gN*6Eh=;z':h h]._dس#n`ǽ?ADkr˿'AǕ+We֣d OL::VY*YlVø|ڀ)sSnAcmL0RKkfZzm])>օ>"#Uikc,i!ѩ~~9R*óWw7V^r50½'A8Sq:J䯆N u =) (HP2f"Ki ˗CS0Tz%5(SaURR}.SN] teJS`-]KrXJW\0xtEL&cga2 N)vtq%hbt^ ʠB]x,x;vv,1)%ADXqb;l|-?d]z/ =hʂh:=`$Kzhk6 C\XY3ӰC@:p-]C#=,?Z.KQDkׁ[\;~ ϛ"pu;@kGI'B{.Az,AttKh֤b5+#^k @MJx> ?ŀ<9fj2#vpkukN^v&;R sR B{> R1rנLZBu?.c$Ev;W !vU]l9mT;<ءqgvCᚧzܸqE=' 2 "=>'= uhQ.d֞JÝyn53Gy zL+C[craυ=vG]TwزejzV!;*VD찭k{v4XLw ܩU kRX (LA0O *6smc0JaG E/Mf*K]amrvgmX:_ǝ寋"L<k7<츟;,;>' YH 1kl^B:Ao @dhX4Mh>jyPZ\obzXsBQ.omJ x>&o@FC*}v_:@`ߩe.VI,Mb9f*+;"{k`__9$>hbMe"{「OAHzAqBy㣏>*Rf=zwx5œ'_i.R5i8lq+xԬay>M9ƣ*#a5 J;61zLdJP*T _XfF3V+FFFՅ^71tfFĹ.Nқa?uKuL^;MMM }x='ෲ̑,$Hgy ">/GEtۄ'xaf:33 j%sزFcTFV9M)O"咁Qi`zm,`f@?\¥C@-YU.-.=] j"`zz===f`G?0z1c* h `F2j`G$sV?T_ō_p_$c .XV5KY2 Q=z̙3gn[[[U*p03u"dfǬGV-܁ UKȨ49r>Wt%DfDž(2*vo Sg}C e+]mS{p-.+ m흋-JGOE__0<Lh;233z̪ť2^.2N-gT>C#l +10*Z~z4Y=|h \ ,cb||SSSP՘vzia)YcBRVUt1*uc ^:up|.F닉0}K?.z ;> gI K-C^`ddnvT,UA_[vhzn˷,Q;yv A@L4l^; zwfz̺EFYLJzp-.Zk-uZ\6+[l*@ ?Qy@5=vIv\wX=hOqbTdP:>\<Nu.~ MXcҁA|X: rt{tnԬ>EWi_1`6@QVW6a/׏BWspC ppJr89=:܂))>ts}aTu8۹LSrZXO[X~mΝ;̃vpg1$HD?ǿ=>z| ~'-:6:zJ6|=Y#Ʈ?/"4֛Se z$CL43K]):tTL?#ti]H1< 9A ҄wCFCLL&闾}}}AF\KK@XQ2& 0B =,0T$Ȫ=JZQ>iBFuU h!hE20Uf<)-OsZ`˒tI Ѝ>A `۔'C`08hA$7g1[jޟHY@1); Y;WhX1U8,h),ǖ4S{w;({}=r# zD>3go.0X`KoA&0X`,? .4Ņ34uV{[\$Pè.Mڣ} ".f󖇽 U00`TA-6y? QA cB*9m c&ɖfƽp#dù.'7*Pª:1^8_s1p* [XbZ[[kyNn`W`DA 1=d2ܗ_~*6͙}gc0o1c,]: q:ۥ£iJHJA@|߸}}a|+K( LUoKb৳:Kk{,-V+><&D%RCȏX 1g3{2I` K z#2Rv Ai@owGYYPe23-.g}j, =Q{pdY:i]ܻBh`F*!40Un^pcM+RvV;t b ^iΊ (r\FHI˔јSuX*U1c(o1BtZh?UGtXct4m|V&ovRA $z|=/ӟl6c[|2 'p-gv?7L#Pã n^.,zTvpE[7ND Nc"/IHJ@aBաR1X@8^ Xu1;ZmU`_" $=>S{=uwGFFA Vq tdGߙNr=hEK@ץŨQC,~(,w.]?eUjxc`z@PJrWmQ#^j~KLP7t |٠+f<{A;~=2E:<):RT޾"4U1BY h;#1x{?pM'qSv ADA  z|=~n0?җr,nvt-CӋsd}R{X>5+Y=$TyxTe wzCTfa30 ³ZUf3+jTu`;kH5ז::YUGKtL#-1 ;:;;/| _X c:~"8f;>I` $HD7*+_zpf?f7_~6fhzh~= jɍ `/7jӣvEͽ)4*]N@F8 Lq^O,Qg1;a`]EµmYZ ŕhubGU+Jf/sД5Y}"}evsRgq?{ W;H A"|=>z|]-&7q\4Mbq6}qhzqY1C+0 =a4KN}Qazϳ]k%BnJ39Ӂ;nobv$. ܚzU;{Hd)U*@֢f8U ,f68'vL:h֗;u$x"v A|A_A{<@=3[ XLnV3F.N=j<js ѩ勀|)1* 'ngxbv LtM}1Lgz9rTGx+NMJ+4j,wQup+Vu8U;A[1 ;f>p9;{x "߽;$HǗfij1h=a|/jڃ55=R7Kis:__hf dg34QiY 0s8k֭-]Of"דZF5$/ܾi,_ա«CH!Bv9VSh4]YYYe?`Ϡb`$!/v ADlBo 7[VkI[S+2֝cU75.)jVF-勀kۄY^!py.>apM]ͪ {"#Fn&2xԭ᪴bMI}`@{UlEx%2oA z =8_>xp\89s<~"phTh}i[$)'PaT*a36YԃJ_BJæX4v4>}83Cu#oWKiIDAT >pv/{h_dJWuLiR 5wu|} wسN  $p =>=83Sg!6e˗BӴ9V7nʬðr^xܣڃjO#l#ͥ*]$D?J3 &߁Guv{}ʴ!z#V'wE߮p׾br,_zoeP,i̻I,Y!q7$H A'y}2 aK/X,ӱBq m =z|SSN!fjosLs!|(c~ %bXē_wUgIz4v\a.^yi `Aݑ5cȨ1ݯ=#zŀ+b+|SRN7%:zETG@r1Ufy^&'srqeX A B=isלǷek5kVX)ê}h~q4m3jS sz!.%:q+xa<&0]ui"4eЖ'B[}E" Q2fR+aQ$ªHU(> qK\rqhRnt]aN ?(y]q˖1JwY֓ d^<L,(aLU>h˽*aW3\ Gqy{WeNj!W8힁Dž\{ 0A+2!\;Rͺ{WdΟ#^}HtjinsT~rf)c`tIxc疠?o1z,Bit9>Gz,'xYoB$oΆv-$H !]oF!y'qxDhhhxi Q{EWG#Φ*=f#_+`GW%⁏=-ߪ@\);8 H]GA7Î@/] hG ~r@Ϻ4]욥L,:´%_":*#Vty ;P&bsQut:k2ct4m}7[X&{`E`ǧx A؃s/&L\\L_|}e*nADpV{]dS{\tvQ.bUl624m.Y&~YSv{ci7*_xNrqZ 2j/pi'ES2t}"xh[ E&4"+ke t8pA):k_dAy5UE}N_Yo&8gU$H ! ȱBf @YYYju{4W^@KmjO`S6HzˇU;0.L\ \zL#@"TУq+h|90I3.f =CV?.f4%>b"{yfo8::O.@;O!jjM 7ͅ6gƦ4W\ ܩ~TĨ4*vqy "w_;M@E ߾ Ϻw=z,5׆>\"qV+6Sң 0q59а~);:_IyCl $H oB=&|Y懙Ir<(#~&v)ͅ?eosqp66eM5e /J]'IUصʼSqDs}!""+³FՑx[̈(: I}:cfgxCl }B#cjg̏uL={կߑv/o AȀ=&'zp-.L禧OLL4cc6=̕ Ʀ]u@IcvЕ nFu Vv~tY ȪUre־bOE~z8Q;*!>W0p\ؘ(555Mޯ|\:Mbv$H !]=[+ffz_~ >ry@cZh;wf#/Y-Xµ,Yu)ۀъ ؈L*rʅ:35S2WnA M0nn3u`k=u傾 t}.96wVYKO<4KۂU.p+eFX>D{(#n]Pfnq~_M01{ *fMTB]IE&*s0V0RC%0X}k[Yĝh;'|r5&^>`ء _BׯÛI,ތvsR$H2O"CllEfb7>X,Zrv`_1|7h:Svf52y>mc|L+rmC>m=rAm|md%f N!_M~ojFR^NCaDžo@;%LE[托vJn볭u@UMBόZV:r;/ʶshDۙh9 M'1XKX jr@)VNPZ&L&;sΟ;K` $HDZ_L|=޽;[tc=LwS|tetgp-H1 |X7.MLv1oJYv@*!B5 @^%HVo7sx}%"nZ9~^="R'1p#e0XŌ;? ]נt0a}`tn۶-K2[_&;Y$H 'ዙ}22G_dG9q.HNzDӉ4^eks:2PksaǸ|0@mM.t6ͼVWAGqA`,-_pߍ c!+VH&ݸ遗ND06iЪSw~IK-ς>SkAޛUy睙w;f:0<ZZ:@K!$dq PJ :S,qȻ'!NHX/xmي7ICή#Y/H>ߟ?=y~9APh∎Kr2]@-#g&t{2}2Bv"2S^=F˲ln-8w"M{pŇ/>t^:_'vg.P~dD 3n!3w%q{EJE':EGkƊ !QKuOnxpj}aF3Gu[N #֪-3\qy7f\ZMT{`~bpŇS=&f? _Q`p~_N Ȏd́doGid)1]B"Ҳ#wIy FH%'$U?T/!WH*1>E}pn׽ꫫ`WX)'ı0̥G8'zǭL _zˑ;ch@ae1,S!Tl*$>Ot.#T•ޢ:S/[0S=OtLw,(0ҿmh,Ȋ8t L=Iz_o]b)nB_T*"ebY:Dt;:DTôwuWBd?&xE•r}rR=Oba"="uPC/=inwaߌBTrVߣ|{pG-6">gNo-~+5y ="tvvVȜhBvOpDECpeԨQ.^BDܒh!bS]7ol%ljgn῿H@Nbؼ3!(RєG=l8J|t|AFt>7xAQGG"THZm.^$gW nDrF1禛n(_aˎۘu@Ĥ2Snrt?maIII/96ܛ\FiOH%>O[ Q`6 #8H~W_#W,;c aΤ5]Etو-.Єg)]ÉDnǾxsp}!!J.|`_!'#&: IS|7batݻw]aKjE鑳b!&;I!;#jeri=z=dW\L|)ңT{ekXt(H$ɗ&b}7xcppKy?+atw,P-<)VY*$MtyAVD.(sjAYqh<ҶYDۥE[ ?D_ kuEY!aڳʋpC bv̙m+,~o2C*ՁXĄPSfrt~2bĈ_~vep"[ls.Zܴr} Oyof=x+FɆp -V*R\Qv,Q@ixmҿvS;wEEԧ;~t/ziDkyth{#g }6O;h;$/:N^6@;}#FfWXsˈׁXDTzD̔J[ncK/˄ qȸc.G92DΓoR|qz#AU"ɀ-*ĦπP#;NNQ"*tO"6)ąҥ J%޲zE-g{DNޏ˲D NBs+:'_d]9!%:X l^{m- ?qKnE]z{\?3?bz(]q =nBS/'%>Nj=ʷϓ )K~9rm㛨FU >S]7x[WjX! Ea_#)ŲS+IDDMgTrP`9ݑGn5u<`+ ?]&rm$4eI d8{A/09?'0᭰u +'aʎHpLQ^fѵw0o{Ya=OZO}HF͢Gk ^C]^&g^1D|pS`#Oa"y W`%y V]QX #kUȎeaDv:;Cc`Zّ8lt%C9ACQŗ݄ Le&cQk76b hOdMP;J%~zG䤦z_pU{4$:\.9Nr],DtxeG8]?ϩeI,i>5l&W7lSBMM͉{7N"q#Ou2}{i8jW\ M0{t:Q&>EeIGa Hm/ŀ<2u{Y=MU.L$w*r I}x V`d>'D)Lw,/y3D\x·Pț[O.9i]HUi tR__Qog3xEPtt{:rDG35Ej:F67 J>]9bR=VXׁrRJ={=?o]ˊ˭ oLaa:u,%n{t˨1YjYO-yKCI&r^CDhsSUo7 /< >Q*y }{tO&6ch) R7/ ux yNm"ϗCNg_r8Gs|SPCLt=OM91YK^&):'gW >r~֬YsywOI*]aJu(]a([aA9)A!=P3Fx%$QTTvq \|+Q PCԘ5嬧P~xCד \GPpbк>}jD{TJQ`PvWjQ]/ݔ3 BCtVdGvד5i {86Ht&gA"y?k5$n'N6l'DG AÒ5DGsԐ,ڴDթѣGKup*&Q1򅦿I7 <?i,>v.uTd<%(>ß+|/?QP~ DB"x YiӉ|lɡlM]]]D6CHtdR=O#7z?-9=%񔦂IWvItAn&6vdt%@nDT&DxHu\?!qҊl'9zŅO +[{:::, ssԘCOSuHYKgΝ;_&ա|ZXaA9)A!=ć^W\n}`.4M{\2u;JU=$>LiOć?!%>D]^&Gj& X\yK _YeRRn0~D'DKDMx)DGk"ob= W +IsE=7yɼce;L---D2'DGutqU ,jHu(-&cEI_U^ MŎL{iR]q3R.B~Yn9C.!+/t%1D5UQtO/ވޗɝE.%Ʒ躲6#φ MdijJꨶh踖4]CǍQ 444"(?V,8/0͚7eGz'K jXAmV5+/N_[#NV9# .<؂ĦU\V@ѱ;3[9J3\xqHI¯BA!T H!v^kXAmE#a4<>xz/^r뭷NR#VX ;ĄУCˊ?>I{v%---+Sl烿=֖/?2QdI_NԚBUC{Gе`Q16ǀ廒W0901'6a"9s"c7)ٮFQɡCi+9-S I=V֘J daZ_~@~?oEb}:VK=#lq+>}m5\.y.Rsrj:,D|pSW^bG~P7ɘ;£`>y"SEZ+vp&d.R٥ *XSU%9982{D{ڊ1e-5A}m&|SJ^XX#HC ^GBvp1+.~/qSi{E>"BVW\v| S:jHYJ|`K}ȝU~Gݪ T"|q,cHx#JK%_LITDaTT:#} 8G ;jD!C ɡ~eE0qژꓗPa:3T:ZA.@dMM͉'D7qoqJI҃?<0/2H# Ì裏^(5yݥ15*(9I}/ޔK%+/R}rC4kR~ׄ/ bJzYsFIK&`ˎ܅y [`9 1n(k nwگ)WV$DG%TJy*ݷj^#z V9ƻ^AT/}(wE(ՁBS-iȤ=DKMO>vq˦cZ'ޅTo1[H~W^>n*W_ WD^B_l-? L@"rd'lU~_Կ"p!YOk&E.N'ߤIrH$ڞET}4Gud9N[[ŋ)7|DFx}E.Hbz dڃ+>=>=6 >LXslɩ_~T*w- TwQhHY}(]{ 9E u-Q0(g}}z~:t L)g sH?]Q/9{%%Ck/_t4, {iBܵ*/Hk6۟p(+3f̘WGwMua;0Ձ>+.bcВxw0?>aĽ\JNr}L]Kɫɸ3;ST{L}H}#? W~-\IT&q>[0PǞ#vewehR,0_2x2Κ d_W+C%T/\/9|øc>L4ɿP[_QSJTbR-L{9r\aa.vk1QnGRT3!ǻR`E>Oe/A/Ɯy= cOx D&Vb m̡''D+:aW~T04|Gi8i\6o޼j/6[,>;BCԭ՗b ,U-\@b S0?& ]'+E7e ᱌(sZL%8Gf S̜B)UGU N줎joo'kWC/9;˒SCFWGZ߽kkJRB+.J M՜r#'q3'~q;- a/5]s1bĘ0o TJ\yO W"|=p,%W|xS 0l S:7 3_#P9opfNYqΌ#wTrgN%WTr-LqOVᗎ# 9ڨZ[[B-YO+++Sm:긜I%33#FHùwo/%ˎ[9ǜR? #q =V\b5! W||\.cG;]?SPɶ}ZQjV_/N>t4&5.X@…. p?;ђ M#gQDTL/KqQ_(.C\@piJ}|V8䎏=BJjk4y%B---DfO~JrlKU{nB4xbm6':d__TH{P#Rk.7.h4,~h]o Oɴw>4Qreԛ(0Άє.nLE*XHBw ~SLyY Z.{p&QA,;|)[`Jd;s/,88F}{3YDSr'/~H T__Ou$P(]k5?Ze,6LM?fVD^C,I.\!qǘ/ohh8}KƝ BNn5cK葷+Ȟ[D}׎t:>B΂E*M"r_K%ǟGX)$׉^Jth栗/y_ȍ)r_ppWTB^=#;id^&•Jjjj*ɴe(6;9䨿7(cX.+LpO1Tw}ETHhBvPP+==Zs;%߃a1;vxnrd>KWɸ#AT~(9EKc u,$[B.wzzz_~t6P_"_X ~  $AW`JśϹ+jㆈH5m%<In7S dK#{TOSɖ1S0XU9YEteT[UWWd"DTQQAoK 1wdR"fۻw|!3FH:`ेkH{ղK^#cȑ u:VJp_&?v-T,?/?V_άǩ#kuf/ԕ+?'f#9\K H " " B (H$rğK۽;Ì!#;;;˦6U#7 z䊊K.tMqaX$•N3%Gy9Qii)9VXHzntOǽѡ)ftGChEa_2… Plk/ UbXNmzjZ@^@ Wl6vp8~Ӑ! A'A] b_*)B [@ĺW?gMAltb3 =aLlA( BRrC*Hoh=5Ƒ5e %Ǒ%Gs' zJKKF#566ݎrl_޻+':}v tJRXab8*TGydw}xzq_xʕ;CC0Z%c>Y2S!X ^q):;;+@O b+@%T2P+B(og'm}Ɲ;ȟ7@ؓˉ-\N߿SlQMmOLKn F8= 8Ӻ L}%u`tjBؙ@NWܳj 9bh4L:u61+5# ʹ\GG-~O&a̝;wD>z@RE>kɫtZ} N,s|`.5y+?,FV:::+@Z3wi L }*%T+BO\bsp& 2D'!B_EI8>K chL\ pE=KȓؐԆR!ԽBZ_r4DM{&y$j38'%GyTHmşt.txB_>DG*IufaP{5|7<~bcŊ M&a"Q1頮BjIKuɳ.y6է̦sCȡTOZ[[- @} +@$T +AB:Awb '}ysff`2DLxGJ:$["̩C[v&e=HYBr/8:ӤVT =~靚q!)Uʒ>9~s܎[{CCÙu-aB I` IN~" !NG4\MQaƼKpmlW_<2I=3z_ե?M$o1j-_!)_tSjDGKI 1ڎUN_|<1𸩧ZOKRicT fP$mTSMM Q}}}@4f?r(2D]B}iy?~!!)#Ixƒ;{~IOl֔x*+J"0+ <ƼCNd w8 W`9sRC[ne􊊔H򿎦򿎦ҏҏqԜuNv 0EGirrP[H$!Q =^CCM~ŦGre;u?棯i<*6MX|{L&L&ʏ;a`!uN` /ADɘGfSԗ?΃ޑ"δYRz ˲?~Arhv葦LaE|S !%1 p Dr&R_'9Il%7yohzaG8j8:.':Pr 'r=N_gFQz`=ZsWEi:KD/jvj0#_***ⱐ VdHbrΦӛƙ%qZHrvrv,3Hx⽝HpF 'Е=[8@deMO~8VRO VlhJS tV7%:rJq#ңfq 0HiRSk.z{w1G 1qu4Rtj,o*--r1Sžk{ŋPR j$Z—!""{88Õ!RB/E\Ut$IΦ+L/G si-vI 0.Cho)4L%W!%2d_h윱${AFl#7 kID;8'c>u1z=8XE_CtDȊa[jW]4 nyq餎jll*-N;qo f 3L{S?{%Hm,KMi807MӒ1/0Ckf]Z@׳8"$:s!]lN3ޱ IQoo/Nu[$LS_洐q L_S6Z>AAhfPoJ-\śĐ)X4ƖȞly+ӝԕ>_l#-.0s7)ԖdLd?M{'Q^;L~5jwzwLIcɔ4R1Tu ?M埌2@Hp@tȉ[8R+Zz:"]JT0G$bZ\*6ZG-n@墎]K^DeRߔJI0 rm_ 1mKʐv \D)3]ӝ*,DRĞmp8zzzQ?$sW :?+, R"CLf řΜ^A AR]@j {'_nܨ:? hcz j{.$wW3`?lssy ~ZW" CN#RSZsQRlk9CޖRjr=s$+@)1 OJ4DH I k]?ړ#:ŤH\K6-0v#sȖ1lޮqCl+8mqd˘J/2dF@h=Yɚ ˨-%N@j(Kl6vMJPzzOd7 ݇7l6[o=Atucf RZᖚDʕ+z<^'=܎v*?Lߢǩ| UD, "$B!2$#,{gM"ɳbuvv+0u0!81>ӧWPxA #L#ػ r"/3Fjˡi=0STj%6?qm8UrTw-u*|Y3<#:~PZH1ZKI >tM{h=EKB#..nŋS\.W7nלg?'%A*wH4P"Dh5& CgSӮԲ{f`,{x"mP[[+j%JNmjy*X H GMʃ('ס#kq ˂%.* 7!0Ćlr㓱Toy"gk~m/++˝5k|?r>}cJM{#>~xa1w}mF_z:Vuڊ?rϜ@IgR՞x272ϢflMMu)s.eէ&e.5N;c9#0MeLjI]L:#Y,immeoXSSđ5;L vH6G#M?N_|cyH*ChZ2Ҡ1Oo,qT{"HuSszwmOqdJGUIrX6*z1TɘOԘN*ю_alxb<0':቎_3rGFCk)) =–zQlx'>Fz뭓bpk=\;Al3QWIz3$ӾyAB"/C"DPM5[Q]c? }Bd 2_fs455QSS5/F<,2rs]@>p(y>N)Ԓ6OʌH]ٿjċJ K )atGy/Q%ySv{ݱcǶ|͓Ȉ_hzF{}~hLj#n߾}H[ۀ8娗ө[T!b2jO<=FOI]t*UoF[Q gj? >0 aoB Bid";jBF*8Ic |Q+4|S!U_H- ^OUIC\Qɘ Qh6SPRkDۉ_8@Dޣe_1b ’jJMR|kǯx{ha<3rY7q_ kT֛ 9=hE/B;fTTqdJ^C 2Ld2ڵkT]]kר3rd45-BQkb`? {YIKqsM!+2Bdo, T;*vq<*6ETj?G{fQád='.+ gk䪯tƍwm+:∎ :GO@PCMCM_|Ӌ(Hk;fz%/$TAS$*d&{h4REEE`*++ň/%b6-6qmJjvO 2@`$BG)!hdA!(38Scn?Isɸk>UAi˩ejq@9N R'L0#:\0E-t ]G\*6C|_|02=|0oƉmZ[t咫AB+'L&F*--*--/r^1?@OQTlXOG$3׶ _D{?]+Z/:! "$inK d8զ?MMy5uՌ76oɌp)_t+ :R!:MRH +ć5H{DK|.#>a2 3 eMP_o٫QlZ 6 T UNIX*S*=Ɣ%!a"4|(|,˿%OQe$>3_ skz~!J QMz\xHlٲe=0e>ϱ;}mRFBtD+U|GG4OUӧ=wna<P9tz͗~s.ڿIOOmE9yjafȑ#o߾}cCC"B nyg#Jζ*k)'G헁SwYA\=DKNC!vv_K99U[Wb\2 o5jxPspEm*D=Dx#=J{-6)-8|#v%W^p\.++MLL\„9k+őbzU%G̪)$EOлTQZ .Zwy#?ef7<`0mX.`HYUTT> =mElm/"勎|~鹺г1)>! !0̘7&\.|-,++yW1i%k+RrA":#}m$ŇS~ꃿM}>nɯ744 /0(X,K 9%ń9a{~Bt%} w页CKi$Q >z>Ԭ>f͚5xfmv{ R Ìe9VRtsڞĴ.ሏJ]JN Ì{UVVq8?bm9iĈ|ُ2Bpr\!v@t#":uԇغ`㮻ZXXgjAA{?OuB%Z9Ԉ2SEW0Ň~j{>]~JRb'ffEoZf]p!uڵKGГVĎJ#҉@|İPSpo+/~ FC0̸ 6$;wnj-%"BbVw *HpVVne9"hY!.>^P&W~HJ}ǤIpXeMQQчsI 1b++z904Ň=ZNwQкZ}5jno.@WUZ*r dhI4WrWt/h|hP)`8/mmmWXu+@G_[[/r/aUjOZ#͡g)DćP4=Eʋ^~xX};&%%mzjFgggPhߗ]}af)S +++ꨯE-dSrއ$=L*RHh9 : >1T| V!"1bĸwO={vGKK%e{UaٳgkNTKq%XCkC|To-wW|h9E˺ԇ /R+/zQZV_?eFo>Y\\ Ս =<b\pBjRR&NF{/9I!9垲"%!WBfm%"Ra(>CCCT/!&@ we 0coᆉ6lH,** م >nb\:w-[p7N0S1*B}zIdS##pV^•r/wKnK9rW_}uS> * @t:gN:/zĈU ))qxTOɡW7D#%+/ђWD~M 0_zjj-x<} @T76HAA{6lH8~Έ8nOEJrE":Hs]G$]"E~Hu~H"G8fԨQ56p.7f7n߿!V2VpHwqȭh=]E{ei0hG8<CMCvpKPaxk0 Ì:uW\9Vɍo/_|099yԩS2 #Wp(KqĚ&>!itE,K$H|||BRRҦmHПܸpB`x{Æ "r/8fBOQ*8k*b]ZVUbaN}E~(9VkCH\;# w-[ U,:U 16 .&''>iҤ9<1FJoT#8~\Vr ʊ4D0NCʋ1[/b>F:V'mذ!`0}=rTel6sʕCOyS'7)b %0?|1ВCNt[iȏG4Sѐzh w{e)9 !c"dժUK_Rl\z5'6Rb rCJpT7šTrHɡ# ,w#T#՗p_  JYr7O^~Ϝ9d4󛛛u8fez4J 0777_4gΜٶo߾ׯ_o, 6Rbk)br> ?gBKFz ~C/9~(䈤Tk+ CG,5Jz%@LB rŨjDXaƏ5jڵkڵ뷟_KJJf6k=6뒒'Ou׮]]zQFMH % Ԇ\rCMzCh=E௩hMqYUQ"9r #CI/r(Y}!TIHIui"DH<* CBd0ĭZjiRRҦϟ?bp8YuA ބvvvVQVV{=7|#Fg5b?rC,CJp2T$G$`G凒^~~hI ! "sB "!/)#BR!BQ!2r zS>oӪ566~mZ{zzZXb^*pXƯ^p!ԩS y #GxZx@Fl(YKJZܚ>p%h9 :@̈H-?Nh] \ # "$B!JdX2DLIaf]w5mK{)))o?~'dL"|JWW׵erGKgg絶+fd**))>|rQQG)))o{W^""CxP#57Un(Mo J8RJ%* Sr fCOCMCO&"$A z2!"%EĈ >bŊ%7o^u֗~⤳g|+WdVVV.l6X,:::*:;;vp4stv\n˲lt:;=n訰X,f/*++\rpgϞS\\TTTavv;iiiغuK7o^bŊ%=L5\11F:?@f(BR?%3F~-'t1z )*bI-Z%`HˏpS ?DJȥ@Ē b w!jdW'ˮD3J䈘$y1FbfȌЈyK1yBHJb<,dP"6-7Ԭ(jSH(? Q+AaAD "?aWeWH $\YTRDZɋ_\3pFFjhbk)J@ H8 9C՗p_$4T"D Q*D#B+J"5wˌҟ~P^I !Tf&:*5ĆXjC/dTTc$DnJ~"&@t(Ih !B2DQ#E䈜 {d~qC]ĐJGl6R䆒Z5hJ9C.uEQ*Aʉ2DQ+F!Y"&MHJm*$ƿK>`=dHPq 6@DI W1s ; +dWG"dIX|.H ` ʤ%tEXtdate:create2018-12-15T21:55:18+00:00{f%tEXtdate:modify2018-12-15T21:55:18+00:00 FIENDB`aiohttp-3.6.2/docs/old-logo.svg0000644000175100001650000014174613547410117016667 0ustar vstsdocker00000000000000 aiohttp-icon Created with Sketch. aiohttp-3.6.2/docs/powered_by.rst0000644000175100001650000000330513547410117017307 0ustar vstsdocker00000000000000.. _aiohttp-powered-by: Powered by aiohttp ================== Web sites powered by aiohttp. Feel free to fork documentation on github, add a link to your site and make a Pull Request! * `Farmer Business Network `_ * `Home Assistant `_ * `KeepSafe `_ * `Skyscanner Hotels `_ * `Ocean S.A. `_ * `GNS3 `_ * `TutorCruncher socket `_ * `Morpheus messaging microservice `_ * `Eyepea - Custom telephony solutions `_ * `ALLOcloud - Telephony in the cloud `_ * `helpmanual - comprehensive help and man page database `_ * `bedevere `_ - CPython's GitHub bot, helps maintain and identify issues with a CPython pull request. * `miss-islington `_ - CPython's GitHub bot, backports and merge CPython's pull requests * `noa technologies - Bike-sharing management platform `_ - SSE endpoint, pushes real time updates of bikes location. * `Wargaming: World of Tanks `_ * `Yandex `_ * `Rambler `_ * `Escargot `_ - Chat server * `Prom.ua `_ - Online trading platform * `globo.com `_ - (some parts) Brazilian largest media portal * `Glose `_ - Social reader for E-Books * `Emoji Generator `_ - Text icon generator aiohttp-3.6.2/docs/signals.rst0000644000175100001650000000220413547410117016605 0ustar vstsdocker00000000000000Signals ======= .. currentmodule:: aiohttp Signal is a list of registered asynchronous callbacks. The signal's life-cycle has two stages: after creation its content could be filled by using standard list operations: ``sig.append()`` etc. After ``sig.freeze()`` call the signal is *frozen*: adding, removing and dropping callbacks are forbidden. The only available operation is calling previously registered callbacks by ``await sig.send(data)``. For concrete usage examples see :ref:`signals in aiohttp.web ` chapter. .. versionchanged:: 3.0 ``sig.send()`` call is forbidden for non-frozen signal. Support for regular (non-async) callbacks is dropped. All callbacks should be async functions. .. class:: Signal The signal, implements :class:`collections.abc.MutableSequence` interface. .. comethod:: send(*args, **kwargs) Call all registered callbacks one by one starting from the begin of list. .. attribute:: frozen ``True`` if :meth:`freeze` was called, read-only property. .. method:: freeze() Freeze the list. After the call any content modification is forbidden. aiohttp-3.6.2/docs/spelling_wordlist.txt0000644000175100001650000000517313547410117020730 0ustar vstsdocker00000000000000abc aiodns aioes aiohttp aiohttpdemo aiohttp’s aiopg alives api api’s app app’s apps arg Arsenic async asyncio auth autocalculated autodetection autogenerates autogeneration awaitable backend backends backport Backport Backporting backports BaseEventLoop basename BasicAuth BodyPartReader boolean botocore bugfix Bugfixes builtin BytesIO cchardet cChardet Changelog charset charsetdetect chunked chunking CIMultiDict ClientSession cls cmd codec Codings committer committers config Config configs conjunction contextmanager CookieJar coroutine Coroutine coroutines cpu CPython css ctor Ctrl cython Cython Cythonize cythonized de deduplicate # de-facto: deprecations DER Dev dict Dict Discord django Django dns DNSResolver docstring Dup elasticsearch encodings env environ eof epoll Facebook facto fallback fallbacks filename finalizers frontend getall gethostbyname github google gunicorn gunicorn’s gzipped hackish highlevel hostnames HTTPException HttpProcessingError httpretty https impl incapsulates Indices infos initializer inline intaking io ip IP ipdb IPv ish iterable iterables javascript Jinja json keepalive keepalived keepalives keepaliving kwarg latin linux localhost Locator login lookup lookups lossless Mako manylinux metadata microservice middleware middlewares miltidict misbehaviors misformed Mongo msg MsgType multi multidict multidict’s multidicts Multidicts multipart Multipart mypy Nagle Nagle’s namedtuple nameservers namespace nginx Nginx Nikolay noop nowait OAuth Online optimizations os outcoming Overridable Paolini param params pathlib peername ping pipelining pluggable plugin poller pong Postgres pre proactor programmatically proxied PRs pubsub Punycode py pyenv pyflakes pytest Pytest Quickstart quote’s readonly readpayload rebase redirections Redis refactor refactored refactoring regex regexps regexs reloader renderer renderers repo repr repr’s RequestContextManager request’s Request’s requote requoting resolvehost resolvers reusage reuseconn Runit runtime sa Satisfiable schemas sendfile serializable shourtcuts skipuntil Skyscanner SocketSocketTransport ssl SSLContext startup subapplication subclasses submodules subpackage subprotocol subprotocols subtype supervisord Supervisord Svetlov symlink symlinks syscall syscalls Systemd tarball TCP teardown Teardown TestClient Testsuite Tf timestamps TLS toolbar toplevel towncrier tp tuples UI un unawaited unclosed unhandled unicode unittest Unittest unix unsets unstripped upstr url urldispatcher urlencoded url’s urls utf utils uvloop vcvarsall waituntil webapp websocket websocket’s websockets Websockets wildcard Workflow ws wsgi WSMessage WSMsgType wss www xxx yarl aiohttp-3.6.2/docs/streams.rst0000644000175100001650000001116213547410117016626 0ustar vstsdocker00000000000000.. _aiohttp-streams: Streaming API ============= .. currentmodule:: aiohttp ``aiohttp`` uses streams for retrieving *BODIES*: :attr:`aiohttp.web.Request.content` and :attr:`aiohttp.ClientResponse.content` are properties with stream API. .. class:: StreamReader The reader from incoming stream. User should never instantiate streams manually but use existing :attr:`aiohttp.web.Request.content` and :attr:`aiohttp.ClientResponse.content` properties for accessing raw BODY data. Reading Methods --------------- .. comethod:: StreamReader.read(n=-1) Read up to *n* bytes. If *n* is not provided, or set to ``-1``, read until EOF and return all read bytes. If the EOF was received and the internal buffer is empty, return an empty bytes object. :param int n: how many bytes to read, ``-1`` for the whole stream. :return bytes: the given data .. comethod:: StreamReader.readany() Read next data portion for the stream. Returns immediately if internal buffer has a data. :return bytes: the given data .. comethod:: StreamReader.readexactly(n) Read exactly *n* bytes. Raise an :exc:`asyncio.IncompleteReadError` if the end of the stream is reached before *n* can be read, the :attr:`asyncio.IncompleteReadError.partial` attribute of the exception contains the partial read bytes. :param int n: how many bytes to read. :return bytes: the given data .. comethod:: StreamReader.readline() Read one line, where “line” is a sequence of bytes ending with ``\n``. If EOF is received, and ``\n`` was not found, the method will return the partial read bytes. If the EOF was received and the internal buffer is empty, return an empty bytes object. :return bytes: the given line .. comethod:: StreamReader.readchunk() Read a chunk of data as it was received by the server. Returns a tuple of (data, end_of_HTTP_chunk). When chunked transfer encoding is used, end_of_HTTP_chunk is a :class:`bool` indicating if the end of the data corresponds to the end of a HTTP chunk, otherwise it is always ``False``. :return tuple[bytes, bool]: a chunk of data and a :class:`bool` that is ``True`` when the end of the returned chunk corresponds to the end of a HTTP chunk. Asynchronous Iteration Support ------------------------------ Stream reader supports asynchronous iteration over BODY. By default it iterates over lines:: async for line in response.content: print(line) Also there are methods for iterating over data chunks with maximum size limit and over any available data. .. comethod:: StreamReader.iter_chunked(n) :async-for: Iterates over data chunks with maximum size limit:: async for data in response.content.iter_chunked(1024): print(data) .. comethod:: StreamReader.iter_any() :async-for: Iterates over data chunks in order of intaking them into the stream:: async for data in response.content.iter_any(): print(data) .. comethod:: StreamReader.iter_chunks() :async-for: Iterates over data chunks as received from the server:: async for data, _ in response.content.iter_chunks(): print(data) If chunked transfer encoding is used, the original http chunks formatting can be retrieved by reading the second element of returned tuples:: buffer = b"" async for data, end_of_http_chunk in response.content.iter_chunks(): buffer += data if end_of_http_chunk: print(buffer) buffer = b"" Helpers ------- .. method:: StreamReader.exception() Get the exception occurred on data reading. .. method:: is_eof() Return ``True`` if EOF was reached. Internal buffer may be not empty at the moment. .. seealso:: :meth:`StreamReader.at_eof()` .. method:: StreamReader.at_eof() Return ``True`` if the buffer is empty and EOF was reached. .. method:: StreamReader.read_nowait(n=None) Returns data from internal buffer if any, empty bytes object otherwise. Raises :exc:`RuntimeError` if other coroutine is waiting for stream. :param int n: how many bytes to read, ``-1`` for the whole internal buffer. :return bytes: the given data .. method:: StreamReader.unread_data(data) Rollback reading some data from stream, inserting it to buffer head. :param bytes data: data to push back into the stream. .. warning:: The method does not wake up waiters. E.g. :meth:`~StreamReader.read()` will not be resumed. .. comethod:: wait_eof() Wait for EOF. The given data may be accessible by upcoming read calls. aiohttp-3.6.2/docs/structures.rst0000644000175100001650000000234413547410117017375 0ustar vstsdocker00000000000000.. _aiohttp-structures: Common data structures ====================== .. module:: aiohttp .. currentmodule:: aiohttp Common data structures used by *aiohttp* internally. FrozenList ---------- A list-like structure which implements :class:`collections.abc.MutableSequence`. The list is *mutable* unless :meth:`FrozenList.freeze` is called, after that the list modification raises :exc:`RuntimeError`. .. class:: FrozenList(items) Construct a new *non-frozen* list from *items* iterable. The list implements all :class:`collections.abc.MutableSequence` methods plus two additional APIs. .. attribute:: frozen A read-only property, ``True`` is the list is *frozen* (modifications are forbidden). .. method:: freeze() Freeze the list. There is no way to *thaw* it back. ChainMapProxy ------------- An *immutable* version of :class:`collections.ChainMap`. Internally the proxy is a list of mappings (dictionaries), if the requested key is not present in the first mapping the second is looked up and so on. The class supports :class:`collections.abc.Mapping` interface. .. class:: ChainMapProxy(maps) Create a new chained mapping proxy from a list of mappings (*maps*). .. versionadded:: 3.2 aiohttp-3.6.2/docs/testing.rst0000644000175100001650000005671213547410117016637 0ustar vstsdocker00000000000000.. _aiohttp-testing: Testing ======= .. currentmodule:: aiohttp.test_utils Testing aiohttp web servers --------------------------- aiohttp provides plugin for *pytest* making writing web server tests extremely easy, it also provides :ref:`test framework agnostic utilities ` for testing with other frameworks such as :ref:`unittest `. Before starting to write your tests, you may also be interested on reading :ref:`how to write testable services` that interact with the loop. For using pytest plugin please install pytest-aiohttp_ library: .. code-block:: shell $ pip install pytest-aiohttp If you don't want to install *pytest-aiohttp* for some reason you may insert ``pytest_plugins = 'aiohttp.pytest_plugin'`` line into ``conftest.py`` instead for the same functionality. Provisional Status ~~~~~~~~~~~~~~~~~~ The module is a **provisional**. *aiohttp* has a year and half period for removing deprecated API (:ref:`aiohttp-backward-compatibility-policy`). But for :mod:`aiohttp.test_tools` the deprecation period could be reduced. Moreover we may break *backward compatibility* without *deprecation period* for some very strong reason. The Test Client and Servers ~~~~~~~~~~~~~~~~~~~~~~~~~~~ *aiohttp* test utils provides a scaffolding for testing aiohttp-based web servers. They are consist of two parts: running test server and making HTTP requests to this server. :class:`~aiohttp.test_utils.TestServer` runs :class:`aiohttp.web.Application` based server, :class:`~aiohttp.test_utils.RawTestServer` starts :class:`aiohttp.web.WebServer` low level server. For performing HTTP requests to these servers you have to create a test client: :class:`~aiohttp.test_utils.TestClient` instance. The client incapsulates :class:`aiohttp.ClientSession` by providing proxy methods to the client for common operations such as *ws_connect*, *get*, *post*, etc. Pytest ~~~~~~ .. currentmodule:: pytest_aiohttp The :data:`aiohttp_client` fixture available from pytest-aiohttp_ plugin allows you to create a client to make requests to test your app. A simple would be:: from aiohttp import web async def hello(request): return web.Response(text='Hello, world') async def test_hello(aiohttp_client, loop): app = web.Application() app.router.add_get('/', hello) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert 'Hello, world' in text It also provides access to the app instance allowing tests to check the state of the app. Tests can be made even more succinct with a fixture to create an app test client:: import pytest from aiohttp import web async def previous(request): if request.method == 'POST': request.app['value'] = (await request.post())['value'] return web.Response(body=b'thanks for the data') return web.Response( body='value: {}'.format(request.app['value']).encode('utf-8')) @pytest.fixture def cli(loop, aiohttp_client): app = web.Application() app.router.add_get('/', previous) app.router.add_post('/', previous) return loop.run_until_complete(aiohttp_client(app)) async def test_set_value(cli): resp = await cli.post('/', data={'value': 'foo'}) assert resp.status == 200 assert await resp.text() == 'thanks for the data' assert cli.server.app['value'] == 'foo' async def test_get_value(cli): cli.server.app['value'] = 'bar' resp = await cli.get('/') assert resp.status == 200 assert await resp.text() == 'value: bar' Pytest tooling has the following fixtures: .. data:: aiohttp_server(app, *, port=None, **kwargs) A fixture factory that creates :class:`~aiohttp.test_utils.TestServer`:: async def test_f(aiohttp_server): app = web.Application() # fill route table server = await aiohttp_server(app) The server will be destroyed on exit from test function. *app* is the :class:`aiohttp.web.Application` used to start server. *port* optional, port the server is run at, if not provided a random unused port is used. .. versionadded:: 3.0 *kwargs* are parameters passed to :meth:`aiohttp.web.Application.make_handler` .. versionchanged:: 3.0 .. deprecated:: 3.2 The fixture was renamed from ``test_server`` to ``aiohttp_server``. .. data:: aiohttp_client(app, server_kwargs=None, **kwargs) aiohttp_client(server, **kwargs) aiohttp_client(raw_server, **kwargs) A fixture factory that creates :class:`~aiohttp.test_utils.TestClient` for access to tested server:: async def test_f(aiohttp_client): app = web.Application() # fill route table client = await aiohttp_client(app) resp = await client.get('/') *client* and responses are cleaned up after test function finishing. The fixture accepts :class:`aiohttp.web.Application`, :class:`aiohttp.test_utils.TestServer` or :class:`aiohttp.test_utils.RawTestServer` instance. *server_kwargs* are parameters passed to the test server if an app is passed, else ignored. *kwargs* are parameters passed to :class:`aiohttp.test_utils.TestClient` constructor. .. versionchanged:: 3.0 The fixture was renamed from ``test_client`` to ``aiohttp_client``. .. data:: aiohttp_raw_server(handler, *, port=None, **kwargs) A fixture factory that creates :class:`~aiohttp.test_utils.RawTestServer` instance from given web handler.:: async def test_f(aiohttp_raw_server, aiohttp_client): async def handler(request): return web.Response(text="OK") raw_server = await aiohttp_raw_server(handler) client = await aiohttp_client(raw_server) resp = await client.get('/') *handler* should be a coroutine which accepts a request and returns response, e.g. *port* optional, port the server is run at, if not provided a random unused port is used. .. versionadded:: 3.0 .. data:: aiohttp_unused_port() Function to return an unused port number for IPv4 TCP protocol:: async def test_f(aiohttp_client, aiohttp_unused_port): port = aiohttp_unused_port() app = web.Application() # fill route table client = await aiohttp_client(app, server_kwargs={'port': port}) ... .. versionchanged:: 3.0 The fixture was renamed from ``unused_port`` to ``aiohttp_unused_port``. .. _aiohttp-testing-unittest-example: .. _aiohttp-testing-unittest-style: Unittest ~~~~~~~~ .. currentmodule:: aiohttp.test_utils To test applications with the standard library's unittest or unittest-based functionality, the AioHTTPTestCase is provided:: from aiohttp.test_utils import AioHTTPTestCase, unittest_run_loop from aiohttp import web class MyAppTestCase(AioHTTPTestCase): async def get_application(self): """ Override the get_app method to return your application. """ async def hello(request): return web.Response(text='Hello, world') app = web.Application() app.router.add_get('/', hello) return app # the unittest_run_loop decorator can be used in tandem with # the AioHTTPTestCase to simplify running # tests that are asynchronous @unittest_run_loop async def test_example(self): resp = await self.client.request("GET", "/") assert resp.status == 200 text = await resp.text() assert "Hello, world" in text # a vanilla example def test_example_vanilla(self): async def test_get_route(): url = "/" resp = await self.client.request("GET", url) assert resp.status == 200 text = await resp.text() assert "Hello, world" in text self.loop.run_until_complete(test_get_route()) .. class:: AioHTTPTestCase A base class to allow for unittest web applications using aiohttp. Derived from :class:`unittest.TestCase` Provides the following: .. attribute:: client an aiohttp test client, :class:`TestClient` instance. .. attribute:: server an aiohttp test server, :class:`TestServer` instance. .. versionadded:: 2.3 .. attribute:: loop The event loop in which the application and server are running. .. deprecated:: 3.5 .. attribute:: app The application returned by :meth:`get_app` (:class:`aiohttp.web.Application` instance). .. comethod:: get_client() This async method can be overridden to return the :class:`TestClient` object used in the test. :return: :class:`TestClient` instance. .. versionadded:: 2.3 .. comethod:: get_server() This async method can be overridden to return the :class:`TestServer` object used in the test. :return: :class:`TestServer` instance. .. versionadded:: 2.3 .. comethod:: get_application() This async method should be overridden to return the :class:`aiohttp.web.Application` object to test. :return: :class:`aiohttp.web.Application` instance. .. comethod:: setUpAsync() This async method do nothing by default and can be overridden to execute asynchronous code during the ``setUp`` stage of the ``TestCase``. .. versionadded:: 2.3 .. comethod:: tearDownAsync() This async method do nothing by default and can be overridden to execute asynchronous code during the ``tearDown`` stage of the ``TestCase``. .. versionadded:: 2.3 .. method:: setUp() Standard test initialization method. .. method:: tearDown() Standard test finalization method. .. note:: The ``TestClient``'s methods are asynchronous: you have to execute function on the test client using asynchronous methods. A basic test class wraps every test method by :func:`unittest_run_loop` decorator:: class TestA(AioHTTPTestCase): @unittest_run_loop async def test_f(self): resp = await self.client.get('/') .. decorator:: unittest_run_loop: A decorator dedicated to use with asynchronous methods of an :class:`AioHTTPTestCase`. Handles executing an asynchronous function, using the :attr:`AioHTTPTestCase.loop` of the :class:`AioHTTPTestCase`. Faking request object --------------------- aiohttp provides test utility for creating fake :class:`aiohttp.web.Request` objects: :func:`aiohttp.test_utils.make_mocked_request`, it could be useful in case of simple unit tests, like handler tests, or simulate error conditions that hard to reproduce on real server:: from aiohttp import web from aiohttp.test_utils import make_mocked_request def handler(request): assert request.headers.get('token') == 'x' return web.Response(body=b'data') def test_handler(): req = make_mocked_request('GET', '/', headers={'token': 'x'}) resp = handler(req) assert resp.body == b'data' .. warning:: We don't recommend to apply :func:`~aiohttp.test_utils.make_mocked_request` everywhere for testing web-handler's business object -- please use test client and real networking via 'localhost' as shown in examples before. :func:`~aiohttp.test_utils.make_mocked_request` exists only for testing complex cases (e.g. emulating network errors) which are extremely hard or even impossible to test by conventional way. .. function:: make_mocked_request(method, path, headers=None, *, \ version=HttpVersion(1, 1), \ closing=False, \ app=None, \ match_info=sentinel, \ reader=sentinel, \ writer=sentinel, \ transport=sentinel, \ payload=sentinel, \ sslcontext=None, \ loop=...) Creates mocked web.Request testing purposes. Useful in unit tests, when spinning full web server is overkill or specific conditions and errors are hard to trigger. :param method: str, that represents HTTP method, like; GET, POST. :type method: str :param path: str, The URL including *PATH INFO* without the host or scheme :type path: str :param headers: mapping containing the headers. Can be anything accepted by the multidict.CIMultiDict constructor. :type headers: dict, multidict.CIMultiDict, list of pairs :param match_info: mapping containing the info to match with url parameters. :type match_info: dict :param version: namedtuple with encoded HTTP version :type version: aiohttp.protocol.HttpVersion :param closing: flag indicates that connection should be closed after response. :type closing: bool :param app: the aiohttp.web application attached for fake request :type app: aiohttp.web.Application :param writer: object for managing outcoming data :type writer: aiohttp.StreamWriter :param transport: asyncio transport instance :type transport: asyncio.transports.Transport :param payload: raw payload reader object :type payload: aiohttp.StreamReader :param sslcontext: ssl.SSLContext object, for HTTPS connection :type sslcontext: ssl.SSLContext :param loop: An event loop instance, mocked loop by default. :type loop: :class:`asyncio.AbstractEventLoop` :return: :class:`aiohttp.web.Request` object. .. versionadded:: 2.3 *match_info* parameter. .. _aiohttp-testing-writing-testable-services: .. _aiohttp-testing-framework-agnostic-utilities: Framework Agnostic Utilities ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ High level test creation:: from aiohttp.test_utils import TestClient, TestServer, loop_context from aiohttp import request # loop_context is provided as a utility. You can use any # asyncio.BaseEventLoop class in its place. with loop_context() as loop: app = _create_example_app() with TestClient(TestServer(app), loop=loop) as client: async def test_get_route(): nonlocal client resp = await client.get("/") assert resp.status == 200 text = await resp.text() assert "Hello, world" in text loop.run_until_complete(test_get_route()) If it's preferred to handle the creation / teardown on a more granular basis, the TestClient object can be used directly:: from aiohttp.test_utils import TestClient, TestServer with loop_context() as loop: app = _create_example_app() client = TestClient(TestServer(app), loop=loop) loop.run_until_complete(client.start_server()) root = "http://127.0.0.1:{}".format(port) async def test_get_route(): resp = await client.get("/") assert resp.status == 200 text = await resp.text() assert "Hello, world" in text loop.run_until_complete(test_get_route()) loop.run_until_complete(client.close()) A full list of the utilities provided can be found at the :data:`api reference ` Testing API Reference --------------------- Test server ~~~~~~~~~~~ Runs given :class:`aiohttp.web.Application` instance on random TCP port. After creation the server is not started yet, use :meth:`~aiohttp.test_utils.TestServer.start_server` for actual server starting and :meth:`~aiohttp.test_utils.TestServer.close` for stopping/cleanup. Test server usually works in conjunction with :class:`aiohttp.test_utils.TestClient` which provides handy client methods for accessing to the server. .. class:: BaseTestServer(*, scheme='http', host='127.0.0.1', port=None) Base class for test servers. :param str scheme: HTTP scheme, non-protected ``"http"`` by default. :param str host: a host for TCP socket, IPv4 *local host* (``'127.0.0.1'``) by default. :param int port: optional port for TCP socket, if not provided a random unused port is used. .. versionadded:: 3.0 .. attribute:: scheme A *scheme* for tested application, ``'http'`` for non-protected run and ``'https'`` for TLS encrypted server. .. attribute:: host *host* used to start a test server. .. attribute:: port *port* used to start the test server. .. attribute:: handler :class:`aiohttp.web.WebServer` used for HTTP requests serving. .. attribute:: server :class:`asyncio.AbstractServer` used for managing accepted connections. .. comethod:: start_server(loop=None, **kwargs) :param loop: the event_loop to use :type loop: asyncio.AbstractEventLoop Start a test server. .. comethod:: close() Stop and finish executed test server. .. method:: make_url(path) Return an *absolute* :class:`~yarl.URL` for given *path*. .. class:: RawTestServer(handler, *, scheme="http", host='127.0.0.1') Low-level test server (derived from :class:`BaseTestServer`). :param handler: a coroutine for handling web requests. The handler should accept :class:`aiohttp.web.BaseRequest` and return a response instance, e.g. :class:`~aiohttp.web.StreamResponse` or :class:`~aiohttp.web.Response`. The handler could raise :class:`~aiohttp.web.HTTPException` as a signal for non-200 HTTP response. :param str scheme: HTTP scheme, non-protected ``"http"`` by default. :param str host: a host for TCP socket, IPv4 *local host* (``'127.0.0.1'``) by default. :param int port: optional port for TCP socket, if not provided a random unused port is used. .. versionadded:: 3.0 .. class:: TestServer(app, *, scheme="http", host='127.0.0.1') Test server (derived from :class:`BaseTestServer`) for starting :class:`~aiohttp.web.Application`. :param app: :class:`aiohttp.web.Application` instance to run. :param str scheme: HTTP scheme, non-protected ``"http"`` by default. :param str host: a host for TCP socket, IPv4 *local host* (``'127.0.0.1'``) by default. :param int port: optional port for TCP socket, if not provided a random unused port is used. .. versionadded:: 3.0 .. attribute:: app :class:`aiohttp.web.Application` instance to run. Test Client ~~~~~~~~~~~ .. class:: TestClient(app_or_server, *, loop=None, \ scheme='http', host='127.0.0.1', \ cookie_jar=None, **kwargs) A test client used for making calls to tested server. :param app_or_server: :class:`BaseTestServer` instance for making client requests to it. In order to pass a :class:`aiohttp.web.Application` you need to convert it first to :class:`TestServer` first with ``TestServer(app)``. :param cookie_jar: an optional :class:`aiohttp.CookieJar` instance, may be useful with ``CookieJar(unsafe=True)`` option. :param str scheme: HTTP scheme, non-protected ``"http"`` by default. :param asyncio.AbstractEventLoop loop: the event_loop to use :param str host: a host for TCP socket, IPv4 *local host* (``'127.0.0.1'``) by default. .. attribute:: scheme A *scheme* for tested application, ``'http'`` for non-protected run and ``'https'`` for TLS encrypted server. .. attribute:: host *host* used to start a test server. .. attribute:: port *port* used to start the server .. attribute:: server :class:`BaseTestServer` test server instance used in conjunction with client. .. attribute:: app An alias for :attr:`self.server.app`. return ``None`` if ``self.server`` is not :class:`TestServer` instance(e.g. :class:`RawTestServer` instance for test low-level server). .. attribute:: session An internal :class:`aiohttp.ClientSession`. Unlike the methods on the :class:`TestClient`, client session requests do not automatically include the host in the url queried, and will require an absolute path to the resource. .. comethod:: start_server(**kwargs) Start a test server. .. comethod:: close() Stop and finish executed test server. .. method:: make_url(path) Return an *absolute* :class:`~yarl.URL` for given *path*. .. comethod:: request(method, path, *args, **kwargs) Routes a request to tested http server. The interface is identical to :meth:`aiohttp.ClientSession.request`, except the loop kwarg is overridden by the instance used by the test server. .. comethod:: get(path, *args, **kwargs) Perform an HTTP GET request. .. comethod:: post(path, *args, **kwargs) Perform an HTTP POST request. .. comethod:: options(path, *args, **kwargs) Perform an HTTP OPTIONS request. .. comethod:: head(path, *args, **kwargs) Perform an HTTP HEAD request. .. comethod:: put(path, *args, **kwargs) Perform an HTTP PUT request. .. comethod:: patch(path, *args, **kwargs) Perform an HTTP PATCH request. .. comethod:: delete(path, *args, **kwargs) Perform an HTTP DELETE request. .. comethod:: ws_connect(path, *args, **kwargs) Initiate websocket connection. The api corresponds to :meth:`aiohttp.ClientSession.ws_connect`. Utilities ~~~~~~~~~ .. function:: make_mocked_coro(return_value) Creates a coroutine mock. Behaves like a coroutine which returns *return_value*. But it is also a mock object, you might test it as usual :class:`~unittest.mock.Mock`:: mocked = make_mocked_coro(1) assert 1 == await mocked(1, 2) mocked.assert_called_with(1, 2) :param return_value: A value that the the mock object will return when called. :returns: A mock object that behaves as a coroutine which returns *return_value* when called. .. function:: unused_port() Return an unused port number for IPv4 TCP protocol. :return int: ephemeral port number which could be reused by test server. .. function:: loop_context(loop_factory=) A contextmanager that creates an event_loop, for test purposes. Handles the creation and cleanup of a test loop. .. function:: setup_test_loop(loop_factory=) Create and return an :class:`asyncio.AbstractEventLoop` instance. The caller should also call teardown_test_loop, once they are done with the loop. .. note:: As side effect the function changes asyncio *default loop* by :func:`asyncio.set_event_loop` call. Previous default loop is not restored. It should not be a problem for test suite: every test expects a new test loop instance anyway. .. versionchanged:: 3.1 The function installs a created event loop as *default*. .. function:: teardown_test_loop(loop) Teardown and cleanup an event_loop created by setup_test_loop. :param loop: the loop to teardown :type loop: asyncio.AbstractEventLoop .. _pytest: http://pytest.org/latest/ .. _pytest-aiohttp: https://pypi.python.org/pypi/pytest-aiohttp aiohttp-3.6.2/docs/third_party.rst0000644000175100001650000002032513547410117017502 0ustar vstsdocker00000000000000.. _aiohttp-3rd-party: Third-Party libraries ===================== aiohttp is not the library for making HTTP requests and creating WEB server only. It is the grand basement for libraries built *on top* of aiohttp. This page is a list of these tools. Please feel free to add your open sourced library if it's not enlisted yet by making Pull Request to https://github.com/aio-libs/aiohttp/ * Why do you might want to include your awesome library into the list? * Just because the list increases your library visibility. People will have an easy way to find it. Officially supported -------------------- This list contains libraries which are supported by *aio-libs* team and located on https://github.com/aio-libs aiohttp extensions ^^^^^^^^^^^^^^^^^^ - `aiohttp-session `_ provides sessions for :mod:`aiohttp.web`. - `aiohttp-debugtoolbar `_ is a library for *debug toolbar* support for :mod:`aiohttp.web`. - `aiohttp-security `_ auth and permissions for :mod:`aiohttp.web`. - `aiohttp-devtools `_ provides development tools for :mod:`aiohttp.web` applications. - `aiohttp-cors `_ CORS support for aiohttp. - `aiohttp-sse `_ Server-sent events support for aiohttp. - `pytest-aiohttp `_ pytest plugin for aiohttp support. - `aiohttp-mako `_ Mako template renderer for aiohttp.web. - `aiohttp-jinja2 `_ Jinja2 template renderer for aiohttp.web. - `aiozipkin `_ distributed tracing instrumentation for `aiohttp` client and server. Database drivers ^^^^^^^^^^^^^^^^ - `aiopg `_ PostgreSQL async driver. - `aiomysql `_ MySql async driver. - `aioredis `_ Redis async driver. Other tools ^^^^^^^^^^^ - `aiodocker `_ Python Docker API client based on asyncio and aiohttp. - `aiobotocore `_ asyncio support for botocore library using aiohttp. Approved third-party libraries ------------------------------ The libraries are not part of ``aio-libs`` but they are proven to be very well written and highly recommended for usage. - `uvloop `_ Ultra fast implementation of asyncio event loop on top of ``libuv``. We are highly recommending to use it instead of standard ``asyncio``. Database drivers ^^^^^^^^^^^^^^^^ - `asyncpg `_ Another PostgreSQL async driver. It's much faster than ``aiopg`` but it is not drop-in replacement -- the API is different. Anyway please take a look on it -- the driver is really incredible fast. Others ------ The list of libraries which are exists but not enlisted in former categories. They may be perfect or not -- we don't know. Please add your library reference here first and after some time period ask to raise the status. - `aiohttp-cache `_ A cache system for aiohttp server. - `aiocache `_ Caching for asyncio with multiple backends (framework agnostic) - `gain `_ Web crawling framework based on asyncio for everyone. - `aiohttp-swagger `_ Swagger API Documentation builder for aiohttp server. - `aiohttp-swaggerify `_ Library to automatically generate swagger2.0 definition for aiohttp endpoints. - `aiohttp-validate `_ Simple library that helps you validate your API endpoints requests/responses with json schema. - `raven-aiohttp `_ An aiohttp transport for raven-python (Sentry client). - `webargs `_ A friendly library for parsing HTTP request arguments, with built-in support for popular web frameworks, including Flask, Django, Bottle, Tornado, Pyramid, webapp2, Falcon, and aiohttp. - `aioauth-client `_ OAuth client for aiohttp. - `aiohttpretty `_ A simple asyncio compatible httpretty mock using aiohttp. - `aioresponses `_ a helper for mock/fake web requests in python aiohttp package. - `aiohttp-transmute `_ A transmute implementation for aiohttp. - `aiohttp_apiset `_ Package to build routes using swagger specification. - `aiohttp-login `_ Registration and authorization (including social) for aiohttp applications. - `aiohttp_utils `_ Handy utilities for building aiohttp.web applications. - `aiohttpproxy `_ Simple aiohttp HTTP proxy. - `aiohttp_traversal `_ Traversal based router for aiohttp.web. - `aiohttp_autoreload `_ Makes aiohttp server auto-reload on source code change. - `gidgethub `_ An async GitHub API library for Python. - `aiohttp_jrpc `_ aiohttp JSON-RPC service. - `fbemissary `_ A bot framework for the Facebook Messenger platform, built on asyncio and aiohttp. - `aioslacker `_ slacker wrapper for asyncio. - `aioreloader `_ Port of tornado reloader to asyncio. - `aiohttp_babel `_ Babel localization support for aiohttp. - `python-mocket `_ a socket mock framework - for all kinds of socket animals, web-clients included. - `aioraft `_ asyncio RAFT algorithm based on aiohttp. - `home-assistant `_ Open-source home automation platform running on Python 3. - `discord.py `_ Discord client library. - `aiohttp-graphql `_ GraphQL and GraphIQL interface for aiohttp. - `aiohttp-sentry `_ An aiohttp middleware for reporting errors to Sentry. Python 3.5+ is required. - `aiohttp-datadog `_ An aiohttp middleware for reporting metrics to DataDog. Python 3.5+ is required. - `async-v20 `_ Asynchronous FOREX client for OANDA's v20 API. Python 3.6+ - `aiohttp-jwt `_ An aiohttp middleware for JWT(JSON Web Token) support. Python 3.5+ is required. - `AWS Xray Python SDK `_ Native tracing support for Aiohttp applications. - `GINO `_ An asyncio ORM on top of SQLAlchemy core, delivered with an aiohttp extension. - `aiohttp-apispec `_ Build and document REST APIs with ``aiohttp`` and ``apispec``. - `eider-py `_ Python implementation of the `Eider RPC protocol `_. - `asynapplicationinsights `_ A client for `Azure Application Insights `_ implemented using ``aiohttp`` client, including a middleware for ``aiohttp`` servers to collect web apps telemetry. - `aiogmaps `_ Asynchronous client for Google Maps API Web Services. Python 3.6+ required. aiohttp-3.6.2/docs/tracing_reference.rst0000644000175100001650000002741213547410117020622 0ustar vstsdocker00000000000000.. _aiohttp-client-tracing-reference: Tracing Reference ================= .. currentmodule:: aiohttp .. versionadded:: 3.0 A reference for client tracing API. .. seealso:: :ref:`aiohttp-client-tracing` for tracing usage instructions. Request life cycle ------------------ A request goes through the following stages and corresponding fallbacks. Overview ^^^^^^^^ .. blockdiag:: :desctable: blockdiag { orientation = portrait; start[shape=beginpoint, description="on_request_start"]; redirect[description="on_request_redirect"]; end[shape=endpoint, description="on_request_end"]; exception[shape=flowchart.terminator, description="on_request_exception"]; acquire_connection[description="Connection acquiring"]; headers_received; headers_sent; chunk_sent[description="on_request_chunk_sent"]; chunk_received[description="on_response_chunk_received"]; start -> acquire_connection; acquire_connection -> headers_sent; headers_sent -> headers_received; headers_sent -> chunk_sent; chunk_sent -> chunk_sent; chunk_sent -> headers_received; headers_received -> chunk_received; chunk_received -> chunk_received; chunk_received -> end; headers_received -> redirect; headers_received -> end; redirect -> headers_sent; chunk_received -> exception; chunk_sent -> exception; headers_sent -> exception; } Connection acquiring ^^^^^^^^^^^^^^^^^^^^ .. blockdiag:: :desctable: blockdiag { orientation = portrait; begin[shape=beginpoint]; end[shape=endpoint]; exception[shape=flowchart.terminator, description="Exception raised"]; queued_start[description="on_connection_queued_start"]; queued_end[description="on_connection_queued_end"]; create_start[description="on_connection_create_start"]; create_end[description="on_connection_create_end"]; reuseconn[description="on_connection_reuseconn"]; resolve_dns[description="DNS resolving"]; sock_connect[description="Connection establishment"]; begin -> reuseconn; begin -> create_start; create_start -> resolve_dns; resolve_dns -> exception; resolve_dns -> sock_connect; sock_connect -> exception; sock_connect -> create_end -> end; begin -> queued_start; queued_start -> queued_end; queued_end -> reuseconn; queued_end -> create_start; reuseconn -> end; } DNS resolving ^^^^^^^^^^^^^ .. blockdiag:: :desctable: blockdiag { orientation = portrait; begin[shape=beginpoint]; end[shape=endpoint]; exception[shape=flowchart.terminator, description="Exception raised"]; resolve_start[description="on_dns_resolvehost_start"]; resolve_end[description="on_dns_resolvehost_end"]; cache_hit[description="on_dns_cache_hit"]; cache_miss[description="on_dns_cache_miss"]; begin -> cache_hit -> end; begin -> cache_miss -> resolve_start; resolve_start -> resolve_end -> end; resolve_start -> exception; } TraceConfig ----------- .. class:: TraceConfig(trace_config_ctx_factory=SimpleNamespace) Trace config is the configuration object used to trace requests launched by a :class:`ClientSession` object using different events related to different parts of the request flow. :param trace_config_ctx_factory: factory used to create trace contexts, default class used :class:`types.SimpleNamespace` .. method:: trace_config_ctx(trace_request_ctx=None) :param trace_request_ctx: Will be used to pass as a kw for the ``trace_config_ctx_factory``. Build a new trace context from the config. Every signal handler should have the following signature:: async def on_signal(session, context, params): ... where ``session`` is :class:`ClientSession` instance, ``context`` is an object returned by :meth:`trace_config_ctx` call and ``params`` is a data class with signal parameters. The type of ``params`` depends on subscribed signal and described below. .. attribute:: on_request_start Property that gives access to the signals that will be executed when a request starts. ``params`` is :class:`aiohttp.TraceRequestStartParams` instance. .. attribute:: on_request_chunk_sent Property that gives access to the signals that will be executed when a chunk of request body is sent. ``params`` is :class:`aiohttp.TraceRequestChunkSentParams` instance. .. versionadded:: 3.1 .. attribute:: on_response_chunk_received Property that gives access to the signals that will be executed when a chunk of response body is received. ``params`` is :class:`aiohttp.TraceResponseChunkReceivedParams` instance. .. versionadded:: 3.1 .. attribute:: on_request_redirect Property that gives access to the signals that will be executed when a redirect happens during a request flow. ``params`` is :class:`aiohttp.TraceRequestRedirectParams` instance. .. attribute:: on_request_end Property that gives access to the signals that will be executed when a request ends. ``params`` is :class:`aiohttp.TraceRequestEndParams` instance. .. attribute:: on_request_exception Property that gives access to the signals that will be executed when a request finishes with an exception. ``params`` is :class:`aiohttp.TraceRequestExceptionParams` instance. .. attribute:: on_connection_queued_start Property that gives access to the signals that will be executed when a request has been queued waiting for an available connection. ``params`` is :class:`aiohttp.TraceConnectionQueuedStartParams` instance. .. attribute:: on_connection_queued_end Property that gives access to the signals that will be executed when a request that was queued already has an available connection. ``params`` is :class:`aiohttp.TraceConnectionQueuedEndParams` instance. .. attribute:: on_connection_create_start Property that gives access to the signals that will be executed when a request creates a new connection. ``params`` is :class:`aiohttp.TraceConnectionCreateStartParams` instance. .. attribute:: on_connection_create_end Property that gives access to the signals that will be executed when a request that created a new connection finishes its creation. ``params`` is :class:`aiohttp.TraceConnectionCreateEndParams` instance. .. attribute:: on_connection_reuseconn Property that gives access to the signals that will be executed when a request reuses a connection. ``params`` is :class:`aiohttp.TraceConnectionReuseconnParams` instance. .. attribute:: on_dns_resolvehost_start Property that gives access to the signals that will be executed when a request starts to resolve the domain related with the request. ``params`` is :class:`aiohttp.TraceDnsResolveHostStartParams` instance. .. attribute:: on_dns_resolvehost_end Property that gives access to the signals that will be executed when a request finishes to resolve the domain related with the request. ``params`` is :class:`aiohttp.TraceDnsResolveHostEndParams` instance. .. attribute:: on_dns_cache_hit Property that gives access to the signals that will be executed when a request was able to use a cached DNS resolution for the domain related with the request. ``params`` is :class:`aiohttp.TraceDnsCacheHitParams` instance. .. attribute:: on_dns_cache_miss Property that gives access to the signals that will be executed when a request was not able to use a cached DNS resolution for the domain related with the request. ``params`` is :class:`aiohttp.TraceDnsCacheMissParams` instance. TraceRequestStartParams ----------------------- .. class:: TraceRequestStartParams See :attr:`TraceConfig.on_request_start` for details. .. attribute:: method Method that will be used to make the request. .. attribute:: url URL that will be used for the request. .. attribute:: headers Headers that will be used for the request, can be mutated. TraceRequestChunkSentParams --------------------------- .. class:: TraceRequestChunkSentParams .. versionadded:: 3.1 See :attr:`TraceConfig.on_request_chunk_sent` for details. .. attribute:: chunk Bytes of chunk sent TraceResponseChunkSentParams ---------------------------- .. class:: TraceResponseChunkSentParams .. versionadded:: 3.1 See :attr:`TraceConfig.on_response_chunk_received` for details. .. attribute:: chunk Bytes of chunk received TraceRequestEndParams --------------------- .. class:: TraceRequestEndParams See :attr:`TraceConfig.on_request_end` for details. .. attribute:: method Method used to make the request. .. attribute:: url URL used for the request. .. attribute:: headers Headers used for the request. .. attribute:: response Response :class:`ClientResponse`. TraceRequestExceptionParams --------------------------- .. class:: TraceRequestExceptionParams See :attr:`TraceConfig.on_request_exception` for details. .. attribute:: method Method used to make the request. .. attribute:: url URL used for the request. .. attribute:: headers Headers used for the request. .. attribute:: exception Exception raised during the request. TraceRequestRedirectParams -------------------------- .. class:: TraceRequestRedirectParams See :attr:`TraceConfig.on_request_redirect` for details. .. attribute:: method Method used to get this redirect request. .. attribute:: url URL used for this redirect request. .. attribute:: headers Headers used for this redirect. .. attribute:: response Response :class:`ClientResponse` got from the redirect. TraceConnectionQueuedStartParams -------------------------------- .. class:: TraceConnectionQueuedStartParams See :attr:`TraceConfig.on_connection_queued_start` for details. There are no attributes right now. TraceConnectionQueuedEndParams ------------------------------ .. class:: TraceConnectionQueuedEndParams See :attr:`TraceConfig.on_connection_queued_end` for details. There are no attributes right now. TraceConnectionCreateStartParams -------------------------------- .. class:: TraceConnectionCreateStartParams See :attr:`TraceConfig.on_connection_create_start` for details. There are no attributes right now. TraceConnectionCreateEndParams ------------------------------ .. class:: TraceConnectionCreateEndParams See :attr:`TraceConfig.on_connection_create_end` for details. There are no attributes right now. TraceConnectionReuseconnParams ------------------------------ .. class:: TraceConnectionReuseconnParams See :attr:`TraceConfig.on_connection_reuseconn` for details. There are no attributes right now. TraceDnsResolveHostStartParams ------------------------------ .. class:: TraceDnsResolveHostStartParams See :attr:`TraceConfig.on_dns_resolvehost_start` for details. .. attribute:: Host Host that will be resolved. TraceDnsResolveHostEndParams ---------------------------- .. class:: TraceDnsResolveHostEndParams See :attr:`TraceConfig.on_dns_resolvehost_end` for details. .. attribute:: Host Host that has been resolved. TraceDnsCacheHitParams ---------------------- .. class:: TraceDnsCacheHitParams See :attr:`TraceConfig.on_dns_cache_hit` for details. .. attribute:: Host Host found in the cache. TraceDnsCacheMissParams ----------------------- .. class:: TraceDnsCacheMissParams See :attr:`TraceConfig.on_dns_cache_miss` for details. .. attribute:: Host Host didn't find the cache. aiohttp-3.6.2/docs/utilities.rst0000644000175100001650000000040713547410117017163 0ustar vstsdocker00000000000000.. _aiohttp-utilities: Utilities ========= Miscellaneous API Shared between Client And Server. .. currentmodule:: aiohttp .. toctree:: :name: utilities abc multipart multipart_reference streams signals structures websocket_utilities aiohttp-3.6.2/docs/web.rst0000644000175100001650000000057513547410117015733 0ustar vstsdocker00000000000000.. _aiohttp-web: Server ====== .. module:: aiohttp.web The page contains all information about aiohttp Server API: .. toctree:: :name: server Tutorial Quickstart Advanced Usage Low Level Reference Logging Testing Deployment aiohttp-3.6.2/docs/web_advanced.rst0000644000175100001650000010213613547410117017554 0ustar vstsdocker00000000000000.. _aiohttp-web-advanced: Web Server Advanced =================== .. currentmodule:: aiohttp.web Unicode support --------------- *aiohttp* does :term:`requoting` of incoming request path. Unicode (non-ASCII) symbols are processed transparently on both *route adding* and *resolving* (internally everything is converted to :term:`percent-encoding` form by :term:`yarl` library). But in case of custom regular expressions for :ref:`aiohttp-web-variable-handler` please take care that URL is *percent encoded*: if you pass Unicode patterns they don't match to *requoted* path. Web Handler Cancellation ------------------------ .. warning:: :term:`web-handler` execution could be canceled on every ``await`` if client drops connection without reading entire response's BODY. The behavior is very different from classic WSGI frameworks like Flask and Django. Sometimes it is a desirable behavior: on processing ``GET`` request the code might fetch data from database or other web resource, the fetching is potentially slow. Canceling this fetch is very good: the peer dropped connection already, there is no reason to waste time and resources (memory etc) by getting data from DB without any chance to send it back to peer. But sometimes the cancellation is bad: on ``POST`` request very often is needed to save data to DB regardless to peer closing. Cancellation prevention could be implemented in several ways: * Applying :func:`asyncio.shield` to coroutine that saves data into DB. * Spawning a new task for DB saving * Using aiojobs_ or other third party library. :func:`asyncio.shield` works pretty good. The only disadvantage is you need to split web handler into exactly two async functions: one for handler itself and other for protected code. For example the following snippet is not safe:: async def handler(request): await asyncio.shield(write_to_redis(request)) await asyncio.shield(write_to_postgres(request)) return web.Response(text='OK') Cancellation might be occurred just after saving data in REDIS, ``write_to_postgres`` will be not called. Spawning a new task is much worse: there is no place to ``await`` spawned tasks:: async def handler(request): request.loop.create_task(write_to_redis(request)) return web.Response(text='OK') In this case errors from ``write_to_redis`` are not awaited, it leads to many asyncio log messages *Future exception was never retrieved* and *Task was destroyed but it is pending!*. Moreover on :ref:`aiohttp-web-graceful-shutdown` phase *aiohttp* don't wait for these tasks, you have a great chance to loose very important data. On other hand aiojobs_ provides an API for spawning new jobs and awaiting their results etc. It stores all scheduled activity in internal data structures and could terminate them gracefully:: from aiojobs.aiohttp import setup, spawn async def coro(timeout): await asyncio.sleep(timeout) # do something in background async def handler(request): await spawn(request, coro()) return web.Response() app = web.Application() setup(app) app.router.add_get('/', handler) All not finished jobs will be terminated on :attr:`Application.on_cleanup` signal. To prevent cancellation of the whole :term:`web-handler` use ``@atomic`` decorator:: from aiojobs.aiohttp import atomic @atomic async def handler(request): await write_to_db() return web.Response() app = web.Application() setup(app) app.router.add_post('/', handler) It prevents all ``handler`` async function from cancellation, ``write_to_db`` will be never interrupted. .. _aiojobs: http://aiojobs.readthedocs.io/en/latest/ Passing a coroutine into run_app and Gunicorn --------------------------------------------- :func:`run_app` accepts either application instance or a coroutine for making an application. The coroutine based approach allows to perform async IO before making an app:: async def app_factory(): await pre_init() app = web.Application() app.router.add_get(...) return app web.run_app(app_factory()) Gunicorn worker supports a factory as well. For Gunicorn the factory should accept zero parameters:: async def my_web_app(): app = web.Application() app.router.add_get(...) return app Start gunicorn: .. code-block:: shell $ gunicorn my_app_module:my_web_app --bind localhost:8080 --worker-class aiohttp.GunicornWebWorker .. versionadded:: 3.1 Custom Routing Criteria ----------------------- Sometimes you need to register :ref:`handlers ` on more complex criteria than simply a *HTTP method* and *path* pair. Although :class:`UrlDispatcher` does not support any extra criteria, routing based on custom conditions can be accomplished by implementing a second layer of routing in your application. The following example shows custom routing based on the *HTTP Accept* header:: class AcceptChooser: def __init__(self): self._accepts = {} async def do_route(self, request): for accept in request.headers.getall('ACCEPT', []): acceptor = self._accepts.get(accept) if acceptor is not None: return (await acceptor(request)) raise HTTPNotAcceptable() def reg_acceptor(self, accept, handler): self._accepts[accept] = handler async def handle_json(request): # do json handling async def handle_xml(request): # do xml handling chooser = AcceptChooser() app.add_routes([web.get('/', chooser.do_route)]) chooser.reg_acceptor('application/json', handle_json) chooser.reg_acceptor('application/xml', handle_xml) .. _aiohttp-web-static-file-handling: Static file handling -------------------- The best way to handle static files (images, JavaScripts, CSS files etc.) is using `Reverse Proxy`_ like `nginx`_ or `CDN`_ services. .. _Reverse Proxy: https://en.wikipedia.org/wiki/Reverse_proxy .. _nginx: https://nginx.org/ .. _CDN: https://en.wikipedia.org/wiki/Content_delivery_network But for development it's very convenient to handle static files by aiohttp server itself. To do it just register a new static route by :meth:`RouteTableDef.static` or :func:`static` calls:: app.add_routes([web.static('/prefix', path_to_static_folder)]) routes.static('/prefix', path_to_static_folder) When a directory is accessed within a static route then the server responses to client with ``HTTP/403 Forbidden`` by default. Displaying folder index instead could be enabled with ``show_index`` parameter set to ``True``:: web.static('/prefix', path_to_static_folder, show_index=True) When a symlink from the static directory is accessed, the server responses to client with ``HTTP/404 Not Found`` by default. To allow the server to follow symlinks, parameter ``follow_symlinks`` should be set to ``True``:: web.static('/prefix', path_to_static_folder, follow_symlinks=True) When you want to enable cache busting, parameter ``append_version`` can be set to ``True`` Cache busting is the process of appending some form of file version hash to the filename of resources like JavaScript and CSS files. The performance advantage of doing this is that we can tell the browser to cache these files indefinitely without worrying about the client not getting the latest version when the file changes:: web.static('/prefix', path_to_static_folder, append_version=True) Template Rendering ------------------ :mod:`aiohttp.web` does not support template rendering out-of-the-box. However, there is a third-party library, :mod:`aiohttp_jinja2`, which is supported by the *aiohttp* authors. Using it is rather simple. First, setup a *jinja2 environment* with a call to :func:`aiohttp_jinja2.setup`:: app = web.Application() aiohttp_jinja2.setup(app, loader=jinja2.FileSystemLoader('/path/to/templates/folder')) After that you may use the template engine in your :ref:`handlers `. The most convenient way is to simply wrap your handlers with the :func:`aiohttp_jinja2.template` decorator:: @aiohttp_jinja2.template('tmpl.jinja2') async def handler(request): return {'name': 'Andrew', 'surname': 'Svetlov'} If you prefer the `Mako`_ template engine, please take a look at the `aiohttp_mako`_ library. .. warning:: :func:`aiohttp_jinja2.template` should be applied **before** :meth:`RouteTableDef.get` decorator and family, e.g. it must be the *first* (most *down* decorator in the chain):: @routes.get('/path') @aiohttp_jinja2.template('tmpl.jinja2') async def handler(request): return {'name': 'Andrew', 'surname': 'Svetlov'} .. _Mako: http://www.makotemplates.org/ .. _aiohttp_mako: https://github.com/aio-libs/aiohttp_mako .. _aiohttp-web-websocket-read-same-task: Reading from the same task in WebSockets ---------------------------------------- Reading from the *WebSocket* (``await ws.receive()``) **must only** be done inside the request handler *task*; however, writing (``ws.send_str(...)``) to the *WebSocket*, closing (``await ws.close()``) and canceling the handler task may be delegated to other tasks. See also :ref:`FAQ section `. :mod:`aiohttp.web` creates an implicit :class:`asyncio.Task` for handling every incoming request. .. note:: While :mod:`aiohttp.web` itself only supports *WebSockets* without downgrading to *LONG-POLLING*, etc., our team supports SockJS_, an aiohttp-based library for implementing SockJS-compatible server code. .. _SockJS: https://github.com/aio-libs/sockjs .. warning:: Parallel reads from websocket are forbidden, there is no possibility to call :meth:`WebSocketResponse.receive` from two tasks. See :ref:`FAQ section ` for instructions how to solve the problem. .. _aiohttp-web-data-sharing: Data Sharing aka No Singletons Please ------------------------------------- :mod:`aiohttp.web` discourages the use of *global variables*, aka *singletons*. Every variable should have its own context that is *not global*. So, :class:`Application` and :class:`Request` support a :class:`collections.abc.MutableMapping` interface (i.e. they are dict-like objects), allowing them to be used as data stores. .. _aiohttp-web-data-sharing-app-config: Application's config ^^^^^^^^^^^^^^^^^^^^ For storing *global-like* variables, feel free to save them in an :class:`Application` instance:: app['my_private_key'] = data and get it back in the :term:`web-handler`:: async def handler(request): data = request.app['my_private_key'] In case of :ref:`nested applications ` the desired lookup strategy could be the following: 1. Search the key in the current nested application. 2. If the key is not found continue searching in the parent application(s). For this please use :attr:`Request.config_dict` read-only property:: async def handler(request): data = request.config_dict['my_private_key'] Request's storage ^^^^^^^^^^^^^^^^^ Variables that are only needed for the lifetime of a :class:`Request`, can be stored in a :class:`Request`:: async def handler(request): request['my_private_key'] = "data" ... This is mostly useful for :ref:`aiohttp-web-middlewares` and :ref:`aiohttp-web-signals` handlers to store data for further processing by the next handlers in the chain. Response's storage ^^^^^^^^^^^^^^^^^^ :class:`StreamResponse` and :class:`Response` objects also support :class:`collections.abc.MutableMapping` interface. This is useful when you want to share data with signals and middlewares once all the work in the handler is done:: async def handler(request): [ do all the work ] response['my_metric'] = 123 return response Naming hint ^^^^^^^^^^^ To avoid clashing with other *aiohttp* users and third-party libraries, please choose a unique key name for storing data. If your code is published on PyPI, then the project name is most likely unique and safe to use as the key. Otherwise, something based on your company name/url would be satisfactory (i.e. ``org.company.app``). .. _aiohttp-web-contextvars: ContextVars support ------------------- Starting from Python 3.7 asyncio has :mod:`Context Variables ` as a context-local storage (a generalization of thread-local concept that works with asyncio tasks also). *aiohttp* server supports it in the following way: * A server inherits the current task's context used when creating it. :func:`aiohttp.web.run_app()` runs a task for handling all underlying jobs running the app, but alternatively :ref:`aiohttp-web-app-runners` can be used. * Application initialization / finalization events (:attr:`Application.cleanup_ctx`, :attr:`Application.on_startup` and :attr:`Application.on_shutdown`, :attr:`Application.on_cleanup`) are executed inside the same context. E.g. all context modifications made on application startup a visible on teardown. * On every request handling *aiohttp* creates a context copy. :term:`web-handler` has all variables installed on initialization stage. But the context modification made by a handler or middleware is invisible to another HTTP request handling call. An example of context vars usage:: from contextvars import ContextVar from aiohttp import web VAR = ContextVar('VAR', default='default') async def coro(): return VAR.get() async def handler(request): var = VAR.get() VAR.set('handler') ret = await coro() return web.Response(text='\n'.join([var, ret])) async def on_startup(app): print('on_startup', VAR.get()) VAR.set('on_startup') async def on_cleanup(app): print('on_cleanup', VAR.get()) VAR.set('on_cleanup') async def init(): print('init', VAR.get()) VAR.set('init') app = web.Application() app.router.add_get('/', handler) app.on_startup.append(on_startup) app.on_cleanup.append(on_cleanup) return app web.run_app(init()) print('done', VAR.get()) .. versionadded:: 3.5 .. _aiohttp-web-middlewares: Middlewares ----------- :mod:`aiohttp.web` provides a powerful mechanism for customizing :ref:`request handlers` via *middlewares*. A *middleware* is a coroutine that can modify either the request or response. For example, here's a simple *middleware* which appends ``' wink'`` to the response:: from aiohttp.web import middleware @middleware async def middleware(request, handler): resp = await handler(request) resp.text = resp.text + ' wink' return resp .. note:: The example won't work with streamed responses or websockets Every *middleware* should accept two parameters, a :class:`request ` instance and a *handler*, and return the response or raise an exception. If the exception is not an instance of :exc:`HTTPException` it is converted to ``500`` :exc:`HTTPInternalServerError` after processing the middlewares chain. .. warning:: Second argument should be named *handler* exactly. When creating an :class:`Application`, these *middlewares* are passed to the keyword-only ``middlewares`` parameter:: app = web.Application(middlewares=[middleware_1, middleware_2]) Internally, a single :ref:`request handler ` is constructed by applying the middleware chain to the original handler in reverse order, and is called by the :class:`RequestHandler` as a regular *handler*. Since *middlewares* are themselves coroutines, they may perform extra ``await`` calls when creating a new handler, e.g. call database etc. *Middlewares* usually call the handler, but they may choose to ignore it, e.g. displaying *403 Forbidden page* or raising :exc:`HTTPForbidden` exception if the user does not have permissions to access the underlying resource. They may also render errors raised by the handler, perform some pre- or post-processing like handling *CORS* and so on. The following code demonstrates middlewares execution order:: from aiohttp import web async def test(request): print('Handler function called') return web.Response(text="Hello") @web.middleware async def middleware1(request, handler): print('Middleware 1 called') response = await handler(request) print('Middleware 1 finished') return response @web.middleware async def middleware2(request, handler): print('Middleware 2 called') response = await handler(request) print('Middleware 2 finished') return response app = web.Application(middlewares=[middleware1, middleware2]) app.router.add_get('/', test) web.run_app(app) Produced output:: Middleware 1 called Middleware 2 called Handler function called Middleware 2 finished Middleware 1 finished Example ^^^^^^^ A common use of middlewares is to implement custom error pages. The following example will render 404 errors using a JSON response, as might be appropriate a JSON REST service:: from aiohttp import web @web.middleware async def error_middleware(request, handler): try: response = await handler(request) if response.status != 404: return response message = response.message except web.HTTPException as ex: if ex.status != 404: raise message = ex.reason return web.json_response({'error': message}) app = web.Application(middlewares=[error_middleware]) Middleware Factory ^^^^^^^^^^^^^^^^^^ A *middleware factory* is a function that creates a middleware with passed arguments. For example, here's a trivial *middleware factory*:: def middleware_factory(text): @middleware async def sample_middleware(request, handler): resp = await handler(request) resp.text = resp.text + text return resp return sample_middleware Remember that contrary to regular middlewares you need the result of a middleware factory not the function itself. So when passing a middleware factory to an app you actually need to call it:: app = web.Application(middlewares=[middleware_factory(' wink')]) .. _aiohttp-web-signals: Signals ------- Although :ref:`middlewares ` can customize :ref:`request handlers` before or after a :class:`Response` has been prepared, they can't customize a :class:`Response` **while** it's being prepared. For this :mod:`aiohttp.web` provides *signals*. For example, a middleware can only change HTTP headers for *unprepared* responses (see :meth:`StreamResponse.prepare`), but sometimes we need a hook for changing HTTP headers for streamed responses and WebSockets. This can be accomplished by subscribing to the :attr:`Application.on_response_prepare` signal:: async def on_prepare(request, response): response.headers['My-Header'] = 'value' app.on_response_prepare.append(on_prepare) Additionally, the :attr:`Application.on_startup` and :attr:`Application.on_cleanup` signals can be subscribed to for application component setup and tear down accordingly. The following example will properly initialize and dispose an aiopg connection engine:: from aiopg.sa import create_engine async def create_aiopg(app): app['pg_engine'] = await create_engine( user='postgre', database='postgre', host='localhost', port=5432, password='' ) async def dispose_aiopg(app): app['pg_engine'].close() await app['pg_engine'].wait_closed() app.on_startup.append(create_aiopg) app.on_cleanup.append(dispose_aiopg) Signal handlers should not return a value but may modify incoming mutable parameters. Signal handlers will be run sequentially, in order they were added. All handlers must be asynchronous since *aiohttp* 3.0. .. _aiohttp-web-cleanup-ctx: Cleanup Context --------------- Bare :attr:`Application.on_startup` / :attr:`Application.on_cleanup` pair still has a pitfall: signals handlers are independent on each other. E.g. we have ``[create_pg, create_redis]`` in *startup* signal and ``[dispose_pg, dispose_redis]`` in *cleanup*. If, for example, ``create_pg(app)`` call fails ``create_redis(app)`` is not called. But on application cleanup both ``dispose_pg(app)`` and ``dispose_redis(app)`` are still called: *cleanup signal* has no knowledge about startup/cleanup pairs and their execution state. The solution is :attr:`Application.cleanup_ctx` usage:: async def pg_engine(app): app['pg_engine'] = await create_engine( user='postgre', database='postgre', host='localhost', port=5432, password='' ) yield app['pg_engine'].close() await app['pg_engine'].wait_closed() app.cleanup_ctx.append(pg_engine) The attribute is a list of *asynchronous generators*, a code *before* ``yield`` is an initialization stage (called on *startup*), a code *after* ``yield`` is executed on *cleanup*. The generator must have only one ``yield``. *aiohttp* guarantees that *cleanup code* is called if and only if *startup code* was successfully finished. Asynchronous generators are supported by Python 3.6+, on Python 3.5 please use `async_generator `_ library. .. versionadded:: 3.1 .. _aiohttp-web-nested-applications: Nested applications ------------------- Sub applications are designed for solving the problem of the big monolithic code base. Let's assume we have a project with own business logic and tools like administration panel and debug toolbar. Administration panel is a separate application by its own nature but all toolbar URLs are served by prefix like ``/admin``. Thus we'll create a totally separate application named ``admin`` and connect it to main app with prefix by :meth:`Application.add_subapp`:: admin = web.Application() # setup admin routes, signals and middlewares app.add_subapp('/admin/', admin) Middlewares and signals from ``app`` and ``admin`` are chained. It means that if URL is ``'/admin/something'`` middlewares from ``app`` are applied first and ``admin.middlewares`` are the next in the call chain. The same is going for :attr:`Application.on_response_prepare` signal -- the signal is delivered to both top level ``app`` and ``admin`` if processing URL is routed to ``admin`` sub-application. Common signals like :attr:`Application.on_startup`, :attr:`Application.on_shutdown` and :attr:`Application.on_cleanup` are delivered to all registered sub-applications. The passed parameter is sub-application instance, not top-level application. Third level sub-applications can be nested into second level ones -- there are no limitation for nesting level. Url reversing for sub-applications should generate urls with proper prefix. But for getting URL sub-application's router should be used:: admin = web.Application() admin.add_routes([web.get('/resource', handler, name='name')]) app.add_subapp('/admin/', admin) url = admin.router['name'].url_for() The generated ``url`` from example will have a value ``URL('/admin/resource')``. If main application should do URL reversing for sub-application it could use the following explicit technique:: admin = web.Application() admin.add_routes([web.get('/resource', handler, name='name')]) app.add_subapp('/admin/', admin) app['admin'] = admin async def handler(request): # main application's handler admin = request.app['admin'] url = admin.router['name'].url_for() .. _aiohttp-web-expect-header: *Expect* Header --------------- :mod:`aiohttp.web` supports *Expect* header. By default it sends ``HTTP/1.1 100 Continue`` line to client, or raises :exc:`HTTPExpectationFailed` if header value is not equal to "100-continue". It is possible to specify custom *Expect* header handler on per route basis. This handler gets called if *Expect* header exist in request after receiving all headers and before processing application's :ref:`aiohttp-web-middlewares` and route handler. Handler can return *None*, in that case the request processing continues as usual. If handler returns an instance of class :class:`StreamResponse`, *request handler* uses it as response. Also handler can raise a subclass of :exc:`HTTPException`. In this case all further processing will not happen and client will receive appropriate http response. .. note:: A server that does not understand or is unable to comply with any of the expectation values in the Expect field of a request MUST respond with appropriate error status. The server MUST respond with a 417 (Expectation Failed) status if any of the expectations cannot be met or, if there are other problems with the request, some other 4xx status. http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.20 If all checks pass, the custom handler *must* write a *HTTP/1.1 100 Continue* status code before returning. The following example shows how to setup a custom handler for the *Expect* header:: async def check_auth(request): if request.version != aiohttp.HttpVersion11: return if request.headers.get('EXPECT') != '100-continue': raise HTTPExpectationFailed(text="Unknown Expect: %s" % expect) if request.headers.get('AUTHORIZATION') is None: raise HTTPForbidden() request.transport.write(b"HTTP/1.1 100 Continue\r\n\r\n") async def hello(request): return web.Response(body=b"Hello, world") app = web.Application() app.add_routes([web.add_get('/', hello, expect_handler=check_auth)]) .. _aiohttp-web-custom-resource: Custom resource implementation ------------------------------ To register custom resource use :meth:`UrlDispatcher.register_resource`. Resource instance must implement `AbstractResource` interface. .. _aiohttp-web-app-runners: Application runners ------------------- :func:`run_app` provides a simple *blocking* API for running an :class:`Application`. For starting the application *asynchronously* or serving on multiple HOST/PORT :class:`AppRunner` exists. The simple startup code for serving HTTP site on ``'localhost'``, port ``8080`` looks like:: runner = web.AppRunner(app) await runner.setup() site = web.TCPSite(runner, 'localhost', 8080) await site.start() To stop serving call :meth:`AppRunner.cleanup`:: await runner.cleanup() .. versionadded:: 3.0 .. _aiohttp-web-graceful-shutdown: Graceful shutdown ------------------ Stopping *aiohttp web server* by just closing all connections is not always satisfactory. The problem is: if application supports :term:`websocket`\s or *data streaming* it most likely has open connections at server shutdown time. The *library* has no knowledge how to close them gracefully but developer can help by registering :attr:`Application.on_shutdown` signal handler and call the signal on *web server* closing. Developer should keep a list of opened connections (:class:`Application` is a good candidate). The following :term:`websocket` snippet shows an example for websocket handler:: from aiohttp import web import weakref app = web.Application() app['websockets'] = weakref.WeakSet() async def websocket_handler(request): ws = web.WebSocketResponse() await ws.prepare(request) request.app['websockets'].add(ws) try: async for msg in ws: ... finally: request.app['websockets'].discard(ws) return ws Signal handler may look like:: from aiohttp import WSCloseCode async def on_shutdown(app): for ws in set(app['websockets']): await ws.close(code=WSCloseCode.GOING_AWAY, message='Server shutdown') app.on_shutdown.append(on_shutdown) Both :func:`run_app` and :meth:`AppRunner.cleanup` call shutdown signal handlers. .. _aiohttp-web-background-tasks: Background tasks ----------------- Sometimes there's a need to perform some asynchronous operations just after application start-up. Even more, in some sophisticated systems there could be a need to run some background tasks in the event loop along with the application's request handler. Such as listening to message queue or other network message/event sources (e.g. ZeroMQ, Redis Pub/Sub, AMQP, etc.) to react to received messages within the application. For example the background task could listen to ZeroMQ on :data:`zmq.SUB` socket, process and forward retrieved messages to clients connected via WebSocket that are stored somewhere in the application (e.g. in the :obj:`application['websockets']` list). To run such short and long running background tasks aiohttp provides an ability to register :attr:`Application.on_startup` signal handler(s) that will run along with the application's request handler. For example there's a need to run one quick task and two long running tasks that will live till the application is alive. The appropriate background tasks could be registered as an :attr:`Application.on_startup` signal handlers as shown in the example below:: async def listen_to_redis(app): try: sub = await aioredis.create_redis(('localhost', 6379)) ch, *_ = await sub.subscribe('news') async for msg in ch.iter(encoding='utf-8'): # Forward message to all connected websockets: for ws in app['websockets']: ws.send_str('{}: {}'.format(ch.name, msg)) except asyncio.CancelledError: pass finally: await sub.unsubscribe(ch.name) await sub.quit() async def start_background_tasks(app): app['redis_listener'] = asyncio.create_task(listen_to_redis(app)) async def cleanup_background_tasks(app): app['redis_listener'].cancel() await app['redis_listener'] app = web.Application() app.on_startup.append(start_background_tasks) app.on_cleanup.append(cleanup_background_tasks) web.run_app(app) The task :func:`listen_to_redis` will run forever. To shut it down correctly :attr:`Application.on_cleanup` signal handler may be used to send a cancellation to it. Handling error pages -------------------- Pages like *404 Not Found* and *500 Internal Error* could be handled by custom middleware, see :ref:`polls demo ` for example. .. _aiohttp-web-forwarded-support: Deploying behind a Proxy ------------------------ As discussed in :ref:`aiohttp-deployment` the preferable way is deploying *aiohttp* web server behind a *Reverse Proxy Server* like :term:`nginx` for production usage. In this way properties like :attr:`BaseRequest.scheme` :attr:`BaseRequest.host` and :attr:`BaseRequest.remote` are incorrect. Real values should be given from proxy server, usually either ``Forwarded`` or old-fashion ``X-Forwarded-For``, ``X-Forwarded-Host``, ``X-Forwarded-Proto`` HTTP headers are used. *aiohttp* does not take *forwarded* headers into account by default because it produces *security issue*: HTTP client might add these headers too, pushing non-trusted data values. That's why *aiohttp server* should setup *forwarded* headers in custom middleware in tight conjunction with *reverse proxy configuration*. For changing :attr:`BaseRequest.scheme` :attr:`BaseRequest.host` and :attr:`BaseRequest.remote` the middleware might use :meth:`BaseRequest.clone`. .. seealso:: https://github.com/aio-libs/aiohttp-remotes provides secure helpers for modifying *scheme*, *host* and *remote* attributes according to ``Forwarded`` and ``X-Forwarded-*`` HTTP headers. Swagger support --------------- `aiohttp-swagger `_ is a library that allow to add Swagger documentation and embed the Swagger-UI into your :mod:`aiohttp.web` project. CORS support ------------ :mod:`aiohttp.web` itself does not support `Cross-Origin Resource Sharing `_, but there is an aiohttp plugin for it: `aiohttp_cors `_. Debug Toolbar ------------- `aiohttp-debugtoolbar`_ is a very useful library that provides a debugging toolbar while you're developing an :mod:`aiohttp.web` application. Install it with ``pip``: .. code-block:: shell $ pip install aiohttp_debugtoolbar Just call :func:`aiohttp_debugtoolbar.setup`:: import aiohttp_debugtoolbar from aiohttp_debugtoolbar import toolbar_middleware_factory app = web.Application() aiohttp_debugtoolbar.setup(app) The toolbar is ready to use. Enjoy!!! .. _aiohttp-debugtoolbar: https://github.com/aio-libs/aiohttp_debugtoolbar Dev Tools --------- `aiohttp-devtools`_ provides a couple of tools to simplify development of :mod:`aiohttp.web` applications. Install with ``pip``: .. code-block:: shell $ pip install aiohttp-devtools * ``runserver`` provides a development server with auto-reload, live-reload, static file serving and aiohttp_debugtoolbar_ integration. * ``start`` is a `cookiecutter command which does the donkey work of creating new :mod:`aiohttp.web` Applications. Documentation and a complete tutorial of creating and running an app locally are available at `aiohttp-devtools`_. .. _aiohttp-devtools: https://github.com/aio-libs/aiohttp-devtools aiohttp-3.6.2/docs/web_lowlevel.rst0000644000175100001650000000521713547410117017642 0ustar vstsdocker00000000000000.. _aiohttp-web-lowlevel: Low Level Server ================ .. currentmodule:: aiohttp.web This topic describes :mod:`aiohttp.web` based *low level* API. Abstract -------- Sometimes user don't need high-level concepts introduced in :ref:`aiohttp-web`: applications, routers, middlewares and signals. All what is needed is supporting asynchronous callable which accepts a request and returns a response object. This is done by introducing :class:`aiohttp.web.Server` class which serves a *protocol factory* role for :meth:`asyncio.AbstractEventLoop.create_server` and bridges data stream to *web handler* and sends result back. Low level *web handler* should accept the single :class:`BaseRequest` parameter and performs one of the following actions: 1. Return a :class:`Response` with the whole HTTP body stored in memory. 2. Create a :class:`StreamResponse`, send headers by :meth:`StreamResponse.prepare` call, send data chunks by :meth:`StreamResponse.write` and return finished response. 3. Raise :class:`HTTPException` derived exception (see :ref:`aiohttp-web-exceptions` section). All other exceptions not derived from :class:`HTTPException` leads to *500 Internal Server Error* response. 4. Initiate and process Web-Socket connection by :class:`WebSocketResponse` using (see :ref:`aiohttp-web-websockets`). Run a Basic Low-Level Server ---------------------------- The following code demonstrates very trivial usage example:: import asyncio from aiohttp import web async def handler(request): return web.Response(text="OK") async def main(): server = web.Server(handler) runner = web.ServerRunner(server) await runner.setup() site = web.TCPSite(runner, 'localhost', 8080) await site.start() print("======= Serving on http://127.0.0.1:8080/ ======") # pause here for very long time by serving HTTP requests and # waiting for keyboard interruption await asyncio.sleep(100*3600) loop = asyncio.get_event_loop() try: loop.run_until_complete(main()) except KeyboardInterrupt: pass loop.close() In the snippet we have ``handler`` which returns a regular :class:`Response` with ``"OK"`` in BODY. This *handler* is processed by ``server`` (:class:`Server` which acts as *protocol factory*). Network communication is created by :ref:`runners API ` to serve ``http://127.0.0.1:8080/``. The handler should process every request for every *path*, e.g. ``GET``, ``POST``, Web-Socket. The example is very basic: it always return ``200 OK`` response, real life code is much more complex usually. aiohttp-3.6.2/docs/web_quickstart.rst0000644000175100001650000005442113547410117020204 0ustar vstsdocker00000000000000.. _aiohttp-web-quickstart: Web Server Quickstart ===================== .. currentmodule:: aiohttp.web Run a Simple Web Server ----------------------- In order to implement a web server, first create a :ref:`request handler `. A request handler must be a :ref:`coroutine ` that accepts a :class:`Request` instance as its only parameter and returns a :class:`Response` instance:: from aiohttp import web async def hello(request): return web.Response(text="Hello, world") Next, create an :class:`Application` instance and register the request handler on a particular *HTTP method* and *path*:: app = web.Application() app.add_routes([web.get('/', hello)]) After that, run the application by :func:`run_app` call:: web.run_app(app) That's it. Now, head over to ``http://localhost:8080/`` to see the results. Alternatively if you prefer *route decorators* create a *route table* and register a :term:`web-handler`:: routes = web.RouteTableDef() @routes.get('/') async def hello(request): return web.Response(text="Hello, world") app = web.Application() app.add_routes(routes) web.run_app(app) Both ways essentially do the same work, the difference is only in your taste: do you prefer *Django style* with famous ``urls.py`` or *Flask* with shiny route decorators. *aiohttp* server documentation uses both ways in code snippets to emphasize their equality, switching from one style to another is very trivial. .. seealso:: :ref:`aiohttp-web-graceful-shutdown` section explains what :func:`run_app` does and how to implement complex server initialization/finalization from scratch. :ref:`aiohttp-web-app-runners` for more handling more complex cases like *asynchronous* web application serving and multiple hosts support. .. _aiohttp-web-cli: Command Line Interface (CLI) ---------------------------- :mod:`aiohttp.web` implements a basic CLI for quickly serving an :class:`Application` in *development* over TCP/IP: .. code-block:: shell $ python -m aiohttp.web -H localhost -P 8080 package.module:init_func ``package.module:init_func`` should be an importable :term:`callable` that accepts a list of any non-parsed command-line arguments and returns an :class:`Application` instance after setting it up:: def init_func(argv): app = web.Application() app.router.add_get("/", index_handler) return app .. _aiohttp-web-handler: Handler ------- A request handler must be a :ref:`coroutine` that accepts a :class:`Request` instance as its only argument and returns a :class:`StreamResponse` derived (e.g. :class:`Response`) instance:: async def handler(request): return web.Response() Handlers are setup to handle requests by registering them with the :meth:`Application.add_routes` on a particular route (*HTTP method* and *path* pair) using helpers like :func:`get` and :func:`post`:: app.add_routes([web.get('/', handler), web.post('/post', post_handler), web.put('/put', put_handler)]) Or use *route decorators*:: routes = web.RouteTableDef() @routes.get('/') async def get_handler(request): ... @routes.post('/post') async def post_handler(request): ... @routes.put('/put') async def put_handler(request): ... app.add_routes(routes) Wildcard *HTTP method* is also supported by :func:`route` or :meth:`RouteTableDef.route`, allowing a handler to serve incoming requests on a *path* having **any** *HTTP method*:: app.add_routes([web.route('*', '/path', all_handler)]) The *HTTP method* can be queried later in the request handler using the :attr:`Request.method` property. By default endpoints added with ``GET`` method will accept ``HEAD`` requests and return the same response headers as they would for a ``GET`` request. You can also deny ``HEAD`` requests on a route:: web.get('/', handler, allow_head=False) Here ``handler`` won't be called on ``HEAD`` request and the server will respond with ``405: Method Not Allowed``. .. _aiohttp-web-resource-and-route: Resources and Routes -------------------- Internally routes are served by :attr:`Application.router` (:class:`UrlDispatcher` instance). The *router* is a list of *resources*. Resource is an entry in *route table* which corresponds to requested URL. Resource in turn has at least one *route*. Route corresponds to handling *HTTP method* by calling *web handler*. Thus when you add a *route* the *resouce* object is created under the hood. The library implementation **merges** all subsequent route additions for the same path adding the only resource for all HTTP methods. Consider two examples:: app.add_routes([web.get('/path1', get_1), web.post('/path1', post_1), web.get('/path2', get_2), web.post('/path2', post_2)] and:: app.add_routes([web.get('/path1', get_1), web.get('/path2', get_2), web.post('/path2', post_2), web.post('/path1', post_1)] First one is *optimized*. You have got the idea. .. _aiohttp-web-variable-handler: Variable Resources ^^^^^^^^^^^^^^^^^^ Resource may have *variable path* also. For instance, a resource with the path ``'/a/{name}/c'`` would match all incoming requests with paths such as ``'/a/b/c'``, ``'/a/1/c'``, and ``'/a/etc/c'``. A variable *part* is specified in the form ``{identifier}``, where the ``identifier`` can be used later in a :ref:`request handler ` to access the matched value for that *part*. This is done by looking up the ``identifier`` in the :attr:`Request.match_info` mapping:: @routes.get('/{name}') async def variable_handler(request): return web.Response( text="Hello, {}".format(request.match_info['name'])) By default, each *part* matches the regular expression ``[^{}/]+``. You can also specify a custom regex in the form ``{identifier:regex}``:: web.get(r'/{name:\d+}', handler) .. _aiohttp-web-named-routes: Reverse URL Constructing using Named Resources ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Routes can also be given a *name*:: @routes.get('/root', name='root') async def handler(request): ... Which can then be used to access and build a *URL* for that resource later (e.g. in a :ref:`request handler `):: url == request.app.router['root'].url_for().with_query({"a": "b", "c": "d"}) assert url == URL('/root?a=b&c=d') A more interesting example is building *URLs* for :ref:`variable resources `:: app.router.add_resource(r'/{user}/info', name='user-info') In this case you can also pass in the *parts* of the route:: url = request.app.router['user-info'].url_for(user='john_doe') url_with_qs = url.with_query("a=b") assert url_with_qs == '/john_doe/info?a=b' Organizing Handlers in Classes ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ As discussed above, :ref:`handlers ` can be first-class coroutines:: async def hello(request): return web.Response(text="Hello, world") app.router.add_get('/', hello) But sometimes it's convenient to group logically similar handlers into a Python *class*. Since :mod:`aiohttp.web` does not dictate any implementation details, application developers can organize handlers in classes if they so wish:: class Handler: def __init__(self): pass async def handle_intro(self, request): return web.Response(text="Hello, world") async def handle_greeting(self, request): name = request.match_info.get('name', "Anonymous") txt = "Hello, {}".format(name) return web.Response(text=txt) handler = Handler() app.add_routes([web.get('/intro', handler.handle_intro), web.get('/greet/{name}', handler.handle_greeting)] .. _aiohttp-web-class-based-views: Class Based Views ^^^^^^^^^^^^^^^^^ :mod:`aiohttp.web` has support for *class based views*. You can derive from :class:`View` and define methods for handling http requests:: class MyView(web.View): async def get(self): return await get_resp(self.request) async def post(self): return await post_resp(self.request) Handlers should be coroutines accepting *self* only and returning response object as regular :term:`web-handler`. Request object can be retrieved by :attr:`View.request` property. After implementing the view (``MyView`` from example above) should be registered in application's router:: web.view('/path/to', MyView) or:: @routes.view('/path/to') class MyView(web.View): ... Example will process GET and POST requests for */path/to* but raise *405 Method not allowed* exception for unimplemented HTTP methods. Resource Views ^^^^^^^^^^^^^^ *All* registered resources in a router can be viewed using the :meth:`UrlDispatcher.resources` method:: for resource in app.router.resources(): print(resource) A *subset* of the resources that were registered with a *name* can be viewed using the :meth:`UrlDispatcher.named_resources` method:: for name, resource in app.router.named_resources().items(): print(name, resource) .. _aiohttp-web-alternative-routes-definition: Alternative ways for registering routes --------------------------------------- Code examples shown above use *imperative* style for adding new routes: they call ``app.router.add_get(...)`` etc. There are two alternatives: route tables and route decorators. Route tables look like Django way:: async def handle_get(request): ... async def handle_post(request): ... app.router.add_routes([web.get('/get', handle_get), web.post('/post', handle_post), The snippet calls :meth:`~aiohttp.web.UrlDispather.add_routes` to register a list of *route definitions* (:class:`aiohttp.web.RouteDef` instances) created by :func:`aiohttp.web.get` or :func:`aiohttp.web.post` functions. .. seealso:: :ref:`aiohttp-web-route-def` reference. Route decorators are closer to Flask approach:: routes = web.RouteTableDef() @routes.get('/get') async def handle_get(request): ... @routes.post('/post') async def handle_post(request): ... app.router.add_routes(routes) It is also possible to use decorators with class-based views:: routes = web.RouteTableDef() @routes.view("/view") class MyView(web.View): async def get(self): ... async def post(self): ... app.router.add_routes(routes) The example creates a :class:`aiohttp.web.RouteTableDef` container first. The container is a list-like object with additional decorators :meth:`aiohttp.web.RouteTableDef.get`, :meth:`aiohttp.web.RouteTableDef.post` etc. for registering new routes. After filling the container :meth:`~aiohttp.web.UrlDispather.add_routes` is used for adding registered *route definitions* into application's router. .. seealso:: :ref:`aiohttp-web-route-table-def` reference. All tree ways (imperative calls, route tables and decorators) are equivalent, you could use what do you prefer or even mix them on your own. .. versionadded:: 2.3 JSON Response ------------- It is a common case to return JSON data in response, :mod:`aiohttp.web` provides a shortcut for returning JSON -- :func:`aiohttp.web.json_response`:: async def handler(request): data = {'some': 'data'} return web.json_response(data) The shortcut method returns :class:`aiohttp.web.Response` instance so you can for example set cookies before returning it from handler. User Sessions ------------- Often you need a container for storing user data across requests. The concept is usually called a *session*. :mod:`aiohttp.web` has no built-in concept of a *session*, however, there is a third-party library, :mod:`aiohttp_session`, that adds *session* support:: import asyncio import time import base64 from cryptography import fernet from aiohttp import web from aiohttp_session import setup, get_session, session_middleware from aiohttp_session.cookie_storage import EncryptedCookieStorage async def handler(request): session = await get_session(request) last_visit = session['last_visit'] if 'last_visit' in session else None text = 'Last visited: {}'.format(last_visit) return web.Response(text=text) async def make_app(): app = web.Application() # secret_key must be 32 url-safe base64-encoded bytes fernet_key = fernet.Fernet.generate_key() secret_key = base64.urlsafe_b64decode(fernet_key) setup(app, EncryptedCookieStorage(secret_key)) app.add_routes([web.get('/', handler)]) return app web.run_app(make_app()) .. _aiohttp-web-forms: HTTP Forms ---------- HTTP Forms are supported out of the box. If form's method is ``"GET"`` (``
    ``) use :attr:`Request.query` for getting form data. To access form data with ``"POST"`` method use :meth:`Request.post` or :meth:`Request.multipart`. :meth:`Request.post` accepts both ``'application/x-www-form-urlencoded'`` and ``'multipart/form-data'`` form's data encoding (e.g. ````). It stores files data in temporary directory. If `client_max_size` is specified `post` raises `ValueError` exception. For efficiency use :meth:`Request.multipart`, It is especially effective for uploading large files (:ref:`aiohttp-web-file-upload`). Values submitted by the following form: .. code-block:: html
    could be accessed as:: async def do_login(request): data = await request.post() login = data['login'] password = data['password'] .. _aiohttp-web-file-upload: File Uploads ------------ :mod:`aiohttp.web` has built-in support for handling files uploaded from the browser. First, make sure that the HTML ``
    `` element has its *enctype* attribute set to ``enctype="multipart/form-data"``. As an example, here is a form that accepts an MP3 file: .. code-block:: html
    Then, in the :ref:`request handler ` you can access the file input field as a :class:`FileField` instance. :class:`FileField` is simply a container for the file as well as some of its metadata:: async def store_mp3_handler(request): # WARNING: don't do that if you plan to receive large files! data = await request.post() mp3 = data['mp3'] # .filename contains the name of the file in string format. filename = mp3.filename # .file contains the actual file data that needs to be stored somewhere. mp3_file = data['mp3'].file content = mp3_file.read() return web.Response(body=content, headers=MultiDict( {'CONTENT-DISPOSITION': mp3_file})) You might have noticed a big warning in the example above. The general issue is that :meth:`Request.post` reads the whole payload in memory, resulting in possible :abbr:`OOM (Out Of Memory)` errors. To avoid this, for multipart uploads, you should use :meth:`Request.multipart` which returns a :ref:`multipart reader `:: async def store_mp3_handler(request): reader = await request.multipart() # /!\ Don't forget to validate your inputs /!\ # reader.next() will `yield` the fields of your form field = await reader.next() assert field.name == 'name' name = await field.read(decode=True) field = await reader.next() assert field.name == 'mp3' filename = field.filename # You cannot rely on Content-Length if transfer is chunked. size = 0 with open(os.path.join('/spool/yarrr-media/mp3/', filename), 'wb') as f: while True: chunk = await field.read_chunk() # 8192 bytes by default. if not chunk: break size += len(chunk) f.write(chunk) return web.Response(text='{} sized of {} successfully stored' ''.format(filename, size)) .. _aiohttp-web-websockets: WebSockets ---------- :mod:`aiohttp.web` supports *WebSockets* out-of-the-box. To setup a *WebSocket*, create a :class:`WebSocketResponse` in a :ref:`request handler ` and then use it to communicate with the peer:: async def websocket_handler(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: if msg.type == aiohttp.WSMsgType.TEXT: if msg.data == 'close': await ws.close() else: await ws.send_str(msg.data + '/answer') elif msg.type == aiohttp.WSMsgType.ERROR: print('ws connection closed with exception %s' % ws.exception()) print('websocket connection closed') return ws The handler should be registered as HTTP GET processor:: app.add_routes([web.get('/ws', websocket_handler)]) .. _aiohttp-web-redirects: Redirects --------- To redirect user to another endpoint - raise :class:`HTTPFound` with an absolute URL, relative URL or view name (the argument from router):: raise web.HTTPFound('/redirect') The following example shows redirect to view named 'login' in routes:: async def handler(request): location = request.app.router['login'].url_for() raise web.HTTPFound(location=location) router.add_get('/handler', handler) router.add_get('/login', login_handler, name='login') Example with login validation:: @aiohttp_jinja2.template('login.html') async def login(request): if request.method == 'POST': form = await request.post() error = validate_login(form) if error: return {'error': error} else: # login form is valid location = request.app.router['index'].url_for() raise web.HTTPFound(location=location) return {} app.router.add_get('/', index, name='index') app.router.add_get('/login', login, name='login') app.router.add_post('/login', login, name='login') .. _aiohttp-web-exceptions: Exceptions ---------- :mod:`aiohttp.web` defines a set of exceptions for every *HTTP status code*. Each exception is a subclass of :class:`~HTTPException` and relates to a single HTTP status code:: async def handler(request): raise aiohttp.web.HTTPFound('/redirect') .. warning:: Returning :class:`~HTTPException` or its subclasses is deprecated and will be removed in subsequent aiohttp versions. Each exception class has a status code according to :rfc:`2068`: codes with 100-300 are not really errors; 400s are client errors, and 500s are server errors. HTTP Exception hierarchy chart:: Exception HTTPException HTTPSuccessful * 200 - HTTPOk * 201 - HTTPCreated * 202 - HTTPAccepted * 203 - HTTPNonAuthoritativeInformation * 204 - HTTPNoContent * 205 - HTTPResetContent * 206 - HTTPPartialContent HTTPRedirection * 300 - HTTPMultipleChoices * 301 - HTTPMovedPermanently * 302 - HTTPFound * 303 - HTTPSeeOther * 304 - HTTPNotModified * 305 - HTTPUseProxy * 307 - HTTPTemporaryRedirect * 308 - HTTPPermanentRedirect HTTPError HTTPClientError * 400 - HTTPBadRequest * 401 - HTTPUnauthorized * 402 - HTTPPaymentRequired * 403 - HTTPForbidden * 404 - HTTPNotFound * 405 - HTTPMethodNotAllowed * 406 - HTTPNotAcceptable * 407 - HTTPProxyAuthenticationRequired * 408 - HTTPRequestTimeout * 409 - HTTPConflict * 410 - HTTPGone * 411 - HTTPLengthRequired * 412 - HTTPPreconditionFailed * 413 - HTTPRequestEntityTooLarge * 414 - HTTPRequestURITooLong * 415 - HTTPUnsupportedMediaType * 416 - HTTPRequestRangeNotSatisfiable * 417 - HTTPExpectationFailed * 421 - HTTPMisdirectedRequest * 422 - HTTPUnprocessableEntity * 424 - HTTPFailedDependency * 426 - HTTPUpgradeRequired * 428 - HTTPPreconditionRequired * 429 - HTTPTooManyRequests * 431 - HTTPRequestHeaderFieldsTooLarge * 451 - HTTPUnavailableForLegalReasons HTTPServerError * 500 - HTTPInternalServerError * 501 - HTTPNotImplemented * 502 - HTTPBadGateway * 503 - HTTPServiceUnavailable * 504 - HTTPGatewayTimeout * 505 - HTTPVersionNotSupported * 506 - HTTPVariantAlsoNegotiates * 507 - HTTPInsufficientStorage * 510 - HTTPNotExtended * 511 - HTTPNetworkAuthenticationRequired All HTTP exceptions have the same constructor signature:: HTTPNotFound(*, headers=None, reason=None, body=None, text=None, content_type=None) If not directly specified, *headers* will be added to the *default response headers*. Classes :class:`HTTPMultipleChoices`, :class:`HTTPMovedPermanently`, :class:`HTTPFound`, :class:`HTTPSeeOther`, :class:`HTTPUseProxy`, :class:`HTTPTemporaryRedirect` have the following constructor signature:: HTTPFound(location, *, headers=None, reason=None, body=None, text=None, content_type=None) where *location* is value for *Location HTTP header*. :class:`HTTPMethodNotAllowed` is constructed by providing the incoming unsupported method and list of allowed methods:: HTTPMethodNotAllowed(method, allowed_methods, *, headers=None, reason=None, body=None, text=None, content_type=None) aiohttp-3.6.2/docs/web_reference.rst0000644000175100001650000026161513547410117017755 0ustar vstsdocker00000000000000.. _aiohttp-web-reference: Server Reference ================ .. currentmodule:: aiohttp.web .. _aiohttp-web-request: Request and Base Request ------------------------ The Request object contains all the information about an incoming HTTP request. :class:`BaseRequest` is used for :ref:`Low-Level Servers` (which have no applications, routers, signals and middlewares). :class:`Request` has an :attr:`Request.app` and :attr:`Request.match_info` attributes. A :class:`BaseRequest` / :class:`Request` are :obj:`dict` like objects, allowing them to be used for :ref:`sharing data` among :ref:`aiohttp-web-middlewares` and :ref:`aiohttp-web-signals` handlers. .. class:: BaseRequest .. attribute:: version *HTTP version* of request, Read-only property. Returns :class:`aiohttp.protocol.HttpVersion` instance. .. attribute:: method *HTTP method*, read-only property. The value is upper-cased :class:`str` like ``"GET"``, ``"POST"``, ``"PUT"`` etc. .. attribute:: url A :class:`~yarl.URL` instance with absolute URL to resource (*scheme*, *host* and *port* are included). .. note:: In case of malformed request (e.g. without ``"HOST"`` HTTP header) the absolute url may be unavailable. .. attribute:: rel_url A :class:`~yarl.URL` instance with relative URL to resource (contains *path*, *query* and *fragment* parts only, *scheme*, *host* and *port* are excluded). The property is equal to ``.url.relative()`` but is always present. .. seealso:: A note from :attr:`url`. .. attribute:: scheme A string representing the scheme of the request. The scheme is ``'https'`` if transport for request handling is *SSL*, ``'http'`` otherwise. The value could be overridden by :meth:`~BaseRequest.clone`. Read-only :class:`str` property. .. versionchanged:: 2.3 *Forwarded* and *X-Forwarded-Proto* are not used anymore. Call ``.clone(scheme=new_scheme)`` for setting up the value explicitly. .. seealso:: :ref:`aiohttp-web-forwarded-support` .. attribute:: secure Shorthand for ``request.url.scheme == 'https'`` Read-only :class:`bool` property. .. seealso:: :attr:`scheme` .. attribute:: forwarded A tuple containing all parsed Forwarded header(s). Makes an effort to parse Forwarded headers as specified by :rfc:`7239`: - It adds one (immutable) dictionary per Forwarded ``field-value``, i.e. per proxy. The element corresponds to the data in the Forwarded ``field-value`` added by the first proxy encountered by the client. Each subsequent item corresponds to those added by later proxies. - It checks that every value has valid syntax in general as specified in :rfc:`7239#section-4`: either a ``token`` or a ``quoted-string``. - It un-escapes ``quoted-pairs``. - It does NOT validate 'by' and 'for' contents as specified in :rfc:`7239#section-6`. - It does NOT validate ``host`` contents (Host ABNF). - It does NOT validate ``proto`` contents for valid URI scheme names. Returns a tuple containing one or more ``MappingProxy`` objects .. seealso:: :attr:`scheme` .. seealso:: :attr:`host` .. attribute:: host Host name of the request, resolved in this order: - Overridden value by :meth:`~BaseRequest.clone` call. - *Host* HTTP header - :func:`socket.gtfqdn` Read-only :class:`str` property. .. versionchanged:: 2.3 *Forwarded* and *X-Forwarded-Host* are not used anymore. Call ``.clone(host=new_host)`` for setting up the value explicitly. .. seealso:: :ref:`aiohttp-web-forwarded-support` .. attribute:: remote Originating IP address of a client initiated HTTP request. The IP is resolved through the following headers, in this order: - Overridden value by :meth:`~BaseRequest.clone` call. - Peer name of opened socket. Read-only :class:`str` property. Call ``.clone(remote=new_remote)`` for setting up the value explicitly. .. versionadded:: 2.3 .. seealso:: :ref:`aiohttp-web-forwarded-support` .. attribute:: path_qs The URL including PATH_INFO and the query string. e.g., ``/app/blog?id=10`` Read-only :class:`str` property. .. attribute:: path The URL including *PATH INFO* without the host or scheme. e.g., ``/app/blog``. The path is URL-decoded. For raw path info see :attr:`raw_path`. Read-only :class:`str` property. .. attribute:: raw_path The URL including raw *PATH INFO* without the host or scheme. Warning, the path may be URL-encoded and may contain invalid URL characters, e.g. ``/my%2Fpath%7Cwith%21some%25strange%24characters``. For URL-decoded version please take a look on :attr:`path`. Read-only :class:`str` property. .. attribute:: query A multidict with all the variables in the query string. Read-only :class:`~multidict.MultiDictProxy` lazy property. .. attribute:: query_string The query string in the URL, e.g., ``id=10`` Read-only :class:`str` property. .. attribute:: headers A case-insensitive multidict proxy with all headers. Read-only :class:`~multidict.CIMultiDictProxy` property. .. attribute:: raw_headers HTTP headers of response as unconverted bytes, a sequence of ``(key, value)`` pairs. .. attribute:: keep_alive ``True`` if keep-alive connection enabled by HTTP client and protocol version supports it, otherwise ``False``. Read-only :class:`bool` property. .. attribute:: transport An :ref:`transport` used to process request, Read-only property. The property can be used, for example, for getting IP address of client's peer:: peername = request.transport.get_extra_info('peername') if peername is not None: host, port = peername .. attribute:: loop An event loop instance used by HTTP request handling. Read-only :class:`asyncio.AbstractEventLoop` property. .. deprecated:: 3.5 .. attribute:: cookies A multidict of all request's cookies. Read-only :class:`~multidict.MultiDictProxy` lazy property. .. attribute:: content A :class:`~aiohttp.StreamReader` instance, input stream for reading request's *BODY*. Read-only property. .. attribute:: body_exists Return ``True`` if request has *HTTP BODY*, ``False`` otherwise. Read-only :class:`bool` property. .. versionadded:: 2.3 .. attribute:: can_read_body Return ``True`` if request's *HTTP BODY* can be read, ``False`` otherwise. Read-only :class:`bool` property. .. versionadded:: 2.3 .. attribute:: has_body Return ``True`` if request's *HTTP BODY* can be read, ``False`` otherwise. Read-only :class:`bool` property. .. deprecated:: 2.3 Use :meth:`can_read_body` instead. .. attribute:: content_type Read-only property with *content* part of *Content-Type* header. Returns :class:`str` like ``'text/html'`` .. note:: Returns value is ``'application/octet-stream'`` if no Content-Type header present in HTTP headers according to :rfc:`2616` .. attribute:: charset Read-only property that specifies the *encoding* for the request's BODY. The value is parsed from the *Content-Type* HTTP header. Returns :class:`str` like ``'utf-8'`` or ``None`` if *Content-Type* has no charset information. .. attribute:: content_length Read-only property that returns length of the request's BODY. The value is parsed from the *Content-Length* HTTP header. Returns :class:`int` or ``None`` if *Content-Length* is absent. .. attribute:: http_range Read-only property that returns information about *Range* HTTP header. Returns a :class:`slice` where ``.start`` is *left inclusive bound*, ``.stop`` is *right exclusive bound* and ``.step`` is ``1``. The property might be used in two manners: 1. Attribute-access style (example assumes that both left and right borders are set, the real logic for case of open bounds is more complex):: rng = request.http_range with open(filename, 'rb') as f: f.seek(rng.start) return f.read(rng.stop-rng.start) 2. Slice-style:: return buffer[request.http_range] .. attribute:: if_modified_since Read-only property that returns the date specified in the *If-Modified-Since* header. Returns :class:`datetime.datetime` or ``None`` if *If-Modified-Since* header is absent or is not a valid HTTP date. .. attribute:: if_unmodified_since Read-only property that returns the date specified in the *If-Unmodified-Since* header. Returns :class:`datetime.datetime` or ``None`` if *If-Unmodified-Since* header is absent or is not a valid HTTP date. .. versionadded:: 3.1 .. attribute:: if_range Read-only property that returns the date specified in the *If-Range* header. Returns :class:`datetime.datetime` or ``None`` if *If-Range* header is absent or is not a valid HTTP date. .. versionadded:: 3.1 .. method:: clone(*, method=..., rel_url=..., headers=...) Clone itself with replacement some attributes. Creates and returns a new instance of Request object. If no parameters are given, an exact copy is returned. If a parameter is not passed, it will reuse the one from the current request object. :param str method: http method :param rel_url: url to use, :class:`str` or :class:`~yarl.URL` :param headers: :class:`~multidict.CIMultiDict` or compatible headers container. :return: a cloned :class:`Request` instance. .. comethod:: read() Read request body, returns :class:`bytes` object with body content. .. note:: The method **does** store read data internally, subsequent :meth:`~Request.read` call will return the same value. .. comethod:: text() Read request body, decode it using :attr:`charset` encoding or ``UTF-8`` if no encoding was specified in *MIME-type*. Returns :class:`str` with body content. .. note:: The method **does** store read data internally, subsequent :meth:`~Request.text` call will return the same value. .. comethod:: json(*, loads=json.loads) Read request body decoded as *json*. The method is just a boilerplate :ref:`coroutine ` implemented as:: async def json(self, *, loads=json.loads): body = await self.text() return loads(body) :param callable loads: any :term:`callable` that accepts :class:`str` and returns :class:`dict` with parsed JSON (:func:`json.loads` by default). .. note:: The method **does** store read data internally, subsequent :meth:`~Request.json` call will return the same value. .. comethod:: multipart() Returns :class:`aiohttp.multipart.MultipartReader` which processes incoming *multipart* request. The method is just a boilerplate :ref:`coroutine ` implemented as:: async def multipart(self, *, reader=aiohttp.multipart.MultipartReader): return reader(self.headers, self._payload) This method is a coroutine for consistency with the else reader methods. .. warning:: The method **does not** store read data internally. That means once you exhausts multipart reader, you cannot get the request payload one more time. .. seealso:: :ref:`aiohttp-multipart` .. versionchanged:: 3.4 Dropped *reader* parameter. .. comethod:: post() A :ref:`coroutine ` that reads POST parameters from request body. Returns :class:`~multidict.MultiDictProxy` instance filled with parsed data. If :attr:`method` is not *POST*, *PUT*, *PATCH*, *TRACE* or *DELETE* or :attr:`content_type` is not empty or *application/x-www-form-urlencoded* or *multipart/form-data* returns empty multidict. .. note:: The method **does** store read data internally, subsequent :meth:`~Request.post` call will return the same value. .. comethod:: release() Release request. Eat unread part of HTTP BODY if present. .. note:: User code may never call :meth:`~Request.release`, all required work will be processed by :mod:`aiohttp.web` internal machinery. .. class:: Request A request used for receiving request's information by *web handler*. Every :ref:`handler` accepts a request instance as the first positional parameter. The class in derived from :class:`BaseRequest`, shares all parent's attributes and methods but has a couple of additional properties: .. attribute:: match_info Read-only property with :class:`~aiohttp.abc.AbstractMatchInfo` instance for result of route resolving. .. note:: Exact type of property depends on used router. If ``app.router`` is :class:`UrlDispatcher` the property contains :class:`UrlMappingMatchInfo` instance. .. attribute:: app An :class:`Application` instance used to call :ref:`request handler `, Read-only property. .. attribute:: config_dict A :class:`aiohttp.ChainMapProxy` instance for mapping all properties from the current application returned by :attr:`app` property and all its parents. .. seealso:: :ref:`aiohttp-web-data-sharing-app-config` .. versionadded:: 3.2 .. note:: You should never create the :class:`Request` instance manually -- :mod:`aiohttp.web` does it for you. But :meth:`~BaseRequest.clone` may be used for cloning *modified* request copy with changed *path*, *method* etc. .. _aiohttp-web-response: Response classes ---------------- For now, :mod:`aiohttp.web` has three classes for the *HTTP response*: :class:`StreamResponse`, :class:`Response` and :class:`FileResponse`. Usually you need to use the second one. :class:`StreamResponse` is intended for streaming data, while :class:`Response` contains *HTTP BODY* as an attribute and sends own content as single piece with the correct *Content-Length HTTP header*. For sake of design decisions :class:`Response` is derived from :class:`StreamResponse` parent class. The response supports *keep-alive* handling out-of-the-box if *request* supports it. You can disable *keep-alive* by :meth:`~StreamResponse.force_close` though. The common case for sending an answer from :ref:`web-handler` is returning a :class:`Response` instance:: async def handler(request): return Response(text="All right!") Response classes are :obj:`dict` like objects, allowing them to be used for :ref:`sharing data` among :ref:`aiohttp-web-middlewares` and :ref:`aiohttp-web-signals` handlers:: resp['key'] = value .. versionadded:: 3.0 Dict-like interface support. StreamResponse ^^^^^^^^^^^^^^ .. class:: StreamResponse(*, status=200, reason=None) The base class for the *HTTP response* handling. Contains methods for setting *HTTP response headers*, *cookies*, *response status code*, writing *HTTP response BODY* and so on. The most important thing you should know about *response* --- it is *Finite State Machine*. That means you can do any manipulations with *headers*, *cookies* and *status code* only before :meth:`prepare` coroutine is called. Once you call :meth:`prepare` any change of the *HTTP header* part will raise :exc:`RuntimeError` exception. Any :meth:`write` call after :meth:`write_eof` is also forbidden. :param int status: HTTP status code, ``200`` by default. :param str reason: HTTP reason. If param is ``None`` reason will be calculated basing on *status* parameter. Otherwise pass :class:`str` with arbitrary *status* explanation.. .. attribute:: prepared Read-only :class:`bool` property, ``True`` if :meth:`prepare` has been called, ``False`` otherwise. .. attribute:: task A task that serves HTTP request handling. May be useful for graceful shutdown of long-running requests (streaming, long polling or web-socket). .. attribute:: status Read-only property for *HTTP response status code*, :class:`int`. ``200`` (OK) by default. .. attribute:: reason Read-only property for *HTTP response reason*, :class:`str`. .. method:: set_status(status, reason=None) Set :attr:`status` and :attr:`reason`. *reason* value is auto calculated if not specified (``None``). .. attribute:: keep_alive Read-only property, copy of :attr:`Request.keep_alive` by default. Can be switched to ``False`` by :meth:`force_close` call. .. method:: force_close Disable :attr:`keep_alive` for connection. There are no ways to enable it back. .. attribute:: compression Read-only :class:`bool` property, ``True`` if compression is enabled. ``False`` by default. .. seealso:: :meth:`enable_compression` .. method:: enable_compression(force=None) Enable compression. When *force* is unset compression encoding is selected based on the request's *Accept-Encoding* header. *Accept-Encoding* is not checked if *force* is set to a :class:`ContentCoding`. .. seealso:: :attr:`compression` .. attribute:: chunked Read-only property, indicates if chunked encoding is on. Can be enabled by :meth:`enable_chunked_encoding` call. .. seealso:: :attr:`enable_chunked_encoding` .. method:: enable_chunked_encoding Enables :attr:`chunked` encoding for response. There are no ways to disable it back. With enabled :attr:`chunked` encoding each :meth:`write` operation encoded in separate chunk. .. warning:: chunked encoding can be enabled for ``HTTP/1.1`` only. Setting up both :attr:`content_length` and chunked encoding is mutually exclusive. .. seealso:: :attr:`chunked` .. attribute:: headers :class:`~multidict.CIMultiDict` instance for *outgoing* *HTTP headers*. .. attribute:: cookies An instance of :class:`http.cookies.SimpleCookie` for *outgoing* cookies. .. warning:: Direct setting up *Set-Cookie* header may be overwritten by explicit calls to cookie manipulation. We are encourage using of :attr:`cookies` and :meth:`set_cookie`, :meth:`del_cookie` for cookie manipulations. .. method:: set_cookie(name, value, *, path='/', expires=None, \ domain=None, max_age=None, \ secure=None, httponly=None, version=None) Convenient way for setting :attr:`cookies`, allows to specify some additional properties like *max_age* in a single call. :param str name: cookie name :param str value: cookie value (will be converted to :class:`str` if value has another type). :param expires: expiration date (optional) :param str domain: cookie domain (optional) :param int max_age: defines the lifetime of the cookie, in seconds. The delta-seconds value is a decimal non- negative integer. After delta-seconds seconds elapse, the client should discard the cookie. A value of zero means the cookie should be discarded immediately. (optional) :param str path: specifies the subset of URLs to which this cookie applies. (optional, ``'/'`` by default) :param bool secure: attribute (with no value) directs the user agent to use only (unspecified) secure means to contact the origin server whenever it sends back this cookie. The user agent (possibly under the user's control) may determine what level of security it considers appropriate for "secure" cookies. The *secure* should be considered security advice from the server to the user agent, indicating that it is in the session's interest to protect the cookie contents. (optional) :param bool httponly: ``True`` if the cookie HTTP only (optional) :param int version: a decimal integer, identifies to which version of the state management specification the cookie conforms. (Optional, *version=1* by default) .. warning:: In HTTP version 1.1, ``expires`` was deprecated and replaced with the easier-to-use ``max-age``, but Internet Explorer (IE6, IE7, and IE8) **does not** support ``max-age``. .. method:: del_cookie(name, *, path='/', domain=None) Deletes cookie. :param str name: cookie name :param str domain: optional cookie domain :param str path: optional cookie path, ``'/'`` by default .. attribute:: content_length *Content-Length* for outgoing response. .. attribute:: content_type *Content* part of *Content-Type* for outgoing response. .. attribute:: charset *Charset* aka *encoding* part of *Content-Type* for outgoing response. The value converted to lower-case on attribute assigning. .. attribute:: last_modified *Last-Modified* header for outgoing response. This property accepts raw :class:`str` values, :class:`datetime.datetime` objects, Unix timestamps specified as an :class:`int` or a :class:`float` object, and the value ``None`` to unset the header. .. comethod:: prepare(request) :param aiohttp.web.Request request: HTTP request object, that the response answers. Send *HTTP header*. You should not change any header data after calling this method. The coroutine calls :attr:`~aiohttp.web.Application.on_response_prepare` signal handlers. .. comethod:: write(data) Send byte-ish data as the part of *response BODY*:: await resp.write(data) :meth:`prepare` must be invoked before the call. Raises :exc:`TypeError` if data is not :class:`bytes`, :class:`bytearray` or :class:`memoryview` instance. Raises :exc:`RuntimeError` if :meth:`prepare` has not been called. Raises :exc:`RuntimeError` if :meth:`write_eof` has been called. .. comethod:: write_eof() A :ref:`coroutine` *may* be called as a mark of the *HTTP response* processing finish. *Internal machinery* will call this method at the end of the request processing if needed. After :meth:`write_eof` call any manipulations with the *response* object are forbidden. Response ^^^^^^^^ .. class:: Response(*, body=None, status=200, reason=None, text=None, \ headers=None, content_type=None, charset=None, \ zlib_executor_size=sentinel, zlib_executor=None) The most usable response class, inherited from :class:`StreamResponse`. Accepts *body* argument for setting the *HTTP response BODY*. The actual :attr:`body` sending happens in overridden :meth:`~StreamResponse.write_eof`. :param bytes body: response's BODY :param int status: HTTP status code, 200 OK by default. :param collections.abc.Mapping headers: HTTP headers that should be added to response's ones. :param str text: response's BODY :param str content_type: response's content type. ``'text/plain'`` if *text* is passed also, ``'application/octet-stream'`` otherwise. :param str charset: response's charset. ``'utf-8'`` if *text* is passed also, ``None`` otherwise. :param int zlib_executor_size: length in bytes which will trigger zlib compression of body to happen in an executor .. versionadded:: 3.5 :param int zlib_executor: executor to use for zlib compression .. versionadded:: 3.5 .. attribute:: body Read-write attribute for storing response's content aka BODY, :class:`bytes`. Setting :attr:`body` also recalculates :attr:`~StreamResponse.content_length` value. Assigning :class:`str` to :attr:`body` will make the :attr:`body` type of :class:`aiohttp.payload.StringPayload`, which tries to encode the given data based on *Content-Type* HTTP header, while defaulting to ``UTF-8``. Resetting :attr:`body` (assigning ``None``) sets :attr:`~StreamResponse.content_length` to ``None`` too, dropping *Content-Length* HTTP header. .. attribute:: text Read-write attribute for storing response's content, represented as string, :class:`str`. Setting :attr:`text` also recalculates :attr:`~StreamResponse.content_length` value and :attr:`~StreamResponse.body` value Resetting :attr:`text` (assigning ``None``) sets :attr:`~StreamResponse.content_length` to ``None`` too, dropping *Content-Length* HTTP header. WebSocketResponse ^^^^^^^^^^^^^^^^^ .. class:: WebSocketResponse(*, timeout=10.0, receive_timeout=None, \ autoclose=True, autoping=True, heartbeat=None, \ protocols=(), compress=True, max_msg_size=4194304) Class for handling server-side websockets, inherited from :class:`StreamResponse`. After starting (by :meth:`prepare` call) the response you cannot use :meth:`~StreamResponse.write` method but should to communicate with websocket client by :meth:`send_str`, :meth:`receive` and others. To enable back-pressure from slow websocket clients treat methods :meth:`ping()`, :meth:`pong()`, :meth:`send_str()`, :meth:`send_bytes()`, :meth:`send_json()` as coroutines. By default write buffer size is set to 64k. :param bool autoping: Automatically send :const:`~aiohttp.WSMsgType.PONG` on :const:`~aiohttp.WSMsgType.PING` message from client, and handle :const:`~aiohttp.WSMsgType.PONG` responses from client. Note that server does not send :const:`~aiohttp.WSMsgType.PING` requests, you need to do this explicitly using :meth:`ping` method. :param float heartbeat: Send `ping` message every `heartbeat` seconds and wait `pong` response, close connection if `pong` response is not received. The timer is reset on any data reception. :param float receive_timeout: Timeout value for `receive` operations. Default value is None (no timeout for receive operation) :param bool compress: Enable per-message deflate extension support. False for disabled, default value is True. :param int max_msg_size: maximum size of read websocket message, 4 MB by default. To disable the size limit use ``0``. .. versionadded:: 3.3 The class supports ``async for`` statement for iterating over incoming messages:: ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: print(msg.data) .. comethod:: prepare(request) Starts websocket. After the call you can use websocket methods. :param aiohttp.web.Request request: HTTP request object, that the response answers. :raises HTTPException: if websocket handshake has failed. .. method:: can_prepare(request) Performs checks for *request* data to figure out if websocket can be started on the request. If :meth:`can_prepare` call is success then :meth:`prepare` will success too. :param aiohttp.web.Request request: HTTP request object, that the response answers. :return: :class:`WebSocketReady` instance. :attr:`WebSocketReady.ok` is ``True`` on success, :attr:`WebSocketReady.protocol` is websocket subprotocol which is passed by client and accepted by server (one of *protocols* sequence from :class:`WebSocketResponse` ctor). :attr:`WebSocketReady.protocol` may be ``None`` if client and server subprotocols are not overlapping. .. note:: The method never raises exception. .. attribute:: closed Read-only property, ``True`` if connection has been closed or in process of closing. :const:`~aiohttp.WSMsgType.CLOSE` message has been received from peer. .. attribute:: close_code Read-only property, close code from peer. It is set to ``None`` on opened connection. .. attribute:: ws_protocol Websocket *subprotocol* chosen after :meth:`start` call. May be ``None`` if server and client protocols are not overlapping. .. method:: exception() Returns last occurred exception or None. .. comethod:: ping(message=b'') Send :const:`~aiohttp.WSMsgType.PING` to peer. :param message: optional payload of *ping* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. :raise RuntimeError: if connections is not started or closing. .. versionchanged:: 3.0 The method is converted into :term:`coroutine` .. comethod:: pong(message=b'') Send *unsolicited* :const:`~aiohttp.WSMsgType.PONG` to peer. :param message: optional payload of *pong* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. :raise RuntimeError: if connections is not started or closing. .. versionchanged:: 3.0 The method is converted into :term:`coroutine` .. comethod:: send_str(data, compress=None) Send *data* to peer as :const:`~aiohttp.WSMsgType.TEXT` message. :param str data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :raise RuntimeError: if connection is not started or closing :raise TypeError: if data is not :class:`str` .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: send_bytes(data, compress=None) Send *data* to peer as :const:`~aiohttp.WSMsgType.BINARY` message. :param data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :raise RuntimeError: if connection is not started or closing :raise TypeError: if data is not :class:`bytes`, :class:`bytearray` or :class:`memoryview`. .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: send_json(data, compress=None, *, dumps=json.dumps) Send *data* to peer as JSON string. :param data: data to send. :param int compress: sets specific level of compression for single message, ``None`` for not overriding per-socket setting. :param callable dumps: any :term:`callable` that accepts an object and returns a JSON string (:func:`json.dumps` by default). :raise RuntimeError: if connection is not started or closing :raise ValueError: if data is not serializable object :raise TypeError: if value returned by ``dumps`` param is not :class:`str` .. versionchanged:: 3.0 The method is converted into :term:`coroutine`, *compress* parameter added. .. comethod:: close(*, code=1000, message=b'') A :ref:`coroutine` that initiates closing handshake by sending :const:`~aiohttp.WSMsgType.CLOSE` message. It is safe to call `close()` from different task. :param int code: closing code :param message: optional payload of *pong* message, :class:`str` (converted to *UTF-8* encoded bytes) or :class:`bytes`. :raise RuntimeError: if connection is not started .. comethod:: receive(timeout=None) A :ref:`coroutine` that waits upcoming *data* message from peer and returns it. The coroutine implicitly handles :const:`~aiohttp.WSMsgType.PING`, :const:`~aiohttp.WSMsgType.PONG` and :const:`~aiohttp.WSMsgType.CLOSE` without returning the message. It process *ping-pong game* and performs *closing handshake* internally. .. note:: Can only be called by the request handling task. :param timeout: timeout for `receive` operation. timeout value overrides response`s receive_timeout attribute. :return: :class:`~aiohttp.WSMessage` :raise RuntimeError: if connection is not started .. comethod:: receive_str(*, timeout=None) A :ref:`coroutine` that calls :meth:`receive` but also asserts the message type is :const:`~aiohttp.WSMsgType.TEXT`. .. note:: Can only be called by the request handling task. :param timeout: timeout for `receive` operation. timeout value overrides response`s receive_timeout attribute. :return str: peer's message content. :raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`. .. comethod:: receive_bytes(*, timeout=None) A :ref:`coroutine` that calls :meth:`receive` but also asserts the message type is :const:`~aiohttp.WSMsgType.BINARY`. .. note:: Can only be called by the request handling task. :param timeout: timeout for `receive` operation. timeout value overrides response`s receive_timeout attribute. :return bytes: peer's message content. :raise TypeError: if message is :const:`~aiohttp.WSMsgType.TEXT`. .. comethod:: receive_json(*, loads=json.loads, timeout=None) A :ref:`coroutine` that calls :meth:`receive_str` and loads the JSON string to a Python dict. .. note:: Can only be called by the request handling task. :param callable loads: any :term:`callable` that accepts :class:`str` and returns :class:`dict` with parsed JSON (:func:`json.loads` by default). :param timeout: timeout for `receive` operation. timeout value overrides response`s receive_timeout attribute. :return dict: loaded JSON content :raise TypeError: if message is :const:`~aiohttp.WSMsgType.BINARY`. :raise ValueError: if message is not valid JSON. .. seealso:: :ref:`WebSockets handling` WebSocketReady ^^^^^^^^^^^^^^ .. class:: WebSocketReady A named tuple for returning result from :meth:`WebSocketResponse.can_prepare`. Has :class:`bool` check implemented, e.g.:: if not await ws.can_prepare(...): cannot_start_websocket() .. attribute:: ok ``True`` if websocket connection can be established, ``False`` otherwise. .. attribute:: protocol :class:`str` represented selected websocket sub-protocol. .. seealso:: :meth:`WebSocketResponse.can_prepare` json_response ^^^^^^^^^^^^^ .. function:: json_response([data], *, text=None, body=None, \ status=200, reason=None, headers=None, \ content_type='application/json', \ dumps=json.dumps) Return :class:`Response` with predefined ``'application/json'`` content type and *data* encoded by ``dumps`` parameter (:func:`json.dumps` by default). HTTP Exceptions ^^^^^^^^^^^^^^^ Errors can also be returned by raising a HTTP exception instance from within the handler. .. class:: HTTPException(*, headers=None, reason=None, text=None, content_type=None) Low-level HTTP failure. :param headers: headers for the response :type headers: dict or multidict.CIMultiDict :param str reason: reason included in the response :param str text: response's body :param str content_type: response's content type. This is passed through to the :class:`Response` initializer. Sub-classes of ``HTTPException`` exist for the standard HTTP response codes as described in :ref:`aiohttp-web-exceptions` and the expected usage is to simply raise the appropriate exception type to respond with a specific HTTP response code. Since ``HTTPException`` is a sub-class of :class:`Response`, it contains the methods and properties that allow you to directly manipulate details of the response. .. attribute:: status_code HTTP status code for this exception class. This attribute is usually defined at the class level. ``self.status_code`` is passed to the :class:`Response` initializer. .. _aiohttp-web-app-and-router: Application and Router ---------------------- Application ^^^^^^^^^^^ Application is a synonym for web-server. To get fully working example, you have to make *application*, register supported urls in *router* and pass it to :func:`aiohttp.web.run_app` or :class:`aiohttp.web.AppRunner`. *Application* contains a *router* instance and a list of callbacks that will be called during application finishing. :class:`Application` is a :obj:`dict`-like object, so you can use it for :ref:`sharing data` globally by storing arbitrary properties for later access from a :ref:`handler` via the :attr:`Request.app` property:: app = Application() app['database'] = await aiopg.create_engine(**db_config) async def handler(request): with (await request.app['database']) as conn: conn.execute("DELETE * FROM table") Although :class:`Application` is a :obj:`dict`-like object, it can't be duplicated like one using :meth:`Application.copy`. .. class:: Application(*, logger=, router=None,middlewares=(), \ handler_args=None, client_max_size=1024**2, \ loop=None, debug=...) The class inherits :class:`dict`. :param logger: :class:`logging.Logger` instance for storing application logs. By default the value is ``logging.getLogger("aiohttp.web")`` :param router: :class:`aiohttp.abc.AbstractRouter` instance, the system creates :class:`UrlDispatcher` by default if *router* is ``None``. .. deprecated:: 3.3 The custom routers support is deprecated, the parameter will be removed in 4.0. :param middlewares: :class:`list` of middleware factories, see :ref:`aiohttp-web-middlewares` for details. :param handler_args: dict-like object that overrides keyword arguments of :meth:`Application.make_handler` :param client_max_size: client's maximum size in a request, in bytes. If a POST request exceeds this value, it raises an `HTTPRequestEntityTooLarge` exception. :param loop: event loop .. deprecated:: 2.0 The parameter is deprecated. Loop is get set during freeze stage. :param debug: Switches debug mode. .. deprecated:: 3.5 Use asyncio :ref:`asyncio-debug-mode` instead. .. attribute:: router Read-only property that returns *router instance*. .. attribute:: logger :class:`logging.Logger` instance for storing application logs. .. attribute:: loop :ref:`event loop` used for processing HTTP requests. .. deprecated:: 3.5 .. attribute:: debug Boolean value indicating whether the debug mode is turned on or off. .. deprecated:: 3.5 Use asyncio :ref:`asyncio-debug-mode` instead. .. attribute:: on_response_prepare A :class:`~aiohttp.Signal` that is fired at the beginning of :meth:`StreamResponse.prepare` with parameters *request* and *response*. It can be used, for example, to add custom headers to each response before sending. Signal handlers should have the following signature:: async def on_prepare(request, response): pass .. attribute:: on_startup A :class:`~aiohttp.Signal` that is fired on application start-up. Subscribers may use the signal to run background tasks in the event loop along with the application's request handler just after the application start-up. Signal handlers should have the following signature:: async def on_startup(app): pass .. seealso:: :ref:`aiohttp-web-signals`. .. attribute:: on_shutdown A :class:`~aiohttp.Signal` that is fired on application shutdown. Subscribers may use the signal for gracefully closing long running connections, e.g. websockets and data streaming. Signal handlers should have the following signature:: async def on_shutdown(app): pass It's up to end user to figure out which :term:`web-handler`\s are still alive and how to finish them properly. We suggest keeping a list of long running handlers in :class:`Application` dictionary. .. seealso:: :ref:`aiohttp-web-graceful-shutdown` and :attr:`on_cleanup`. .. attribute:: on_cleanup A :class:`~aiohttp.Signal` that is fired on application cleanup. Subscribers may use the signal for gracefully closing connections to database server etc. Signal handlers should have the following signature:: async def on_cleanup(app): pass .. seealso:: :ref:`aiohttp-web-signals` and :attr:`on_shutdown`. .. attribute:: cleanup_ctx A list of *context generators* for *startup*/*cleanup* handling. Signal handlers should have the following signature:: async def context(app): # do startup stuff yield # do cleanup .. versionadded:: 3.1 .. seealso:: :ref:`aiohttp-web-cleanup-ctx`. .. method:: add_subapp(prefix, subapp) Register nested sub-application under given path *prefix*. In resolving process if request's path starts with *prefix* then further resolving is passed to *subapp*. :param str prefix: path's prefix for the resource. :param Application subapp: nested application attached under *prefix*. :returns: a :class:`PrefixedSubAppResource` instance. .. method:: add_domain(domain, subapp) Register nested sub-application that serves the domain name or domain name mask. In resolving process if request.headers['host'] matches the pattern *domain* then further resolving is passed to *subapp*. :param str domain: domain or mask of domain for the resource. :param Application subapp: nested application. :returns: a :class:`MatchedSubAppResource` instance. .. method:: add_routes(routes_table) Register route definitions from *routes_table*. The table is a :class:`list` of :class:`RouteDef` items or :class:`RouteTableDef`. The method is a shortcut for ``app.router.add_routes(routes_table)``, see also :meth:`UrlDispatcher.add_routes`. .. versionadded:: 3.1 .. method:: make_handler(loop=None, **kwargs) Creates HTTP protocol factory for handling requests. :param loop: :ref:`event loop` used for processing HTTP requests. If param is ``None`` :func:`asyncio.get_event_loop` used for getting default event loop. .. deprecated:: 2.0 :param bool tcp_keepalive: Enable TCP Keep-Alive. Default: ``True``. :param int keepalive_timeout: Number of seconds before closing Keep-Alive connection. Default: ``75`` seconds (NGINX's default value). :param logger: Custom logger object. Default: :data:`aiohttp.log.server_logger`. :param access_log: Custom logging object. Default: :data:`aiohttp.log.access_logger`. :param access_log_class: Class for `access_logger`. Default: :data:`aiohttp.helpers.AccessLogger`. Must to be a subclass of :class:`aiohttp.abc.AbstractAccessLogger`. :param str access_log_format: Access log format string. Default: :attr:`helpers.AccessLogger.LOG_FORMAT`. :param int max_line_size: Optional maximum header line size. Default: ``8190``. :param int max_headers: Optional maximum header size. Default: ``32768``. :param int max_field_size: Optional maximum header field size. Default: ``8190``. :param float lingering_time: Maximum time during which the server reads and ignores additional data coming from the client when lingering close is on. Use ``0`` to disable lingering on server channel closing. You should pass result of the method as *protocol_factory* to :meth:`~asyncio.AbstractEventLoop.create_server`, e.g.:: loop = asyncio.get_event_loop() app = Application() # setup route table # app.router.add_route(...) await loop.create_server(app.make_handler(), '0.0.0.0', 8080) .. deprecated:: 3.2 The method is deprecated and will be removed in future aiohttp versions. Please use :ref:`aiohttp-web-app-runners` instead. .. comethod:: startup() A :ref:`coroutine` that will be called along with the application's request handler. The purpose of the method is calling :attr:`on_startup` signal handlers. .. comethod:: shutdown() A :ref:`coroutine` that should be called on server stopping but before :meth:`cleanup()`. The purpose of the method is calling :attr:`on_shutdown` signal handlers. .. comethod:: cleanup() A :ref:`coroutine` that should be called on server stopping but after :meth:`shutdown`. The purpose of the method is calling :attr:`on_cleanup` signal handlers. .. note:: Application object has :attr:`router` attribute but has no ``add_route()`` method. The reason is: we want to support different router implementations (even maybe not url-matching based but traversal ones). For sake of that fact we have very trivial ABC for :class:`AbstractRouter`: it should have only :meth:`AbstractRouter.resolve` coroutine. No methods for adding routes or route reversing (getting URL by route name). All those are router implementation details (but, sure, you need to deal with that methods after choosing the router for your application). Server ^^^^^^ A protocol factory compatible with :meth:`~asyncio.AbstreactEventLoop.create_server`. .. class:: Server The class is responsible for creating HTTP protocol objects that can handle HTTP connections. .. attribute:: connections List of all currently opened connections. .. attribute:: requests_count Amount of processed requests. .. comethod:: Server.shutdown(timeout) A :ref:`coroutine` that should be called to close all opened connections. Router ^^^^^^ For dispatching URLs to :ref:`handlers` :mod:`aiohttp.web` uses *routers*. Router is any object that implements :class:`AbstractRouter` interface. :mod:`aiohttp.web` provides an implementation called :class:`UrlDispatcher`. :class:`Application` uses :class:`UrlDispatcher` as :meth:`router` by default. .. class:: UrlDispatcher() Straightforward url-matching router, implements :class:`collections.abc.Mapping` for access to *named routes*. Before running :class:`Application` you should fill *route table* first by calling :meth:`add_route` and :meth:`add_static`. :ref:`Handler` lookup is performed by iterating on added *routes* in FIFO order. The first matching *route* will be used to call corresponding *handler*. If on route creation you specify *name* parameter the result is *named route*. *Named route* can be retrieved by ``app.router[name]`` call, checked for existence by ``name in app.router`` etc. .. seealso:: :ref:`Route classes ` .. method:: add_resource(path, *, name=None) Append a :term:`resource` to the end of route table. *path* may be either *constant* string like ``'/a/b/c'`` or *variable rule* like ``'/a/{var}'`` (see :ref:`handling variable paths `) :param str path: resource path spec. :param str name: optional resource name. :return: created resource instance (:class:`PlainResource` or :class:`DynamicResource`). .. method:: add_route(method, path, handler, *, \ name=None, expect_handler=None) Append :ref:`handler` to the end of route table. *path* may be either *constant* string like ``'/a/b/c'`` or *variable rule* like ``'/a/{var}'`` (see :ref:`handling variable paths `) Pay attention please: *handler* is converted to coroutine internally when it is a regular function. :param str method: HTTP method for route. Should be one of ``'GET'``, ``'POST'``, ``'PUT'``, ``'DELETE'``, ``'PATCH'``, ``'HEAD'``, ``'OPTIONS'`` or ``'*'`` for any method. The parameter is case-insensitive, e.g. you can push ``'get'`` as well as ``'GET'``. :param str path: route path. Should be started with slash (``'/'``). :param callable handler: route handler. :param str name: optional route name. :param coroutine expect_handler: optional *expect* header handler. :returns: new :class:`PlainRoute` or :class:`DynamicRoute` instance. .. method:: add_routes(routes_table) Register route definitions from *routes_table*. The table is a :class:`list` of :class:`RouteDef` items or :class:`RouteTableDef`. .. versionadded:: 2.3 .. method:: add_get(path, handler, *, name=None, allow_head=True, **kwargs) Shortcut for adding a GET handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'GET'``. If *allow_head* is ``True`` (default) the route for method HEAD is added with the same handler as for GET. If *name* is provided the name for HEAD route is suffixed with ``'-head'``. For example ``router.add_get(path, handler, name='route')`` call adds two routes: first for GET with name ``'route'`` and second for HEAD with name ``'route-head'``. .. method:: add_post(path, handler, **kwargs) Shortcut for adding a POST handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'POST'``. .. method:: add_head(path, handler, **kwargs) Shortcut for adding a HEAD handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'HEAD'``. .. method:: add_put(path, handler, **kwargs) Shortcut for adding a PUT handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'PUT'``. .. method:: add_patch(path, handler, **kwargs) Shortcut for adding a PATCH handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'PATCH'``. .. method:: add_delete(path, handler, **kwargs) Shortcut for adding a DELETE handler. Calls the :meth:`add_route` with \ ``method`` equals to ``'DELETE'``. .. method:: add_view(path, handler, **kwargs) Shortcut for adding a class-based view handler. Calls the \ :meth:`add_route` with ``method`` equals to ``'*'``. .. versionadded:: 3.0 .. method:: add_static(prefix, path, *, name=None, expect_handler=None, \ chunk_size=256*1024, \ response_factory=StreamResponse, \ show_index=False, \ follow_symlinks=False, \ append_version=False) Adds a router and a handler for returning static files. Useful for serving static content like images, javascript and css files. On platforms that support it, the handler will transfer files more efficiently using the ``sendfile`` system call. In some situations it might be necessary to avoid using the ``sendfile`` system call even if the platform supports it. This can be accomplished by by setting environment variable ``AIOHTTP_NOSENDFILE=1``. If a gzip version of the static content exists at file path + ``.gz``, it will be used for the response. .. warning:: Use :meth:`add_static` for development only. In production, static content should be processed by web servers like *nginx* or *apache*. :param str prefix: URL path prefix for handled static files :param path: path to the folder in file system that contains handled static files, :class:`str` or :class:`pathlib.Path`. :param str name: optional route name. :param coroutine expect_handler: optional *expect* header handler. :param int chunk_size: size of single chunk for file downloading, 256Kb by default. Increasing *chunk_size* parameter to, say, 1Mb may increase file downloading speed but consumes more memory. :param bool show_index: flag for allowing to show indexes of a directory, by default it's not allowed and HTTP/403 will be returned on directory access. :param bool follow_symlinks: flag for allowing to follow symlinks from a directory, by default it's not allowed and HTTP/404 will be returned on access. :param bool append_version: flag for adding file version (hash) to the url query string, this value will be used as default when you call to :meth:`StaticRoute.url` and :meth:`StaticRoute.url_for` methods. :returns: new :class:`StaticRoute` instance. .. comethod:: resolve(request) A :ref:`coroutine` that returns :class:`AbstractMatchInfo` for *request*. The method never raises exception, but returns :class:`AbstractMatchInfo` instance with: 1. :attr:`~AbstractMatchInfo.http_exception` assigned to :exc:`HTTPException` instance. 2. :attr:`~AbstractMatchInfo.handler` which raises :exc:`HTTPNotFound` or :exc:`HTTPMethodNotAllowed` on handler's execution if there is no registered route for *request*. *Middlewares* can process that exceptions to render pretty-looking error page for example. Used by internal machinery, end user unlikely need to call the method. .. note:: The method uses :attr:`Request.raw_path` for pattern matching against registered routes. .. method:: resources() The method returns a *view* for *all* registered resources. The view is an object that allows to: 1. Get size of the router table:: len(app.router.resources()) 2. Iterate over registered resources:: for resource in app.router.resources(): print(resource) 3. Make a check if the resources is registered in the router table:: route in app.router.resources() .. method:: routes() The method returns a *view* for *all* registered routes. .. method:: named_resources() Returns a :obj:`dict`-like :class:`types.MappingProxyType` *view* over *all* named **resources**. The view maps every named resource's **name** to the :class:`BaseResource` instance. It supports the usual :obj:`dict`-like operations, except for any mutable operations (i.e. it's **read-only**):: len(app.router.named_resources()) for name, resource in app.router.named_resources().items(): print(name, resource) "name" in app.router.named_resources() app.router.named_resources()["name"] .. _aiohttp-web-resource: Resource ^^^^^^^^ Default router :class:`UrlDispatcher` operates with :term:`resource`\s. Resource is an item in *routing table* which has a *path*, an optional unique *name* and at least one :term:`route`. :term:`web-handler` lookup is performed in the following way: 1. Router iterates over *resources* one-by-one. 2. If *resource* matches to requested URL the resource iterates over own *routes*. 3. If route matches to requested HTTP method (or ``'*'`` wildcard) the route's handler is used as found :term:`web-handler`. The lookup is finished. 4. Otherwise router tries next resource from the *routing table*. 5. If the end of *routing table* is reached and no *resource* / *route* pair found the *router* returns special :class:`AbstractMatchInfo` instance with :attr:`AbstractMatchInfo.http_exception` is not ``None`` but :exc:`HTTPException` with either *HTTP 404 Not Found* or *HTTP 405 Method Not Allowed* status code. Registered :attr:`AbstractMatchInfo.handler` raises this exception on call. User should never instantiate resource classes but give it by :meth:`UrlDispatcher.add_resource` call. After that he may add a :term:`route` by calling :meth:`Resource.add_route`. :meth:`UrlDispatcher.add_route` is just shortcut for:: router.add_resource(path).add_route(method, handler) Resource with a *name* is called *named resource*. The main purpose of *named resource* is constructing URL by route name for passing it into *template engine* for example:: url = app.router['resource_name'].url_for().with_query({'a': 1, 'b': 2}) Resource classes hierarchy:: AbstractResource Resource PlainResource DynamicResource StaticResource .. class:: AbstractResource A base class for all resources. Inherited from :class:`collections.abc.Sized` and :class:`collections.abc.Iterable`. ``len(resource)`` returns amount of :term:`route`\s belongs to the resource, ``for route in resource`` allows to iterate over these routes. .. attribute:: name Read-only *name* of resource or ``None``. .. attribute:: canonical Read-only *canonical path* associate with the resource. For example ``/path/to`` or ``/path/{to}`` .. versionadded:: 3.3 .. comethod:: resolve(request) Resolve resource by finding appropriate :term:`web-handler` for ``(method, path)`` combination. :return: (*match_info*, *allowed_methods*) pair. *allowed_methods* is a :class:`set` or HTTP methods accepted by resource. *match_info* is either :class:`UrlMappingMatchInfo` if request is resolved or ``None`` if no :term:`route` is found. .. method:: get_info() A resource description, e.g. ``{'path': '/path/to'}`` or ``{'formatter': '/path/{to}', 'pattern': re.compile(r'^/path/(?P[a-zA-Z][_a-zA-Z0-9]+)$`` .. method:: url_for(*args, **kwargs) Construct an URL for route with additional params. *args* and **kwargs** depend on a parameters list accepted by inherited resource class. :return: :class:`~yarl.URL` -- resulting URL instance. .. class:: Resource A base class for new-style resources, inherits :class:`AbstractResource`. .. method:: add_route(method, handler, *, expect_handler=None) Add a :term:`web-handler` to resource. :param str method: HTTP method for route. Should be one of ``'GET'``, ``'POST'``, ``'PUT'``, ``'DELETE'``, ``'PATCH'``, ``'HEAD'``, ``'OPTIONS'`` or ``'*'`` for any method. The parameter is case-insensitive, e.g. you can push ``'get'`` as well as ``'GET'``. The method should be unique for resource. :param callable handler: route handler. :param coroutine expect_handler: optional *expect* header handler. :returns: new :class:`ResourceRoute` instance. .. class:: PlainResource A resource, inherited from :class:`Resource`. The class corresponds to resources with plain-text matching, ``'/path/to'`` for example. .. attribute:: canonical Read-only *canonical path* associate with the resource. Returns the path used to create the PlainResource. For example ``/path/to`` .. versionadded:: 3.3 .. method:: url_for() Returns a :class:`~yarl.URL` for the resource. .. class:: DynamicResource A resource, inherited from :class:`Resource`. The class corresponds to resources with :ref:`variable ` matching, e.g. ``'/path/{to}/{param}'`` etc. .. attribute:: canonical Read-only *canonical path* associate with the resource. Returns the formatter obtained from the path used to create the DynamicResource. For example, from a path ``/get/{num:^\d+}``, it returns ``/get/{num}`` .. versionadded:: 3.3 .. method:: url_for(**params) Returns a :class:`~yarl.URL` for the resource. :param params: -- a variable substitutions for dynamic resource. E.g. for ``'/path/{to}/{param}'`` pattern the method should be called as ``resource.url_for(to='val1', param='val2')`` .. class:: StaticResource A resource, inherited from :class:`Resource`. The class corresponds to resources for :ref:`static file serving `. .. attribute:: canonical Read-only *canonical path* associate with the resource. Returns the prefix used to create the StaticResource. For example ``/prefix`` .. versionadded:: 3.3 .. method:: url_for(filename, append_version=None) Returns a :class:`~yarl.URL` for file path under resource prefix. :param filename: -- a file name substitution for static file handler. Accepts both :class:`str` and :class:`pathlib.Path`. E.g. an URL for ``'/prefix/dir/file.txt'`` should be generated as ``resource.url_for(filename='dir/file.txt')`` :param bool append_version: -- a flag for adding file version (hash) to the url query string for cache boosting By default has value from a constructor (``False`` by default) When set to ``True`` - ``v=FILE_HASH`` query string param will be added When set to ``False`` has no impact if file not found has no impact .. class:: PrefixedSubAppResource A resource for serving nested applications. The class instance is returned by :class:`~aiohttp.web.Application.add_subapp` call. .. attribute:: canonical Read-only *canonical path* associate with the resource. Returns the prefix used to create the PrefixedSubAppResource. For example ``/prefix`` .. versionadded:: 3.3 .. method:: url_for(**kwargs) The call is not allowed, it raises :exc:`RuntimeError`. .. _aiohttp-web-route: Route ^^^^^ Route has *HTTP method* (wildcard ``'*'`` is an option), :term:`web-handler` and optional *expect handler*. Every route belong to some resource. Route classes hierarchy:: AbstractRoute ResourceRoute SystemRoute :class:`ResourceRoute` is the route used for resources, :class:`SystemRoute` serves URL resolving errors like *404 Not Found* and *405 Method Not Allowed*. .. class:: AbstractRoute Base class for routes served by :class:`UrlDispatcher`. .. attribute:: method HTTP method handled by the route, e.g. *GET*, *POST* etc. .. attribute:: handler :ref:`handler` that processes the route. .. attribute:: name Name of the route, always equals to name of resource which owns the route. .. attribute:: resource Resource instance which holds the route, ``None`` for :class:`SystemRoute`. .. method:: url_for(*args, **kwargs) Abstract method for constructing url handled by the route. Actually it's a shortcut for ``route.resource.url_for(...)``. .. comethod:: handle_expect_header(request) ``100-continue`` handler. .. class:: ResourceRoute The route class for handling different HTTP methods for :class:`Resource`. .. class:: SystemRoute The route class for handling URL resolution errors like like *404 Not Found* and *405 Method Not Allowed*. .. attribute:: status HTTP status code .. attribute:: reason HTTP status reason .. _aiohttp-web-route-def: RouteDef and StaticDef ^^^^^^^^^^^^^^^^^^^^^^ Route definition, a description for not registered yet route. Could be used for filing route table by providing a list of route definitions (Django style). The definition is created by functions like :func:`get` or :func:`post`, list of definitions could be added to router by :meth:`UrlDispatcher.add_routes` call:: from aiohttp import web async def handle_get(request): ... async def handle_post(request): ... app.router.add_routes([web.get('/get', handle_get), web.post('/post', handle_post), .. class:: AbstractRouteDef A base class for route definitions. Inherited from :class:`abc.ABC`. .. versionadded:: 3.1 .. method:: register(router) Register itself into :class:`UrlDispatcher`. Abstract method, should be overridden by subclasses. .. class:: RouteDef A definition of not registered yet route. Implements :class:`AbstractRouteDef`. .. versionadded:: 2.3 .. versionchanged:: 3.1 The class implements :class:`AbstractRouteDef` interface. .. attribute:: method HTTP method (``GET``, ``POST`` etc.) (:class:`str`). .. attribute:: path Path to resource, e.g. ``/path/to``. Could contain ``{}`` brackets for :ref:`variable resources ` (:class:`str`). .. attribute:: handler An async function to handle HTTP request. .. attribute:: kwargs A :class:`dict` of additional arguments. .. class:: StaticDef A definition of static file resource. Implements :class:`AbstractRouteDef`. .. versionadded:: 3.1 .. attribute:: prefix A prefix used for static file handling, e.g. ``/static``. .. attribute:: path File system directory to serve, :class:`str` or :class:`pathlib.Path` (e.g. ``'/home/web-service/path/to/static'``. .. attribute:: kwargs A :class:`dict` of additional arguments, see :meth:`UrlDispatcher.add_static` for a list of supported options. .. function:: get(path, handler, *, name=None, allow_head=True, \ expect_handler=None) Return :class:`RouteDef` for processing ``GET`` requests. See :meth:`UrlDispatcher.add_get` for information about parameters. .. versionadded:: 2.3 .. function:: post(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``POST`` requests. See :meth:`UrlDispatcher.add_post` for information about parameters. .. versionadded:: 2.3 .. function:: head(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``HEAD`` requests. See :meth:`UrlDispatcher.add_head` for information about parameters. .. versionadded:: 2.3 .. function:: put(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``PUT`` requests. See :meth:`UrlDispatcher.add_put` for information about parameters. .. versionadded:: 2.3 .. function:: patch(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``PATCH`` requests. See :meth:`UrlDispatcher.add_patch` for information about parameters. .. versionadded:: 2.3 .. function:: delete(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``DELETE`` requests. See :meth:`UrlDispatcher.add_delete` for information about parameters. .. versionadded:: 2.3 .. function:: view(path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing ``ANY`` requests. See :meth:`UrlDispatcher.add_view` for information about parameters. .. versionadded:: 3.0 .. function:: static(prefix, path, *, name=None, expect_handler=None, \ chunk_size=256*1024, \ show_index=False, follow_symlinks=False, \ append_version=False) Return :class:`StaticDef` for processing static files. See :meth:`UrlDispatcher.add_static` for information about supported parameters. .. versionadded:: 3.1 .. function:: route(method, path, handler, *, name=None, expect_handler=None) Return :class:`RouteDef` for processing requests that decided by ``method``. See :meth:`UrlDispatcher.add_route` for information about parameters. .. versionadded:: 2.3 .. _aiohttp-web-route-table-def: RouteTableDef ^^^^^^^^^^^^^ A routes table definition used for describing routes by decorators (Flask style):: from aiohttp import web routes = web.RouteTableDef() @routes.get('/get') async def handle_get(request): ... @routes.post('/post') async def handle_post(request): ... app.router.add_routes(routes) @routes.view("/view") class MyView(web.View): async def get(self): ... async def post(self): ... .. class:: RouteTableDef() A sequence of :class:`RouteDef` instances (implements :class:`abc.collections.Sequence` protocol). In addition to all standard :class:`list` methods the class provides also methods like ``get()`` and ``post()`` for adding new route definition. .. versionadded:: 2.3 .. decoratormethod:: get(path, *, allow_head=True, \ name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``GET`` web-handler. See :meth:`UrlDispatcher.add_get` for information about parameters. .. decoratormethod:: post(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``POST`` web-handler. See :meth:`UrlDispatcher.add_post` for information about parameters. .. decoratormethod:: head(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``HEAD`` web-handler. See :meth:`UrlDispatcher.add_head` for information about parameters. .. decoratormethod:: put(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``PUT`` web-handler. See :meth:`UrlDispatcher.add_put` for information about parameters. .. decoratormethod:: patch(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``PATCH`` web-handler. See :meth:`UrlDispatcher.add_patch` for information about parameters. .. decoratormethod:: delete(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``DELETE`` web-handler. See :meth:`UrlDispatcher.add_delete` for information about parameters. .. decoratormethod:: view(path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering ``ANY`` methods against a class-based view. See :meth:`UrlDispatcher.add_view` for information about parameters. .. versionadded:: 3.0 .. method:: static(prefix, path, *, name=None, expect_handler=None, \ chunk_size=256*1024, \ show_index=False, follow_symlinks=False, \ append_version=False) Add a new :class:`StaticDef` item for registering static files processor. See :meth:`UrlDispatcher.add_static` for information about supported parameters. .. versionadded:: 3.1 .. decoratormethod:: route(method, path, *, name=None, expect_handler=None) Add a new :class:`RouteDef` item for registering a web-handler for arbitrary HTTP method. See :meth:`UrlDispatcher.add_route` for information about parameters. MatchInfo ^^^^^^^^^ After route matching web application calls found handler if any. Matching result can be accessible from handler as :attr:`Request.match_info` attribute. In general the result may be any object derived from :class:`AbstractMatchInfo` (:class:`UrlMappingMatchInfo` for default :class:`UrlDispatcher` router). .. class:: UrlMappingMatchInfo Inherited from :class:`dict` and :class:`AbstractMatchInfo`. Dict items are filled by matching info and is :term:`resource`\-specific. .. attribute:: expect_handler A coroutine for handling ``100-continue``. .. attribute:: handler A coroutine for handling request. .. attribute:: route :class:`Route` instance for url matching. View ^^^^ .. class:: View(request) Inherited from :class:`AbstractView`. Base class for class based views. Implementations should derive from :class:`View` and override methods for handling HTTP verbs like ``get()`` or ``post()``:: class MyView(View): async def get(self): resp = await get_response(self.request) return resp async def post(self): resp = await post_response(self.request) return resp app.router.add_view('/view', MyView) The view raises *405 Method Not allowed* (:class:`HTTPMethodNotAllowed`) if requested web verb is not supported. :param request: instance of :class:`Request` that has initiated a view processing. .. attribute:: request Request sent to view's constructor, read-only property. Overridable coroutine methods: ``connect()``, ``delete()``, ``get()``, ``head()``, ``options()``, ``patch()``, ``post()``, ``put()``, ``trace()``. .. seealso:: :ref:`aiohttp-web-class-based-views` .. _aiohttp-web-app-runners-reference: Running Applications -------------------- To start web application there is ``AppRunner`` and site classes. Runner is a storage for running application, sites are for running application on specific TCP or Unix socket, e.g.:: runner = web.AppRunner(app) await runner.setup() site = web.TCPSite(runner, 'localhost', 8080) await site.start() # wait for finish signal await runner.cleanup() .. versionadded:: 3.0 :class:`AppRunner` / :class:`ServerRunner` and :class:`TCPSite` / :class:`UnixSite` / :class:`SockSite` are added in aiohttp 3.0 .. class:: BaseRunner A base class for runners. Use :class:`AppRunner` for serving :class:`Application`, :class:`ServerRunner` for low-level :class:`Server`. .. attribute:: server Low-level web :class:`Server` for handling HTTP requests, read-only attribute. .. attribute:: addresses A :class:`list` of served sockets addresses. See :meth:`socket.getsockname` for items type. .. versionadded:: 3.3 .. attribute:: sites A read-only :class:`set` of served sites (:class:`TCPSite` / :class:`UnixSite` / :class:`NamedPipeSite` / :class:`SockSite` instances). .. comethod:: setup() Initialize the server. Should be called before adding sites. .. comethod:: cleanup() Stop handling all registered sites and cleanup used resources. .. class:: AppRunner(app, *, handle_signals=False, **kwargs) A runner for :class:`Application`. Used with conjunction with sites to serve on specific port. Inherited from :class:`BaseRunner`. :param Application app: web application instance to serve. :param bool handle_signals: add signal handlers for :data:`signal.SIGINT` and :data:`signal.SIGTERM` (``False`` by default). :param kwargs: named parameters to pass into web protocol. .. attribute:: app Read-only attribute for accessing to :class:`Application` served instance. .. comethod:: setup() Initialize application. Should be called before adding sites. The method calls :attr:`Application.on_startup` registered signals. .. comethod:: cleanup() Stop handling all registered sites and cleanup used resources. :attr:`Application.on_shutdown` and :attr:`Application.on_cleanup` signals are called internally. .. class:: ServerRunner(web_server, *, handle_signals=False, **kwargs) A runner for low-level :class:`Server`. Used with conjunction with sites to serve on specific port. Inherited from :class:`BaseRunner`. :param Server web_server: low-level web server instance to serve. :param bool handle_signals: add signal handlers for :data:`signal.SIGINT` and :data:`signal.SIGTERM` (``False`` by default). :param kwargs: named parameters to pass into web protocol. .. seealso:: :ref:`aiohttp-web-lowlevel` demonstrates low-level server usage .. class:: BaseSite An abstract class for handled sites. .. attribute:: name An identifier for site, read-only :class:`str` property. Could be a handled URL or UNIX socket path. .. comethod:: start() Start handling a site. .. comethod:: stop() Stop handling a site. .. class:: TCPSite(runner, host=None, port=None, *, \ shutdown_timeout=60.0, ssl_context=None, \ backlog=128, reuse_address=None, \ reuse_port=None) Serve a runner on TCP socket. :param runner: a runner to serve. :param str host: HOST to listen on, ``'0.0.0.0'`` if ``None`` (default). :param int port: PORT to listed on, ``8080`` if ``None`` (default). :param float shutdown_timeout: a timeout for closing opened connections on :meth:`BaseSite.stop` call. :param ssl_context: a :class:`ssl.SSLContext` instance for serving SSL/TLS secure server, ``None`` for plain HTTP server (default). :param int backlog: a number of unaccepted connections that the system will allow before refusing new connections, see :meth:`socket.listen` for details. ``128`` by default. :param bool reuse_address: tells the kernel to reuse a local socket in TIME_WAIT state, without waiting for its natural timeout to expire. If not specified will automatically be set to True on UNIX. :param bool reuse_port: tells the kernel to allow this endpoint to be bound to the same port as other existing endpoints are bound to, so long as they all set this flag when being created. This option is not supported on Windows. .. class:: UnixSite(runner, path, *, \ shutdown_timeout=60.0, ssl_context=None, \ backlog=128) Serve a runner on UNIX socket. :param runner: a runner to serve. :param str path: PATH to UNIX socket to listen. :param float shutdown_timeout: a timeout for closing opened connections on :meth:`BaseSite.stop` call. :param ssl_context: a :class:`ssl.SSLContext` instance for serving SSL/TLS secure server, ``None`` for plain HTTP server (default). :param int backlog: a number of unaccepted connections that the system will allow before refusing new connections, see :meth:`socket.listen` for details. ``128`` by default. .. class:: NamedPipeSite(runner, path, *, shutdown_timeout=60.0) Serve a runner on Named Pipe in Windows. :param runner: a runner to serve. :param str path: PATH of named pipe to listen. :param float shutdown_timeout: a timeout for closing opened connections on :meth:`BaseSite.stop` call. .. class:: SockSite(runner, sock, *, \ shutdown_timeout=60.0, ssl_context=None, \ backlog=128) Serve a runner on UNIX socket. :param runner: a runner to serve. :param sock: :class:`socket.socket` to listen. :param float shutdown_timeout: a timeout for closing opened connections on :meth:`BaseSite.stop` call. :param ssl_context: a :class:`ssl.SSLContext` instance for serving SSL/TLS secure server, ``None`` for plain HTTP server (default). :param int backlog: a number of unaccepted connections that the system will allow before refusing new connections, see :meth:`socket.listen` for details. ``128`` by default. Utilities --------- .. class:: FileField A :class:`~collections.namedtuple` instance that is returned as multidict value by :meth:`Request.POST` if field is uploaded file. .. attribute:: name Field name .. attribute:: filename File name as specified by uploading (client) side. .. attribute:: file An :class:`io.IOBase` instance with content of uploaded file. .. attribute:: content_type *MIME type* of uploaded file, ``'text/plain'`` by default. .. seealso:: :ref:`aiohttp-web-file-upload` .. function:: run_app(app, *, host=None, port=None, path=None, \ sock=None, shutdown_timeout=60.0, \ ssl_context=None, print=print, backlog=128, \ access_log_class=aiohttp.helpers.AccessLogger, \ access_log_format=aiohttp.helpers.AccessLogger.LOG_FORMAT, \ access_log=aiohttp.log.access_logger, \ handle_signals=True, \ reuse_address=None, \ reuse_port=None) A utility function for running an application, serving it until keyboard interrupt and performing a :ref:`aiohttp-web-graceful-shutdown`. Suitable as handy tool for scaffolding aiohttp based projects. Perhaps production config will use more sophisticated runner but it good enough at least at very beginning stage. The server will listen on any host or Unix domain socket path you supply. If no hosts or paths are supplied, or only a port is supplied, a TCP server listening on 0.0.0.0 (all hosts) will be launched. Distributing HTTP traffic to multiple hosts or paths on the same application process provides no performance benefit as the requests are handled on the same event loop. See :doc:`deployment` for ways of distributing work for increased performance. :param app: :class:`Application` instance to run or a *coroutine* that returns an application. :param str host: TCP/IP host or a sequence of hosts for HTTP server. Default is ``'0.0.0.0'`` if *port* has been specified or if *path* is not supplied. :param int port: TCP/IP port for HTTP server. Default is ``8080`` for plain text HTTP and ``8443`` for HTTP via SSL (when *ssl_context* parameter is specified). :param str path: file system path for HTTP server Unix domain socket. A sequence of file system paths can be used to bind multiple domain sockets. Listening on Unix domain sockets is not supported by all operating systems. :param socket sock: a preexisting socket object to accept connections on. A sequence of socket objects can be passed. :param int shutdown_timeout: a delay to wait for graceful server shutdown before disconnecting all open client sockets hard way. A system with properly :ref:`aiohttp-web-graceful-shutdown` implemented never waits for this timeout but closes a server in a few milliseconds. :param ssl_context: :class:`ssl.SSLContext` for HTTPS server, ``None`` for HTTP connection. :param print: a callable compatible with :func:`print`. May be used to override STDOUT output or suppress it. Passing `None` disables output. :param int backlog: the number of unaccepted connections that the system will allow before refusing new connections (``128`` by default). :param access_log_class: class for `access_logger`. Default: :data:`aiohttp.helpers.AccessLogger`. Must to be a subclass of :class:`aiohttp.abc.AbstractAccessLogger`. :param access_log: :class:`logging.Logger` instance used for saving access logs. Use ``None`` for disabling logs for sake of speedup. :param access_log_format: access log format, see :ref:`aiohttp-logging-access-log-format-spec` for details. :param bool handle_signals: override signal TERM handling to gracefully exit the application. :param bool reuse_address: tells the kernel to reuse a local socket in TIME_WAIT state, without waiting for its natural timeout to expire. If not specified will automatically be set to True on UNIX. :param bool reuse_port: tells the kernel to allow this endpoint to be bound to the same port as other existing endpoints are bound to, so long as they all set this flag when being created. This option is not supported on Windows. .. versionadded:: 3.0 Support *access_log_class* parameter. Support *reuse_address*, *reuse_port* parameter. .. versionadded:: 3.1 Accept a coroutine as *app* parameter. Constants --------- .. class:: ContentCoding An :class:`enum.Enum` class of available Content Codings. .. attribute:: deflate *DEFLATE compression* .. attribute:: gzip *GZIP compression* .. attribute:: identity *no compression* Middlewares ----------- Normalize path middleware ^^^^^^^^^^^^^^^^^^^^^^^^^ .. function:: normalize_path_middleware(*, \ append_slash=True, \ remove_slash=False, \ merge_slashes=True, \ redirect_class=HTTPPermanentRedirect) Middleware factory which produces a middleware that normalizes the path of a request. By normalizing it means: - Add or remove a trailing slash to the path. - Double slashes are replaced by one. The middleware returns as soon as it finds a path that resolves correctly. The order if both merge and append/remove are enabled is: 1. *merge_slashes* 2. *append_slash* or *remove_slash* 3. both *merge_slashes* and *append_slash* or *remove_slash* If the path resolves with at least one of those conditions, it will redirect to the new path. Only one of *append_slash* and *remove_slash* can be enabled. If both are ``True`` the factory will raise an ``AssertionError`` If *append_slash* is ``True`` the middleware will append a slash when needed. If a resource is defined with trailing slash and the request comes without it, it will append it automatically. If *remove_slash* is ``True``, *append_slash* must be ``False``. When enabled the middleware will remove trailing slashes and redirect if the resource is defined. If *merge_slashes* is ``True``, merge multiple consecutive slashes in the path into one. .. versionadded:: 3.4 Support for *remove_slash* aiohttp-3.6.2/docs/websocket_utilities.rst0000644000175100001650000001050013547410117021224 0ustar vstsdocker00000000000000WebSocket utilities =================== .. currentmodule:: aiohttp .. class:: WSCloseCode An :class:`~enum.IntEnum` for keeping close message code. .. attribute:: OK A normal closure, meaning that the purpose for which the connection was established has been fulfilled. .. attribute:: GOING_AWAY An endpoint is "going away", such as a server going down or a browser having navigated away from a page. .. attribute:: PROTOCOL_ERROR An endpoint is terminating the connection due to a protocol error. .. attribute:: UNSUPPORTED_DATA An endpoint is terminating the connection because it has received a type of data it cannot accept (e.g., an endpoint that understands only text data MAY send this if it receives a binary message). .. attribute:: INVALID_TEXT An endpoint is terminating the connection because it has received data within a message that was not consistent with the type of the message (e.g., non-UTF-8 :rfc:`3629` data within a text message). .. attribute:: POLICY_VIOLATION An endpoint is terminating the connection because it has received a message that violates its policy. This is a generic status code that can be returned when there is no other more suitable status code (e.g., :attr:`~WSCloseCode.unsupported_data` or :attr:`~WSCloseCode.message_too_big`) or if there is a need to hide specific details about the policy. .. attribute:: MESSAGE_TOO_BIG An endpoint is terminating the connection because it has received a message that is too big for it to process. .. attribute:: MANDATORY_EXTENSION An endpoint (client) is terminating the connection because it has expected the server to negotiate one or more extension, but the server did not return them in the response message of the WebSocket handshake. The list of extensions that are needed should appear in the /reason/ part of the Close frame. Note that this status code is not used by the server, because it can fail the WebSocket handshake instead. .. attribute:: INTERNAL_ERROR A server is terminating the connection because it encountered an unexpected condition that prevented it from fulfilling the request. .. attribute:: SERVICE_RESTART The service is restarted. a client may reconnect, and if it chooses to do, should reconnect using a randomized delay of 5-30s. .. attribute:: TRY_AGAIN_LATER The service is experiencing overload. A client should only connect to a different IP (when there are multiple for the target) or reconnect to the same IP upon user action. .. class:: WSMsgType An :class:`~enum.IntEnum` for describing :class:`WSMessage` type. .. attribute:: CONTINUATION A mark for continuation frame, user will never get the message with this type. .. attribute:: TEXT Text message, the value has :class:`str` type. .. attribute:: BINARY Binary message, the value has :class:`bytes` type. .. attribute:: PING Ping frame (sent by client peer). .. attribute:: PONG Pong frame, answer on ping. Sent by server peer. .. attribute:: CLOSE Close frame. .. attribute:: CLOSED FRAME Actually not frame but a flag indicating that websocket was closed. .. attribute:: ERROR Actually not frame but a flag indicating that websocket was received an error. .. class:: WSMessage Websocket message, returned by ``.receive()`` calls. .. attribute:: type Message type, :class:`WSMsgType` instance. .. attribute:: data Message payload. 1. :class:`str` for :attr:`WSMsgType.TEXT` messages. 2. :class:`bytes` for :attr:`WSMsgType.BINARY` messages. 3. :class:`WSCloseCode` for :attr:`WSMsgType.CLOSE` messages. 4. :class:`bytes` for :attr:`WSMsgType.PING` messages. 5. :class:`bytes` for :attr:`WSMsgType.PONG` messages. .. attribute:: extra Additional info, :class:`str`. Makes sense only for :attr:`WSMsgType.CLOSE` messages, contains optional message description. .. method:: json(*, loads=json.loads) Returns parsed JSON data. :param loads: optional JSON decoder function. aiohttp-3.6.2/docs/whats_new_1_1.rst0000644000175100001650000001046513547410117017614 0ustar vstsdocker00000000000000========================= What's new in aiohttp 1.1 ========================= YARL and URL encoding ====================== Since aiohttp 1.1 the library uses :term:`yarl` for URL processing. New API ------- :class:`yarl.URL` gives handy methods for URL operations etc. Client API still accepts :class:`str` everywhere *url* is used, e.g. ``session.get('http://example.com')`` works as well as ``session.get(yarl.URL('http://example.com'))``. Internal API has been switched to :class:`yarl.URL`. :class:`aiohttp.CookieJar` accepts :class:`~yarl.URL` instances only. On server side has added :class:`web.Request.url` and :class:`web.Request.rel_url` properties for representing relative and absolute request's URL. URL using is the recommended way, already existed properties for retrieving URL parts are deprecated and will be eventually removed. Redirection web exceptions accepts :class:`yarl.URL` as *location* parameter. :class:`str` is still supported and will be supported forever. Reverse URL processing for *router* has been changed. The main API is :class:`aiohttp.web.Request.url_for(name, **kwargs)` which returns a :class:`yarl.URL` instance for named resource. It does not support *query args* but adding *args* is trivial: ``request.url_for('named_resource', param='a').with_query(arg='val')``. The method returns a *relative* URL, absolute URL may be constructed by ``request.url.join(request.url_for(...)`` call. URL encoding ------------ YARL encodes all non-ASCII symbols on :class:`yarl.URL` creation. Thus ``URL('https://www.python.org/путь')`` becomes ``'https://www.python.org/%D0%BF%D1%83%D1%82%D1%8C'``. On filling route table it's possible to use both non-ASCII and percent encoded paths:: app.router.add_get('/путь', handler) and:: app.router.add_get('/%D0%BF%D1%83%D1%82%D1%8C', handler) are the same. Internally ``'/путь'`` is converted into percent-encoding representation. Route matching also accepts both URL forms: raw and encoded by converting the route pattern to *canonical* (encoded) form on route registration. Sub-Applications ================ Sub applications are designed for solving the problem of the big monolithic code base. Let's assume we have a project with own business logic and tools like administration panel and debug toolbar. Administration panel is a separate application by its own nature but all toolbar URLs are served by prefix like ``/admin``. Thus we'll create a totally separate application named ``admin`` and connect it to main app with prefix:: admin = web.Application() # setup admin routes, signals and middlewares app.add_subapp('/admin/', admin) Middlewares and signals from ``app`` and ``admin`` are chained. It means that if URL is ``'/admin/something'`` middlewares from ``app`` are applied first and ``admin.middlewares`` are the next in the call chain. The same is going for :attr:`~aiohttp.web.Application.on_response_prepare` signal -- the signal is delivered to both top level ``app`` and ``admin`` if processing URL is routed to ``admin`` sub-application. Common signals like :attr:`~aiohttp.web.Application.on_startup`, :attr:`~aiohttp.web.Application.on_shutdown` and :attr:`~aiohttp.web.Application.on_cleanup` are delivered to all registered sub-applications. The passed parameter is sub-application instance, not top-level application. Third level sub-applications can be nested into second level ones -- there are no limitation for nesting level. Url reversing ------------- Url reversing for sub-applications should generate urls with proper prefix. But for getting URL sub-application's router should be used:: admin = web.Application() admin.add_get('/resource', handler, name='name') app.add_subapp('/admin/', admin) url = admin.router['name'].url_for() The generated ``url`` from example will have a value ``URL('/admin/resource')``. Application freezing ==================== Application can be used either as main app (``app.make_handler()``) or as sub-application -- not both cases at the same time. After connecting application by ``.add_subapp()`` call or starting serving web-server as toplevel application the application is **frozen**. It means that registering new routes, signals and middlewares is forbidden. Changing state (``app['name'] = 'value'``) of frozen application is deprecated and will be eventually removed. aiohttp-3.6.2/docs/whats_new_3_0.rst0000644000175100001650000000461413547410117017614 0ustar vstsdocker00000000000000.. _aiohttp_whats_new_3_0: ========================= What's new in aiohttp 3.0 ========================= async/await everywhere ====================== The main change is dropping ``yield from`` support and using ``async``/``await`` everywhere. Farewell, Python 3.4. The minimal supported Python version is **3.5.3** now. Why not *3.5.0*? Because *3.5.3* has a crucial change: :func:`asyncio.get_event_loop()` returns the running loop instead of *default*, which may be different, e.g.:: loop = asyncio.new_event_loop() loop.run_until_complete(f()) Note, :func:`asyncio.set_event_loop` was not called and default loop is not equal to actually executed one. Application Runners =================== People constantly asked about ability to run aiohttp servers together with other asyncio code, but :func:`aiohttp.web.run_app` is blocking synchronous call. aiohttp had support for starting the application without ``run_app`` but the API was very low-level and cumbersome. Now application runners solve the task in a few lines of code, see :ref:`aiohttp-web-app-runners` for details. Client Tracing ============== Other long awaited feature is tracing client request life cycle to figure out when and why client request spends a time waiting for connection establishment, getting server response headers etc. Now it is possible by registering special signal handlers on every request processing stage. :ref:`aiohttp-client-tracing` provides more info about the feature. HTTPS support ============= Unfortunately asyncio has a bug with checking SSL certificates for non-ASCII site DNS names, e.g. `https://историк.рф `_ or `https://雜草工作室.香港 `_. The bug has been fixed in upcoming Python 3.7 only (the change requires breaking backward compatibility in :mod:`ssl` API). aiohttp installs a fix for older Python versions (3.5 and 3.6). Dropped obsolete API ==================== A switch to new major version is a great chance for dropping already deprecated features. The release dropped a lot, see :ref:`aiohttp_changes` for details. All removals was already marked as deprecated or related to very low level implementation details. If user code did not raise :exc:`DeprecationWarning` it is compatible with aiohttp 3.0 most likely. Summary ======= Enjoy aiohttp 3.0 release! The full change log is here: :ref:`aiohttp_changes`. aiohttp-3.6.2/examples/0000755000175100001650000000000013547410140015277 5ustar vstsdocker00000000000000aiohttp-3.6.2/examples/background_tasks.py0000755000175100001650000000342513547410117021210 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example of aiohttp.web.Application.on_startup signal handler""" import asyncio import aioredis from aiohttp import web async def websocket_handler(request): ws = web.WebSocketResponse() await ws.prepare(request) request.app['websockets'].append(ws) try: async for msg in ws: print(msg) await asyncio.sleep(1) finally: request.app['websockets'].remove(ws) return ws async def on_shutdown(app): for ws in app['websockets']: await ws.close(code=999, message='Server shutdown') async def listen_to_redis(app): try: sub = await aioredis.create_redis(('localhost', 6379), loop=app.loop) ch, *_ = await sub.subscribe('news') async for msg in ch.iter(encoding='utf-8'): # Forward message to all connected websockets: for ws in app['websockets']: await ws.send_str('{}: {}'.format(ch.name, msg)) print("message in {}: {}".format(ch.name, msg)) except asyncio.CancelledError: pass finally: print('Cancel Redis listener: close connection...') await sub.unsubscribe(ch.name) await sub.quit() print('Redis connection closed.') async def start_background_tasks(app): app['redis_listener'] = app.loop.create_task(listen_to_redis(app)) async def cleanup_background_tasks(app): print('cleanup background tasks...') app['redis_listener'].cancel() await app['redis_listener'] def init(): app = web.Application() app['websockets'] = [] app.router.add_get('/news', websocket_handler) app.on_startup.append(start_background_tasks) app.on_cleanup.append(cleanup_background_tasks) app.on_shutdown.append(on_shutdown) return app web.run_app(init()) aiohttp-3.6.2/examples/cli_app.py0000755000175100001650000000260013547410117017265 0ustar vstsdocker00000000000000""" Example of serving an Application using the `aiohttp.web` CLI. Serve this app using:: $ python -m aiohttp.web -H localhost -P 8080 --repeat 10 cli_app:init \ > "Hello World" Here ``--repeat`` & ``"Hello World"`` are application specific command-line arguments. `aiohttp.web` only parses & consumes the command-line arguments it needs (i.e. ``-H``, ``-P`` & ``entry-func``) and passes on any additional arguments to the `cli_app:init` function for processing. """ from argparse import ArgumentParser from aiohttp import web def display_message(req): args = req.app["args"] text = "\n".join([args.message] * args.repeat) return web.Response(text=text) def init(argv): arg_parser = ArgumentParser( prog="aiohttp.web ...", description="Application CLI", add_help=False ) # Positional argument arg_parser.add_argument( "message", help="message to print" ) # Optional argument arg_parser.add_argument( "--repeat", help="number of times to repeat message", type=int, default="1" ) # Avoid conflict with -h from `aiohttp.web` CLI parser arg_parser.add_argument( "--app-help", help="show this message and exit", action="help" ) args = arg_parser.parse_args(argv) app = web.Application() app["args"] = args app.router.add_get('/', display_message) return app aiohttp-3.6.2/examples/client_auth.py0000755000175100001650000000105013547410117020153 0ustar vstsdocker00000000000000import asyncio import aiohttp async def fetch(session): print('Query http://httpbin.org/basic-auth/andrew/password') async with session.get( 'http://httpbin.org/basic-auth/andrew/password') as resp: print(resp.status) body = await resp.text() print(body) async def go(loop): async with aiohttp.ClientSession( auth=aiohttp.BasicAuth('andrew', 'password'), loop=loop) as session: await fetch(session) loop = asyncio.get_event_loop() loop.run_until_complete(go(loop)) aiohttp-3.6.2/examples/client_json.py0000755000175100001650000000070013547410117020164 0ustar vstsdocker00000000000000import asyncio import aiohttp async def fetch(session): print('Query http://httpbin.org/get') async with session.get( 'http://httpbin.org/get') as resp: print(resp.status) data = await resp.json() print(data) async def go(loop): async with aiohttp.ClientSession(loop=loop) as session: await fetch(session) loop = asyncio.get_event_loop() loop.run_until_complete(go(loop)) loop.close() aiohttp-3.6.2/examples/client_ws.py0000755000175100001650000000407513547410117017655 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """websocket cmd client for wssrv.py example.""" import argparse import asyncio import signal import sys import aiohttp async def start_client(loop, url): name = input('Please enter your name: ') # input reader def stdin_callback(): line = sys.stdin.buffer.readline().decode('utf-8') if not line: loop.stop() else: ws.send_str(name + ': ' + line) loop.add_reader(sys.stdin.fileno(), stdin_callback) async def dispatch(): while True: msg = await ws.receive() if msg.type == aiohttp.WSMsgType.TEXT: print('Text: ', msg.data.strip()) elif msg.type == aiohttp.WSMsgType.BINARY: print('Binary: ', msg.data) elif msg.type == aiohttp.WSMsgType.PING: ws.pong() elif msg.type == aiohttp.WSMsgType.PONG: print('Pong received') else: if msg.type == aiohttp.WSMsgType.CLOSE: await ws.close() elif msg.type == aiohttp.WSMsgType.ERROR: print('Error during receive %s' % ws.exception()) elif msg.type == aiohttp.WSMsgType.CLOSED: pass break # send request async with aiohttp.ws_connect(url, autoclose=False, autoping=False) as ws: await dispatch() ARGS = argparse.ArgumentParser( description="websocket console client for wssrv.py example.") ARGS.add_argument( '--host', action="store", dest='host', default='127.0.0.1', help='Host name') ARGS.add_argument( '--port', action="store", dest='port', default=8080, type=int, help='Port number') if __name__ == '__main__': args = ARGS.parse_args() if ':' in args.host: args.host, port = args.host.split(':', 1) args.port = int(port) url = 'http://{}:{}'.format(args.host, args.port) loop = asyncio.get_event_loop() loop.add_signal_handler(signal.SIGINT, loop.stop) loop.create_task(start_client(loop, url)) loop.run_forever() aiohttp-3.6.2/examples/curl.py0000755000175100001650000000165113547410117016630 0ustar vstsdocker00000000000000#!/usr/bin/env python3 import argparse import asyncio import aiohttp async def curl(url): async with aiohttp.ClientSession() as session: async with session.request('GET', url) as response: print(repr(response)) chunk = await response.content.read() print('Downloaded: %s' % len(chunk)) if __name__ == '__main__': ARGS = argparse.ArgumentParser(description="GET url example") ARGS.add_argument('url', nargs=1, metavar='URL', help="URL to download") ARGS.add_argument('--iocp', default=False, action="store_true", help="Use ProactorEventLoop on Windows") options = ARGS.parse_args() if options.iocp: from asyncio import events, windows_events el = windows_events.ProactorEventLoop() events.set_event_loop(el) loop = asyncio.get_event_loop() loop.run_until_complete(curl(options.url[0])) aiohttp-3.6.2/examples/fake_server.py0000755000175100001650000000737113547410117020164 0ustar vstsdocker00000000000000import asyncio import pathlib import socket import ssl import aiohttp from aiohttp import web from aiohttp.resolver import DefaultResolver from aiohttp.test_utils import unused_port class FakeResolver: _LOCAL_HOST = {0: '127.0.0.1', socket.AF_INET: '127.0.0.1', socket.AF_INET6: '::1'} def __init__(self, fakes, *, loop): """fakes -- dns -> port dict""" self._fakes = fakes self._resolver = DefaultResolver(loop=loop) async def resolve(self, host, port=0, family=socket.AF_INET): fake_port = self._fakes.get(host) if fake_port is not None: return [{'hostname': host, 'host': self._LOCAL_HOST[family], 'port': fake_port, 'family': family, 'proto': 0, 'flags': socket.AI_NUMERICHOST}] else: return await self._resolver.resolve(host, port, family) class FakeFacebook: def __init__(self, *, loop): self.loop = loop self.app = web.Application(loop=loop) self.app.router.add_routes( [web.get('/v2.7/me', self.on_me), web.get('/v2.7/me/friends', self.on_my_friends)]) self.runner = None here = pathlib.Path(__file__) ssl_cert = here.parent / 'server.crt' ssl_key = here.parent / 'server.key' self.ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH) self.ssl_context.load_cert_chain(str(ssl_cert), str(ssl_key)) async def start(self): port = unused_port() self.runner = web.AppRunner(self.app) await self.runner.setup() site = web.TCPSite(self.runner, '127.0.0.1', port, ssl_context=self.ssl_context) await site.start() return {'graph.facebook.com': port} async def stop(self): await self.runner.cleanup() async def on_me(self, request): return web.json_response({ "name": "John Doe", "id": "12345678901234567" }) async def on_my_friends(self, request): return web.json_response({ "data": [ { "name": "Bill Doe", "id": "233242342342" }, { "name": "Mary Doe", "id": "2342342343222" }, { "name": "Alex Smith", "id": "234234234344" }, ], "paging": { "cursors": { "before": "QVFIUjRtc2c5NEl0ajN", "after": "QVFIUlpFQWM0TmVuaDRad0dt", }, "next": ("https://graph.facebook.com/v2.7/12345678901234567/" "friends?access_token=EAACEdEose0cB") }, "summary": { "total_count": 3 }}) async def main(loop): token = "ER34gsSGGS34XCBKd7u" fake_facebook = FakeFacebook(loop=loop) info = await fake_facebook.start() resolver = FakeResolver(info, loop=loop) connector = aiohttp.TCPConnector(loop=loop, resolver=resolver, verify_ssl=False) async with aiohttp.ClientSession(connector=connector, loop=loop) as session: async with session.get('https://graph.facebook.com/v2.7/me', params={'access_token': token}) as resp: print(await resp.json()) async with session.get('https://graph.facebook.com/v2.7/me/friends', params={'access_token': token}) as resp: print(await resp.json()) await fake_facebook.stop() loop = asyncio.get_event_loop() loop.run_until_complete(main(loop)) aiohttp-3.6.2/examples/legacy/0000755000175100001650000000000013547410140016543 5ustar vstsdocker00000000000000aiohttp-3.6.2/examples/legacy/crawl.py0000755000175100001650000000607113547410117020240 0ustar vstsdocker00000000000000#!/usr/bin/env python3 import asyncio import logging import re import signal import sys import urllib.parse import aiohttp class Crawler: def __init__(self, rooturl, loop, maxtasks=100): self.rooturl = rooturl self.loop = loop self.todo = set() self.busy = set() self.done = {} self.tasks = set() self.sem = asyncio.Semaphore(maxtasks, loop=loop) # connector stores cookies between requests and uses connection pool self.session = aiohttp.ClientSession(loop=loop) async def run(self): t = asyncio.ensure_future(self.addurls([(self.rooturl, '')]), loop=self.loop) await asyncio.sleep(1, loop=self.loop) while self.busy: await asyncio.sleep(1, loop=self.loop) await t await self.session.close() self.loop.stop() async def addurls(self, urls): for url, parenturl in urls: url = urllib.parse.urljoin(parenturl, url) url, frag = urllib.parse.urldefrag(url) if (url.startswith(self.rooturl) and url not in self.busy and url not in self.done and url not in self.todo): self.todo.add(url) await self.sem.acquire() task = asyncio.ensure_future(self.process(url), loop=self.loop) task.add_done_callback(lambda t: self.sem.release()) task.add_done_callback(self.tasks.remove) self.tasks.add(task) async def process(self, url): print('processing:', url) self.todo.remove(url) self.busy.add(url) try: resp = await self.session.get(url) except Exception as exc: print('...', url, 'has error', repr(str(exc))) self.done[url] = False else: if (resp.status == 200 and ('text/html' in resp.headers.get('content-type'))): data = (await resp.read()).decode('utf-8', 'replace') urls = re.findall(r'(?i)href=["\']?([^\s"\'<>]+)', data) asyncio.Task(self.addurls([(u, url) for u in urls])) resp.close() self.done[url] = True self.busy.remove(url) print(len(self.done), 'completed tasks,', len(self.tasks), 'still pending, todo', len(self.todo)) def main(): loop = asyncio.get_event_loop() c = Crawler(sys.argv[1], loop) asyncio.ensure_future(c.run(), loop=loop) try: loop.add_signal_handler(signal.SIGINT, loop.stop) except RuntimeError: pass loop.run_forever() print('todo:', len(c.todo)) print('busy:', len(c.busy)) print('done:', len(c.done), '; ok:', sum(c.done.values())) print('tasks:', len(c.tasks)) if __name__ == '__main__': if '--iocp' in sys.argv: from asyncio import events, windows_events sys.argv.remove('--iocp') logging.info('using iocp') el = windows_events.ProactorEventLoop() events.set_event_loop(el) main() aiohttp-3.6.2/examples/legacy/srv.py0000755000175100001650000001235313547410117017742 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Simple server written using an event loop.""" import argparse import asyncio import logging import os import sys import aiohttp import aiohttp.server try: import ssl except ImportError: # pragma: no cover ssl = None class HttpRequestHandler(aiohttp.server.ServerHttpProtocol): async def handle_request(self, message, payload): print('method = {!r}; path = {!r}; version = {!r}'.format( message.method, message.path, message.version)) path = message.path if (not (path.isprintable() and path.startswith('/')) or '/.' in path): print('bad path', repr(path)) path = None else: path = '.' + path if not os.path.exists(path): print('no file', repr(path)) path = None else: isdir = os.path.isdir(path) if not path: raise aiohttp.HttpProcessingError(code=404) for hdr, val in message.headers.items(): print(hdr, val) if isdir and not path.endswith('/'): path = path + '/' raise aiohttp.HttpProcessingError( code=302, headers=(('URI', path), ('Location', path))) response = aiohttp.Response( self.writer, 200, http_version=message.version) response.add_header('Transfer-Encoding', 'chunked') # content encoding accept_encoding = message.headers.get('accept-encoding', '').lower() if 'deflate' in accept_encoding: response.add_header('Content-Encoding', 'deflate') response.add_compression_filter('deflate') elif 'gzip' in accept_encoding: response.add_header('Content-Encoding', 'gzip') response.add_compression_filter('gzip') response.add_chunking_filter(1025) if isdir: response.add_header('Content-type', 'text/html') response.send_headers() response.write(b'
    ') else: response.add_header('Content-type', 'text/plain') response.send_headers() try: with open(path, 'rb') as fp: chunk = fp.read(8192) while chunk: response.write(chunk) chunk = fp.read(8192) except OSError: response.write(b'Cannot open') await response.write_eof() if response.keep_alive(): self.keep_alive(True) ARGS = argparse.ArgumentParser(description="Run simple HTTP server.") ARGS.add_argument( '--host', action="store", dest='host', default='127.0.0.1', help='Host name') ARGS.add_argument( '--port', action="store", dest='port', default=8080, type=int, help='Port number') # make iocp and ssl mutually exclusive because ProactorEventLoop is # incompatible with SSL group = ARGS.add_mutually_exclusive_group() group.add_argument( '--iocp', action="store_true", dest='iocp', help='Windows IOCP event loop') group.add_argument( '--ssl', action="store_true", dest='ssl', help='Run ssl mode.') ARGS.add_argument( '--sslcert', action="store", dest='certfile', help='SSL cert file.') ARGS.add_argument( '--sslkey', action="store", dest='keyfile', help='SSL key file.') def main(): args = ARGS.parse_args() if ':' in args.host: args.host, port = args.host.split(':', 1) args.port = int(port) if args.iocp: from asyncio import windows_events sys.argv.remove('--iocp') logging.info('using iocp') el = windows_events.ProactorEventLoop() asyncio.set_event_loop(el) if args.ssl: here = os.path.join(os.path.dirname(__file__), 'tests') if args.certfile: certfile = args.certfile or os.path.join(here, 'sample.crt') keyfile = args.keyfile or os.path.join(here, 'sample.key') else: certfile = os.path.join(here, 'sample.crt') keyfile = os.path.join(here, 'sample.key') sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23) sslcontext.load_cert_chain(certfile, keyfile) else: sslcontext = None loop = asyncio.get_event_loop() f = loop.create_server( lambda: HttpRequestHandler(debug=True, keep_alive=75), args.host, args.port, ssl=sslcontext) svr = loop.run_until_complete(f) socks = svr.sockets print('serving on', socks[0].getsockname()) try: loop.run_forever() except KeyboardInterrupt: pass if __name__ == '__main__': main() aiohttp-3.6.2/examples/legacy/tcp_protocol_parser.py0000755000175100001650000001145013547410117023210 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Protocol parser example.""" import argparse import asyncio import collections import aiohttp try: import signal except ImportError: signal = None MSG_TEXT = b'text:' MSG_PING = b'ping:' MSG_PONG = b'pong:' MSG_STOP = b'stop:' Message = collections.namedtuple('Message', ('tp', 'data')) def my_protocol_parser(out, buf): """Parser is used with StreamParser for incremental protocol parsing. Parser is a generator function, but it is not a coroutine. Usually parsers are implemented as a state machine. more details in asyncio/parsers.py existing parsers: * HTTP protocol parsers asyncio/http/protocol.py * websocket parser asyncio/http/websocket.py """ while True: tp = yield from buf.read(5) if tp in (MSG_PING, MSG_PONG): # skip line yield from buf.skipuntil(b'\r\n') out.feed_data(Message(tp, None)) elif tp == MSG_STOP: out.feed_data(Message(tp, None)) elif tp == MSG_TEXT: # read text text = yield from buf.readuntil(b'\r\n') out.feed_data(Message(tp, text.strip().decode('utf-8'))) else: raise ValueError('Unknown protocol prefix.') class MyProtocolWriter: def __init__(self, transport): self.transport = transport def ping(self): self.transport.write(b'ping:\r\n') def pong(self): self.transport.write(b'pong:\r\n') def stop(self): self.transport.write(b'stop:\r\n') def send_text(self, text): self.transport.write( 'text:{}\r\n'.format(text.strip()).encode('utf-8')) class EchoServer(asyncio.Protocol): def connection_made(self, transport): print('Connection made') self.transport = transport self.stream = aiohttp.StreamParser() asyncio.Task(self.dispatch()) def data_received(self, data): self.stream.feed_data(data) def eof_received(self): self.stream.feed_eof() def connection_lost(self, exc): print('Connection lost') async def dispatch(self): reader = self.stream.set_parser(my_protocol_parser) writer = MyProtocolWriter(self.transport) while True: try: msg = await reader.read() except aiohttp.ConnectionError: # client has been disconnected break print('Message received: {}'.format(msg)) if msg.type == MSG_PING: writer.pong() elif msg.type == MSG_TEXT: writer.send_text('Re: ' + msg.data) elif msg.type == MSG_STOP: self.transport.close() break async def start_client(loop, host, port): transport, stream = await loop.create_connection( aiohttp.StreamProtocol, host, port) reader = stream.reader.set_parser(my_protocol_parser) writer = MyProtocolWriter(transport) writer.ping() message = 'This is the message. It will be echoed.' while True: try: msg = await reader.read() except aiohttp.ConnectionError: print('Server has been disconnected.') break print('Message received: {}'.format(msg)) if msg.type == MSG_PONG: writer.send_text(message) print('data sent:', message) elif msg.type == MSG_TEXT: writer.stop() print('stop sent') break transport.close() def start_server(loop, host, port): f = loop.create_server(EchoServer, host, port) srv = loop.run_until_complete(f) x = srv.sockets[0] print('serving on', x.getsockname()) loop.run_forever() ARGS = argparse.ArgumentParser(description="Protocol parser example.") ARGS.add_argument( '--server', action="store_true", dest='server', default=False, help='Run tcp server') ARGS.add_argument( '--client', action="store_true", dest='client', default=False, help='Run tcp client') ARGS.add_argument( '--host', action="store", dest='host', default='127.0.0.1', help='Host name') ARGS.add_argument( '--port', action="store", dest='port', default=9999, type=int, help='Port number') if __name__ == '__main__': args = ARGS.parse_args() if ':' in args.host: args.host, port = args.host.split(':', 1) args.port = int(port) if (not (args.server or args.client)) or (args.server and args.client): print('Please specify --server or --client\n') ARGS.print_help() else: loop = asyncio.get_event_loop() if signal is not None: loop.add_signal_handler(signal.SIGINT, loop.stop) if args.server: start_server(loop, args.host, args.port) else: loop.run_until_complete(start_client(loop, args.host, args.port)) aiohttp-3.6.2/examples/lowlevel_srv.py0000644000175100001650000000104413547410117020377 0ustar vstsdocker00000000000000import asyncio from aiohttp import web async def handler(request): return web.Response(text="OK") async def main(loop): server = web.Server(handler) await loop.create_server(server, "127.0.0.1", 8080) print("======= Serving on http://127.0.0.1:8080/ ======") # pause here for very long time by serving HTTP requests and # waiting for keyboard interruption await asyncio.sleep(100*3600) loop = asyncio.get_event_loop() try: loop.run_until_complete(main(loop)) except KeyboardInterrupt: pass loop.close() aiohttp-3.6.2/examples/server.crt0000644000175100001650000000211713547410117017324 0ustar vstsdocker00000000000000-----BEGIN CERTIFICATE----- MIIDADCCAegCCQCgevpPMuTTLzANBgkqhkiG9w0BAQsFADBCMQswCQYDVQQGEwJV QTEQMA4GA1UECAwHVWtyYWluZTEhMB8GA1UECgwYSW50ZXJuZXQgV2lkZ2l0cyBQ dHkgTHRkMB4XDTE2MDgwNzIzMTMwOFoXDTI2MDgwNTIzMTMwOFowQjELMAkGA1UE BhMCVUExEDAOBgNVBAgMB1VrcmFpbmUxITAfBgNVBAoMGEludGVybmV0IFdpZGdp dHMgUHR5IEx0ZDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOUgkn3j X/sdg6GGueGDHCM+snIUVY3fM6D4jXjyBhnT3TqKG1lJwCGYR11AD+2SJYppU+w4 QaF6YZwMeZBKy+mVQ9+CrVYyKQE7j9H8XgNEHV9BQzoragT8lia8eC5aOQzUeX8A xCSSbsnyT/X+S1IKdd0txLOeZOD6pWwJoc3dpDELglk2b1tzhyN2GjQv3aRHj55P x7127MeZyRXwODFpXrpbnwih4OqkA4EYtmqFbZttGEzMhd4Y5mkbyuRbGM+IE99o QJMvnIkjAfUo0aKnDrcAIkWCkwLIci9TIG6u3R1P2Tn+HYVntzQZ4BnxanbFNQ5S 9ARd3529EmO3BzUCAwEAATANBgkqhkiG9w0BAQsFAAOCAQEAXyiw1+YUnTEDI3C/ vq1Vn9pnwZALVQPiPlTqEGkl/nbq0suMmeZZG7pwrOJp3wr+sGwRAv9sPTro6srf Vj12wTo4LrTRKEDuS+AUJl0Mut7cPGIUKo+MGeZmmnDjMqcjljN3AO47ef4eWYo5 XGe4r4NDABEk5auOD/vQW5IiIMdmWsaMJ+0mZNpAV2NhAD/6ia28VvSL/yuaNqDW TYTUYHWLH08H6M6qrQ7FdoIDyYR5siqBukQzeqlnuq45bQ3ViYttNIkzZN4jbWJV /MFYLuJQ/fNoalDIC+ec0EIa9NbrfpoocJ8h6HlmWOqkES4QpBSOrkVid64Cdy3P JgiEWg== -----END CERTIFICATE----- aiohttp-3.6.2/examples/server.csr0000644000175100001650000000167013547410117017326 0ustar vstsdocker00000000000000-----BEGIN CERTIFICATE REQUEST----- MIIChzCCAW8CAQAwQjELMAkGA1UEBhMCVUExEDAOBgNVBAgMB1VrcmFpbmUxITAf BgNVBAoMGEludGVybmV0IFdpZGdpdHMgUHR5IEx0ZDCCASIwDQYJKoZIhvcNAQEB BQADggEPADCCAQoCggEBAOUgkn3jX/sdg6GGueGDHCM+snIUVY3fM6D4jXjyBhnT 3TqKG1lJwCGYR11AD+2SJYppU+w4QaF6YZwMeZBKy+mVQ9+CrVYyKQE7j9H8XgNE HV9BQzoragT8lia8eC5aOQzUeX8AxCSSbsnyT/X+S1IKdd0txLOeZOD6pWwJoc3d pDELglk2b1tzhyN2GjQv3aRHj55Px7127MeZyRXwODFpXrpbnwih4OqkA4EYtmqF bZttGEzMhd4Y5mkbyuRbGM+IE99oQJMvnIkjAfUo0aKnDrcAIkWCkwLIci9TIG6u 3R1P2Tn+HYVntzQZ4BnxanbFNQ5S9ARd3529EmO3BzUCAwEAAaAAMA0GCSqGSIb3 DQEBCwUAA4IBAQDO/PSd29KgisTdGXhntg7yBEhBAjsDW7uQCrdrPSZtFyN6wUHy /1yrrWe56ZuW8jpuP5tG0eTZ+0bT2RXIRot8a2Cc3eBhpoe8M3d84yXjKAoHutGE 5IK+TViQdvT3pT3a7pTmjlf8Ojq9tx+U2ckiz8Ccnjd9yM47M9NgMhrS1aBpVZSt gOD+zzrqMML4xks9id94H7bi9Tgs3AbEJIyDpBpoK6i4OvK7KTidCngCg80qmdTy bcScLapoy1Ped2BKKuxWdOOlP+mDJatc/pcfBLE13AncQjJgMerS9M5RWCBjmRow A+aB6fBEU8bOTrqCryfBeTiV6xzyDDcIXtc6 -----END CERTIFICATE REQUEST----- aiohttp-3.6.2/examples/server.key0000644000175100001650000000321313547410117017322 0ustar vstsdocker00000000000000-----BEGIN RSA PRIVATE KEY----- MIIEowIBAAKCAQEA5SCSfeNf+x2DoYa54YMcIz6ychRVjd8zoPiNePIGGdPdOoob WUnAIZhHXUAP7ZIlimlT7DhBoXphnAx5kErL6ZVD34KtVjIpATuP0fxeA0QdX0FD OitqBPyWJrx4Llo5DNR5fwDEJJJuyfJP9f5LUgp13S3Es55k4PqlbAmhzd2kMQuC WTZvW3OHI3YaNC/dpEePnk/HvXbsx5nJFfA4MWleulufCKHg6qQDgRi2aoVtm20Y TMyF3hjmaRvK5FsYz4gT32hAky+ciSMB9SjRoqcOtwAiRYKTAshyL1Mgbq7dHU/Z Of4dhWe3NBngGfFqdsU1DlL0BF3fnb0SY7cHNQIDAQABAoIBAG9BJ6B03VADfrzZ vDwh+3Gpqd/2u6wNqvYIejk123yDATLBiJIMW3x0goJm7tT+V7gjeJqEnmmYEPlC nWxQxT6AOdq3iw8FgB+XGjhuAAA5/MEZ4VjHZ81QEGBytzBaosT2DqB6cMMJTz5D qEvb1Brb9WsWJCLLUFRloBkbfDOG9lMvt34ixYTTmqjsVj5WByD5BhzKH51OJ72L 00IYpvrsEOtSev1hNV4199CHPYE90T/YQVooRBiHtTcfN+/KNVJu6Rf/zcaJ3WMS 1l3MBI8HwMimjKKkbddpoMHyFMtSNmS9Yq+4a9w7XZo1F5rt88hYSCtAF8HRAarX 0VBCJmkCgYEA9HenBBnmfDoN857femzoTHdWQQrZQ4YPAKHvKPlcgudizE5tQbs0 iTpwm+IsecgJS2Rio7zY+P7A5nKFz3N5c0IX3smYo0J2PoakkLAm25KMxFZYBuz4 MFWVdfByAU7d28BdNfyOVbA2kU2eal9lJ0yPLpMLbH8+bbvw5uBS808CgYEA7++p ftwib3DvKWMpl6G5eA1C2xprdbE0jm2fSr3LYp/vZ4QN2V6kK2YIlyUqQvhYCnxX oIP3v2MWDRHKKwJtBWR4+t23PaDaSXS2Ifm0qhRxwSm/oqpAJQXbR7VzxXp4/4FP 1SgkLe51bubc4h+cDngqBLcplCanvj52CqhqzDsCgYAEIhG8zANNjl22BLWaiETV Jh9bMifCMH4IcLRuaOjbfbX55kmKlvOobkiBGi3OUUd28teIFSVF8GiqfL0uaLFg 9XkZ1yaxe+or3HLjz1aY171xhFQwqcj4aDoCqHIE+6Rclr/8raxqXnRNuJY5DivT okO5cdr7lpsjl83W2WwNmQKBgCPXi1xWChbXqgJmu8nY8NnMMVaFpdPY+t7j5U3G +GDtP1gZU/BKwP9yqInblWqXqp82X+isjg/a/2pIZAj0vdB2Z9Qh1sOwCau7cZG1 uZVGpI+UavojsJ1XOKCHrJmtZ/HTIVfYPT9XRdehSRHGYwuOS8iUi/ODqr8ymXOS IRINAoGBAMEmhTihgFz6Y8ezRK3QTubguehHZG1zIvtgVhOk+8hRUTSJPI9nBJPC 4gOZsPx4g2oLK6PiudPR79bhxRxPACCMnXkdwZ/8FaIdmvRHsWVs8T80wID0wthI r5hW4uqi9CcKZrGWH7mx9cVJktspeGUczvKyzNMfCaojwzA/49Z1 -----END RSA PRIVATE KEY----- aiohttp-3.6.2/examples/server_simple.py0000644000175100001650000000136013547410117020534 0ustar vstsdocker00000000000000# server_simple.py from aiohttp import web async def handle(request): name = request.match_info.get("name", "Anonymous") text = "Hello, " + name return web.Response(text=text) async def wshandle(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: if msg.type == web.WSMsgType.text: await ws.send_str("Hello, {}".format(msg.data)) elif msg.type == web.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == web.WSMsgType.close: break return ws app = web.Application() app.add_routes([web.get("/", handle), web.get("/echo", wshandle), web.get("/{name}", handle)]) web.run_app(app) aiohttp-3.6.2/examples/static_files.py0000755000175100001650000000023613547410117020332 0ustar vstsdocker00000000000000import pathlib from aiohttp import web app = web.Application() app.router.add_static('/', pathlib.Path(__file__).parent, show_index=True) web.run_app(app) aiohttp-3.6.2/examples/web_classview.py0000755000175100001650000000256413547410117020524 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web class based views """ import functools import json from aiohttp import web class MyView(web.View): async def get(self): return web.json_response({ 'method': 'get', 'args': dict(self.request.GET), 'headers': dict(self.request.headers), }, dumps=functools.partial(json.dumps, indent=4)) async def post(self): data = await self.request.post() return web.json_response({ 'method': 'post', 'args': dict(self.request.GET), 'data': dict(data), 'headers': dict(self.request.headers), }, dumps=functools.partial(json.dumps, indent=4)) async def index(request): txt = """ Class based view example

    Class based view example

    • / This page
    • /get Returns GET data.
    • /post Returns POST data.
    """ return web.Response(text=txt, content_type='text/html') def init(): app = web.Application() app.router.add_get('/', index) app.router.add_get('/get', MyView) app.router.add_post('/post', MyView) return app web.run_app(init()) aiohttp-3.6.2/examples/web_cookies.py0000755000175100001650000000156013547410117020153 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web basic server with cookies. """ from pprint import pformat from aiohttp import web tmpl = '''\ Login
    Logout
    {}
    ''' async def root(request): resp = web.Response(content_type='text/html') resp.text = tmpl.format(pformat(request.cookies)) return resp async def login(request): resp = web.HTTPFound(location='/') resp.set_cookie('AUTH', 'secret') return resp async def logout(request): resp = web.HTTPFound(location='/') resp.del_cookie('AUTH') return resp def init(loop): app = web.Application(loop=loop) app.router.add_get('/', root) app.router.add_get('/login', login) app.router.add_get('/logout', logout) return app web.run_app(init()) aiohttp-3.6.2/examples/web_rewrite_headers_middleware.py0000755000175100001650000000114013547410117024062 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """ Example for rewriting response headers by middleware. """ from aiohttp import web async def handler(request): return web.Response(text="Everything is fine") @web.middleware async def middleware(request, handler): try: response = await handler(request) except web.HTTPException as exc: raise exc if not response.prepared: response.headers['SERVER'] = "Secured Server Software" return response def init(): app = web.Application(middlewares=[middleware]) app.router.add_get('/', handler) return app web.run_app(init()) aiohttp-3.6.2/examples/web_srv.py0000755000175100001650000000253213547410117017331 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web basic server """ import textwrap from aiohttp import web async def intro(request): txt = textwrap.dedent("""\ Type {url}/hello/John {url}/simple or {url}/change_body in browser url bar """).format(url='127.0.0.1:8080') binary = txt.encode('utf8') resp = web.StreamResponse() resp.content_length = len(binary) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(binary) return resp async def simple(request): return web.Response(text="Simple answer") async def change_body(request): resp = web.Response() resp.body = b"Body changed" resp.content_type = 'text/plain' return resp async def hello(request): resp = web.StreamResponse() name = request.match_info.get('name', 'Anonymous') answer = ('Hello, ' + name).encode('utf8') resp.content_length = len(answer) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(answer) await resp.write_eof() return resp def init(): app = web.Application() app.router.add_get('/', intro) app.router.add_get('/simple', simple) app.router.add_get('/change_body', change_body) app.router.add_get('/hello/{name}', hello) app.router.add_get('/hello', hello) return app web.run_app(init()) aiohttp-3.6.2/examples/web_srv_route_deco.py0000644000175100001650000000250113547410117021532 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web basic server with decorator definition for routes """ import textwrap from aiohttp import web routes = web.RouteTableDef() @routes.get('/') async def intro(request): txt = textwrap.dedent("""\ Type {url}/hello/John {url}/simple or {url}/change_body in browser url bar """).format(url='127.0.0.1:8080') binary = txt.encode('utf8') resp = web.StreamResponse() resp.content_length = len(binary) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(binary) return resp @routes.get('/simple') async def simple(request): return web.Response(text="Simple answer") @routes.get('/change_body') async def change_body(request): resp = web.Response() resp.body = b"Body changed" resp.content_type = 'text/plain' return resp @routes.get('/hello') async def hello(request): resp = web.StreamResponse() name = request.match_info.get('name', 'Anonymous') answer = ('Hello, ' + name).encode('utf8') resp.content_length = len(answer) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(answer) await resp.write_eof() return resp def init(): app = web.Application() app.router.add_routes(routes) return app web.run_app(init()) aiohttp-3.6.2/examples/web_srv_route_table.py0000644000175100001650000000260013547410117021707 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web basic server with table definition for routes """ import textwrap from aiohttp import web async def intro(request): txt = textwrap.dedent("""\ Type {url}/hello/John {url}/simple or {url}/change_body in browser url bar """).format(url='127.0.0.1:8080') binary = txt.encode('utf8') resp = web.StreamResponse() resp.content_length = len(binary) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(binary) return resp async def simple(request): return web.Response(text="Simple answer") async def change_body(request): resp = web.Response() resp.body = b"Body changed" resp.content_type = 'text/plain' return resp async def hello(request): resp = web.StreamResponse() name = request.match_info.get('name', 'Anonymous') answer = ('Hello, ' + name).encode('utf8') resp.content_length = len(answer) resp.content_type = 'text/plain' await resp.prepare(request) await resp.write(answer) await resp.write_eof() return resp def init(): app = web.Application() app.router.add_routes([ web.get('/', intro), web.get('/simple', simple), web.get('/change_body', change_body), web.get('/hello/{name}', hello), web.get('/hello', hello), ]) return app web.run_app(init()) aiohttp-3.6.2/examples/web_ws.py0000755000175100001650000000261113547410117017146 0ustar vstsdocker00000000000000#!/usr/bin/env python3 """Example for aiohttp.web websocket server """ import os from aiohttp import web WS_FILE = os.path.join(os.path.dirname(__file__), 'websocket.html') async def wshandler(request): resp = web.WebSocketResponse() available = resp.can_prepare(request) if not available: with open(WS_FILE, 'rb') as fp: return web.Response(body=fp.read(), content_type='text/html') await resp.prepare(request) await resp.send_str('Welcome!!!') try: print('Someone joined.') for ws in request.app['sockets']: await ws.send_str('Someone joined') request.app['sockets'].append(resp) async for msg in resp: if msg.type == web.WSMsgType.TEXT: for ws in request.app['sockets']: if ws is not resp: await ws.send_str(msg.data) else: return resp return resp finally: request.app['sockets'].remove(resp) print('Someone disconnected.') for ws in request.app['sockets']: await ws.send_str('Someone disconnected.') async def on_shutdown(app): for ws in app['sockets']: await ws.close() def init(): app = web.Application() app['sockets'] = [] app.router.add_get('/', wshandler) app.on_shutdown.append(on_shutdown) return app web.run_app(init()) aiohttp-3.6.2/examples/websocket.html0000644000175100001650000000447313547410117020167 0ustar vstsdocker00000000000000

    Chat!

     | Status: disconnected
    aiohttp-3.6.2/setup.cfg0000644000175100001650000000252013547410140015301 0ustar vstsdocker00000000000000[aliases] test = pytest [metadata] license_file = LICENSE.txt [pep8] max-line-length = 79 [easy_install] zip_ok = false [flake8] ignore = N801,N802,N803,E226,W504,E252,E301,E302,E704,W503,W504,F811 max-line-length = 79 [isort] multi_line_output = 3 include_trailing_comma = True force_grid_wrap = 0 use_parentheses = True known_third_party = jinja2,pytest,multidict,yarl,gunicorn,freezegun,async_generator known_first_party = aiohttp,aiohttp_jinja2,aiopg [report] exclude_lines = @abc.abstractmethod @abstractmethod [coverage:run] branch = True source = aiohttp, tests omit = site-packages [mypy] follow_imports = silent strict_optional = True warn_redundant_casts = True check_untyped_defs = True disallow_any_generics = True disallow_untyped_defs = True warn_unused_ignores = True [mypy-pytest] ignore_missing_imports = true [mypy-uvloop] ignore_missing_imports = true [mypy-tokio] ignore_missing_imports = true [mypy-async_generator] ignore_missing_imports = true [mypy-aiodns] ignore_missing_imports = true [mypy-gunicorn.config] ignore_missing_imports = true [mypy-gunicorn.workers] ignore_missing_imports = true [mypy-brotli] ignore_missing_imports = true [mypy-chardet] ignore_missing_imports = true [mypy-cchardet] ignore_missing_imports = true [mypy-idna_ssl] ignore_missing_imports = true [egg_info] tag_build = tag_date = 0 aiohttp-3.6.2/setup.py0000644000175100001650000001210113547410117015172 0ustar vstsdocker00000000000000import pathlib import re import sys from distutils.command.build_ext import build_ext from distutils.errors import ( CCompilerError, DistutilsExecError, DistutilsPlatformError, ) from setuptools import Extension, setup if sys.version_info < (3, 5, 3): raise RuntimeError("aiohttp 3.x requires Python 3.5.3+") here = pathlib.Path(__file__).parent if ( (here / '.git').exists() and not (here / 'vendor/http-parser/README.md').exists() ): print("Install submodules when building from git clone", file=sys.stderr) print("Hint:", file=sys.stderr) print(" git submodule update --init", file=sys.stderr) sys.exit(2) # NOTE: makefile cythonizes all Cython modules extensions = [Extension('aiohttp._websocket', ['aiohttp/_websocket.c']), Extension('aiohttp._http_parser', ['aiohttp/_http_parser.c', 'vendor/http-parser/http_parser.c', 'aiohttp/_find_header.c'], define_macros=[('HTTP_PARSER_STRICT', 0)], ), Extension('aiohttp._frozenlist', ['aiohttp/_frozenlist.c']), Extension('aiohttp._helpers', ['aiohttp/_helpers.c']), Extension('aiohttp._http_writer', ['aiohttp/_http_writer.c'])] class BuildFailed(Exception): pass class ve_build_ext(build_ext): # This class allows C extension building to fail. def run(self): try: build_ext.run(self) except (DistutilsPlatformError, FileNotFoundError): raise BuildFailed() def build_extension(self, ext): try: build_ext.build_extension(self, ext) except (CCompilerError, DistutilsExecError, DistutilsPlatformError, ValueError): raise BuildFailed() txt = (here / 'aiohttp' / '__init__.py').read_text('utf-8') try: version = re.findall(r"^__version__ = '([^']+)'\r?$", txt, re.M)[0] except IndexError: raise RuntimeError('Unable to determine version.') install_requires = [ 'attrs>=17.3.0', 'chardet>=2.0,<4.0', 'multidict>=4.5,<5.0', 'async_timeout>=3.0,<4.0', 'yarl>=1.0,<2.0', 'idna-ssl>=1.0; python_version<"3.7"', 'typing_extensions>=3.6.5; python_version<"3.7"', ] def read(f): return (here / f).read_text('utf-8').strip() NEEDS_PYTEST = {'pytest', 'test'}.intersection(sys.argv) pytest_runner = ['pytest-runner'] if NEEDS_PYTEST else [] tests_require = [ 'pytest', 'gunicorn', 'pytest-timeout', 'async-generator', 'pytest-xdist', ] args = dict( name='aiohttp', version=version, description='Async http client/server framework (asyncio)', long_description='\n\n'.join((read('README.rst'), read('CHANGES.rst'))), classifiers=[ 'License :: OSI Approved :: Apache Software License', 'Intended Audience :: Developers', 'Programming Language :: Python', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Development Status :: 5 - Production/Stable', 'Operating System :: POSIX', 'Operating System :: MacOS :: MacOS X', 'Operating System :: Microsoft :: Windows', 'Topic :: Internet :: WWW/HTTP', 'Framework :: AsyncIO', ], author='Nikolay Kim', author_email='fafhrd91@gmail.com', maintainer=', '.join(('Nikolay Kim ', 'Andrew Svetlov ')), maintainer_email='aio-libs@googlegroups.com', url='https://github.com/aio-libs/aiohttp', project_urls={ 'Chat: Gitter': 'https://gitter.im/aio-libs/Lobby', 'CI: AppVeyor': 'https://ci.appveyor.com/project/aio-libs/aiohttp', 'CI: Circle': 'https://circleci.com/gh/aio-libs/aiohttp', 'CI: Shippable': 'https://app.shippable.com/github/aio-libs/aiohttp', 'CI: Travis': 'https://travis-ci.com/aio-libs/aiohttp', 'Coverage: codecov': 'https://codecov.io/github/aio-libs/aiohttp', 'Docs: RTD': 'https://docs.aiohttp.org', 'GitHub: issues': 'https://github.com/aio-libs/aiohttp/issues', 'GitHub: repo': 'https://github.com/aio-libs/aiohttp', }, license='Apache 2', packages=['aiohttp'], python_requires='>=3.5.3', install_requires=install_requires, extras_require={ 'speedups': [ 'aiodns', 'brotlipy', 'cchardet', ], }, tests_require=tests_require, setup_requires=pytest_runner, include_package_data=True, ext_modules=extensions, cmdclass=dict(build_ext=ve_build_ext), ) try: setup(**args) except BuildFailed: print("************************************************************") print("Cannot compile C accelerator module, use pure python version") print("************************************************************") del args['ext_modules'] del args['cmdclass'] setup(**args) aiohttp-3.6.2/tests/0000755000175100001650000000000013547410140014623 5ustar vstsdocker00000000000000aiohttp-3.6.2/tests/aiohttp.jpg0000644000175100001650000001476013547410117017011 0ustar vstsdocker00000000000000JFIFHHXExifMM*i&8Photoshop 3.08BIM8BIM%ُ B~C C  " @!1AQa"2qBRb#3Cr$%47 !1AQaq"B2Rbr ?і6=ec C!q3g╂<7>Ԟ?,S5ȴHXKקX)6U?~F= ]kLCMm:JodaqoIR(JʗR*~ùbLfD%X}ϓNcگFӱJEHupYf*Ϙ=EeI;4 )9^xxBsnTL7,&T~}ҧM!,ë4l EHX8A!Nmduzg>2hi3!$ 4cݾ?)s'/,@vI\flXŅG pErF}ĭ5b :H/mxPj*OO*}qlC; xĘinB3lv_3a  ob5'Jx(:( z}-$2Ht9%*%Mqvv'+N~ղ*\34eHLȪ$xP{70=iNujyr0Xzk,i9QZ!A L^.+'ncB۹ Lأ(@!KéَYR?ċ[VsxLw.fr>L*vDǩqXۍ x[Ja;5O1pO=0h*Ǒ\5=V'EIY# ̬:F[(|hFF|0epQ<_.G4Cۯf@/Е;9fiM m.I=os~8 ՚@rxQcn:`n#S>'rmrkb]%+I+|@Ow;tjLnTԝO^dl{iT\cP? NAaa+R9MʋpU7)X"y\(']Mb=~ XⴒZ33Βo]k`b;\QQiԒesd:~ l~&{߄yglLNrvƪz3Tfi 0,xkΐS7okszjqWQsesmڝz4[$cL㩿}<cDzgEԿhLkS4 m5)w&bZ4YQIM? q!qAQl78YTOE]5^h+c,#U^ xَ!Wߩ7';fC4܀$l uLT87(T0l|13l҂FT'Sb  <<>5cJ]-a,jzP tk}?6XӨqifuutYY#VU1(<%hXsF`P r"#p<Զ`o`<>, 70jt ?6k&{\7Y{M#YcY0ئjtcFbH?Cd?-ɦ?W[qm;Bԡ`5WVб9ՒO(TbzPdxr;7;;s6?2Ϝj2"ZB~gQ~AZNڒ8,$qB+ Rp2{O Cپ>QyY&"4XpG~cjbK攙6S!xY%y _i#4`˫٢OPF}C.o< bX*}Sƈ(I *:$߇٩rY{6aGWۓV028^=fM:J,O}jVIO+<,DB!YW_91wG߱R}[~d, c_bV",vY2Z 5>qҺ9"&HMnQ]$jhڏX;A8`RM4Md?]Ӟmܗ7Ҍ)dp+S7cHARUμ|m\XO bb#hEͅ}QٖUCU$+K (#c u. \#uy*T*"з@`a'l/SO)kVQZNm΃NTVx QS|O9ip ͻĶIPV)gFgI"o$2擛1E ә"S44W⹩(R=t3OZQxψ'熕3I;xz}Ky]G u8eQL)1u[ߝ[N!nhHe>0J?JԊc 2`ክZ_%*߅~( orYNb-.PSJ"<8TlSݻM5fi7j;yQD>`KкnUZfTu86dF?]ћ=qڅ+l ɽ:}ӿu'㟣1W.$.1GQY׎ 5C։`%60)RVHQ9>BǑ=&G4T9Zt76-XI9n͚IQ~Ĝ@vёY\Z(`z:@\J+n {&ϵGf0B(:>煐 fۭFֺ7rDk!%x1ݍjw?쨓- Jd\WfV!\){QU$ōpDH*TB8`[_b\y:rRO$OP9D7^Ina-"1 E_8UMxpfnɯRUj[9Xeaԭ+~ĩ`i@hE wd{J3䕪^~2ޘ7MW4g杤Eܲ6 uCh])z &[!JGrb׹cwg8+B$B  W5üISboΘ$+Scb >c:8 y:)f;]@kxƞ-v%,PKNKx{~%GhQgs!ati2 !/ 0FOmώƂ4[e$IʜRj7'(y33G۠{uA,ʕgxFX"nKfOMKhs:KmfU? TSO%-Tm I"qfF43\`-$gٕ|=تj-vNj!>b<1UN :R$XxqҧK A )F3C\TIY@G=ǨJ}Xs[:Tq7 OKw9╪ͪ) ;>M{ezV tOK?l|xm#Ji$#i|qaנۭM4KԉPyH!Ď+uX9K+I)ƷU;M'm۴ʟ/ѹRFiRwf~'iS.;b֧9I(m92 IWZiVJެ1>a}S蝧:{.p8xP)SR@jevbZKF<Q)N6cI$QQQC$HAvf= z_!Cq`Y9Nj샻Һ3!O!xyݍwO|/O4LE{ݾLً5~9u<9sebf+hK:g9MU*C ,W,*T)Ȇ~#FϘ:vZgiV>b0/+3Ngiҵv:w !TG*ȡ+Z_.Nt@/sJ78dp'~9XEǛ_Ae?/,0[0,0C[ʇIcxdاǣ5vMJ&KZ"q#H*O_3Ofړ2itYd7WDZya I=SKZE,A5\ы!1t-)P"G )e"r? bÔvOjj֦h`O0feq#9MIF?հ5[U,z*-%tP_IDATxuײ.jd|-  ,AA xH  !ø~ݫ~MBdsϽ^/n p‰(TRUTFeTF4A@ ԜgKAT@T@~G~"8P82 6nv_ܓ{rO ~%o[^NNb\q=Ne1@)B)@ ;((elDDDBPI'tzI/%ݤt!:yy}A_}QQyyox`08Tގ؎QQY$+XKRT~H+Ҋ>m3|UXU&7EqQ\$\L4L"O >zy,V}Iq"'r"<4ɐ~ZH irQ\eDQFTSc8F9(堓tNjZMX5P?>TzWxX܎q;Ygb^,rjh5Z]ܕrWH|J)RJq+n-oEl!&&,¬,em\\0 0VYeU+ߕnXED+JXu:|L;NSݤnR7)ZJ-REB5Gaf`FB3Àq=Ѓf$AVmUud*d2̆59~%xRZ57jݠbJ==iȍ\7܋kU*X|p_hmc>r*|2g Z[-^L/C/?)#vfh ԕ'Z|\Rg<奞 ]( ?HAy[|"d9Ua6+QJ94Q@ E{XKM_LQODe{zd;`߫uZ?,u̕hp*%gkjupJTv ߻Fԕ`ދzNx8MΥFz\@WaEDtI#䂧{ IUkh- G!W`Qf%dnc<_S2/j"ZyNWd岏p'p(e2J hE"mZ% $C"*?7 2,`0 x{{[w]G]G]G]nnn\e\^!.9]׸Eϓڹr׋΅j]]^TX>\ύ!ԐgKGd/ٍ( |MU5M_ثE01kp?3hO6cK& W)3HmI.h!uy*Ӝ])Q̉ ϯ_Jǯ'VEkDwR"KQ}^0q;R:G#9J,"A{”+9b ҵl0Tsk9'HGFg8P4CoUa@LޯC[`g,ᤩ?ز_ | h14. Ԗb>){JN۬⭟lk#KwSdwꢀ9#PhAvp9]NjmhZmS99=Y3_b(ʊhF[===3{gCgCgCzz dҮ _$g;0vPxpr GlB@0kPp]k: {7?YenFL`/N/> SWse\boJ%o|ruN~d6&_ѿ=c}k;g0ItX'l/J1{#6r$\rv6i|W#~ďo7%BNh'lll2]Y`7((Yuhv4;fꚸ+&M@rX9G o&5cG7m<(D;8%?M]GQ+\W*x]z `h4HtOC $`#0h5}H tXX_ ֋`P-% ng-/f_L ٔ dC9 IHN `Nկ!#>o8)zIzW,g$Β^pYn~af2!nquw@irXTkj̐qC}}}&bX+f!Po3@/=d#Sq+nŭznF~QgL8uUAʝ?oyݽ:1""!Ab!)m䳍4~"[HfM6Ai*! E0kP@|rWPLsL%h7 uޭf>J#h/1$:.s bZiտJ9>şEO>~ %C}@-!/j-kjnou#$fvr-"Bzk '1ޱ`ha.'T)g١B翋<(&gC&*uuuc18w|^"Ldkf4h**oMb-vvvwZNi9TP [\ |VV|WVw\u>5{LSJhgW.(r Q HTΝX;j̀oo6r7giA[HA%"yq(Oů3D3> o@ 4:xGO݇fY,/AVx{3UE#$^ќDFyz]#8R8Rp}tFhڀP444QQQ&&&^F/OȤ_HS8S^s\ y Gz sS{K^I~~SՕöD)tìǔgq*`=9ԏO36)KWstA_jyoad]^_x4@+I6N^A ̇(~jboI LbL{xM4slA=ɏoaIɟ@fb @Ew.)LSvSfZ{2?_;!G5/Vjw3&JS?XA EN/^,X&^^^XX%P% aKd2?0?0?p777={P'7ߓ5 #ͻ62w[vqMpǓC 4S~to9Nbl5ՃBiv` גCٲ؇d,Uq_Q_A [y4,[JZDǯxiw넡rҹS3&;~+lDd79Q #{X+Vm= ʑً ' %,`1"\@ϥ#ltINTy>gC>`  jإ  s U aߣ@9ydE0ZnKTgX=,-Ve*,iom>0{HiiZew]Ⲹ,.gf!p x,fwݭ5f5E= zR[chvJ+W4t5UZmҁo.<Ū F r0Fۜm Ƞld)гުby Jv^-[XNHNdYͅ|sQȩhvo鿼wpۭףO=n/i $^v8Q l("CWv ROA>c| g 6(Ieo#g_e0RO/&Nڎn&(\rq`(r5MVn{YdFw#vJPdAJ~Q \h&&& , ST$%$ʠ,,˦;3,>@!$W}ηTXYޜG:Ab, pE)qP6}7T뷻nD'_k쇼,yh)Ýar:p |Vv~)ACv$eFz`,Z`5y ;F!%ќ4fUbVzD~ B + l٫nH9 X `!"%}d6eB!"_:MOUf^خ|4`n07N˝666@YTVfOe#נdL˴L\\\o9o9o?$ɗ:Pԉ*wD Xo͇tcX@BBpQSnCdL'Ptp H\m(YS>6Th:"}_^_^_^rrrYYYے7%}2:$|B<9;@zZ{ ?!!ˆA ݖ͎,)/q0Г LHWwf9P]-^47m5ϥ\`o`!?}af}`!(LePVtb $^־j E<RXz`G*"D.Aښ3x*`(_.H8֪mޠy4N#O݈YYyp~ﵣPOsss7͙f6[ͦrTeUCo{{eYAV0Oͧz#YYaN5Mita~9*z7sRutVڱIh߃~V¹pɕnzy@T#_ \0s+:qm|&u;e9/\F/rDŽf땖Cj L{%]L_t)xCRAk4ih[0Ksk]BoqU- ױoQ}NHB %t HA?R4:$bǹ=B RiRil`EC0oYlWKh Mt/UB1wҫ$Y9$:'@k?G~G~G~SSSw[w[w[FhzM^T "A1A o-Ŋ8%%%tEWt -`[/赻 rNvwU83Q/a B qn+`d#57VaTiz - L\\ȃFd Ƈj~q>se?+R,[r 7/wm6O?mX\m-Roا6_}iYt27ީ^5Y~Jo. +c~Jc]ݏ} 󀹯o$L^=#՚G3L$Q) >"V6͒݋0]S/b+rh /.Q,&Ṟ\`pIa Gm]JgU΂eUW~< < Άt6Ў3˾VnQ&'=E飣Nhs .Z`޸G(T۟"EJ 'OxTf*3qq HDk5{nvt QMW~Ғ?>³FuZ]u<LCٰރ!NyGl.j8/7iڷ]^ktE({xCo\-\B)zW3.?] "X!^9̳{xKʓR'ۇ١MrsϙF7 81lۿ sG9qWp(\U@T!Km4_\&qUKxdB:^A@ ِaT}PҼ*xOq"2i"luWt,e6Ϣiby\-"lޫtN*VAK DsbjBmmmɬgnn$?g&]ͮfWEEE~3AcqW=ofݝ.JDyL`HXI5sCWD:Pf'`RB Sd r^=&5PqC"^{4);A\)ԣz^α2=٩XLk s7he{\+Us? Fϝs;]WYI[G% H zU,|=.|u$4]v~߭W4[~$ND6}9%m LS@*-X.:k_cxWWt<k%ϸ4s] <]\U&9qDA0:(htE8牔 sh.W=?BBj'9Fw E'u_E9dNq3h: M![ȣ ]ϯVZI{엳-V5Q8;;{|^iTZj[-ږ"_ȁY1 /swn6F3ve؅!C&LFl׸yDo.x}燘T5H̷FAj@ @*b>G}~rvn}k1VyW/a~[(2jAG {B0XPvBT9Zd󻟯NZI+@0~EYf-<*$6 p xAEi&U !R_rna9JNBsvx:$+dt9:Υq{[6rw{ft4[6Mߩ3yh;|]nBt_]]U챼Q.GMR'!CTP! t4ѷ2K|펌MўV 6tvK>7B^'R¤(AʌU|`|^X6^}Nu+Տf o_(]f^!e}s+s,3+~2+93٬F٪W9i9s};wZ74W鷹跘 ̠5&?wR`>86'ێ߅zF̯EqPczB{ C+ݷӲ95z[UX_ѿEo5{euY]V6lr P0&lba;;;)cwfYk^69+h>~)bZ2nW~gԈs&)fPAHC9?ebtmRWqgmy/]tU;e= px4 /?ATͧ-",JA`\VnhhT[DCf쁣 & M}߶ 䘚#n4_&8 Aqt7!זEɬJq\Nf~?%LtnNƱZ([JoO'Pw;jeE:@&L|y|a,7FAҷkÂ)G^I :巫zf\Luy2 C'|4?.-_g~©o輐xħ)YhZS~ǻQi# JZSN'9|YVOv>%iն%Usa}u#HT{l,/$ Eh+PJh϶zK4$n$ [ၿ=gM]2Q-?:3Xֲ -~D? `៸%@`1L~,3T?E6>l9[F8xݨI"^>4MoEwN9 .;Qy[ }G|+5rΒҙ*i1ퟄDxqL#Z\MnX yv.<;9 9 9@U*988dI_VMeڻi尿,Y`3{={={~&[]q7f$ZZY}64_uM`G9s) _\r9ʥ|tZj -s:S5g$F+& A;GPi(?ᨨv6]/~ u17qx!qWhU򏥺t1r;N>W m( ~g8oՐ+*hs~4R*{K ?Dk5'R qs=M9䧲t؅GOb:@t[Vi_2)i`Y/T&32Dۭ<C -KVxD54>,ed\еU0wݜCQ^c?dȘ1 zބڃȎT d c9c|V>+ggg@f)666C=Z/hYxg֤LȏPNE[6qB3|r? =OT F`*KD m,j<һo#:1͉ ;CwFdEÅ^:/eӋГ| &zRK[Jl5gAiqžN"4\LfvtOPO[! ("o,elV˜ ]gbYK;vyV}n ȍ9 ^}ʏ#Gޏ27 _+DP]n1٠r=)VFdS9GGx%<#6Y'쒥<[HV#+aFaA;(>Uh/tj#)o m/H_@x!B`o5@T͜$a.(;H1b/MH b:'h' 9X)ס!Ԃy48M΁A%z?Ox;/>=Vok^+9l0zd|Q "9iA*u="]@ jqd(ۨq3Tu(Yo{{8Jeܓg0kq9=:\f8x^2NKDy\a!;|Rmt$̓XʮjDV:u1cxTg(e򶷉Z]VPPr,d5Y杔]λlPk#$qSrM ?kGm^Ԅ֫TPP{'j3_רZ 'JP?0 c3`bmʪE r'TU+DƋ+BNpgH`łhu7*y?A?d-+Qq~ɾZ-<-eI+>*[v1^4Jc S?!* : DͧI\RT_֤ƣ]t@[g 4$j% 7lӘR N42wq=A5 T)lj:Ps!^50@"[53*Mꔀɮ^ﬔu=J{fY{=YYY=VZZ|s9_3LJDJD(Baq.ą)gʙqqqJMRvvN8>MZuwVy9T8:demJmش dȎ>LGF .r{RȔNlB=`ofMa.D̪DS̾BFܛ9& K#"1~ʯ} f).Wr/ph6g)nfȇS 0z 9ӷ3wXDIe y-W^Ҏ]N%} qHsnxz=-{H t38ڂb)xyhRdc [_M8[Tl\x])(ےQX=AřUTJj w2|C X`0g^wAl(>GVLFw0<@_L%. T6RQ7@چk\{ ;F->͏6-* .|,аLV=ulQHSz"een;41=1R$G`8N: A(%ϟN|&wtV μX[iu,[!5p:\+גZk':AG2 BB*8AK F]{$3 ylW1%p{V1n!/b;zUꥒTJ!ƀC/P RŹ8J>2ZLA_Sȶ=a3;8vz']<#9@p!oxRY\2z?2ϽFmg^KKlfUJY>0`ܨp"lHb ^u[ףEJ >J[!KNAG2rωH͞FC'(' u޷+Yl>Q3 gcg0Jrac`a[#yux59>ڒ!qKN_Wu" [gmqȶ!~_a7Gq1Fi% q^l<ǒ|сSyFW>6uR2@GOǬ1),=@ W#Ljwr\Jm-gSmL_+--(4Wz-]Zv/_VUUUլKll9o7Ifx3f7J:CpkCww0/;aqG%nwQZ qگs&76t~ON H q_bpsI̩_ё]N1b6nTH_1C%i>ο N>zfsyY5d .cXXL%x I)?<ޫ W\x8238(N"z??W 豜>`Qj ' vB'QEz1#t"S"xy\Cԅ:l:r}n\~Uj:3EF_Y}3&q)y4r̒3M.1 [D^שD$ߐb6pUB sJI"A(B !?#(/Tmyޓ7'dItl(:}}}wʗ)_|Yg6-%S2%# {pGl9[6G/k.ͥ!JF-|MaS9ar?|vr+uJb>>ӅJ jk| {}(+㯸05ԧ~깐Wdǫ2dt햣, \_Wg F<}UxI%c6c(0/@W{kk0k3ԝd 7P戳(Vp vV4 _YJ)=9ZV 4~)g\&\ը]%% Sej@֬AS8>+٭" GE^rmŗto 2;kmArϨʈ(A8= e"XœSR:¸Y}4 '&fCѦ ( 2 5SͩlS)21$"Š!g^444zFk bW/ŏWCK{VO2؀ʘixu訃9sPHދV Qak˘!ٮ67$(U${Q>:NSS`c)(P X]ţ)(^|Qt@YNV3=*%s(q_֤(Pd#ٕSdzVpsL$Baq/&3&b6˯9҆ bR2G9[\_hߨϔ|XBx/{ցY_?111W{98)n/%d;c1O jsw=P0cx0O!p _ hYSǗTqg;?xJ!*.agj b1T:TŔ}kOe;溚| 5h"\Kz2GxsYس vA_A@ r y]ƫ@ƮXU~@gҬRذZ#> t wYe,H뵫R*1A`;97@$ ㇱ/wW!ŇR4=00r=ޡ4ojdG1c{o)5iB} %I ^.?tҘ6VȆ2,^Mࠦ4S;1ܛ`C לS'5,͆3,$!f̪2+k'I60a?8+ SR>`CwLx. 0gKsqc8?{FnT;J5 W6Y B$ }p#'TӬEpWBQêXܼg tΧ gY?v uI|Sc=~I]Tw96Cg5}&3.|O :SŌbc1_יMߤ=<`ɒ!wfbl  *k*1y~Ҳu#1= D=Ʌ!;4}V ,z*Oz\JĹm?o0܏ y:{؋4\L(w38OG}-BT_/uss#9lg?7,\~/%U*3wi7݊zhiЋr[ =cC6JB{ktNt(TNy*qثq6QtvM|иl8\^|l} +fBkEI}~XtHh Mz I;khՖpS59%tVBs{x~na_lXӍL)H%]|iI`ځa5"ZBu03Jj`ɸZ=!7rS_/Q7GhFh5ߜ*\ L6^\L,ԠY,\F6Go*TX< 1|c{T@^B<){hѬ`?4QZ "PD]됃oYut~ݱt5[ 9(Τ. 7P0Z掤yۉFk2!sr\eaP5{*Py ,ꂰS6gtǟW->eg )F޲}N澿Ý]k 4V'k P} 8oe畾9 8xys&oCvr+" @v!qkLqO*aMPowiRPfP02T=dUR*"a6CE/3`F'c=?$!^HEn6E2iǾ'*VJsN ٪UtJ1^xG!(Pd3FT5!or`ovh5x gsw2`7\( KZ|R/2bfusVD@(󕠀i'r%qOA;&ɨpXC_ 2 'u5ޠ䅨~ؤ<{x.UH}e*,MÔ}5'Wf}|ޅTTV@)g( op4W)t#mzܜŮ1ߔhPSѓb##=4帼i'ͪIxFmųYT;tyVT A惤m06s6F?bN4&fThc1xV&at@KRHlOy=J) gRS,6ds٬ZF (P+Ory."Py&lL-JCŵp1JT8Jg+}o͂Lﺙ%_m@E) ^6?*XZpgNG{!DYsKƾw_'iuItO]XP 4[!wp gI1˸{;QM] MD*arK| |JxT!qzܼc1j^ͫ1Ӊ">7M^ yAـg:b .ޣ^=Zii`9$,!* Z㸃1='r#QI%b*o;x8cI3'4_\4[3#+`>C9u'P4 dJOwU_U}x{MC\TtB[0lvHC:DPſZ-V*Wl>neMn嘕=hpTE>X4b}IO"Uy7B 3tyI+rO_~9'r!?ե2JlT޷]A>s{+ob PX]|W%|^?p YgZ'։P2.2W\]/K#y@m6xC$< fWhF| ͉`4Ljߙ\7b*.; VegwWA`E/Xw+-xEGa@4scc/wOE./ RqCgcچDfo o%rg a7ϰC $H+*MsBCמʳ?8vX7\+У" z}NEJ!4h-C\ɻS/h{"zi{HyİҡyG^.\|h7/6UCm>Ar`"~G=;Hr UerZOtR}YƗ Ozbҳ]gHO)JTSmG7J5>Kŏ8b+rgs 滅VIBbpB~T9ם=) nqM4"bi2wkې E^b=՞jOKd YU܀ S)soH$v) 7Al zFQ.S`voz)O=.- t딦ְɺbbAhN:Gj#}H2tur=K8%fyfoH`-y8(پswYiyUۡa%}IC}*fPÇLӾԯq`X`浯D2T[Y|\V}_>0"|34'|)6WlMDRg53>Z nn8\+rzZi<_I4}r{L,Ğk23 һ4bDжbq7"]mĀ;Ji%wJ./aK𴫯YbUy (-ବAS<,y%S'zL[kqߓ pvVnR CqH4T 3] M MBɢaQZs웑@}ޥRve pyxA^+y2JF(BChHfRTQVd3MS0h(d%vuރ:GJT{س#auW&eC N٤VV>?~4O<ٿ>@Ȁ)*ރ ?VX0ew^Pð1ɐc[HU^8kWW|r( l0Pd"P'!?Mkk^O3HjOi]8V!u?Ξ"*]aZќFS^n6 J|-k+bRݠ-+ߺ=|9 ga'?<V&(6HDyMNcd̤"HIP]eITM#(͟py+ɧIOK̗ྟK!LcVb7o*ݡuOljȴvé1ATO]Fro"(ޡz VÇL)0;3u;=2krg*Kr 4&2 lm-gYyli4[K/t.ѥ~§DwM?O~ujFE]͡]Jn]enb O9Q]Nh/a]y-*͊t&+kk]aF@”r6`H^'dV6_ QXOyE;X9!iC-B%=Q2VkLv3ie9S_-C){Q\c!Rr8\+g(EDC+ <ZÁH (r6 5=mJ~nNb0*2MYg=q_x*#QsNS)*[?*b&A4peLajˏ:ڶIBTR=6@)o[/}^v'm|Sp:_`@P.fUs}`\QlUj,c39nZܴi|||J@`+bkf؋pC8lVym+o>$,2"!`TNs9`B_n|lb-j.P_dxZɑ4Oq{ӂf_HhUh ahґt(M4tIPMH|`DG쓷=P͟{ @ ;,[,b`" &:~0.D` nNZѺR9-y 6D5M Қ.@>6r+45i>[5JB14v4_ 2 ۸uzEXnPڮjB_gCJ#S=~b}j篖peKIoFzevIy|]zo𪸫XG^,>4ӗ{5m0ˤP*DBK~)#שvYy dGBϢ?꒿fxcM%J_IT<Я2kW3担?*mVu5]kx1<6n=\$ETRY,zMTYjDio+*=,fNXvv& !X@@W"VϨZiHmhdY3g{j&]Gm Ż}hnE2ԑ@j1Z2 me0?U?fP`#Y-e>KSkOQ{F^_ "P%LV'{"&$dTIi:}}T̉ԓޚMAuTFȦ~ oݬV~2,ȆhHA~][:*(M܃B׷S:jr^+W W݉#>pTxpKn-VhzǼHF#fڇI}+ ^$~ү UJkvE4J #U݄ o ?զ .supxY6XljrljyW'0 VN̞߱ZB;>hMa/4i, jNzT>U͞sQZб 󚍢p4ؓ9*nɕTRPޞ/2nQqtpZ{j^4Ң^Ky,ntf 6'x*KB } j B+Yb*8qpD`8Q=* J h5@U:Hs}KzV(g91UFb|UZj]GiwiXTޥuo7}r{_؇htPYDc~.Dzz NRDLLE$#OsGk mM4d'Vc5Vg6v!u: BSRRp$Hqc8?_6)n.(I~ߋ4ǬTۨ [@,,.,QϢU⒕ţ*Ovꐎ&g{Skp3yD˰twhfǣ8rMVnoTQZF} $"P,@5P^Uo! 2DiFcJw?i.>/EYU|v!)^t"{ Q SI 4~=\?qiF 9ZjtM:fnjI>!1nVFM~Jת&p'X+֊ iL[:t88),2C}ۃʷ:mV6r3 {-_Ҵ./Ùn"syEAj*_.<)wV擴;(r<"3~&@k57`EJ(+e}I,鯂T&*rVzz9?~flh&^a}z<g—O<< TNnRz_mה.#_0s9 n j8n-QHԋ&ߊZXAyO}fǎ˧]}&q-{ jg]-0WE yY:统@eIUs-w 4/]Vcڟ}t(ػDو }'|FD78?NysZԳ_\cP0rfar"LxBEIpFt|"@TjAoBr>mEP!Z?XE wqwqWq +`60o"wR&@,zZ*@٠BVaKE|/%=[iq@0C=#O c35l -ײjD|۞} o4; Ճ!R!V"Vjҵ/*r-<.37{dK BKk:Ga~Miv@b#L\Lh}'L%&HG48-KF@|7F`Q.8hbZBhy/HK6w 歧4T*6j6p".ť*/#[ C LnNDSe6i`-c#]|A^%uH&?t| o!  ''Үqw'ZŌ(߅e>ID?#fy3gIkC6oE,B+BQl* ʺCߓNuqC1$U'_`$8ew&lkcSoe4b&mbbOtJ[DxF$Rs6YЄ>A}:+~HYeZ)go:VKچ4, t3R΀ +4DB@GN$lmdw@lD,('P&P_k/7}7=2J-ъ=}%.N^ 5,*]$ОBX‹?dC |?X}y-'ɮOp4=TGݐ&)o8W QfbCE!BaƓy2OSQJ۝B#TʓLiɨ4-6`}훲p5!&ﺋƥ@kة+DO%|xEs{2> d͍BPTvRsņӻTm,{]xߥk $cT{x AVw oY uCkWD.vjSMTﻎrtFLP~Qxlb~[Dy;Tsm.{Tife#˺8|Vlw> Jq6mJn;> Pc(G ǯ` '#Қ9h*g 9Qow VSt,~bnZgSh~D*S$~o09C:zag4TNXzFQQ)j}4Y77;jCgn[Rk%"c+ >{ hR q7]e?c? pGŨ((hƴvn9ENS2;@>D:Sۣy^)hxեK`)Fqgx_ r^CcSAȃi(f:dKghc+c/Q;nHގʯ@H){eQh L+2:AA!+P}e ``u6JyQw3+=?riF솉ضv8sibsTd$( I "@3PAQs9t>'P}yw{-T]Zk)Eh1 {KY.D8p4kuټ!2ٌ~ەZ'x=5}]P $$|8Oi|> ܟpns?RV/w +uQH| W"WSb{޽ց/^Z ,(##gw2}…NBx75gdWs4T4 ?eP}fnW0]"Ԍ-{! P7`v'bb,$źɵLCE3BE~ Arh#RT:9դuJ>C}!_xbG.BEm>6a0,\lهIC t|M#~ wU.GkGH@%VL뼌LtDF&atj1yMMG8R.0qMyt rQN"ASC/Biޯ^^caH(@P:rGlWA̐ U%>L r*P+1(USf/kVD6Q"2Ա|/Q@YUtDGt烠<ʣ<^͇GR}O ZZFCQj0T;jAղD8Fz|uȉ4M$,9ao?zz'>t_ (odxj@EeQ_ + lΒ@HDc7כ'uvHNQWA\83MeeC2SdI: hhBCުa^iwxͱ1M s8'*Om7¡f?ĥd%DQ)@"+MF'. $6,89̈́ `d yb-dߛt"[*w\&b5{[$i - + *ukӺV@_,Dž?͝lw kR[׫;qby껝'ڼ Ӈwon~n=yswaeg3b$Ϳ;E9yл=uײ=7{ܳfٳYQEͩ95nԍ'TA1S3iQs9 Q;:V\.;@g`)`"cd^iΞCE_U`&|\rN- h,o> 1KEHD):Ra, Hto/Bȥ¼<1H̷Zt޴ڇ!9.D@{^SfPu@ ~gAR1KDF4T??j\.ʊ/_hub&)b(1e< pB^a ϋ10~`;+/?B$nxNx\/t_ È߳q.ՖxGPJ,`sz+Ȅn%NDe ̿ d6:o K;5k;ҴJ‰hJF5[ Kv`]$ o>{7M*ij[A*{BHhE\6ҡ?E'H>[/Mr_UBs}y@0#xA=v zE#,>̛FEt NhCyJ HH16>Ei15UxIDAT;ˌ:x#QReo̥ k8~HON|qVw:I-/&MU606X ;]+tT_"*"ͧ4tPJ(%ffb&f `>'(2>oG;+te(ΕPAAQp+\0X*O:U6#llÏ4KB"U\֖u9TePӮǶ F%] 1>,uh&:Qhm5#1Z^ދnt%b@?NOXQTxJ.'5U[y #Uhb WՌ%G2x9{ &PM./Bi4/u3 ۹\ ~O^kFCRZ -{PB QߵC8Mju*Ũ?b=G&j=IY0zzqPȇ`cTh*,HYopF:=/ n@kJL(!̖MXԉM{'+0 (#\Xk|ڱSkKWzv=}۾W+)5߿p%RA`>7n~}-\r^ErŅ>:{/YgC zF !sQlĦPR5!kTqQM~Jt7Cn! "=:o `*"ʻ PETN Pk4a wpTM%2h>-łqKzx|%7j ވy-%>P?o::,^ Wɢ:U%qM'⑯H$EvInh2PqL9,-QS q*AɈ5c!„Ggdž-%m 8t"k#ש6\!'f#8G 2U6w+=wpbAu=%I&6= @ npHį)@kdU=2ݙs'(N%K,f q갸dS1#ʂ؇STX~Y=IW1WJd&h_s9.C;|$]t5x)x)x'fxR?B>6yq8챚3.(F.̪ΒZ/-@+U53>3B4QO{ܵ +#pJ̻H]$OOnEXvUOq; ڥGH"#ӫTQ;'U|"~e$US7駨VU.#O8E6ֿ~UZ&"E8xH_ ,C.dVi||Й5yMZxM(9l>'oò/{wanc;τn= @JP3XKQG:vsPJL>#r}Mj8^():Nn.1&L ΩW=΀JJԔ* 0BUu2"4~3RVU}rFo񵀮1f\>`P6dld{T6DӜ;|kע}ѯR0p_hzz@i4P)ՔjT Q{ꋾKң2ILhu:EEE؄M"WVʖ`p+!S"ЗnJy,96" NMH=+EvI_u8y~w8cONujF| :L= SW6yg3Ns5nD#뼉<۱7U阏 a=_CYPQ<O4H?wPH WKUEۈ,^|  }L\9z-|d}# hA qżFH=@>:7K@=)8e @__8duPz-!]fQв}N91r@:"4g>yoc@)y*[XHjc#A*L*R@:rؠ Uفm ŔEZ5fQ F6 ˁe:E*6A=-OShG2˅5S4Y^VCCCD5QM"ZD|Ci.uQu)R)U<.kw;#ڈ6:V~4h! N(C 41ANDEH\&{Y<-Pk% n.ܑyy({9rLT Yջty;!\F.Jqꮽ։@6Av+laAmщZrmR7|ٳݞz @HԱ2i"*Rjs a\%U4pYowfʿ:GS=6K˯J_aӺ3>  CheGtzsu /5c`n`2ܨ\~!TS(z7POz^&n&>DWS. }:>J>3~"]1zb'ӓdXHBH\_Rldy'KAE'?<[DףH.{g/LP7xۘmx%<6+V >}}| qN!_ IH"y#*\>j&T&T&TT}RK]UM*'KBzf8| \m\!y9:\:Vmg6Mvsx\ϏvxF3'q9GEWu[|FjSǡӸ-ϦiW'O[zV]#v%=5dU3L52ZV}@F@?(Qs|O=$( USżR%YvE<4TlQ $ i^>+qZ;ޤVN;oݏ@EdR{_ӗ?_q:Bk0}^(쥢;{MM~bzFDY3!>dESH0?kxFE)^.^ŋb('8[TŦ E8t i gf)Q-#Tȃ*SQe3[CPK2^y'@جL` {_a*Qjkn ,gS|@o8[mA+R;@ĢJ3ZqdaY$L:7VR/1~8nv-Q)z*^ZFx7)_U۫Kq).xAP#p pQtx%^oہցց}@[5.'}}ݻ\ s\񊮬p8D!L1vt'jFjTK>L&Ź=a#\>dZK S5ژ|O%9GgEi"~>ig_/XU|)_yžj'?i1^v*jhE m#?qs0weۮ>( y"q(KǝqLG!ԓ@A«TY TP ή!F:pn_ 0B~W1zTq|QPt}awZrxM ^nnN4i}Nl k -m> ([C_ }嚥<<nRJו(;މ#l]ʿ<5ښ1h[A.،p:DPm0,M>?ͼ??٤LTNAL3`y?"(^'NkVOAs *Fhxa f&YzFD C4⢎4cO`}˯Dr[+!MqQb]QEJQ`(ώOKRRXa Q[ ?Zex)#p,O>0:Ɵcd%p |W|W|Wc1:O[_᫻? x⺸Rg> 3d28R\nF6`76R~"?yx[較u' Ar9! x^9|ӗGќ!ILLjJW@Q%዁Jgy}YNE=N̛*/<8yF **.Dl\ded&։h4<ܙY(_kea~wj{h+,G&a&UR8}9eװC_R1;þNIh*b~Zh.4bB1333ţxQD`(1JMߦoW |@DHW..MjfpPhXj%`1q2T ߩu2v\2 \E*>U hB.N(o}dBGϾ$R|C|'hosL{V۩?L!䟿<ڝ.X3iR;s.˕7o<#M=.ߩ딨ج~=Q jtDAJ|* I.XJMVmkvg,@4 .0 h 6='p!;.S2ep!Iur"\=%"&j |˽R;'3U%59P_x"RnG:4OKR1T C:]?(P6ݦʛʛʛpu::J^* }P~X2JΡT^2d(pv&6|Vm/Mٺ_y_xZ FósE{]VgQ| P3Vz_<J&D v0&hJ\p?Ӯ}鹹[Og2a m<'§8T(ؑTB--[5,D"_ T&c-}u2\p_Q(G*1@ax5nR71G=c~^heow8c|_m[ nse33s`hpj,|xa ,Kj/9p ? \(VwZ_]z7.% ̂* P.?rHap5yKs+2-*Eym)!{!g͡x+`NZ-97R]|u"Ee?d>,wBK2 Oy%IG2s{0ɨimBtp@=1C71ozg΁k A$%7!k@rK` ޯb P\>f'`4930L,! 7D4E[Ccv1UᓍrTe9^qxEM^?#+(,ʈ2)qÎӞӞ"fE`Q^2"K[VxwM)MZdZ"#WνQ4Ot2 j[}2oT%W M1"/^&SVb`Rs\1p*<>HˊafnR$hi@H~vl (l}.3h\xRx2>a!?1n*ҭe,u=2'kIO-ݴ<*6|TҢ_-߸ֲ8w&4̤r$OV6J׶OoWS웦ˑŇi~Üywbٝ 6n{l'bt׈+ˎR[is䜔j\=^>w}g5 @6%V L05R0c8 k h/mn¯$84 ƻH:N:?mDZS͆fn 1om(&kZ6|***u: }G~0jxp@.b6l 444"/:DuGR9)"+/]hDa;}=XHB 4¼׶R6Wh}לGGdcxxHFMπ~BZZ׎Rys 訍!}*?ڴ2jEJ%iɏΖ׏- 6ڂ\WZՌ}z % Z$h$iZ@BcXڇ(Ixj$s_oɉt#lՃnVM9'q\J5FD況yo[[=MfԋGyGdzg9+ەVg3ՙ;8-3me[mmm7so޴>>>*%EIV^ʳZ.[^yAi{ϝuccFkDǵ_xfZy0b@1/~?3wriw_NV*|u`sڌ8C_PN('^>GK1m)b5صٵٵ99ȁgggklJlJ}]j]{Hs]'Ec]ݢ8T/xjj u=1FcݧG#n|Jm~-I)L-D3G .g'A%!c18'h/TT !E_M~=v /ک }iɢ L% riP߇#@I?gȯ@Fdp^~3T :)8.wב0sp[X%y+}E[err0tT. TWPjme&R #m{% *I_RcL2'rSrSrS\\\ǴfZ3Y#/s|ۏ/N6Rt|BBBBf̆ H" hMtyuo,v9`%yze+[>JM>.bb(熏܍Ɏ&%/oxZ\q})5/։}c*7Yv8,(wiHc-g @c*UF]@w*AT->g8j9|'#.Q>js :.)>ŧٮlWKuE]2225IMRqGAODϿ>DMp1\Y,U www y6x3ތp&H^\nfY_]}qBk\쀠tMG!Aϙ|& *q>Qt5C}4F9@B;qs$AG䴺x&Aѧ'g( g)BtROك hN͙l`-G&5H8q^P&Pp4)`]\_ Tt @r%'b-M#]Ctv-`u` Ute6 _CňaGI yV~0/%ʮ5i]^q˾˾#G.֚hM&jejSm _ <2,2e2Jvj;mmm3eL˘f[u;v^)O(&Q5vB ґ*KB0L8 )R) m! *$9!36I%8},=Φ,{44x Ãۈ"QH_\ۂPzRg l$fx bFk"v(KG=a:1yOGwa.DG9]]AĻ'lȈtŽ ,S F(B'`w11ӤLB^dPQI>55DHt)me= mGY`V+W,G#K^R/ bؐp7ʗC}2D /nŭ*Sejƙ3g0JuGρ G%]SaPC@SW 6"VF'q)$DQ2L F ,[>6 ':ZHTFnǣJ{=9 xϛPDu(;2Uv)M!ȷ1RPpi_ ] xЂB@%\]FY6PQ^4*tyZpV)+=C磖KkߝUlO?e/qǵL-ST>W>W>OHHQDEPmh#lXٰyyy=/z^|J+u\o:(_jjer@AUR>2C,H83vVA#p4;I'WD9< .!P΁nF n3Bʙ Kd(&8BaJ/@‚c|1Y*3मb&Ll>#?e/nxAţr0tE9Y` ` pgH\@ 9VmCMN7\w}s7?o~|_}_}_a{z|: 4%Q%iРC5qM\kZ {¼n^7g(QޑHw;]N>b Gd*|uco~O}]Da ՆD3@sg?]2 ҧ u8fc99 k h= ? P^!@];I8qRvG D91wI~kB[6)Fqr&0"uQ,Ìqv+J7Q/_dX/ 79}&fblf KG 1`POD,/G4M mIXac:gH91'[zk0287(і[h  UX ~ JS]eF{壡l0aɅmԘbEs_9\Twr>5>?eJ_$}z]^w.ݥkYZ|||@I_:5ذhD[E"c1w~Ku.@^V?oŸ-#a $O;5snsrY<)lh%I 0f?kOpnZ2lUҭ?i&CccqSaaq|TJ͗Fq+ ktś⵿7ZMccfƅ;(BsȻnPTiiNbRxVZIQA4b\]ZN.Y=vHj)Fc-t_ / wb܉ U U Uz>-zE @6_0gqgdeQutuu{ٽ^r7nJ3g01ajGH4o~,oљ߭BՔ+ً;8 q:(Ѥ˟lۗ'2` @Od_3MFʟaED]z4$p{w8O>gޠ6r](v)UُS4@oI&^cB ؁Bt:#k<.qܜ(%|=H%͚aV7p7 04E|53^Xe?G%O5IP4I<'3 g,H$#H䗸iݵZw Mhs{CuP%AO'HtH26LΙ3vu=vO=-̡ȧ˧wh 36o_ Sx|]mp156p+`CAJBŧdN7X\zU3PJw=[(P:Z4$-؆ByBC q3iZNZr*밵͕%1gbDHː4mxqK$Tr ֓Y\vvIS_ucިwcкA;x a# 6"hBNo!o%`A"| ?P_:Pa!tTr >.j7RG5O'lllb,&6WP4Vkl5[v|;ǝ K K Kh$IN9,O;>"חYj^j{Qs ''R"Dc#GAB :٪oDS+vCHK  <>A+%pK|Ծۼt $E6K`BʣFϪs3H}z>? *%?#ўў+Wf k.+ JDS P ~a@w?P{=o7ӂi4ooollāFn:k8yD/8l͎OJ h/.r=9lʖ$ AAгDv!"ܣ8NyۄCV| ;xYkeqD3F 6U^E&6M" 1#\/4UUO➦4Unbwto6NȓY\KKր:%7)}gvkND*Qܘ:ȭzQkPM~f5#5#5#5,#,#,#,",",w5j|# ;&,X (v]\k lllUUU+~ů$lKؖ-bŘNRj1\z.G=iA1q9gkaKW>I" WaA- 9ES}n6E| W@;r 6;y)~XyL=r?S+t\sD4%]!\Q}8x\Oaw`0b<}?yn$ _[" .yzI)YCoh'2;E =o>aKSyq_UکLX2]\իէOS 5I[\7fWِcZ=ӥ%J5|H⤣|"t wFcԶ=TȢpTB _,vEΪ,調*ڨqh%)q++r*T̩9?s~Ia& VӑHq].jA7z/ٶ 5$*J"(9> djaȹr7>n|| ν鳢S^q>]$ؕwWw?g|Nj*6`gIݮ`@e^K6/Vsydn893@CSR  B]p~I{2|Mi(2A|ڪXg Z\kQpsܥ@+%cɍԳU-C}n6&~u8yJQ˹Sr a׸垲f,CTatؐ:YX<:Ki׆w+1 A_EWHl$!.) |H'JTh=OXX3JSZVhr~A~YeOϞ==Jؕ+sssʎʎj Bd,ogZr6la1kB-B-B-|_UWCѡP+$BKF;Wrƽ޾ǟG/S{!vi:*8EB_J$ؠ+a{z!DB#9)e (yZҔ/,]բ&On}8ɋȡ*"y8Mr=4aEY\r#}KlS(z$[D+ojVg.^lOkDj_K2';.ynq / 8aV9K)P},0`+v5J9eC.>W_yJNd2a;v k2[I"I$a"&b"KL3Y8/΋z "o A8|9eJ41i}(L<?[ m]fs%ZzI#ɮKqXgB(V?jw c>'HKQi<+T;!!:) Kԯ)5X ğ 5$~|\x儞u?8u3G)aaK4&( R&waQv'O'^Q^P/GxonqqYyXyXy8_#6MbxH<$ k;3!J))>?|Rh\vt3֧-~0=<+5FWrC[A3'+xbQ 7x膝X5|Sw]!h %o"`Eɶi]!][P֝]۲xpFаfKe"y2052aR} OEJ4$d T0_dÍ |]ϣ}o^l]a>k 3r v/+bk:ƕ+|]Tm:?dvϔd-Vb#?4Syyy~N?Pz(=EqQ\5=y a)xmMnnnnԻQ:88ywfOOOЧUdž"o斷KG\^4*,j=8Bp H:$ |V3r;K p1Ma;Z#(rELrJQ\4ms6Sn1vM t ^*[*%"-ݼڱy |[jrO7i{d'` ?Ep DqJ]Xh'L!vTMKE\BF@kQ1cnU}ꏧiu֑[G2dɘ|._?wwqwqwqxzCGq$__la<º`].fŇłłżc>Nh(Qȑ#բ?eF_o)+;fV>VxpqGWܙ H WjS4jwYAei C9#MpJP Y3q [W,JycR‹r&Y(\$T#(7Cƌ9ABGߩmiwěEod> _3de>ofBˢcG:TިM+ܓREd:'D^Vy2ftaaÇwpp[-Z֣K2(2&'ȷ؏R9J̭Vs2BF(bЋ> v'N Q(g)xE;>`&22V^}`;vzg_gc|R%'Q)@rz#> V+"jroo M,* XWZϣ*-kog2k<^4Z@oD5G*a?Y@ڐ&OXC oAΡ࢒tؼ ^tJ[+)G]Wb y9&έ{ iR IS7CIKtEi~ 7'50O·T]vpQ33*GW,)~1;IK _&5&#+"v?[vx[{% ֪iմjfɚ5ǓIϫϫϻ?uyyyJWuUWQ(mr^9-z C1CLBBb`/奼4?jdy-5ǙqCt t tD')fH̐!jC/ůщ{"v8(|f"BFQsϱaXU~g|r|.w ! [)r4nuRުL\Bʄk8_+^HjAD)Ѻh]y~ٜ9gy,Η/;_v˻,+v!s9ߜo2{B-C-C-o %^W=ўhO={ggg5 u`'h${ 0,oU ^]qӈRtօbfigy[ҩT!4 3IEe$1A\GE9|E5ӛ$&w‹QS4}Qܲqzf Cbp vs[6d3jjjTTT)))PkKW>Q>Q>Q4FUW:39?b~k|ɫjw;Cs9&z۴VꈪU#F>}999PjA9[n iBUN}k`2yQ;]5o`W=3xOXW5F\VSh}<[WcZt$آs|BgJkG^R8twxMO N8N^{='='='V[s8sdg3Yo7՛jZTR)}bX/c/b/ʢ,Os{-xx^ūxk}JkҼj^5F it.p.p.Y'N!CV}}`)CI \pͭ-KQ&*?nggge/=L3!q%Bp;G8vFQc8DYd-ڢviF`ռھl_/~ovit~e)YJar{rY.e~i~i~iX)Vhn4M M M1)M_i\1W{{{^kW)u[XXnǃL L L 4 4 44 B:VF+m6h-mLm6S7JD*=JOKsi.O~C?+H_/p#阌ɘov.ʾ}>kXVqUܺf]-͖fKsl&f}>b--ůJkrqqg3ZUVU+u:JQJYztJtGh} y xx3xF }>i7ϛ 1٘lLί5ov;]]jw*UJVohk5F?c1R,ʧWq%S? uNp@2[fly2^~%_nv7mm}>o k"f5YVGcC:x 0[UTD5JU*uԝn-UPN:/żq9.x ί&}O~H1?+AbEݢnQO|u:_tU*JJ%%VUbs92]L D@l[VJ]+ePeuna9c9A<.[V8'ʆT"|p_-{K +$I' *ʛy3opp='b.dFazZO]zQ/%9b8)NsRD!EHEEQQ4JU(!J"M4~x*GSt oMh;N oVlTLTŽ\6ï6^kx\"%W xS|}c'q'q n-x 17p7 psQ,ųMi7(D!JJTjR[jKmE7Mt+XPjSO_pjߍ؝/8WFeTlͷg$ߖo˷;α}>k#p\f.35͚fMq8_ai@ B)R(R**8%P%PP_L,£#D%tEXtdate:create2018-10-06T12:06:20+00:00Gj%tEXtdate:modify2018-10-06T12:06:20+00:006GtEXtsvg:base-urifile:///home/andrew/projects/aiohttp/docs/aiohttp-icon.svg!hIENDB`aiohttp-3.6.2/tests/autobahn/0000755000175100001650000000000013547410140016424 5ustar vstsdocker00000000000000aiohttp-3.6.2/tests/autobahn/client.py0000644000175100001650000000250713547410117020264 0ustar vstsdocker00000000000000#!/usr/bin/env python3 import asyncio import aiohttp async def client(loop, url, name): ws = await aiohttp.ws_connect(url + '/getCaseCount') num_tests = int((await ws.receive()).data) print('running %d cases' % num_tests) await ws.close() for i in range(1, num_tests + 1): print('running test case:', i) text_url = url + '/runCase?case=%d&agent=%s' % (i, name) ws = await aiohttp.ws_connect(text_url) while True: msg = await ws.receive() if msg.type == aiohttp.WSMsgType.text: await ws.send_str(msg.data) elif msg.type == aiohttp.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == aiohttp.WSMsgType.close: await ws.close() break else: break url = url + '/updateReports?agent=%s' % name ws = await aiohttp.ws_connect(url) await ws.close() async def run(loop, url, name): try: await client(loop, url, name) except Exception: import traceback traceback.print_exc() if __name__ == '__main__': loop = asyncio.get_event_loop() try: loop.run_until_complete(run(loop, 'http://localhost:9001', 'aiohttp')) except KeyboardInterrupt: pass finally: loop.close() aiohttp-3.6.2/tests/autobahn/fuzzingclient.json0000644000175100001650000000042413547410117022216 0ustar vstsdocker00000000000000{ "options": {"failByDrop": false}, "outdir": "./reports/servers", "servers": [{"agent": "AutobahnServer", "url": "ws://localhost:9001", "options": {"version": 18}}], "cases": ["*"], "exclude-cases": ["12.*", "13.*"], "exclude-agent-cases": {} } aiohttp-3.6.2/tests/autobahn/fuzzingserver.json0000644000175100001650000000033113547410117022243 0ustar vstsdocker00000000000000 { "url": "ws://localhost:9001", "options": {"failByDrop": false}, "outdir": "./reports/clients", "webport": 8080, "cases": ["*"], "exclude-cases": ["12.*", "13.*"], "exclude-agent-cases": {} } aiohttp-3.6.2/tests/autobahn/server.py0000644000175100001650000000257213547410117020316 0ustar vstsdocker00000000000000#!/usr/bin/env python3 import asyncio import logging from aiohttp import web async def wshandler(request): ws = web.WebSocketResponse(autoclose=False) is_ws = ws.can_prepare(request) if not is_ws: return web.HTTPBadRequest() await ws.prepare(request) while True: msg = await ws.receive() if msg.type == web.WSMsgType.text: await ws.send_str(msg.data) elif msg.type == web.WSMsgType.binary: await ws.send_bytes(msg.data) elif msg.type == web.WSMsgType.close: await ws.close() break else: break return ws async def main(loop): app = web.Application() app.router.add_route('GET', '/', wshandler) handler = app._make_handler() srv = await loop.create_server(handler, '127.0.0.1', 9001) print("Server started at http://127.0.0.1:9001") return app, srv, handler async def finish(app, srv, handler): srv.close() await handler.shutdown() await srv.wait_closed() if __name__ == '__main__': loop = asyncio.get_event_loop() logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(levelname)s %(message)s') app, srv, handler = loop.run_until_complete(main(loop)) try: loop.run_forever() except KeyboardInterrupt: loop.run_until_complete(finish(app, srv, handler)) aiohttp-3.6.2/tests/conftest.py0000644000175100001650000000401613547410117017027 0ustar vstsdocker00000000000000import hashlib import pathlib import shutil import ssl import tempfile import uuid import pytest try: import trustme TRUSTME = True except ImportError: TRUSTME = False pytest_plugins = ['aiohttp.pytest_plugin', 'pytester'] @pytest.fixture def shorttmpdir(): """Provides a temporary directory with a shorter file system path than the tmpdir fixture. """ tmpdir = pathlib.Path(tempfile.mkdtemp()) yield tmpdir # str(tmpdir) is required, Python 3.5 doesn't have __fspath__ # concept shutil.rmtree(str(tmpdir), ignore_errors=True) @pytest.fixture def tls_certificate_authority(): if not TRUSTME: pytest.xfail("trustme fails on 32bit Linux") return trustme.CA() @pytest.fixture def tls_certificate(tls_certificate_authority): return tls_certificate_authority.issue_server_cert( 'localhost', '127.0.0.1', '::1', ) @pytest.fixture def ssl_ctx(tls_certificate): ssl_ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23) tls_certificate.configure_cert(ssl_ctx) return ssl_ctx @pytest.fixture def client_ssl_ctx(tls_certificate_authority): ssl_ctx = ssl.create_default_context(purpose=ssl.Purpose.SERVER_AUTH) tls_certificate_authority.configure_trust(ssl_ctx) return ssl_ctx @pytest.fixture def tls_ca_certificate_pem_path(tls_certificate_authority): with tls_certificate_authority.cert_pem.tempfile() as ca_cert_pem: yield ca_cert_pem @pytest.fixture def tls_certificate_pem_path(tls_certificate): with tls_certificate.private_key_and_cert_chain_pem.tempfile() as cert_pem: yield cert_pem @pytest.fixture def tls_certificate_pem_bytes(tls_certificate): return tls_certificate.cert_chain_pems[0].bytes() @pytest.fixture def tls_certificate_fingerprint_sha256(tls_certificate_pem_bytes): tls_cert_der = ssl.PEM_cert_to_DER_cert(tls_certificate_pem_bytes.decode()) return hashlib.sha256(tls_cert_der).digest() @pytest.fixture def pipe_name(): name = r'\\.\pipe\{}'.format(uuid.uuid4().hex) return name aiohttp-3.6.2/tests/data.unknown_mime_type0000644000175100001650000000001513547410117021225 0ustar vstsdocker00000000000000file content aiohttp-3.6.2/tests/hello.txt.gz0000644000175100001650000000005413547410117017111 0ustar vstsdocker000000000000001%Uhello.txtHWH())6aiohttp-3.6.2/tests/test_base_protocol.py0000644000175100001650000001050013547410117021067 0ustar vstsdocker00000000000000import asyncio from contextlib import suppress from unittest import mock import pytest from aiohttp.base_protocol import BaseProtocol async def test_loop() -> None: loop = asyncio.get_event_loop() asyncio.set_event_loop(None) pr = BaseProtocol(loop) assert pr._loop is loop async def test_pause_writing() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop) assert not pr._paused pr.pause_writing() assert pr._paused async def test_resume_writing_no_waiters() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) pr.pause_writing() assert pr._paused pr.resume_writing() assert not pr._paused async def test_connection_made() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() assert pr.transport is None pr.connection_made(tr) assert pr.transport is not None async def test_connection_lost_not_paused() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) assert not pr._connection_lost pr.connection_lost(None) assert pr.transport is None assert pr._connection_lost async def test_connection_lost_paused_without_waiter() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) assert not pr._connection_lost pr.pause_writing() pr.connection_lost(None) assert pr.transport is None assert pr._connection_lost async def test_drain_lost() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.connection_lost(None) with pytest.raises(ConnectionResetError): await pr._drain_helper() async def test_drain_not_paused() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) assert pr._drain_waiter is None await pr._drain_helper() assert pr._drain_waiter is None async def test_resume_drain_waited() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.pause_writing() t = loop.create_task(pr._drain_helper()) await asyncio.sleep(0) assert pr._drain_waiter is not None pr.resume_writing() assert (await t) is None assert pr._drain_waiter is None async def test_lost_drain_waited_ok() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.pause_writing() t = loop.create_task(pr._drain_helper()) await asyncio.sleep(0) assert pr._drain_waiter is not None pr.connection_lost(None) assert (await t) is None assert pr._drain_waiter is None async def test_lost_drain_waited_exception() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.pause_writing() t = loop.create_task(pr._drain_helper()) await asyncio.sleep(0) assert pr._drain_waiter is not None exc = RuntimeError() pr.connection_lost(exc) with pytest.raises(RuntimeError) as cm: await t assert cm.value is exc assert pr._drain_waiter is None async def test_lost_drain_cancelled() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.pause_writing() fut = loop.create_future() async def wait(): fut.set_result(None) await pr._drain_helper() t = loop.create_task(wait()) await fut t.cancel() assert pr._drain_waiter is not None pr.connection_lost(None) with suppress(asyncio.CancelledError): await t assert pr._drain_waiter is None async def test_resume_drain_cancelled() -> None: loop = asyncio.get_event_loop() pr = BaseProtocol(loop=loop) tr = mock.Mock() pr.connection_made(tr) pr.pause_writing() fut = loop.create_future() async def wait(): fut.set_result(None) await pr._drain_helper() t = loop.create_task(wait()) await fut t.cancel() assert pr._drain_waiter is not None pr.resume_writing() with suppress(asyncio.CancelledError): await t assert pr._drain_waiter is None aiohttp-3.6.2/tests/test_classbasedview.py0000644000175100001650000000243713547410117021245 0ustar vstsdocker00000000000000from unittest import mock import pytest from aiohttp import web from aiohttp.web_urldispatcher import View def test_ctor() -> None: request = mock.Mock() view = View(request) assert view.request is request async def test_render_ok() -> None: resp = web.Response(text='OK') class MyView(View): async def get(self): return resp request = mock.Mock() request.method = 'GET' resp2 = await MyView(request) assert resp is resp2 async def test_render_unknown_method() -> None: class MyView(View): async def get(self): return web.Response(text='OK') options = get request = mock.Mock() request.method = 'UNKNOWN' with pytest.raises(web.HTTPMethodNotAllowed) as ctx: await MyView(request) assert ctx.value.headers['allow'] == 'GET,OPTIONS' assert ctx.value.status == 405 async def test_render_unsupported_method() -> None: class MyView(View): async def get(self): return web.Response(text='OK') options = delete = get request = mock.Mock() request.method = 'POST' with pytest.raises(web.HTTPMethodNotAllowed) as ctx: await MyView(request) assert ctx.value.headers['allow'] == 'DELETE,GET,OPTIONS' assert ctx.value.status == 405 aiohttp-3.6.2/tests/test_client_connection.py0000644000175100001650000000653413547410117021745 0ustar vstsdocker00000000000000import gc from unittest import mock import pytest from aiohttp.connector import Connection @pytest.fixture def key(): return object() @pytest.fixture def loop(): return mock.Mock() @pytest.fixture def connector(): return mock.Mock() @pytest.fixture def protocol(): return mock.Mock(should_close=False) def test_ctor(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) with pytest.warns(DeprecationWarning): assert conn.loop is loop assert conn.protocol is protocol conn.close() def test_callbacks_on_close(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) notified = False def cb(): nonlocal notified notified = True conn.add_callback(cb) conn.close() assert notified def test_callbacks_on_release(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) notified = False def cb(): nonlocal notified notified = True conn.add_callback(cb) conn.release() assert notified def test_callbacks_exception(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) notified = False def cb1(): raise Exception def cb2(): nonlocal notified notified = True conn.add_callback(cb1) conn.add_callback(cb2) conn.close() assert notified def test_del(connector, key, protocol, loop) -> None: loop.is_closed.return_value = False conn = Connection(connector, key, protocol, loop) exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) with pytest.warns(ResourceWarning): del conn gc.collect() connector._release.assert_called_with(key, protocol, should_close=True) msg = {'client_connection': mock.ANY, # conn was deleted 'message': 'Unclosed connection'} if loop.get_debug(): msg['source_traceback'] = mock.ANY loop.call_exception_handler.assert_called_with(msg) def test_close(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) assert not conn.closed conn.close() assert conn._protocol is None connector._release.assert_called_with(key, protocol, should_close=True) assert conn.closed def test_release(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) assert not conn.closed conn.release() assert not protocol.transport.close.called assert conn._protocol is None connector._release.assert_called_with(key, protocol, should_close=False) assert conn.closed def test_release_proto_should_close(connector, key, protocol, loop) -> None: protocol.should_close = True conn = Connection(connector, key, protocol, loop) assert not conn.closed conn.release() assert not protocol.transport.close.called assert conn._protocol is None connector._release.assert_called_with(key, protocol, should_close=True) assert conn.closed def test_release_released(connector, key, protocol, loop) -> None: conn = Connection(connector, key, protocol, loop) conn.release() connector._release.reset_mock() conn.release() assert not protocol.transport.close.called assert conn._protocol is None assert not connector._release.called aiohttp-3.6.2/tests/test_client_exceptions.py0000644000175100001650000002715013547410117021764 0ustar vstsdocker00000000000000"""Tests for client_exceptions.py""" import errno import pickle import sys from unittest import mock import pytest from aiohttp import client, client_reqrep class TestClientResponseError: request_info = client.RequestInfo(url='http://example.com', method='GET', headers={}, real_url='http://example.com') def test_default_status(self) -> None: err = client.ClientResponseError(history=(), request_info=self.request_info) assert err.status == 0 def test_status(self) -> None: err = client.ClientResponseError(status=400, history=(), request_info=self.request_info) assert err.status == 400 def test_pickle(self) -> None: err = client.ClientResponseError(request_info=self.request_info, history=()) for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.request_info == self.request_info assert err2.history == () assert err2.status == 0 assert err2.message == '' assert err2.headers is None err = client.ClientResponseError(request_info=self.request_info, history=(), status=400, message='Something wrong', headers={}) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.request_info == self.request_info assert err2.history == () assert err2.status == 400 assert err2.message == 'Something wrong' assert err2.headers == {} assert err2.foo == 'bar' def test_repr(self) -> None: err = client.ClientResponseError(request_info=self.request_info, history=()) assert repr(err) == ("ClientResponseError(%r, ())" % (self.request_info,)) err = client.ClientResponseError(request_info=self.request_info, history=(), status=400, message='Something wrong', headers={}) assert repr(err) == ("ClientResponseError(%r, (), status=400, " "message='Something wrong', headers={})" % (self.request_info,)) def test_str(self) -> None: err = client.ClientResponseError(request_info=self.request_info, history=(), status=400, message='Something wrong', headers={}) assert str(err) == ("400, message='Something wrong', " "url='http://example.com'") def test_response_status() -> None: request_info = mock.Mock(real_url='http://example.com') err = client.ClientResponseError(status=400, history=None, request_info=request_info) assert err.status == 400 def test_response_deprecated_code_property() -> None: request_info = mock.Mock(real_url='http://example.com') with pytest.warns(DeprecationWarning): err = client.ClientResponseError(code=400, history=None, request_info=request_info) with pytest.warns(DeprecationWarning): assert err.code == err.status with pytest.warns(DeprecationWarning): err.code = '404' with pytest.warns(DeprecationWarning): assert err.code == err.status def test_response_both_code_and_status() -> None: with pytest.raises(ValueError): client.ClientResponseError(code=400, status=400, history=None, request_info=None) class TestClientConnectorError: connection_key = client_reqrep.ConnectionKey( host='example.com', port=8080, is_ssl=False, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=None) def test_ctor(self) -> None: err = client.ClientConnectorError( connection_key=self.connection_key, os_error=OSError(errno.ENOENT, 'No such file')) assert err.errno == errno.ENOENT assert err.strerror == 'No such file' assert err.os_error.errno == errno.ENOENT assert err.os_error.strerror == 'No such file' assert err.host == 'example.com' assert err.port == 8080 assert err.ssl is None def test_pickle(self) -> None: err = client.ClientConnectorError( connection_key=self.connection_key, os_error=OSError(errno.ENOENT, 'No such file')) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.errno == errno.ENOENT assert err2.strerror == 'No such file' assert err2.os_error.errno == errno.ENOENT assert err2.os_error.strerror == 'No such file' assert err2.host == 'example.com' assert err2.port == 8080 assert err2.ssl is None assert err2.foo == 'bar' def test_repr(self) -> None: os_error = OSError(errno.ENOENT, 'No such file') err = client.ClientConnectorError(connection_key=self.connection_key, os_error=os_error) assert repr(err) == ("ClientConnectorError(%r, %r)" % (self.connection_key, os_error)) def test_str(self) -> None: err = client.ClientConnectorError( connection_key=self.connection_key, os_error=OSError(errno.ENOENT, 'No such file')) assert str(err) == ("Cannot connect to host example.com:8080 ssl:" "default [No such file]") class TestClientConnectorCertificateError: connection_key = client_reqrep.ConnectionKey( host='example.com', port=8080, is_ssl=False, ssl=None, proxy=None, proxy_auth=None, proxy_headers_hash=None) def test_ctor(self) -> None: certificate_error = Exception('Bad certificate') err = client.ClientConnectorCertificateError( connection_key=self.connection_key, certificate_error=certificate_error) assert err.certificate_error == certificate_error assert err.host == 'example.com' assert err.port == 8080 assert err.ssl is False def test_pickle(self) -> None: certificate_error = Exception('Bad certificate') err = client.ClientConnectorCertificateError( connection_key=self.connection_key, certificate_error=certificate_error) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.certificate_error.args == ('Bad certificate',) assert err2.host == 'example.com' assert err2.port == 8080 assert err2.ssl is False assert err2.foo == 'bar' def test_repr(self) -> None: certificate_error = Exception('Bad certificate') err = client.ClientConnectorCertificateError( connection_key=self.connection_key, certificate_error=certificate_error) assert repr(err) == ("ClientConnectorCertificateError(%r, %r)" % (self.connection_key, certificate_error)) def test_str(self) -> None: certificate_error = Exception('Bad certificate') err = client.ClientConnectorCertificateError( connection_key=self.connection_key, certificate_error=certificate_error) assert str(err) == ("Cannot connect to host example.com:8080 ssl:False" " [Exception: ('Bad certificate',)]") class TestServerDisconnectedError: def test_ctor(self) -> None: err = client.ServerDisconnectedError() assert err.message is None err = client.ServerDisconnectedError(message='No connection') assert err.message == 'No connection' def test_pickle(self) -> None: err = client.ServerDisconnectedError(message='No connection') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.message == 'No connection' assert err2.foo == 'bar' def test_repr(self) -> None: err = client.ServerDisconnectedError() assert repr(err) == "ServerDisconnectedError()" err = client.ServerDisconnectedError(message='No connection') if sys.version_info < (3, 7): assert repr(err) == "ServerDisconnectedError('No connection',)" else: assert repr(err) == "ServerDisconnectedError('No connection')" def test_str(self) -> None: err = client.ServerDisconnectedError() assert str(err) == '' err = client.ServerDisconnectedError(message='No connection') assert str(err) == 'No connection' class TestServerFingerprintMismatch: def test_ctor(self) -> None: err = client.ServerFingerprintMismatch(expected=b'exp', got=b'got', host='example.com', port=8080) assert err.expected == b'exp' assert err.got == b'got' assert err.host == 'example.com' assert err.port == 8080 def test_pickle(self) -> None: err = client.ServerFingerprintMismatch(expected=b'exp', got=b'got', host='example.com', port=8080) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.expected == b'exp' assert err2.got == b'got' assert err2.host == 'example.com' assert err2.port == 8080 assert err2.foo == 'bar' def test_repr(self) -> None: err = client.ServerFingerprintMismatch(b'exp', b'got', 'example.com', 8080) assert repr(err) == ("") class TestInvalidURL: def test_ctor(self) -> None: err = client.InvalidURL(url=':wrong:url:') assert err.url == ':wrong:url:' def test_pickle(self) -> None: err = client.InvalidURL(url=':wrong:url:') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.url == ':wrong:url:' assert err2.foo == 'bar' def test_repr(self) -> None: err = client.InvalidURL(url=':wrong:url:') assert repr(err) == "" def test_str(self) -> None: err = client.InvalidURL(url=':wrong:url:') assert str(err) == ':wrong:url:' aiohttp-3.6.2/tests/test_client_fingerprint.py0000644000175100001650000000473113547410117022132 0ustar vstsdocker00000000000000import hashlib from unittest import mock import pytest import aiohttp from aiohttp.client_reqrep import _merge_ssl_params ssl = pytest.importorskip('ssl') def test_fingerprint_sha256() -> None: sha256 = hashlib.sha256(b'12345678'*64).digest() fp = aiohttp.Fingerprint(sha256) assert fp.fingerprint == sha256 def test_fingerprint_sha1() -> None: sha1 = hashlib.sha1(b'12345678'*64).digest() with pytest.raises(ValueError): aiohttp.Fingerprint(sha1) def test_fingerprint_md5() -> None: md5 = hashlib.md5(b'12345678'*64).digest() with pytest.raises(ValueError): aiohttp.Fingerprint(md5) def test_fingerprint_check_no_ssl() -> None: sha256 = hashlib.sha256(b'12345678'*64).digest() fp = aiohttp.Fingerprint(sha256) transport = mock.Mock() transport.get_extra_info.return_value = None assert fp.check(transport) is None def test__merge_ssl_params_verify_ssl() -> None: with pytest.warns(DeprecationWarning): assert _merge_ssl_params(None, False, None, None) is False def test__merge_ssl_params_verify_ssl_conflict() -> None: ctx = ssl.SSLContext() with pytest.warns(DeprecationWarning): with pytest.raises(ValueError): _merge_ssl_params(ctx, False, None, None) def test__merge_ssl_params_ssl_context() -> None: ctx = ssl.SSLContext() with pytest.warns(DeprecationWarning): assert _merge_ssl_params(None, None, ctx, None) is ctx def test__merge_ssl_params_ssl_context_conflict() -> None: ctx1 = ssl.SSLContext() ctx2 = ssl.SSLContext() with pytest.warns(DeprecationWarning): with pytest.raises(ValueError): _merge_ssl_params(ctx1, None, ctx2, None) def test__merge_ssl_params_fingerprint() -> None: digest = hashlib.sha256(b'123').digest() with pytest.warns(DeprecationWarning): ret = _merge_ssl_params(None, None, None, digest) assert ret.fingerprint == digest def test__merge_ssl_params_fingerprint_conflict() -> None: fingerprint = aiohttp.Fingerprint(hashlib.sha256(b'123').digest()) ctx = ssl.SSLContext() with pytest.warns(DeprecationWarning): with pytest.raises(ValueError): _merge_ssl_params(ctx, None, None, fingerprint) def test__merge_ssl_params_ssl() -> None: ctx = ssl.SSLContext() assert ctx is _merge_ssl_params(ctx, None, None, None) def test__merge_ssl_params_invlid() -> None: with pytest.raises(TypeError): _merge_ssl_params(object(), None, None, None) aiohttp-3.6.2/tests/test_client_functional.py0000644000175100001650000024725113547410117021753 0ustar vstsdocker00000000000000"""HTTP client functional tests against aiohttp.web server""" import asyncio import datetime import http.cookies import io import json import pathlib import socket from unittest import mock import pytest from async_generator import async_generator, yield_ from multidict import MultiDict import aiohttp from aiohttp import Fingerprint, ServerFingerprintMismatch, hdrs, web from aiohttp.abc import AbstractResolver from aiohttp.client_exceptions import TooManyRedirects from aiohttp.test_utils import unused_port @pytest.fixture def here(): return pathlib.Path(__file__).parent @pytest.fixture def fname(here): return here / 'conftest.py' def ceil(val): return val async def test_keepalive_two_requests_success( aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(body=b'OK') app = web.Application() app.router.add_route('GET', '/', handler) connector = aiohttp.TCPConnector(limit=1) client = await aiohttp_client(app, connector=connector) resp1 = await client.get('/') await resp1.read() resp2 = await client.get('/') await resp2.read() assert 1 == len(client._session.connector._conns) async def test_keepalive_response_released(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(body=b'OK') app = web.Application() app.router.add_route('GET', '/', handler) connector = aiohttp.TCPConnector(limit=1) client = await aiohttp_client(app, connector=connector) resp1 = await client.get('/') resp1.release() resp2 = await client.get('/') resp2.release() assert 1 == len(client._session.connector._conns) async def test_keepalive_server_force_close_connection(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body response = web.Response(body=b'OK') response.force_close() return response app = web.Application() app.router.add_route('GET', '/', handler) connector = aiohttp.TCPConnector(limit=1) client = await aiohttp_client(app, connector=connector) resp1 = await client.get('/') resp1.close() resp2 = await client.get('/') resp2.close() assert 0 == len(client._session.connector._conns) async def test_release_early(aiohttp_client) -> None: async def handler(request): await request.read() return web.Response(body=b'OK') app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.closed assert 1 == len(client._session.connector._conns) async def test_HTTP_304(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(status=304) app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 304 content = await resp.read() assert content == b'' async def test_HTTP_304_WITH_BODY(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(body=b'test', status=304) app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 304 content = await resp.read() assert content == b'' async def test_auto_header_user_agent(aiohttp_client) -> None: async def handler(request): assert 'aiohttp' in request.headers['user-agent'] return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status async def test_skip_auto_headers_user_agent(aiohttp_client) -> None: async def handler(request): assert hdrs.USER_AGENT not in request.headers return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/', skip_auto_headers=['user-agent']) assert 200 == resp.status async def test_skip_default_auto_headers_user_agent(aiohttp_client) -> None: async def handler(request): assert hdrs.USER_AGENT not in request.headers return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app, skip_auto_headers=['user-agent']) resp = await client.get('/') assert 200 == resp.status async def test_skip_auto_headers_content_type(aiohttp_client) -> None: async def handler(request): assert hdrs.CONTENT_TYPE not in request.headers return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/', skip_auto_headers=['content-type']) assert 200 == resp.status async def test_post_data_bytesio(aiohttp_client) -> None: data = b'some buffer' async def handler(request): assert len(data) == request.content_length val = await request.read() assert data == val return web.Response() app = web.Application() app.router.add_route('POST', '/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=io.BytesIO(data)) assert 200 == resp.status async def test_post_data_with_bytesio_file(aiohttp_client) -> None: data = b'some buffer' async def handler(request): post_data = await request.post() assert ['file'] == list(post_data.keys()) assert data == post_data['file'].file.read() return web.Response() app = web.Application() app.router.add_route('POST', '/', handler) client = await aiohttp_client(app) resp = await client.post('/', data={'file': io.BytesIO(data)}) assert 200 == resp.status async def test_post_data_stringio(aiohttp_client) -> None: data = 'some buffer' async def handler(request): assert len(data) == request.content_length assert request.headers['CONTENT-TYPE'] == 'text/plain; charset=utf-8' val = await request.text() assert data == val return web.Response() app = web.Application() app.router.add_route('POST', '/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=io.StringIO(data)) assert 200 == resp.status async def test_post_data_textio_encoding(aiohttp_client) -> None: data = 'текст' async def handler(request): assert request.headers['CONTENT-TYPE'] == 'text/plain; charset=koi8-r' val = await request.text() assert data == val return web.Response() app = web.Application() app.router.add_route('POST', '/', handler) client = await aiohttp_client(app) pl = aiohttp.TextIOPayload(io.StringIO(data), encoding='koi8-r') resp = await client.post('/', data=pl) assert 200 == resp.status async def test_ssl_client( aiohttp_server, ssl_ctx, aiohttp_client, client_ssl_ctx, ) -> None: connector = aiohttp.TCPConnector(ssl=client_ssl_ctx) async def handler(request): return web.Response(text='Test message') app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app, ssl=ssl_ctx) client = await aiohttp_client(server, connector=connector) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == 'Test message' async def test_tcp_connector_fingerprint_ok( aiohttp_server, aiohttp_client, ssl_ctx, tls_certificate_fingerprint_sha256, ): tls_fingerprint = Fingerprint(tls_certificate_fingerprint_sha256) async def handler(request): return web.Response(text='Test message') connector = aiohttp.TCPConnector(ssl=tls_fingerprint) app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app, ssl=ssl_ctx) client = await aiohttp_client(server, connector=connector) resp = await client.get('/') assert resp.status == 200 resp.close() async def test_tcp_connector_fingerprint_fail( aiohttp_server, aiohttp_client, ssl_ctx, tls_certificate_fingerprint_sha256, ): async def handler(request): return web.Response(text='Test message') bad_fingerprint = b'\x00' * len(tls_certificate_fingerprint_sha256) connector = aiohttp.TCPConnector(ssl=Fingerprint(bad_fingerprint)) app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app, ssl=ssl_ctx) client = await aiohttp_client(server, connector=connector) with pytest.raises(ServerFingerprintMismatch) as cm: await client.get('/') exc = cm.value assert exc.expected == bad_fingerprint assert exc.got == tls_certificate_fingerprint_sha256 async def test_format_task_get(aiohttp_server) -> None: loop = asyncio.get_event_loop() async def handler(request): return web.Response(body=b'OK') app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) client = aiohttp.ClientSession() task = loop.create_task(client.get(server.make_url('/'))) assert "{}".format(task).startswith(" None: async def handler(request): assert 'q=t est' in request.rel_url.query_string return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/', params='q=t+est') assert 200 == resp.status async def test_drop_params_on_redirect(aiohttp_client) -> None: async def handler_redirect(request): return web.Response(status=301, headers={'Location': '/ok?a=redirect'}) async def handler_ok(request): assert request.rel_url.query_string == 'a=redirect' return web.Response(status=200) app = web.Application() app.router.add_route('GET', '/ok', handler_ok) app.router.add_route('GET', '/redirect', handler_redirect) client = await aiohttp_client(app) resp = await client.get('/redirect', params={'a': 'initial'}) assert resp.status == 200 async def test_drop_fragment_on_redirect(aiohttp_client) -> None: async def handler_redirect(request): return web.Response(status=301, headers={'Location': '/ok#fragment'}) async def handler_ok(request): return web.Response(status=200) app = web.Application() app.router.add_route('GET', '/ok', handler_ok) app.router.add_route('GET', '/redirect', handler_redirect) client = await aiohttp_client(app) resp = await client.get('/redirect') assert resp.status == 200 assert resp.url.path == '/ok' async def test_drop_fragment(aiohttp_client) -> None: async def handler_ok(request): return web.Response(status=200) app = web.Application() app.router.add_route('GET', '/ok', handler_ok) client = await aiohttp_client(app) resp = await client.get('/ok#fragment') assert resp.status == 200 assert resp.url.path == '/ok' async def test_history(aiohttp_client) -> None: async def handler_redirect(request): return web.Response(status=301, headers={'Location': '/ok'}) async def handler_ok(request): return web.Response(status=200) app = web.Application() app.router.add_route('GET', '/ok', handler_ok) app.router.add_route('GET', '/redirect', handler_redirect) client = await aiohttp_client(app) resp = await client.get('/ok') assert len(resp.history) == 0 assert resp.status == 200 resp_redirect = await client.get('/redirect') assert len(resp_redirect.history) == 1 assert resp_redirect.history[0].status == 301 assert resp_redirect.status == 200 async def test_keepalive_closed_by_server(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body resp = web.Response(body=b'OK') resp.force_close() return resp app = web.Application() app.router.add_route('GET', '/', handler) connector = aiohttp.TCPConnector(limit=1) client = await aiohttp_client(app, connector=connector) resp1 = await client.get('/') val1 = await resp1.read() assert val1 == b'OK' resp2 = await client.get('/') val2 = await resp2.read() assert val2 == b'OK' assert 0 == len(client._session.connector._conns) async def test_wait_for(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await asyncio.wait_for(client.get('/'), 10) assert resp.status == 200 txt = await resp.text() assert txt == 'OK' async def test_raw_headers(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 raw_headers = tuple((bytes(h), bytes(v)) for h, v in resp.raw_headers) assert raw_headers == ((b'Content-Length', b'0'), (b'Content-Type', b'application/octet-stream'), (b'Date', mock.ANY), (b'Server', mock.ANY)) resp.close() async def test_host_header_first(aiohttp_client) -> None: async def handler(request): assert list(request.headers)[0] == hdrs.HOST return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 async def test_empty_header_values(aiohttp_client) -> None: async def handler(request): resp = web.Response() resp.headers['X-Empty'] = '' return resp app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 raw_headers = tuple((bytes(h), bytes(v)) for h, v in resp.raw_headers) assert raw_headers == ((b'X-Empty', b''), (b'Content-Length', b'0'), (b'Content-Type', b'application/octet-stream'), (b'Date', mock.ANY), (b'Server', mock.ANY)) resp.close() async def test_204_with_gzipped_content_encoding(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse(status=204) resp.content_length = 0 resp.content_type = 'application/json' # resp.enable_compression(web.ContentCoding.gzip) resp.headers['Content-Encoding'] = 'gzip' await resp.prepare(request) return resp app = web.Application() app.router.add_route('DELETE', '/', handler) client = await aiohttp_client(app) resp = await client.delete('/') assert resp.status == 204 assert resp.closed async def test_timeout_on_reading_headers(aiohttp_client, mocker) -> None: mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = web.StreamResponse() await asyncio.sleep(0.1) await resp.prepare(request) return resp app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) with pytest.raises(asyncio.TimeoutError): await client.get('/', timeout=0.01) async def test_timeout_on_conn_reading_headers(aiohttp_client, mocker) -> None: # tests case where user did not set a connection timeout mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = web.StreamResponse() await asyncio.sleep(0.1) await resp.prepare(request) return resp app = web.Application() app.router.add_route('GET', '/', handler) conn = aiohttp.TCPConnector() client = await aiohttp_client(app, connector=conn) with pytest.raises(asyncio.TimeoutError): await client.get('/', timeout=0.01) async def test_timeout_on_session_read_timeout(aiohttp_client, mocker) -> None: mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = web.StreamResponse() await asyncio.sleep(0.1) await resp.prepare(request) return resp app = web.Application() app.router.add_route('GET', '/', handler) conn = aiohttp.TCPConnector() client = await aiohttp_client( app, connector=conn, timeout=aiohttp.ClientTimeout(sock_read=0.01)) with pytest.raises(asyncio.TimeoutError): await client.get('/') async def test_read_timeout_between_chunks(aiohttp_client, mocker) -> None: mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = aiohttp.web.StreamResponse() await resp.prepare(request) # write data 4 times, with pauses. Total time 0.4 seconds. for _ in range(4): await asyncio.sleep(0.1) await resp.write(b'data\n') return resp app = web.Application() app.add_routes([web.get('/', handler)]) # A timeout of 0.2 seconds should apply per read. timeout = aiohttp.ClientTimeout(sock_read=0.2) client = await aiohttp_client(app, timeout=timeout) res = b'' async with await client.get('/') as resp: res += await resp.read() assert res == b'data\n' * 4 async def test_read_timeout_on_reading_chunks(aiohttp_client, mocker) -> None: mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = aiohttp.web.StreamResponse() await resp.prepare(request) await resp.write(b'data\n') await asyncio.sleep(1) await resp.write(b'data\n') return resp app = web.Application() app.add_routes([web.get('/', handler)]) # A timeout of 0.2 seconds should apply per read. timeout = aiohttp.ClientTimeout(sock_read=0.2) client = await aiohttp_client(app, timeout=timeout) async with await client.get('/') as resp: assert (await resp.content.read(5)) == b'data\n' with pytest.raises(asyncio.TimeoutError): await resp.content.read() async def test_timeout_on_reading_data(aiohttp_client, mocker) -> None: loop = asyncio.get_event_loop() mocker.patch('aiohttp.helpers.ceil').side_effect = ceil fut = loop.create_future() async def handler(request): resp = web.StreamResponse(headers={'content-length': '100'}) await resp.prepare(request) fut.set_result(None) await asyncio.sleep(0.2) return resp app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/', timeout=1) await fut with pytest.raises(asyncio.TimeoutError): await resp.read() async def test_timeout_none(aiohttp_client, mocker) -> None: mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def handler(request): resp = web.StreamResponse() await resp.prepare(request) return resp app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/', timeout=None) assert resp.status == 200 async def test_readline_error_on_conn_close(aiohttp_client) -> None: loop = asyncio.get_event_loop() async def handler(request): resp_ = web.StreamResponse() await resp_.prepare(request) # make sure connection is closed by client. with pytest.raises(aiohttp.ServerDisconnectedError): for _ in range(10): await resp_.write(b'data\n') await asyncio.sleep(0.5) return resp_ app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_client(app) session = aiohttp.ClientSession() try: timer_started = False url, headers = server.make_url('/'), {'Connection': 'Keep-alive'} resp = await session.get(url, headers=headers) with pytest.raises(aiohttp.ClientConnectionError): while True: data = await resp.content.readline() data = data.strip() if not data: break assert data == b'data' if not timer_started: def do_release(): loop.create_task(resp.release()) loop.call_later(1.0, do_release) timer_started = True finally: await session.close() async def test_no_error_on_conn_close_if_eof(aiohttp_client) -> None: async def handler(request): resp_ = web.StreamResponse() await resp_.prepare(request) await resp_.write(b'data\n') await asyncio.sleep(0.5) return resp_ app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_client(app) session = aiohttp.ClientSession() try: url, headers = server.make_url('/'), {'Connection': 'Keep-alive'} resp = await session.get(url, headers=headers) while True: data = await resp.content.readline() data = data.strip() if not data: break assert data == b'data' assert resp.content.exception() is None finally: await session.close() async def test_error_not_overwrote_on_conn_close(aiohttp_client) -> None: async def handler(request): resp_ = web.StreamResponse() await resp_.prepare(request) return resp_ app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_client(app) session = aiohttp.ClientSession() try: url, headers = server.make_url('/'), {'Connection': 'Keep-alive'} resp = await session.get(url, headers=headers) resp.content.set_exception(ValueError()) finally: await session.close() assert isinstance(resp.content.exception(), ValueError) async def test_HTTP_200_OK_METHOD(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) app = web.Application() for meth in ('get', 'post', 'put', 'delete', 'head', 'patch', 'options'): app.router.add_route(meth.upper(), '/', handler) client = await aiohttp_client(app) for meth in ('get', 'post', 'put', 'delete', 'head', 'patch', 'options'): resp = await client.request(meth, '/') assert resp.status == 200 assert len(resp.history) == 0 content1 = await resp.read() content2 = await resp.read() assert content1 == content2 content = await resp.text() if meth == 'head': assert b'' == content1 else: assert meth.upper() == content async def test_HTTP_200_OK_METHOD_connector(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) conn = aiohttp.TCPConnector() conn.clear_dns_cache() app = web.Application() for meth in ('get', 'post', 'put', 'delete', 'head'): app.router.add_route(meth.upper(), '/', handler) client = await aiohttp_client(app, connector=conn) for meth in ('get', 'post', 'put', 'delete', 'head'): resp = await client.request(meth, '/') content1 = await resp.read() content2 = await resp.read() assert content1 == content2 content = await resp.text() assert resp.status == 200 if meth == 'head': assert b'' == content1 else: assert meth.upper() == content async def test_HTTP_302_REDIRECT_GET(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): raise web.HTTPFound(location='/') app = web.Application() app.router.add_get('/', handler) app.router.add_get('/redirect', redirect) client = await aiohttp_client(app) resp = await client.get('/redirect') assert 200 == resp.status assert 1 == len(resp.history) resp.close() async def test_HTTP_302_REDIRECT_HEAD(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): raise web.HTTPFound(location='/') app = web.Application() app.router.add_get('/', handler) app.router.add_get('/redirect', redirect) app.router.add_head('/', handler) app.router.add_head('/redirect', redirect) client = await aiohttp_client(app) resp = await client.request('head', '/redirect') assert 200 == resp.status assert 1 == len(resp.history) assert resp.method == 'HEAD' resp.close() async def test_HTTP_302_REDIRECT_NON_HTTP(aiohttp_client) -> None: async def redirect(request): raise web.HTTPFound(location='ftp://127.0.0.1/test/') app = web.Application() app.router.add_get('/redirect', redirect) client = await aiohttp_client(app) with pytest.raises(ValueError): await client.get('/redirect') async def test_HTTP_302_REDIRECT_POST(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): raise web.HTTPFound(location='/') app = web.Application() app.router.add_get('/', handler) app.router.add_post('/redirect', redirect) client = await aiohttp_client(app) resp = await client.post('/redirect') assert 200 == resp.status assert 1 == len(resp.history) txt = await resp.text() assert txt == 'GET' resp.close() async def test_HTTP_302_REDIRECT_POST_with_content_length_hdr( aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): await request.read() raise web.HTTPFound(location='/') data = json.dumps({'some': 'data'}) app = web.Application() app.router.add_get('/', handler) app.router.add_post('/redirect', redirect) client = await aiohttp_client(app) resp = await client.post( '/redirect', data=data, headers={'Content-Length': str(len(data))} ) assert 200 == resp.status assert 1 == len(resp.history) txt = await resp.text() assert txt == 'GET' resp.close() async def test_HTTP_307_REDIRECT_POST(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): await request.read() raise web.HTTPTemporaryRedirect(location='/') app = web.Application() app.router.add_post('/', handler) app.router.add_post('/redirect', redirect) client = await aiohttp_client(app) resp = await client.post('/redirect', data={'some': 'data'}) assert 200 == resp.status assert 1 == len(resp.history) txt = await resp.text() assert txt == 'POST' resp.close() async def test_HTTP_308_PERMANENT_REDIRECT_POST(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): await request.read() raise web.HTTPPermanentRedirect(location='/') app = web.Application() app.router.add_post('/', handler) app.router.add_post('/redirect', redirect) client = await aiohttp_client(app) resp = await client.post('/redirect', data={'some': 'data'}) assert 200 == resp.status assert 1 == len(resp.history) txt = await resp.text() assert txt == 'POST' resp.close() async def test_HTTP_302_max_redirects(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): count = int(request.match_info['count']) if count: raise web.HTTPFound(location='/redirect/{}'.format(count-1)) else: raise web.HTTPFound(location='/') app = web.Application() app.router.add_get('/', handler) app.router.add_get(r'/redirect/{count:\d+}', redirect) client = await aiohttp_client(app) with pytest.raises(TooManyRedirects) as ctx: await client.get('/redirect/5', max_redirects=2) assert 2 == len(ctx.value.history) assert ctx.value.request_info.url.path == '/redirect/5' assert ctx.value.request_info.method == 'GET' async def test_HTTP_200_GET_WITH_PARAMS(aiohttp_client) -> None: async def handler(request): return web.Response(text='&'.join( k+'='+v for k, v in request.query.items())) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/', params={'q': 'test'}) assert 200 == resp.status txt = await resp.text() assert txt == 'q=test' resp.close() async def test_HTTP_200_GET_WITH_MultiDict_PARAMS(aiohttp_client) -> None: async def handler(request): return web.Response(text='&'.join( k+'='+v for k, v in request.query.items())) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/', params=MultiDict([('q', 'test'), ('q', 'test2')])) assert 200 == resp.status txt = await resp.text() assert txt == 'q=test&q=test2' resp.close() async def test_HTTP_200_GET_WITH_MIXED_PARAMS(aiohttp_client) -> None: async def handler(request): return web.Response(text='&'.join( k+'='+v for k, v in request.query.items())) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/?test=true', params={'q': 'test'}) assert 200 == resp.status txt = await resp.text() assert txt == 'test=true&q=test' resp.close() async def test_POST_DATA(aiohttp_client) -> None: async def handler(request): data = await request.post() return web.json_response(dict(data)) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data={'some': 'data'}) assert 200 == resp.status content = await resp.json() assert content == {'some': 'data'} resp.close() async def test_POST_DATA_with_explicit_formdata(aiohttp_client) -> None: async def handler(request): data = await request.post() return web.json_response(dict(data)) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData() form.add_field('name', 'text') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.json() assert content == {'name': 'text'} resp.close() async def test_POST_DATA_with_charset(aiohttp_client) -> None: async def handler(request): mp = await request.multipart() part = await mp.next() text = await part.text() return web.Response(text=text) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData() form.add_field('name', 'текст', content_type='text/plain; charset=koi8-r') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.text() assert content == 'текст' resp.close() async def test_POST_DATA_formdats_with_charset(aiohttp_client) -> None: async def handler(request): mp = await request.post() assert 'name' in mp return web.Response(text=mp['name']) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData(charset='koi8-r') form.add_field('name', 'текст') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.text() assert content == 'текст' resp.close() async def test_POST_DATA_with_charset_post(aiohttp_client) -> None: async def handler(request): data = await request.post() return web.Response(text=data['name']) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData() form.add_field('name', 'текст', content_type='text/plain; charset=koi8-r') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.text() assert content == 'текст' resp.close() async def test_POST_DATA_with_context_transfer_encoding( aiohttp_client) -> None: async def handler(request): data = await request.post() assert data['name'] == 'text' return web.Response(text=data['name']) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData() form.add_field('name', 'text', content_transfer_encoding='base64') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.text() assert content == 'text' resp.close() async def test_POST_DATA_with_content_type_context_transfer_encoding( aiohttp_client): async def handler(request): data = await request.post() assert data['name'] == 'text' return web.Response(body=data['name']) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = aiohttp.FormData() form.add_field('name', 'text', content_type='text/plain', content_transfer_encoding='base64') resp = await client.post('/', data=form) assert 200 == resp.status content = await resp.text() assert content == 'text' resp.close() async def test_POST_MultiDict(aiohttp_client) -> None: async def handler(request): data = await request.post() assert data == MultiDict([('q', 'test1'), ('q', 'test2')]) return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=MultiDict( [('q', 'test1'), ('q', 'test2')])) assert 200 == resp.status resp.close() async def test_POST_DATA_DEFLATE(aiohttp_client) -> None: async def handler(request): data = await request.post() return web.json_response(dict(data)) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data={'some': 'data'}, compress=True) assert 200 == resp.status content = await resp.json() assert content == {'some': 'data'} resp.close() async def test_POST_FILES(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() assert data['some'].filename == fname.name with fname.open('rb') as f: content1 = f.read() content2 = data['some'].file.read() assert content1 == content2 assert data['test'].file.read() == b'data' return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post( '/', data={'some': f, 'test': b'data'}, chunked=True) assert 200 == resp.status resp.close() async def test_POST_FILES_DEFLATE(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() assert data['some'].filename == fname.name with fname.open('rb') as f: content1 = f.read() content2 = data['some'].file.read() assert content1 == content2 return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post( '/', data={'some': f}, chunked=True, compress='deflate' ) assert 200 == resp.status resp.close() async def test_POST_bytes(aiohttp_client) -> None: body = b'0' * 12345 async def handler(request): data = await request.read() assert body == data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=body) assert 200 == resp.status resp.close() async def test_POST_bytes_too_large(aiohttp_client) -> None: body = b'0' * (2 ** 20 + 1) async def handler(request): data = await request.content.read() assert body == data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with pytest.warns(ResourceWarning): resp = await client.post('/', data=body) assert 200 == resp.status resp.close() async def test_POST_FILES_STR(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() with fname.open() as f: content1 = f.read() content2 = data['some'] assert content1 == content2 return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post('/', data={'some': f.read()}) assert 200 == resp.status resp.close() async def test_POST_FILES_STR_SIMPLE(aiohttp_client, fname) -> None: async def handler(request): data = await request.read() with fname.open('rb') as f: content = f.read() assert content == data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post('/', data=f.read()) assert 200 == resp.status resp.close() async def test_POST_FILES_LIST(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() assert fname.name == data['some'].filename with fname.open('rb') as f: content = f.read() assert content == data['some'].file.read() return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post('/', data=[('some', f)]) assert 200 == resp.status resp.close() async def test_POST_FILES_CT(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() assert fname.name == data['some'].filename assert 'text/plain' == data['some'].content_type with fname.open('rb') as f: content = f.read() assert content == data['some'].file.read() return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: form = aiohttp.FormData() form.add_field('some', f, content_type='text/plain') resp = await client.post('/', data=form) assert 200 == resp.status resp.close() async def test_POST_FILES_SINGLE(aiohttp_client, fname) -> None: async def handler(request): data = await request.text() with fname.open('r') as f: content = f.read() assert content == data # if system cannot determine 'application/pgp-keys' MIME type # then use 'application/octet-stream' default assert request.content_type in ['application/pgp-keys', 'text/plain', 'application/octet-stream'] assert 'content-disposition' not in request.headers return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post('/', data=f) assert 200 == resp.status resp.close() async def test_POST_FILES_SINGLE_content_disposition( aiohttp_client, fname) -> None: async def handler(request): data = await request.text() with fname.open('r') as f: content = f.read() assert content == data # if system cannot determine 'application/pgp-keys' MIME type # then use 'application/octet-stream' default assert request.content_type in ['application/pgp-keys', 'text/plain', 'application/octet-stream'] assert request.headers['content-disposition'] == ( "inline; filename=\"conftest.py\"; filename*=utf-8''conftest.py") return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post( '/', data=aiohttp.get_payload(f, disposition='inline')) assert 200 == resp.status resp.close() async def test_POST_FILES_SINGLE_BINARY(aiohttp_client, fname) -> None: async def handler(request): data = await request.read() with fname.open('rb') as f: content = f.read() assert content == data # if system cannot determine 'application/pgp-keys' MIME type # then use 'application/octet-stream' default assert request.content_type in ['application/pgp-keys', 'text/plain', 'text/x-python', 'application/octet-stream'] return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open('rb') as f: resp = await client.post('/', data=f) assert 200 == resp.status resp.close() async def test_POST_FILES_IO(aiohttp_client) -> None: async def handler(request): data = await request.post() assert b'data' == data['unknown'].file.read() assert data['unknown'].content_type == 'application/octet-stream' assert data['unknown'].filename == 'unknown' return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = io.BytesIO(b'data') resp = await client.post('/', data=[data]) assert 200 == resp.status resp.close() async def test_POST_FILES_IO_WITH_PARAMS(aiohttp_client) -> None: async def handler(request): data = await request.post() assert data['test'] == 'true' assert data['unknown'].content_type == 'application/octet-stream' assert data['unknown'].filename == 'unknown' assert data['unknown'].file.read() == b'data' assert data.getall('q') == ['t1', 't2'] return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = io.BytesIO(b'data') resp = await client.post( '/', data=(('test', 'true'), MultiDict([('q', 't1'), ('q', 't2')]), data) ) assert 200 == resp.status resp.close() async def test_POST_FILES_WITH_DATA(aiohttp_client, fname) -> None: async def handler(request): data = await request.post() assert data['test'] == 'true' assert data['some'].content_type in ['application/pgp-keys', 'text/plain; charset=utf-8', 'application/octet-stream'] assert data['some'].filename == fname.name with fname.open('rb') as f: assert data['some'].file.read() == f.read() return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open() as f: resp = await client.post('/', data={'test': 'true', 'some': f}) assert 200 == resp.status resp.close() async def test_POST_STREAM_DATA(aiohttp_client, fname) -> None: async def handler(request): assert request.content_type == 'application/octet-stream' content = await request.read() with fname.open('rb') as f: expected = f.read() assert request.content_length == len(expected) assert content == expected return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open('rb') as f: data_size = len(f.read()) with pytest.warns(DeprecationWarning): @aiohttp.streamer async def stream(writer, fname): with fname.open('rb') as f: data = f.read(100) while data: await writer.write(data) data = f.read(100) resp = await client.post( '/', data=stream(fname), headers={'Content-Length': str(data_size)}) assert 200 == resp.status resp.close() async def test_POST_STREAM_DATA_no_params(aiohttp_client, fname) -> None: async def handler(request): assert request.content_type == 'application/octet-stream' content = await request.read() with fname.open('rb') as f: expected = f.read() assert request.content_length == len(expected) assert content == expected return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open('rb') as f: data_size = len(f.read()) with pytest.warns(DeprecationWarning): @aiohttp.streamer async def stream(writer): with fname.open('rb') as f: data = f.read(100) while data: await writer.write(data) data = f.read(100) resp = await client.post( '/', data=stream, headers={'Content-Length': str(data_size)}) assert 200 == resp.status resp.close() async def test_json(aiohttp_client) -> None: async def handler(request): assert request.content_type == 'application/json' data = await request.json() return web.Response(body=aiohttp.JsonPayload(data)) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', json={'some': 'data'}) assert 200 == resp.status content = await resp.json() assert content == {'some': 'data'} resp.close() with pytest.raises(ValueError): await client.post('/', data="some data", json={'some': 'data'}) async def test_json_custom(aiohttp_client) -> None: async def handler(request): assert request.content_type == 'application/json' data = await request.json() return web.Response(body=aiohttp.JsonPayload(data)) used = False def dumps(obj): nonlocal used used = True return json.dumps(obj) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app, json_serialize=dumps) resp = await client.post('/', json={'some': 'data'}) assert 200 == resp.status assert used content = await resp.json() assert content == {'some': 'data'} resp.close() with pytest.raises(ValueError): await client.post('/', data="some data", json={'some': 'data'}) async def test_expect_continue(aiohttp_client) -> None: expect_called = False async def handler(request): data = await request.post() assert data == {'some': 'data'} return web.Response() async def expect_handler(request): nonlocal expect_called expect = request.headers.get(hdrs.EXPECT) if expect.lower() == "100-continue": request.transport.write(b"HTTP/1.1 100 Continue\r\n\r\n") expect_called = True app = web.Application() app.router.add_post('/', handler, expect_handler=expect_handler) client = await aiohttp_client(app) resp = await client.post('/', data={'some': 'data'}, expect100=True) assert 200 == resp.status resp.close() assert expect_called async def test_encoding_deflate(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.enable_chunked_encoding() resp.enable_compression(web.ContentCoding.deflate) return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == 'text' resp.close() async def test_encoding_deflate_nochunk(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.enable_compression(web.ContentCoding.deflate) return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == 'text' resp.close() async def test_encoding_gzip(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.enable_chunked_encoding() resp.enable_compression(web.ContentCoding.gzip) return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == 'text' resp.close() async def test_encoding_gzip_write_by_chunks(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse() resp.enable_compression(web.ContentCoding.gzip) await resp.prepare(request) await resp.write(b'0') await resp.write(b'0') return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == '00' resp.close() async def test_encoding_gzip_nochunk(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.enable_compression(web.ContentCoding.gzip) return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert txt == 'text' resp.close() async def test_bad_payload_compression(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.headers['Content-Encoding'] = 'gzip' return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status with pytest.raises(aiohttp.ClientPayloadError): await resp.read() resp.close() async def test_bad_payload_chunked_encoding(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse() resp.force_close() resp._length_check = False resp.headers['Transfer-Encoding'] = 'chunked' writer = await resp.prepare(request) await writer.write(b'9\r\n\r\n') await writer.write_eof() return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status with pytest.raises(aiohttp.ClientPayloadError): await resp.read() resp.close() async def test_bad_payload_content_length(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.headers['Content-Length'] = '10000' resp.force_close() return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status with pytest.raises(aiohttp.ClientPayloadError): await resp.read() resp.close() async def test_payload_content_length_by_chunks(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse(headers={'content-length': '3'}) await resp.prepare(request) await resp.write(b'answer') await resp.write(b'two') request.transport.close() return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') data = await resp.read() assert data == b'ans' resp.close() async def test_chunked(aiohttp_client) -> None: async def handler(request): resp = web.Response(text='text') resp.enable_chunked_encoding() return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status assert resp.headers['Transfer-Encoding'] == 'chunked' txt = await resp.text() assert txt == 'text' resp.close() async def test_shortcuts(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) app = web.Application() for meth in ('get', 'post', 'put', 'delete', 'head', 'patch', 'options'): app.router.add_route(meth.upper(), '/', handler) client = await aiohttp_client(app) for meth in ('get', 'post', 'put', 'delete', 'head', 'patch', 'options'): coro = getattr(client.session, meth) resp = await coro(client.make_url('/')) assert resp.status == 200 assert len(resp.history) == 0 content1 = await resp.read() content2 = await resp.read() assert content1 == content2 content = await resp.text() if meth == 'head': assert b'' == content1 else: assert meth.upper() == content async def test_cookies(aiohttp_client) -> None: async def handler(request): assert request.cookies.keys() == {'test1', 'test3'} assert request.cookies['test1'] == '123' assert request.cookies['test3'] == '456' return web.Response() c = http.cookies.Morsel() c.set('test3', '456', '456') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client( app, cookies={'test1': '123', 'test2': c}) resp = await client.get('/') assert 200 == resp.status resp.close() async def test_cookies_per_request(aiohttp_client) -> None: async def handler(request): assert request.cookies.keys() == {'test1', 'test3', 'test4', 'test6'} assert request.cookies['test1'] == '123' assert request.cookies['test3'] == '456' assert request.cookies['test4'] == '789' assert request.cookies['test6'] == 'abc' return web.Response() c = http.cookies.Morsel() c.set('test3', '456', '456') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client( app, cookies={'test1': '123', 'test2': c}) rc = http.cookies.Morsel() rc.set('test6', 'abc', 'abc') resp = await client.get( '/', cookies={'test4': '789', 'test5': rc}) assert 200 == resp.status resp.close() async def test_cookies_redirect(aiohttp_client) -> None: async def redirect1(request): ret = web.Response(status=301, headers={'Location': '/redirect2'}) ret.set_cookie('c', '1') return ret async def redirect2(request): ret = web.Response(status=301, headers={'Location': '/'}) ret.set_cookie('c', '2') return ret async def handler(request): assert request.cookies.keys() == {'c'} assert request.cookies['c'] == '2' return web.Response() app = web.Application() app.router.add_get('/redirect1', redirect1) app.router.add_get('/redirect2', redirect2) app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/redirect1') assert 200 == resp.status resp.close() async def test_cookies_on_empty_session_jar(aiohttp_client) -> None: async def handler(request): assert 'custom-cookie' in request.cookies assert request.cookies['custom-cookie'] == 'abc' return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client( app, cookies=None) resp = await client.get('/', cookies={'custom-cookie': 'abc'}) assert 200 == resp.status resp.close() async def test_morsel_with_attributes(aiohttp_client) -> None: # A comment from original test: # # No cookie attribute should pass here # they are only used as filters # whether to send particular cookie or not. # E.g. if cookie expires it just becomes thrown away. # Server who sent the cookie with some attributes # already knows them, no need to send this back again and again async def handler(request): assert request.cookies.keys() == {'test3'} assert request.cookies['test3'] == '456' return web.Response() c = http.cookies.Morsel() c.set('test3', '456', '456') c['httponly'] = True c['secure'] = True c['max-age'] = 1000 app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, cookies={'test2': c}) resp = await client.get('/') assert 200 == resp.status resp.close() async def test_set_cookies(aiohttp_client) -> None: async def handler(request): ret = web.Response() ret.set_cookie('c1', 'cookie1') ret.set_cookie('c2', 'cookie2') ret.headers.add('Set-Cookie', 'ISAWPLB{A7F52349-3531-4DA9-8776-F74BC6F4F1BB}=' '{925EC0B8-CB17-4BEB-8A35-1033813B0523}; ' 'HttpOnly; Path=/') return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with mock.patch('aiohttp.client_reqrep.client_logger') as m_log: resp = await client.get('/') assert 200 == resp.status cookie_names = {c.key for c in client.session.cookie_jar} assert cookie_names == {'c1', 'c2'} resp.close() m_log.warning.assert_called_with('Can not load response cookies: %s', mock.ANY) async def test_set_cookies_expired(aiohttp_client) -> None: async def handler(request): ret = web.Response() ret.set_cookie('c1', 'cookie1') ret.set_cookie('c2', 'cookie2') ret.headers.add('Set-Cookie', 'c3=cookie3; ' 'HttpOnly; Path=/' " Expires=Tue, 1 Jan 1980 12:00:00 GMT; ") return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status cookie_names = {c.key for c in client.session.cookie_jar} assert cookie_names == {'c1', 'c2'} resp.close() async def test_set_cookies_max_age(aiohttp_client) -> None: async def handler(request): ret = web.Response() ret.set_cookie('c1', 'cookie1') ret.set_cookie('c2', 'cookie2') ret.headers.add('Set-Cookie', 'c3=cookie3; ' 'HttpOnly; Path=/' " Max-Age=1; ") return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status cookie_names = {c.key for c in client.session.cookie_jar} assert cookie_names == {'c1', 'c2', 'c3'} await asyncio.sleep(2) cookie_names = {c.key for c in client.session.cookie_jar} assert cookie_names == {'c1', 'c2'} resp.close() async def test_set_cookies_max_age_overflow(aiohttp_client) -> None: async def handler(request): ret = web.Response() ret.headers.add('Set-Cookie', 'overflow=overflow; ' 'HttpOnly; Path=/' " Max-Age=" + str(overflow) + "; ") return ret overflow = int(datetime.datetime.max.replace( tzinfo=datetime.timezone.utc).timestamp()) empty = None try: empty = (datetime.datetime.now(datetime.timezone.utc) + datetime.timedelta(seconds=overflow)) except OverflowError as ex: assert isinstance(ex, OverflowError) assert not isinstance(empty, datetime.datetime) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status for cookie in client.session.cookie_jar: if cookie.key == 'overflow': assert int(cookie['max-age']) == int(overflow) resp.close() async def test_request_conn_error() -> None: client = aiohttp.ClientSession() with pytest.raises(aiohttp.ClientConnectionError): await client.get('http://0.0.0.0:1') await client.close() @pytest.mark.xfail async def test_broken_connection(aiohttp_client) -> None: async def handler(request): request.transport.close() return web.Response(text='answer'*1000) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with pytest.raises(aiohttp.ClientResponseError): await client.get('/') async def test_broken_connection_2(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse(headers={'content-length': '1000'}) await resp.prepare(request) await resp.write(b'answer') request.transport.close() return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') with pytest.raises(aiohttp.ClientPayloadError): await resp.read() resp.close() async def test_custom_headers(aiohttp_client) -> None: async def handler(request): assert request.headers["x-api-key"] == "foo" return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', headers={ "Content-Type": "application/json", "x-api-key": "foo"}) assert resp.status == 200 async def test_redirect_to_absolute_url(aiohttp_client) -> None: async def handler(request): return web.Response(text=request.method) async def redirect(request): raise web.HTTPFound(location=client.make_url('/')) app = web.Application() app.router.add_get('/', handler) app.router.add_get('/redirect', redirect) client = await aiohttp_client(app) resp = await client.get('/redirect') assert 200 == resp.status resp.close() async def test_redirect_without_location_header(aiohttp_client) -> None: body = b'redirect' async def handler_redirect(request): return web.Response(status=301, body=body) app = web.Application() app.router.add_route('GET', '/redirect', handler_redirect) client = await aiohttp_client(app) resp = await client.get('/redirect') data = await resp.read() assert data == body async def test_chunked_deprecated(aiohttp_client) -> None: async def handler_redirect(request): return web.Response(status=301) app = web.Application() app.router.add_route('GET', '/redirect', handler_redirect) client = await aiohttp_client(app) with pytest.warns(DeprecationWarning): await client.post('/', chunked=1024) async def test_raise_for_status(aiohttp_client) -> None: async def handler_redirect(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_route('GET', '/', handler_redirect) client = await aiohttp_client(app, raise_for_status=True) with pytest.raises(aiohttp.ClientResponseError): await client.get('/') async def test_raise_for_status_per_request(aiohttp_client) -> None: async def handler_redirect(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_route('GET', '/', handler_redirect) client = await aiohttp_client(app) with pytest.raises(aiohttp.ClientResponseError): await client.get('/', raise_for_status=True) async def test_raise_for_status_disable_per_request(aiohttp_client) -> None: async def handler_redirect(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_route('GET', '/', handler_redirect) client = await aiohttp_client(app, raise_for_status=True) resp = await client.get('/', raise_for_status=False) assert 400 == resp.status resp.close() async def test_request_raise_for_status_default(aiohttp_server) -> None: async def handler(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) async with aiohttp.request('GET', server.make_url('/')) as resp: assert resp.status == 400 async def test_request_raise_for_status_disabled(aiohttp_server) -> None: async def handler(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) url = server.make_url('/') async with aiohttp.request('GET', url, raise_for_status=False) as resp: assert resp.status == 400 async def test_request_raise_for_status_enabled(aiohttp_server) -> None: async def handler(request): raise web.HTTPBadRequest() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) url = server.make_url('/') with pytest.raises(aiohttp.ClientResponseError): async with aiohttp.request('GET', url, raise_for_status=True): assert False, "never executed" # pragma: no cover async def test_invalid_idna() -> None: session = aiohttp.ClientSession() try: with pytest.raises(aiohttp.InvalidURL): await session.get('http://\u2061owhefopw.com') finally: await session.close() async def test_creds_in_auth_and_url() -> None: session = aiohttp.ClientSession() try: with pytest.raises(ValueError): await session.get('http://user:pass@example.com', auth=aiohttp.BasicAuth('user2', 'pass2')) finally: await session.close() async def test_drop_auth_on_redirect_to_other_host(aiohttp_server) -> None: async def srv1(request): assert request.host == 'host1.com' assert request.headers['Authorization'] == 'Basic dXNlcjpwYXNz' raise web.HTTPFound('http://host2.com/path2') async def srv2(request): assert request.host == 'host2.com' assert 'Authorization' not in request.headers return web.Response() app = web.Application() app.router.add_route('GET', '/path1', srv1) app.router.add_route('GET', '/path2', srv2) server = await aiohttp_server(app) class FakeResolver(AbstractResolver): async def resolve(self, host, port=0, family=socket.AF_INET): return [{'hostname': host, 'host': server.host, 'port': server.port, 'family': socket.AF_INET, 'proto': 0, 'flags': socket.AI_NUMERICHOST}] async def close(self): pass connector = aiohttp.TCPConnector(resolver=FakeResolver()) async with aiohttp.ClientSession(connector=connector) as client: resp = await client.get( 'http://host1.com/path1', auth=aiohttp.BasicAuth('user', 'pass') ) assert resp.status == 200 resp = await client.get( 'http://host1.com/path1', headers={'Authorization': 'Basic dXNlcjpwYXNz'} ) assert resp.status == 200 async def test_async_with_session() -> None: with pytest.warns(None) as cm: async with aiohttp.ClientSession() as session: pass assert len(cm.list) == 0 assert session.closed async def test_session_close_awaitable() -> None: session = aiohttp.ClientSession() with pytest.warns(None) as cm: await session.close() assert len(cm.list) == 0 assert session.closed async def test_close_run_until_complete_not_deprecated() -> None: session = aiohttp.ClientSession() with pytest.warns(None) as cm: await session.close() assert len(cm.list) == 0 async def test_close_resp_on_error_async_with_session(aiohttp_server) -> None: async def handler(request): resp = web.StreamResponse(headers={'content-length': '100'}) await resp.prepare(request) await asyncio.sleep(0.1) return resp app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: with pytest.raises(RuntimeError): async with session.get(server.make_url('/')) as resp: resp.content.set_exception(RuntimeError()) await resp.read() assert len(session._connector._conns) == 0 async def test_release_resp_on_normal_exit_from_cm(aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: async with session.get(server.make_url('/')) as resp: await resp.read() assert len(session._connector._conns) == 1 async def test_non_close_detached_session_on_error_cm(aiohttp_server) -> None: async def handler(request): resp = web.StreamResponse(headers={'content-length': '100'}) await resp.prepare(request) await asyncio.sleep(0.1) return resp app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) session = aiohttp.ClientSession() cm = session.get(server.make_url('/')) assert not session.closed with pytest.raises(RuntimeError): async with cm as resp: resp.content.set_exception(RuntimeError()) await resp.read() assert not session.closed async def test_close_detached_session_on_non_existing_addr() -> None: class FakeResolver(AbstractResolver): async def resolve(host, port=0, family=socket.AF_INET): return {} async def close(self): pass connector = aiohttp.TCPConnector(resolver=FakeResolver()) session = aiohttp.ClientSession(connector=connector) async with session: cm = session.get('http://non-existing.example.com') assert not session.closed with pytest.raises(Exception): await cm assert session.closed async def test_aiohttp_request_context_manager(aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) async with aiohttp.request('GET', server.make_url('/')) as resp: await resp.read() assert resp.status == 200 async def test_aiohttp_request_ctx_manager_close_sess_on_error( ssl_ctx, aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app, ssl=ssl_ctx) cm = aiohttp.request('GET', server.make_url('/')) with pytest.raises(aiohttp.ClientConnectionError): async with cm: pass assert cm._session.closed async def test_aiohttp_request_ctx_manager_not_found() -> None: with pytest.raises(aiohttp.ClientConnectionError): async with aiohttp.request('GET', 'http://wrong-dns-name.com'): assert False, "never executed" # pragma: no cover async def test_aiohttp_request_coroutine(aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app) with pytest.raises(TypeError): await aiohttp.request('GET', server.make_url('/')) async def test_yield_from_in_session_request(aiohttp_client) -> None: # a test for backward compatibility with yield from syntax async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 async def test_close_context_manager(aiohttp_client) -> None: # a test for backward compatibility with yield from syntax async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ctx = client.get('/') ctx.close() assert not ctx._coro.cr_running async def test_session_auth(aiohttp_client) -> None: async def handler(request): return web.json_response({'headers': dict(request.headers)}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, auth=aiohttp.BasicAuth("login", "pass")) r = await client.get('/') assert r.status == 200 content = await r.json() assert content['headers']["Authorization"] == "Basic bG9naW46cGFzcw==" async def test_session_auth_override(aiohttp_client) -> None: async def handler(request): return web.json_response({'headers': dict(request.headers)}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, auth=aiohttp.BasicAuth("login", "pass")) r = await client.get('/', auth=aiohttp.BasicAuth("other_login", "pass")) assert r.status == 200 content = await r.json() val = content['headers']["Authorization"] assert val == "Basic b3RoZXJfbG9naW46cGFzcw==" async def test_session_auth_header_conflict(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, auth=aiohttp.BasicAuth("login", "pass")) headers = {'Authorization': "Basic b3RoZXJfbG9naW46cGFzcw=="} with pytest.raises(ValueError): await client.get('/', headers=headers) async def test_session_headers(aiohttp_client) -> None: async def handler(request): return web.json_response({'headers': dict(request.headers)}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, headers={"X-Real-IP": "192.168.0.1"}) r = await client.get('/') assert r.status == 200 content = await r.json() assert content['headers']["X-Real-IP"] == "192.168.0.1" async def test_session_headers_merge(aiohttp_client) -> None: async def handler(request): return web.json_response({'headers': dict(request.headers)}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, headers=[ ("X-Real-IP", "192.168.0.1"), ("X-Sent-By", "requests")]) r = await client.get('/', headers={"X-Sent-By": "aiohttp"}) assert r.status == 200 content = await r.json() assert content['headers']["X-Real-IP"] == "192.168.0.1" assert content['headers']["X-Sent-By"] == "aiohttp" async def test_multidict_headers(aiohttp_client) -> None: async def handler(request): assert await request.read() == data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = b'sample data' r = await client.post('/', data=data, headers=MultiDict( {'Content-Length': str(len(data))})) assert r.status == 200 async def test_request_conn_closed(aiohttp_client) -> None: async def handler(request): request.transport.close() return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with pytest.raises(aiohttp.ServerDisconnectedError): resp = await client.get('/') await resp.read() async def test_dont_close_explicit_connector(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) r = await client.get('/') await r.read() assert 1 == len(client.session.connector._conns) async def test_server_close_keepalive_connection() -> None: loop = asyncio.get_event_loop() class Proto(asyncio.Protocol): def connection_made(self, transport): self.transp = transport self.data = b'' def data_received(self, data): self.data += data if data.endswith(b'\r\n\r\n'): self.transp.write( b'HTTP/1.1 200 OK\r\n' b'CONTENT-LENGTH: 2\r\n' b'CONNECTION: close\r\n' b'\r\n' b'ok') self.transp.close() def connection_lost(self, exc): self.transp = None server = await loop.create_server( Proto, '127.0.0.1', unused_port()) addr = server.sockets[0].getsockname() connector = aiohttp.TCPConnector(limit=1) session = aiohttp.ClientSession(connector=connector) url = 'http://{}:{}/'.format(*addr) for i in range(2): r = await session.request('GET', url) await r.read() assert 0 == len(connector._conns) await session.close() connector.close() server.close() await server.wait_closed() async def test_handle_keepalive_on_closed_connection() -> None: loop = asyncio.get_event_loop() class Proto(asyncio.Protocol): def connection_made(self, transport): self.transp = transport self.data = b'' def data_received(self, data): self.data += data if data.endswith(b'\r\n\r\n'): self.transp.write( b'HTTP/1.1 200 OK\r\n' b'CONTENT-LENGTH: 2\r\n' b'\r\n' b'ok') self.transp.close() def connection_lost(self, exc): self.transp = None server = await loop.create_server( Proto, '127.0.0.1', unused_port()) addr = server.sockets[0].getsockname() connector = aiohttp.TCPConnector(limit=1) session = aiohttp.ClientSession(connector=connector) url = 'http://{}:{}/'.format(*addr) r = await session.request('GET', url) await r.read() assert 1 == len(connector._conns) with pytest.raises(aiohttp.ClientConnectionError): await session.request('GET', url) assert 0 == len(connector._conns) await session.close() connector.close() server.close() await server.wait_closed() async def test_error_in_performing_request(ssl_ctx, aiohttp_client, aiohttp_server): async def handler(request): return web.Response() def exception_handler(loop, context): # skip log messages about destroyed but pending tasks pass loop = asyncio.get_event_loop() loop.set_exception_handler(exception_handler) app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app, ssl=ssl_ctx) conn = aiohttp.TCPConnector(limit=1) client = await aiohttp_client(server, connector=conn) with pytest.raises(aiohttp.ClientConnectionError): await client.get('/') # second try should not hang with pytest.raises(aiohttp.ClientConnectionError): await client.get('/') async def test_await_after_cancelling(aiohttp_client) -> None: loop = asyncio.get_event_loop() async def handler(request): return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) fut1 = loop.create_future() fut2 = loop.create_future() async def fetch1(): resp = await client.get('/') assert resp.status == 200 fut1.set_result(None) with pytest.raises(asyncio.CancelledError): await fut2 resp.release() async def fetch2(): await fut1 resp = await client.get('/') assert resp.status == 200 async def canceller(): await fut1 fut2.cancel() await asyncio.gather(fetch1(), fetch2(), canceller()) async def test_async_payload_generator(aiohttp_client) -> None: async def handler(request): data = await request.read() assert data == b'1234567890' * 100 return web.Response() app = web.Application() app.add_routes([web.post('/', handler)]) client = await aiohttp_client(app) @async_generator async def gen(): for i in range(100): await yield_(b'1234567890') resp = await client.post('/', data=gen()) assert resp.status == 200 async def test_read_from_closed_response(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'data') app = web.Application() app.add_routes([web.get('/', handler)]) client = await aiohttp_client(app) async with client.get('/') as resp: assert resp.status == 200 with pytest.raises(aiohttp.ClientConnectionError): await resp.read() async def test_read_from_closed_response2(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'data') app = web.Application() app.add_routes([web.get('/', handler)]) client = await aiohttp_client(app) async with client.get('/') as resp: assert resp.status == 200 await resp.read() with pytest.raises(aiohttp.ClientConnectionError): await resp.read() async def test_read_from_closed_content(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'data') app = web.Application() app.add_routes([web.get('/', handler)]) client = await aiohttp_client(app) async with client.get('/') as resp: assert resp.status == 200 with pytest.raises(aiohttp.ClientConnectionError): await resp.content.readline() async def test_read_timeout(aiohttp_client) -> None: async def handler(request): await asyncio.sleep(5) return web.Response() app = web.Application() app.add_routes([web.get('/', handler)]) timeout = aiohttp.ClientTimeout(sock_read=0.1) client = await aiohttp_client(app, timeout=timeout) with pytest.raises(aiohttp.ServerTimeoutError): await client.get('/') async def test_read_timeout_on_prepared_response(aiohttp_client) -> None: async def handler(request): resp = aiohttp.web.StreamResponse() await resp.prepare(request) await asyncio.sleep(5) await resp.drain() return resp app = web.Application() app.add_routes([web.get('/', handler)]) timeout = aiohttp.ClientTimeout(sock_read=0.1) client = await aiohttp_client(app, timeout=timeout) with pytest.raises(aiohttp.ServerTimeoutError): async with await client.get('/') as resp: await resp.read() aiohttp-3.6.2/tests/test_client_proto.py0000644000175100001650000000763513547410117020754 0ustar vstsdocker00000000000000from unittest import mock from yarl import URL from aiohttp import http from aiohttp.client_exceptions import ClientOSError, ServerDisconnectedError from aiohttp.client_proto import ResponseHandler from aiohttp.client_reqrep import ClientResponse from aiohttp.helpers import TimerNoop async def test_oserror(loop) -> None: proto = ResponseHandler(loop=loop) transport = mock.Mock() proto.connection_made(transport) proto.connection_lost(OSError()) assert proto.should_close assert isinstance(proto.exception(), ClientOSError) async def test_pause_resume_on_error(loop) -> None: proto = ResponseHandler(loop=loop) transport = mock.Mock() proto.connection_made(transport) proto.pause_reading() assert proto._reading_paused proto.resume_reading() assert not proto._reading_paused async def test_client_proto_bad_message(loop) -> None: proto = ResponseHandler(loop=loop) transport = mock.Mock() proto.connection_made(transport) proto.set_response_params() proto.data_received(b'HTTP\r\n\r\n') assert proto.should_close assert transport.close.called assert isinstance(proto.exception(), http.HttpProcessingError) async def test_uncompleted_message(loop) -> None: proto = ResponseHandler(loop=loop) transport = mock.Mock() proto.connection_made(transport) proto.set_response_params(read_until_eof=True) proto.data_received(b'HTTP/1.1 301 Moved Permanently\r\n' b'Location: http://python.org/') proto.connection_lost(None) exc = proto.exception() assert isinstance(exc, ServerDisconnectedError) assert exc.message.code == 301 assert dict(exc.message.headers) == {'Location': 'http://python.org/'} async def test_client_protocol_readuntil_eof(loop) -> None: proto = ResponseHandler(loop=loop) transport = mock.Mock() proto.connection_made(transport) conn = mock.Mock() conn.protocol = proto proto.data_received(b'HTTP/1.1 200 Ok\r\n\r\n') response = ClientResponse('get', URL('http://def-cl-resp.org'), writer=mock.Mock(), continue100=None, timer=TimerNoop(), request_info=mock.Mock(), traces=[], loop=loop, session=mock.Mock()) proto.set_response_params(read_until_eof=True) await response.start(conn) assert not response.content.is_eof() proto.data_received(b'0000') data = await response.content.readany() assert data == b'0000' proto.data_received(b'1111') data = await response.content.readany() assert data == b'1111' proto.connection_lost(None) assert response.content.is_eof() async def test_empty_data(loop) -> None: proto = ResponseHandler(loop=loop) proto.data_received(b'') # do nothing async def test_schedule_timeout(loop) -> None: proto = ResponseHandler(loop=loop) proto.set_response_params(read_timeout=1) assert proto._read_timeout_handle is not None async def test_drop_timeout(loop) -> None: proto = ResponseHandler(loop=loop) proto.set_response_params(read_timeout=1) assert proto._read_timeout_handle is not None proto._drop_timeout() assert proto._read_timeout_handle is None async def test_reschedule_timeout(loop) -> None: proto = ResponseHandler(loop=loop) proto.set_response_params(read_timeout=1) assert proto._read_timeout_handle is not None h = proto._read_timeout_handle proto._reschedule_timeout() assert proto._read_timeout_handle is not None assert proto._read_timeout_handle is not h async def test_eof_received(loop) -> None: proto = ResponseHandler(loop=loop) proto.set_response_params(read_timeout=1) assert proto._read_timeout_handle is not None proto.eof_received() assert proto._read_timeout_handle is None aiohttp-3.6.2/tests/test_client_request.py0000644000175100001650000011674213547410117021301 0ustar vstsdocker00000000000000# coding: utf-8 import asyncio import hashlib import io import os.path import urllib.parse import zlib from http.cookies import SimpleCookie from unittest import mock import pytest from async_generator import async_generator, yield_ from multidict import CIMultiDict, CIMultiDictProxy, istr from yarl import URL import aiohttp from aiohttp import BaseConnector, hdrs, payload from aiohttp.client_reqrep import ( ClientRequest, ClientResponse, Fingerprint, _merge_ssl_params, ) from aiohttp.test_utils import make_mocked_coro @pytest.fixture def make_request(loop): request = None def maker(method, url, *args, **kwargs): nonlocal request request = ClientRequest(method, URL(url), *args, loop=loop, **kwargs) return request yield maker if request is not None: loop.run_until_complete(request.close()) @pytest.fixture def buf(): return bytearray() @pytest.fixture def protocol(loop, transport): protocol = mock.Mock() protocol.transport = transport protocol._drain_helper.return_value = loop.create_future() protocol._drain_helper.return_value.set_result(None) return protocol @pytest.fixture def transport(buf): transport = mock.Mock() def write(chunk): buf.extend(chunk) async def write_eof(): pass transport.write.side_effect = write transport.write_eof.side_effect = write_eof transport.is_closing.return_value = False return transport @pytest.fixture def conn(transport, protocol): return mock.Mock( transport=transport, protocol=protocol ) def test_method1(make_request) -> None: req = make_request('get', 'http://python.org/') assert req.method == 'GET' def test_method2(make_request) -> None: req = make_request('head', 'http://python.org/') assert req.method == 'HEAD' def test_method3(make_request) -> None: req = make_request('HEAD', 'http://python.org/') assert req.method == 'HEAD' def test_version_1_0(make_request) -> None: req = make_request('get', 'http://python.org/', version='1.0') assert req.version == (1, 0) def test_version_default(make_request) -> None: req = make_request('get', 'http://python.org/') assert req.version == (1, 1) def test_request_info(make_request) -> None: req = make_request('get', 'http://python.org/') assert req.request_info == aiohttp.RequestInfo(URL('http://python.org/'), 'GET', req.headers) def test_request_info_with_fragment(make_request) -> None: req = make_request('get', 'http://python.org/#urlfragment') assert req.request_info == aiohttp.RequestInfo( URL('http://python.org/'), 'GET', req.headers, URL('http://python.org/#urlfragment')) def test_version_err(make_request) -> None: with pytest.raises(ValueError): make_request('get', 'http://python.org/', version='1.c') def test_https_proxy(make_request) -> None: with pytest.raises(ValueError): make_request( 'get', 'http://python.org/', proxy=URL('https://proxy.org')) def test_keep_alive(make_request) -> None: req = make_request('get', 'http://python.org/', version=(0, 9)) assert not req.keep_alive() req = make_request('get', 'http://python.org/', version=(1, 0)) assert not req.keep_alive() req = make_request('get', 'http://python.org/', version=(1, 0), headers={'connection': 'keep-alive'}) assert req.keep_alive() req = make_request('get', 'http://python.org/', version=(1, 1)) assert req.keep_alive() req = make_request('get', 'http://python.org/', version=(1, 1), headers={'connection': 'close'}) assert not req.keep_alive() def test_host_port_default_http(make_request) -> None: req = make_request('get', 'http://python.org/') assert req.host == 'python.org' assert req.port == 80 assert not req.ssl def test_host_port_default_https(make_request) -> None: req = make_request('get', 'https://python.org/') assert req.host == 'python.org' assert req.port == 443 assert req.is_ssl() def test_host_port_nondefault_http(make_request) -> None: req = make_request('get', 'http://python.org:960/') assert req.host == 'python.org' assert req.port == 960 assert not req.is_ssl() def test_host_port_nondefault_https(make_request) -> None: req = make_request('get', 'https://python.org:960/') assert req.host == 'python.org' assert req.port == 960 assert req.is_ssl() def test_host_port_default_ws(make_request) -> None: req = make_request('get', 'ws://python.org/') assert req.host == 'python.org' assert req.port == 80 assert not req.is_ssl() def test_host_port_default_wss(make_request) -> None: req = make_request('get', 'wss://python.org/') assert req.host == 'python.org' assert req.port == 443 assert req.is_ssl() def test_host_port_nondefault_ws(make_request) -> None: req = make_request('get', 'ws://python.org:960/') assert req.host == 'python.org' assert req.port == 960 assert not req.is_ssl() def test_host_port_nondefault_wss(make_request) -> None: req = make_request('get', 'wss://python.org:960/') assert req.host == 'python.org' assert req.port == 960 assert req.is_ssl() def test_host_port_none_port(make_request) -> None: req = make_request('get', 'unix://localhost/path') assert req.headers['Host'] == 'localhost' def test_host_port_err(make_request) -> None: with pytest.raises(ValueError): make_request('get', 'http://python.org:123e/') def test_hostname_err(make_request) -> None: with pytest.raises(ValueError): make_request('get', 'http://:8080/') def test_host_header_host_first(make_request) -> None: req = make_request('get', 'http://python.org/') assert list(req.headers)[0] == 'Host' def test_host_header_host_without_port(make_request) -> None: req = make_request('get', 'http://python.org/') assert req.headers['HOST'] == 'python.org' def test_host_header_host_with_default_port(make_request) -> None: req = make_request('get', 'http://python.org:80/') assert req.headers['HOST'] == 'python.org' def test_host_header_host_with_nondefault_port(make_request) -> None: req = make_request('get', 'http://python.org:99/') assert req.headers['HOST'] == 'python.org:99' def test_host_header_host_idna_encode(make_request) -> None: req = make_request('get', 'http://xn--9caa.com') assert req.headers['HOST'] == 'xn--9caa.com' def test_host_header_host_unicode(make_request) -> None: req = make_request('get', 'http://éé.com') assert req.headers['HOST'] == 'xn--9caa.com' def test_host_header_explicit_host(make_request) -> None: req = make_request('get', 'http://python.org/', headers={'host': 'example.com'}) assert req.headers['HOST'] == 'example.com' def test_host_header_explicit_host_with_port(make_request) -> None: req = make_request('get', 'http://python.org/', headers={'host': 'example.com:99'}) assert req.headers['HOST'] == 'example.com:99' def test_host_header_ipv4(make_request) -> None: req = make_request('get', 'http://127.0.0.2') assert req.headers['HOST'] == '127.0.0.2' def test_host_header_ipv6(make_request) -> None: req = make_request('get', 'http://[::2]') assert req.headers['HOST'] == '[::2]' def test_host_header_ipv4_with_port(make_request) -> None: req = make_request('get', 'http://127.0.0.2:99') assert req.headers['HOST'] == '127.0.0.2:99' def test_host_header_ipv6_with_port(make_request) -> None: req = make_request('get', 'http://[::2]:99') assert req.headers['HOST'] == '[::2]:99' def test_default_loop(loop) -> None: asyncio.set_event_loop(loop) req = ClientRequest('get', URL('http://python.org/')) assert req.loop is loop def test_default_headers_useragent(make_request) -> None: req = make_request('get', 'http://python.org/') assert 'SERVER' not in req.headers assert 'USER-AGENT' in req.headers def test_default_headers_useragent_custom(make_request) -> None: req = make_request('get', 'http://python.org/', headers={'user-agent': 'my custom agent'}) assert 'USER-Agent' in req.headers assert 'my custom agent' == req.headers['User-Agent'] def test_skip_default_useragent_header(make_request) -> None: req = make_request('get', 'http://python.org/', skip_auto_headers=set([istr('user-agent')])) assert 'User-Agent' not in req.headers def test_headers(make_request) -> None: req = make_request('post', 'http://python.org/', headers={'Content-Type': 'text/plain'}) assert 'CONTENT-TYPE' in req.headers assert req.headers['CONTENT-TYPE'] == 'text/plain' assert req.headers['ACCEPT-ENCODING'] == 'gzip, deflate' def test_headers_list(make_request) -> None: req = make_request('post', 'http://python.org/', headers=[('Content-Type', 'text/plain')]) assert 'CONTENT-TYPE' in req.headers assert req.headers['CONTENT-TYPE'] == 'text/plain' def test_headers_default(make_request) -> None: req = make_request('get', 'http://python.org/', headers={'ACCEPT-ENCODING': 'deflate'}) assert req.headers['ACCEPT-ENCODING'] == 'deflate' def test_invalid_url(make_request) -> None: with pytest.raises(aiohttp.InvalidURL): make_request('get', 'hiwpefhipowhefopw') def test_no_path(make_request) -> None: req = make_request('get', 'http://python.org') assert '/' == req.url.path def test_ipv6_default_http_port(make_request) -> None: req = make_request('get', 'http://[2001:db8::1]/') assert req.host == '2001:db8::1' assert req.port == 80 assert not req.ssl def test_ipv6_default_https_port(make_request) -> None: req = make_request('get', 'https://[2001:db8::1]/') assert req.host == '2001:db8::1' assert req.port == 443 assert req.is_ssl() def test_ipv6_nondefault_http_port(make_request) -> None: req = make_request('get', 'http://[2001:db8::1]:960/') assert req.host == '2001:db8::1' assert req.port == 960 assert not req.is_ssl() def test_ipv6_nondefault_https_port(make_request) -> None: req = make_request('get', 'https://[2001:db8::1]:960/') assert req.host == '2001:db8::1' assert req.port == 960 assert req.is_ssl() def test_basic_auth(make_request) -> None: req = make_request('get', 'http://python.org', auth=aiohttp.BasicAuth('nkim', '1234')) assert 'AUTHORIZATION' in req.headers assert 'Basic bmtpbToxMjM0' == req.headers['AUTHORIZATION'] def test_basic_auth_utf8(make_request) -> None: req = make_request('get', 'http://python.org', auth=aiohttp.BasicAuth('nkim', 'секрет', 'utf-8')) assert 'AUTHORIZATION' in req.headers assert 'Basic bmtpbTrRgdC10LrRgNC10YI=' == req.headers['AUTHORIZATION'] def test_basic_auth_tuple_forbidden(make_request) -> None: with pytest.raises(TypeError): make_request('get', 'http://python.org', auth=('nkim', '1234')) def test_basic_auth_from_url(make_request) -> None: req = make_request('get', 'http://nkim:1234@python.org') assert 'AUTHORIZATION' in req.headers assert 'Basic bmtpbToxMjM0' == req.headers['AUTHORIZATION'] assert 'python.org' == req.host def test_basic_auth_from_url_overridden(make_request) -> None: req = make_request('get', 'http://garbage@python.org', auth=aiohttp.BasicAuth('nkim', '1234')) assert 'AUTHORIZATION' in req.headers assert 'Basic bmtpbToxMjM0' == req.headers['AUTHORIZATION'] assert 'python.org' == req.host def test_path_is_not_double_encoded1(make_request) -> None: req = make_request('get', "http://0.0.0.0/get/test case") assert req.url.raw_path == "/get/test%20case" def test_path_is_not_double_encoded2(make_request) -> None: req = make_request('get', "http://0.0.0.0/get/test%2fcase") assert req.url.raw_path == "/get/test%2Fcase" def test_path_is_not_double_encoded3(make_request) -> None: req = make_request('get', "http://0.0.0.0/get/test%20case") assert req.url.raw_path == "/get/test%20case" def test_path_safe_chars_preserved(make_request) -> None: req = make_request('get', "http://0.0.0.0/get/:=+/%2B/") assert req.url.path == "/get/:=+/+/" def test_params_are_added_before_fragment1(make_request) -> None: req = make_request('GET', "http://example.com/path#fragment", params={"a": "b"}) assert str(req.url) == "http://example.com/path?a=b" def test_params_are_added_before_fragment2(make_request) -> None: req = make_request('GET', "http://example.com/path?key=value#fragment", params={"a": "b"}) assert str(req.url) == "http://example.com/path?key=value&a=b" def test_path_not_contain_fragment1(make_request) -> None: req = make_request('GET', "http://example.com/path#fragment") assert req.url.path == "/path" def test_path_not_contain_fragment2(make_request) -> None: req = make_request('GET', "http://example.com/path?key=value#fragment") assert str(req.url) == "http://example.com/path?key=value" def test_cookies(make_request) -> None: req = make_request('get', 'http://test.com/path', cookies={'cookie1': 'val1'}) assert 'COOKIE' in req.headers assert 'cookie1=val1' == req.headers['COOKIE'] def test_cookies_merge_with_headers(make_request) -> None: req = make_request('get', 'http://test.com/path', headers={'cookie': 'cookie1=val1'}, cookies={'cookie2': 'val2'}) assert 'cookie1=val1; cookie2=val2' == req.headers['COOKIE'] def test_unicode_get1(make_request) -> None: req = make_request('get', 'http://python.org', params={'foo': 'f\xf8\xf8'}) assert 'http://python.org/?foo=f%C3%B8%C3%B8' == str(req.url) def test_unicode_get2(make_request) -> None: req = make_request('', 'http://python.org', params={'f\xf8\xf8': 'f\xf8\xf8'}) assert 'http://python.org/?f%C3%B8%C3%B8=f%C3%B8%C3%B8' == str(req.url) def test_unicode_get3(make_request) -> None: req = make_request('', 'http://python.org', params={'foo': 'foo'}) assert 'http://python.org/?foo=foo' == str(req.url) def test_unicode_get4(make_request) -> None: def join(*suffix): return urllib.parse.urljoin('http://python.org/', '/'.join(suffix)) req = make_request('', join('\xf8'), params={'foo': 'foo'}) assert 'http://python.org/%C3%B8?foo=foo' == str(req.url) def test_query_multivalued_param(make_request) -> None: for meth in ClientRequest.ALL_METHODS: req = make_request( meth, 'http://python.org', params=(('test', 'foo'), ('test', 'baz'))) assert str(req.url) == 'http://python.org/?test=foo&test=baz' def test_query_str_param(make_request) -> None: for meth in ClientRequest.ALL_METHODS: req = make_request(meth, 'http://python.org', params='test=foo') assert str(req.url) == 'http://python.org/?test=foo' def test_query_bytes_param_raises(make_request) -> None: for meth in ClientRequest.ALL_METHODS: with pytest.raises(TypeError): make_request(meth, 'http://python.org', params=b'test=foo') def test_query_str_param_is_not_encoded(make_request) -> None: for meth in ClientRequest.ALL_METHODS: req = make_request(meth, 'http://python.org', params='test=f+oo') assert str(req.url) == 'http://python.org/?test=f+oo' def test_params_update_path_and_url(make_request) -> None: req = make_request('get', 'http://python.org', params=(('test', 'foo'), ('test', 'baz'))) assert str(req.url) == 'http://python.org/?test=foo&test=baz' def test_params_empty_path_and_url(make_request) -> None: req_empty = make_request('get', 'http://python.org', params={}) assert str(req_empty.url) == 'http://python.org' req_none = make_request('get', 'http://python.org') assert str(req_none.url) == 'http://python.org' def test_gen_netloc_all(make_request) -> None: req = make_request('get', 'https://aiohttp:pwpwpw@' + '12345678901234567890123456789' + '012345678901234567890:8080') assert req.headers['HOST'] == '12345678901234567890123456789' +\ '012345678901234567890:8080' def test_gen_netloc_no_port(make_request) -> None: req = make_request('get', 'https://aiohttp:pwpwpw@' + '12345678901234567890123456789' + '012345678901234567890/') assert req.headers['HOST'] == '12345678901234567890123456789' +\ '012345678901234567890' async def test_connection_header(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org'), loop=loop) req.keep_alive = mock.Mock() req.headers.clear() req.keep_alive.return_value = True req.version = (1, 1) req.headers.clear() await req.send(conn) assert req.headers.get('CONNECTION') is None req.version = (1, 0) req.headers.clear() await req.send(conn) assert req.headers.get('CONNECTION') == 'keep-alive' req.keep_alive.return_value = False req.version = (1, 1) req.headers.clear() await req.send(conn) assert req.headers.get('CONNECTION') == 'close' async def test_no_content_length(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org'), loop=loop) resp = await req.send(conn) assert req.headers.get('CONTENT-LENGTH') is None await req.close() resp.close() async def test_no_content_length_head(loop, conn) -> None: req = ClientRequest('head', URL('http://python.org'), loop=loop) resp = await req.send(conn) assert req.headers.get('CONTENT-LENGTH') is None await req.close() resp.close() async def test_content_type_auto_header_get(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org'), loop=loop) resp = await req.send(conn) assert 'CONTENT-TYPE' not in req.headers resp.close() async def test_content_type_auto_header_form(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org'), data={'hey': 'you'}, loop=loop) resp = await req.send(conn) assert 'application/x-www-form-urlencoded' == \ req.headers.get('CONTENT-TYPE') resp.close() async def test_content_type_auto_header_bytes(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org'), data=b'hey you', loop=loop) resp = await req.send(conn) assert 'application/octet-stream' == req.headers.get('CONTENT-TYPE') resp.close() async def test_content_type_skip_auto_header_bytes(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org'), data=b'hey you', skip_auto_headers={'Content-Type'}, loop=loop) resp = await req.send(conn) assert 'CONTENT-TYPE' not in req.headers resp.close() async def test_content_type_skip_auto_header_form(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org'), data={'hey': 'you'}, loop=loop, skip_auto_headers={'Content-Type'}) resp = await req.send(conn) assert 'CONTENT-TYPE' not in req.headers resp.close() async def test_content_type_auto_header_content_length_no_skip(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org'), data=io.BytesIO(b'hey'), skip_auto_headers={'Content-Length'}, loop=loop) resp = await req.send(conn) assert req.headers.get('CONTENT-LENGTH') == '3' resp.close() async def test_urlencoded_formdata_charset(loop, conn) -> None: req = ClientRequest( 'post', URL('http://python.org'), data=aiohttp.FormData({'hey': 'you'}, charset='koi8-r'), loop=loop) await req.send(conn) assert 'application/x-www-form-urlencoded; charset=koi8-r' == \ req.headers.get('CONTENT-TYPE') async def test_post_data(loop, conn) -> None: for meth in ClientRequest.POST_METHODS: req = ClientRequest( meth, URL('http://python.org/'), data={'life': '42'}, loop=loop) resp = await req.send(conn) assert '/' == req.url.path assert b'life=42' == req.body._value assert 'application/x-www-form-urlencoded' ==\ req.headers['CONTENT-TYPE'] await req.close() resp.close() async def test_pass_falsy_data(loop) -> None: with mock.patch( 'aiohttp.client_reqrep.ClientRequest.update_body_from_data'): req = ClientRequest( 'post', URL('http://python.org/'), data={}, loop=loop) req.update_body_from_data.assert_called_once_with({}) await req.close() async def test_pass_falsy_data_file(loop, tmpdir) -> None: testfile = tmpdir.join('tmpfile').open('w+b') testfile.write(b'data') testfile.seek(0) skip = frozenset([hdrs.CONTENT_TYPE]) req = ClientRequest( 'post', URL('http://python.org/'), data=testfile, skip_auto_headers=skip, loop=loop) assert req.headers.get('CONTENT-LENGTH', None) is not None await req.close() # Elasticsearch API requires to send request body with GET-requests async def test_get_with_data(loop) -> None: for meth in ClientRequest.GET_METHODS: req = ClientRequest( meth, URL('http://python.org/'), data={'life': '42'}, loop=loop) assert '/' == req.url.path assert b'life=42' == req.body._value await req.close() async def test_bytes_data(loop, conn) -> None: for meth in ClientRequest.POST_METHODS: req = ClientRequest( meth, URL('http://python.org/'), data=b'binary data', loop=loop) resp = await req.send(conn) assert '/' == req.url.path assert isinstance(req.body, payload.BytesPayload) assert b'binary data' == req.body._value assert 'application/octet-stream' == req.headers['CONTENT-TYPE'] await req.close() resp.close() async def test_content_encoding(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org/'), data='foo', compress='deflate', loop=loop) with mock.patch('aiohttp.client_reqrep.StreamWriter') as m_writer: m_writer.return_value.write_headers = make_mocked_coro() resp = await req.send(conn) assert req.headers['TRANSFER-ENCODING'] == 'chunked' assert req.headers['CONTENT-ENCODING'] == 'deflate' m_writer.return_value\ .enable_compression.assert_called_with('deflate') await req.close() resp.close() async def test_content_encoding_dont_set_headers_if_no_body(loop, conn) -> None: req = ClientRequest('post', URL('http://python.org/'), compress='deflate', loop=loop) with mock.patch('aiohttp.client_reqrep.http'): resp = await req.send(conn) assert 'TRANSFER-ENCODING' not in req.headers assert 'CONTENT-ENCODING' not in req.headers await req.close() resp.close() async def test_content_encoding_header(loop, conn) -> None: req = ClientRequest( 'post', URL('http://python.org/'), data='foo', headers={'Content-Encoding': 'deflate'}, loop=loop) with mock.patch('aiohttp.client_reqrep.StreamWriter') as m_writer: m_writer.return_value.write_headers = make_mocked_coro() resp = await req.send(conn) assert not m_writer.return_value.enable_compression.called assert not m_writer.return_value.enable_chunking.called await req.close() resp.close() async def test_compress_and_content_encoding(loop, conn) -> None: with pytest.raises(ValueError): ClientRequest('post', URL('http://python.org/'), data='foo', headers={'content-encoding': 'deflate'}, compress='deflate', loop=loop) async def test_chunked(loop, conn) -> None: req = ClientRequest( 'post', URL('http://python.org/'), headers={'TRANSFER-ENCODING': 'gzip'}, loop=loop) resp = await req.send(conn) assert 'gzip' == req.headers['TRANSFER-ENCODING'] await req.close() resp.close() async def test_chunked2(loop, conn) -> None: req = ClientRequest( 'post', URL('http://python.org/'), headers={'Transfer-encoding': 'chunked'}, loop=loop) resp = await req.send(conn) assert 'chunked' == req.headers['TRANSFER-ENCODING'] await req.close() resp.close() async def test_chunked_explicit(loop, conn) -> None: req = ClientRequest( 'post', URL('http://python.org/'), chunked=True, loop=loop) with mock.patch('aiohttp.client_reqrep.StreamWriter') as m_writer: m_writer.return_value.write_headers = make_mocked_coro() resp = await req.send(conn) assert 'chunked' == req.headers['TRANSFER-ENCODING'] m_writer.return_value.enable_chunking.assert_called_with() await req.close() resp.close() async def test_chunked_length(loop, conn) -> None: with pytest.raises(ValueError): ClientRequest( 'post', URL('http://python.org/'), headers={'CONTENT-LENGTH': '1000'}, chunked=True, loop=loop) async def test_chunked_transfer_encoding(loop, conn) -> None: with pytest.raises(ValueError): ClientRequest( 'post', URL('http://python.org/'), headers={'TRANSFER-ENCODING': 'chunked'}, chunked=True, loop=loop) async def test_file_upload_not_chunked(loop) -> None: here = os.path.dirname(__file__) fname = os.path.join(here, 'aiohttp.png') with open(fname, 'rb') as f: req = ClientRequest( 'post', URL('http://python.org/'), data=f, loop=loop) assert not req.chunked assert req.headers['CONTENT-LENGTH'] == str(os.path.getsize(fname)) await req.close() async def test_precompressed_data_stays_intact(loop) -> None: data = zlib.compress(b'foobar') req = ClientRequest( 'post', URL('http://python.org/'), data=data, headers={'CONTENT-ENCODING': 'deflate'}, compress=False, loop=loop) assert not req.compress assert not req.chunked assert req.headers['CONTENT-ENCODING'] == 'deflate' await req.close() async def test_file_upload_not_chunked_seek(loop) -> None: here = os.path.dirname(__file__) fname = os.path.join(here, 'aiohttp.png') with open(fname, 'rb') as f: f.seek(100) req = ClientRequest( 'post', URL('http://python.org/'), data=f, loop=loop) assert req.headers['CONTENT-LENGTH'] == \ str(os.path.getsize(fname) - 100) await req.close() async def test_file_upload_force_chunked(loop) -> None: here = os.path.dirname(__file__) fname = os.path.join(here, 'aiohttp.png') with open(fname, 'rb') as f: req = ClientRequest( 'post', URL('http://python.org/'), data=f, chunked=True, loop=loop) assert req.chunked assert 'CONTENT-LENGTH' not in req.headers await req.close() async def test_expect100(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org/'), expect100=True, loop=loop) resp = await req.send(conn) assert '100-continue' == req.headers['EXPECT'] assert req._continue is not None req.terminate() resp.close() async def test_expect_100_continue_header(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org/'), headers={'expect': '100-continue'}, loop=loop) resp = await req.send(conn) assert '100-continue' == req.headers['EXPECT'] assert req._continue is not None req.terminate() resp.close() async def test_data_stream(loop, buf, conn) -> None: @async_generator async def gen(): await yield_(b'binary data') await yield_(b' result') req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) assert req.chunked assert req.headers['TRANSFER-ENCODING'] == 'chunked' resp = await req.send(conn) assert asyncio.isfuture(req._writer) await resp.wait_for_close() assert req._writer is None assert buf.split(b'\r\n\r\n', 1)[1] == \ b'b\r\nbinary data\r\n7\r\n result\r\n0\r\n\r\n' await req.close() async def test_data_stream_deprecated(loop, buf, conn) -> None: with pytest.warns(DeprecationWarning): @aiohttp.streamer async def gen(writer): await writer.write(b'binary data') await writer.write(b' result') req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) assert req.chunked assert req.headers['TRANSFER-ENCODING'] == 'chunked' resp = await req.send(conn) assert asyncio.isfuture(req._writer) await resp.wait_for_close() assert req._writer is None assert buf.split(b'\r\n\r\n', 1)[1] == \ b'b\r\nbinary data\r\n7\r\n result\r\n0\r\n\r\n' await req.close() async def test_data_file(loop, buf, conn) -> None: req = ClientRequest( 'POST', URL('http://python.org/'), data=io.BufferedReader(io.BytesIO(b'*' * 2)), loop=loop) assert req.chunked assert isinstance(req.body, payload.BufferedReaderPayload) assert req.headers['TRANSFER-ENCODING'] == 'chunked' resp = await req.send(conn) assert asyncio.isfuture(req._writer) await resp.wait_for_close() assert req._writer is None assert buf.split(b'\r\n\r\n', 1)[1] == \ b'2\r\n' + b'*' * 2 + b'\r\n0\r\n\r\n' await req.close() async def test_data_stream_exc(loop, conn) -> None: fut = loop.create_future() @async_generator async def gen(): await yield_(b'binary data') await fut req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) assert req.chunked assert req.headers['TRANSFER-ENCODING'] == 'chunked' async def throw_exc(): await asyncio.sleep(0.01) fut.set_exception(ValueError) loop.create_task(throw_exc()) await req.send(conn) await req._writer # assert conn.close.called assert conn.protocol.set_exception.called await req.close() async def test_data_stream_exc_deprecated(loop, conn) -> None: fut = loop.create_future() with pytest.warns(DeprecationWarning): @aiohttp.streamer async def gen(writer): await writer.write(b'binary data') await fut req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) assert req.chunked assert req.headers['TRANSFER-ENCODING'] == 'chunked' async def throw_exc(): await asyncio.sleep(0.01, loop=loop) fut.set_exception(ValueError) loop.create_task(throw_exc()) await req.send(conn) await req._writer # assert conn.close.called assert conn.protocol.set_exception.called await req.close() async def test_data_stream_exc_chain(loop, conn) -> None: fut = loop.create_future() @async_generator async def gen(): await fut req = ClientRequest('POST', URL('http://python.org/'), data=gen(), loop=loop) inner_exc = ValueError() async def throw_exc(): await asyncio.sleep(0.01) fut.set_exception(inner_exc) loop.create_task(throw_exc()) await req.send(conn) await req._writer # assert connection.close.called assert conn.protocol.set_exception.called outer_exc = conn.protocol.set_exception.call_args[0][0] assert isinstance(outer_exc, ValueError) assert inner_exc is outer_exc assert inner_exc is outer_exc await req.close() async def test_data_stream_exc_chain_deprecated(loop, conn) -> None: fut = loop.create_future() with pytest.warns(DeprecationWarning): @aiohttp.streamer async def gen(writer): await fut req = ClientRequest('POST', URL('http://python.org/'), data=gen(), loop=loop) inner_exc = ValueError() async def throw_exc(): await asyncio.sleep(0.01, loop=loop) fut.set_exception(inner_exc) loop.create_task(throw_exc()) await req.send(conn) await req._writer # assert connection.close.called assert conn.protocol.set_exception.called outer_exc = conn.protocol.set_exception.call_args[0][0] assert isinstance(outer_exc, ValueError) assert inner_exc is outer_exc assert inner_exc is outer_exc await req.close() async def test_data_stream_continue(loop, buf, conn) -> None: @async_generator async def gen(): await yield_(b'binary data') await yield_(b' result') req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), expect100=True, loop=loop) assert req.chunked async def coro(): await asyncio.sleep(0.0001) req._continue.set_result(1) loop.create_task(coro()) resp = await req.send(conn) await req._writer assert buf.split(b'\r\n\r\n', 1)[1] == \ b'b\r\nbinary data\r\n7\r\n result\r\n0\r\n\r\n' await req.close() resp.close() async def test_data_stream_continue_deprecated(loop, buf, conn) -> None: with pytest.warns(DeprecationWarning): @aiohttp.streamer async def gen(writer): await writer.write(b'binary data') await writer.write(b' result') await writer.write_eof() req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), expect100=True, loop=loop) assert req.chunked async def coro(): await asyncio.sleep(0.0001, loop=loop) req._continue.set_result(1) loop.create_task(coro()) resp = await req.send(conn) await req._writer assert buf.split(b'\r\n\r\n', 1)[1] == \ b'b\r\nbinary data\r\n7\r\n result\r\n0\r\n\r\n' await req.close() resp.close() async def test_data_continue(loop, buf, conn) -> None: req = ClientRequest( 'POST', URL('http://python.org/'), data=b'data', expect100=True, loop=loop) async def coro(): await asyncio.sleep(0.0001) req._continue.set_result(1) loop.create_task(coro()) resp = await req.send(conn) await req._writer assert buf.split(b'\r\n\r\n', 1)[1] == b'data' await req.close() resp.close() async def test_close(loop, buf, conn) -> None: @async_generator async def gen(): await asyncio.sleep(0.00001) await yield_(b'result') req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) resp = await req.send(conn) await req.close() assert buf.split(b'\r\n\r\n', 1)[1] == b'6\r\nresult\r\n0\r\n\r\n' await req.close() resp.close() async def test_close_deprecated(loop, buf, conn) -> None: with pytest.warns(DeprecationWarning): @aiohttp.streamer async def gen(writer): await asyncio.sleep(0.00001, loop=loop) await writer.write(b'result') req = ClientRequest( 'POST', URL('http://python.org/'), data=gen(), loop=loop) resp = await req.send(conn) await req.close() assert buf.split(b'\r\n\r\n', 1)[1] == b'6\r\nresult\r\n0\r\n\r\n' await req.close() resp.close() async def test_custom_response_class(loop, conn) -> None: class CustomResponse(ClientResponse): def read(self, decode=False): return 'customized!' req = ClientRequest( 'GET', URL('http://python.org/'), response_class=CustomResponse, loop=loop) resp = await req.send(conn) assert 'customized!' == resp.read() await req.close() resp.close() async def test_oserror_on_write_bytes(loop, conn) -> None: req = ClientRequest( 'POST', URL('http://python.org/'), loop=loop) writer = mock.Mock() writer.write.side_effect = OSError await req.write_bytes(writer, conn) assert conn.protocol.set_exception.called exc = conn.protocol.set_exception.call_args[0][0] assert isinstance(exc, aiohttp.ClientOSError) async def test_terminate(loop, conn) -> None: req = ClientRequest('get', URL('http://python.org'), loop=loop) resp = await req.send(conn) assert req._writer is not None writer = req._writer = mock.Mock() req.terminate() assert req._writer is None writer.cancel.assert_called_with() resp.close() def test_terminate_with_closed_loop(loop, conn) -> None: req = resp = writer = None async def go(): nonlocal req, resp, writer req = ClientRequest('get', URL('http://python.org')) resp = await req.send(conn) assert req._writer is not None writer = req._writer = mock.Mock() await asyncio.sleep(0.05) loop.run_until_complete(go()) loop.close() req.terminate() assert req._writer is None assert not writer.cancel.called resp.close() def test_terminate_without_writer(loop) -> None: req = ClientRequest('get', URL('http://python.org'), loop=loop) assert req._writer is None req.terminate() assert req._writer is None async def test_custom_req_rep(loop) -> None: conn = None class CustomResponse(ClientResponse): async def start(self, connection, read_until_eof=False): nonlocal conn conn = connection self.status = 123 self.reason = 'Test OK' self._headers = CIMultiDictProxy(CIMultiDict()) self.cookies = SimpleCookie() return called = False class CustomRequest(ClientRequest): async def send(self, conn): resp = self.response_class(self.method, self.url, writer=self._writer, continue100=self._continue, timer=self._timer, request_info=self.request_info, traces=self._traces, loop=self.loop, session=self._session) self.response = resp nonlocal called called = True return resp async def create_connection(req, traces, timeout): assert isinstance(req, CustomRequest) return mock.Mock() connector = BaseConnector(loop=loop) connector._create_connection = create_connection session = aiohttp.ClientSession( request_class=CustomRequest, response_class=CustomResponse, connector=connector, loop=loop) resp = await session.request( 'get', URL('http://example.com/path/to')) assert isinstance(resp, CustomResponse) assert called resp.close() await session.close() conn.close() def test_verify_ssl_false_with_ssl_context(loop, ssl_ctx) -> None: with pytest.warns(DeprecationWarning): with pytest.raises(ValueError): _merge_ssl_params(None, verify_ssl=False, ssl_context=ssl_ctx, fingerprint=None) def test_bad_fingerprint(loop) -> None: with pytest.raises(ValueError): Fingerprint(b'invalid') def test_insecure_fingerprint_md5(loop) -> None: with pytest.raises(ValueError): Fingerprint(hashlib.md5(b"foo").digest()) def test_insecure_fingerprint_sha1(loop) -> None: with pytest.raises(ValueError): Fingerprint(hashlib.sha1(b"foo").digest()) aiohttp-3.6.2/tests/test_client_response.py0000644000175100001650000011614313547410117021442 0ustar vstsdocker00000000000000# -*- coding: utf-8 -*- """Tests for aiohttp/client.py""" import gc import sys from unittest import mock import pytest from multidict import CIMultiDict from yarl import URL import aiohttp from aiohttp import http from aiohttp.client_reqrep import ClientResponse, RequestInfo from aiohttp.helpers import TimerNoop from aiohttp.test_utils import make_mocked_coro @pytest.fixture def session(): return mock.Mock() async def test_http_processing_error(session) -> None: loop = mock.Mock() request_info = mock.Mock() response = ClientResponse( 'get', URL('http://del-cl-resp.org'), request_info=request_info, writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) loop.get_debug = mock.Mock() loop.get_debug.return_value = True connection = mock.Mock() connection.protocol = aiohttp.DataQueue(loop) connection.protocol.set_response_params = mock.Mock() connection.protocol.set_exception(http.HttpProcessingError()) with pytest.raises(aiohttp.ClientResponseError) as info: await response.start(connection) assert info.value.request_info is request_info def test_del(session) -> None: loop = mock.Mock() response = ClientResponse('get', URL('http://del-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) loop.get_debug = mock.Mock() loop.get_debug.return_value = True connection = mock.Mock() response._closed = False response._connection = connection loop.set_exception_handler(lambda loop, ctx: None) with pytest.warns(ResourceWarning): del response gc.collect() connection.release.assert_called_with() def test_close(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._closed = False response._connection = mock.Mock() response.close() assert response.connection is None response.close() response.close() def test_wait_for_100_1(loop, session) -> None: response = ClientResponse( 'get', URL('http://python.org'), continue100=object(), request_info=mock.Mock(), writer=mock.Mock(), timer=TimerNoop(), traces=[], loop=loop, session=session) assert response._continue is not None response.close() def test_wait_for_100_2(loop, session) -> None: response = ClientResponse( 'get', URL('http://python.org'), request_info=mock.Mock(), continue100=None, writer=mock.Mock(), timer=TimerNoop(), traces=[], loop=loop, session=session) assert response._continue is None response.close() def test_repr(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response.status = 200 response.reason = 'Ok' assert ''\ in repr(response) def test_repr_non_ascii_url() -> None: response = ClientResponse('get', URL('http://fake-host.org/\u03bb'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) assert ""\ in repr(response) def test_repr_non_ascii_reason() -> None: response = ClientResponse('get', URL('http://fake-host.org/path'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response.reason = '\u03bb' assert ""\ in repr(response) def test_url_obj_deprecated() -> None: response = ClientResponse('get', URL('http://fake-host.org/'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) with pytest.warns(DeprecationWarning): response.url_obj async def test_read_and_release_connection(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result(b'payload') return fut content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.read() assert res == b'payload' assert response._connection is None async def test_read_and_release_connection_with_error(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) content = response.content = mock.Mock() content.read.return_value = loop.create_future() content.read.return_value.set_exception(ValueError) with pytest.raises(ValueError): await response.read() assert response._closed async def test_release(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) fut = loop.create_future() fut.set_result(b'') content = response.content = mock.Mock() content.readany.return_value = fut response.release() assert response._connection is None @pytest.mark.skipif(sys.implementation.name != 'cpython', reason="Other implementations has different GC strategies") async def test_release_on_del(loop, session) -> None: connection = mock.Mock() connection.protocol.upgraded = False def run(conn): response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._closed = False response._connection = conn run(connection) assert connection.release.called async def test_response_eof(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._closed = False conn = response._connection = mock.Mock() conn.protocol.upgraded = False response._response_eof() assert conn.release.called assert response._connection is None async def test_response_eof_upgraded(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) conn = response._connection = mock.Mock() conn.protocol.upgraded = True response._response_eof() assert not conn.release.called assert response._connection is conn async def test_response_eof_after_connection_detach(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._closed = False conn = response._connection = mock.Mock() conn.protocol = None response._response_eof() assert conn.release.called assert response._connection is None async def test_text(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/json;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.text() assert res == '{"тест": "пройден"}' assert response._connection is None async def test_text_bad_encoding(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тестkey": "пройденvalue"}'.encode('cp1251')) return fut # lie about the encoding response._headers = { 'Content-Type': 'application/json;charset=utf-8'} content = response.content = mock.Mock() content.read.side_effect = side_effect with pytest.raises(UnicodeDecodeError): await response.text() # only the valid utf-8 characters will be returned res = await response.text(errors='ignore') assert res == '{"key": "value"}' assert response._connection is None async def test_text_custom_encoding(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/json'} content = response.content = mock.Mock() content.read.side_effect = side_effect response.get_encoding = mock.Mock() res = await response.text(encoding='cp1251') assert res == '{"тест": "пройден"}' assert response._connection is None assert not response.get_encoding.called async def test_text_detect_encoding(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = {'Content-Type': 'text/plain'} content = response.content = mock.Mock() content.read.side_effect = side_effect await response.read() res = await response.text() assert res == '{"тест": "пройден"}' assert response._connection is None async def test_text_detect_encoding_if_invalid_charset(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = {'Content-Type': 'text/plain;charset=invalid'} content = response.content = mock.Mock() content.read.side_effect = side_effect await response.read() res = await response.text() assert res == '{"тест": "пройден"}' assert response._connection is None assert response.get_encoding().lower() in ('windows-1251', 'maccyrillic') async def test_text_after_read(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/json;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.text() assert res == '{"тест": "пройден"}' assert response._connection is None async def test_json(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/json;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.json() assert res == {'тест': 'пройден'} assert response._connection is None async def test_json_extended_content_type(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/this.is-1_content+subtype+json;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.json() assert res == {'тест': 'пройден'} assert response._connection is None async def test_json_custom_content_type(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'custom/type;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.json(content_type='custom/type') assert res == {'тест': 'пройден'} assert response._connection is None async def test_json_custom_loader(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = { 'Content-Type': 'application/json;charset=cp1251'} response._body = b'data' def custom(content): return content + '-custom' res = await response.json(loads=custom) assert res == 'data-custom' async def test_json_invalid_content_type(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = { 'Content-Type': 'data/octet-stream'} response._body = b'' with pytest.raises(aiohttp.ContentTypeError) as info: await response.json() assert info.value.request_info == response.request_info async def test_json_no_content(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = { 'Content-Type': 'data/octet-stream'} response._body = b'' res = await response.json(content_type=None) assert res is None async def test_json_override_encoding(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result('{"тест": "пройден"}'.encode('cp1251')) return fut response._headers = { 'Content-Type': 'application/json;charset=utf8'} content = response.content = mock.Mock() content.read.side_effect = side_effect response.get_encoding = mock.Mock() res = await response.json(encoding='cp1251') assert res == {'тест': 'пройден'} assert response._connection is None assert not response.get_encoding.called def test_get_encoding_unknown(loop, session) -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = {'Content-Type': 'application/json'} with mock.patch('aiohttp.client_reqrep.chardet') as m_chardet: m_chardet.detect.return_value = {'encoding': None} assert response.get_encoding() == 'utf-8' def test_raise_for_status_2xx() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response.status = 200 response.reason = 'OK' response.raise_for_status() # should not raise def test_raise_for_status_4xx() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response.status = 409 response.reason = 'CONFLICT' with pytest.raises(aiohttp.ClientResponseError) as cm: response.raise_for_status() assert str(cm.value.status) == '409' assert str(cm.value.message) == "CONFLICT" assert response.closed def test_raise_for_status_4xx_without_reason() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response.status = 404 response.reason = '' with pytest.raises(aiohttp.ClientResponseError) as cm: response.raise_for_status() assert str(cm.value.status) == '404' assert str(cm.value.message) == '' assert response.closed def test_resp_host() -> None: response = ClientResponse('get', URL('http://del-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) assert 'del-cl-resp.org' == response.host def test_content_type() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {'Content-Type': 'application/json;charset=cp1251'} assert 'application/json' == response.content_type def test_content_type_no_header() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {} assert 'application/octet-stream' == response.content_type def test_charset() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {'Content-Type': 'application/json;charset=cp1251'} assert 'cp1251' == response.charset def test_charset_no_header() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {} assert response.charset is None def test_charset_no_charset() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {'Content-Type': 'application/json'} assert response.charset is None def test_content_disposition_full() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {'Content-Disposition': 'attachment; filename="archive.tar.gz"; foo=bar'} assert 'attachment' == response.content_disposition.type assert 'bar' == response.content_disposition.parameters["foo"] assert 'archive.tar.gz' == response.content_disposition.filename with pytest.raises(TypeError): response.content_disposition.parameters["foo"] = "baz" def test_content_disposition_no_parameters() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {'Content-Disposition': 'attachment'} assert 'attachment' == response.content_disposition.type assert response.content_disposition.filename is None assert {} == response.content_disposition.parameters def test_content_disposition_no_header() -> None: response = ClientResponse('get', URL('http://def-cl-resp.org'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock()) response._headers = {} assert response.content_disposition is None def test_response_request_info() -> None: url = 'http://def-cl-resp.org' headers = {'Content-Type': 'application/json;charset=cp1251'} response = ClientResponse( 'get', URL(url), request_info=RequestInfo( url, 'get', headers ), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock() ) assert url == response.request_info.url assert 'get' == response.request_info.method assert headers == response.request_info.headers def test_request_info_in_exception() -> None: url = 'http://def-cl-resp.org' headers = {'Content-Type': 'application/json;charset=cp1251'} response = ClientResponse( 'get', URL(url), request_info=RequestInfo( url, 'get', headers ), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock() ) response.status = 409 response.reason = 'CONFLICT' with pytest.raises(aiohttp.ClientResponseError) as cm: response.raise_for_status() assert cm.value.request_info == response.request_info def test_no_redirect_history_in_exception() -> None: url = 'http://def-cl-resp.org' headers = {'Content-Type': 'application/json;charset=cp1251'} response = ClientResponse( 'get', URL(url), request_info=RequestInfo( url, 'get', headers ), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock() ) response.status = 409 response.reason = 'CONFLICT' with pytest.raises(aiohttp.ClientResponseError) as cm: response.raise_for_status() assert () == cm.value.history def test_redirect_history_in_exception() -> None: hist_url = 'http://def-cl-resp.org' url = 'http://def-cl-resp.org/index.htm' hist_headers = {'Content-Type': 'application/json;charset=cp1251', 'Location': url } headers = {'Content-Type': 'application/json;charset=cp1251'} response = ClientResponse( 'get', URL(url), request_info=RequestInfo( url, 'get', headers ), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock() ) response.status = 409 response.reason = 'CONFLICT' hist_response = ClientResponse( 'get', URL(hist_url), request_info=RequestInfo( url, 'get', headers ), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=mock.Mock(), session=mock.Mock() ) hist_response._headers = hist_headers hist_response.status = 301 hist_response.reason = 'REDIRECT' response._history = [hist_response] with pytest.raises(aiohttp.ClientResponseError) as cm: response.raise_for_status() assert [hist_response] == cm.value.history async def test_response_read_triggers_callback(loop, session) -> None: trace = mock.Mock() trace.send_response_chunk_received = make_mocked_coro() response_body = b'This is response' response = ClientResponse( 'get', URL('http://def-cl-resp.org'), request_info=mock.Mock, writer=mock.Mock(), continue100=None, timer=TimerNoop(), loop=loop, session=session, traces=[trace] ) def side_effect(*args, **kwargs): fut = loop.create_future() fut.set_result(response_body) return fut response._headers = { 'Content-Type': 'application/json;charset=cp1251'} content = response.content = mock.Mock() content.read.side_effect = side_effect res = await response.read() assert res == response_body assert response._connection is None assert trace.send_response_chunk_received.called assert ( trace.send_response_chunk_received.call_args == mock.call(response_body) ) def test_response_real_url(loop, session) -> None: url = URL('http://def-cl-resp.org/#urlfragment') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) assert response.url == url.with_fragment(None) assert response.real_url == url def test_response_links_comma_separated(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict([ ( "Link", ('; rel=next, ' '; rel=home') ) ]) assert ( response.links == {'next': {'url': URL('http://example.com/page/1.html'), 'rel': 'next'}, 'home': {'url': URL('http://example.com/'), 'rel': 'home'} } ) def test_response_links_multiple_headers(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict([ ( "Link", '; rel=next' ), ( "Link", '; rel=home' ) ]) assert ( response.links == {'next': {'url': URL('http://example.com/page/1.html'), 'rel': 'next'}, 'home': {'url': URL('http://example.com/'), 'rel': 'home'} } ) def test_response_links_no_rel(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict([ ( "Link", '' ) ]) assert ( response.links == { 'http://example.com/': {'url': URL('http://example.com/')} } ) def test_response_links_quoted(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict([ ( "Link", '; rel="home-page"' ), ]) assert ( response.links == {'home-page': {'url': URL('http://example.com/'), 'rel': 'home-page'} } ) def test_response_links_relative(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict([ ( "Link", '; rel=rel' ), ]) assert ( response.links == {'rel': {'url': URL('http://def-cl-resp.org/relative/path'), 'rel': 'rel'} } ) def test_response_links_empty(loop, session) -> None: url = URL('http://def-cl-resp.org/') response = ClientResponse('get', url, request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=loop, session=session) response._headers = CIMultiDict() assert response.links == {} aiohttp-3.6.2/tests/test_client_session.py0000644000175100001650000006027313547410117021271 0ustar vstsdocker00000000000000import asyncio import contextlib import gc import json import re from http.cookies import SimpleCookie from io import BytesIO from unittest import mock import pytest from multidict import CIMultiDict, MultiDict from yarl import URL import aiohttp from aiohttp import client, hdrs, web from aiohttp.client import ClientSession from aiohttp.client_reqrep import ClientRequest from aiohttp.connector import BaseConnector, TCPConnector from aiohttp.helpers import DEBUG, PY_36 from aiohttp.test_utils import make_mocked_coro @pytest.fixture def connector(loop): async def make_conn(): return BaseConnector(loop=loop) conn = loop.run_until_complete(make_conn()) proto = mock.Mock() conn._conns['a'] = [(proto, 123)] yield conn conn.close() @pytest.fixture def create_session(loop): session = None async def maker(*args, **kwargs): nonlocal session session = ClientSession(*args, loop=loop, **kwargs) return session yield maker if session is not None: loop.run_until_complete(session.close()) @pytest.fixture def session(create_session, loop): return loop.run_until_complete(create_session()) @pytest.fixture def params(): return dict( headers={"Authorization": "Basic ..."}, max_redirects=2, encoding="latin1", version=aiohttp.HttpVersion10, compress="deflate", chunked=True, expect100=True, read_until_eof=False) async def test_close_coro(create_session) -> None: session = await create_session() await session.close() async def test_init_headers_simple_dict(create_session) -> None: session = await create_session(headers={"h1": "header1", "h2": "header2"}) assert (sorted(session._default_headers.items()) == ([("h1", "header1"), ("h2", "header2")])) async def test_init_headers_list_of_tuples(create_session) -> None: session = await create_session(headers=[("h1", "header1"), ("h2", "header2"), ("h3", "header3")]) assert (session._default_headers == CIMultiDict([("h1", "header1"), ("h2", "header2"), ("h3", "header3")])) async def test_init_headers_MultiDict(create_session) -> None: session = await create_session(headers=MultiDict([("h1", "header1"), ("h2", "header2"), ("h3", "header3")])) assert (session._default_headers == CIMultiDict([("H1", "header1"), ("H2", "header2"), ("H3", "header3")])) async def test_init_headers_list_of_tuples_with_duplicates( create_session) -> None: session = await create_session(headers=[("h1", "header11"), ("h2", "header21"), ("h1", "header12")]) assert (session._default_headers == CIMultiDict([("H1", "header11"), ("H2", "header21"), ("H1", "header12")])) async def test_init_cookies_with_simple_dict(create_session) -> None: session = await create_session(cookies={"c1": "cookie1", "c2": "cookie2"}) cookies = session.cookie_jar.filter_cookies() assert set(cookies) == {'c1', 'c2'} assert cookies['c1'].value == 'cookie1' assert cookies['c2'].value == 'cookie2' async def test_init_cookies_with_list_of_tuples(create_session) -> None: session = await create_session(cookies=[("c1", "cookie1"), ("c2", "cookie2")]) cookies = session.cookie_jar.filter_cookies() assert set(cookies) == {'c1', 'c2'} assert cookies['c1'].value == 'cookie1' assert cookies['c2'].value == 'cookie2' async def test_merge_headers(create_session) -> None: # Check incoming simple dict session = await create_session(headers={"h1": "header1", "h2": "header2"}) headers = session._prepare_headers({"h1": "h1"}) assert isinstance(headers, CIMultiDict) assert headers == {"h1": "h1", "h2": "header2"} async def test_merge_headers_with_multi_dict(create_session) -> None: session = await create_session(headers={"h1": "header1", "h2": "header2"}) headers = session._prepare_headers(MultiDict([("h1", "h1")])) assert isinstance(headers, CIMultiDict) assert headers == {"h1": "h1", "h2": "header2"} async def test_merge_headers_with_list_of_tuples(create_session) -> None: session = await create_session(headers={"h1": "header1", "h2": "header2"}) headers = session._prepare_headers([("h1", "h1")]) assert isinstance(headers, CIMultiDict) assert headers == {"h1": "h1", "h2": "header2"} async def test_merge_headers_with_list_of_tuples_duplicated_names( create_session) -> None: session = await create_session(headers={"h1": "header1", "h2": "header2"}) headers = session._prepare_headers([("h1", "v1"), ("h1", "v2")]) assert isinstance(headers, CIMultiDict) assert list(sorted(headers.items())) == [("h1", "v1"), ("h1", "v2"), ("h2", "header2")] def test_http_GET(session, params) -> None: # Python 3.8 will auto use mock.AsyncMock, it has different behavior with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.get("http://test.example.com", params={"x": 1}, **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("GET", "http://test.example.com",), dict( params={"x": 1}, allow_redirects=True, **params)] def test_http_OPTIONS(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.options("http://opt.example.com", params={"x": 2}, **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("OPTIONS", "http://opt.example.com",), dict( params={"x": 2}, allow_redirects=True, **params)] def test_http_HEAD(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.head("http://head.example.com", params={"x": 2}, **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("HEAD", "http://head.example.com",), dict( params={"x": 2}, allow_redirects=False, **params)] def test_http_POST(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.post("http://post.example.com", params={"x": 2}, data="Some_data", **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("POST", "http://post.example.com",), dict( params={"x": 2}, data="Some_data", **params)] def test_http_PUT(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.put("http://put.example.com", params={"x": 2}, data="Some_data", **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("PUT", "http://put.example.com",), dict( params={"x": 2}, data="Some_data", **params)] def test_http_PATCH(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.patch("http://patch.example.com", params={"x": 2}, data="Some_data", **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("PATCH", "http://patch.example.com",), dict( params={"x": 2}, data="Some_data", **params)] def test_http_DELETE(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.delete("http://delete.example.com", params={"x": 2}, **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("DELETE", "http://delete.example.com",), dict( params={"x": 2}, **params)] async def test_close(create_session, connector) -> None: session = await create_session(connector=connector) await session.close() assert session.connector is None assert connector.closed async def test_closed(session) -> None: assert not session.closed await session.close() assert session.closed async def test_connector(create_session, loop, mocker) -> None: connector = TCPConnector(loop=loop) mocker.spy(connector, 'close') session = await create_session(connector=connector) assert session.connector is connector await session.close() assert connector.close.called connector.close() async def test_create_connector(create_session, loop, mocker) -> None: session = await create_session() connector = session.connector mocker.spy(session.connector, 'close') await session.close() assert connector.close.called def test_connector_loop(loop) -> None: with contextlib.ExitStack() as stack: another_loop = asyncio.new_event_loop() stack.enter_context(contextlib.closing(another_loop)) async def make_connector(): return TCPConnector() connector = another_loop.run_until_complete(make_connector()) stack.enter_context(contextlib.closing(connector)) with pytest.raises(RuntimeError) as ctx: async def make_sess(): return ClientSession(connector=connector, loop=loop) loop.run_until_complete(make_sess()) assert re.match("Session and connector has to use same event loop", str(ctx.value)) def test_detach(session) -> None: conn = session.connector try: assert not conn.closed session.detach() assert session.connector is None assert session.closed assert not conn.closed finally: conn.close() async def test_request_closed_session(session) -> None: await session.close() with pytest.raises(RuntimeError): await session.request('get', '/') def test_close_flag_for_closed_connector(session) -> None: conn = session.connector assert not session.closed conn.close() assert session.closed async def test_double_close(connector, create_session) -> None: session = await create_session(connector=connector) await session.close() assert session.connector is None await session.close() assert session.closed assert connector.closed async def test_del(connector, loop) -> None: loop.set_debug(False) # N.B. don't use session fixture, it stores extra reference internally session = ClientSession(connector=connector, loop=loop) logs = [] loop.set_exception_handler(lambda loop, ctx: logs.append(ctx)) with pytest.warns(ResourceWarning): del session gc.collect() assert len(logs) == 1 expected = {'client_session': mock.ANY, 'message': 'Unclosed client session'} assert logs[0] == expected async def test_del_debug(connector, loop) -> None: loop.set_debug(True) # N.B. don't use session fixture, it stores extra reference internally session = ClientSession(connector=connector, loop=loop) logs = [] loop.set_exception_handler(lambda loop, ctx: logs.append(ctx)) with pytest.warns(ResourceWarning): del session gc.collect() assert len(logs) == 1 expected = {'client_session': mock.ANY, 'message': 'Unclosed client session', 'source_traceback': mock.ANY} assert logs[0] == expected async def test_context_manager(connector, loop) -> None: with pytest.raises(TypeError): with ClientSession(loop=loop, connector=connector) as session: pass assert session.closed async def test_borrow_connector_loop(connector, create_session, loop) -> None: session = ClientSession(connector=connector, loop=None) try: assert session._loop, loop finally: await session.close() async def test_reraise_os_error(create_session) -> None: err = OSError(1, "permission error") req = mock.Mock() req_factory = mock.Mock(return_value=req) req.send = mock.Mock(side_effect=err) session = await create_session(request_class=req_factory) async def create_connection(req, traces, timeout): # return self.transport, self.protocol return mock.Mock() session._connector._create_connection = create_connection session._connector._release = mock.Mock() with pytest.raises(aiohttp.ClientOSError) as ctx: await session.request('get', 'http://example.com') e = ctx.value assert e.errno == err.errno assert e.strerror == err.strerror async def test_close_conn_on_error(create_session) -> None: class UnexpectedException(BaseException): pass err = UnexpectedException("permission error") req = mock.Mock() req_factory = mock.Mock(return_value=req) req.send = mock.Mock(side_effect=err) session = await create_session(request_class=req_factory) connections = [] original_connect = session._connector.connect async def connect(req, traces, timeout): conn = await original_connect(req, traces, timeout) connections.append(conn) return conn async def create_connection(req, traces, timeout): # return self.transport, self.protocol conn = mock.Mock() return conn session._connector.connect = connect session._connector._create_connection = create_connection session._connector._release = mock.Mock() with pytest.raises(UnexpectedException): async with session.request('get', 'http://example.com') as resp: await resp.text() # normally called during garbage collection. triggers an exception # if the connection wasn't already closed for c in connections: c.__del__() async def test_cookie_jar_usage(loop, aiohttp_client) -> None: req_url = None jar = mock.Mock() jar.filter_cookies.return_value = None async def handler(request): nonlocal req_url req_url = "http://%s/" % request.host resp = web.Response() resp.set_cookie("response", "resp_value") return resp app = web.Application() app.router.add_route('GET', '/', handler) session = await aiohttp_client( app, cookies={"request": "req_value"}, cookie_jar=jar ) # Updating the cookie jar with initial user defined cookies jar.update_cookies.assert_called_with({"request": "req_value"}) jar.update_cookies.reset_mock() resp = await session.get("/") await resp.release() # Filtering the cookie jar before sending the request, # getting the request URL as only parameter jar.filter_cookies.assert_called_with(URL(req_url)) # Updating the cookie jar with the response cookies assert jar.update_cookies.called resp_cookies = jar.update_cookies.call_args[0][0] assert isinstance(resp_cookies, SimpleCookie) assert "response" in resp_cookies assert resp_cookies["response"].value == "resp_value" async def test_session_default_version(loop) -> None: session = aiohttp.ClientSession(loop=loop) assert session.version == aiohttp.HttpVersion11 async def test_session_loop(loop) -> None: session = aiohttp.ClientSession(loop=loop) with pytest.warns(DeprecationWarning): assert session.loop is loop await session.close() def test_proxy_str(session, params) -> None: with mock.patch( "aiohttp.client.ClientSession._request", new_callable=mock.MagicMock ) as patched: session.get("http://test.example.com", proxy='http://proxy.com', **params) assert patched.called, "`ClientSession._request` not called" assert list(patched.call_args) == [("GET", "http://test.example.com",), dict( allow_redirects=True, proxy='http://proxy.com', **params)] async def test_request_tracing(loop, aiohttp_client) -> None: async def handler(request): return web.json_response({'ok': True}) app = web.Application() app.router.add_post('/', handler) trace_config_ctx = mock.Mock() trace_request_ctx = {} body = 'This is request body' gathered_req_body = BytesIO() gathered_res_body = BytesIO() on_request_start = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_request_redirect = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_request_end = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) async def on_request_chunk_sent(session, context, params): gathered_req_body.write(params.chunk) async def on_response_chunk_received(session, context, params): gathered_res_body.write(params.chunk) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_request_start.append(on_request_start) trace_config.on_request_end.append(on_request_end) trace_config.on_request_chunk_sent.append(on_request_chunk_sent) trace_config.on_response_chunk_received.append(on_response_chunk_received) trace_config.on_request_redirect.append(on_request_redirect) session = await aiohttp_client(app, trace_configs=[trace_config]) async with session.post( '/', data=body, trace_request_ctx=trace_request_ctx) as resp: await resp.json() on_request_start.assert_called_once_with( session.session, trace_config_ctx, aiohttp.TraceRequestStartParams( hdrs.METH_POST, session.make_url('/'), CIMultiDict() ) ) on_request_end.assert_called_once_with( session.session, trace_config_ctx, aiohttp.TraceRequestEndParams( hdrs.METH_POST, session.make_url('/'), CIMultiDict(), resp ) ) assert not on_request_redirect.called assert gathered_req_body.getvalue() == body.encode('utf8') assert gathered_res_body.getvalue() == json.dumps( {'ok': True}).encode('utf8') async def test_request_tracing_exception(loop) -> None: on_request_end = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_request_exception = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig() trace_config.on_request_end.append(on_request_end) trace_config.on_request_exception.append(on_request_exception) with mock.patch("aiohttp.client.TCPConnector.connect") as connect_patched: error = Exception() f = loop.create_future() f.set_exception(error) connect_patched.return_value = f session = aiohttp.ClientSession( loop=loop, trace_configs=[trace_config] ) try: await session.get('http://example.com') except Exception: pass on_request_exception.assert_called_once_with( session, mock.ANY, aiohttp.TraceRequestExceptionParams( hdrs.METH_GET, URL("http://example.com"), CIMultiDict(), error ) ) assert not on_request_end.called async def test_request_tracing_interpose_headers(loop, aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) class MyClientRequest(ClientRequest): headers = None def __init__(self, *args, **kwargs): super(MyClientRequest, self).__init__(*args, **kwargs) MyClientRequest.headers = self.headers async def new_headers( session, trace_config_ctx, data): data.headers['foo'] = 'bar' trace_config = aiohttp.TraceConfig() trace_config.on_request_start.append(new_headers) session = await aiohttp_client( app, request_class=MyClientRequest, trace_configs=[trace_config] ) await session.get('/') assert MyClientRequest.headers['foo'] == 'bar' @pytest.mark.skipif(not PY_36, reason="Python 3.6+ required") def test_client_session_inheritance() -> None: with pytest.warns(DeprecationWarning): class A(ClientSession): pass @pytest.mark.skipif(not DEBUG, reason="The check is applied in DEBUG mode only") async def test_client_session_custom_attr(loop) -> None: session = ClientSession(loop=loop) with pytest.warns(DeprecationWarning): session.custom = None async def test_client_session_timeout_args(loop) -> None: session1 = ClientSession(loop=loop) assert session1._timeout == client.DEFAULT_TIMEOUT with pytest.warns(DeprecationWarning): session2 = ClientSession(loop=loop, read_timeout=20*60, conn_timeout=30*60) assert session2._timeout == client.ClientTimeout(total=20*60, connect=30*60) with pytest.raises(ValueError): ClientSession(loop=loop, timeout=client.ClientTimeout(total=10*60), read_timeout=20*60) with pytest.raises(ValueError): ClientSession(loop=loop, timeout=client.ClientTimeout(total=10 * 60), conn_timeout=30 * 60) async def test_requote_redirect_url_default() -> None: session = ClientSession() assert session.requote_redirect_url async def test_requote_redirect_url_default_disable() -> None: session = ClientSession(requote_redirect_url=False) assert not session.requote_redirect_url async def test_requote_redirect_setter() -> None: session = ClientSession() assert session.requote_redirect_url with pytest.warns(DeprecationWarning): session.requote_redirect_url = False assert not session.requote_redirect_url aiohttp-3.6.2/tests/test_client_ws.py0000644000175100001650000006230713547410117020237 0ustar vstsdocker00000000000000import asyncio import base64 import hashlib import os from unittest import mock import pytest import aiohttp from aiohttp import client, hdrs from aiohttp.http import WS_KEY from aiohttp.log import ws_logger from aiohttp.streams import EofStream from aiohttp.test_utils import make_mocked_coro @pytest.fixture def key_data(): return os.urandom(16) @pytest.fixture def key(key_data): return base64.b64encode(key_data) @pytest.fixture def ws_key(key): return base64.b64encode(hashlib.sha1(key + WS_KEY).digest()).decode() async def test_ws_connect(ws_key, loop, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_PROTOCOL: 'chat' } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert isinstance(res, client.ClientWebSocketResponse) assert res.protocol == 'chat' assert hdrs.ORIGIN not in m_req.call_args[1]["headers"] async def test_ws_connect_with_origin(key_data, loop) -> None: resp = mock.Mock() resp.status = 403 with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) origin = 'https://example.org/page.html' with pytest.raises(client.WSServerHandshakeError): await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', origin=origin) assert hdrs.ORIGIN in m_req.call_args[1]["headers"] assert m_req.call_args[1]["headers"][hdrs.ORIGIN] == origin async def test_ws_connect_custom_response(loop, ws_key, key_data) -> None: class CustomResponse(client.ClientWebSocketResponse): def read(self, decode=False): return 'customized!' resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession( ws_response_class=CustomResponse, loop=loop).ws_connect( 'http://test.org') assert res.read() == 'customized!' async def test_ws_connect_err_status(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 500 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError) as ctx: await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert ctx.value.message == 'Invalid response status' async def test_ws_connect_err_upgrade(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: 'test', hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError) as ctx: await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert ctx.value.message == 'Invalid upgrade header' async def test_ws_connect_err_conn(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: 'close', hdrs.SEC_WEBSOCKET_ACCEPT: ws_key } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError) as ctx: await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert ctx.value.message == 'Invalid connection header' async def test_ws_connect_err_challenge(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: 'asdfasdfasdfasdfasdfasdf' } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError) as ctx: await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert ctx.value.message == 'Invalid challenge response' async def test_ws_connect_common_headers(ws_key, loop, key_data) -> None: """Emulate a headers dict being reused for a second ws_connect. In this scenario, we need to ensure that the newly generated secret key is sent to the server, not the stale key. """ headers = {} async def test_connection() -> None: async def mock_get(*args, **kwargs): resp = mock.Mock() resp.status = 101 key = kwargs.get('headers').get(hdrs.SEC_WEBSOCKET_KEY) accept = base64.b64encode( hashlib.sha1(base64.b64encode(base64.b64decode(key)) + WS_KEY) .digest()).decode() resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: accept, hdrs.SEC_WEBSOCKET_PROTOCOL: 'chat' } return resp with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request', side_effect=mock_get) as m_req: m_os.urandom.return_value = key_data res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat'), headers=headers) assert isinstance(res, client.ClientWebSocketResponse) assert res.protocol == 'chat' assert hdrs.ORIGIN not in m_req.call_args[1]["headers"] await test_connection() # Generate a new ws key key_data = os.urandom(16) await test_connection() async def test_close(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) writer = mock.Mock() WebSocketWriter.return_value = writer writer.close = make_mocked_coro() session = aiohttp.ClientSession(loop=loop) resp = await session.ws_connect( 'http://test.org') assert not resp.closed resp._reader.feed_data( aiohttp.WSMessage(aiohttp.WSMsgType.CLOSE, b'', b''), 0) res = await resp.close() writer.close.assert_called_with(1000, b'') assert resp.closed assert res assert resp.exception() is None # idempotent res = await resp.close() assert not res assert writer.close.call_count == 1 await session.close() async def test_close_eofstream(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) writer = WebSocketWriter.return_value = mock.Mock() session = aiohttp.ClientSession(loop=loop) resp = await session.ws_connect('http://test.org') assert not resp.closed exc = EofStream() resp._reader.set_exception(exc) await resp.receive() writer.close.assert_called_with(1000, b'') assert resp.closed await session.close() async def test_close_exc(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) writer = mock.Mock() WebSocketWriter.return_value = writer writer.close = make_mocked_coro() session = aiohttp.ClientSession(loop=loop) resp = await session.ws_connect('http://test.org') assert not resp.closed exc = ValueError() resp._reader.set_exception(exc) await resp.close() assert resp.closed assert resp.exception() is exc await session.close() async def test_close_exc2(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) writer = WebSocketWriter.return_value = mock.Mock() resp = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org') assert not resp.closed exc = ValueError() writer.close.side_effect = exc await resp.close() assert resp.closed assert resp.exception() is exc resp._closed = False writer.close.side_effect = asyncio.CancelledError() with pytest.raises(asyncio.CancelledError): await resp.close() async def test_send_data_after_close(ws_key, key_data, loop, mocker) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) resp = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org') resp._writer._closing = True mocker.spy(ws_logger, 'warning') for meth, args in ((resp.ping, ()), (resp.pong, ()), (resp.send_str, ('s',)), (resp.send_bytes, (b'b',)), (resp.send_json, ({},))): await meth(*args) assert ws_logger.warning.called ws_logger.warning.reset_mock() async def test_send_data_type_errors(ws_key, key_data, loop) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) WebSocketWriter.return_value = mock.Mock() resp = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org') with pytest.raises(TypeError): await resp.send_str(b's') with pytest.raises(TypeError): await resp.send_bytes('b') with pytest.raises(TypeError): await resp.send_json(set()) async def test_reader_read_exception(ws_key, key_data, loop) -> None: hresp = mock.Mock() hresp.status = 101 hresp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(hresp) writer = mock.Mock() WebSocketWriter.return_value = writer writer.close = make_mocked_coro() session = aiohttp.ClientSession(loop=loop) resp = await session.ws_connect('http://test.org') exc = ValueError() resp._reader.set_exception(exc) msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.ERROR assert resp.exception() is exc await session.close() async def test_receive_runtime_err(loop) -> None: resp = client.ClientWebSocketResponse( mock.Mock(), mock.Mock(), mock.Mock(), mock.Mock(), 10.0, True, True, loop) resp._waiting = True with pytest.raises(RuntimeError): await resp.receive() async def test_ws_connect_close_resp_on_err(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 500 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError): await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) resp.close.assert_called_with() async def test_ws_connect_non_overlapped_protocols(ws_key, loop, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_PROTOCOL: 'other,another' } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert res.protocol is None async def test_ws_connect_non_overlapped_protocols_2(ws_key, loop, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_PROTOCOL: 'other,another' } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) connector = aiohttp.TCPConnector(loop=loop, force_close=True) res = await aiohttp.ClientSession( connector=connector, loop=loop).ws_connect( 'http://test.org', protocols=('t1', 't2', 'chat')) assert res.protocol is None del res async def test_ws_connect_deflate(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate', } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) assert res.compress == 15 assert res.client_notakeover is False async def test_ws_connect_deflate_per_message(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate', } with mock.patch('aiohttp.client.WebSocketWriter') as WebSocketWriter: with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) writer = WebSocketWriter.return_value = mock.Mock() send = writer.send = make_mocked_coro() session = aiohttp.ClientSession(loop=loop) resp = await session.ws_connect('http://test.org') await resp.send_str('string', compress=-1) send.assert_called_with('string', binary=False, compress=-1) await resp.send_bytes(b'bytes', compress=15) send.assert_called_with(b'bytes', binary=True, compress=15) await resp.send_json([{}], compress=-9) send.assert_called_with('[{}]', binary=False, compress=-9) await session.close() async def test_ws_connect_deflate_server_not_support(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) assert res.compress == 0 assert res.client_notakeover is False async def test_ws_connect_deflate_notakeover(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate; ' 'client_no_context_takeover', } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) assert res.compress == 15 assert res.client_notakeover is True async def test_ws_connect_deflate_client_wbits(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate; ' 'client_max_window_bits=10', } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) res = await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) assert res.compress == 10 assert res.client_notakeover is False async def test_ws_connect_deflate_client_wbits_bad(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate; ' 'client_max_window_bits=6', } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError): await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) async def test_ws_connect_deflate_server_ext_bad(loop, ws_key, key_data) -> None: resp = mock.Mock() resp.status = 101 resp.headers = { hdrs.UPGRADE: hdrs.WEBSOCKET, hdrs.CONNECTION: hdrs.UPGRADE, hdrs.SEC_WEBSOCKET_ACCEPT: ws_key, hdrs.SEC_WEBSOCKET_EXTENSIONS: 'permessage-deflate; bad', } with mock.patch('aiohttp.client.os') as m_os: with mock.patch('aiohttp.client.ClientSession.request') as m_req: m_os.urandom.return_value = key_data m_req.return_value = loop.create_future() m_req.return_value.set_result(resp) with pytest.raises(client.WSServerHandshakeError): await aiohttp.ClientSession(loop=loop).ws_connect( 'http://test.org', compress=15) aiohttp-3.6.2/tests/test_client_ws_functional.py0000644000175100001650000005173113547410117022460 0ustar vstsdocker00000000000000import asyncio import async_timeout import pytest import aiohttp from aiohttp import hdrs, web @pytest.fixture def ceil(mocker): def ceil(val): return val mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def test_send_recv_text(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() await ws.send_str(msg+'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') assert resp.get_extra_info('socket') is not None data = await resp.receive_str() assert data == 'ask/answer' await resp.close() assert resp.get_extra_info('socket') is None async def test_send_recv_bytes_bad_type(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() await ws.send_str(msg+'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') with pytest.raises(TypeError): await resp.receive_bytes() await resp.close() async def test_send_recv_bytes(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.send_bytes(msg+b'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_bytes(b'ask') data = await resp.receive_bytes() assert data == b'ask/answer' await resp.close() async def test_send_recv_text_bad_type(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.send_bytes(msg+b'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_bytes(b'ask') with pytest.raises(TypeError): await resp.receive_str() await resp.close() async def test_send_recv_json(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) data = await ws.receive_json() await ws.send_json({'response': data['request']}) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') payload = {'request': 'test'} await resp.send_json(payload) data = await resp.receive_json() assert data['response'] == payload['request'] await resp.close() async def test_ping_pong(aiohttp_client) -> None: loop = asyncio.get_event_loop() closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.ping() await ws.send_bytes(msg+b'/answer') try: await ws.close() finally: closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.ping() await resp.send_bytes(b'ask') msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.BINARY assert msg.data == b'ask/answer' msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSE await resp.close() await closed async def test_ping_pong_manual(aiohttp_client) -> None: loop = asyncio.get_event_loop() closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.ping() await ws.send_bytes(msg+b'/answer') try: await ws.close() finally: closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', autoping=False) await resp.ping() await resp.send_bytes(b'ask') msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.PONG msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.PING await resp.pong() msg = await resp.receive() assert msg.data == b'ask/answer' msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSE await closed async def test_close(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_bytes() await ws.send_str('test') await ws.receive() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_bytes(b'ask') closed = await resp.close() assert closed assert resp.closed assert resp.close_code == 1000 msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSED async def test_concurrent_close(aiohttp_client) -> None: client_ws = None async def handler(request): nonlocal client_ws ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_bytes() await ws.send_str('test') await client_ws.close() msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSE return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = client_ws = await client.ws_connect('/') await ws.send_bytes(b'ask') msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSING await asyncio.sleep(0.01) msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSED async def test_close_from_server(aiohttp_client) -> None: loop = asyncio.get_event_loop() closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) try: await ws.receive_bytes() await ws.close() finally: closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_bytes(b'ask') msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSE assert resp.closed msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSED await closed async def test_close_manual(aiohttp_client) -> None: loop = asyncio.get_event_loop() closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_bytes() await ws.send_str('test') try: await ws.close() finally: closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', autoclose=False) await resp.send_bytes(b'ask') msg = await resp.receive() assert msg.data == 'test' msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == '' assert not resp.closed await resp.close() await closed assert resp.closed async def test_close_timeout(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_bytes() await ws.send_str('test') await asyncio.sleep(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', timeout=0.2, autoclose=False) await resp.send_bytes(b'ask') msg = await resp.receive() assert msg.data == 'test' assert msg.type == aiohttp.WSMsgType.TEXT msg = await resp.close() assert resp.closed assert isinstance(resp.exception(), asyncio.TimeoutError) async def test_close_cancel(aiohttp_client) -> None: loop = asyncio.get_event_loop() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_bytes() await ws.send_str('test') await asyncio.sleep(10) app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', autoclose=False) await resp.send_bytes(b'ask') text = await resp.receive() assert text.data == 'test' t = loop.create_task(resp.close()) await asyncio.sleep(0.1) t.cancel() await asyncio.sleep(0.1) assert resp.closed assert resp.exception() is None async def test_override_default_headers(aiohttp_client) -> None: async def handler(request): assert request.headers[hdrs.SEC_WEBSOCKET_VERSION] == '8' ws = web.WebSocketResponse() await ws.prepare(request) await ws.send_str('answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) headers = {hdrs.SEC_WEBSOCKET_VERSION: '8'} client = await aiohttp_client(app) resp = await client.ws_connect('/', headers=headers) msg = await resp.receive() assert msg.data == 'answer' await resp.close() async def test_additional_headers(aiohttp_client) -> None: async def handler(request): assert request.headers['x-hdr'] == 'xtra' ws = web.WebSocketResponse() await ws.prepare(request) await ws.send_str('answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', headers={'x-hdr': 'xtra'}) msg = await resp.receive() assert msg.data == 'answer' await resp.close() async def test_recv_protocol_error(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_str() ws._writer.transport.write(b'01234' * 100) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.ERROR assert type(msg.data) is aiohttp.WebSocketError assert msg.data.code == aiohttp.WSCloseCode.PROTOCOL_ERROR assert str(msg.data) == 'Received frame with non-zero reserved bits' assert msg.extra is None await resp.close() async def test_recv_timeout(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive_str() await asyncio.sleep(0.1) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') with pytest.raises(asyncio.TimeoutError): with async_timeout.timeout(0.01): await resp.receive() await resp.close() async def test_receive_timeout(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive() await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', receive_timeout=0.1) with pytest.raises(asyncio.TimeoutError): await resp.receive(0.05) await resp.close() async def test_custom_receive_timeout(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive() await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') with pytest.raises(asyncio.TimeoutError): await resp.receive(0.05) await resp.close() async def test_heartbeat(aiohttp_client, ceil) -> None: ping_received = False async def handler(request): nonlocal ping_received ws = web.WebSocketResponse(autoping=False) await ws.prepare(request) msg = await ws.receive() if msg.type == aiohttp.WSMsgType.ping: ping_received = True await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', heartbeat=0.01) await asyncio.sleep(0.1) await resp.receive() await resp.close() assert ping_received async def test_heartbeat_no_pong(aiohttp_client, ceil) -> None: ping_received = False async def handler(request): nonlocal ping_received ws = web.WebSocketResponse(autoping=False) await ws.prepare(request) msg = await ws.receive() if msg.type == aiohttp.WSMsgType.ping: ping_received = True await ws.receive() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', heartbeat=0.05) await resp.receive() await resp.receive() assert ping_received async def test_send_recv_compress(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() await ws.send_str(msg+'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', compress=15) await resp.send_str('ask') assert resp.compress == 15 data = await resp.receive_str() assert data == 'ask/answer' await resp.close() assert resp.get_extra_info('socket') is None async def test_send_recv_compress_wbits(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() await ws.send_str(msg+'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/', compress=9) await resp.send_str('ask') # Client indicates supports wbits 15 # Server supports wbit 15 for decode assert resp.compress == 15 data = await resp.receive_str() assert data == 'ask/answer' await resp.close() assert resp.get_extra_info('socket') is None async def test_send_recv_compress_wbit_error(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.send_bytes(msg+b'/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) with pytest.raises(ValueError): await client.ws_connect('/', compress=1) async def test_ws_client_async_for(aiohttp_client) -> None: items = ['q1', 'q2', 'q3'] async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) for i in items: await ws.send_str(i) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') it = iter(items) async for msg in resp: assert msg.data == next(it) with pytest.raises(StopIteration): next(it) assert resp.closed async def test_ws_async_with(aiohttp_server) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive() await ws.send_str(msg.data + '/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as client: async with client.ws_connect(server.make_url('/')) as ws: await ws.send_str('request') msg = await ws.receive() assert msg.data == 'request/answer' assert ws.closed async def test_ws_async_with_send(aiohttp_server) -> None: # send_xxx methods have to return awaitable objects async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive() await ws.send_str(msg.data + '/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as client: async with client.ws_connect(server.make_url('/')) as ws: await ws.send_str('request') msg = await ws.receive() assert msg.data == 'request/answer' assert ws.closed async def test_ws_async_with_shortcut(aiohttp_server) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive() await ws.send_str(msg.data + '/answer') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as client: async with client.ws_connect(server.make_url('/')) as ws: await ws.send_str('request') msg = await ws.receive() assert msg.data == 'request/answer' assert ws.closed async def test_closed_async_for(aiohttp_client) -> None: loop = asyncio.get_event_loop() closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) try: await ws.send_bytes(b'started') await ws.receive_bytes() finally: closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') messages = [] async for msg in resp: messages.append(msg) if b'started' == msg.data: await resp.send_bytes(b'ask') await resp.close() assert 1 == len(messages) assert messages[0].type == aiohttp.WSMsgType.BINARY assert messages[0].data == b'started' assert resp.closed await closed async def test_peer_connection_lost(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() assert msg == 'ask' await ws.send_str('answer') request.transport.close() await asyncio.sleep(10) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') assert 'answer' == await resp.receive_str() msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSED await resp.close() async def test_peer_connection_lost_iter(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() assert msg == 'ask' await ws.send_str('answer') request.transport.close() await asyncio.sleep(100) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.ws_connect('/') await resp.send_str('ask') async for msg in resp: assert 'answer' == msg.data await resp.close() aiohttp-3.6.2/tests/test_connector.py0000644000175100001650000020503713547410117020241 0ustar vstsdocker00000000000000"""Tests of http client with custom Connector""" import asyncio import gc import hashlib import platform import socket import ssl import sys import uuid from collections import deque from unittest import mock import pytest from yarl import URL import aiohttp from aiohttp import client, web from aiohttp.client import ClientRequest, ClientTimeout from aiohttp.client_reqrep import ConnectionKey from aiohttp.connector import Connection, TCPConnector, _DNSCacheTable from aiohttp.helpers import PY_37 from aiohttp.locks import EventResultOrError from aiohttp.test_utils import make_mocked_coro, unused_port from aiohttp.tracing import Trace @pytest.fixture() def key(): """Connection key""" return ConnectionKey('localhost', 80, False, None, None, None, None) @pytest.fixture def key2(): """Connection key""" return ConnectionKey('localhost', 80, False, None, None, None, None) @pytest.fixture def ssl_key(): """Connection key""" return ConnectionKey('localhost', 80, True, None, None, None, None) @pytest.fixture def unix_sockname(shorttmpdir): sock_path = shorttmpdir / 'socket.sock' return str(sock_path) @pytest.fixture def unix_server(loop, unix_sockname): runners = [] async def go(app): runner = web.AppRunner(app) runners.append(runner) await runner.setup() site = web.UnixSite(runner, unix_sockname) await site.start() yield go for runner in runners: loop.run_until_complete(runner.cleanup()) @pytest.fixture def named_pipe_server(proactor_loop, pipe_name): runners = [] async def go(app): runner = web.AppRunner(app) runners.append(runner) await runner.setup() site = web.NamedPipeSite(runner, pipe_name) await site.start() yield go for runner in runners: proactor_loop.run_until_complete(runner.cleanup()) def create_mocked_conn(conn_closing_result=None, **kwargs): loop = asyncio.get_event_loop() proto = mock.Mock(**kwargs) proto.closed = loop.create_future() proto.closed.set_result(conn_closing_result) return proto def test_connection_del(loop) -> None: connector = mock.Mock() key = mock.Mock() protocol = mock.Mock() loop.set_debug(0) conn = Connection(connector, key, protocol, loop=loop) exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) with pytest.warns(ResourceWarning): del conn gc.collect() connector._release.assert_called_with( key, protocol, should_close=True ) msg = { 'message': mock.ANY, 'client_connection': mock.ANY, } exc_handler.assert_called_with(loop, msg) def test_connection_del_loop_debug(loop) -> None: connector = mock.Mock() key = mock.Mock() protocol = mock.Mock() loop.set_debug(1) conn = Connection(connector, key, protocol, loop=loop) exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) with pytest.warns(ResourceWarning): del conn gc.collect() msg = { 'message': mock.ANY, 'client_connection': mock.ANY, 'source_traceback': mock.ANY } exc_handler.assert_called_with(loop, msg) def test_connection_del_loop_closed(loop) -> None: connector = mock.Mock() key = mock.Mock() protocol = mock.Mock() loop.set_debug(1) conn = Connection(connector, key, protocol, loop=loop) exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) loop.close() with pytest.warns(ResourceWarning): del conn gc.collect() assert not connector._release.called assert not exc_handler.called async def test_del(loop) -> None: conn = aiohttp.BaseConnector() proto = mock.Mock(should_close=False) conn._release('a', proto) conns_impl = conn._conns exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) with pytest.warns(ResourceWarning): del conn gc.collect() assert not conns_impl proto.close.assert_called_with() msg = {'connector': mock.ANY, # conn was deleted 'connections': mock.ANY, 'message': 'Unclosed connector'} if loop.get_debug(): msg['source_traceback'] = mock.ANY exc_handler.assert_called_with(loop, msg) @pytest.mark.xfail async def test_del_with_scheduled_cleanup(loop) -> None: loop.set_debug(True) conn = aiohttp.BaseConnector(loop=loop, keepalive_timeout=0.01) transp = mock.Mock() conn._conns['a'] = [(transp, 123)] conns_impl = conn._conns exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) with pytest.warns(ResourceWarning): # obviously doesn't deletion because loop has a strong # reference to connector's instance method, isn't it? del conn await asyncio.sleep(0.01, loop=loop) gc.collect() assert not conns_impl transp.close.assert_called_with() msg = {'connector': mock.ANY, # conn was deleted 'message': 'Unclosed connector'} if loop.get_debug(): msg['source_traceback'] = mock.ANY exc_handler.assert_called_with(loop, msg) @pytest.mark.skipif(sys.implementation.name != 'cpython', reason="CPython GC is required for the test") def test_del_with_closed_loop(loop) -> None: async def make_conn(): return aiohttp.BaseConnector() conn = loop.run_until_complete(make_conn()) transp = mock.Mock() conn._conns['a'] = [(transp, 123)] conns_impl = conn._conns exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) loop.close() with pytest.warns(ResourceWarning): del conn gc.collect() assert not conns_impl assert not transp.close.called assert exc_handler.called async def test_del_empty_connector(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) exc_handler = mock.Mock() loop.set_exception_handler(exc_handler) del conn assert not exc_handler.called async def test_create_conn(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) with pytest.raises(NotImplementedError): await conn._create_connection(object(), [], object()) async def test_context_manager(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) with pytest.warns(DeprecationWarning): with conn as c: assert conn is c assert conn.closed async def test_async_context_manager(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) async with conn as c: assert conn is c assert conn.closed async def test_close(loop) -> None: proto = mock.Mock() conn = aiohttp.BaseConnector(loop=loop) assert not conn.closed conn._conns[('host', 8080, False)] = [(proto, object())] conn.close() assert not conn._conns assert proto.close.called assert conn.closed async def test_get(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) assert conn._get(1) is None proto = mock.Mock() conn._conns[1] = [(proto, loop.time())] assert conn._get(1) == proto conn.close() async def test_get_expired(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) key = ConnectionKey('localhost', 80, False, None, None, None, None) assert conn._get(key) is None proto = mock.Mock() conn._conns[key] = [(proto, loop.time() - 1000)] assert conn._get(key) is None assert not conn._conns conn.close() async def test_get_expired_ssl(loop) -> None: conn = aiohttp.BaseConnector(loop=loop, enable_cleanup_closed=True) key = ConnectionKey('localhost', 80, True, None, None, None, None) assert conn._get(key) is None proto = mock.Mock() transport = proto.transport conn._conns[key] = [(proto, loop.time() - 1000)] assert conn._get(key) is None assert not conn._conns assert conn._cleanup_closed_transports == [transport] conn.close() async def test_release_acquired(loop, key) -> None: proto = mock.Mock() conn = aiohttp.BaseConnector(loop=loop, limit=5) conn._release_waiter = mock.Mock() conn._acquired.add(proto) conn._acquired_per_host[key].add(proto) conn._release_acquired(key, proto) assert 0 == len(conn._acquired) assert 0 == len(conn._acquired_per_host) assert conn._release_waiter.called conn._release_acquired(key, proto) assert 0 == len(conn._acquired) assert 0 == len(conn._acquired_per_host) conn.close() async def test_release_acquired_closed(loop, key) -> None: proto = mock.Mock() conn = aiohttp.BaseConnector(loop=loop, limit=5) conn._release_waiter = mock.Mock() conn._acquired.add(proto) conn._acquired_per_host[key].add(proto) conn._closed = True conn._release_acquired(key, proto) assert 1 == len(conn._acquired) assert 1 == len(conn._acquired_per_host[key]) assert not conn._release_waiter.called conn.close() async def test_release(loop, key) -> None: conn = aiohttp.BaseConnector(loop=loop) conn._release_waiter = mock.Mock() proto = mock.Mock(should_close=False) conn._acquired.add(proto) conn._acquired_per_host[key].add(proto) conn._release(key, proto) assert conn._release_waiter.called assert conn._conns[key][0][0] == proto assert conn._conns[key][0][1] == pytest.approx(loop.time(), abs=0.1) assert not conn._cleanup_closed_transports conn.close() async def test_release_ssl_transport(loop, ssl_key) -> None: conn = aiohttp.BaseConnector(loop=loop, enable_cleanup_closed=True) conn._release_waiter = mock.Mock() proto = mock.Mock() transport = proto.transport conn._acquired.add(proto) conn._acquired_per_host[ssl_key].add(proto) conn._release(ssl_key, proto, should_close=True) assert conn._cleanup_closed_transports == [transport] conn.close() async def test_release_already_closed(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) proto = mock.Mock() key = 1 conn._acquired.add(proto) conn.close() conn._release_waiters = mock.Mock() conn._release_acquired = mock.Mock() conn._release(key, proto) assert not conn._release_waiters.called assert not conn._release_acquired.called async def test_release_waiter_no_limit(loop, key, key2) -> None: # limit is 0 conn = aiohttp.BaseConnector(limit=0, loop=loop) w = mock.Mock() w.done.return_value = False conn._waiters[key].append(w) conn._release_waiter() assert len(conn._waiters[key]) == 0 assert w.done.called conn.close() async def test_release_waiter_first_available(loop, key, key2) -> None: conn = aiohttp.BaseConnector(loop=loop) w1, w2 = mock.Mock(), mock.Mock() w1.done.return_value = False w2.done.return_value = False conn._waiters[key].append(w2) conn._waiters[key2].append(w1) conn._release_waiter() assert (w1.set_result.called and not w2.set_result.called or not w1.set_result.called and w2.set_result.called) conn.close() async def test_release_waiter_release_first(loop, key, key2) -> None: conn = aiohttp.BaseConnector(loop=loop, limit=1) w1, w2 = mock.Mock(), mock.Mock() w1.done.return_value = False w2.done.return_value = False conn._waiters[key] = deque([w1, w2]) conn._release_waiter() assert w1.set_result.called assert not w2.set_result.called conn.close() async def test_release_waiter_skip_done_waiter(loop, key, key2) -> None: conn = aiohttp.BaseConnector(loop=loop, limit=1) w1, w2 = mock.Mock(), mock.Mock() w1.done.return_value = True w2.done.return_value = False conn._waiters[key] = deque([w1, w2]) conn._release_waiter() assert not w1.set_result.called assert w2.set_result.called conn.close() async def test_release_waiter_per_host(loop, key, key2) -> None: # no limit conn = aiohttp.BaseConnector(loop=loop, limit=0, limit_per_host=2) w1, w2 = mock.Mock(), mock.Mock() w1.done.return_value = False w2.done.return_value = False conn._waiters[key] = deque([w1]) conn._waiters[key2] = deque([w2]) conn._release_waiter() assert ((w1.set_result.called and not w2.set_result.called) or (not w1.set_result.called and w2.set_result.called)) conn.close() async def test_release_waiter_no_available(loop, key, key2) -> None: # limit is 0 conn = aiohttp.BaseConnector(limit=0, loop=loop) w = mock.Mock() w.done.return_value = False conn._waiters[key].append(w) conn._available_connections = mock.Mock(return_value=0) conn._release_waiter() assert len(conn._waiters) == 1 assert not w.done.called conn.close() async def test_release_close(loop, key) -> None: conn = aiohttp.BaseConnector(loop=loop) proto = mock.Mock(should_close=True) conn._acquired.add(proto) conn._release(key, proto) assert not conn._conns assert proto.close.called async def test__drop_acquire_per_host1(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) conn._drop_acquired_per_host(123, 456) assert len(conn._acquired_per_host) == 0 async def test__drop_acquire_per_host2(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) conn._acquired_per_host[123].add(456) conn._drop_acquired_per_host(123, 456) assert len(conn._acquired_per_host) == 0 async def test__drop_acquire_per_host3(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) conn._acquired_per_host[123].add(456) conn._acquired_per_host[123].add(789) conn._drop_acquired_per_host(123, 456) assert len(conn._acquired_per_host) == 1 assert conn._acquired_per_host[123] == {789} async def test_tcp_connector_certificate_error(loop) -> None: req = ClientRequest('GET', URL('https://127.0.0.1:443'), loop=loop) async def certificate_error(*args, **kwargs): raise ssl.CertificateError conn = aiohttp.TCPConnector(loop=loop) conn._loop.create_connection = certificate_error with pytest.raises(aiohttp.ClientConnectorCertificateError) as ctx: await conn.connect(req, [], ClientTimeout()) assert isinstance(ctx.value, ssl.CertificateError) assert isinstance(ctx.value.certificate_error, ssl.CertificateError) assert isinstance(ctx.value, aiohttp.ClientSSLError) async def test_tcp_connector_multiple_hosts_errors(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) ip1 = '192.168.1.1' ip2 = '192.168.1.2' ip3 = '192.168.1.3' ip4 = '192.168.1.4' ip5 = '192.168.1.5' ips = [ip1, ip2, ip3, ip4, ip5] ips_tried = [] fingerprint = hashlib.sha256(b'foo').digest() req = ClientRequest('GET', URL('https://mocked.host'), ssl=aiohttp.Fingerprint(fingerprint), loop=loop) async def _resolve_host(host, port, traces=None): return [{ 'hostname': host, 'host': ip, 'port': port, 'family': socket.AF_INET, 'proto': 0, 'flags': socket.AI_NUMERICHOST} for ip in ips] conn._resolve_host = _resolve_host os_error = certificate_error = ssl_error = fingerprint_error = False connected = False async def create_connection(*args, **kwargs): nonlocal os_error, certificate_error, ssl_error, fingerprint_error nonlocal connected ip = args[1] ips_tried.append(ip) if ip == ip1: os_error = True raise OSError if ip == ip2: certificate_error = True raise ssl.CertificateError if ip == ip3: ssl_error = True raise ssl.SSLError if ip == ip4: fingerprint_error = True tr, pr = mock.Mock(), mock.Mock() def get_extra_info(param): if param == 'sslcontext': return True if param == 'ssl_object': s = mock.Mock() s.getpeercert.return_value = b'not foo' return s if param == 'peername': return ('192.168.1.5', 12345) assert False, param tr.get_extra_info = get_extra_info return tr, pr if ip == ip5: connected = True tr, pr = mock.Mock(), mock.Mock() def get_extra_info(param): if param == 'sslcontext': return True if param == 'ssl_object': s = mock.Mock() s.getpeercert.return_value = b'foo' return s assert False tr.get_extra_info = get_extra_info return tr, pr assert False conn._loop.create_connection = create_connection await conn.connect(req, [], ClientTimeout()) assert ips == ips_tried assert os_error assert certificate_error assert ssl_error assert fingerprint_error assert connected async def test_tcp_connector_resolve_host(loop) -> None: conn = aiohttp.TCPConnector(loop=loop, use_dns_cache=True) res = await conn._resolve_host('localhost', 8080) assert res for rec in res: if rec['family'] == socket.AF_INET: assert rec['host'] == '127.0.0.1' assert rec['hostname'] == 'localhost' assert rec['port'] == 8080 elif rec['family'] == socket.AF_INET6: assert rec['hostname'] == 'localhost' assert rec['port'] == 8080 if platform.system() == 'Darwin': assert rec['host'] in ('::1', 'fe80::1', 'fe80::1%lo0') else: assert rec['host'] == '::1' @pytest.fixture def dns_response(loop): async def coro(): # simulates a network operation await asyncio.sleep(0, loop=loop) return ["127.0.0.1"] return coro async def test_tcp_connector_dns_cache_not_expired(loop, dns_response) -> None: with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() await conn._resolve_host('localhost', 8080) await conn._resolve_host('localhost', 8080) m_resolver().resolve.assert_called_once_with( 'localhost', 8080, family=0 ) async def test_tcp_connector_dns_cache_forever(loop, dns_response) -> None: with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() await conn._resolve_host('localhost', 8080) await conn._resolve_host('localhost', 8080) m_resolver().resolve.assert_called_once_with( 'localhost', 8080, family=0 ) async def test_tcp_connector_use_dns_cache_disabled(loop, dns_response) -> None: with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector(loop=loop, use_dns_cache=False) m_resolver().resolve.side_effect = [dns_response(), dns_response()] await conn._resolve_host('localhost', 8080) await conn._resolve_host('localhost', 8080) m_resolver().resolve.assert_has_calls([ mock.call('localhost', 8080, family=0), mock.call('localhost', 8080, family=0) ]) async def test_tcp_connector_dns_throttle_requests(loop, dns_response) -> None: with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() loop.create_task(conn._resolve_host('localhost', 8080)) loop.create_task(conn._resolve_host('localhost', 8080)) await asyncio.sleep(0, loop=loop) m_resolver().resolve.assert_called_once_with( 'localhost', 8080, family=0 ) async def test_tcp_connector_dns_throttle_requests_exception_spread( loop) -> None: with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) e = Exception() m_resolver().resolve.side_effect = e r1 = loop.create_task(conn._resolve_host('localhost', 8080)) r2 = loop.create_task(conn._resolve_host('localhost', 8080)) await asyncio.sleep(0, loop=loop) assert r1.exception() == e assert r2.exception() == e async def test_tcp_connector_dns_throttle_requests_cancelled_when_close( loop, dns_response): with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() loop.create_task(conn._resolve_host('localhost', 8080)) f = loop.create_task(conn._resolve_host('localhost', 8080)) await asyncio.sleep(0, loop=loop) conn.close() with pytest.raises(asyncio.CancelledError): await f async def test_tcp_connector_dns_tracing(loop, dns_response) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_dns_resolvehost_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_dns_resolvehost_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_dns_cache_hit = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_dns_cache_miss = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_dns_resolvehost_start.append(on_dns_resolvehost_start) trace_config.on_dns_resolvehost_end.append(on_dns_resolvehost_end) trace_config.on_dns_cache_hit.append(on_dns_cache_hit) trace_config.on_dns_cache_miss.append(on_dns_cache_miss) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() await conn._resolve_host( 'localhost', 8080, traces=traces ) on_dns_resolvehost_start.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsResolveHostStartParams('localhost') ) on_dns_resolvehost_end.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsResolveHostEndParams('localhost') ) on_dns_cache_miss.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsCacheMissParams('localhost') ) assert not on_dns_cache_hit.called await conn._resolve_host( 'localhost', 8080, traces=traces ) on_dns_cache_hit.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsCacheHitParams('localhost') ) async def test_tcp_connector_dns_tracing_cache_disabled(loop, dns_response) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_dns_resolvehost_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_dns_resolvehost_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_dns_resolvehost_start.append(on_dns_resolvehost_start) trace_config.on_dns_resolvehost_end.append(on_dns_resolvehost_end) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=False ) m_resolver().resolve.side_effect = [ dns_response(), dns_response() ] await conn._resolve_host( 'localhost', 8080, traces=traces ) await conn._resolve_host( 'localhost', 8080, traces=traces ) on_dns_resolvehost_start.assert_has_calls([ mock.call( session, trace_config_ctx, aiohttp.TraceDnsResolveHostStartParams('localhost') ), mock.call( session, trace_config_ctx, aiohttp.TraceDnsResolveHostStartParams('localhost') ) ]) on_dns_resolvehost_end.assert_has_calls([ mock.call( session, trace_config_ctx, aiohttp.TraceDnsResolveHostEndParams('localhost') ), mock.call( session, trace_config_ctx, aiohttp.TraceDnsResolveHostEndParams('localhost') ) ]) async def test_tcp_connector_dns_tracing_throttle_requests( loop, dns_response) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_dns_cache_hit = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_dns_cache_miss = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_dns_cache_hit.append(on_dns_cache_hit) trace_config.on_dns_cache_miss.append(on_dns_cache_miss) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] with mock.patch('aiohttp.connector.DefaultResolver') as m_resolver: conn = aiohttp.TCPConnector( loop=loop, use_dns_cache=True, ttl_dns_cache=10 ) m_resolver().resolve.return_value = dns_response() loop.create_task(conn._resolve_host('localhost', 8080, traces=traces)) loop.create_task(conn._resolve_host('localhost', 8080, traces=traces)) await asyncio.sleep(0, loop=loop) on_dns_cache_hit.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsCacheHitParams('localhost') ) on_dns_cache_miss.assert_called_once_with( session, trace_config_ctx, aiohttp.TraceDnsCacheMissParams('localhost') ) async def test_dns_error(loop) -> None: connector = aiohttp.TCPConnector(loop=loop) connector._resolve_host = make_mocked_coro( raise_exception=OSError('dont take it serious')) req = ClientRequest( 'GET', URL('http://www.python.org'), loop=loop) with pytest.raises(aiohttp.ClientConnectorError): await connector.connect(req, [], ClientTimeout()) async def test_get_pop_empty_conns(loop) -> None: # see issue #473 conn = aiohttp.BaseConnector(loop=loop) key = ('127.0.0.1', 80, False) conn._conns[key] = [] proto = conn._get(key) assert proto is None assert not conn._conns async def test_release_close_do_not_add_to_pool(loop, key) -> None: # see issue #473 conn = aiohttp.BaseConnector(loop=loop) proto = mock.Mock(should_close=True) conn._acquired.add(proto) conn._release(key, proto) assert not conn._conns async def test_release_close_do_not_delete_existing_connections(key) -> None: proto1 = mock.Mock() conn = aiohttp.BaseConnector() conn._conns[key] = [(proto1, 1)] proto = mock.Mock(should_close=True) conn._acquired.add(proto) conn._release(key, proto) assert conn._conns[key] == [(proto1, 1)] assert proto.close.called conn.close() async def test_release_not_started(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) proto = mock.Mock(should_close=False) key = 1 conn._acquired.add(proto) conn._release(key, proto) # assert conn._conns == {1: [(proto, 10)]} rec = conn._conns[1] assert rec[0][0] == proto assert rec[0][1] == pytest.approx(loop.time(), abs=0.05) assert not proto.close.called conn.close() async def test_release_not_opened(loop, key) -> None: conn = aiohttp.BaseConnector(loop=loop) proto = mock.Mock() conn._acquired.add(proto) conn._release(key, proto) assert proto.close.called async def test_connect(loop, key) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) connection = await conn.connect(req, [], ClientTimeout()) assert not conn._create_connection.called assert connection._protocol is proto assert connection.transport is proto.transport assert isinstance(connection, Connection) connection.close() async def test_connect_tracing(loop) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_connection_create_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_connection_create_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_connection_create_start.append(on_connection_create_start) trace_config.on_connection_create_end.append(on_connection_create_end) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop) conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) conn2 = await conn.connect(req, traces, ClientTimeout()) conn2.release() on_connection_create_start.assert_called_with( session, trace_config_ctx, aiohttp.TraceConnectionCreateStartParams() ) on_connection_create_end.assert_called_with( session, trace_config_ctx, aiohttp.TraceConnectionCreateEndParams() ) async def test_close_during_connect(loop) -> None: proto = mock.Mock() proto.is_connected.return_value = True fut = loop.create_future() req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop) conn._create_connection = mock.Mock() conn._create_connection.return_value = fut task = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) conn.close() fut.set_result(proto) with pytest.raises(aiohttp.ClientConnectionError): await task assert proto.close.called async def test_ctor_cleanup() -> None: loop = mock.Mock() loop.time.return_value = 1.5 conn = aiohttp.BaseConnector( loop=loop, keepalive_timeout=10, enable_cleanup_closed=True) assert conn._cleanup_handle is None assert conn._cleanup_closed_handle is not None async def test_cleanup(key) -> None: testset = { key: [(mock.Mock(), 10), (mock.Mock(), 300)], } testset[key][0][0].is_connected.return_value = True testset[key][1][0].is_connected.return_value = False loop = mock.Mock() loop.time.return_value = 300 conn = aiohttp.BaseConnector(loop=loop) conn._conns = testset existing_handle = conn._cleanup_handle = mock.Mock() conn._cleanup() assert existing_handle.cancel.called assert conn._conns == {} assert conn._cleanup_handle is not None async def test_cleanup_close_ssl_transport(ssl_key) -> None: proto = mock.Mock() transport = proto.transport testset = {ssl_key: [(proto, 10)]} loop = mock.Mock() loop.time.return_value = 300 conn = aiohttp.BaseConnector(loop=loop, enable_cleanup_closed=True) conn._conns = testset existing_handle = conn._cleanup_handle = mock.Mock() conn._cleanup() assert existing_handle.cancel.called assert conn._conns == {} assert conn._cleanup_closed_transports == [transport] async def test_cleanup2() -> None: testset = {1: [(mock.Mock(), 300)]} testset[1][0][0].is_connected.return_value = True loop = mock.Mock() loop.time.return_value = 300 conn = aiohttp.BaseConnector(loop=loop, keepalive_timeout=10) conn._conns = testset conn._cleanup() assert conn._conns == testset assert conn._cleanup_handle is not None loop.call_at.assert_called_with(310, mock.ANY, mock.ANY) conn.close() async def test_cleanup3(key) -> None: testset = {key: [(mock.Mock(), 290.1), (mock.Mock(), 305.1)]} testset[key][0][0].is_connected.return_value = True loop = mock.Mock() loop.time.return_value = 308.5 conn = aiohttp.BaseConnector(loop=loop, keepalive_timeout=10) conn._conns = testset conn._cleanup() assert conn._conns == {key: [testset[key][1]]} assert conn._cleanup_handle is not None loop.call_at.assert_called_with(319, mock.ANY, mock.ANY) conn.close() async def test_cleanup_closed(loop, mocker) -> None: if not hasattr(loop, '__dict__'): pytest.skip("can not override loop attributes") mocker.spy(loop, 'call_at') conn = aiohttp.BaseConnector(loop=loop, enable_cleanup_closed=True) tr = mock.Mock() conn._cleanup_closed_handle = cleanup_closed_handle = mock.Mock() conn._cleanup_closed_transports = [tr] conn._cleanup_closed() assert tr.abort.called assert not conn._cleanup_closed_transports assert loop.call_at.called assert cleanup_closed_handle.cancel.called async def test_cleanup_closed_disabled(loop, mocker) -> None: conn = aiohttp.BaseConnector( loop=loop, enable_cleanup_closed=False) tr = mock.Mock() conn._cleanup_closed_transports = [tr] conn._cleanup_closed() assert tr.abort.called assert not conn._cleanup_closed_transports async def test_tcp_connector_ctor(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) assert conn._ssl is None assert conn.use_dns_cache assert conn.family == 0 async def test_tcp_connector_ctor_fingerprint_valid(loop) -> None: valid = aiohttp.Fingerprint(hashlib.sha256(b"foo").digest()) conn = aiohttp.TCPConnector(ssl=valid, loop=loop) assert conn._ssl is valid async def test_insecure_fingerprint_md5(loop) -> None: with pytest.raises(ValueError): aiohttp.TCPConnector( ssl=aiohttp.Fingerprint(hashlib.md5(b"foo").digest()), loop=loop) async def test_insecure_fingerprint_sha1(loop) -> None: with pytest.raises(ValueError): aiohttp.TCPConnector( ssl=aiohttp.Fingerprint(hashlib.sha1(b"foo").digest()), loop=loop) async def test_tcp_connector_clear_dns_cache(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) hosts = ['a', 'b'] conn._cached_hosts.add(('localhost', 123), hosts) conn._cached_hosts.add(('localhost', 124), hosts) conn.clear_dns_cache('localhost', 123) with pytest.raises(KeyError): conn._cached_hosts.next_addrs(('localhost', 123)) assert conn._cached_hosts.next_addrs(('localhost', 124)) == hosts # Remove removed element is OK conn.clear_dns_cache('localhost', 123) with pytest.raises(KeyError): conn._cached_hosts.next_addrs(('localhost', 123)) conn.clear_dns_cache() with pytest.raises(KeyError): conn._cached_hosts.next_addrs(('localhost', 124)) async def test_tcp_connector_clear_dns_cache_bad_args(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) with pytest.raises(ValueError): conn.clear_dns_cache('localhost') async def test_dont_recreate_ssl_context(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) ctx = conn._make_ssl_context(True) assert ctx is conn._make_ssl_context(True) async def test_dont_recreate_ssl_context2(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) ctx = conn._make_ssl_context(False) assert ctx is conn._make_ssl_context(False) async def test___get_ssl_context1(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) req = mock.Mock() req.is_ssl.return_value = False assert conn._get_ssl_context(req) is None async def test___get_ssl_context2(loop) -> None: ctx = ssl.SSLContext() conn = aiohttp.TCPConnector(loop=loop) req = mock.Mock() req.is_ssl.return_value = True req.ssl = ctx assert conn._get_ssl_context(req) is ctx async def test___get_ssl_context3(loop) -> None: ctx = ssl.SSLContext() conn = aiohttp.TCPConnector(loop=loop, ssl=ctx) req = mock.Mock() req.is_ssl.return_value = True req.ssl = None assert conn._get_ssl_context(req) is ctx async def test___get_ssl_context4(loop) -> None: ctx = ssl.SSLContext() conn = aiohttp.TCPConnector(loop=loop, ssl=ctx) req = mock.Mock() req.is_ssl.return_value = True req.ssl = False assert conn._get_ssl_context(req) is conn._make_ssl_context(False) async def test___get_ssl_context5(loop) -> None: ctx = ssl.SSLContext() conn = aiohttp.TCPConnector(loop=loop, ssl=ctx) req = mock.Mock() req.is_ssl.return_value = True req.ssl = aiohttp.Fingerprint(hashlib.sha256(b'1').digest()) assert conn._get_ssl_context(req) is conn._make_ssl_context(False) async def test___get_ssl_context6(loop) -> None: conn = aiohttp.TCPConnector(loop=loop) req = mock.Mock() req.is_ssl.return_value = True req.ssl = None assert conn._get_ssl_context(req) is conn._make_ssl_context(True) async def test_close_twice(loop) -> None: proto = mock.Mock() conn = aiohttp.BaseConnector(loop=loop) conn._conns[1] = [(proto, object())] conn.close() assert not conn._conns assert proto.close.called assert conn.closed conn._conns = 'Invalid' # fill with garbage conn.close() assert conn.closed async def test_close_cancels_cleanup_handle(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) conn._release(1, mock.Mock(should_close=False)) assert conn._cleanup_handle is not None conn.close() assert conn._cleanup_handle is None async def test_close_abort_closed_transports(loop) -> None: tr = mock.Mock() conn = aiohttp.BaseConnector(loop=loop) conn._cleanup_closed_transports.append(tr) conn.close() assert not conn._cleanup_closed_transports assert tr.abort.called assert conn.closed async def test_close_cancels_cleanup_closed_handle(loop) -> None: conn = aiohttp.BaseConnector(loop=loop, enable_cleanup_closed=True) assert conn._cleanup_closed_handle is not None conn.close() assert conn._cleanup_closed_handle is None async def test_ctor_with_default_loop(loop) -> None: conn = aiohttp.BaseConnector() assert loop is conn._loop async def test_connect_with_limit(loop, key) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost:80'), loop=loop, response_class=mock.Mock()) conn = aiohttp.BaseConnector(loop=loop, limit=1) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) connection1 = await conn.connect(req, None, ClientTimeout()) assert connection1._protocol == proto assert 1 == len(conn._acquired) assert proto in conn._acquired assert key in conn._acquired_per_host assert proto in conn._acquired_per_host[key] acquired = False async def f(): nonlocal acquired connection2 = await conn.connect(req, None, ClientTimeout()) acquired = True assert 1 == len(conn._acquired) assert 1 == len(conn._acquired_per_host[key]) connection2.release() task = loop.create_task(f()) await asyncio.sleep(0.01, loop=loop) assert not acquired connection1.release() await asyncio.sleep(0, loop=loop) assert acquired await task conn.close() async def test_connect_queued_operation_tracing(loop, key) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_connection_queued_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) on_connection_queued_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_connection_queued_start.append(on_connection_queued_start) trace_config.on_connection_queued_end.append(on_connection_queued_end) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost1:80'), loop=loop, response_class=mock.Mock()) conn = aiohttp.BaseConnector(loop=loop, limit=1) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) connection1 = await conn.connect(req, traces, ClientTimeout()) async def f(): connection2 = await conn.connect(req, traces, ClientTimeout()) on_connection_queued_start.assert_called_with( session, trace_config_ctx, aiohttp.TraceConnectionQueuedStartParams() ) on_connection_queued_end.assert_called_with( session, trace_config_ctx, aiohttp.TraceConnectionQueuedEndParams() ) connection2.release() task = asyncio.ensure_future(f(), loop=loop) await asyncio.sleep(0.01, loop=loop) connection1.release() await task conn.close() async def test_connect_reuseconn_tracing(loop, key) -> None: session = mock.Mock() trace_config_ctx = mock.Mock() on_connection_reuseconn = mock.Mock( side_effect=make_mocked_coro(mock.Mock()) ) trace_config = aiohttp.TraceConfig( trace_config_ctx_factory=mock.Mock(return_value=trace_config_ctx) ) trace_config.on_connection_reuseconn.append(on_connection_reuseconn) trace_config.freeze() traces = [ Trace( session, trace_config, trace_config.trace_config_ctx() ) ] proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost:80'), loop=loop, response_class=mock.Mock()) conn = aiohttp.BaseConnector(loop=loop, limit=1) conn._conns[key] = [(proto, loop.time())] conn2 = await conn.connect(req, traces, ClientTimeout()) conn2.release() on_connection_reuseconn.assert_called_with( session, trace_config_ctx, aiohttp.TraceConnectionReuseconnParams() ) conn.close() async def test_connect_with_limit_and_limit_per_host(loop, key) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=1000, limit_per_host=1) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) acquired = False connection1 = await conn.connect(req, None, ClientTimeout()) async def f(): nonlocal acquired connection2 = await conn.connect(req, None, ClientTimeout()) acquired = True assert 1 == len(conn._acquired) assert 1 == len(conn._acquired_per_host[key]) connection2.release() task = loop.create_task(f()) await asyncio.sleep(0.01, loop=loop) assert not acquired connection1.release() await asyncio.sleep(0, loop=loop) assert acquired await task conn.close() async def test_connect_with_no_limit_and_limit_per_host(loop, key) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost1:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=0, limit_per_host=1) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) acquired = False connection1 = await conn.connect(req, None, ClientTimeout()) async def f(): nonlocal acquired connection2 = await conn.connect(req, None, ClientTimeout()) acquired = True connection2.release() task = loop.create_task(f()) await asyncio.sleep(0.01, loop=loop) assert not acquired connection1.release() await asyncio.sleep(0, loop=loop) assert acquired await task conn.close() async def test_connect_with_no_limits(loop, key) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://localhost:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=0, limit_per_host=0) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) acquired = False connection1 = await conn.connect(req, None, ClientTimeout()) async def f(): nonlocal acquired connection2 = await conn.connect(req, None, ClientTimeout()) acquired = True assert 1 == len(conn._acquired) assert 1 == len(conn._acquired_per_host[key]) connection2.release() task = loop.create_task(f()) await asyncio.sleep(0.01, loop=loop) assert acquired connection1.release() await task conn.close() async def test_connect_with_limit_cancelled(loop) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=1) key = ('host', 80, False) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) connection = await conn.connect(req, None, ClientTimeout()) assert connection._protocol == proto assert connection.transport == proto.transport assert 1 == len(conn._acquired) with pytest.raises(asyncio.TimeoutError): # limit exhausted await asyncio.wait_for(conn.connect(req, None, ClientTimeout()), 0.01, loop=loop) connection.close() async def test_connect_with_capacity_release_waiters(loop) -> None: def check_with_exc(err): conn = aiohttp.BaseConnector(limit=1, loop=loop) conn._create_connection = mock.Mock() conn._create_connection.return_value = \ loop.create_future() conn._create_connection.return_value.set_exception(err) with pytest.raises(Exception): req = mock.Mock() yield from conn.connect(req, None, ClientTimeout()) assert not conn._waiters check_with_exc(OSError(1, 'permission error')) check_with_exc(RuntimeError()) check_with_exc(asyncio.TimeoutError()) async def test_connect_with_limit_concurrent(loop) -> None: proto = mock.Mock() proto.should_close = False proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) max_connections = 2 num_connections = 0 conn = aiohttp.BaseConnector(limit=max_connections, loop=loop) # Use a real coroutine for _create_connection; a mock would mask # problems that only happen when the method yields. async def create_connection(req, traces, timeout): nonlocal num_connections num_connections += 1 await asyncio.sleep(0, loop=loop) # Make a new transport mock each time because acquired # transports are stored in a set. Reusing the same object # messes with the count. proto = mock.Mock(should_close=False) proto.is_connected.return_value = True return proto conn._create_connection = create_connection # Simulate something like a crawler. It opens a connection, does # something with it, closes it, then creates tasks that make more # connections and waits for them to finish. The crawler is started # with multiple concurrent requests and stops when it hits a # predefined maximum number of requests. max_requests = 10 num_requests = 0 start_requests = max_connections + 1 async def f(start=True): nonlocal num_requests if num_requests == max_requests: return num_requests += 1 if not start: connection = await conn.connect(req, None, ClientTimeout()) await asyncio.sleep(0, loop=loop) connection.release() tasks = [ loop.create_task(f(start=False)) for i in range(start_requests) ] await asyncio.wait(tasks, loop=loop) await f() conn.close() assert max_connections == num_connections async def test_connect_waiters_cleanup(loop) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=1) conn._available_connections = mock.Mock(return_value=0) t = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) assert conn._waiters.keys() t.cancel() await asyncio.sleep(0, loop=loop) assert not conn._waiters.keys() async def test_connect_waiters_cleanup_key_error(loop) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=1) conn._available_connections = mock.Mock(return_value=0) t = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) assert conn._waiters.keys() # we delete the entry explicitly before the # canceled connection grabs the loop again, we # must expect a none failure termination conn._waiters.clear() t.cancel() await asyncio.sleep(0, loop=loop) assert not conn._waiters.keys() == [] async def test_close_with_acquired_connection(loop) -> None: proto = mock.Mock() proto.is_connected.return_value = True req = ClientRequest('GET', URL('http://host:80'), loop=loop) conn = aiohttp.BaseConnector(loop=loop, limit=1) key = ('host', 80, False) conn._conns[key] = [(proto, loop.time())] conn._create_connection = mock.Mock() conn._create_connection.return_value = loop.create_future() conn._create_connection.return_value.set_result(proto) connection = await conn.connect(req, None, ClientTimeout()) assert 1 == len(conn._acquired) conn.close() assert 0 == len(conn._acquired) assert conn.closed proto.close.assert_called_with() assert not connection.closed connection.close() assert connection.closed async def test_default_force_close(loop) -> None: connector = aiohttp.BaseConnector(loop=loop) assert not connector.force_close async def test_limit_property(loop) -> None: conn = aiohttp.BaseConnector(loop=loop, limit=15) assert 15 == conn.limit conn.close() async def test_limit_per_host_property(loop) -> None: conn = aiohttp.BaseConnector(loop=loop, limit_per_host=15) assert 15 == conn.limit_per_host conn.close() async def test_limit_property_default(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) assert conn.limit == 100 conn.close() async def test_limit_per_host_property_default(loop) -> None: conn = aiohttp.BaseConnector(loop=loop) assert conn.limit_per_host == 0 conn.close() async def test_force_close_and_explicit_keep_alive(loop) -> None: with pytest.raises(ValueError): aiohttp.BaseConnector(loop=loop, keepalive_timeout=30, force_close=True) conn = aiohttp.BaseConnector(loop=loop, force_close=True, keepalive_timeout=None) assert conn conn = aiohttp.BaseConnector(loop=loop, force_close=True) assert conn async def test_error_on_connection(loop, key) -> None: conn = aiohttp.BaseConnector(limit=1, loop=loop) req = mock.Mock() req.connection_key = key proto = mock.Mock() i = 0 fut = loop.create_future() exc = OSError() async def create_connection(req, traces, timeout): nonlocal i i += 1 if i == 1: await fut raise exc elif i == 2: return proto conn._create_connection = create_connection t1 = loop.create_task(conn.connect(req, None, ClientTimeout())) t2 = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) assert not t1.done() assert not t2.done() assert len(conn._acquired_per_host[key]) == 1 fut.set_result(None) with pytest.raises(OSError): await t1 ret = await t2 assert len(conn._acquired_per_host[key]) == 1 assert ret._key == key assert ret.protocol == proto assert proto in conn._acquired ret.release() async def test_cancelled_waiter(loop) -> None: conn = aiohttp.BaseConnector(limit=1, loop=loop) req = mock.Mock() req.connection_key = 'key' proto = mock.Mock() async def create_connection(req, traces=None): await asyncio.sleep(1) return proto conn._create_connection = create_connection conn._acquired.add(proto) conn2 = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) conn2.cancel() with pytest.raises(asyncio.CancelledError): await conn2 async def test_error_on_connection_with_cancelled_waiter(loop, key) -> None: conn = aiohttp.BaseConnector(limit=1, loop=loop) req = mock.Mock() req.connection_key = key proto = mock.Mock() i = 0 fut1 = loop.create_future() fut2 = loop.create_future() exc = OSError() async def create_connection(req, traces, timeout): nonlocal i i += 1 if i == 1: await fut1 raise exc if i == 2: await fut2 elif i == 3: return proto conn._create_connection = create_connection t1 = loop.create_task(conn.connect(req, None, ClientTimeout())) t2 = loop.create_task(conn.connect(req, None, ClientTimeout())) t3 = loop.create_task(conn.connect(req, None, ClientTimeout())) await asyncio.sleep(0, loop=loop) assert not t1.done() assert not t2.done() assert len(conn._acquired_per_host[key]) == 1 fut1.set_result(None) fut2.cancel() with pytest.raises(OSError): await t1 with pytest.raises(asyncio.CancelledError): await t2 ret = await t3 assert len(conn._acquired_per_host[key]) == 1 assert ret._key == key assert ret.protocol == proto assert proto in conn._acquired ret.release() async def test_tcp_connector(aiohttp_client, loop) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) r = await client.get('/') assert r.status == 200 @pytest.mark.skipif(not hasattr(socket, 'AF_UNIX'), reason="requires unix socket") async def test_unix_connector_not_found(loop) -> None: connector = aiohttp.UnixConnector('/' + uuid.uuid4().hex, loop=loop) req = ClientRequest( 'GET', URL('http://www.python.org'), loop=loop) with pytest.raises(aiohttp.ClientConnectorError): await connector.connect(req, None, ClientTimeout()) @pytest.mark.skipif(not hasattr(socket, 'AF_UNIX'), reason="requires unix socket") async def test_unix_connector_permission(loop) -> None: loop.create_unix_connection = make_mocked_coro( raise_exception=PermissionError()) connector = aiohttp.UnixConnector('/' + uuid.uuid4().hex, loop=loop) req = ClientRequest( 'GET', URL('http://www.python.org'), loop=loop) with pytest.raises(aiohttp.ClientConnectorError): await connector.connect(req, None, ClientTimeout()) @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_connector_wrong_loop(loop, pipe_name) -> None: with pytest.raises(RuntimeError): aiohttp.NamedPipeConnector(pipe_name, loop=loop) @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_connector_not_found( proactor_loop, pipe_name ) -> None: connector = aiohttp.NamedPipeConnector(pipe_name, loop=proactor_loop) req = ClientRequest( 'GET', URL('http://www.python.org'), loop=proactor_loop) with pytest.raises(aiohttp.ClientConnectorError): await connector.connect(req, None, ClientTimeout()) @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_connector_permission( proactor_loop, pipe_name ) -> None: proactor_loop.create_pipe_connection = make_mocked_coro( raise_exception=PermissionError() ) connector = aiohttp.NamedPipeConnector(pipe_name, loop=proactor_loop) req = ClientRequest( 'GET', URL('http://www.python.org'), loop=proactor_loop) with pytest.raises(aiohttp.ClientConnectorError): await connector.connect(req, None, ClientTimeout()) async def test_default_use_dns_cache() -> None: conn = aiohttp.TCPConnector() assert conn.use_dns_cache async def test_resolver_not_called_with_address_is_ip(loop) -> None: resolver = mock.MagicMock() connector = aiohttp.TCPConnector(resolver=resolver) req = ClientRequest('GET', URL('http://127.0.0.1:{}'.format(unused_port())), loop=loop, response_class=mock.Mock()) with pytest.raises(OSError): await connector.connect(req, None, ClientTimeout()) resolver.resolve.assert_not_called() async def test_tcp_connector_raise_connector_ssl_error( aiohttp_server, ssl_ctx, ) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) srv = await aiohttp_server(app, ssl=ssl_ctx) port = unused_port() conn = aiohttp.TCPConnector(local_addr=('127.0.0.1', port)) session = aiohttp.ClientSession(connector=conn) url = srv.make_url('/') if PY_37: err = aiohttp.ClientConnectorCertificateError else: err = aiohttp.ClientConnectorSSLError with pytest.raises(err) as ctx: await session.get(url) if PY_37: assert isinstance(ctx.value, aiohttp.ClientConnectorCertificateError) assert isinstance(ctx.value.certificate_error, ssl.SSLError) else: assert isinstance(ctx.value, aiohttp.ClientSSLError) assert isinstance(ctx.value.os_error, ssl.SSLError) await session.close() async def test_tcp_connector_do_not_raise_connector_ssl_error( aiohttp_server, ssl_ctx, client_ssl_ctx, ) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) srv = await aiohttp_server(app, ssl=ssl_ctx) port = unused_port() conn = aiohttp.TCPConnector(local_addr=('127.0.0.1', port)) session = aiohttp.ClientSession(connector=conn) url = srv.make_url('/') r = await session.get(url, ssl=client_ssl_ctx) r.release() first_conn = next(iter(conn._conns.values()))[0][0] try: _sslcontext = first_conn.transport._ssl_protocol._sslcontext except AttributeError: _sslcontext = first_conn.transport._sslcontext assert _sslcontext is client_ssl_ctx r.close() await session.close() conn.close() async def test_tcp_connector_uses_provided_local_addr(aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) srv = await aiohttp_server(app) port = unused_port() conn = aiohttp.TCPConnector(local_addr=('127.0.0.1', port)) session = aiohttp.ClientSession(connector=conn) url = srv.make_url('/') r = await session.get(url) r.release() first_conn = next(iter(conn._conns.values()))[0][0] assert first_conn.transport.get_extra_info( 'sockname') == ('127.0.0.1', port) r.close() await session.close() conn.close() @pytest.mark.skipif(not hasattr(socket, 'AF_UNIX'), reason='requires UNIX sockets') async def test_unix_connector(unix_server, unix_sockname) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) await unix_server(app) url = "http://127.0.0.1/" connector = aiohttp.UnixConnector(unix_sockname) assert unix_sockname == connector.path session = client.ClientSession(connector=connector) r = await session.get(url) assert r.status == 200 r.close() await session.close() @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_connector( proactor_loop, named_pipe_server, pipe_name ) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) await named_pipe_server(app) url = "http://this-does-not-matter.com" connector = aiohttp.NamedPipeConnector(pipe_name) assert pipe_name == connector.path session = client.ClientSession(connector=connector) r = await session.get(url) assert r.status == 200 r.close() await session.close() class TestDNSCacheTable: @pytest.fixture def dns_cache_table(self): return _DNSCacheTable() def test_next_addrs_basic(self, dns_cache_table) -> None: dns_cache_table.add('localhost', ['127.0.0.1']) dns_cache_table.add('foo', ['127.0.0.2']) addrs = dns_cache_table.next_addrs('localhost') assert addrs == ['127.0.0.1'] addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.2'] with pytest.raises(KeyError): dns_cache_table.next_addrs('no-such-host') def test_remove(self, dns_cache_table) -> None: dns_cache_table.add('localhost', ['127.0.0.1']) dns_cache_table.remove('localhost') with pytest.raises(KeyError): dns_cache_table.next_addrs('localhost') def test_clear(self, dns_cache_table) -> None: dns_cache_table.add('localhost', ['127.0.0.1']) dns_cache_table.clear() with pytest.raises(KeyError): dns_cache_table.next_addrs('localhost') def test_not_expired_ttl_None(self, dns_cache_table) -> None: dns_cache_table.add('localhost', ['127.0.0.1']) assert not dns_cache_table.expired('localhost') def test_not_expired_ttl(self) -> None: dns_cache_table = _DNSCacheTable(ttl=0.1) dns_cache_table.add('localhost', ['127.0.0.1']) assert not dns_cache_table.expired('localhost') async def test_expired_ttl(self, loop) -> None: dns_cache_table = _DNSCacheTable(ttl=0.01) dns_cache_table.add('localhost', ['127.0.0.1']) await asyncio.sleep(0.02, loop=loop) assert dns_cache_table.expired('localhost') def test_next_addrs(self, dns_cache_table) -> None: dns_cache_table.add('foo', ['127.0.0.1', '127.0.0.2', '127.0.0.3']) # Each calls to next_addrs return the hosts using # a round robin strategy. addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.1', '127.0.0.2', '127.0.0.3'] addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.2', '127.0.0.3', '127.0.0.1'] addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.3', '127.0.0.1', '127.0.0.2'] addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.1', '127.0.0.2', '127.0.0.3'] def test_next_addrs_single(self, dns_cache_table) -> None: dns_cache_table.add('foo', ['127.0.0.1']) addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.1'] addrs = dns_cache_table.next_addrs('foo') assert addrs == ['127.0.0.1'] async def test_connector_cache_trace_race(): class DummyTracer: async def send_dns_cache_hit(self, *args, **kwargs): connector._cached_hosts.remove(("", 0)) token = object() connector = TCPConnector() connector._cached_hosts.add(("", 0), [token]) traces = [DummyTracer()] assert await connector._resolve_host("", 0, traces) == [token] async def test_connector_throttle_trace_race(loop): key = ("", 0) token = object() class DummyTracer: async def send_dns_cache_hit(self, *args, **kwargs): event = connector._throttle_dns_events.pop(key) event.set() connector._cached_hosts.add(key, [token]) connector = TCPConnector() connector._throttle_dns_events[key] = EventResultOrError(loop) traces = [DummyTracer()] assert await connector._resolve_host("", 0, traces) == [token] aiohttp-3.6.2/tests/test_cookiejar.py0000644000175100001650000005503313547410117020214 0ustar vstsdocker00000000000000import asyncio import datetime import itertools import os import tempfile import unittest from http.cookies import SimpleCookie from unittest import mock import pytest from freezegun import freeze_time from yarl import URL from aiohttp import CookieJar, DummyCookieJar @pytest.fixture def cookies_to_send(): return SimpleCookie( "shared-cookie=first; " "domain-cookie=second; Domain=example.com; " "subdomain1-cookie=third; Domain=test1.example.com; " "subdomain2-cookie=fourth; Domain=test2.example.com; " "dotted-domain-cookie=fifth; Domain=.example.com; " "different-domain-cookie=sixth; Domain=different.org; " "secure-cookie=seventh; Domain=secure.com; Secure; " "no-path-cookie=eighth; Domain=pathtest.com; " "path1-cookie=nineth; Domain=pathtest.com; Path=/; " "path2-cookie=tenth; Domain=pathtest.com; Path=/one; " "path3-cookie=eleventh; Domain=pathtest.com; Path=/one/two; " "path4-cookie=twelfth; Domain=pathtest.com; Path=/one/two/; " "expires-cookie=thirteenth; Domain=expirestest.com; Path=/;" " Expires=Tue, 1 Jan 2039 12:00:00 GMT; " "max-age-cookie=fourteenth; Domain=maxagetest.com; Path=/;" " Max-Age=60; " "invalid-max-age-cookie=fifteenth; Domain=invalid-values.com; " " Max-Age=string; " "invalid-expires-cookie=sixteenth; Domain=invalid-values.com; " " Expires=string;" ) @pytest.fixture def cookies_to_send_with_expired(): return SimpleCookie( "shared-cookie=first; " "domain-cookie=second; Domain=example.com; " "subdomain1-cookie=third; Domain=test1.example.com; " "subdomain2-cookie=fourth; Domain=test2.example.com; " "dotted-domain-cookie=fifth; Domain=.example.com; " "different-domain-cookie=sixth; Domain=different.org; " "secure-cookie=seventh; Domain=secure.com; Secure; " "no-path-cookie=eighth; Domain=pathtest.com; " "path1-cookie=nineth; Domain=pathtest.com; Path=/; " "path2-cookie=tenth; Domain=pathtest.com; Path=/one; " "path3-cookie=eleventh; Domain=pathtest.com; Path=/one/two; " "path4-cookie=twelfth; Domain=pathtest.com; Path=/one/two/; " "expires-cookie=thirteenth; Domain=expirestest.com; Path=/;" " Expires=Tue, 1 Jan 1980 12:00:00 GMT; " "max-age-cookie=fourteenth; Domain=maxagetest.com; Path=/;" " Max-Age=60; " "invalid-max-age-cookie=fifteenth; Domain=invalid-values.com; " " Max-Age=string; " "invalid-expires-cookie=sixteenth; Domain=invalid-values.com; " " Expires=string;" ) @pytest.fixture def cookies_to_receive(): return SimpleCookie( "unconstrained-cookie=first; Path=/; " "domain-cookie=second; Domain=example.com; Path=/; " "subdomain1-cookie=third; Domain=test1.example.com; Path=/; " "subdomain2-cookie=fourth; Domain=test2.example.com; Path=/; " "dotted-domain-cookie=fifth; Domain=.example.com; Path=/; " "different-domain-cookie=sixth; Domain=different.org; Path=/; " "no-path-cookie=seventh; Domain=pathtest.com; " "path-cookie=eighth; Domain=pathtest.com; Path=/somepath; " "wrong-path-cookie=nineth; Domain=pathtest.com; Path=somepath;" ) def test_date_parsing() -> None: parse_func = CookieJar._parse_date utc = datetime.timezone.utc assert parse_func("") is None # 70 -> 1970 assert parse_func("Tue, 1 Jan 70 00:00:00 GMT") == \ datetime.datetime(1970, 1, 1, tzinfo=utc) # 10 -> 2010 assert parse_func("Tue, 1 Jan 10 00:00:00 GMT") == \ datetime.datetime(2010, 1, 1, tzinfo=utc) # No day of week string assert parse_func("1 Jan 1970 00:00:00 GMT") == \ datetime.datetime(1970, 1, 1, tzinfo=utc) # No timezone string assert parse_func("Tue, 1 Jan 1970 00:00:00") == \ datetime.datetime(1970, 1, 1, tzinfo=utc) # No year assert parse_func("Tue, 1 Jan 00:00:00 GMT") is None # No month assert parse_func("Tue, 1 1970 00:00:00 GMT") is None # No day of month assert parse_func("Tue, Jan 1970 00:00:00 GMT") is None # No time assert parse_func("Tue, 1 Jan 1970 GMT") is None # Invalid day of month assert parse_func("Tue, 0 Jan 1970 00:00:00 GMT") is None # Invalid year assert parse_func("Tue, 1 Jan 1500 00:00:00 GMT") is None # Invalid time assert parse_func("Tue, 1 Jan 1970 77:88:99 GMT") is None def test_domain_matching() -> None: test_func = CookieJar._is_domain_match assert test_func("test.com", "test.com") assert test_func("test.com", "sub.test.com") assert not test_func("test.com", "") assert not test_func("test.com", "test.org") assert not test_func("diff-test.com", "test.com") assert not test_func("test.com", "diff-test.com") assert not test_func("test.com", "127.0.0.1") def test_path_matching() -> None: test_func = CookieJar._is_path_match assert test_func("/", "") assert test_func("", "/") assert test_func("/file", "") assert test_func("/folder/file", "") assert test_func("/", "/") assert test_func("/file", "/") assert test_func("/file", "/file") assert test_func("/folder/", "/folder/") assert test_func("/folder/", "/") assert test_func("/folder/file", "/") assert not test_func("/", "/file") assert not test_func("/", "/folder/") assert not test_func("/file", "/folder/file") assert not test_func("/folder/", "/folder/file") assert not test_func("/different-file", "/file") assert not test_func("/different-folder/", "/folder/") async def test_constructor(loop, cookies_to_send, cookies_to_receive) -> None: jar = CookieJar(loop=loop) jar.update_cookies(cookies_to_send) jar_cookies = SimpleCookie() for cookie in jar: dict.__setitem__(jar_cookies, cookie.key, cookie) expected_cookies = cookies_to_send assert jar_cookies == expected_cookies assert jar._loop is loop async def test_constructor_with_expired(loop, cookies_to_send_with_expired, cookies_to_receive) -> None: jar = CookieJar() jar.update_cookies(cookies_to_send_with_expired) jar_cookies = SimpleCookie() for cookie in jar: dict.__setitem__(jar_cookies, cookie.key, cookie) expected_cookies = cookies_to_send_with_expired assert jar_cookies != expected_cookies assert jar._loop is loop async def test_save_load(loop, cookies_to_send, cookies_to_receive) -> None: file_path = tempfile.mkdtemp() + '/aiohttp.test.cookie' # export cookie jar jar_save = CookieJar(loop=loop) jar_save.update_cookies(cookies_to_receive) jar_save.save(file_path=file_path) jar_load = CookieJar(loop=loop) jar_load.load(file_path=file_path) jar_test = SimpleCookie() for cookie in jar_load: jar_test[cookie.key] = cookie os.unlink(file_path) assert jar_test == cookies_to_receive async def test_update_cookie_with_unicode_domain(loop) -> None: cookies = ( "idna-domain-first=first; Domain=xn--9caa.com; Path=/;", "idna-domain-second=second; Domain=xn--9caa.com; Path=/;", ) jar = CookieJar(loop=loop) jar.update_cookies(SimpleCookie(cookies[0]), URL("http://éé.com/")) jar.update_cookies(SimpleCookie(cookies[1]), URL("http://xn--9caa.com/")) jar_test = SimpleCookie() for cookie in jar: jar_test[cookie.key] = cookie assert jar_test == SimpleCookie(" ".join(cookies)) async def test_filter_cookie_with_unicode_domain(loop) -> None: jar = CookieJar() jar.update_cookies(SimpleCookie( "idna-domain-first=first; Domain=xn--9caa.com; Path=/; " )) assert len(jar.filter_cookies(URL("http://éé.com"))) == 1 assert len(jar.filter_cookies(URL("http://xn--9caa.com"))) == 1 async def test_domain_filter_ip_cookie_send(loop) -> None: jar = CookieJar(loop=loop) cookies = SimpleCookie( "shared-cookie=first; " "domain-cookie=second; Domain=example.com; " "subdomain1-cookie=third; Domain=test1.example.com; " "subdomain2-cookie=fourth; Domain=test2.example.com; " "dotted-domain-cookie=fifth; Domain=.example.com; " "different-domain-cookie=sixth; Domain=different.org; " "secure-cookie=seventh; Domain=secure.com; Secure; " "no-path-cookie=eighth; Domain=pathtest.com; " "path1-cookie=nineth; Domain=pathtest.com; Path=/; " "path2-cookie=tenth; Domain=pathtest.com; Path=/one; " "path3-cookie=eleventh; Domain=pathtest.com; Path=/one/two; " "path4-cookie=twelfth; Domain=pathtest.com; Path=/one/two/; " "expires-cookie=thirteenth; Domain=expirestest.com; Path=/;" " Expires=Tue, 1 Jan 1980 12:00:00 GMT; " "max-age-cookie=fourteenth; Domain=maxagetest.com; Path=/;" " Max-Age=60; " "invalid-max-age-cookie=fifteenth; Domain=invalid-values.com; " " Max-Age=string; " "invalid-expires-cookie=sixteenth; Domain=invalid-values.com; " " Expires=string;" ) jar.update_cookies(cookies) cookies_sent = jar.filter_cookies(URL("http://1.2.3.4/")).output( header='Cookie:') assert cookies_sent == 'Cookie: shared-cookie=first' async def test_domain_filter_ip_cookie_receive(cookies_to_receive) -> None: jar = CookieJar() jar.update_cookies(cookies_to_receive, URL("http://1.2.3.4/")) assert len(jar) == 0 async def test_preserving_ip_domain_cookies(loop) -> None: jar = CookieJar(loop=loop, unsafe=True) jar.update_cookies(SimpleCookie( "shared-cookie=first; " "ip-cookie=second; Domain=127.0.0.1;" )) cookies_sent = jar.filter_cookies(URL("http://127.0.0.1/")).output( header='Cookie:') assert cookies_sent == ('Cookie: ip-cookie=second\r\n' 'Cookie: shared-cookie=first') async def test_preserving_quoted_cookies(loop) -> None: jar = CookieJar(loop=loop, unsafe=True) jar.update_cookies(SimpleCookie( "ip-cookie=\"second\"; Domain=127.0.0.1;" )) cookies_sent = jar.filter_cookies(URL("http://127.0.0.1/")).output( header='Cookie:') assert cookies_sent == 'Cookie: ip-cookie=\"second\"' async def test_ignore_domain_ending_with_dot(loop) -> None: jar = CookieJar(loop=loop, unsafe=True) jar.update_cookies(SimpleCookie("cookie=val; Domain=example.com.;"), URL("http://www.example.com")) cookies_sent = jar.filter_cookies(URL("http://www.example.com/")) assert cookies_sent.output(header='Cookie:') == "Cookie: cookie=val" cookies_sent = jar.filter_cookies(URL("http://example.com/")) assert cookies_sent.output(header='Cookie:') == "" class TestCookieJarBase(unittest.TestCase): def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) # N.B. those need to be overridden in child test cases async def make_jar(): return CookieJar() self.jar = self.loop.run_until_complete(make_jar()) def tearDown(self): self.loop.close() def request_reply_with_same_url(self, url): self.jar.update_cookies(self.cookies_to_send) cookies_sent = self.jar.filter_cookies(URL(url)) self.jar.clear() self.jar.update_cookies(self.cookies_to_receive, URL(url)) cookies_received = SimpleCookie() for cookie in self.jar: dict.__setitem__(cookies_received, cookie.key, cookie) self.jar.clear() return cookies_sent, cookies_received class TestCookieJarSafe(TestCookieJarBase): def setUp(self): super().setUp() self.cookies_to_send = SimpleCookie( "shared-cookie=first; " "domain-cookie=second; Domain=example.com; " "subdomain1-cookie=third; Domain=test1.example.com; " "subdomain2-cookie=fourth; Domain=test2.example.com; " "dotted-domain-cookie=fifth; Domain=.example.com; " "different-domain-cookie=sixth; Domain=different.org; " "secure-cookie=seventh; Domain=secure.com; Secure; " "no-path-cookie=eighth; Domain=pathtest.com; " "path1-cookie=nineth; Domain=pathtest.com; Path=/; " "path2-cookie=tenth; Domain=pathtest.com; Path=/one; " "path3-cookie=eleventh; Domain=pathtest.com; Path=/one/two; " "path4-cookie=twelfth; Domain=pathtest.com; Path=/one/two/; " "expires-cookie=thirteenth; Domain=expirestest.com; Path=/;" " Expires=Tue, 1 Jan 1980 12:00:00 GMT; " "max-age-cookie=fourteenth; Domain=maxagetest.com; Path=/;" " Max-Age=60; " "invalid-max-age-cookie=fifteenth; Domain=invalid-values.com; " " Max-Age=string; " "invalid-expires-cookie=sixteenth; Domain=invalid-values.com; " " Expires=string;" ) self.cookies_to_receive = SimpleCookie( "unconstrained-cookie=first; Path=/; " "domain-cookie=second; Domain=example.com; Path=/; " "subdomain1-cookie=third; Domain=test1.example.com; Path=/; " "subdomain2-cookie=fourth; Domain=test2.example.com; Path=/; " "dotted-domain-cookie=fifth; Domain=.example.com; Path=/; " "different-domain-cookie=sixth; Domain=different.org; Path=/; " "no-path-cookie=seventh; Domain=pathtest.com; " "path-cookie=eighth; Domain=pathtest.com; Path=/somepath; " "wrong-path-cookie=nineth; Domain=pathtest.com; Path=somepath;" ) async def make_jar(): return CookieJar() self.jar = self.loop.run_until_complete(make_jar()) def timed_request(self, url, update_time, send_time): if isinstance(update_time, int): update_time = datetime.timedelta(seconds=update_time) elif isinstance(update_time, float): update_time = datetime.datetime.fromtimestamp(update_time) if isinstance(send_time, int): send_time = datetime.timedelta(seconds=send_time) elif isinstance(send_time, float): send_time = datetime.datetime.fromtimestamp(send_time) with freeze_time(update_time): self.jar.update_cookies(self.cookies_to_send) with freeze_time(send_time): cookies_sent = self.jar.filter_cookies(URL(url)) self.jar.clear() return cookies_sent def test_domain_filter_same_host(self) -> None: cookies_sent, cookies_received = ( self.request_reply_with_same_url("http://example.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "domain-cookie", "dotted-domain-cookie" }) self.assertEqual(set(cookies_received.keys()), { "unconstrained-cookie", "domain-cookie", "dotted-domain-cookie" }) def test_domain_filter_same_host_and_subdomain(self) -> None: cookies_sent, cookies_received = ( self.request_reply_with_same_url("http://test1.example.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "domain-cookie", "subdomain1-cookie", "dotted-domain-cookie" }) self.assertEqual(set(cookies_received.keys()), { "unconstrained-cookie", "domain-cookie", "subdomain1-cookie", "dotted-domain-cookie" }) def test_domain_filter_same_host_diff_subdomain(self) -> None: cookies_sent, cookies_received = ( self.request_reply_with_same_url("http://different.example.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "domain-cookie", "dotted-domain-cookie" }) self.assertEqual(set(cookies_received.keys()), { "unconstrained-cookie", "domain-cookie", "dotted-domain-cookie" }) def test_domain_filter_diff_host(self) -> None: cookies_sent, cookies_received = ( self.request_reply_with_same_url("http://different.org/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "different-domain-cookie" }) self.assertEqual(set(cookies_received.keys()), { "unconstrained-cookie", "different-domain-cookie" }) def test_domain_filter_host_only(self) -> None: self.jar.update_cookies(self.cookies_to_receive, URL("http://example.com/")) cookies_sent = self.jar.filter_cookies(URL("http://example.com/")) self.assertIn("unconstrained-cookie", set(cookies_sent.keys())) cookies_sent = self.jar.filter_cookies(URL("http://different.org/")) self.assertNotIn("unconstrained-cookie", set(cookies_sent.keys())) def test_secure_filter(self) -> None: cookies_sent, _ = ( self.request_reply_with_same_url("http://secure.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie" }) cookies_sent, _ = ( self.request_reply_with_same_url("https://secure.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "secure-cookie" }) def test_path_filter_root(self) -> None: cookies_sent, _ = ( self.request_reply_with_same_url("http://pathtest.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie" }) def test_path_filter_folder(self) -> None: cookies_sent, _ = ( self.request_reply_with_same_url("http://pathtest.com/one/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie", "path2-cookie" }) def test_path_filter_file(self) -> None: cookies_sent, _ = self.request_reply_with_same_url( "http://pathtest.com/one/two") self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie", "path2-cookie", "path3-cookie" }) def test_path_filter_subfolder(self) -> None: cookies_sent, _ = self.request_reply_with_same_url( "http://pathtest.com/one/two/") self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie", "path2-cookie", "path3-cookie", "path4-cookie" }) def test_path_filter_subsubfolder(self) -> None: cookies_sent, _ = self.request_reply_with_same_url( "http://pathtest.com/one/two/three/") self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie", "path2-cookie", "path3-cookie", "path4-cookie" }) def test_path_filter_different_folder(self) -> None: cookies_sent, _ = ( self.request_reply_with_same_url("http://pathtest.com/hundred/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "no-path-cookie", "path1-cookie" }) def test_path_value(self) -> None: _, cookies_received = ( self.request_reply_with_same_url("http://pathtest.com/")) self.assertEqual(set(cookies_received.keys()), { "unconstrained-cookie", "no-path-cookie", "path-cookie", "wrong-path-cookie" }) self.assertEqual(cookies_received["no-path-cookie"]["path"], "/") self.assertEqual(cookies_received["path-cookie"]["path"], "/somepath") self.assertEqual(cookies_received["wrong-path-cookie"]["path"], "/") def test_expires(self) -> None: ts_before = datetime.datetime( 1975, 1, 1, tzinfo=datetime.timezone.utc).timestamp() ts_after = datetime.datetime( 2115, 1, 1, tzinfo=datetime.timezone.utc).timestamp() cookies_sent = self.timed_request( "http://expirestest.com/", ts_before, ts_before) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "expires-cookie" }) cookies_sent = self.timed_request( "http://expirestest.com/", ts_before, ts_after) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie" }) def test_max_age(self) -> None: cookies_sent = self.timed_request( "http://maxagetest.com/", 1000, 1000) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "max-age-cookie" }) cookies_sent = self.timed_request( "http://maxagetest.com/", 1000, 2000) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie" }) def test_invalid_values(self) -> None: cookies_sent, cookies_received = ( self.request_reply_with_same_url("http://invalid-values.com/")) self.assertEqual(set(cookies_sent.keys()), { "shared-cookie", "invalid-max-age-cookie", "invalid-expires-cookie" }) cookie = cookies_sent["invalid-max-age-cookie"] self.assertEqual(cookie["max-age"], "") cookie = cookies_sent["invalid-expires-cookie"] self.assertEqual(cookie["expires"], "") def test_cookie_not_expired_when_added_after_removal(self) -> None: """Test case for https://github.com/aio-libs/aiohttp/issues/2084""" timestamps = [533588.993, 533588.993, 533588.993, 533588.993, 533589.093, 533589.093] loop = mock.Mock() loop.time.side_effect = itertools.chain( timestamps, itertools.cycle([timestamps[-1]])) async def make_jar(): return CookieJar(unsafe=True) jar = self.loop.run_until_complete(make_jar()) # Remove `foo` cookie. jar.update_cookies(SimpleCookie('foo=""; Max-Age=0')) # Set `foo` cookie to `bar`. jar.update_cookies(SimpleCookie('foo="bar"')) # Assert that there is a cookie. assert len(jar) == 1 async def test_dummy_cookie_jar() -> None: cookie = SimpleCookie('foo=bar; Domain=example.com;') dummy_jar = DummyCookieJar() assert len(dummy_jar) == 0 dummy_jar.update_cookies(cookie) assert len(dummy_jar) == 0 with pytest.raises(StopIteration): next(iter(dummy_jar)) assert not dummy_jar.filter_cookies(URL("http://example.com/")) dummy_jar.clear() aiohttp-3.6.2/tests/test_flowcontrol_streams.py0000644000175100001650000001013113547410117022342 0ustar vstsdocker00000000000000from unittest import mock import pytest from aiohttp import streams @pytest.fixture def protocol(): return mock.Mock(_reading_paused=False) @pytest.fixture def stream(loop, protocol): out = streams.StreamReader(protocol, limit=1, loop=loop) out._allow_pause = True return out @pytest.fixture def buffer(loop, protocol): out = streams.FlowControlDataQueue(protocol, limit=1, loop=loop) out._allow_pause = True return out class TestFlowControlStreamReader: async def test_read(self, stream) -> None: stream.feed_data(b'da', 2) res = await stream.read(1) assert res == b'd' assert not stream._protocol.resume_reading.called async def test_read_resume_paused(self, stream) -> None: stream.feed_data(b'test', 4) stream._protocol._reading_paused = True res = await stream.read(1) assert res == b't' assert stream._protocol.pause_reading.called async def test_readline(self, stream) -> None: stream.feed_data(b'd\n', 5) res = await stream.readline() assert res == b'd\n' assert not stream._protocol.resume_reading.called async def test_readline_resume_paused(self, stream) -> None: stream._protocol._reading_paused = True stream.feed_data(b'd\n', 5) res = await stream.readline() assert res == b'd\n' assert stream._protocol.resume_reading.called async def test_readany(self, stream) -> None: stream.feed_data(b'data', 4) res = await stream.readany() assert res == b'data' assert not stream._protocol.resume_reading.called async def test_readany_resume_paused(self, stream) -> None: stream._protocol._reading_paused = True stream.feed_data(b'data', 4) res = await stream.readany() assert res == b'data' assert stream._protocol.resume_reading.called async def test_readchunk(self, stream) -> None: stream.feed_data(b'data', 4) res, end_of_http_chunk = await stream.readchunk() assert res == b'data' assert not end_of_http_chunk assert not stream._protocol.resume_reading.called async def test_readchunk_resume_paused(self, stream) -> None: stream._protocol._reading_paused = True stream.feed_data(b'data', 4) res, end_of_http_chunk = await stream.readchunk() assert res == b'data' assert not end_of_http_chunk assert stream._protocol.resume_reading.called async def test_readexactly(self, stream) -> None: stream.feed_data(b'data', 4) res = await stream.readexactly(3) assert res == b'dat' assert not stream._protocol.resume_reading.called async def test_feed_data(self, stream) -> None: stream._protocol._reading_paused = False stream.feed_data(b'datadata', 8) assert stream._protocol.pause_reading.called async def test_read_nowait(self, stream) -> None: stream._protocol._reading_paused = True stream.feed_data(b'data1', 5) stream.feed_data(b'data2', 5) stream.feed_data(b'data3', 5) res = await stream.read(5) assert res == b'data1' assert stream._protocol.resume_reading.call_count == 0 res = stream.read_nowait(5) assert res == b'data2' assert stream._protocol.resume_reading.call_count == 0 res = stream.read_nowait(5) assert res == b'data3' assert stream._protocol.resume_reading.call_count == 1 stream._protocol._reading_paused = False res = stream.read_nowait(5) assert res == b'' assert stream._protocol.resume_reading.call_count == 1 class TestFlowControlDataQueue: def test_feed_pause(self, buffer) -> None: buffer._protocol._reading_paused = False buffer.feed_data(object(), 100) assert buffer._protocol.pause_reading.called async def test_resume_on_read(self, buffer) -> None: buffer.feed_data(object(), 100) buffer._protocol._reading_paused = True await buffer.read() assert buffer._protocol.resume_reading.called aiohttp-3.6.2/tests/test_formdata.py0000644000175100001650000000445513547410117020045 0ustar vstsdocker00000000000000from unittest import mock import pytest from aiohttp.formdata import FormData @pytest.fixture def buf(): return bytearray() @pytest.fixture def writer(buf): writer = mock.Mock() async def write(chunk): buf.extend(chunk) writer.write.side_effect = write return writer def test_formdata_multipart(buf, writer) -> None: form = FormData() assert not form.is_multipart form.add_field('test', b'test', filename='test.txt') assert form.is_multipart def test_invalid_formdata_payload() -> None: form = FormData() form.add_field('test', object(), filename='test.txt') with pytest.raises(TypeError): form() def test_invalid_formdata_params() -> None: with pytest.raises(TypeError): FormData('asdasf') def test_invalid_formdata_params2() -> None: with pytest.raises(TypeError): FormData('as') # 2-char str is not allowed def test_invalid_formdata_content_type() -> None: form = FormData() invalid_vals = [0, 0.1, {}, [], b'foo'] for invalid_val in invalid_vals: with pytest.raises(TypeError): form.add_field('foo', 'bar', content_type=invalid_val) def test_invalid_formdata_filename() -> None: form = FormData() invalid_vals = [0, 0.1, {}, [], b'foo'] for invalid_val in invalid_vals: with pytest.raises(TypeError): form.add_field('foo', 'bar', filename=invalid_val) def test_invalid_formdata_content_transfer_encoding() -> None: form = FormData() invalid_vals = [0, 0.1, {}, [], b'foo'] for invalid_val in invalid_vals: with pytest.raises(TypeError): form.add_field('foo', 'bar', content_transfer_encoding=invalid_val) async def test_formdata_field_name_is_quoted(buf, writer) -> None: form = FormData(charset="ascii") form.add_field("emails[]", "xxx@x.co", content_type="multipart/form-data") payload = form() await payload.write(writer) assert b'name="emails%5B%5D"' in buf async def test_formdata_field_name_is_not_quoted(buf, writer) -> None: form = FormData(quote_fields=False, charset="ascii") form.add_field("emails[]", "xxx@x.co", content_type="multipart/form-data") payload = form() await payload.write(writer) assert b'name="emails[]"' in buf aiohttp-3.6.2/tests/test_frozenlist.py0000644000175100001650000001401713547410117020442 0ustar vstsdocker00000000000000from collections.abc import MutableSequence import pytest from aiohttp.frozenlist import FrozenList, PyFrozenList class FrozenListMixin: FrozenList = NotImplemented SKIP_METHODS = {'__abstractmethods__', '__slots__'} def test_subclass(self) -> None: assert issubclass(self.FrozenList, MutableSequence) def test_iface(self) -> None: for name in set(dir(MutableSequence)) - self.SKIP_METHODS: if name.startswith('_') and not name.endswith('_'): continue assert hasattr(self.FrozenList, name) def test_ctor_default(self) -> None: _list = self.FrozenList([]) assert not _list.frozen def test_ctor(self) -> None: _list = self.FrozenList([1]) assert not _list.frozen def test_ctor_copy_list(self) -> None: orig = [1] _list = self.FrozenList(orig) del _list[0] assert _list != orig def test_freeze(self) -> None: _list = self.FrozenList() _list.freeze() assert _list.frozen def test_repr(self) -> None: _list = self.FrozenList([1]) assert repr(_list) == '' _list.freeze() assert repr(_list) == '' def test_getitem(self) -> None: _list = self.FrozenList([1, 2]) assert _list[1] == 2 def test_setitem(self) -> None: _list = self.FrozenList([1, 2]) _list[1] = 3 assert _list[1] == 3 def test_delitem(self) -> None: _list = self.FrozenList([1, 2]) del _list[0] assert len(_list) == 1 assert _list[0] == 2 def test_len(self) -> None: _list = self.FrozenList([1]) assert len(_list) == 1 def test_iter(self) -> None: _list = self.FrozenList([1, 2]) assert list(iter(_list)) == [1, 2] def test_reversed(self) -> None: _list = self.FrozenList([1, 2]) assert list(reversed(_list)) == [2, 1] def test_eq(self) -> None: _list = self.FrozenList([1]) assert _list == [1] def test_ne(self) -> None: _list = self.FrozenList([1]) assert _list != [2] def test_le(self) -> None: _list = self.FrozenList([1]) assert _list <= [1] def test_lt(self) -> None: _list = self.FrozenList([1]) assert _list <= [3] def test_ge(self) -> None: _list = self.FrozenList([1]) assert _list >= [1] def test_gt(self) -> None: _list = self.FrozenList([2]) assert _list > [1] def test_insert(self) -> None: _list = self.FrozenList([2]) _list.insert(0, 1) assert _list == [1, 2] def test_frozen_setitem(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list[0] = 2 def test_frozen_delitem(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): del _list[0] def test_frozen_insert(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list.insert(0, 2) def test_contains(self) -> None: _list = self.FrozenList([2]) assert 2 in _list def test_iadd(self) -> None: _list = self.FrozenList([1]) _list += [2] assert _list == [1, 2] def test_iadd_frozen(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list += [2] assert _list == [1] def test_index(self) -> None: _list = self.FrozenList([1]) assert _list.index(1) == 0 def test_remove(self) -> None: _list = self.FrozenList([1]) _list.remove(1) assert len(_list) == 0 def test_remove_frozen(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list.remove(1) assert _list == [1] def test_clear(self) -> None: _list = self.FrozenList([1]) _list.clear() assert len(_list) == 0 def test_clear_frozen(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list.clear() assert _list == [1] def test_extend(self) -> None: _list = self.FrozenList([1]) _list.extend([2]) assert _list == [1, 2] def test_extend_frozen(self) -> None: _list = self.FrozenList([1]) _list.freeze() with pytest.raises(RuntimeError): _list.extend([2]) assert _list == [1] def test_reverse(self) -> None: _list = self.FrozenList([1, 2]) _list.reverse() assert _list == [2, 1] def test_reverse_frozen(self) -> None: _list = self.FrozenList([1, 2]) _list.freeze() with pytest.raises(RuntimeError): _list.reverse() assert _list == [1, 2] def test_pop(self) -> None: _list = self.FrozenList([1, 2]) assert _list.pop(0) == 1 assert _list == [2] def test_pop_default(self) -> None: _list = self.FrozenList([1, 2]) assert _list.pop() == 2 assert _list == [1] def test_pop_frozen(self) -> None: _list = self.FrozenList([1, 2]) _list.freeze() with pytest.raises(RuntimeError): _list.pop() assert _list == [1, 2] def test_append(self) -> None: _list = self.FrozenList([1, 2]) _list.append(3) assert _list == [1, 2, 3] def test_append_frozen(self) -> None: _list = self.FrozenList([1, 2]) _list.freeze() with pytest.raises(RuntimeError): _list.append(3) assert _list == [1, 2] def test_count(self) -> None: _list = self.FrozenList([1, 2]) assert _list.count(1) == 1 class TestFrozenList(FrozenListMixin): FrozenList = FrozenList class TestFrozenListPy(FrozenListMixin): FrozenList = PyFrozenList aiohttp-3.6.2/tests/test_helpers.py0000644000175100001650000004032713547410117017710 0ustar vstsdocker00000000000000import asyncio import base64 import gc import os import platform import tempfile from unittest import mock import pytest from multidict import MultiDict from yarl import URL from aiohttp import helpers IS_PYPY = platform.python_implementation() == 'PyPy' # ------------------- parse_mimetype ---------------------------------- @pytest.mark.parametrize('mimetype, expected', [ ('', helpers.MimeType('', '', '', MultiDict())), ('*', helpers.MimeType('*', '*', '', MultiDict())), ('application/json', helpers.MimeType('application', 'json', '', MultiDict())), ('application/json; charset=utf-8', helpers.MimeType('application', 'json', '', MultiDict({'charset': 'utf-8'}))), ('''application/json; charset=utf-8;''', helpers.MimeType('application', 'json', '', MultiDict({'charset': 'utf-8'}))), ('ApPlIcAtIoN/JSON;ChaRseT="UTF-8"', helpers.MimeType('application', 'json', '', MultiDict({'charset': 'UTF-8'}))), ('application/rss+xml', helpers.MimeType('application', 'rss', 'xml', MultiDict())), ('text/plain;base64', helpers.MimeType('text', 'plain', '', MultiDict({'base64': ''}))) ]) def test_parse_mimetype(mimetype, expected) -> None: result = helpers.parse_mimetype(mimetype) assert isinstance(result, helpers.MimeType) assert result == expected # ------------------- guess_filename ---------------------------------- def test_guess_filename_with_tempfile() -> None: with tempfile.TemporaryFile() as fp: assert (helpers.guess_filename(fp, 'no-throw') is not None) # ------------------- BasicAuth ----------------------------------- def test_basic_auth1() -> None: # missing password here with pytest.raises(ValueError): helpers.BasicAuth(None) def test_basic_auth2() -> None: with pytest.raises(ValueError): helpers.BasicAuth('nkim', None) def test_basic_with_auth_colon_in_login() -> None: with pytest.raises(ValueError): helpers.BasicAuth('nkim:1', 'pwd') def test_basic_auth3() -> None: auth = helpers.BasicAuth('nkim') assert auth.login == 'nkim' assert auth.password == '' def test_basic_auth4() -> None: auth = helpers.BasicAuth('nkim', 'pwd') assert auth.login == 'nkim' assert auth.password == 'pwd' assert auth.encode() == 'Basic bmtpbTpwd2Q=' @pytest.mark.parametrize('header', ( 'Basic bmtpbTpwd2Q=', 'basic bmtpbTpwd2Q=', )) def test_basic_auth_decode(header) -> None: auth = helpers.BasicAuth.decode(header) assert auth.login == 'nkim' assert auth.password == 'pwd' def test_basic_auth_invalid() -> None: with pytest.raises(ValueError): helpers.BasicAuth.decode('bmtpbTpwd2Q=') def test_basic_auth_decode_not_basic() -> None: with pytest.raises(ValueError): helpers.BasicAuth.decode('Complex bmtpbTpwd2Q=') def test_basic_auth_decode_bad_base64() -> None: with pytest.raises(ValueError): helpers.BasicAuth.decode('Basic bmtpbTpwd2Q') @pytest.mark.parametrize('header', ('Basic ???', 'Basic ')) def test_basic_auth_decode_illegal_chars_base64(header) -> None: with pytest.raises(ValueError, match='Invalid base64 encoding.'): helpers.BasicAuth.decode(header) def test_basic_auth_decode_invalid_credentials() -> None: with pytest.raises(ValueError, match='Invalid credentials.'): header = 'Basic {}'.format(base64.b64encode(b'username').decode()) helpers.BasicAuth.decode(header) @pytest.mark.parametrize('credentials, expected_auth', ( (':', helpers.BasicAuth( login='', password='', encoding='latin1')), ('username:', helpers.BasicAuth( login='username', password='', encoding='latin1')), (':password', helpers.BasicAuth( login='', password='password', encoding='latin1')), ('username:password', helpers.BasicAuth( login='username', password='password', encoding='latin1')), )) def test_basic_auth_decode_blank_username(credentials, expected_auth) -> None: header = 'Basic {}'.format(base64.b64encode(credentials.encode()).decode()) assert helpers.BasicAuth.decode(header) == expected_auth def test_basic_auth_from_url() -> None: url = URL('http://user:pass@example.com') auth = helpers.BasicAuth.from_url(url) assert auth.login == 'user' assert auth.password == 'pass' def test_basic_auth_from_not_url() -> None: with pytest.raises(TypeError): helpers.BasicAuth.from_url('http://user:pass@example.com') class ReifyMixin: reify = NotImplemented def test_reify(self) -> None: class A: def __init__(self): self._cache = {} @self.reify def prop(self): return 1 a = A() assert 1 == a.prop def test_reify_class(self) -> None: class A: def __init__(self): self._cache = {} @self.reify def prop(self): """Docstring.""" return 1 assert isinstance(A.prop, self.reify) assert 'Docstring.' == A.prop.__doc__ def test_reify_assignment(self) -> None: class A: def __init__(self): self._cache = {} @self.reify def prop(self): return 1 a = A() with pytest.raises(AttributeError): a.prop = 123 class TestPyReify(ReifyMixin): reify = helpers.reify_py if not helpers.NO_EXTENSIONS and not IS_PYPY: class TestCReify(ReifyMixin): reify = helpers.reify_c # ----------------------------------- is_ip_address() ---------------------- def test_is_ip_address() -> None: assert helpers.is_ip_address("127.0.0.1") assert helpers.is_ip_address("::1") assert helpers.is_ip_address("FE80:0000:0000:0000:0202:B3FF:FE1E:8329") # Hostnames assert not helpers.is_ip_address("localhost") assert not helpers.is_ip_address("www.example.com") # Out of range assert not helpers.is_ip_address("999.999.999.999") # Contain a port assert not helpers.is_ip_address("127.0.0.1:80") assert not helpers.is_ip_address("[2001:db8:0:1]:80") # Too many "::" assert not helpers.is_ip_address("1200::AB00:1234::2552:7777:1313") def test_is_ip_address_bytes() -> None: assert helpers.is_ip_address(b"127.0.0.1") assert helpers.is_ip_address(b"::1") assert helpers.is_ip_address(b"FE80:0000:0000:0000:0202:B3FF:FE1E:8329") # Hostnames assert not helpers.is_ip_address(b"localhost") assert not helpers.is_ip_address(b"www.example.com") # Out of range assert not helpers.is_ip_address(b"999.999.999.999") # Contain a port assert not helpers.is_ip_address(b"127.0.0.1:80") assert not helpers.is_ip_address(b"[2001:db8:0:1]:80") # Too many "::" assert not helpers.is_ip_address(b"1200::AB00:1234::2552:7777:1313") def test_ipv4_addresses() -> None: ip_addresses = [ '0.0.0.0', '127.0.0.1', '255.255.255.255', ] for address in ip_addresses: assert helpers.is_ipv4_address(address) assert not helpers.is_ipv6_address(address) assert helpers.is_ip_address(address) def test_ipv6_addresses() -> None: ip_addresses = [ '0:0:0:0:0:0:0:0', 'FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF', '00AB:0002:3008:8CFD:00AB:0002:3008:8CFD', '00ab:0002:3008:8cfd:00ab:0002:3008:8cfd', 'AB:02:3008:8CFD:AB:02:3008:8CFD', 'AB:02:3008:8CFD::02:3008:8CFD', '::', '1::1', ] for address in ip_addresses: assert not helpers.is_ipv4_address(address) assert helpers.is_ipv6_address(address) assert helpers.is_ip_address(address) def test_host_addresses() -> None: hosts = [ 'www.four.part.host' 'www.python.org', 'foo.bar', 'localhost', ] for host in hosts: assert not helpers.is_ip_address(host) def test_is_ip_address_invalid_type() -> None: with pytest.raises(TypeError): helpers.is_ip_address(123) with pytest.raises(TypeError): helpers.is_ip_address(object()) # ----------------------------------- TimeoutHandle ------------------- def test_timeout_handle(loop) -> None: handle = helpers.TimeoutHandle(loop, 10.2) cb = mock.Mock() handle.register(cb) assert cb == handle._callbacks[0][0] handle.close() assert not handle._callbacks def test_timeout_handle_cb_exc(loop) -> None: handle = helpers.TimeoutHandle(loop, 10.2) cb = mock.Mock() handle.register(cb) cb.side_effect = ValueError() handle() assert cb.called assert not handle._callbacks def test_timer_context_cancelled() -> None: with mock.patch('aiohttp.helpers.asyncio') as m_asyncio: m_asyncio.TimeoutError = asyncio.TimeoutError loop = mock.Mock() ctx = helpers.TimerContext(loop) ctx.timeout() with pytest.raises(asyncio.TimeoutError): with ctx: pass if helpers.PY_37: assert m_asyncio.current_task.return_value.cancel.called else: assert m_asyncio.Task.current_task.return_value.cancel.called def test_timer_context_no_task(loop) -> None: with pytest.raises(RuntimeError): with helpers.TimerContext(loop): pass # -------------------------------- CeilTimeout -------------------------- async def test_weakref_handle(loop) -> None: cb = mock.Mock() helpers.weakref_handle(cb, 'test', 0.01, loop, False) await asyncio.sleep(0.1) assert cb.test.called async def test_weakref_handle_weak(loop) -> None: cb = mock.Mock() helpers.weakref_handle(cb, 'test', 0.01, loop, False) del cb gc.collect() await asyncio.sleep(0.1) def test_ceil_call_later() -> None: cb = mock.Mock() loop = mock.Mock() loop.time.return_value = 10.1 helpers.call_later(cb, 10.1, loop) loop.call_at.assert_called_with(21.0, cb) def test_ceil_call_later_no_timeout() -> None: cb = mock.Mock() loop = mock.Mock() helpers.call_later(cb, 0, loop) assert not loop.call_at.called async def test_ceil_timeout(loop) -> None: with helpers.CeilTimeout(None, loop=loop) as timeout: assert timeout._timeout is None assert timeout._cancel_handler is None def test_ceil_timeout_no_task(loop) -> None: with pytest.raises(RuntimeError): with helpers.CeilTimeout(10, loop=loop): pass # -------------------------------- ContentDisposition ------------------- def test_content_disposition() -> None: assert (helpers.content_disposition_header('attachment', foo='bar') == 'attachment; foo="bar"') def test_content_disposition_bad_type() -> None: with pytest.raises(ValueError): helpers.content_disposition_header('foo bar') with pytest.raises(ValueError): helpers.content_disposition_header('—Ç–µ—Å—Ç') with pytest.raises(ValueError): helpers.content_disposition_header('foo\x00bar') with pytest.raises(ValueError): helpers.content_disposition_header('') def test_set_content_disposition_bad_param() -> None: with pytest.raises(ValueError): helpers.content_disposition_header('inline', **{'foo bar': 'baz'}) with pytest.raises(ValueError): helpers.content_disposition_header('inline', **{'—Ç–µ—Å—Ç': 'baz'}) with pytest.raises(ValueError): helpers.content_disposition_header('inline', **{'': 'baz'}) with pytest.raises(ValueError): helpers.content_disposition_header('inline', **{'foo\x00bar': 'baz'}) # --------------------- proxies_from_env ------------------------------ def test_proxies_from_env_http(mocker) -> None: url = URL('http://aiohttp.io/path') mocker.patch.dict(os.environ, {'http_proxy': str(url)}) ret = helpers.proxies_from_env() assert ret.keys() == {'http'} assert ret['http'].proxy == url assert ret['http'].proxy_auth is None def test_proxies_from_env_http_proxy_for_https_proto(mocker) -> None: url = URL('http://aiohttp.io/path') mocker.patch.dict(os.environ, {'https_proxy': str(url)}) ret = helpers.proxies_from_env() assert ret.keys() == {'https'} assert ret['https'].proxy == url assert ret['https'].proxy_auth is None def test_proxies_from_env_https_proxy_skipped(mocker) -> None: url = URL('https://aiohttp.io/path') mocker.patch.dict(os.environ, {'https_proxy': str(url)}) log = mocker.patch('aiohttp.log.client_logger.warning') assert helpers.proxies_from_env() == {} log.assert_called_with('HTTPS proxies %s are not supported, ignoring', URL('https://aiohttp.io/path')) def test_proxies_from_env_http_with_auth(mocker) -> None: url = URL('http://user:pass@aiohttp.io/path') mocker.patch.dict(os.environ, {'http_proxy': str(url)}) ret = helpers.proxies_from_env() assert ret.keys() == {'http'} assert ret['http'].proxy == url.with_user(None) proxy_auth = ret['http'].proxy_auth assert proxy_auth.login == 'user' assert proxy_auth.password == 'pass' assert proxy_auth.encoding == 'latin1' # ------------ get_running_loop --------------------------------- def test_get_running_loop_not_running(loop) -> None: with pytest.warns(DeprecationWarning): helpers.get_running_loop() async def test_get_running_loop_ok(loop) -> None: assert helpers.get_running_loop() is loop # ------------- set_result / set_exception ---------------------- async def test_set_result(loop) -> None: fut = loop.create_future() helpers.set_result(fut, 123) assert 123 == await fut async def test_set_result_cancelled(loop) -> None: fut = loop.create_future() fut.cancel() helpers.set_result(fut, 123) with pytest.raises(asyncio.CancelledError): await fut async def test_set_exception(loop) -> None: fut = loop.create_future() helpers.set_exception(fut, RuntimeError()) with pytest.raises(RuntimeError): await fut async def test_set_exception_cancelled(loop) -> None: fut = loop.create_future() fut.cancel() helpers.set_exception(fut, RuntimeError()) with pytest.raises(asyncio.CancelledError): await fut # ----------- ChainMapProxy -------------------------- class TestChainMapProxy: @pytest.mark.skipif(not helpers.PY_36, reason="Requires Python 3.6+") def test_inheritance(self) -> None: with pytest.raises(TypeError): class A(helpers.ChainMapProxy): pass def test_getitem(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert cp['a'] == 2 assert cp['b'] == 3 def test_getitem_not_found(self) -> None: d = {'a': 1} cp = helpers.ChainMapProxy([d]) with pytest.raises(KeyError): cp['b'] def test_get(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert cp.get('a') == 2 def test_get_default(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert cp.get('c', 4) == 4 def test_get_non_default(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert cp.get('a', 4) == 2 def test_len(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert len(cp) == 2 def test_iter(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert set(cp) == {'a', 'b'} def test_contains(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) assert 'a' in cp assert 'b' in cp assert 'c' not in cp def test_bool(self) -> None: assert helpers.ChainMapProxy([{'a': 1}]) assert not helpers.ChainMapProxy([{}, {}]) assert not helpers.ChainMapProxy([]) def test_repr(self) -> None: d1 = {'a': 2, 'b': 3} d2 = {'a': 1} cp = helpers.ChainMapProxy([d1, d2]) expected = "ChainMapProxy({!r}, {!r})".format(d1, d2) assert expected == repr(cp) aiohttp-3.6.2/tests/test_http_exceptions.py0000644000175100001650000001257313547410117021470 0ustar vstsdocker00000000000000"""Tests for http_exceptions.py""" import pickle from aiohttp import http_exceptions class TestHttpProcessingError: def test_ctor(self) -> None: err = http_exceptions.HttpProcessingError( code=500, message='Internal error', headers={}) assert err.code == 500 assert err.message == 'Internal error' assert err.headers == {} def test_pickle(self) -> None: err = http_exceptions.HttpProcessingError( code=500, message='Internal error', headers={}) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.code == 500 assert err2.message == 'Internal error' assert err2.headers == {} assert err2.foo == 'bar' def test_str(self) -> None: err = http_exceptions.HttpProcessingError( code=500, message='Internal error', headers={}) assert str(err) == "500, message='Internal error'" def test_repr(self) -> None: err = http_exceptions.HttpProcessingError( code=500, message='Internal error', headers={}) assert repr(err) == ("") class TestBadHttpMessage: def test_ctor(self) -> None: err = http_exceptions.BadHttpMessage('Bad HTTP message', headers={}) assert err.code == 400 assert err.message == 'Bad HTTP message' assert err.headers == {} def test_pickle(self) -> None: err = http_exceptions.BadHttpMessage( message='Bad HTTP message', headers={}) err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.code == 400 assert err2.message == 'Bad HTTP message' assert err2.headers == {} assert err2.foo == 'bar' def test_str(self) -> None: err = http_exceptions.BadHttpMessage( message='Bad HTTP message', headers={}) assert str(err) == "400, message='Bad HTTP message'" def test_repr(self) -> None: err = http_exceptions.BadHttpMessage( message='Bad HTTP message', headers={}) assert repr(err) == "" class TestLineTooLong: def test_ctor(self) -> None: err = http_exceptions.LineTooLong('spam', '10', '12') assert err.code == 400 assert err.message == 'Got more than 10 bytes (12) when reading spam.' assert err.headers is None def test_pickle(self) -> None: err = http_exceptions.LineTooLong( line='spam', limit='10', actual_size='12') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.code == 400 assert err2.message == ('Got more than 10 bytes (12) ' 'when reading spam.') assert err2.headers is None assert err2.foo == 'bar' def test_str(self) -> None: err = http_exceptions.LineTooLong( line='spam', limit='10', actual_size='12') assert str(err) == ("400, message='Got more than 10 bytes (12) " "when reading spam.'") def test_repr(self) -> None: err = http_exceptions.LineTooLong( line='spam', limit='10', actual_size='12') assert repr(err) == ("") class TestInvalidHeader: def test_ctor(self) -> None: err = http_exceptions.InvalidHeader('X-Spam') assert err.code == 400 assert err.message == 'Invalid HTTP Header: X-Spam' assert err.headers is None def test_pickle(self) -> None: err = http_exceptions.InvalidHeader(hdr='X-Spam') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.code == 400 assert err2.message == 'Invalid HTTP Header: X-Spam' assert err2.headers is None assert err2.foo == 'bar' def test_str(self) -> None: err = http_exceptions.InvalidHeader(hdr='X-Spam') assert str(err) == "400, message='Invalid HTTP Header: X-Spam'" def test_repr(self) -> None: err = http_exceptions.InvalidHeader(hdr='X-Spam') assert repr(err) == ("") class TestBadStatusLine: def test_ctor(self) -> None: err = http_exceptions.BadStatusLine('Test') assert err.line == 'Test' assert str(err) == 'Test' def test_ctor2(self) -> None: err = http_exceptions.BadStatusLine(b'') assert err.line == "b''" assert str(err) == "b''" def test_pickle(self) -> None: err = http_exceptions.BadStatusLine('Test') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.line == 'Test' assert err2.foo == 'bar' aiohttp-3.6.2/tests/test_http_parser.py0000644000175100001650000007444413547410117020610 0ustar vstsdocker00000000000000"""Tests for aiohttp/protocol.py""" import asyncio import zlib from unittest import mock import pytest from multidict import CIMultiDict from yarl import URL import aiohttp from aiohttp import http_exceptions, streams from aiohttp.http_parser import ( DeflateBuffer, HttpPayloadParser, HttpRequestParserPy, HttpResponseParserPy, ) try: import brotli except ImportError: brotli = None REQUEST_PARSERS = [HttpRequestParserPy] RESPONSE_PARSERS = [HttpResponseParserPy] try: from aiohttp.http_parser import HttpRequestParserC, HttpResponseParserC REQUEST_PARSERS.append(HttpRequestParserC) RESPONSE_PARSERS.append(HttpResponseParserC) except ImportError: # pragma: no cover pass @pytest.fixture def protocol(): return mock.Mock() @pytest.fixture(params=REQUEST_PARSERS) def parser(loop, protocol, request): """Parser implementations""" return request.param(protocol, loop, max_line_size=8190, max_headers=32768, max_field_size=8190) @pytest.fixture(params=REQUEST_PARSERS) def request_cls(request): """Request Parser class""" return request.param @pytest.fixture(params=RESPONSE_PARSERS) def response(loop, protocol, request): """Parser implementations""" return request.param(protocol, loop, max_line_size=8190, max_headers=32768, max_field_size=8190) @pytest.fixture(params=RESPONSE_PARSERS) def response_cls(request): """Parser implementations""" return request.param @pytest.fixture def stream(): return mock.Mock() def test_parse_headers(parser) -> None: text = b'''GET /test HTTP/1.1\r test: line\r continue\r test2: data\r \r ''' messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 1 msg = messages[0][0] assert list(msg.headers.items()) == [('test', 'line continue'), ('test2', 'data')] assert msg.raw_headers == ((b'test', b'line continue'), (b'test2', b'data')) assert not msg.should_close assert msg.compression is None assert not msg.upgrade def test_parse(parser) -> None: text = b'GET /test HTTP/1.1\r\n\r\n' messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 1 msg, _ = messages[0] assert msg.compression is None assert not msg.upgrade assert msg.method == 'GET' assert msg.path == '/test' assert msg.version == (1, 1) async def test_parse_body(parser) -> None: text = b'GET /test HTTP/1.1\r\nContent-Length: 4\r\n\r\nbody' messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 1 _, payload = messages[0] body = await payload.read(4) assert body == b'body' async def test_parse_body_with_CRLF(parser) -> None: text = b'\r\nGET /test HTTP/1.1\r\nContent-Length: 4\r\n\r\nbody' messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 1 _, payload = messages[0] body = await payload.read(4) assert body == b'body' def test_parse_delayed(parser) -> None: text = b'GET /test HTTP/1.1\r\n' messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 0 assert not upgrade messages, upgrade, tail = parser.feed_data(b'\r\n') assert len(messages) == 1 msg = messages[0][0] assert msg.method == 'GET' def test_headers_multi_feed(parser) -> None: text1 = b'GET /test HTTP/1.1\r\n' text2 = b'test: line\r' text3 = b'\n continue\r\n\r\n' messages, upgrade, tail = parser.feed_data(text1) assert len(messages) == 0 messages, upgrade, tail = parser.feed_data(text2) assert len(messages) == 0 messages, upgrade, tail = parser.feed_data(text3) assert len(messages) == 1 msg = messages[0][0] assert list(msg.headers.items()) == [('test', 'line continue')] assert msg.raw_headers == ((b'test', b'line continue'),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade def test_headers_split_field(parser) -> None: text1 = b'GET /test HTTP/1.1\r\n' text2 = b't' text3 = b'es' text4 = b't: value\r\n\r\n' messages, upgrade, tail = parser.feed_data(text1) messages, upgrade, tail = parser.feed_data(text2) messages, upgrade, tail = parser.feed_data(text3) assert len(messages) == 0 messages, upgrade, tail = parser.feed_data(text4) assert len(messages) == 1 msg = messages[0][0] assert list(msg.headers.items()) == [('test', 'value')] assert msg.raw_headers == ((b'test', b'value'),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade def test_parse_headers_multi(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'Set-Cookie: c1=cookie1\r\n' b'Set-Cookie: c2=cookie2\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) assert len(messages) == 1 msg = messages[0][0] assert list(msg.headers.items()) == [('Set-Cookie', 'c1=cookie1'), ('Set-Cookie', 'c2=cookie2')] assert msg.raw_headers == ((b'Set-Cookie', b'c1=cookie1'), (b'Set-Cookie', b'c2=cookie2')) assert not msg.should_close assert msg.compression is None def test_conn_default_1_0(parser) -> None: text = b'GET /test HTTP/1.0\r\n\r\n' messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.should_close def test_conn_default_1_1(parser) -> None: text = b'GET /test HTTP/1.1\r\n\r\n' messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close def test_conn_close(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'connection: close\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.should_close def test_conn_close_1_0(parser) -> None: text = (b'GET /test HTTP/1.0\r\n' b'connection: close\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.should_close def test_conn_keep_alive_1_0(parser) -> None: text = (b'GET /test HTTP/1.0\r\n' b'connection: keep-alive\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close def test_conn_keep_alive_1_1(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'connection: keep-alive\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close def test_conn_other_1_0(parser) -> None: text = (b'GET /test HTTP/1.0\r\n' b'connection: test\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.should_close def test_conn_other_1_1(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'connection: test\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close def test_request_chunked(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg, payload = messages[0] assert msg.chunked assert not upgrade assert isinstance(payload, streams.StreamReader) def test_conn_upgrade(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'connection: upgrade\r\n' b'upgrade: websocket\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close assert msg.upgrade assert upgrade def test_compression_empty(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-encoding: \r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.compression is None def test_compression_deflate(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-encoding: deflate\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.compression == 'deflate' def test_compression_gzip(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-encoding: gzip\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.compression == 'gzip' @pytest.mark.skipif(brotli is None, reason="brotli is not installed") def test_compression_brotli(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-encoding: br\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.compression == 'br' def test_compression_unknown(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-encoding: compress\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.compression is None def test_headers_connect(parser) -> None: text = (b'CONNECT www.google.com HTTP/1.1\r\n' b'content-length: 0\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg, payload = messages[0] assert upgrade assert isinstance(payload, streams.StreamReader) def test_headers_old_websocket_key1(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'SEC-WEBSOCKET-KEY1: line\r\n\r\n') with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(text) def test_headers_content_length_err_1(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-length: line\r\n\r\n') with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(text) def test_headers_content_length_err_2(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'content-length: -1\r\n\r\n') with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(text) def test_invalid_header(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'test line\r\n\r\n') with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(text) def test_invalid_name(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'test[]: line\r\n\r\n') with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(text) @pytest.mark.parametrize('size', [40960, 8191]) def test_max_header_field_size(parser, size) -> None: name = b't' * size text = (b'GET /test HTTP/1.1\r\n' + name + b':data\r\n\r\n') match = ("400, message='Got more than 8190 bytes \\({}\\) when reading" .format(size)) with pytest.raises(http_exceptions.LineTooLong, match=match): parser.feed_data(text) def test_max_header_field_size_under_limit(parser) -> None: name = b't' * 8190 text = (b'GET /test HTTP/1.1\r\n' + name + b':data\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/test' assert msg.version == (1, 1) assert msg.headers == CIMultiDict({name.decode(): 'data'}) assert msg.raw_headers == ((name, b'data'),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/test') @pytest.mark.parametrize('size', [40960, 8191]) def test_max_header_value_size(parser, size) -> None: name = b't' * size text = (b'GET /test HTTP/1.1\r\n' b'data:' + name + b'\r\n\r\n') match = ("400, message='Got more than 8190 bytes \\({}\\) when reading" .format(size)) with pytest.raises(http_exceptions.LineTooLong, match=match): parser.feed_data(text) def test_max_header_value_size_under_limit(parser) -> None: value = b'A' * 8190 text = (b'GET /test HTTP/1.1\r\n' b'data:' + value + b'\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/test' assert msg.version == (1, 1) assert msg.headers == CIMultiDict({'data': value.decode()}) assert msg.raw_headers == ((b'data', value),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/test') @pytest.mark.parametrize('size', [40965, 8191]) def test_max_header_value_size_continuation(parser, size) -> None: name = b'T' * (size - 5) text = (b'GET /test HTTP/1.1\r\n' b'data: test\r\n ' + name + b'\r\n\r\n') match = ("400, message='Got more than 8190 bytes \\({}\\) when reading" .format(size)) with pytest.raises(http_exceptions.LineTooLong, match=match): parser.feed_data(text) def test_max_header_value_size_continuation_under_limit(parser) -> None: value = b'A' * 8185 text = (b'GET /test HTTP/1.1\r\n' b'data: test\r\n ' + value + b'\r\n\r\n') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/test' assert msg.version == (1, 1) assert msg.headers == CIMultiDict({'data': 'test ' + value.decode()}) assert msg.raw_headers == ((b'data', b'test ' + value),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/test') def test_http_request_parser(parser) -> None: text = b'GET /path HTTP/1.1\r\n\r\n' messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/path' assert msg.version == (1, 1) assert msg.headers == CIMultiDict() assert msg.raw_headers == () assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/path') def test_http_request_bad_status_line(parser) -> None: text = b'getpath \r\n\r\n' with pytest.raises(http_exceptions.BadStatusLine): parser.feed_data(text) def test_http_request_upgrade(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'connection: upgrade\r\n' b'upgrade: websocket\r\n\r\n' b'some raw data') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.should_close assert msg.upgrade assert upgrade assert tail == b'some raw data' def test_http_request_parser_utf8(parser) -> None: text = 'GET /path HTTP/1.1\r\nx-test:тест\r\n\r\n'.encode('utf-8') messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/path' assert msg.version == (1, 1) assert msg.headers == CIMultiDict([('X-TEST', 'тест')]) assert msg.raw_headers == ((b'x-test', 'тест'.encode('utf-8')),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/path') def test_http_request_parser_non_utf8(parser) -> None: text = 'GET /path HTTP/1.1\r\nx-test:тест\r\n\r\n'.encode('cp1251') msg = parser.feed_data(text)[0][0][0] assert msg.method == 'GET' assert msg.path == '/path' assert msg.version == (1, 1) assert msg.headers == CIMultiDict([('X-TEST', 'тест'.encode('cp1251') .decode('utf8', 'surrogateescape'))]) assert msg.raw_headers == ((b'x-test', 'тест'.encode('cp1251')),) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/path') def test_http_request_parser_two_slashes(parser) -> None: text = b'GET //path HTTP/1.1\r\n\r\n' msg = parser.feed_data(text)[0][0][0] assert msg.method == 'GET' assert msg.path == '//path' assert msg.version == (1, 1) assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked def test_http_request_parser_bad_method(parser) -> None: with pytest.raises(http_exceptions.BadStatusLine): parser.feed_data(b'=":(e),[T];?" /get HTTP/1.1\r\n\r\n') def test_http_request_parser_bad_version(parser) -> None: with pytest.raises(http_exceptions.BadHttpMessage): parser.feed_data(b'GET //get HT/11\r\n\r\n') @pytest.mark.parametrize('size', [40965, 8191]) def test_http_request_max_status_line(parser, size) -> None: path = b't' * (size - 5) match = ("400, message='Got more than 8190 bytes \\({}\\) when reading" .format(size)) with pytest.raises(http_exceptions.LineTooLong, match=match): parser.feed_data( b'GET /path' + path + b' HTTP/1.1\r\n\r\n') def test_http_request_max_status_line_under_limit(parser) -> None: path = b't' * (8190 - 5) messages, upgraded, tail = parser.feed_data( b'GET /path' + path + b' HTTP/1.1\r\n\r\n') msg = messages[0][0] assert msg.method == 'GET' assert msg.path == '/path' + path.decode() assert msg.version == (1, 1) assert msg.headers == CIMultiDict() assert msg.raw_headers == () assert not msg.should_close assert msg.compression is None assert not msg.upgrade assert not msg.chunked assert msg.url == URL('/path' + path.decode()) def test_http_response_parser_utf8(response) -> None: text = 'HTTP/1.1 200 Ok\r\nx-test:тест\r\n\r\n'.encode('utf-8') messages, upgraded, tail = response.feed_data(text) assert len(messages) == 1 msg = messages[0][0] assert msg.version == (1, 1) assert msg.code == 200 assert msg.reason == 'Ok' assert msg.headers == CIMultiDict([('X-TEST', 'тест')]) assert msg.raw_headers == ((b'x-test', 'тест'.encode('utf-8')),) assert not upgraded assert not tail @pytest.mark.parametrize('size', [40962, 8191]) def test_http_response_parser_bad_status_line_too_long(response, size) -> None: reason = b't' * (size - 2) match = ("400, message='Got more than 8190 bytes \\({}\\) when reading" .format(size)) with pytest.raises(http_exceptions.LineTooLong, match=match): response.feed_data( b'HTTP/1.1 200 Ok' + reason + b'\r\n\r\n') def test_http_response_parser_status_line_under_limit(response) -> None: reason = b'O' * 8190 messages, upgraded, tail = response.feed_data( b'HTTP/1.1 200 ' + reason + b'\r\n\r\n') msg = messages[0][0] assert msg.version == (1, 1) assert msg.code == 200 assert msg.reason == reason.decode() def test_http_response_parser_bad_version(response) -> None: with pytest.raises(http_exceptions.BadHttpMessage): response.feed_data(b'HT/11 200 Ok\r\n\r\n') def test_http_response_parser_no_reason(response) -> None: msg = response.feed_data(b'HTTP/1.1 200\r\n\r\n')[0][0][0] assert msg.version == (1, 1) assert msg.code == 200 assert msg.reason == '' def test_http_response_parser_bad(response) -> None: with pytest.raises(http_exceptions.BadHttpMessage): response.feed_data(b'HTT/1\r\n\r\n') def test_http_response_parser_code_under_100(response) -> None: msg = response.feed_data(b'HTTP/1.1 99 test\r\n\r\n')[0][0][0] assert msg.code == 99 def test_http_response_parser_code_above_999(response) -> None: with pytest.raises(http_exceptions.BadHttpMessage): response.feed_data(b'HTTP/1.1 9999 test\r\n\r\n') def test_http_response_parser_code_not_int(response) -> None: with pytest.raises(http_exceptions.BadHttpMessage): response.feed_data(b'HTTP/1.1 ttt test\r\n\r\n') def test_http_request_chunked_payload(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') msg, payload = parser.feed_data(text)[0][0] assert msg.chunked assert not payload.is_eof() assert isinstance(payload, streams.StreamReader) parser.feed_data(b'4\r\ndata\r\n4\r\nline\r\n0\r\n\r\n') assert b'dataline' == b''.join(d for d in payload._buffer) assert [4, 8] == payload._http_chunk_splits assert payload.is_eof() def test_http_request_chunked_payload_and_next_message(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') msg, payload = parser.feed_data(text)[0][0] messages, upgraded, tail = parser.feed_data( b'4\r\ndata\r\n4\r\nline\r\n0\r\n\r\n' b'POST /test2 HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') assert b'dataline' == b''.join(d for d in payload._buffer) assert [4, 8] == payload._http_chunk_splits assert payload.is_eof() assert len(messages) == 1 msg2, payload2 = messages[0] assert msg2.method == 'POST' assert msg2.chunked assert not payload2.is_eof() def test_http_request_chunked_payload_chunks(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') msg, payload = parser.feed_data(text)[0][0] parser.feed_data(b'4\r\ndata\r') parser.feed_data(b'\n4') parser.feed_data(b'\r') parser.feed_data(b'\n') parser.feed_data(b'li') parser.feed_data(b'ne\r\n0\r\n') parser.feed_data(b'test: test\r\n') assert b'dataline' == b''.join(d for d in payload._buffer) assert [4, 8] == payload._http_chunk_splits assert not payload.is_eof() parser.feed_data(b'\r\n') assert b'dataline' == b''.join(d for d in payload._buffer) assert [4, 8] == payload._http_chunk_splits assert payload.is_eof() def test_parse_chunked_payload_chunk_extension(parser) -> None: text = (b'GET /test HTTP/1.1\r\n' b'transfer-encoding: chunked\r\n\r\n') msg, payload = parser.feed_data(text)[0][0] parser.feed_data( b'4;test\r\ndata\r\n4\r\nline\r\n0\r\ntest: test\r\n\r\n') assert b'dataline' == b''.join(d for d in payload._buffer) assert [4, 8] == payload._http_chunk_splits assert payload.is_eof() def _test_parse_no_length_or_te_on_post(loop, protocol, request_cls): parser = request_cls(protocol, loop, readall=True) text = b'POST /test HTTP/1.1\r\n\r\n' msg, payload = parser.feed_data(text)[0][0] assert payload.is_eof() def test_parse_payload_response_without_body(loop, protocol, response_cls) -> None: parser = response_cls(protocol, loop, response_with_body=False) text = (b'HTTP/1.1 200 Ok\r\n' b'content-length: 10\r\n\r\n') msg, payload = parser.feed_data(text)[0][0] assert payload.is_eof() def test_parse_length_payload(response) -> None: text = (b'HTTP/1.1 200 Ok\r\n' b'content-length: 4\r\n\r\n') msg, payload = response.feed_data(text)[0][0] assert not payload.is_eof() response.feed_data(b'da') response.feed_data(b't') response.feed_data(b'aHT') assert payload.is_eof() assert b'data' == b''.join(d for d in payload._buffer) def test_parse_no_length_payload(parser) -> None: text = b'PUT / HTTP/1.1\r\n\r\n' msg, payload = parser.feed_data(text)[0][0] assert payload.is_eof() def test_partial_url(parser) -> None: messages, upgrade, tail = parser.feed_data(b'GET /te') assert len(messages) == 0 messages, upgrade, tail = parser.feed_data(b'st HTTP/1.1\r\n\r\n') assert len(messages) == 1 msg, payload = messages[0] assert msg.method == 'GET' assert msg.path == '/test' assert msg.version == (1, 1) assert payload.is_eof() def test_url_parse_non_strict_mode(parser) -> None: payload = 'GET /test/тест HTTP/1.1\r\n\r\n'.encode('utf-8') messages, upgrade, tail = parser.feed_data(payload) assert len(messages) == 1 msg, payload = messages[0] assert msg.method == 'GET' assert msg.path == '/test/тест' assert msg.version == (1, 1) assert payload.is_eof() class TestParsePayload: async def test_parse_eof_payload(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, readall=True) p.feed_data(b'data') p.feed_eof() assert out.is_eof() assert [(bytearray(b'data'), 4)] == list(out._buffer) async def test_parse_no_body(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, method='PUT') assert out.is_eof() assert p.done async def test_parse_length_payload_eof(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, length=4) p.feed_data(b'da') with pytest.raises(http_exceptions.ContentLengthError): p.feed_eof() async def test_parse_chunked_payload_size_error(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, chunked=True) with pytest.raises(http_exceptions.TransferEncodingError): p.feed_data(b'blah\r\n') assert isinstance(out.exception(), http_exceptions.TransferEncodingError) async def test_http_payload_parser_length(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, length=2) eof, tail = p.feed_data(b'1245') assert eof assert b'12' == b''.join(d for d, _ in out._buffer) assert b'45' == tail _comp = zlib.compressobj(wbits=-zlib.MAX_WBITS) _COMPRESSED = b''.join([_comp.compress(b'data'), _comp.flush()]) async def test_http_payload_parser_deflate(self, stream) -> None: length = len(self._COMPRESSED) out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser( out, length=length, compression='deflate') p.feed_data(self._COMPRESSED) assert b'data' == b''.join(d for d, _ in out._buffer) assert out.is_eof() async def test_http_payload_parser_deflate_no_wbits(self, stream) -> None: comp = zlib.compressobj() COMPRESSED = b''.join([comp.compress(b'data'), comp.flush()]) length = len(COMPRESSED) out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser( out, length=length, compression='deflate') p.feed_data(COMPRESSED) assert b'data' == b''.join(d for d, _ in out._buffer) assert out.is_eof() async def test_http_payload_parser_length_zero(self, stream) -> None: out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser(out, length=0) assert p.done assert out.is_eof() @pytest.mark.skipif(brotli is None, reason="brotli is not installed") async def test_http_payload_brotli(self, stream) -> None: compressed = brotli.compress(b'brotli data') out = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) p = HttpPayloadParser( out, length=len(compressed), compression='br') p.feed_data(compressed) assert b'brotli data' == b''.join(d for d, _ in out._buffer) assert out.is_eof() class TestDeflateBuffer: async def test_feed_data(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'deflate') dbuf.decompressor = mock.Mock() dbuf.decompressor.decompress.return_value = b'line' dbuf.feed_data(b'data', 4) assert [b'line'] == list(d for d, _ in buf._buffer) async def test_feed_data_err(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'deflate') exc = ValueError() dbuf.decompressor = mock.Mock() dbuf.decompressor.decompress.side_effect = exc with pytest.raises(http_exceptions.ContentEncodingError): dbuf.feed_data(b'data', 4) async def test_feed_eof(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'deflate') dbuf.decompressor = mock.Mock() dbuf.decompressor.flush.return_value = b'line' dbuf.feed_eof() assert [b'line'] == list(d for d, _ in buf._buffer) assert buf._eof async def test_feed_eof_err_deflate(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'deflate') dbuf.decompressor = mock.Mock() dbuf.decompressor.flush.return_value = b'line' dbuf.decompressor.eof = False with pytest.raises(http_exceptions.ContentEncodingError): dbuf.feed_eof() async def test_feed_eof_no_err_gzip(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'gzip') dbuf.decompressor = mock.Mock() dbuf.decompressor.flush.return_value = b'line' dbuf.decompressor.eof = False dbuf.feed_eof() assert [b'line'] == list(d for d, _ in buf._buffer) async def test_feed_eof_no_err_brotli(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'br') dbuf.decompressor = mock.Mock() dbuf.decompressor.flush.return_value = b'line' dbuf.decompressor.eof = False dbuf.feed_eof() assert [b'line'] == list(d for d, _ in buf._buffer) async def test_empty_body(self, stream) -> None: buf = aiohttp.FlowControlDataQueue(stream, loop=asyncio.get_event_loop()) dbuf = DeflateBuffer(buf, 'deflate') dbuf.feed_eof() assert buf.at_eof() aiohttp-3.6.2/tests/test_http_writer.py0000644000175100001650000001340513547410117020616 0ustar vstsdocker00000000000000"""Tests for aiohttp/http_writer.py""" import zlib from unittest import mock import pytest from aiohttp import http from aiohttp.test_utils import make_mocked_coro @pytest.fixture def buf(): return bytearray() @pytest.fixture def transport(buf): transport = mock.Mock() def write(chunk): buf.extend(chunk) transport.write.side_effect = write transport.is_closing.return_value = False return transport @pytest.fixture def protocol(loop, transport): protocol = mock.Mock(transport=transport) protocol._drain_helper = make_mocked_coro() return protocol def test_payloadwriter_properties(transport, protocol, loop) -> None: writer = http.StreamWriter(protocol, loop) assert writer.protocol == protocol assert writer.transport == transport async def test_write_payload_eof(transport, protocol, loop) -> None: write = transport.write = mock.Mock() msg = http.StreamWriter(protocol, loop) await msg.write(b'data1') await msg.write(b'data2') await msg.write_eof() content = b''.join([c[1][0] for c in list(write.mock_calls)]) assert b'data1data2' == content.split(b'\r\n\r\n', 1)[-1] async def test_write_payload_chunked(buf, protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) msg.enable_chunking() await msg.write(b'data') await msg.write_eof() assert b'4\r\ndata\r\n0\r\n\r\n' == buf async def test_write_payload_chunked_multiple(buf, protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) msg.enable_chunking() await msg.write(b'data1') await msg.write(b'data2') await msg.write_eof() assert b'5\r\ndata1\r\n5\r\ndata2\r\n0\r\n\r\n' == buf async def test_write_payload_length(protocol, transport, loop) -> None: write = transport.write = mock.Mock() msg = http.StreamWriter(protocol, loop) msg.length = 2 await msg.write(b'd') await msg.write(b'ata') await msg.write_eof() content = b''.join([c[1][0] for c in list(write.mock_calls)]) assert b'da' == content.split(b'\r\n\r\n', 1)[-1] async def test_write_payload_chunked_filter(protocol, transport, loop) -> None: write = transport.write = mock.Mock() msg = http.StreamWriter(protocol, loop) msg.enable_chunking() await msg.write(b'da') await msg.write(b'ta') await msg.write_eof() content = b''.join([c[1][0] for c in list(write.mock_calls)]) assert content.endswith(b'2\r\nda\r\n2\r\nta\r\n0\r\n\r\n') async def test_write_payload_chunked_filter_mutiple_chunks( protocol, transport, loop): write = transport.write = mock.Mock() msg = http.StreamWriter(protocol, loop) msg.enable_chunking() await msg.write(b'da') await msg.write(b'ta') await msg.write(b'1d') await msg.write(b'at') await msg.write(b'a2') await msg.write_eof() content = b''.join([c[1][0] for c in list(write.mock_calls)]) assert content.endswith( b'2\r\nda\r\n2\r\nta\r\n2\r\n1d\r\n2\r\nat\r\n' b'2\r\na2\r\n0\r\n\r\n') compressor = zlib.compressobj(wbits=-zlib.MAX_WBITS) COMPRESSED = b''.join([compressor.compress(b'data'), compressor.flush()]) async def test_write_payload_deflate_compression(protocol, transport, loop) -> None: write = transport.write = mock.Mock() msg = http.StreamWriter(protocol, loop) msg.enable_compression('deflate') await msg.write(b'data') await msg.write_eof() chunks = [c[1][0] for c in list(write.mock_calls)] assert all(chunks) content = b''.join(chunks) assert COMPRESSED == content.split(b'\r\n\r\n', 1)[-1] async def test_write_payload_deflate_and_chunked( buf, protocol, transport, loop): msg = http.StreamWriter(protocol, loop) msg.enable_compression('deflate') msg.enable_chunking() await msg.write(b'da') await msg.write(b'ta') await msg.write_eof() assert b'6\r\nKI,I\x04\x00\r\n0\r\n\r\n' == buf async def test_write_drain(protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) msg.drain = make_mocked_coro() await msg.write(b'1' * (64 * 1024 * 2), drain=False) assert not msg.drain.called await msg.write(b'1', drain=True) assert msg.drain.called assert msg.buffer_size == 0 async def test_write_calls_callback(protocol, transport, loop) -> None: on_chunk_sent = make_mocked_coro() msg = http.StreamWriter( protocol, loop, on_chunk_sent=on_chunk_sent ) chunk = b'1' await msg.write(chunk) assert on_chunk_sent.called assert on_chunk_sent.call_args == mock.call(chunk) async def test_write_eof_calls_callback(protocol, transport, loop) -> None: on_chunk_sent = make_mocked_coro() msg = http.StreamWriter( protocol, loop, on_chunk_sent=on_chunk_sent ) chunk = b'1' await msg.write_eof(chunk=chunk) assert on_chunk_sent.called assert on_chunk_sent.call_args == mock.call(chunk) async def test_write_to_closing_transport(protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) await msg.write(b'Before closing') transport.is_closing.return_value = True with pytest.raises(ConnectionResetError): await msg.write(b'After closing') async def test_drain(protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) await msg.drain() assert protocol._drain_helper.called async def test_drain_no_transport(protocol, transport, loop) -> None: msg = http.StreamWriter(protocol, loop) msg._protocol.transport = None await msg.drain() assert not protocol._drain_helper.called aiohttp-3.6.2/tests/test_locks.py0000644000175100001650000000242313547410117017354 0ustar vstsdocker00000000000000"""Tests of custom aiohttp locks implementations""" import asyncio import pytest from aiohttp.locks import EventResultOrError class TestEventResultOrError: async def test_set_exception(self, loop) -> None: ev = EventResultOrError(loop=loop) async def c(): try: await ev.wait() except Exception as e: return e return 1 t = loop.create_task(c()) await asyncio.sleep(0, loop=loop) e = Exception() ev.set(exc=e) assert (await t) == e async def test_set(self, loop) -> None: ev = EventResultOrError(loop=loop) async def c(): await ev.wait() return 1 t = loop.create_task(c()) await asyncio.sleep(0, loop=loop) ev.set() assert (await t) == 1 async def test_cancel_waiters(self, loop) -> None: ev = EventResultOrError(loop=loop) async def c(): await ev.wait() t1 = loop.create_task(c()) t2 = loop.create_task(c()) await asyncio.sleep(0, loop=loop) ev.cancel() ev.set() with pytest.raises(asyncio.CancelledError): await t1 with pytest.raises(asyncio.CancelledError): await t2 aiohttp-3.6.2/tests/test_loop.py0000644000175100001650000000221413547410117017210 0ustar vstsdocker00000000000000import asyncio import platform import threading import pytest from aiohttp import web from aiohttp.test_utils import AioHTTPTestCase, unittest_run_loop @pytest.mark.skipif(platform.system() == "Windows", reason="the test is not valid for Windows") async def test_subprocess_co(loop) -> None: assert isinstance(threading.current_thread(), threading._MainThread) proc = await asyncio.create_subprocess_shell( "exit 0", loop=loop, stdin=asyncio.subprocess.DEVNULL, stdout=asyncio.subprocess.DEVNULL, stderr=asyncio.subprocess.DEVNULL) await proc.wait() class TestCase(AioHTTPTestCase): async def get_application(self): app = web.Application() app.on_startup.append(self.on_startup_hook) return app async def on_startup_hook(self, app): self.on_startup_called = True @unittest_run_loop async def test_on_startup_hook(self) -> None: self.assertTrue(self.on_startup_called) def test_default_loop(self) -> None: self.assertIs(self.loop, asyncio.get_event_loop()) def test_default_loop(loop) -> None: assert asyncio.get_event_loop() is loop aiohttp-3.6.2/tests/test_multipart.py0000644000175100001650000012465413547410117020275 0ustar vstsdocker00000000000000import asyncio import io import json import zlib from unittest import mock import pytest import aiohttp from aiohttp import payload from aiohttp.hdrs import ( CONTENT_DISPOSITION, CONTENT_ENCODING, CONTENT_TRANSFER_ENCODING, CONTENT_TYPE, ) from aiohttp.helpers import parse_mimetype from aiohttp.multipart import MultipartResponseWrapper from aiohttp.streams import DEFAULT_LIMIT as stream_reader_default_limit from aiohttp.streams import StreamReader from aiohttp.test_utils import make_mocked_coro BOUNDARY = b'--:' @pytest.fixture def buf(): return bytearray() @pytest.fixture def stream(buf): writer = mock.Mock() async def write(chunk): buf.extend(chunk) writer.write.side_effect = write return writer @pytest.fixture def writer(): return aiohttp.MultipartWriter(boundary=':') class Response: def __init__(self, headers, content): self.headers = headers self.content = content class Stream: def __init__(self, content): self.content = io.BytesIO(content) async def read(self, size=None): return self.content.read(size) def at_eof(self): return self.content.tell() == len(self.content.getbuffer()) async def readline(self): return self.content.readline() def unread_data(self, data): self.content = io.BytesIO(data + self.content.read()) class StreamWithShortenRead(Stream): def __init__(self, content): self._first = True super().__init__(content) async def read(self, size=None): if size is not None and self._first: self._first = False size = size // 2 return await super().read(size) class TestMultipartResponseWrapper: def test_at_eof(self) -> None: wrapper = MultipartResponseWrapper(mock.Mock(), mock.Mock()) wrapper.at_eof() assert wrapper.resp.content.at_eof.called async def test_next(self) -> None: wrapper = MultipartResponseWrapper(mock.Mock(), mock.Mock()) wrapper.stream.next = make_mocked_coro(b'') wrapper.stream.at_eof.return_value = False await wrapper.next() assert wrapper.stream.next.called async def test_release(self) -> None: wrapper = MultipartResponseWrapper(mock.Mock(), mock.Mock()) wrapper.resp.release = make_mocked_coro(None) await wrapper.release() assert wrapper.resp.release.called async def test_release_when_stream_at_eof(self) -> None: wrapper = MultipartResponseWrapper(mock.Mock(), mock.Mock()) wrapper.resp.release = make_mocked_coro(None) wrapper.stream.next = make_mocked_coro(b'') wrapper.stream.at_eof.return_value = True await wrapper.next() assert wrapper.stream.next.called assert wrapper.resp.release.called class TestPartReader: async def test_next(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello, world!\r\n--:')) result = await obj.next() assert b'Hello, world!' == result assert obj.at_eof() async def test_next_next(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello, world!\r\n--:')) result = await obj.next() assert b'Hello, world!' == result assert obj.at_eof() result = await obj.next() assert result is None async def test_read(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello, world!\r\n--:')) result = await obj.read() assert b'Hello, world!' == result assert obj.at_eof() async def test_read_chunk_at_eof(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'--:')) obj._at_eof = True result = await obj.read_chunk() assert b'' == result async def test_read_chunk_without_content_length(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello, world!\r\n--:')) c1 = await obj.read_chunk(8) c2 = await obj.read_chunk(8) c3 = await obj.read_chunk(8) assert c1 + c2 == b'Hello, world!' assert c3 == b'' async def test_read_incomplete_chunk(self) -> None: loop = asyncio.get_event_loop() stream = Stream(b'') def prepare(data): f = loop.create_future() f.set_result(data) return f with mock.patch.object(stream, 'read', side_effect=[ prepare(b'Hello, '), prepare(b'World'), prepare(b'!\r\n--:'), prepare(b'') ]): obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) c1 = await obj.read_chunk(8) assert c1 == b'Hello, ' c2 = await obj.read_chunk(8) assert c2 == b'World' c3 = await obj.read_chunk(8) assert c3 == b'!' async def test_read_all_at_once(self) -> None: stream = Stream(b'Hello, World!\r\n--:--\r\n') obj = aiohttp.BodyPartReader(BOUNDARY, {}, stream) result = await obj.read_chunk() assert b'Hello, World!' == result result = await obj.read_chunk() assert b'' == result assert obj.at_eof() async def test_read_incomplete_body_chunked(self) -> None: stream = Stream(b'Hello, World!\r\n-') obj = aiohttp.BodyPartReader(BOUNDARY, {}, stream) result = b'' with pytest.raises(AssertionError): for _ in range(4): result += await obj.read_chunk(7) assert b'Hello, World!\r\n-' == result async def test_read_boundary_with_incomplete_chunk(self) -> None: loop = asyncio.get_event_loop() stream = Stream(b'') def prepare(data): f = loop.create_future() f.set_result(data) return f with mock.patch.object(stream, 'read', side_effect=[ prepare(b'Hello, World'), prepare(b'!\r\n'), prepare(b'--:'), prepare(b'') ]): obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) c1 = await obj.read_chunk(12) assert c1 == b'Hello, World' c2 = await obj.read_chunk(8) assert c2 == b'!' c3 = await obj.read_chunk(8) assert c3 == b'' async def test_multi_read_chunk(self) -> None: stream = Stream(b'Hello,\r\n--:\r\n\r\nworld!\r\n--:--') obj = aiohttp.BodyPartReader(BOUNDARY, {}, stream) result = await obj.read_chunk(8) assert b'Hello,' == result result = await obj.read_chunk(8) assert b'' == result assert obj.at_eof() async def test_read_chunk_properly_counts_read_bytes(self) -> None: expected = b'.' * 10 size = len(expected) obj = aiohttp.BodyPartReader( BOUNDARY, {'CONTENT-LENGTH': size}, StreamWithShortenRead(expected + b'\r\n--:--')) result = bytearray() while True: chunk = await obj.read_chunk() if not chunk: break result.extend(chunk) assert size == len(result) assert b'.' * size == result assert obj.at_eof() async def test_read_does_not_read_boundary(self) -> None: stream = Stream(b'Hello, world!\r\n--:') obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) result = await obj.read() assert b'Hello, world!' == result assert b'--:' == (await stream.read()) async def test_multiread(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello,\r\n--:\r\n\r\nworld!\r\n--:--')) result = await obj.read() assert b'Hello,' == result result = await obj.read() assert b'' == result assert obj.at_eof() async def test_read_multiline(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello\n,\r\nworld!\r\n--:--')) result = await obj.read() assert b'Hello\n,\r\nworld!' == result result = await obj.read() assert b'' == result assert obj.at_eof() async def test_read_respects_content_length(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {'CONTENT-LENGTH': 100500}, Stream(b'.' * 100500 + b'\r\n--:--')) result = await obj.read() assert b'.' * 100500 == result assert obj.at_eof() async def test_read_with_content_encoding_gzip(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'gzip'}, Stream(b'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03\x0b\xc9\xccMU' b'(\xc9W\x08J\xcdI\xacP\x04\x00$\xfb\x9eV\x0e\x00\x00\x00' b'\r\n--:--')) result = await obj.read(decode=True) assert b'Time to Relax!' == result async def test_read_with_content_encoding_deflate(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'deflate'}, Stream(b'\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00\r\n--:--')) result = await obj.read(decode=True) assert b'Time to Relax!' == result async def test_read_with_content_encoding_identity(self) -> None: thing = (b'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03\x0b\xc9\xccMU' b'(\xc9W\x08J\xcdI\xacP\x04\x00$\xfb\x9eV\x0e\x00\x00\x00' b'\r\n') obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'identity'}, Stream(thing + b'--:--')) result = await obj.read(decode=True) assert thing[:-2] == result async def test_read_with_content_encoding_unknown(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'snappy'}, Stream(b'\x0e4Time to Relax!\r\n--:--')) with pytest.raises(RuntimeError): await obj.read(decode=True) async def test_read_with_content_transfer_encoding_base64(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TRANSFER_ENCODING: 'base64'}, Stream(b'VGltZSB0byBSZWxheCE=\r\n--:--')) result = await obj.read(decode=True) assert b'Time to Relax!' == result async def test_read_with_content_transfer_encoding_quoted_printable( self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TRANSFER_ENCODING: 'quoted-printable'}, Stream(b'=D0=9F=D1=80=D0=B8=D0=B2=D0=B5=D1=82,' b' =D0=BC=D0=B8=D1=80!\r\n--:--')) result = await obj.read(decode=True) expected = (b'\xd0\x9f\xd1\x80\xd0\xb8\xd0\xb2\xd0\xb5\xd1\x82,' b' \xd0\xbc\xd0\xb8\xd1\x80!') assert result == expected @pytest.mark.parametrize('encoding', ('binary', '8bit', '7bit')) async def test_read_with_content_transfer_encoding_binary( self, encoding) -> None: data = b'\xd0\x9f\xd1\x80\xd0\xb8\xd0\xb2\xd0\xb5\xd1\x82,' \ b' \xd0\xbc\xd0\xb8\xd1\x80!' obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TRANSFER_ENCODING: encoding}, Stream(data + b'\r\n--:--')) result = await obj.read(decode=True) assert data == result async def test_read_with_content_transfer_encoding_unknown(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TRANSFER_ENCODING: 'unknown'}, Stream(b'\x0e4Time to Relax!\r\n--:--')) with pytest.raises(RuntimeError): await obj.read(decode=True) async def test_read_text(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello, world!\r\n--:--')) result = await obj.text() assert 'Hello, world!' == result async def test_read_text_default_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream('Привет, Мир!\r\n--:--'.encode('utf-8'))) result = await obj.text() assert 'Привет, Мир!' == result async def test_read_text_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream('Привет, Мир!\r\n--:--'.encode('cp1251'))) result = await obj.text(encoding='cp1251') assert 'Привет, Мир!' == result async def test_read_text_guess_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'text/plain;charset=cp1251'}, Stream('Привет, Мир!\r\n--:--'.encode('cp1251'))) result = await obj.text() assert 'Привет, Мир!' == result async def test_read_text_compressed(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'deflate', CONTENT_TYPE: 'text/plain'}, Stream(b'\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00\r\n--:--')) result = await obj.text() assert 'Time to Relax!' == result async def test_read_text_while_closed(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'text/plain'}, Stream(b'')) obj._at_eof = True result = await obj.text() assert '' == result async def test_read_json(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/json'}, Stream(b'{"test": "passed"}\r\n--:--')) result = await obj.json() assert {'test': 'passed'} == result async def test_read_json_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/json'}, Stream('{"тест": "пассед"}\r\n--:--'.encode('cp1251'))) result = await obj.json(encoding='cp1251') assert {'тест': 'пассед'} == result async def test_read_json_guess_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/json; charset=cp1251'}, Stream('{"тест": "пассед"}\r\n--:--'.encode('cp1251'))) result = await obj.json() assert {'тест': 'пассед'} == result async def test_read_json_compressed(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_ENCODING: 'deflate', CONTENT_TYPE: 'application/json'}, Stream(b'\xabV*I-.Q\xb2RP*H,.NMQ\xaa\x05\x00\r\n--:--')) result = await obj.json() assert {'test': 'passed'} == result async def test_read_json_while_closed(self) -> None: stream = Stream(b'') obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/json'}, stream) obj._at_eof = True result = await obj.json() assert result is None async def test_read_form(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/x-www-form-urlencoded'}, Stream(b'foo=bar&foo=baz&boo=\r\n--:--')) result = await obj.form() assert [('foo', 'bar'), ('foo', 'baz'), ('boo', '')] == result async def test_read_form_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/x-www-form-urlencoded'}, Stream('foo=bar&foo=baz&boo=\r\n--:--'.encode('cp1251'))) result = await obj.form(encoding='cp1251') assert [('foo', 'bar'), ('foo', 'baz'), ('boo', '')] == result async def test_read_form_guess_encoding(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/x-www-form-urlencoded; charset=utf-8'}, Stream('foo=bar&foo=baz&boo=\r\n--:--'.encode('utf-8'))) result = await obj.form() assert [('foo', 'bar'), ('foo', 'baz'), ('boo', '')] == result async def test_read_form_while_closed(self) -> None: stream = Stream(b'') obj = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_TYPE: 'application/x-www-form-urlencoded'}, stream) obj._at_eof = True result = await obj.form() assert not result async def test_readline(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {}, Stream(b'Hello\n,\r\nworld!\r\n--:--')) result = await obj.readline() assert b'Hello\n' == result result = await obj.readline() assert b',\r\n' == result result = await obj.readline() assert b'world!' == result result = await obj.readline() assert b'' == result assert obj.at_eof() async def test_release(self) -> None: stream = Stream(b'Hello,\r\n--:\r\n\r\nworld!\r\n--:--') obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) await obj.release() assert obj.at_eof() assert b'--:\r\n\r\nworld!\r\n--:--' == stream.content.read() async def test_release_respects_content_length(self) -> None: obj = aiohttp.BodyPartReader( BOUNDARY, {'CONTENT-LENGTH': 100500}, Stream(b'.' * 100500 + b'\r\n--:--')) result = await obj.release() assert result is None assert obj.at_eof() async def test_release_release(self) -> None: stream = Stream(b'Hello,\r\n--:\r\n\r\nworld!\r\n--:--') obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) await obj.release() await obj.release() assert b'--:\r\n\r\nworld!\r\n--:--' == stream.content.read() async def test_filename(self) -> None: part = aiohttp.BodyPartReader( BOUNDARY, {CONTENT_DISPOSITION: 'attachment; filename=foo.html'}, None) assert 'foo.html' == part.filename async def test_reading_long_part(self) -> None: size = 2 * stream_reader_default_limit protocol = mock.Mock(_reading_paused=False) stream = StreamReader(protocol) stream.feed_data(b'0' * size + b'\r\n--:--') stream.feed_eof() obj = aiohttp.BodyPartReader( BOUNDARY, {}, stream) data = await obj.read() assert len(data) == size class TestMultipartReader: def test_from_response(self) -> None: resp = Response({CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\nhello\r\n--:--')) res = aiohttp.MultipartReader.from_response(resp) assert isinstance(res, MultipartResponseWrapper) assert isinstance(res.stream, aiohttp.MultipartReader) def test_bad_boundary(self) -> None: resp = Response( {CONTENT_TYPE: 'multipart/related;boundary=' + 'a' * 80}, Stream(b'')) with pytest.raises(ValueError): aiohttp.MultipartReader.from_response(resp) def test_dispatch(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\necho\r\n--:--')) res = reader._get_part_reader({CONTENT_TYPE: 'text/plain'}) assert isinstance(res, reader.part_reader_cls) def test_dispatch_bodypart(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\necho\r\n--:--')) res = reader._get_part_reader({CONTENT_TYPE: 'text/plain'}) assert isinstance(res, reader.part_reader_cls) def test_dispatch_multipart(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'----:--\r\n' b'\r\n' b'test\r\n' b'----:--\r\n' b'\r\n' b'passed\r\n' b'----:----\r\n' b'--:--')) res = reader._get_part_reader( {CONTENT_TYPE: 'multipart/related;boundary=--:--'}) assert isinstance(res, reader.__class__) def test_dispatch_custom_multipart_reader(self) -> None: class CustomReader(aiohttp.MultipartReader): pass reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'----:--\r\n' b'\r\n' b'test\r\n' b'----:--\r\n' b'\r\n' b'passed\r\n' b'----:----\r\n' b'--:--')) reader.multipart_reader_cls = CustomReader res = reader._get_part_reader( {CONTENT_TYPE: 'multipart/related;boundary=--:--'}) assert isinstance(res, CustomReader) async def test_emit_next(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\necho\r\n--:--')) res = await reader.next() assert isinstance(res, reader.part_reader_cls) async def test_invalid_boundary(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'---:\r\n\r\necho\r\n---:--')) with pytest.raises(ValueError): await reader.next() async def test_release(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/mixed;boundary=":"'}, Stream(b'--:\r\n' b'Content-Type: multipart/related;boundary=--:--\r\n' b'\r\n' b'----:--\r\n' b'\r\n' b'test\r\n' b'----:--\r\n' b'\r\n' b'passed\r\n' b'----:----\r\n' b'\r\n' b'--:--')) await reader.release() assert reader.at_eof() async def test_release_release(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\necho\r\n--:--')) await reader.release() assert reader.at_eof() await reader.release() assert reader.at_eof() async def test_release_next(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n\r\necho\r\n--:--')) await reader.release() assert reader.at_eof() res = await reader.next() assert res is None async def test_second_next_releases_previous_object(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n' b'\r\n' b'test\r\n' b'--:\r\n' b'\r\n' b'passed\r\n' b'--:--')) first = await reader.next() assert isinstance(first, aiohttp.BodyPartReader) second = await reader.next() assert first.at_eof() assert not second.at_eof() async def test_release_without_read_the_last_object(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n' b'\r\n' b'test\r\n' b'--:\r\n' b'\r\n' b'passed\r\n' b'--:--')) first = await reader.next() second = await reader.next() third = await reader.next() assert first.at_eof() assert second.at_eof() assert second.at_eof() assert third is None async def test_read_chunk_by_length_doesnt_breaks_reader(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n' b'Content-Length: 4\r\n\r\n' b'test' b'\r\n--:\r\n' b'Content-Length: 6\r\n\r\n' b'passed' b'\r\n--:--')) body_parts = [] while True: read_part = b'' part = await reader.next() if part is None: break while not part.at_eof(): read_part += await part.read_chunk(3) body_parts.append(read_part) assert body_parts == [b'test', b'passed'] async def test_read_chunk_from_stream_doesnt_breaks_reader(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'--:\r\n' b'\r\n' b'chunk' b'\r\n--:\r\n' b'\r\n' b'two_chunks' b'\r\n--:--')) body_parts = [] while True: read_part = b'' part = await reader.next() if part is None: break while not part.at_eof(): chunk = await part.read_chunk(5) assert chunk read_part += chunk body_parts.append(read_part) assert body_parts == [b'chunk', b'two_chunks'] async def test_reading_skips_prelude(self) -> None: reader = aiohttp.MultipartReader( {CONTENT_TYPE: 'multipart/related;boundary=":"'}, Stream(b'Multi-part data is not supported.\r\n' b'\r\n' b'--:\r\n' b'\r\n' b'test\r\n' b'--:\r\n' b'\r\n' b'passed\r\n' b'--:--')) first = await reader.next() assert isinstance(first, aiohttp.BodyPartReader) second = await reader.next() assert first.at_eof() assert not second.at_eof() async def test_writer(writer) -> None: assert writer.size == 7 assert writer.boundary == ':' async def test_writer_serialize_io_chunk(buf, stream, writer) -> None: flo = io.BytesIO(b'foobarbaz') writer.append(flo) await writer.write(stream) assert (buf == b'--:\r\nContent-Type: application/octet-stream' b'\r\nContent-Length: 9\r\n\r\nfoobarbaz\r\n--:--\r\n') async def test_writer_serialize_json(buf, stream, writer) -> None: writer.append_json({'привет': 'мир'}) await writer.write(stream) assert (b'{"\\u043f\\u0440\\u0438\\u0432\\u0435\\u0442":' b' "\\u043c\\u0438\\u0440"}' in buf) async def test_writer_serialize_form(buf, stream, writer) -> None: data = [('foo', 'bar'), ('foo', 'baz'), ('boo', 'zoo')] writer.append_form(data) await writer.write(stream) assert (b'foo=bar&foo=baz&boo=zoo' in buf) async def test_writer_serialize_form_dict(buf, stream, writer) -> None: data = {'hello': 'мир'} writer.append_form(data) await writer.write(stream) assert (b'hello=%D0%BC%D0%B8%D1%80' in buf) async def test_writer_write(buf, stream, writer) -> None: writer.append('foo-bar-baz') writer.append_json({'test': 'passed'}) writer.append_form({'test': 'passed'}) writer.append_form([('one', 1), ('two', 2)]) sub_multipart = aiohttp.MultipartWriter(boundary='::') sub_multipart.append('nested content') sub_multipart.headers['X-CUSTOM'] = 'test' writer.append(sub_multipart) await writer.write(stream) assert ( (b'--:\r\n' b'Content-Type: text/plain; charset=utf-8\r\n' b'Content-Length: 11\r\n\r\n' b'foo-bar-baz' b'\r\n' b'--:\r\n' b'Content-Type: application/json\r\n' b'Content-Length: 18\r\n\r\n' b'{"test": "passed"}' b'\r\n' b'--:\r\n' b'Content-Type: application/x-www-form-urlencoded\r\n' b'Content-Length: 11\r\n\r\n' b'test=passed' b'\r\n' b'--:\r\n' b'Content-Type: application/x-www-form-urlencoded\r\n' b'Content-Length: 11\r\n\r\n' b'one=1&two=2' b'\r\n' b'--:\r\n' b'Content-Type: multipart/mixed; boundary="::"\r\n' b'X-CUSTOM: test\r\nContent-Length: 93\r\n\r\n' b'--::\r\n' b'Content-Type: text/plain; charset=utf-8\r\n' b'Content-Length: 14\r\n\r\n' b'nested content\r\n' b'--::--\r\n' b'\r\n' b'--:--\r\n') == bytes(buf)) async def test_writer_write_no_close_boundary(buf, stream) -> None: writer = aiohttp.MultipartWriter(boundary=':') writer.append('foo-bar-baz') writer.append_json({'test': 'passed'}) writer.append_form({'test': 'passed'}) writer.append_form([('one', 1), ('two', 2)]) await writer.write(stream, close_boundary=False) assert ( (b'--:\r\n' b'Content-Type: text/plain; charset=utf-8\r\n' b'Content-Length: 11\r\n\r\n' b'foo-bar-baz' b'\r\n' b'--:\r\n' b'Content-Type: application/json\r\n' b'Content-Length: 18\r\n\r\n' b'{"test": "passed"}' b'\r\n' b'--:\r\n' b'Content-Type: application/x-www-form-urlencoded\r\n' b'Content-Length: 11\r\n\r\n' b'test=passed' b'\r\n' b'--:\r\n' b'Content-Type: application/x-www-form-urlencoded\r\n' b'Content-Length: 11\r\n\r\n' b'one=1&two=2' b'\r\n') == bytes(buf)) async def test_writer_write_no_parts(buf, stream, writer) -> None: await writer.write(stream) assert b'--:--\r\n' == bytes(buf) async def test_writer_serialize_with_content_encoding_gzip(buf, stream, writer): writer.append('Time to Relax!', {CONTENT_ENCODING: 'gzip'}) await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert (b'--:\r\nContent-Type: text/plain; charset=utf-8\r\n' b'Content-Encoding: gzip' == headers) decompressor = zlib.decompressobj(wbits=16+zlib.MAX_WBITS) data = decompressor.decompress(message.split(b'\r\n')[0]) data += decompressor.flush() assert b'Time to Relax!' == data async def test_writer_serialize_with_content_encoding_deflate(buf, stream, writer): writer.append('Time to Relax!', {CONTENT_ENCODING: 'deflate'}) await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert (b'--:\r\nContent-Type: text/plain; charset=utf-8\r\n' b'Content-Encoding: deflate' == headers) thing = b'\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00\r\n--:--\r\n' assert thing == message async def test_writer_serialize_with_content_encoding_identity(buf, stream, writer): thing = b'\x0b\xc9\xccMU(\xc9W\x08J\xcdI\xacP\x04\x00' writer.append(thing, {CONTENT_ENCODING: 'identity'}) await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert (b'--:\r\nContent-Type: application/octet-stream\r\n' b'Content-Encoding: identity\r\n' b'Content-Length: 16' == headers) assert thing == message.split(b'\r\n')[0] def test_writer_serialize_with_content_encoding_unknown(buf, stream, writer): with pytest.raises(RuntimeError): writer.append('Time to Relax!', {CONTENT_ENCODING: 'snappy'}) async def test_writer_with_content_transfer_encoding_base64(buf, stream, writer): writer.append('Time to Relax!', {CONTENT_TRANSFER_ENCODING: 'base64'}) await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert (b'--:\r\nContent-Type: text/plain; charset=utf-8\r\n' b'Content-Transfer-Encoding: base64' == headers) assert b'VGltZSB0byBSZWxheCE=' == message.split(b'\r\n')[0] async def test_writer_content_transfer_encoding_quote_printable(buf, stream, writer): writer.append('Привет, мир!', {CONTENT_TRANSFER_ENCODING: 'quoted-printable'}) await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert (b'--:\r\nContent-Type: text/plain; charset=utf-8\r\n' b'Content-Transfer-Encoding: quoted-printable' == headers) assert (b'=D0=9F=D1=80=D0=B8=D0=B2=D0=B5=D1=82,' b' =D0=BC=D0=B8=D1=80!' == message.split(b'\r\n')[0]) def test_writer_content_transfer_encoding_unknown(buf, stream, writer) -> None: with pytest.raises(RuntimeError): writer.append('Time to Relax!', {CONTENT_TRANSFER_ENCODING: 'unknown'}) class TestMultipartWriter: def test_default_subtype(self, writer) -> None: mimetype = parse_mimetype(writer.headers.get(CONTENT_TYPE)) assert 'multipart' == mimetype.type assert 'mixed' == mimetype.subtype def test_unquoted_boundary(self) -> None: writer = aiohttp.MultipartWriter(boundary='abc123') expected = {CONTENT_TYPE: 'multipart/mixed; boundary=abc123'} assert expected == writer.headers def test_quoted_boundary(self) -> None: writer = aiohttp.MultipartWriter(boundary=R'\"') expected = {CONTENT_TYPE: R'multipart/mixed; boundary="\\\""'} assert expected == writer.headers def test_bad_boundary(self) -> None: with pytest.raises(ValueError): aiohttp.MultipartWriter(boundary='тест') with pytest.raises(ValueError): aiohttp.MultipartWriter(boundary='test\n') def test_default_headers(self, writer) -> None: expected = {CONTENT_TYPE: 'multipart/mixed; boundary=":"'} assert expected == writer.headers def test_iter_parts(self, writer) -> None: writer.append('foo') writer.append('bar') writer.append('baz') assert 3 == len(list(writer)) def test_append(self, writer) -> None: assert 0 == len(writer) writer.append('hello, world!') assert 1 == len(writer) assert isinstance(writer._parts[0][0], payload.Payload) def test_append_with_headers(self, writer) -> None: writer.append('hello, world!', {'x-foo': 'bar'}) assert 1 == len(writer) assert 'x-foo' in writer._parts[0][0].headers assert writer._parts[0][0].headers['x-foo'] == 'bar' def test_append_json(self, writer) -> None: writer.append_json({'foo': 'bar'}) assert 1 == len(writer) part = writer._parts[0][0] assert part.headers[CONTENT_TYPE] == 'application/json' def test_append_part(self, writer) -> None: part = payload.get_payload( 'test', headers={CONTENT_TYPE: 'text/plain'}) writer.append(part, {CONTENT_TYPE: 'test/passed'}) assert 1 == len(writer) part = writer._parts[0][0] assert part.headers[CONTENT_TYPE] == 'test/passed' def test_append_json_overrides_content_type(self, writer) -> None: writer.append_json({'foo': 'bar'}, {CONTENT_TYPE: 'test/passed'}) assert 1 == len(writer) part = writer._parts[0][0] assert part.headers[CONTENT_TYPE] == 'test/passed' def test_append_form(self, writer) -> None: writer.append_form({'foo': 'bar'}, {CONTENT_TYPE: 'test/passed'}) assert 1 == len(writer) part = writer._parts[0][0] assert part.headers[CONTENT_TYPE] == 'test/passed' def test_append_multipart(self, writer) -> None: subwriter = aiohttp.MultipartWriter(boundary=':') subwriter.append_json({'foo': 'bar'}) writer.append(subwriter, {CONTENT_TYPE: 'test/passed'}) assert 1 == len(writer) part = writer._parts[0][0] assert part.headers[CONTENT_TYPE] == 'test/passed' def test_with(self) -> None: with aiohttp.MultipartWriter(boundary=':') as writer: writer.append('foo') writer.append(b'bar') writer.append_json({'baz': True}) assert 3 == len(writer) def test_append_int_not_allowed(self) -> None: with pytest.raises(TypeError): with aiohttp.MultipartWriter(boundary=':') as writer: writer.append(1) def test_append_float_not_allowed(self) -> None: with pytest.raises(TypeError): with aiohttp.MultipartWriter(boundary=':') as writer: writer.append(1.1) def test_append_none_not_allowed(self) -> None: with pytest.raises(TypeError): with aiohttp.MultipartWriter(boundary=':') as writer: writer.append(None) async def test_write_preserves_content_disposition( self, buf, stream ) -> None: with aiohttp.MultipartWriter(boundary=':') as writer: part = writer.append(b'foo', headers={CONTENT_TYPE: 'test/passed'}) part.set_content_disposition('form-data', filename='bug') await writer.write(stream) headers, message = bytes(buf).split(b'\r\n\r\n', 1) assert headers == ( b'--:\r\n' b'Content-Type: test/passed\r\n' b'Content-Length: 3\r\n' b'Content-Disposition:' b' form-data; filename="bug"; filename*=utf-8\'\'bug' ) assert message == b'foo\r\n--:--\r\n' async def test_preserve_content_disposition_header(self, buf, stream): """ https://github.com/aio-libs/aiohttp/pull/3475#issuecomment-451072381 """ with open(__file__, 'rb') as fobj: with aiohttp.MultipartWriter('form-data', boundary=':') as writer: part = writer.append( fobj, headers={ CONTENT_DISPOSITION: 'attachments; filename="bug.py"', CONTENT_TYPE: 'text/python', } ) content_length = part.size await writer.write(stream) assert part.headers[CONTENT_TYPE] == 'text/python' assert part.headers[CONTENT_DISPOSITION] == ( 'attachments; filename="bug.py"' ) headers, _ = bytes(buf).split(b'\r\n\r\n', 1) assert headers == ( b'--:\r\n' b'Content-Type: text/python\r\n' b'Content-Disposition: attachments; filename="bug.py"\r\n' b'Content-Length: %s' b'' % (str(content_length).encode(),) ) async def test_set_content_disposition_override(self, buf, stream): """ https://github.com/aio-libs/aiohttp/pull/3475#issuecomment-451072381 """ with open(__file__, 'rb') as fobj: with aiohttp.MultipartWriter('form-data', boundary=':') as writer: part = writer.append( fobj, headers={ CONTENT_DISPOSITION: 'attachments; filename="bug.py"', CONTENT_TYPE: 'text/python', } ) content_length = part.size await writer.write(stream) assert part.headers[CONTENT_TYPE] == 'text/python' assert part.headers[CONTENT_DISPOSITION] == ( 'attachments; filename="bug.py"' ) headers, _ = bytes(buf).split(b'\r\n\r\n', 1) assert headers == ( b'--:\r\n' b'Content-Type: text/python\r\n' b'Content-Disposition: attachments; filename="bug.py"\r\n' b'Content-Length: %s' b'' % (str(content_length).encode(),) ) async def test_reset_content_disposition_header(self, buf, stream): """ https://github.com/aio-libs/aiohttp/pull/3475#issuecomment-451072381 """ with open(__file__, 'rb') as fobj: with aiohttp.MultipartWriter('form-data', boundary=':') as writer: part = writer.append( fobj, headers={CONTENT_TYPE: 'text/plain'}, ) content_length = part.size assert CONTENT_DISPOSITION in part.headers part.set_content_disposition('attachments', filename='bug.py') await writer.write(stream) headers, _ = bytes(buf).split(b'\r\n\r\n', 1) assert headers == ( b'--:\r\n' b'Content-Type: text/plain\r\n' b'Content-Disposition:' b' attachments; filename="bug.py"; filename*=utf-8\'\'bug.py\r\n' b'Content-Length: %s' b'' % (str(content_length).encode(),) ) async def test_async_for_reader() -> None: data = [ {"test": "passed"}, 42, b'plain text', b'aiohttp\n', b'no epilogue'] reader = aiohttp.MultipartReader( headers={CONTENT_TYPE: 'multipart/mixed; boundary=":"'}, content=Stream(b'\r\n'.join([ b'--:', b'Content-Type: application/json', b'', json.dumps(data[0]).encode(), b'--:', b'Content-Type: application/json', b'', json.dumps(data[1]).encode(), b'--:', b'Content-Type: multipart/related; boundary="::"', b'', b'--::', b'Content-Type: text/plain', b'', data[2], b'--::', b'Content-Disposition: attachment; filename="aiohttp"', b'Content-Type: text/plain', b'Content-Length: 28', b'Content-Encoding: gzip', b'', b'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03K\xcc\xcc\xcf())' b'\xe0\x02\x00\xd6\x90\xe2O\x08\x00\x00\x00', b'--::', b'Content-Type: multipart/related; boundary=":::"', b'', b'--:::', b'Content-Type: text/plain', b'', data[4], b'--:::--', b'--::--', b'', b'--:--', b'']))) idata = iter(data) async def check(reader): async for part in reader: if isinstance(part, aiohttp.BodyPartReader): if part.headers[CONTENT_TYPE] == 'application/json': assert next(idata) == (await part.json()) else: assert next(idata) == await part.read(decode=True) else: await check(part) await check(reader) async def test_async_for_bodypart() -> None: part = aiohttp.BodyPartReader( boundary=b'--:', headers={}, content=Stream(b'foobarbaz\r\n--:--')) async for data in part: assert data == b'foobarbaz' aiohttp-3.6.2/tests/test_multipart_helpers.py0000644000175100001650000006527713547410117022024 0ustar vstsdocker00000000000000import pytest import aiohttp from aiohttp import content_disposition_filename, parse_content_disposition class TestParseContentDisposition: # http://greenbytes.de/tech/tc2231/ def test_parse_empty(self) -> None: disptype, params = parse_content_disposition(None) assert disptype is None assert {} == params def test_inlonly(self) -> None: disptype, params = parse_content_disposition('inline') assert 'inline' == disptype assert {} == params def test_inlonlyquoted(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition('"inline"') assert disptype is None assert {} == params def test_semicolon(self) -> None: disptype, params = parse_content_disposition( 'form-data; name="data"; filename="file ; name.mp4"') assert disptype == 'form-data' assert params == {'name': 'data', 'filename': 'file ; name.mp4'} def test_inlwithasciifilename(self) -> None: disptype, params = parse_content_disposition( 'inline; filename="foo.html"') assert 'inline' == disptype assert {'filename': 'foo.html'} == params def test_inlwithfnattach(self) -> None: disptype, params = parse_content_disposition( 'inline; filename="Not an attachment!"') assert 'inline' == disptype assert {'filename': 'Not an attachment!'} == params def test_attonly(self) -> None: disptype, params = parse_content_disposition('attachment') assert 'attachment' == disptype assert {} == params def test_attonlyquoted(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition('"attachment"') assert disptype is None assert {} == params def test_attonlyucase(self) -> None: disptype, params = parse_content_disposition('ATTACHMENT') assert 'attachment' == disptype assert {} == params def test_attwithasciifilename(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_inlwithasciifilenamepdf(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo.pdf"') assert 'attachment' == disptype assert {'filename': 'foo.pdf'} == params def test_attwithasciifilename25(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="0000000000111111111122222"') assert 'attachment' == disptype assert {'filename': '0000000000111111111122222'} == params def test_attwithasciifilename35(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="00000000001111111111222222222233333"') assert 'attachment' == disptype assert {'filename': '00000000001111111111222222222233333'} == params def test_attwithasciifnescapedchar(self) -> None: disptype, params = parse_content_disposition( r'attachment; filename="f\oo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attwithasciifnescapedquote(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="\"quoting\" tested.html"') assert 'attachment' == disptype assert {'filename': '"quoting" tested.html'} == params @pytest.mark.skip('need more smart parser which respects quoted text') def test_attwithquotedsemicolon(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="Here\'s a semicolon;.html"') assert 'attachment' == disptype assert {'filename': 'Here\'s a semicolon;.html'} == params def test_attwithfilenameandextparam(self) -> None: disptype, params = parse_content_disposition( 'attachment; foo="bar"; filename="foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html', 'foo': 'bar'} == params def test_attwithfilenameandextparamescaped(self) -> None: disptype, params = parse_content_disposition( 'attachment; foo="\"\\";filename="foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html', 'foo': '"\\'} == params def test_attwithasciifilenameucase(self) -> None: disptype, params = parse_content_disposition( 'attachment; FILENAME="foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attwithasciifilenamenq(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename=foo.html') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attwithtokfncommanq(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo,bar.html') assert disptype is None assert {} == params def test_attwithasciifilenamenqs(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo.html ;') assert disptype is None assert {} == params def test_attemptyparam(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; ;filename=foo') assert disptype is None assert {} == params def test_attwithasciifilenamenqws(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo bar.html') assert disptype is None assert {} == params def test_attwithfntokensq(self) -> None: disptype, params = parse_content_disposition( "attachment; filename='foo.html'") assert 'attachment' == disptype assert {'filename': "'foo.html'"} == params def test_attwithisofnplain(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-ä.html"') assert 'attachment' == disptype assert {'filename': 'foo-ä.html'} == params def test_attwithutf8fnplain(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-ä.html"') assert 'attachment' == disptype assert {'filename': 'foo-ä.html'} == params def test_attwithfnrawpctenca(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-%41.html"') assert 'attachment' == disptype assert {'filename': 'foo-%41.html'} == params def test_attwithfnusingpct(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="50%.html"') assert 'attachment' == disptype assert {'filename': '50%.html'} == params def test_attwithfnrawpctencaq(self) -> None: disptype, params = parse_content_disposition( r'attachment; filename="foo-%\41.html"') assert 'attachment' == disptype assert {'filename': r'foo-%41.html'} == params def test_attwithnamepct(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-%41.html"') assert 'attachment' == disptype assert {'filename': 'foo-%41.html'} == params def test_attwithfilenamepctandiso(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="ä-%41.html"') assert 'attachment' == disptype assert {'filename': 'ä-%41.html'} == params def test_attwithfnrawpctenclong(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-%c3%a4-%e2%82%ac.html"') assert 'attachment' == disptype assert {'filename': 'foo-%c3%a4-%e2%82%ac.html'} == params def test_attwithasciifilenamews1(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename ="foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attwith2filenames(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename="foo.html"; filename="bar.html"') assert disptype is None assert {} == params def test_attfnbrokentoken(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo[1](2).html') assert disptype is None assert {} == params def test_attfnbrokentokeniso(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo-ä.html') assert disptype is None assert {} == params def test_attfnbrokentokenutf(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo-ä.html') assert disptype is None assert {} == params def test_attmissingdisposition(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'filename=foo.html') assert disptype is None assert {} == params def test_attmissingdisposition2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'x=y; filename=foo.html') assert disptype is None assert {} == params def test_attmissingdisposition3(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( '"foo; filename=bar;baz"; filename=qux') assert disptype is None assert {} == params def test_attmissingdisposition4(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'filename=foo.html, filename=bar.html') assert disptype is None assert {} == params def test_emptydisposition(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( '; filename=foo.html') assert disptype is None assert {} == params def test_doublecolon(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( ': inline; attachment; filename=foo.html') assert disptype is None assert {} == params def test_attandinline(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'inline; attachment; filename=foo.html') assert disptype is None assert {} == params def test_attandinline2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; inline; filename=foo.html') assert disptype is None assert {} == params def test_attbrokenquotedfn(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename="foo.html".txt') assert disptype is None assert {} == params def test_attbrokenquotedfn2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename="bar') assert disptype is None assert {} == params def test_attbrokenquotedfn3(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo"bar;baz"qux') assert disptype is None assert {} == params def test_attmultinstances(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=foo.html, attachment; filename=bar.html') assert disptype is None assert {} == params def test_attmissingdelim(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; foo=foo filename=bar') assert disptype is None assert {} == params def test_attmissingdelim2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename=bar foo=foo') assert disptype is None assert {} == params def test_attmissingdelim3(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment filename=bar') assert disptype is None assert {} == params def test_attreversed(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'filename=foo.html; attachment') assert disptype is None assert {} == params def test_attconfusedparam(self) -> None: disptype, params = parse_content_disposition( 'attachment; xfilename=foo.html') assert 'attachment' == disptype assert {'xfilename': 'foo.html'} == params def test_attabspath(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="/foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attabspathwin(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="\\foo.html"') assert 'attachment' == disptype assert {'filename': 'foo.html'} == params def test_attcdate(self) -> None: disptype, params = parse_content_disposition( 'attachment; creation-date="Wed, 12 Feb 1997 16:29:51 -0500"') assert 'attachment' == disptype assert {'creation-date': 'Wed, 12 Feb 1997 16:29:51 -0500'} == params def test_attmdate(self) -> None: disptype, params = parse_content_disposition( 'attachment; modification-date="Wed, 12 Feb 1997 16:29:51 -0500"') assert 'attachment' == disptype assert {'modification-date': 'Wed, 12 Feb 1997 16:29:51 -0500'} == params def test_dispext(self) -> None: disptype, params = parse_content_disposition('foobar') assert 'foobar' == disptype assert {} == params def test_dispextbadfn(self) -> None: disptype, params = parse_content_disposition( 'attachment; example="filename=example.txt"') assert 'attachment' == disptype assert {'example': 'filename=example.txt'} == params def test_attwithisofn2231iso(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=iso-8859-1''foo-%E4.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä.html'} == params def test_attwithfn2231utf8(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''foo-%c3%a4-%e2%82%ac.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä-€.html'} == params def test_attwithfn2231noc(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=''foo-%c3%a4-%e2%82%ac.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä-€.html'} == params def test_attwithfn2231utf8comp(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''foo-a%cc%88.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä.html'} == params @pytest.mark.skip('should raise decoding error: %82 is invalid for latin1') def test_attwithfn2231utf8_bad(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=iso-8859-1''foo-%c3%a4-%e2%82%ac.html") assert 'attachment' == disptype assert {} == params @pytest.mark.skip('should raise decoding error: %E4 is invalid for utf-8') def test_attwithfn2231iso_bad(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=utf-8''foo-%E4.html") assert 'attachment' == disptype assert {} == params def test_attwithfn2231ws1(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename *=UTF-8''foo-%c3%a4.html") assert 'attachment' == disptype assert {} == params def test_attwithfn2231ws2(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*= UTF-8''foo-%c3%a4.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä.html'} == params def test_attwithfn2231ws3(self) -> None: disptype, params = parse_content_disposition( "attachment; filename* =UTF-8''foo-%c3%a4.html") assert 'attachment' == disptype assert {'filename*': 'foo-ä.html'} == params def test_attwithfn2231quot(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=\"UTF-8''foo-%c3%a4.html\"") assert 'attachment' == disptype assert {} == params def test_attwithfn2231quot2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=\"foo%20bar.html\"") assert 'attachment' == disptype assert {} == params def test_attwithfn2231singleqmissing(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=UTF-8'foo-%c3%a4.html") assert 'attachment' == disptype assert {} == params @pytest.mark.skip('urllib.parse.unquote is tolerate to standalone % chars') def test_attwithfn2231nbadpct1(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''foo%") assert 'attachment' == disptype assert {} == params @pytest.mark.skip('urllib.parse.unquote is tolerate to standalone % chars') def test_attwithfn2231nbadpct2(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''f%oo.html") assert 'attachment' == disptype assert {} == params def test_attwithfn2231dpct(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''A-%2541.html") assert 'attachment' == disptype assert {'filename*': 'A-%41.html'} == params def test_attwithfn2231abspathdisguised(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''%5cfoo.html") assert 'attachment' == disptype assert {'filename*': '\\foo.html'} == params def test_attfncont(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*0="foo."; filename*1="html"') assert 'attachment' == disptype assert {'filename*0': 'foo.', 'filename*1': 'html'} == params def test_attfncontqs(self) -> None: disptype, params = parse_content_disposition( r'attachment; filename*0="foo"; filename*1="\b\a\r.html"') assert 'attachment' == disptype assert {'filename*0': 'foo', 'filename*1': 'bar.html'} == params def test_attfncontenc(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*0*=UTF-8''foo-%c3%a4; filename*1=".html"') assert 'attachment' == disptype assert {'filename*0*': 'UTF-8''foo-%c3%a4', 'filename*1': '.html'} == params def test_attfncontlz(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*0="foo"; filename*01="bar"') assert 'attachment' == disptype assert {'filename*0': 'foo', 'filename*01': 'bar'} == params def test_attfncontnc(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*0="foo"; filename*2="bar"') assert 'attachment' == disptype assert {'filename*0': 'foo', 'filename*2': 'bar'} == params def test_attfnconts1(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*0="foo."; filename*2="html"') assert 'attachment' == disptype assert {'filename*0': 'foo.', 'filename*2': 'html'} == params def test_attfncontord(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename*1="bar"; filename*0="foo"') assert 'attachment' == disptype assert {'filename*0': 'foo', 'filename*1': 'bar'} == params def test_attfnboth(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="foo-ae.html";' " filename*=UTF-8''foo-%c3%a4.html") assert 'attachment' == disptype assert {'filename': 'foo-ae.html', 'filename*': 'foo-ä.html'} == params def test_attfnboth2(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*=UTF-8''foo-%c3%a4.html;" ' filename="foo-ae.html"') assert 'attachment' == disptype assert {'filename': 'foo-ae.html', 'filename*': 'foo-ä.html'} == params def test_attfnboth3(self) -> None: disptype, params = parse_content_disposition( "attachment; filename*0*=ISO-8859-15''euro-sign%3d%a4;" " filename*=ISO-8859-1''currency-sign%3d%a4") assert 'attachment' == disptype assert {'filename*': 'currency-sign=¤', 'filename*0*': "ISO-8859-15''euro-sign%3d%a4"} == params def test_attnewandfn(self) -> None: disptype, params = parse_content_disposition( 'attachment; foobar=x; filename="foo.html"') assert 'attachment' == disptype assert {'foobar': 'x', 'filename': 'foo.html'} == params def test_attrfc2047token(self) -> None: with pytest.warns(aiohttp.BadContentDispositionHeader): disptype, params = parse_content_disposition( 'attachment; filename==?ISO-8859-1?Q?foo-=E4.html?=') assert disptype is None assert {} == params def test_attrfc2047quoted(self) -> None: disptype, params = parse_content_disposition( 'attachment; filename="=?ISO-8859-1?Q?foo-=E4.html?="') assert 'attachment' == disptype assert {'filename': '=?ISO-8859-1?Q?foo-=E4.html?='} == params def test_bad_continuous_param(self) -> None: with pytest.warns(aiohttp.BadContentDispositionParam): disptype, params = parse_content_disposition( 'attachment; filename*0=foo bar') assert 'attachment' == disptype assert {} == params class TestContentDispositionFilename: # http://greenbytes.de/tech/tc2231/ def test_no_filename(self) -> None: assert content_disposition_filename({}) is None assert content_disposition_filename({'foo': 'bar'}) is None def test_filename(self) -> None: params = {'filename': 'foo.html'} assert 'foo.html' == content_disposition_filename(params) def test_filename_ext(self) -> None: params = {'filename*': 'файл.html'} assert 'файл.html' == content_disposition_filename(params) def test_attfncont(self) -> None: params = {'filename*0': 'foo.', 'filename*1': 'html'} assert 'foo.html' == content_disposition_filename(params) def test_attfncontqs(self) -> None: params = {'filename*0': 'foo', 'filename*1': 'bar.html'} assert 'foobar.html' == content_disposition_filename(params) def test_attfncontenc(self) -> None: params = {'filename*0*': "UTF-8''foo-%c3%a4", 'filename*1': '.html'} assert 'foo-ä.html' == content_disposition_filename(params) def test_attfncontlz(self) -> None: params = {'filename*0': 'foo', 'filename*01': 'bar'} assert 'foo' == content_disposition_filename(params) def test_attfncontnc(self) -> None: params = {'filename*0': 'foo', 'filename*2': 'bar'} assert 'foo' == content_disposition_filename(params) def test_attfnconts1(self) -> None: params = {'filename*1': 'foo', 'filename*2': 'bar'} assert content_disposition_filename(params) is None def test_attfnboth(self) -> None: params = {'filename': 'foo-ae.html', 'filename*': 'foo-ä.html'} assert 'foo-ä.html' == content_disposition_filename(params) def test_attfnboth3(self) -> None: params = {'filename*0*': "ISO-8859-15''euro-sign%3d%a4", 'filename*': 'currency-sign=¤'} assert 'currency-sign=¤' == content_disposition_filename(params) def test_attrfc2047quoted(self) -> None: params = {'filename': '=?ISO-8859-1?Q?foo-=E4.html?='} assert '=?ISO-8859-1?Q?foo-=E4.html?=' == content_disposition_filename( params) aiohttp-3.6.2/tests/test_payload.py0000644000175100001650000000702613547410117017676 0ustar vstsdocker00000000000000import asyncio from io import StringIO from unittest import mock import pytest from async_generator import async_generator from aiohttp import payload, streams @pytest.fixture def registry(): old = payload.PAYLOAD_REGISTRY reg = payload.PAYLOAD_REGISTRY = payload.PayloadRegistry() yield reg payload.PAYLOAD_REGISTRY = old class Payload(payload.Payload): async def write(self, writer): pass def test_register_type(registry) -> None: class TestProvider: pass payload.register_payload(Payload, TestProvider) p = payload.get_payload(TestProvider()) assert isinstance(p, Payload) def test_register_unsupported_order(registry) -> None: class TestProvider: pass with pytest.raises(ValueError): payload.register_payload(Payload, TestProvider, order=object()) def test_payload_ctor() -> None: p = Payload('test', encoding='utf-8', filename='test.txt') assert p._value == 'test' assert p._encoding == 'utf-8' assert p.size is None assert p.filename == 'test.txt' assert p.content_type == 'text/plain' def test_payload_content_type() -> None: p = Payload('test', headers={'content-type': 'application/json'}) assert p.content_type == 'application/json' def test_bytes_payload_default_content_type() -> None: p = payload.BytesPayload(b'data') assert p.content_type == 'application/octet-stream' def test_bytes_payload_explicit_content_type() -> None: p = payload.BytesPayload(b'data', content_type='application/custom') assert p.content_type == 'application/custom' def test_bytes_payload_bad_type() -> None: with pytest.raises(TypeError): payload.BytesPayload(object()) def test_string_payload() -> None: p = payload.StringPayload('test') assert p.encoding == 'utf-8' assert p.content_type == 'text/plain; charset=utf-8' p = payload.StringPayload('test', encoding='koi8-r') assert p.encoding == 'koi8-r' assert p.content_type == 'text/plain; charset=koi8-r' p = payload.StringPayload( 'test', content_type='text/plain; charset=koi8-r') assert p.encoding == 'koi8-r' assert p.content_type == 'text/plain; charset=koi8-r' def test_string_io_payload() -> None: s = StringIO('ű' * 5000) p = payload.StringIOPayload(s) assert p.encoding == 'utf-8' assert p.content_type == 'text/plain; charset=utf-8' assert p.size == 10000 def test_async_iterable_payload_default_content_type() -> None: @async_generator async def gen(): pass p = payload.AsyncIterablePayload(gen()) assert p.content_type == 'application/octet-stream' def test_async_iterable_payload_explicit_content_type() -> None: @async_generator async def gen(): pass p = payload.AsyncIterablePayload(gen(), content_type='application/custom') assert p.content_type == 'application/custom' def test_async_iterable_payload_not_async_iterable() -> None: with pytest.raises(TypeError): payload.AsyncIterablePayload(object()) async def test_stream_reader_long_lines() -> None: loop = asyncio.get_event_loop() DATA = b'0' * 1024 ** 3 stream = streams.StreamReader(mock.Mock(), loop=loop) stream.feed_data(DATA) stream.feed_eof() body = payload.get_payload(stream) writer = mock.Mock() writer.write.return_value = loop.create_future() writer.write.return_value.set_result(None) await body.write(writer) writer.write.assert_called_once_with(mock.ANY) (chunk,), _ = writer.write.call_args assert len(chunk) == len(DATA) aiohttp-3.6.2/tests/test_proxy.py0000644000175100001650000005634113547410117017432 0ustar vstsdocker00000000000000import asyncio import gc import socket import ssl import unittest from unittest import mock from yarl import URL import aiohttp from aiohttp.client_reqrep import ClientRequest, ClientResponse from aiohttp.helpers import TimerNoop from aiohttp.test_utils import make_mocked_coro class TestProxy(unittest.TestCase): response_mock_attrs = { 'status': 200, } mocked_response = mock.Mock(**response_mock_attrs) clientrequest_mock_attrs = { 'return_value.send.return_value.start': make_mocked_coro(mocked_response), } def setUp(self): self.loop = asyncio.new_event_loop() asyncio.set_event_loop(None) def tearDown(self): # just in case if we have transport close callbacks self.loop.stop() self.loop.run_forever() self.loop.close() gc.collect() @mock.patch('aiohttp.connector.ClientRequest') def test_connect(self, ClientRequestMock) -> None: req = ClientRequest( 'GET', URL('http://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) self.assertEqual(str(req.proxy), 'http://proxy.example.com') # mock all the things! async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro([mock.MagicMock()]) proto = mock.Mock(**{ 'transport.get_extra_info.return_value': False, }) self.loop.create_connection = make_mocked_coro( (proto.transport, proto)) conn = self.loop.run_until_complete( connector.connect(req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url, URL('http://www.python.org')) self.assertIs(conn._protocol, proto) self.assertIs(conn.transport, proto.transport) ClientRequestMock.assert_called_with( 'GET', URL('http://proxy.example.com'), auth=None, headers={'Host': 'www.python.org'}, loop=self.loop, ssl=None) @mock.patch('aiohttp.connector.ClientRequest') def test_proxy_headers(self, ClientRequestMock) -> None: req = ClientRequest( 'GET', URL('http://www.python.org'), proxy=URL('http://proxy.example.com'), proxy_headers={'Foo': 'Bar'}, loop=self.loop) self.assertEqual(str(req.proxy), 'http://proxy.example.com') # mock all the things! async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro([mock.MagicMock()]) proto = mock.Mock(**{ 'transport.get_extra_info.return_value': False, }) self.loop.create_connection = make_mocked_coro( (proto.transport, proto)) conn = self.loop.run_until_complete(connector.connect( req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url, URL('http://www.python.org')) self.assertIs(conn._protocol, proto) self.assertIs(conn.transport, proto.transport) ClientRequestMock.assert_called_with( 'GET', URL('http://proxy.example.com'), auth=None, headers={'Host': 'www.python.org', 'Foo': 'Bar'}, loop=self.loop, ssl=None) def test_proxy_auth(self) -> None: with self.assertRaises(ValueError) as ctx: ClientRequest( 'GET', URL('http://python.org'), proxy=URL('http://proxy.example.com'), proxy_auth=('user', 'pass'), loop=mock.Mock()) self.assertEqual( ctx.exception.args[0], "proxy_auth must be None or BasicAuth() tuple", ) def test_proxy_dns_error(self) -> None: async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( raise_exception=OSError('dont take it serious')) req = ClientRequest( 'GET', URL('http://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) expected_headers = dict(req.headers) with self.assertRaises(aiohttp.ClientConnectorError): self.loop.run_until_complete(connector.connect( req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url.path, '/') self.assertEqual(dict(req.headers), expected_headers) def test_proxy_connection_error(self) -> None: async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro([{ 'hostname': 'www.python.org', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': socket.AI_NUMERICHOST}]) connector._loop.create_connection = make_mocked_coro( raise_exception=OSError('dont take it serious')) req = ClientRequest( 'GET', URL('http://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaises(aiohttp.ClientProxyConnectionError): self.loop.run_until_complete(connector.connect( req, None, aiohttp.ClientTimeout())) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) self.loop.run_until_complete( connector._create_connection(req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url.path, '/') self.assertEqual(proxy_req.method, 'CONNECT') self.assertEqual(proxy_req.url, URL('https://www.python.org')) tr.close.assert_called_once_with() tr.get_extra_info.assert_called_with('socket', default=None) self.loop.run_until_complete(proxy_req.close()) proxy_resp.close() self.loop.run_until_complete(req.close()) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_certificate_error(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) seq = 0 async def create_connection(*args, **kwargs): nonlocal seq seq += 1 # connection to http://proxy.example.com if seq == 1: return mock.Mock(), mock.Mock() # connection to https://www.python.org elif seq == 2: raise ssl.CertificateError else: assert False self.loop.create_connection = create_connection req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaises(aiohttp.ClientConnectorCertificateError): self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_ssl_error(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) seq = 0 async def create_connection(*args, **kwargs): nonlocal seq seq += 1 # connection to http://proxy.example.com if seq == 1: return mock.Mock(), mock.Mock() # connection to https://www.python.org elif seq == 2: raise ssl.SSLError else: assert False self.loop.create_connection = create_connection req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaises(aiohttp.ClientConnectorSSLError): self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_runtime_error(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() tr.get_extra_info.return_value = None self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaisesRegex( RuntimeError, "Transport does not expose socket instance"): self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) self.loop.run_until_complete(proxy_req.close()) proxy_resp.close() self.loop.run_until_complete(req.close()) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_http_proxy_error(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro( mock.Mock(status=400, reason='bad request')) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() tr.get_extra_info.return_value = None self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaisesRegex( aiohttp.ClientHttpProxyError, "400, message='bad request'"): self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) self.loop.run_until_complete(proxy_req.close()) proxy_resp.close() self.loop.run_until_complete(req.close()) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_resp_start_error(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro( raise_exception=OSError("error message")) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() tr.get_extra_info.return_value = None self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) with self.assertRaisesRegex(OSError, "error message"): self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) @mock.patch('aiohttp.connector.ClientRequest') def test_request_port(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() tr.get_extra_info.return_value = None self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('http://localhost:1234/path'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url, URL('http://localhost:1234/path')) def test_proxy_auth_property(self) -> None: req = aiohttp.ClientRequest( 'GET', URL('http://localhost:1234/path'), proxy=URL('http://proxy.example.com'), proxy_auth=aiohttp.helpers.BasicAuth('user', 'pass'), loop=self.loop) self.assertEqual(('user', 'pass', 'latin1'), req.proxy_auth) def test_proxy_auth_property_default(self) -> None: req = aiohttp.ClientRequest( 'GET', URL('http://localhost:1234/path'), proxy=URL('http://proxy.example.com'), loop=self.loop) self.assertIsNone(req.proxy_auth) @mock.patch('aiohttp.connector.ClientRequest') def test_https_connect_pass_ssl_context(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() self.loop.create_connection = make_mocked_coro((tr, proto)) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop, ) self.loop.run_until_complete(connector._create_connection( req, None, aiohttp.ClientTimeout())) self.loop.create_connection.assert_called_with( mock.ANY, ssl=connector._make_ssl_context(True), sock=mock.ANY, server_hostname='www.python.org') self.assertEqual(req.url.path, '/') self.assertEqual(proxy_req.method, 'CONNECT') self.assertEqual(proxy_req.url, URL('https://www.python.org')) tr.close.assert_called_once_with() tr.get_extra_info.assert_called_with('socket', default=None) self.loop.run_until_complete(proxy_req.close()) proxy_resp.close() self.loop.run_until_complete(req.close()) @mock.patch('aiohttp.connector.ClientRequest') def test_https_auth(self, ClientRequestMock) -> None: proxy_req = ClientRequest('GET', URL('http://proxy.example.com'), auth=aiohttp.helpers.BasicAuth('user', 'pass'), loop=self.loop) ClientRequestMock.return_value = proxy_req proxy_resp = ClientResponse('get', URL('http://proxy.example.com'), request_info=mock.Mock(), writer=mock.Mock(), continue100=None, timer=TimerNoop(), traces=[], loop=self.loop, session=mock.Mock()) proxy_req.send = make_mocked_coro(proxy_resp) proxy_resp.start = make_mocked_coro(mock.Mock(status=200)) async def make_conn(): return aiohttp.TCPConnector() connector = self.loop.run_until_complete(make_conn()) connector._resolve_host = make_mocked_coro( [{'hostname': 'hostname', 'host': '127.0.0.1', 'port': 80, 'family': socket.AF_INET, 'proto': 0, 'flags': 0}]) tr, proto = mock.Mock(), mock.Mock() self.loop.create_connection = make_mocked_coro((tr, proto)) self.assertIn('AUTHORIZATION', proxy_req.headers) self.assertNotIn('PROXY-AUTHORIZATION', proxy_req.headers) req = ClientRequest( 'GET', URL('https://www.python.org'), proxy=URL('http://proxy.example.com'), loop=self.loop ) self.assertNotIn('AUTHORIZATION', req.headers) self.assertNotIn('PROXY-AUTHORIZATION', req.headers) self.loop.run_until_complete( connector._create_connection(req, None, aiohttp.ClientTimeout())) self.assertEqual(req.url.path, '/') self.assertNotIn('AUTHORIZATION', req.headers) self.assertNotIn('PROXY-AUTHORIZATION', req.headers) self.assertNotIn('AUTHORIZATION', proxy_req.headers) self.assertIn('PROXY-AUTHORIZATION', proxy_req.headers) connector._resolve_host.assert_called_with( 'proxy.example.com', 80, traces=mock.ANY) self.loop.run_until_complete(proxy_req.close()) proxy_resp.close() self.loop.run_until_complete(req.close()) aiohttp-3.6.2/tests/test_proxy_functional.py0000644000175100001650000005070713547410117021654 0ustar vstsdocker00000000000000import asyncio import os import pathlib from unittest import mock import pytest from yarl import URL import aiohttp from aiohttp import web @pytest.fixture def proxy_test_server(aiohttp_raw_server, loop, monkeypatch): """Handle all proxy requests and imitate remote server response.""" _patch_ssl_transport(monkeypatch) default_response = dict( status=200, headers=None, body=None) proxy_mock = mock.Mock() async def proxy_handler(request): proxy_mock.request = request proxy_mock.requests_list.append(request) response = default_response.copy() if isinstance(proxy_mock.return_value, dict): response.update(proxy_mock.return_value) headers = response['headers'] if not headers: headers = {} if request.method == 'CONNECT': response['body'] = None response['headers'] = headers resp = web.Response(**response) await resp.prepare(request) await resp.write_eof() return resp async def proxy_server(): proxy_mock.request = None proxy_mock.auth = None proxy_mock.requests_list = [] server = await aiohttp_raw_server(proxy_handler) proxy_mock.server = server proxy_mock.url = server.make_url('/') return proxy_mock return proxy_server @pytest.fixture() def get_request(loop): async def _request(method='GET', *, url, trust_env=False, **kwargs): connector = aiohttp.TCPConnector(ssl=False, loop=loop) client = aiohttp.ClientSession(connector=connector, trust_env=trust_env) try: resp = await client.request(method, url, **kwargs) await resp.release() return resp finally: await client.close() return _request async def test_proxy_http_absolute_path(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io/path?query=yes' proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path?query=yes' async def test_proxy_http_raw_path(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io:2561/space sheep?q=can:fly' raw_url = 'http://aiohttp.io:2561/space%20sheep?q=can:fly' proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) assert proxy.request.host == 'aiohttp.io:2561' assert proxy.request.path_qs == raw_url async def test_proxy_http_idna_support(proxy_test_server, get_request) -> None: url = 'http://éé.com/' raw_url = 'http://xn--9caa.com/' proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) assert proxy.request.host == 'xn--9caa.com' assert proxy.request.path_qs == raw_url async def test_proxy_http_connection_error(get_request) -> None: url = 'http://aiohttp.io/path' proxy_url = 'http://localhost:2242/' with pytest.raises(aiohttp.ClientConnectorError): await get_request(url=url, proxy=proxy_url) async def test_proxy_http_bad_response(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io/path' proxy = await proxy_test_server() proxy.return_value = dict( status=502, headers={'Proxy-Agent': 'TestProxy'}) resp = await get_request(url=url, proxy=proxy.url) assert resp.status == 502 assert resp.headers['Proxy-Agent'] == 'TestProxy' async def test_proxy_http_auth(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io/path' proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) assert 'Authorization' not in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers auth = aiohttp.BasicAuth('user', 'pass') await get_request(url=url, auth=auth, proxy=proxy.url) assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers await get_request(url=url, proxy_auth=auth, proxy=proxy.url) assert 'Authorization' not in proxy.request.headers assert 'Proxy-Authorization' in proxy.request.headers await get_request(url=url, auth=auth, proxy_auth=auth, proxy=proxy.url) assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' in proxy.request.headers async def test_proxy_http_auth_utf8(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io/path' auth = aiohttp.BasicAuth('юзер', 'пасс', 'utf-8') proxy = await proxy_test_server() await get_request(url=url, auth=auth, proxy=proxy.url) assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers async def test_proxy_http_auth_from_url(proxy_test_server, get_request) -> None: url = 'http://aiohttp.io/path' proxy = await proxy_test_server() auth_url = URL(url).with_user('user').with_password('pass') await get_request(url=auth_url, proxy=proxy.url) assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers proxy_url = URL(proxy.url).with_user('user').with_password('pass') await get_request(url=url, proxy=proxy_url) assert 'Authorization' not in proxy.request.headers assert 'Proxy-Authorization' in proxy.request.headers async def test_proxy_http_acquired_cleanup(proxy_test_server, loop) -> None: url = 'http://aiohttp.io/path' conn = aiohttp.TCPConnector(loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() assert 0 == len(conn._acquired) resp = await sess.get(url, proxy=proxy.url) assert resp.closed assert 0 == len(conn._acquired) await sess.close() @pytest.mark.skip('we need to reconsider how we test this') async def test_proxy_http_acquired_cleanup_force(proxy_test_server, loop) -> None: url = 'http://aiohttp.io/path' conn = aiohttp.TCPConnector(force_close=True, loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() assert 0 == len(conn._acquired) async def request(): resp = await sess.get(url, proxy=proxy.url) assert 1 == len(conn._acquired) await resp.release() await request() assert 0 == len(conn._acquired) await sess.close() @pytest.mark.skip('we need to reconsider how we test this') async def test_proxy_http_multi_conn_limit(proxy_test_server, loop) -> None: url = 'http://aiohttp.io/path' limit, multi_conn_num = 1, 5 conn = aiohttp.TCPConnector(limit=limit, loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() current_pid = None async def request(pid): # process requests only one by one nonlocal current_pid resp = await sess.get(url, proxy=proxy.url) current_pid = pid await asyncio.sleep(0.2, loop=loop) assert current_pid == pid await resp.release() return resp requests = [request(pid) for pid in range(multi_conn_num)] responses = await asyncio.gather(*requests, loop=loop) assert len(responses) == multi_conn_num assert set(resp.status for resp in responses) == {200} await sess.close() @pytest.mark.xfail async def xtest_proxy_https_connect(proxy_test_server, get_request): proxy = await proxy_test_server() url = 'https://www.google.com.ua/search?q=aiohttp proxy' await get_request(url=url, proxy=proxy.url) connect = proxy.requests_list[0] assert connect.method == 'CONNECT' assert connect.path == 'www.google.com.ua:443' assert connect.host == 'www.google.com.ua' assert proxy.request.host == 'www.google.com.ua' assert proxy.request.path_qs == '/search?q=aiohttp+proxy' @pytest.mark.xfail async def xtest_proxy_https_connect_with_port(proxy_test_server, get_request): proxy = await proxy_test_server() url = 'https://secure.aiohttp.io:2242/path' await get_request(url=url, proxy=proxy.url) connect = proxy.requests_list[0] assert connect.method == 'CONNECT' assert connect.path == 'secure.aiohttp.io:2242' assert connect.host == 'secure.aiohttp.io:2242' assert proxy.request.host == 'secure.aiohttp.io:2242' assert proxy.request.path_qs == '/path' @pytest.mark.xfail async def xtest_proxy_https_send_body(proxy_test_server, loop): sess = aiohttp.ClientSession(loop=loop) proxy = await proxy_test_server() proxy.return_value = {'status': 200, 'body': b'1'*(2**20)} url = 'https://www.google.com.ua/search?q=aiohttp proxy' resp = await sess.get(url, proxy=proxy.url) body = await resp.read() await resp.release() await sess.close() assert body == b'1'*(2**20) @pytest.mark.xfail async def xtest_proxy_https_idna_support(proxy_test_server, get_request): url = 'https://éé.com/' proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) connect = proxy.requests_list[0] assert connect.method == 'CONNECT' assert connect.path == 'xn--9caa.com:443' assert connect.host == 'xn--9caa.com' async def test_proxy_https_connection_error(get_request) -> None: url = 'https://secure.aiohttp.io/path' proxy_url = 'http://localhost:2242/' with pytest.raises(aiohttp.ClientConnectorError): await get_request(url=url, proxy=proxy_url) async def test_proxy_https_bad_response(proxy_test_server, get_request) -> None: url = 'https://secure.aiohttp.io/path' proxy = await proxy_test_server() proxy.return_value = dict( status=502, headers={'Proxy-Agent': 'TestProxy'}) with pytest.raises(aiohttp.ClientHttpProxyError): await get_request(url=url, proxy=proxy.url) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'CONNECT' assert proxy.request.path == 'secure.aiohttp.io:443' @pytest.mark.xfail async def xtest_proxy_https_auth(proxy_test_server, get_request): url = 'https://secure.aiohttp.io/path' auth = aiohttp.BasicAuth('user', 'pass') proxy = await proxy_test_server() await get_request(url=url, proxy=proxy.url) connect = proxy.requests_list[0] assert 'Authorization' not in connect.headers assert 'Proxy-Authorization' not in connect.headers assert 'Authorization' not in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers proxy = await proxy_test_server() await get_request(url=url, auth=auth, proxy=proxy.url) connect = proxy.requests_list[0] assert 'Authorization' not in connect.headers assert 'Proxy-Authorization' not in connect.headers assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers proxy = await proxy_test_server() await get_request(url=url, proxy_auth=auth, proxy=proxy.url) connect = proxy.requests_list[0] assert 'Authorization' not in connect.headers assert 'Proxy-Authorization' in connect.headers assert 'Authorization' not in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers proxy = await proxy_test_server() await get_request(url=url, auth=auth, proxy_auth=auth, proxy=proxy.url) connect = proxy.requests_list[0] assert 'Authorization' not in connect.headers assert 'Proxy-Authorization' in connect.headers assert 'Authorization' in proxy.request.headers assert 'Proxy-Authorization' not in proxy.request.headers @pytest.mark.xfail async def xtest_proxy_https_acquired_cleanup(proxy_test_server, loop): url = 'https://secure.aiohttp.io/path' conn = aiohttp.TCPConnector(loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() assert 0 == len(conn._acquired) async def request(): resp = await sess.get(url, proxy=proxy.url) assert 1 == len(conn._acquired) await resp.release() await request() assert 0 == len(conn._acquired) await sess.close() @pytest.mark.xfail async def xtest_proxy_https_acquired_cleanup_force(proxy_test_server, loop): url = 'https://secure.aiohttp.io/path' conn = aiohttp.TCPConnector(force_close=True, loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() assert 0 == len(conn._acquired) async def request(): resp = await sess.get(url, proxy=proxy.url) assert 1 == len(conn._acquired) await resp.release() await request() assert 0 == len(conn._acquired) await sess.close() @pytest.mark.xfail async def xtest_proxy_https_multi_conn_limit(proxy_test_server, loop): url = 'https://secure.aiohttp.io/path' limit, multi_conn_num = 1, 5 conn = aiohttp.TCPConnector(limit=limit, loop=loop) sess = aiohttp.ClientSession(connector=conn, loop=loop) proxy = await proxy_test_server() current_pid = None async def request(pid): # process requests only one by one nonlocal current_pid resp = await sess.get(url, proxy=proxy.url) current_pid = pid await asyncio.sleep(0.2, loop=loop) assert current_pid == pid await resp.release() return resp requests = [request(pid) for pid in range(multi_conn_num)] responses = await asyncio.gather(*requests, loop=loop) assert len(responses) == multi_conn_num assert set(resp.status for resp in responses) == {200} await sess.close() def _patch_ssl_transport(monkeypatch): """Make ssl transport substitution to prevent ssl handshake.""" def _make_ssl_transport_dummy(self, rawsock, protocol, sslcontext, waiter=None, **kwargs): return self._make_socket_transport(rawsock, protocol, waiter, extra=kwargs.get('extra'), server=kwargs.get('server')) monkeypatch.setattr( "asyncio.selector_events.BaseSelectorEventLoop._make_ssl_transport", _make_ssl_transport_dummy) original_is_file = pathlib.Path.is_file def mock_is_file(self): """ make real netrc file invisible in home dir """ if self.name in ['_netrc', '.netrc'] and self.parent == self.home(): return False else: return original_is_file(self) async def test_proxy_from_env_http(proxy_test_server, get_request, mocker) -> None: url = 'http://aiohttp.io/path' proxy = await proxy_test_server() mocker.patch.dict(os.environ, {'http_proxy': str(proxy.url)}) mocker.patch('pathlib.Path.is_file', mock_is_file) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path' assert 'Proxy-Authorization' not in proxy.request.headers async def test_proxy_from_env_http_with_auth(proxy_test_server, get_request, mocker): url = 'http://aiohttp.io/path' proxy = await proxy_test_server() auth = aiohttp.BasicAuth('user', 'pass') mocker.patch.dict(os.environ, {'http_proxy': str(proxy.url .with_user(auth.login) .with_password(auth.password))}) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path' assert proxy.request.headers['Proxy-Authorization'] == auth.encode() async def test_proxy_from_env_http_with_auth_from_netrc( proxy_test_server, get_request, tmpdir, mocker): url = 'http://aiohttp.io/path' proxy = await proxy_test_server() auth = aiohttp.BasicAuth('user', 'pass') netrc_file = tmpdir.join('test_netrc') netrc_file_data = 'machine 127.0.0.1 login %s password %s' % ( auth.login, auth.password) with open(str(netrc_file), 'w') as f: f.write(netrc_file_data) mocker.patch.dict(os.environ, {'http_proxy': str(proxy.url), 'NETRC': str(netrc_file)}) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path' assert proxy.request.headers['Proxy-Authorization'] == auth.encode() async def test_proxy_from_env_http_without_auth_from_netrc( proxy_test_server, get_request, tmpdir, mocker): url = 'http://aiohttp.io/path' proxy = await proxy_test_server() auth = aiohttp.BasicAuth('user', 'pass') netrc_file = tmpdir.join('test_netrc') netrc_file_data = 'machine 127.0.0.2 login %s password %s' % ( auth.login, auth.password) with open(str(netrc_file), 'w') as f: f.write(netrc_file_data) mocker.patch.dict(os.environ, {'http_proxy': str(proxy.url), 'NETRC': str(netrc_file)}) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path' assert 'Proxy-Authorization' not in proxy.request.headers async def test_proxy_from_env_http_without_auth_from_wrong_netrc( proxy_test_server, get_request, tmpdir, mocker): url = 'http://aiohttp.io/path' proxy = await proxy_test_server() auth = aiohttp.BasicAuth('user', 'pass') netrc_file = tmpdir.join('test_netrc') invalid_data = 'machine 127.0.0.1 %s pass %s' % ( auth.login, auth.password) with open(str(netrc_file), 'w') as f: f.write(invalid_data) mocker.patch.dict(os.environ, {'http_proxy': str(proxy.url), 'NETRC': str(netrc_file)}) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 1 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'http://aiohttp.io/path' assert 'Proxy-Authorization' not in proxy.request.headers @pytest.mark.xfail async def xtest_proxy_from_env_https(proxy_test_server, get_request, mocker): url = 'https://aiohttp.io/path' proxy = await proxy_test_server() mocker.patch.dict(os.environ, {'https_proxy': str(proxy.url)}) mock.patch('pathlib.Path.is_file', mock_is_file) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 2 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == 'https://aiohttp.io/path' assert 'Proxy-Authorization' not in proxy.request.headers @pytest.mark.xfail async def xtest_proxy_from_env_https_with_auth(proxy_test_server, get_request, mocker): url = 'https://aiohttp.io/path' proxy = await proxy_test_server() auth = aiohttp.BasicAuth('user', 'pass') mocker.patch.dict(os.environ, {'https_proxy': str(proxy.url .with_user(auth.login) .with_password(auth.password))}) await get_request(url=url, trust_env=True) assert len(proxy.requests_list) == 2 assert proxy.request.method == 'GET' assert proxy.request.host == 'aiohttp.io' assert proxy.request.path_qs == '/path' assert 'Proxy-Authorization' not in proxy.request.headers r2 = proxy.requests_list[0] assert r2.method == 'CONNECT' assert r2.host == 'aiohttp.io' assert r2.path_qs == '/path' assert r2.headers['Proxy-Authorization'] == auth.encode() async def test_proxy_auth() -> None: async with aiohttp.ClientSession() as session: with pytest.raises( ValueError, match=r"proxy_auth must be None or BasicAuth\(\) tuple"): await session.get('http://python.org', proxy='http://proxy.example.com', proxy_auth=('user', 'pass')) aiohttp-3.6.2/tests/test_pytest_plugin.py0000644000175100001650000001472613547410117021160 0ustar vstsdocker00000000000000import os import platform import sys import pytest pytest_plugins = 'pytester' CONFTEST = ''' pytest_plugins = 'aiohttp.pytest_plugin' ''' IS_PYPY = platform.python_implementation() == 'PyPy' def test_aiohttp_plugin(testdir) -> None: testdir.makepyfile("""\ import pytest from unittest import mock from aiohttp import web async def hello(request): return web.Response(body=b'Hello, world') def create_app(loop=None): app = web.Application() app.router.add_route('GET', '/', hello) return app async def test_hello(aiohttp_client) -> None: client = await aiohttp_client(create_app) resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert 'Hello, world' in text async def test_hello_from_app(aiohttp_client, loop) -> None: app = web.Application() app.router.add_get('/', hello) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert 'Hello, world' in text async def test_hello_with_loop(aiohttp_client, loop) -> None: client = await aiohttp_client(create_app) resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert 'Hello, world' in text async def test_set_args(aiohttp_client, loop) -> None: with pytest.raises(AssertionError): app = web.Application() await aiohttp_client(app, 1, 2, 3) async def test_set_keyword_args(aiohttp_client, loop) -> None: app = web.Application() with pytest.raises(TypeError): await aiohttp_client(app, param=1) async def test_noop() -> None: pass async def previous(request): if request.method == 'POST': with pytest.warns(DeprecationWarning): request.app['value'] = (await request.post())['value'] return web.Response(body=b'thanks for the data') else: v = request.app.get('value', 'unknown') return web.Response(body='value: {}'.format(v).encode()) def create_stateful_app(loop): app = web.Application() app.router.add_route('*', '/', previous) return app @pytest.fixture def cli(loop, aiohttp_client): return loop.run_until_complete(aiohttp_client(create_stateful_app)) async def test_set_value(cli) -> None: resp = await cli.post('/', data={'value': 'foo'}) assert resp.status == 200 text = await resp.text() assert text == 'thanks for the data' assert cli.server.app['value'] == 'foo' async def test_get_value(cli) -> None: resp = await cli.get('/') assert resp.status == 200 text = await resp.text() assert text == 'value: unknown' with pytest.warns(DeprecationWarning): cli.server.app['value'] = 'bar' resp = await cli.get('/') assert resp.status == 200 text = await resp.text() assert text == 'value: bar' def test_noncoro() -> None: assert True async def test_failed_to_create_client(aiohttp_client) -> None: def make_app(loop): raise RuntimeError() with pytest.raises(RuntimeError): await aiohttp_client(make_app) async def test_custom_port_aiohttp_client(aiohttp_client, aiohttp_unused_port): port = aiohttp_unused_port() client = await aiohttp_client(create_app, server_kwargs={'port': port}) assert client.port == port resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert 'Hello, world' in text async def test_custom_port_test_server(aiohttp_server, aiohttp_unused_port): app = create_app() port = aiohttp_unused_port() server = await aiohttp_server(app, port=port) assert server.port == port """) testdir.makeconftest(CONFTEST) result = testdir.runpytest('-p', 'no:sugar', '--aiohttp-loop=pyloop') result.assert_outcomes(passed=12) def test_warning_checks(testdir) -> None: testdir.makepyfile("""\ async def foobar(): return 123 async def test_good() -> None: v = await foobar() assert v == 123 async def test_bad() -> None: foobar() """) testdir.makeconftest(CONFTEST) result = testdir.runpytest('-p', 'no:sugar', '-s', '-W', 'default', '--aiohttp-loop=pyloop') expected_outcomes = ( {'failed': 0, 'passed': 2} if IS_PYPY and bool(os.environ.get('PYTHONASYNCIODEBUG')) else {'failed': 1, 'passed': 1} ) """Under PyPy "coroutine 'foobar' was never awaited" does not happen.""" result.assert_outcomes(**expected_outcomes) def test_aiohttp_plugin_async_fixture(testdir, capsys) -> None: testdir.makepyfile("""\ import pytest from aiohttp import web async def hello(request): return web.Response(body=b'Hello, world') def create_app(loop): app = web.Application() app.router.add_route('GET', '/', hello) return app @pytest.fixture async def cli(aiohttp_client): client = await aiohttp_client(create_app) return client @pytest.fixture async def foo(): return 42 @pytest.fixture async def bar(request): # request should be accessible in async fixtures if needed return request.function async def test_hello(cli) -> None: resp = await cli.get('/') assert resp.status == 200 def test_foo(loop, foo) -> None: assert foo == 42 def test_foo_without_loop(foo) -> None: # will raise an error because there is no loop pass def test_bar(loop, bar) -> None: assert bar is test_bar """) testdir.makeconftest(CONFTEST) result = testdir.runpytest('-p', 'no:sugar', '--aiohttp-loop=pyloop') result.assert_outcomes(passed=3, error=1) result.stdout.fnmatch_lines( "*Asynchronous fixtures must depend on the 'loop' fixture " "or be used in tests depending from it." ) @pytest.mark.skipif(sys.version_info < (3, 6), reason='old python') def test_aiohttp_plugin_async_gen_fixture(testdir) -> None: testdir.makepyfile("""\ import pytest from unittest import mock from aiohttp import web canary = mock.Mock() async def hello(request): return web.Response(body=b'Hello, world') def create_app(loop): app = web.Application() app.router.add_route('GET', '/', hello) return app @pytest.fixture async def cli(aiohttp_client): yield await aiohttp_client(create_app) canary() async def test_hello(cli) -> None: resp = await cli.get('/') assert resp.status == 200 def test_finalized() -> None: assert canary.called is True """) testdir.makeconftest(CONFTEST) result = testdir.runpytest('-p', 'no:sugar', '--aiohttp-loop=pyloop') result.assert_outcomes(passed=2) aiohttp-3.6.2/tests/test_resolver.py0000644000175100001650000001707113547410117020107 0ustar vstsdocker00000000000000import asyncio import ipaddress import socket from unittest.mock import Mock, patch import pytest from aiohttp.resolver import AsyncResolver, DefaultResolver, ThreadedResolver try: import aiodns gethostbyname = hasattr(aiodns.DNSResolver, 'gethostbyname') except ImportError: aiodns = None gethostbyname = False class FakeResult: def __init__(self, addresses): self.addresses = addresses class FakeQueryResult: def __init__(self, host): self.host = host async def fake_result(addresses): return FakeResult(addresses=tuple(addresses)) async def fake_query_result(result): return [FakeQueryResult(host=h) for h in result] def fake_addrinfo(hosts): async def fake(*args, **kwargs): if not hosts: raise socket.gaierror return list([(None, None, None, None, [h, 0]) for h in hosts]) return fake @pytest.mark.skipif(not gethostbyname, reason="aiodns 1.1 required") async def test_async_resolver_positive_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: mock().gethostbyname.return_value = fake_result(['127.0.0.1']) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.python.org') ipaddress.ip_address(real[0]['host']) mock().gethostbyname.assert_called_with('www.python.org', socket.AF_INET) @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_async_resolver_query_positive_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: del mock().gethostbyname mock().query.return_value = fake_query_result(['127.0.0.1']) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.python.org') ipaddress.ip_address(real[0]['host']) mock().query.assert_called_with('www.python.org', 'A') @pytest.mark.skipif(not gethostbyname, reason="aiodns 1.1 required") async def test_async_resolver_multiple_replies(loop) -> None: with patch('aiodns.DNSResolver') as mock: ips = ['127.0.0.1', '127.0.0.2', '127.0.0.3', '127.0.0.4'] mock().gethostbyname.return_value = fake_result(ips) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.google.com') ips = [ipaddress.ip_address(x['host']) for x in real] assert len(ips) > 3, "Expecting multiple addresses" @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_async_resolver_query_multiple_replies(loop) -> None: with patch('aiodns.DNSResolver') as mock: del mock().gethostbyname ips = ['127.0.0.1', '127.0.0.2', '127.0.0.3', '127.0.0.4'] mock().query.return_value = fake_query_result(ips) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.google.com') ips = [ipaddress.ip_address(x['host']) for x in real] @pytest.mark.skipif(not gethostbyname, reason="aiodns 1.1 required") async def test_async_resolver_negative_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: mock().gethostbyname.side_effect = aiodns.error.DNSError() resolver = AsyncResolver(loop=loop) with pytest.raises(OSError): await resolver.resolve('doesnotexist.bla') @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_async_resolver_query_negative_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: del mock().gethostbyname mock().query.side_effect = aiodns.error.DNSError() resolver = AsyncResolver(loop=loop) with pytest.raises(OSError): await resolver.resolve('doesnotexist.bla') @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_async_resolver_no_hosts_in_query(loop) -> None: with patch('aiodns.DNSResolver') as mock: del mock().gethostbyname mock().query.return_value = fake_query_result([]) resolver = AsyncResolver(loop=loop) with pytest.raises(OSError): await resolver.resolve('doesnotexist.bla') @pytest.mark.skipif(not gethostbyname, reason="aiodns 1.1 required") async def test_async_resolver_no_hosts_in_gethostbyname(loop) -> None: with patch('aiodns.DNSResolver') as mock: mock().gethostbyname.return_value = fake_result([]) resolver = AsyncResolver(loop=loop) with pytest.raises(OSError): await resolver.resolve('doesnotexist.bla') async def test_threaded_resolver_positive_lookup() -> None: loop = Mock() loop.getaddrinfo = fake_addrinfo(["127.0.0.1"]) resolver = ThreadedResolver(loop=loop) real = await resolver.resolve('www.python.org') ipaddress.ip_address(real[0]['host']) async def test_threaded_resolver_multiple_replies() -> None: loop = Mock() ips = ['127.0.0.1', '127.0.0.2', '127.0.0.3', '127.0.0.4'] loop.getaddrinfo = fake_addrinfo(ips) resolver = ThreadedResolver(loop=loop) real = await resolver.resolve('www.google.com') ips = [ipaddress.ip_address(x['host']) for x in real] assert len(ips) > 3, "Expecting multiple addresses" async def test_threaded_negative_lookup() -> None: loop = Mock() ips = [] loop.getaddrinfo = fake_addrinfo(ips) resolver = ThreadedResolver(loop=loop) with pytest.raises(socket.gaierror): await resolver.resolve('doesnotexist.bla') async def test_close_for_threaded_resolver(loop) -> None: resolver = ThreadedResolver(loop=loop) await resolver.close() @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_close_for_async_resolver(loop) -> None: resolver = AsyncResolver(loop=loop) await resolver.close() async def test_default_loop_for_threaded_resolver(loop) -> None: asyncio.set_event_loop(loop) resolver = ThreadedResolver() assert resolver._loop is loop @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_default_loop_for_async_resolver(loop) -> None: asyncio.set_event_loop(loop) resolver = AsyncResolver() assert resolver._loop is loop @pytest.mark.skipif(not gethostbyname, reason="aiodns 1.1 required") async def test_async_resolver_ipv6_positive_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: mock().gethostbyname.return_value = fake_result(['::1']) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.python.org', family=socket.AF_INET6) ipaddress.ip_address(real[0]['host']) mock().gethostbyname.assert_called_with('www.python.org', socket.AF_INET6) @pytest.mark.skipif(aiodns is None, reason="aiodns required") async def test_async_resolver_query_ipv6_positive_lookup(loop) -> None: with patch('aiodns.DNSResolver') as mock: del mock().gethostbyname mock().query.return_value = fake_query_result(['::1']) resolver = AsyncResolver(loop=loop) real = await resolver.resolve('www.python.org', family=socket.AF_INET6) ipaddress.ip_address(real[0]['host']) mock().query.assert_called_with('www.python.org', 'AAAA') async def test_async_resolver_aiodns_not_present(loop, monkeypatch) -> None: monkeypatch.setattr("aiohttp.resolver.aiodns", None) with pytest.raises(RuntimeError): AsyncResolver(loop=loop) def test_default_resolver() -> None: # if gethostbyname: # assert DefaultResolver is AsyncResolver # else: # assert DefaultResolver is ThreadedResolver assert DefaultResolver is ThreadedResolver aiohttp-3.6.2/tests/test_route_def.py0000644000175100001650000001532713547410117020224 0ustar vstsdocker00000000000000import pathlib import pytest from yarl import URL from aiohttp import web from aiohttp.web_urldispatcher import UrlDispatcher @pytest.fixture def router(): return UrlDispatcher() def test_get(router) -> None: async def handler(request): pass router.add_routes([web.get('/', handler)]) assert len(router.routes()) == 2 # GET and HEAD route = list(router.routes())[1] assert route.handler is handler assert route.method == 'GET' assert str(route.url_for()) == '/' route2 = list(router.routes())[0] assert route2.handler is handler assert route2.method == 'HEAD' def test_head(router) -> None: async def handler(request): pass router.add_routes([web.head('/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'HEAD' assert str(route.url_for()) == '/' def test_options(router) -> None: async def handler(request): pass router.add_routes([web.options('/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'OPTIONS' assert str(route.url_for()) == '/' def test_post(router) -> None: async def handler(request): pass router.add_routes([web.post('/', handler)]) route = list(router.routes())[0] assert route.handler is handler assert route.method == 'POST' assert str(route.url_for()) == '/' def test_put(router) -> None: async def handler(request): pass router.add_routes([web.put('/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'PUT' assert str(route.url_for()) == '/' def test_patch(router) -> None: async def handler(request): pass router.add_routes([web.patch('/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'PATCH' assert str(route.url_for()) == '/' def test_delete(router) -> None: async def handler(request): pass router.add_routes([web.delete('/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'DELETE' assert str(route.url_for()) == '/' def test_route(router) -> None: async def handler(request): pass router.add_routes([web.route('OTHER', '/', handler)]) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.handler is handler assert route.method == 'OTHER' assert str(route.url_for()) == '/' def test_static(router) -> None: folder = pathlib.Path(__file__).parent router.add_routes([web.static('/prefix', folder)]) assert len(router.resources()) == 1 # 2 routes: for HEAD and GET resource = list(router.resources())[0] info = resource.get_info() assert info['prefix'] == '/prefix' assert info['directory'] == folder url = resource.url_for(filename='aiohttp.png') assert url == URL('/prefix/aiohttp.png') def test_head_deco(router) -> None: routes = web.RouteTableDef() @routes.head('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'HEAD' assert str(route.url_for()) == '/path' def test_get_deco(router) -> None: routes = web.RouteTableDef() @routes.get('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 2 route1 = list(router.routes())[0] assert route1.method == 'HEAD' assert str(route1.url_for()) == '/path' route2 = list(router.routes())[1] assert route2.method == 'GET' assert str(route2.url_for()) == '/path' def test_post_deco(router) -> None: routes = web.RouteTableDef() @routes.post('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'POST' assert str(route.url_for()) == '/path' def test_put_deco(router) -> None: routes = web.RouteTableDef() @routes.put('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'PUT' assert str(route.url_for()) == '/path' def test_patch_deco(router) -> None: routes = web.RouteTableDef() @routes.patch('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'PATCH' assert str(route.url_for()) == '/path' def test_delete_deco(router) -> None: routes = web.RouteTableDef() @routes.delete('/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'DELETE' assert str(route.url_for()) == '/path' def test_route_deco(router) -> None: routes = web.RouteTableDef() @routes.route('OTHER', '/path') async def handler(request): pass router.add_routes(routes) assert len(router.routes()) == 1 route = list(router.routes())[0] assert route.method == 'OTHER' assert str(route.url_for()) == '/path' def test_routedef_sequence_protocol() -> None: routes = web.RouteTableDef() @routes.delete('/path') async def handler(request): pass assert len(routes) == 1 info = routes[0] assert isinstance(info, web.RouteDef) assert info in routes assert list(routes)[0] is info def test_repr_route_def() -> None: routes = web.RouteTableDef() @routes.get('/path') async def handler(request): pass rd = routes[0] assert repr(rd) == " 'handler'>" def test_repr_route_def_with_extra_info() -> None: routes = web.RouteTableDef() @routes.get('/path', extra='info') async def handler(request): pass rd = routes[0] assert repr(rd) == " 'handler', extra='info'>" def test_repr_static_def() -> None: routes = web.RouteTableDef() routes.static('/prefix', '/path', name='name') rd = routes[0] assert repr(rd) == " /path, name='name'>" def test_repr_route_table_def() -> None: routes = web.RouteTableDef() @routes.get('/path') async def handler(request): pass assert repr(routes) == "" aiohttp-3.6.2/tests/test_run_app.py0000644000175100001650000005453613547410117017721 0ustar vstsdocker00000000000000import asyncio import contextlib import logging import os import platform import signal import socket import ssl import subprocess import sys from unittest import mock from uuid import uuid4 import pytest from aiohttp import web from aiohttp.helpers import PY_37 from aiohttp.test_utils import make_mocked_coro # Test for features of OS' socket support _has_unix_domain_socks = hasattr(socket, 'AF_UNIX') if _has_unix_domain_socks: _abstract_path_sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) try: _abstract_path_sock.bind(b"\x00" + uuid4().hex.encode('ascii')) # type: ignore # noqa except FileNotFoundError: _abstract_path_failed = True else: _abstract_path_failed = False finally: _abstract_path_sock.close() del _abstract_path_sock else: _abstract_path_failed = True skip_if_no_abstract_paths = pytest.mark.skipif( _abstract_path_failed, reason="Linux-style abstract paths are not supported." ) skip_if_no_unix_socks = pytest.mark.skipif( not _has_unix_domain_socks, reason="Unix domain sockets are not supported" ) del _has_unix_domain_socks, _abstract_path_failed HAS_IPV6 = socket.has_ipv6 if HAS_IPV6: # The socket.has_ipv6 flag may be True if Python was built with IPv6 # support, but the target system still may not have it. # So let's ensure that we really have IPv6 support. try: socket.socket(socket.AF_INET6, socket.SOCK_STREAM) except OSError: HAS_IPV6 = False # tokio event loop does not allow to override attributes def skip_if_no_dict(loop): if not hasattr(loop, '__dict__'): pytest.skip("can not override loop attributes") def skip_if_on_windows(): if platform.system() == "Windows": pytest.skip("the test is not valid for Windows") @pytest.fixture def patched_loop(loop): skip_if_no_dict(loop) server = mock.Mock() server.wait_closed = make_mocked_coro(None) loop.create_server = make_mocked_coro(server) unix_server = mock.Mock() unix_server.wait_closed = make_mocked_coro(None) loop.create_unix_server = make_mocked_coro(unix_server) asyncio.set_event_loop(loop) return loop def stopper(loop): def raiser(): raise KeyboardInterrupt def f(*args): loop.call_soon(raiser) return f def test_run_app_http(patched_loop) -> None: app = web.Application() startup_handler = make_mocked_coro() app.on_startup.append(startup_handler) cleanup_handler = make_mocked_coro() app.on_cleanup.append(cleanup_handler) web.run_app(app, print=stopper(patched_loop)) patched_loop.create_server.assert_called_with(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None) startup_handler.assert_called_once_with(app) cleanup_handler.assert_called_once_with(app) def test_run_app_close_loop(patched_loop) -> None: app = web.Application() web.run_app(app, print=stopper(patched_loop)) patched_loop.create_server.assert_called_with(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None) assert patched_loop.is_closed() mock_unix_server_single = [ mock.call(mock.ANY, '/tmp/testsock1.sock', ssl=None, backlog=128), ] mock_unix_server_multi = [ mock.call(mock.ANY, '/tmp/testsock1.sock', ssl=None, backlog=128), mock.call(mock.ANY, '/tmp/testsock2.sock', ssl=None, backlog=128), ] mock_server_single = [ mock.call(mock.ANY, '127.0.0.1', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None), ] mock_server_multi = [ mock.call(mock.ANY, '127.0.0.1', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None), mock.call(mock.ANY, '192.168.1.1', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None), ] mock_server_default_8989 = [ mock.call(mock.ANY, '0.0.0.0', 8989, ssl=None, backlog=128, reuse_address=None, reuse_port=None) ] mock_socket = mock.Mock(getsockname=lambda: ('mock-socket', 123)) mixed_bindings_tests = ( ( # type: ignore "Nothing Specified", {}, [mock.call(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None)], [] ), ( "Port Only", {'port': 8989}, mock_server_default_8989, [] ), ( "Multiple Hosts", {'host': ('127.0.0.1', '192.168.1.1')}, mock_server_multi, [] ), ( "Multiple Paths", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock')}, [], mock_unix_server_multi ), ( "Multiple Paths, Port", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock'), 'port': 8989}, mock_server_default_8989, mock_unix_server_multi, ), ( "Multiple Paths, Single Host", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock'), 'host': '127.0.0.1'}, mock_server_single, mock_unix_server_multi ), ( "Single Path, Single Host", {'path': '/tmp/testsock1.sock', 'host': '127.0.0.1'}, mock_server_single, mock_unix_server_single ), ( "Single Path, Multiple Hosts", {'path': '/tmp/testsock1.sock', 'host': ('127.0.0.1', '192.168.1.1')}, mock_server_multi, mock_unix_server_single ), ( "Single Path, Port", {'path': '/tmp/testsock1.sock', 'port': 8989}, mock_server_default_8989, mock_unix_server_single ), ( "Multiple Paths, Multiple Hosts, Port", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock'), 'host': ('127.0.0.1', '192.168.1.1'), 'port': 8000}, [mock.call(mock.ANY, '127.0.0.1', 8000, ssl=None, backlog=128, reuse_address=None, reuse_port=None), mock.call(mock.ANY, '192.168.1.1', 8000, ssl=None, backlog=128, reuse_address=None, reuse_port=None)], mock_unix_server_multi ), ( "Only socket", {"sock": [mock_socket]}, [mock.call(mock.ANY, ssl=None, sock=mock_socket, backlog=128)], [], ), ( "Socket, port", {"sock": [mock_socket], "port": 8765}, [mock.call(mock.ANY, '0.0.0.0', 8765, ssl=None, backlog=128, reuse_address=None, reuse_port=None), mock.call(mock.ANY, sock=mock_socket, ssl=None, backlog=128)], [], ), ( "Socket, Host, No port", {"sock": [mock_socket], "host": 'localhost'}, [mock.call(mock.ANY, 'localhost', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None), mock.call(mock.ANY, sock=mock_socket, ssl=None, backlog=128)], [], ), ( "reuse_port", {"reuse_port": True}, [mock.call(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=True)], [] ), ( "reuse_address", {"reuse_address": False}, [mock.call(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=False, reuse_port=None)], [] ), ( "reuse_port, reuse_address", {"reuse_address": True, "reuse_port": True}, [mock.call(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=True, reuse_port=True)], [] ), ( "Port, reuse_port", {'port': 8989, "reuse_port": True}, [mock.call(mock.ANY, '0.0.0.0', 8989, ssl=None, backlog=128, reuse_address=None, reuse_port=True)], [] ), ( "Multiple Hosts, reuse_port", {'host': ('127.0.0.1', '192.168.1.1'), "reuse_port": True}, [ mock.call(mock.ANY, '127.0.0.1', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=True), mock.call(mock.ANY, '192.168.1.1', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=True), ], [] ), ( "Multiple Paths, Port, reuse_address", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock'), 'port': 8989, 'reuse_address': False}, [mock.call(mock.ANY, '0.0.0.0', 8989, ssl=None, backlog=128, reuse_address=False, reuse_port=None)], mock_unix_server_multi, ), ( "Multiple Paths, Single Host, reuse_address, reuse_port", {'path': ('/tmp/testsock1.sock', '/tmp/testsock2.sock'), 'host': '127.0.0.1', 'reuse_address': True, 'reuse_port': True}, [ mock.call(mock.ANY, '127.0.0.1', 8080, ssl=None, backlog=128, reuse_address=True, reuse_port=True), ], mock_unix_server_multi ), ) mixed_bindings_test_ids = [test[0] for test in mixed_bindings_tests] mixed_bindings_test_params = [test[1:] for test in mixed_bindings_tests] @pytest.mark.parametrize( 'run_app_kwargs, expected_server_calls, expected_unix_server_calls', mixed_bindings_test_params, ids=mixed_bindings_test_ids ) def test_run_app_mixed_bindings(run_app_kwargs, expected_server_calls, expected_unix_server_calls, patched_loop): app = web.Application() web.run_app(app, print=stopper(patched_loop), **run_app_kwargs) assert (patched_loop.create_unix_server.mock_calls == expected_unix_server_calls) assert (patched_loop.create_server.mock_calls == expected_server_calls) def test_run_app_https(patched_loop) -> None: app = web.Application() ssl_context = ssl.create_default_context() web.run_app(app, ssl_context=ssl_context, print=stopper(patched_loop)) patched_loop.create_server.assert_called_with( mock.ANY, '0.0.0.0', 8443, ssl=ssl_context, backlog=128, reuse_address=None, reuse_port=None) def test_run_app_nondefault_host_port(patched_loop, aiohttp_unused_port) -> None: port = aiohttp_unused_port() host = '127.0.0.1' app = web.Application() web.run_app(app, host=host, port=port, print=stopper(patched_loop)) patched_loop.create_server.assert_called_with(mock.ANY, host, port, ssl=None, backlog=128, reuse_address=None, reuse_port=None) def test_run_app_custom_backlog(patched_loop) -> None: app = web.Application() web.run_app(app, backlog=10, print=stopper(patched_loop)) patched_loop.create_server.assert_called_with( mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=10, reuse_address=None, reuse_port=None) def test_run_app_custom_backlog_unix(patched_loop) -> None: app = web.Application() web.run_app(app, path='/tmp/tmpsock.sock', backlog=10, print=stopper(patched_loop)) patched_loop.create_unix_server.assert_called_with( mock.ANY, '/tmp/tmpsock.sock', ssl=None, backlog=10) @skip_if_no_unix_socks def test_run_app_http_unix_socket(patched_loop, shorttmpdir) -> None: app = web.Application() sock_path = str(shorttmpdir / 'socket.sock') printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, path=sock_path, print=printer) patched_loop.create_unix_server.assert_called_with(mock.ANY, sock_path, ssl=None, backlog=128) assert "http://unix:{}:".format(sock_path) in printer.call_args[0][0] @skip_if_no_unix_socks def test_run_app_https_unix_socket(patched_loop, shorttmpdir) -> None: app = web.Application() sock_path = str(shorttmpdir / 'socket.sock') ssl_context = ssl.create_default_context() printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, path=sock_path, ssl_context=ssl_context, print=printer) patched_loop.create_unix_server.assert_called_with( mock.ANY, sock_path, ssl=ssl_context, backlog=128) assert "https://unix:{}:".format(sock_path) in printer.call_args[0][0] @skip_if_no_unix_socks @skip_if_no_abstract_paths def test_run_app_abstract_linux_socket(patched_loop) -> None: sock_path = b"\x00" + uuid4().hex.encode('ascii') app = web.Application() web.run_app( app, path=sock_path.decode('ascii', 'ignore'), print=stopper(patched_loop)) patched_loop.create_unix_server.assert_called_with( mock.ANY, sock_path.decode('ascii'), ssl=None, backlog=128 ) def test_run_app_preexisting_inet_socket(patched_loop, mocker) -> None: app = web.Application() sock = socket.socket() with contextlib.closing(sock): sock.bind(('0.0.0.0', 0)) _, port = sock.getsockname() printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, sock=sock, print=printer) patched_loop.create_server.assert_called_with( mock.ANY, sock=sock, backlog=128, ssl=None ) assert "http://0.0.0.0:{}".format(port) in printer.call_args[0][0] @pytest.mark.skipif(not HAS_IPV6, reason="IPv6 is not available") def test_run_app_preexisting_inet6_socket(patched_loop) -> None: app = web.Application() sock = socket.socket(socket.AF_INET6) with contextlib.closing(sock): sock.bind(('::', 0)) port = sock.getsockname()[1] printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, sock=sock, print=printer) patched_loop.create_server.assert_called_with( mock.ANY, sock=sock, backlog=128, ssl=None ) assert "http://[::]:{}".format(port) in printer.call_args[0][0] @skip_if_no_unix_socks def test_run_app_preexisting_unix_socket(patched_loop, mocker) -> None: app = web.Application() sock_path = '/tmp/test_preexisting_sock1' sock = socket.socket(socket.AF_UNIX) with contextlib.closing(sock): sock.bind(sock_path) os.unlink(sock_path) printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, sock=sock, print=printer) patched_loop.create_server.assert_called_with( mock.ANY, sock=sock, backlog=128, ssl=None ) assert "http://unix:{}:".format(sock_path) in printer.call_args[0][0] def test_run_app_multiple_preexisting_sockets(patched_loop) -> None: app = web.Application() sock1 = socket.socket() sock2 = socket.socket() with contextlib.closing(sock1), contextlib.closing(sock2): sock1.bind(('0.0.0.0', 0)) _, port1 = sock1.getsockname() sock2.bind(('0.0.0.0', 0)) _, port2 = sock2.getsockname() printer = mock.Mock(wraps=stopper(patched_loop)) web.run_app(app, sock=(sock1, sock2), print=printer) patched_loop.create_server.assert_has_calls([ mock.call(mock.ANY, sock=sock1, backlog=128, ssl=None), mock.call(mock.ANY, sock=sock2, backlog=128, ssl=None) ]) assert "http://0.0.0.0:{}".format(port1) in printer.call_args[0][0] assert "http://0.0.0.0:{}".format(port2) in printer.call_args[0][0] _script_test_signal = """ from aiohttp import web app = web.Application() web.run_app(app, host=()) """ def test_sigint() -> None: skip_if_on_windows() proc = subprocess.Popen([sys.executable, "-u", "-c", _script_test_signal], stdout=subprocess.PIPE) for line in proc.stdout: if line.startswith(b"======== Running on"): break proc.send_signal(signal.SIGINT) assert proc.wait() == 0 def test_sigterm() -> None: skip_if_on_windows() proc = subprocess.Popen([sys.executable, "-u", "-c", _script_test_signal], stdout=subprocess.PIPE) for line in proc.stdout: if line.startswith(b"======== Running on"): break proc.terminate() assert proc.wait() == 0 def test_startup_cleanup_signals_even_on_failure(patched_loop) -> None: patched_loop.create_server = mock.Mock(side_effect=RuntimeError()) app = web.Application() startup_handler = make_mocked_coro() app.on_startup.append(startup_handler) cleanup_handler = make_mocked_coro() app.on_cleanup.append(cleanup_handler) with pytest.raises(RuntimeError): web.run_app(app, print=stopper(patched_loop)) startup_handler.assert_called_once_with(app) cleanup_handler.assert_called_once_with(app) def test_run_app_coro(patched_loop) -> None: startup_handler = cleanup_handler = None async def make_app(): nonlocal startup_handler, cleanup_handler app = web.Application() startup_handler = make_mocked_coro() app.on_startup.append(startup_handler) cleanup_handler = make_mocked_coro() app.on_cleanup.append(cleanup_handler) return app web.run_app(make_app(), print=stopper(patched_loop)) patched_loop.create_server.assert_called_with(mock.ANY, '0.0.0.0', 8080, ssl=None, backlog=128, reuse_address=None, reuse_port=None) startup_handler.assert_called_once_with(mock.ANY) cleanup_handler.assert_called_once_with(mock.ANY) def test_run_app_default_logger(monkeypatch, patched_loop): patched_loop.set_debug(True) logger = web.access_logger attrs = { 'hasHandlers.return_value': False, 'level': logging.NOTSET, 'name': 'aiohttp.access', } mock_logger = mock.create_autospec(logger, name='mock_access_logger') mock_logger.configure_mock(**attrs) app = web.Application() web.run_app(app, print=stopper(patched_loop), access_log=mock_logger) mock_logger.setLevel.assert_any_call(logging.DEBUG) mock_logger.hasHandlers.assert_called_with() assert isinstance(mock_logger.addHandler.call_args[0][0], logging.StreamHandler) def test_run_app_default_logger_setup_requires_debug(patched_loop): patched_loop.set_debug(False) logger = web.access_logger attrs = { 'hasHandlers.return_value': False, 'level': logging.NOTSET, 'name': 'aiohttp.access', } mock_logger = mock.create_autospec(logger, name='mock_access_logger') mock_logger.configure_mock(**attrs) app = web.Application() web.run_app(app, print=stopper(patched_loop), access_log=mock_logger) mock_logger.setLevel.assert_not_called() mock_logger.hasHandlers.assert_not_called() mock_logger.addHandler.assert_not_called() def test_run_app_default_logger_setup_requires_default_logger(patched_loop): patched_loop.set_debug(True) logger = web.access_logger attrs = { 'hasHandlers.return_value': False, 'level': logging.NOTSET, 'name': None, } mock_logger = mock.create_autospec(logger, name='mock_access_logger') mock_logger.configure_mock(**attrs) app = web.Application() web.run_app(app, print=stopper(patched_loop), access_log=mock_logger) mock_logger.setLevel.assert_not_called() mock_logger.hasHandlers.assert_not_called() mock_logger.addHandler.assert_not_called() def test_run_app_default_logger_setup_only_if_unconfigured(patched_loop): patched_loop.set_debug(True) logger = web.access_logger attrs = { 'hasHandlers.return_value': True, 'level': None, 'name': 'aiohttp.access', } mock_logger = mock.create_autospec(logger, name='mock_access_logger') mock_logger.configure_mock(**attrs) app = web.Application() web.run_app(app, print=stopper(patched_loop), access_log=mock_logger) mock_logger.setLevel.assert_not_called() mock_logger.hasHandlers.assert_called_with() mock_logger.addHandler.assert_not_called() def test_run_app_cancels_all_pending_tasks(patched_loop): app = web.Application() task = None async def on_startup(app): nonlocal task loop = asyncio.get_event_loop() task = loop.create_task(asyncio.sleep(1000)) app.on_startup.append(on_startup) web.run_app(app, print=stopper(patched_loop)) assert task.cancelled() def test_run_app_cancels_done_tasks(patched_loop): app = web.Application() task = None async def coro(): return 123 async def on_startup(app): nonlocal task loop = asyncio.get_event_loop() task = loop.create_task(coro()) app.on_startup.append(on_startup) web.run_app(app, print=stopper(patched_loop)) assert task.done() def test_run_app_cancels_failed_tasks(patched_loop): app = web.Application() task = None exc = RuntimeError("FAIL") async def fail(): try: await asyncio.sleep(1000) except asyncio.CancelledError: raise exc async def on_startup(app): nonlocal task loop = asyncio.get_event_loop() task = loop.create_task(fail()) await asyncio.sleep(0.01) app.on_startup.append(on_startup) exc_handler = mock.Mock() patched_loop.set_exception_handler(exc_handler) web.run_app(app, print=stopper(patched_loop)) assert task.done() msg = { 'message': 'unhandled exception during asyncio.run() shutdown', 'exception': exc, 'task': task, } exc_handler.assert_called_with(patched_loop, msg) @pytest.mark.skipif(not PY_37, reason="contextvars support is required") def test_run_app_context_vars(patched_loop): from contextvars import ContextVar count = 0 VAR = ContextVar('VAR', default='default') async def on_startup(app): nonlocal count assert 'init' == VAR.get() VAR.set('on_startup') count += 1 async def on_cleanup(app): nonlocal count assert 'on_startup' == VAR.get() count += 1 async def init(): nonlocal count assert 'default' == VAR.get() VAR.set('init') app = web.Application() app.on_startup.append(on_startup) app.on_cleanup.append(on_cleanup) count += 1 return app web.run_app(init(), print=stopper(patched_loop)) assert count == 3 aiohttp-3.6.2/tests/test_signals.py0000644000175100001650000000720513547410117017704 0ustar vstsdocker00000000000000import re from unittest import mock import pytest from multidict import CIMultiDict from aiohttp.signals import Signal from aiohttp.test_utils import make_mocked_coro, make_mocked_request from aiohttp.web import Application, Response @pytest.fixture def app(): return Application() def make_request(app, method, path, headers=CIMultiDict()): return make_mocked_request(method, path, headers, app=app) async def test_add_signal_handler_not_a_callable(app) -> None: callback = True app.on_response_prepare.append(callback) app.on_response_prepare.freeze() with pytest.raises(TypeError): await app.on_response_prepare(None, None) async def test_function_signal_dispatch(app) -> None: signal = Signal(app) kwargs = {'foo': 1, 'bar': 2} callback_mock = mock.Mock() async def callback(**kwargs): callback_mock(**kwargs) signal.append(callback) signal.freeze() await signal.send(**kwargs) callback_mock.assert_called_once_with(**kwargs) async def test_function_signal_dispatch2(app) -> None: signal = Signal(app) args = {'a', 'b'} kwargs = {'foo': 1, 'bar': 2} callback_mock = mock.Mock() async def callback(*args, **kwargs): callback_mock(*args, **kwargs) signal.append(callback) signal.freeze() await signal.send(*args, **kwargs) callback_mock.assert_called_once_with(*args, **kwargs) async def test_response_prepare(app) -> None: callback = mock.Mock() async def cb(*args, **kwargs): callback(*args, **kwargs) app.on_response_prepare.append(cb) app.on_response_prepare.freeze() request = make_request(app, 'GET', '/') response = Response(body=b'') await response.prepare(request) callback.assert_called_once_with(request, response) async def test_non_coroutine(app) -> None: signal = Signal(app) kwargs = {'foo': 1, 'bar': 2} callback = mock.Mock() signal.append(callback) signal.freeze() with pytest.raises(TypeError): await signal.send(**kwargs) def test_setitem(app) -> None: signal = Signal(app) m1 = mock.Mock() signal.append(m1) assert signal[0] is m1 m2 = mock.Mock() signal[0] = m2 assert signal[0] is m2 def test_delitem(app) -> None: signal = Signal(app) m1 = mock.Mock() signal.append(m1) assert len(signal) == 1 del signal[0] assert len(signal) == 0 def test_cannot_append_to_frozen_signal(app) -> None: signal = Signal(app) m1 = mock.Mock() m2 = mock.Mock() signal.append(m1) signal.freeze() with pytest.raises(RuntimeError): signal.append(m2) assert list(signal) == [m1] def test_cannot_setitem_in_frozen_signal(app) -> None: signal = Signal(app) m1 = mock.Mock() m2 = mock.Mock() signal.append(m1) signal.freeze() with pytest.raises(RuntimeError): signal[0] = m2 assert list(signal) == [m1] def test_cannot_delitem_in_frozen_signal(app) -> None: signal = Signal(app) m1 = mock.Mock() signal.append(m1) signal.freeze() with pytest.raises(RuntimeError): del signal[0] assert list(signal) == [m1] async def test_cannot_send_non_frozen_signal(app) -> None: signal = Signal(app) callback = make_mocked_coro() signal.append(callback) with pytest.raises(RuntimeError): await signal.send() assert not callback.called async def test_repr(app) -> None: signal = Signal(app) callback = make_mocked_coro() signal.append(callback) assert re.match(r", frozen=False, " r"\[\]>", repr(signal)) aiohttp-3.6.2/tests/test_streams.py0000644000175100001650000011676113547410117017732 0ustar vstsdocker00000000000000"""Tests for streams.py""" import abc import asyncio import gc import re import types from collections import defaultdict from itertools import groupby from unittest import mock import pytest from aiohttp import streams DATA = b'line1\nline2\nline3\n' def chunkify(seq, n): for i in range(0, len(seq), n): yield seq[i:i+n] async def create_stream(): loop = asyncio.get_event_loop() protocol = mock.Mock(_reading_paused=False) stream = streams.StreamReader(protocol, loop=loop) stream.feed_data(DATA) stream.feed_eof() return stream @pytest.fixture def protocol(): return mock.Mock(_reading_paused=False) MEMLEAK_SKIP_TYPES = ( *(getattr(types, name) for name in types.__all__ if name.endswith('Type')), mock.Mock, abc.ABCMeta, ) def get_memory_usage(obj): objs = [obj] # Memory leak may be caused by leaked links to same objects. # Without link counting, [1,2,3] is indistiguishable from [1,2,3,3,3,3,3,3] known = defaultdict(int) known[id(obj)] += 1 while objs: refs = gc.get_referents(*objs) objs = [] for obj in refs: if isinstance(obj, MEMLEAK_SKIP_TYPES): continue i = id(obj) known[i] += 1 if known[i] == 1: objs.append(obj) # Make list of unhashable objects uniq objs.sort(key=id) objs = [next(g) for (i, g) in groupby(objs, id)] return sum(known.values()) class TestStreamReader: DATA = b'line1\nline2\nline3\n' def _make_one(self, *args, **kwargs): return streams.StreamReader(mock.Mock(_reading_paused=False), *args, **kwargs) async def test_create_waiter(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one(loop=loop) stream._waiter = loop.create_future with pytest.raises(RuntimeError): await stream._wait('test') def test_ctor_global_loop(self) -> None: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) stream = streams.StreamReader(mock.Mock(_reading_paused=False)) assert stream._loop is loop async def test_at_eof(self) -> None: stream = self._make_one() assert not stream.at_eof() stream.feed_data(b'some data\n') assert not stream.at_eof() await stream.readline() assert not stream.at_eof() stream.feed_data(b'some data\n') stream.feed_eof() await stream.readline() assert stream.at_eof() async def test_wait_eof(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() wait_task = loop.create_task(stream.wait_eof()) async def cb(): await asyncio.sleep(0.1) stream.feed_eof() loop.create_task(cb()) await wait_task assert stream.is_eof() assert stream._eof_waiter is None async def test_wait_eof_eof(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() stream.feed_eof() wait_task = loop.create_task(stream.wait_eof()) await wait_task assert stream.is_eof() async def test_feed_empty_data(self) -> None: stream = self._make_one() stream.feed_data(b'') stream.feed_eof() data = await stream.read() assert b'' == data async def test_feed_nonempty_data(self) -> None: stream = self._make_one() stream.feed_data(self.DATA) stream.feed_eof() data = await stream.read() assert self.DATA == data async def test_read_zero(self) -> None: # Read zero bytes. stream = self._make_one() stream.feed_data(self.DATA) data = await stream.read(0) assert b'' == data stream.feed_eof() data = await stream.read() assert self.DATA == data async def test_read(self) -> None: loop = asyncio.get_event_loop() # Read bytes. stream = self._make_one() read_task = loop.create_task(stream.read(30)) def cb(): stream.feed_data(self.DATA) loop.call_soon(cb) data = await read_task assert self.DATA == data stream.feed_eof() data = await stream.read() assert b'' == data async def test_read_line_breaks(self) -> None: # Read bytes without line breaks. stream = self._make_one() stream.feed_data(b'line1') stream.feed_data(b'line2') data = await stream.read(5) assert b'line1' == data data = await stream.read(5) assert b'line2' == data async def test_read_all(self) -> None: # Read all available buffered bytes stream = self._make_one() stream.feed_data(b'line1') stream.feed_data(b'line2') stream.feed_eof() data = await stream.read() assert b'line1line2' == data async def test_read_up_to(self) -> None: # Read available buffered bytes up to requested amount stream = self._make_one() stream.feed_data(b'line1') stream.feed_data(b'line2') data = await stream.read(8) assert b'line1lin' == data data = await stream.read(8) assert b'e2' == data async def test_read_eof(self) -> None: loop = asyncio.get_event_loop() # Read bytes, stop at eof. stream = self._make_one() read_task = loop.create_task(stream.read(1024)) def cb(): stream.feed_eof() loop.call_soon(cb) data = await read_task assert b'' == data data = await stream.read() assert data == b'' async def test_read_eof_infinite(self) -> None: # Read bytes. stream = self._make_one() stream.feed_eof() with mock.patch('aiohttp.streams.internal_logger') as internal_logger: await stream.read() await stream.read() await stream.read() await stream.read() await stream.read() await stream.read() assert internal_logger.warning.called async def test_read_eof_unread_data_no_warning(self) -> None: # Read bytes. stream = self._make_one() stream.feed_eof() with mock.patch('aiohttp.streams.internal_logger') as internal_logger: await stream.read() await stream.read() await stream.read() await stream.read() await stream.read() with pytest.warns(DeprecationWarning): stream.unread_data(b'data') await stream.read() await stream.read() assert not internal_logger.warning.called async def test_read_until_eof(self) -> None: loop = asyncio.get_event_loop() # Read all bytes until eof. stream = self._make_one() read_task = loop.create_task(stream.read(-1)) def cb(): stream.feed_data(b'chunk1\n') stream.feed_data(b'chunk2') stream.feed_eof() loop.call_soon(cb) data = await read_task assert b'chunk1\nchunk2' == data data = await stream.read() assert b'' == data async def test_read_exception(self) -> None: stream = self._make_one() stream.feed_data(b'line\n') data = await stream.read(2) assert b'li' == data stream.set_exception(ValueError()) with pytest.raises(ValueError): await stream.read(2) async def test_readline(self) -> None: loop = asyncio.get_event_loop() # Read one line. 'readline' will need to wait for the data # to come from 'cb' stream = self._make_one() stream.feed_data(b'chunk1 ') read_task = loop.create_task(stream.readline()) def cb(): stream.feed_data(b'chunk2 ') stream.feed_data(b'chunk3 ') stream.feed_data(b'\n chunk4') loop.call_soon(cb) line = await read_task assert b'chunk1 chunk2 chunk3 \n' == line stream.feed_eof() data = await stream.read() assert b' chunk4' == data async def test_readline_limit_with_existing_data(self) -> None: # Read one line. The data is in StreamReader's buffer # before the event loop is run. stream = self._make_one(limit=2) stream.feed_data(b'li') stream.feed_data(b'ne1\nline2\n') with pytest.raises(ValueError): await stream.readline() # The buffer should contain the remaining data after exception stream.feed_eof() data = await stream.read() assert b'line2\n' == data async def test_readline_limit(self) -> None: loop = asyncio.get_event_loop() # Read one line. StreamReaders are fed with data after # their 'readline' methods are called. stream = self._make_one(limit=4) def cb(): stream.feed_data(b'chunk1') stream.feed_data(b'chunk2\n') stream.feed_data(b'chunk3\n') stream.feed_eof() loop.call_soon(cb) with pytest.raises(ValueError): await stream.readline() data = await stream.read() assert b'chunk3\n' == data async def test_readline_nolimit_nowait(self) -> None: # All needed data for the first 'readline' call will be # in the buffer. stream = self._make_one() stream.feed_data(self.DATA[:6]) stream.feed_data(self.DATA[6:]) line = await stream.readline() assert b'line1\n' == line stream.feed_eof() data = await stream.read() assert b'line2\nline3\n' == data async def test_readline_eof(self) -> None: stream = self._make_one() stream.feed_data(b'some data') stream.feed_eof() line = await stream.readline() assert b'some data' == line async def test_readline_empty_eof(self) -> None: stream = self._make_one() stream.feed_eof() line = await stream.readline() assert b'' == line async def test_readline_read_byte_count(self) -> None: stream = self._make_one() stream.feed_data(self.DATA) await stream.readline() data = await stream.read(7) assert b'line2\nl' == data stream.feed_eof() data = await stream.read() assert b'ine3\n' == data async def test_readline_exception(self) -> None: stream = self._make_one() stream.feed_data(b'line\n') data = await stream.readline() assert b'line\n' == data stream.set_exception(ValueError()) with pytest.raises(ValueError): await stream.readline() async def test_readexactly_zero_or_less(self) -> None: # Read exact number of bytes (zero or less). stream = self._make_one() stream.feed_data(self.DATA) data = await stream.readexactly(0) assert b'' == data stream.feed_eof() data = await stream.read() assert self.DATA == data stream = self._make_one() stream.feed_data(self.DATA) data = await stream.readexactly(-1) assert b'' == data stream.feed_eof() data = await stream.read() assert self.DATA == data async def test_readexactly(self) -> None: loop = asyncio.get_event_loop() # Read exact number of bytes. stream = self._make_one() n = 2 * len(self.DATA) read_task = loop.create_task(stream.readexactly(n)) def cb(): stream.feed_data(self.DATA) stream.feed_data(self.DATA) stream.feed_data(self.DATA) loop.call_soon(cb) data = await read_task assert self.DATA + self.DATA == data stream.feed_eof() data = await stream.read() assert self.DATA == data async def test_readexactly_eof(self) -> None: loop = asyncio.get_event_loop() # Read exact number of bytes (eof). stream = self._make_one(loop=loop) n = 2 * len(self.DATA) read_task = loop.create_task(stream.readexactly(n)) def cb(): stream.feed_data(self.DATA) stream.feed_eof() loop.call_soon(cb) with pytest.raises(asyncio.IncompleteReadError) as cm: await read_task assert cm.value.partial == self.DATA assert cm.value.expected == n assert (str(cm.value) == '18 bytes read on a total of 36 expected bytes') data = await stream.read() assert b'' == data async def test_readexactly_exception(self) -> None: stream = self._make_one() stream.feed_data(b'line\n') data = await stream.readexactly(2) assert b'li' == data stream.set_exception(ValueError()) with pytest.raises(ValueError): await stream.readexactly(2) async def test_unread_data(self) -> None: stream = self._make_one() stream.feed_data(b'line1') stream.feed_data(b'line2') stream.feed_data(b'onemoreline') data = await stream.read(5) assert b'line1' == data with pytest.warns(DeprecationWarning): stream.unread_data(data) data = await stream.read(5) assert b'line1' == data data = await stream.read(4) assert b'line' == data with pytest.warns(DeprecationWarning): stream.unread_data(b'line1line') data = b'' while len(data) < 10: data += await stream.read(10) assert b'line1line2' == data data = await stream.read(7) assert b'onemore' == data with pytest.warns(DeprecationWarning): stream.unread_data(data) data = b'' while len(data) < 11: data += await stream.read(11) assert b'onemoreline' == data with pytest.warns(DeprecationWarning): stream.unread_data(b'line') data = await stream.read(4) assert b'line' == data stream.feed_eof() with pytest.warns(DeprecationWarning): stream.unread_data(b'at_eof') data = await stream.read(6) assert b'at_eof' == data async def test_exception(self) -> None: stream = self._make_one() assert stream.exception() is None exc = ValueError() stream.set_exception(exc) assert stream.exception() is exc async def test_exception_waiter(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() async def set_err(): stream.set_exception(ValueError()) t1 = loop.create_task(stream.readline()) t2 = loop.create_task(set_err()) await asyncio.wait([t1, t2]) with pytest.raises(ValueError): t1.result() async def test_exception_cancel(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() async def read_a_line(): await stream.readline() t = loop.create_task(read_a_line()) await asyncio.sleep(0) t.cancel() await asyncio.sleep(0) # The following line fails if set_exception() isn't careful. stream.set_exception(RuntimeError('message')) await asyncio.sleep(0) assert stream._waiter is None async def test_readany_eof(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() read_task = loop.create_task(stream.readany()) loop.call_soon(stream.feed_data, b'chunk1\n') data = await read_task assert b'chunk1\n' == data stream.feed_eof() data = await stream.read() assert b'' == data async def test_readany_empty_eof(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() stream.feed_eof() read_task = loop.create_task(stream.readany()) data = await read_task assert b'' == data async def test_readany_exception(self) -> None: stream = self._make_one() stream.feed_data(b'line\n') data = await stream.readany() assert b'line\n' == data stream.set_exception(ValueError()) with pytest.raises(ValueError): await stream.readany() async def test_read_nowait(self) -> None: stream = self._make_one() stream.feed_data(b'line1\nline2\n') assert stream.read_nowait() == b'line1\nline2\n' assert stream.read_nowait() == b'' stream.feed_eof() data = await stream.read() assert b'' == data async def test_read_nowait_n(self) -> None: stream = self._make_one() stream.feed_data(b'line1\nline2\n') assert stream.read_nowait(4) == b'line' assert stream.read_nowait() == b'1\nline2\n' assert stream.read_nowait() == b'' stream.feed_eof() data = await stream.read() assert b'' == data async def test_read_nowait_exception(self) -> None: stream = self._make_one() stream.feed_data(b'line\n') stream.set_exception(ValueError()) with pytest.raises(ValueError): stream.read_nowait() async def test_read_nowait_waiter(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() stream.feed_data(b'line\n') stream._waiter = loop.create_future() with pytest.raises(RuntimeError): stream.read_nowait() async def test_readchunk(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() def cb(): stream.feed_data(b'chunk1') stream.feed_data(b'chunk2') stream.feed_eof() loop.call_soon(cb) data, end_of_chunk = await stream.readchunk() assert b'chunk1' == data assert not end_of_chunk data, end_of_chunk = await stream.readchunk() assert b'chunk2' == data assert not end_of_chunk data, end_of_chunk = await stream.readchunk() assert b'' == data assert not end_of_chunk async def test_readchunk_wait_eof(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() async def cb(): await asyncio.sleep(0.1) stream.feed_eof() loop.create_task(cb()) data, end_of_chunk = await stream.readchunk() assert b"" == data assert not end_of_chunk assert stream.is_eof() async def test_begin_and_end_chunk_receiving(self) -> None: stream = self._make_one() stream.begin_http_chunk_receiving() stream.feed_data(b'part1') stream.feed_data(b'part2') stream.end_http_chunk_receiving() data, end_of_chunk = await stream.readchunk() assert b'part1part2' == data assert end_of_chunk stream.begin_http_chunk_receiving() stream.feed_data(b'part3') data, end_of_chunk = await stream.readchunk() assert b'part3' == data assert not end_of_chunk stream.end_http_chunk_receiving() data, end_of_chunk = await stream.readchunk() assert b'' == data assert end_of_chunk stream.feed_eof() data, end_of_chunk = await stream.readchunk() assert b'' == data assert not end_of_chunk async def test_readany_chunk_end_race(self) -> None: stream = self._make_one() stream.begin_http_chunk_receiving() stream.feed_data(b'part1') data = await stream.readany() assert data == b'part1' loop = asyncio.get_event_loop() task = loop.create_task(stream.readany()) # Give a chance for task to create waiter and start waiting for it. await asyncio.sleep(0.1) assert stream._waiter is not None assert not task.done() # Just for sure. # This will trigger waiter, but without feeding any data. # The stream should re-create waiter again. stream.end_http_chunk_receiving() # Give a chance for task to resolve. # If everything is OK, previous action SHOULD NOT resolve the task. await asyncio.sleep(0.1) assert not task.done() # The actual test. stream.begin_http_chunk_receiving() # This SHOULD unblock the task actually. stream.feed_data(b'part2') stream.end_http_chunk_receiving() data = await task assert data == b'part2' async def test_end_chunk_receiving_without_begin(self) -> None: stream = self._make_one() with pytest.raises(RuntimeError): stream.end_http_chunk_receiving() async def test_readchunk_with_unread(self) -> None: """Test that stream.unread does not break controlled chunk receiving. """ stream = self._make_one() # Send 2 chunks stream.begin_http_chunk_receiving() stream.feed_data(b'part1') stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b'part2') stream.end_http_chunk_receiving() # Read only one chunk data, end_of_chunk = await stream.readchunk() # Try to unread a part of the first chunk with pytest.warns(DeprecationWarning): stream.unread_data(b'rt1') # The end_of_chunk signal was already received for the first chunk, # so we receive up to the second one data, end_of_chunk = await stream.readchunk() assert b'rt1part2' == data assert end_of_chunk # Unread a part of the second chunk with pytest.warns(DeprecationWarning): stream.unread_data(b'rt2') data, end_of_chunk = await stream.readchunk() assert b'rt2' == data # end_of_chunk was already received for this chunk assert not end_of_chunk stream.feed_eof() data, end_of_chunk = await stream.readchunk() assert b'' == data assert not end_of_chunk async def test_readchunk_with_other_read_calls(self) -> None: """Test that stream.readchunk works when other read calls are made on the stream. """ stream = self._make_one() stream.begin_http_chunk_receiving() stream.feed_data(b'part1') stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b'part2') stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b'part3') stream.end_http_chunk_receiving() data = await stream.read(7) assert b'part1pa' == data data, end_of_chunk = await stream.readchunk() assert b'rt2' == data assert end_of_chunk # Corner case between read/readchunk data = await stream.read(5) assert b'part3' == data data, end_of_chunk = await stream.readchunk() assert b'' == data assert end_of_chunk stream.feed_eof() data, end_of_chunk = await stream.readchunk() assert b'' == data assert not end_of_chunk async def test_chunksplits_memory_leak(self) -> None: """ Test for memory leak on chunksplits """ stream = self._make_one() N = 500 # Warm-up variables stream.begin_http_chunk_receiving() stream.feed_data(b'Y' * N) stream.end_http_chunk_receiving() await stream.read(N) N = 300 before = get_memory_usage(stream) for _ in range(N): stream.begin_http_chunk_receiving() stream.feed_data(b'X') stream.end_http_chunk_receiving() await stream.read(N) after = get_memory_usage(stream) assert abs(after - before) == 0 async def test_read_empty_chunks(self) -> None: """Test that feeding empty chunks does not break stream""" stream = self._make_one() # Simulate empty first chunk. This is significant special case stream.begin_http_chunk_receiving() stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b'ungzipped') stream.end_http_chunk_receiving() # Possible when compression is enabled. stream.begin_http_chunk_receiving() stream.end_http_chunk_receiving() # is also possible stream.begin_http_chunk_receiving() stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b' data') stream.end_http_chunk_receiving() stream.feed_eof() data = await stream.read() assert data == b'ungzipped data' async def test_readchunk_separate_http_chunk_tail(self) -> None: """Test that stream.readchunk returns (b'', True) when end of http chunk received after body """ loop = asyncio.get_event_loop() stream = self._make_one() stream.begin_http_chunk_receiving() stream.feed_data(b'part1') data, end_of_chunk = await stream.readchunk() assert b'part1' == data assert not end_of_chunk async def cb(): await asyncio.sleep(0.1) stream.end_http_chunk_receiving() loop.create_task(cb()) data, end_of_chunk = await stream.readchunk() assert b'' == data assert end_of_chunk stream.begin_http_chunk_receiving() stream.feed_data(b'part2') data, end_of_chunk = await stream.readchunk() assert b'part2' == data assert not end_of_chunk stream.end_http_chunk_receiving() stream.begin_http_chunk_receiving() stream.feed_data(b'part3') stream.end_http_chunk_receiving() data, end_of_chunk = await stream.readchunk() assert b'' == data assert end_of_chunk data, end_of_chunk = await stream.readchunk() assert b'part3' == data assert end_of_chunk stream.begin_http_chunk_receiving() stream.feed_data(b'part4') data, end_of_chunk = await stream.readchunk() assert b'part4' == data assert not end_of_chunk async def cb(): await asyncio.sleep(0.1) stream.end_http_chunk_receiving() stream.feed_eof() loop.create_task(cb()) data, end_of_chunk = await stream.readchunk() assert b'' == data assert end_of_chunk data, end_of_chunk = await stream.readchunk() assert b'' == data assert not end_of_chunk async def test___repr__(self) -> None: stream = self._make_one() assert "" == repr(stream) async def test___repr__nondefault_limit(self) -> None: stream = self._make_one(limit=123) assert "" == repr(stream) async def test___repr__eof(self) -> None: stream = self._make_one() stream.feed_eof() assert "" == repr(stream) async def test___repr__data(self) -> None: stream = self._make_one() stream.feed_data(b'data') assert "" == repr(stream) async def test___repr__exception(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one(loop=loop) exc = RuntimeError() stream.set_exception(exc) assert "" == repr(stream) async def test___repr__waiter(self) -> None: loop = asyncio.get_event_loop() stream = self._make_one() stream._waiter = loop.create_future() assert re.search(r">", repr(stream)) stream._waiter.set_result(None) await stream._waiter stream._waiter = None assert "" == repr(stream) async def test_unread_empty(self) -> None: stream = self._make_one() stream.feed_data(b'line1') stream.feed_eof() with pytest.warns(DeprecationWarning): stream.unread_data(b'') data = await stream.read(5) assert b'line1' == data assert stream.at_eof() async def test_empty_stream_reader() -> None: s = streams.EmptyStreamReader() assert s.set_exception(ValueError()) is None assert s.exception() is None assert s.feed_eof() is None assert s.feed_data(b'data') is None assert s.at_eof() assert (await s.wait_eof()) is None assert await s.read() == b'' assert await s.readline() == b'' assert await s.readany() == b'' assert await s.readchunk() == (b'', True) with pytest.raises(asyncio.IncompleteReadError): await s.readexactly(10) assert s.read_nowait() == b'' @pytest.fixture async def buffer(loop): return streams.DataQueue(loop) class TestDataQueue: def test_is_eof(self, buffer) -> None: assert not buffer.is_eof() buffer.feed_eof() assert buffer.is_eof() def test_at_eof(self, buffer) -> None: assert not buffer.at_eof() buffer.feed_eof() assert buffer.at_eof() buffer._buffer.append(object()) assert not buffer.at_eof() def test_feed_data(self, buffer) -> None: item = object() buffer.feed_data(item, 1) assert [(item, 1)] == list(buffer._buffer) def test_feed_eof(self, buffer) -> None: buffer.feed_eof() assert buffer._eof async def test_read(self, buffer) -> None: loop = asyncio.get_event_loop() item = object() def cb(): buffer.feed_data(item, 1) loop.call_soon(cb) data = await buffer.read() assert item is data async def test_read_eof(self, buffer) -> None: loop = asyncio.get_event_loop() def cb(): buffer.feed_eof() loop.call_soon(cb) with pytest.raises(streams.EofStream): await buffer.read() async def test_read_cancelled(self, buffer) -> None: loop = asyncio.get_event_loop() read_task = loop.create_task(buffer.read()) await asyncio.sleep(0) waiter = buffer._waiter assert asyncio.isfuture(waiter) read_task.cancel() with pytest.raises(asyncio.CancelledError): await read_task assert waiter.cancelled() assert buffer._waiter is None buffer.feed_data(b'test', 4) assert buffer._waiter is None async def test_read_until_eof(self, buffer) -> None: item = object() buffer.feed_data(item, 1) buffer.feed_eof() data = await buffer.read() assert data is item with pytest.raises(streams.EofStream): await buffer.read() async def test_read_exc(self, buffer) -> None: item = object() buffer.feed_data(item) buffer.set_exception(ValueError) data = await buffer.read() assert item is data with pytest.raises(ValueError): await buffer.read() async def test_read_exception(self, buffer) -> None: buffer.set_exception(ValueError()) with pytest.raises(ValueError): await buffer.read() async def test_read_exception_with_data(self, buffer) -> None: val = object() buffer.feed_data(val, 1) buffer.set_exception(ValueError()) assert val is (await buffer.read()) with pytest.raises(ValueError): await buffer.read() async def test_read_exception_on_wait(self, buffer) -> None: loop = asyncio.get_event_loop() read_task = loop.create_task(buffer.read()) await asyncio.sleep(0) assert asyncio.isfuture(buffer._waiter) buffer.feed_eof() buffer.set_exception(ValueError()) with pytest.raises(ValueError): await read_task def test_exception(self, buffer) -> None: assert buffer.exception() is None exc = ValueError() buffer.set_exception(exc) assert buffer.exception() is exc async def test_exception_waiter(self, buffer) -> None: loop = asyncio.get_event_loop() async def set_err(): buffer.set_exception(ValueError()) t1 = loop.create_task(buffer.read()) t2 = loop.create_task(set_err()) await asyncio.wait([t1, t2]) with pytest.raises(ValueError): t1.result() async def test_feed_data_waiters(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() eof_waiter = reader._eof_waiter = loop.create_future() reader.feed_data(b'1') assert list(reader._buffer) == [b'1'] assert reader._size == 1 assert reader.total_bytes == 1 assert waiter.done() assert not eof_waiter.done() assert reader._waiter is None assert reader._eof_waiter is eof_waiter async def test_feed_data_completed_waiters(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() waiter.set_result(1) reader.feed_data(b'1') assert reader._waiter is None async def test_feed_eof_waiters(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() eof_waiter = reader._eof_waiter = loop.create_future() reader.feed_eof() assert reader._eof assert waiter.done() assert eof_waiter.done() assert reader._waiter is None assert reader._eof_waiter is None async def test_feed_eof_cancelled(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() eof_waiter = reader._eof_waiter = loop.create_future() waiter.set_result(1) eof_waiter.set_result(1) reader.feed_eof() assert waiter.done() assert eof_waiter.done() assert reader._waiter is None assert reader._eof_waiter is None async def test_on_eof(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) on_eof = mock.Mock() reader.on_eof(on_eof) assert not on_eof.called reader.feed_eof() assert on_eof.called async def test_on_eof_empty_reader() -> None: reader = streams.EmptyStreamReader() on_eof = mock.Mock() reader.on_eof(on_eof) assert on_eof.called async def test_on_eof_exc_in_callback(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) on_eof = mock.Mock() on_eof.side_effect = ValueError reader.on_eof(on_eof) assert not on_eof.called reader.feed_eof() assert on_eof.called assert not reader._eof_callbacks async def test_on_eof_exc_in_callback_empty_stream_reader() -> None: reader = streams.EmptyStreamReader() on_eof = mock.Mock() on_eof.side_effect = ValueError reader.on_eof(on_eof) assert on_eof.called async def test_on_eof_eof_is_set(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) reader.feed_eof() on_eof = mock.Mock() reader.on_eof(on_eof) assert on_eof.called assert not reader._eof_callbacks async def test_on_eof_eof_is_set_exception(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) reader.feed_eof() on_eof = mock.Mock() on_eof.side_effect = ValueError reader.on_eof(on_eof) assert on_eof.called assert not reader._eof_callbacks async def test_set_exception(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() eof_waiter = reader._eof_waiter = loop.create_future() exc = ValueError() reader.set_exception(exc) assert waiter.exception() is exc assert eof_waiter.exception() is exc assert reader._waiter is None assert reader._eof_waiter is None async def test_set_exception_cancelled(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) waiter = reader._waiter = loop.create_future() eof_waiter = reader._eof_waiter = loop.create_future() waiter.set_result(1) eof_waiter.set_result(1) exc = ValueError() reader.set_exception(exc) assert waiter.exception() is None assert eof_waiter.exception() is None assert reader._waiter is None assert reader._eof_waiter is None async def test_set_exception_eof_callbacks(protocol) -> None: loop = asyncio.get_event_loop() reader = streams.StreamReader(protocol, loop=loop) on_eof = mock.Mock() reader.on_eof(on_eof) reader.set_exception(ValueError()) assert not on_eof.called assert not reader._eof_callbacks async def test_stream_reader_lines() -> None: line_iter = iter(DATA.splitlines(keepends=True)) async for line in await create_stream(): assert line == next(line_iter, None) pytest.raises(StopIteration, next, line_iter) async def test_stream_reader_chunks_complete() -> None: """Tests if chunked iteration works if the chunking works out (i.e. the data is divisible by the chunk size) """ chunk_iter = chunkify(DATA, 9) async for data in (await create_stream()).iter_chunked(9): assert data == next(chunk_iter, None) pytest.raises(StopIteration, next, chunk_iter) async def test_stream_reader_chunks_incomplete() -> None: """Tests if chunked iteration works if the last chunk is incomplete""" chunk_iter = chunkify(DATA, 8) async for data in (await create_stream()).iter_chunked(8): assert data == next(chunk_iter, None) pytest.raises(StopIteration, next, chunk_iter) async def test_data_queue_empty() -> None: """Tests that async looping yields nothing if nothing is there""" loop = asyncio.get_event_loop() buffer = streams.DataQueue(loop) buffer.feed_eof() async for _ in buffer: # NOQA assert False async def test_data_queue_items() -> None: """Tests that async looping yields objects identically""" loop = asyncio.get_event_loop() buffer = streams.DataQueue(loop) items = [object(), object()] buffer.feed_data(items[0], 1) buffer.feed_data(items[1], 1) buffer.feed_eof() item_iter = iter(items) async for item in buffer: assert item is next(item_iter, None) pytest.raises(StopIteration, next, item_iter) async def test_stream_reader_iter_any() -> None: it = iter([b'line1\nline2\nline3\n']) async for raw in (await create_stream()).iter_any(): assert raw == next(it) pytest.raises(StopIteration, next, it) async def test_stream_reader_iter() -> None: it = iter([b'line1\n', b'line2\n', b'line3\n']) async for raw in await create_stream(): assert raw == next(it) pytest.raises(StopIteration, next, it) async def test_stream_reader_iter_chunks_no_chunked_encoding() -> None: it = iter([b'line1\nline2\nline3\n']) async for data, end_of_chunk in (await create_stream()).iter_chunks(): assert (data, end_of_chunk) == (next(it), False) pytest.raises(StopIteration, next, it) async def test_stream_reader_iter_chunks_chunked_encoding(protocol) -> None: loop = asyncio.get_event_loop() stream = streams.StreamReader(protocol, loop=loop) for line in DATA.splitlines(keepends=True): stream.begin_http_chunk_receiving() stream.feed_data(line) stream.end_http_chunk_receiving() stream.feed_eof() it = iter([b'line1\n', b'line2\n', b'line3\n']) async for data, end_of_chunk in stream.iter_chunks(): assert (data, end_of_chunk) == (next(it), True) pytest.raises(StopIteration, next, it) aiohttp-3.6.2/tests/test_tcp_helpers.py0000644000175100001650000001127113547410117020552 0ustar vstsdocker00000000000000import socket from unittest import mock import pytest from aiohttp.tcp_helpers import CORK, tcp_cork, tcp_nodelay has_ipv6 = socket.has_ipv6 if has_ipv6: # The socket.has_ipv6 flag may be True if Python was built with IPv6 # support, but the target system still may not have it. # So let's ensure that we really have IPv6 support. try: socket.socket(socket.AF_INET6, socket.SOCK_STREAM) except OSError: has_ipv6 = False # nodelay def test_tcp_nodelay_exception() -> None: transport = mock.Mock() s = mock.Mock() s.setsockopt = mock.Mock() s.family = socket.AF_INET s.setsockopt.side_effect = OSError transport.get_extra_info.return_value = s tcp_nodelay(transport, True) s.setsockopt.assert_called_with( socket.IPPROTO_TCP, socket.TCP_NODELAY, True ) def test_tcp_nodelay_enable() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_nodelay(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) def test_tcp_nodelay_enable_and_disable() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_nodelay(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) tcp_nodelay(transport, False) assert not s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) @pytest.mark.skipif(not has_ipv6, reason="IPv6 is not available") def test_tcp_nodelay_enable_ipv6() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET6, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_nodelay(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY) @pytest.mark.skipif(not hasattr(socket, 'AF_UNIX'), reason="requires unix sockets") def test_tcp_nodelay_enable_unix() -> None: # do not set nodelay for unix socket transport = mock.Mock() s = mock.Mock(family=socket.AF_UNIX, type=socket.SOCK_STREAM) transport.get_extra_info.return_value = s tcp_nodelay(transport, True) assert not s.setsockopt.called def test_tcp_nodelay_enable_no_socket() -> None: transport = mock.Mock() transport.get_extra_info.return_value = None tcp_nodelay(transport, True) # cork @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_tcp_cork_enable() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_cork(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, CORK) @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_set_cork_enable_and_disable() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_cork(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, CORK) tcp_cork(transport, False) assert not s.getsockopt(socket.IPPROTO_TCP, CORK) @pytest.mark.skipif(not has_ipv6, reason="IPv6 is not available") @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_set_cork_enable_ipv6() -> None: transport = mock.Mock() with socket.socket(socket.AF_INET6, socket.SOCK_STREAM) as s: transport.get_extra_info.return_value = s tcp_cork(transport, True) assert s.getsockopt(socket.IPPROTO_TCP, CORK) @pytest.mark.skipif(not hasattr(socket, 'AF_UNIX'), reason="requires unix sockets") @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_set_cork_enable_unix() -> None: transport = mock.Mock() s = mock.Mock(family=socket.AF_UNIX, type=socket.SOCK_STREAM) transport.get_extra_info.return_value = s tcp_cork(transport, True) assert not s.setsockopt.called @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_set_cork_enable_no_socket() -> None: transport = mock.Mock() transport.get_extra_info.return_value = None tcp_cork(transport, True) @pytest.mark.skipif(CORK is None, reason="TCP_CORK or TCP_NOPUSH required") def test_set_cork_exception() -> None: transport = mock.Mock() s = mock.Mock() s.setsockopt = mock.Mock() s.family = socket.AF_INET s.setsockopt.side_effect = OSError transport.get_extra_info.return_value = s tcp_cork(transport, True) s.setsockopt.assert_called_with( socket.IPPROTO_TCP, CORK, True ) aiohttp-3.6.2/tests/test_test_utils.py0000644000175100001650000002264213547410117020445 0ustar vstsdocker00000000000000import gzip from unittest import mock import pytest from multidict import CIMultiDict, CIMultiDictProxy from yarl import URL import aiohttp from aiohttp import web from aiohttp.test_utils import AioHTTPTestCase from aiohttp.test_utils import RawTestServer as _RawTestServer from aiohttp.test_utils import TestClient as _TestClient from aiohttp.test_utils import TestServer as _TestServer from aiohttp.test_utils import ( loop_context, make_mocked_request, unittest_run_loop, ) _hello_world_str = "Hello, world" _hello_world_bytes = _hello_world_str.encode('utf-8') _hello_world_gz = gzip.compress(_hello_world_bytes) def _create_example_app(): async def hello(request): return web.Response(body=_hello_world_bytes) async def websocket_handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive() if msg.type == aiohttp.WSMsgType.TEXT: if msg.data == 'close': await ws.close() else: await ws.send_str(msg.data + '/answer') return ws async def cookie_handler(request): resp = web.Response(body=_hello_world_bytes) resp.set_cookie('cookie', 'val') return resp app = web.Application() app.router.add_route('*', '/', hello) app.router.add_route('*', '/websocket', websocket_handler) app.router.add_route('*', '/cookie', cookie_handler) return app # these exist to test the pytest scenario @pytest.fixture def loop(): with loop_context() as loop: yield loop @pytest.fixture def app(): return _create_example_app() @pytest.fixture def test_client(loop, app) -> None: async def make_client(): return _TestClient(_TestServer(app, loop=loop), loop=loop) client = loop.run_until_complete(make_client()) loop.run_until_complete(client.start_server()) yield client loop.run_until_complete(client.close()) def test_with_test_server_fails(loop) -> None: app = _create_example_app() with pytest.raises(TypeError): with _TestServer(app, loop=loop): pass async def test_with_client_fails(loop) -> None: app = _create_example_app() with pytest.raises(TypeError): with _TestClient(_TestServer(app, loop=loop), loop=loop): pass async def test_aiohttp_client_close_is_idempotent() -> None: """ a test client, called multiple times, should not attempt to close the server again. """ app = _create_example_app() client = _TestClient(_TestServer(app)) await client.close() await client.close() class TestAioHTTPTestCase(AioHTTPTestCase): def get_app(self): return _create_example_app() @unittest_run_loop async def test_example_with_loop(self) -> None: request = await self.client.request("GET", "/") assert request.status == 200 text = await request.text() assert _hello_world_str == text def test_example(self) -> None: async def test_get_route() -> None: resp = await self.client.request("GET", "/") assert resp.status == 200 text = await resp.text() assert _hello_world_str == text self.loop.run_until_complete(test_get_route()) def test_get_route(loop, test_client) -> None: async def test_get_route() -> None: resp = await test_client.request("GET", "/") assert resp.status == 200 text = await resp.text() assert _hello_world_str == text loop.run_until_complete(test_get_route()) async def test_client_websocket(loop, test_client) -> None: resp = await test_client.ws_connect("/websocket") await resp.send_str("foo") msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.TEXT assert "foo" in msg.data await resp.send_str("close") msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.CLOSE async def test_client_cookie(loop, test_client) -> None: assert not test_client.session.cookie_jar await test_client.get("/cookie") cookies = list(test_client.session.cookie_jar) assert cookies[0].key == 'cookie' assert cookies[0].value == 'val' @pytest.mark.parametrize("method", [ "get", "post", "options", "post", "put", "patch", "delete" ]) async def test_test_client_methods(method, loop, test_client) -> None: resp = await getattr(test_client, method)("/") assert resp.status == 200 text = await resp.text() assert _hello_world_str == text async def test_test_client_head(loop, test_client) -> None: resp = await test_client.head("/") assert resp.status == 200 @pytest.mark.parametrize( "headers", [{'token': 'x'}, CIMultiDict({'token': 'x'}), {}]) def test_make_mocked_request(headers) -> None: req = make_mocked_request('GET', '/', headers=headers) assert req.method == "GET" assert req.path == "/" assert isinstance(req, web.Request) assert isinstance(req.headers, CIMultiDictProxy) def test_make_mocked_request_sslcontext() -> None: req = make_mocked_request('GET', '/') assert req.transport.get_extra_info('sslcontext') is None def test_make_mocked_request_unknown_extra_info() -> None: req = make_mocked_request('GET', '/') assert req.transport.get_extra_info('unknown_extra_info') is None def test_make_mocked_request_app() -> None: app = mock.Mock() req = make_mocked_request('GET', '/', app=app) assert req.app is app def test_make_mocked_request_app_can_store_values() -> None: req = make_mocked_request('GET', '/') req.app['a_field'] = 'a_value' assert req.app['a_field'] == 'a_value' def test_make_mocked_request_match_info() -> None: req = make_mocked_request('GET', '/', match_info={'a': '1', 'b': '2'}) assert req.match_info == {'a': '1', 'b': '2'} def test_make_mocked_request_content() -> None: payload = mock.Mock() req = make_mocked_request('GET', '/', payload=payload) assert req.content is payload def test_make_mocked_request_transport() -> None: transport = mock.Mock() req = make_mocked_request('GET', '/', transport=transport) assert req.transport is transport async def test_test_client_props(loop) -> None: app = _create_example_app() client = _TestClient(_TestServer(app, host='127.0.0.1', loop=loop), loop=loop) assert client.host == '127.0.0.1' assert client.port is None async with client: assert isinstance(client.port, int) assert client.server is not None assert client.app is not None assert client.port is None async def test_test_client_raw_server_props(loop) -> None: async def hello(request): return web.Response(body=_hello_world_bytes) client = _TestClient(_RawTestServer(hello, host='127.0.0.1', loop=loop), loop=loop) assert client.host == '127.0.0.1' assert client.port is None async with client: assert isinstance(client.port, int) assert client.server is not None assert client.app is None assert client.port is None async def test_test_server_context_manager(loop) -> None: app = _create_example_app() async with _TestServer(app, loop=loop) as server: client = aiohttp.ClientSession(loop=loop) resp = await client.head(server.make_url('/')) assert resp.status == 200 resp.close() await client.close() def test_client_unsupported_arg() -> None: with pytest.raises(TypeError) as e: _TestClient('string') assert str(e.value) == \ "server must be TestServer instance, found type: " async def test_server_make_url_yarl_compatibility(loop) -> None: app = _create_example_app() async with _TestServer(app, loop=loop) as server: make_url = server.make_url assert make_url(URL('/foo')) == make_url('/foo') with pytest.raises(AssertionError): make_url('http://foo.com') with pytest.raises(AssertionError): make_url(URL('http://foo.com')) def test_testcase_no_app(testdir, loop) -> None: testdir.makepyfile( """ from aiohttp.test_utils import AioHTTPTestCase class InvalidTestCase(AioHTTPTestCase): def test_noop(self) -> None: pass """) result = testdir.runpytest() result.stdout.fnmatch_lines(["*RuntimeError*"]) async def test_server_context_manager(app, loop) -> None: async with _TestServer(app, loop=loop) as server: async with aiohttp.ClientSession(loop=loop) as client: async with client.head(server.make_url('/')) as resp: assert resp.status == 200 @pytest.mark.parametrize("method", [ "head", "get", "post", "options", "post", "put", "patch", "delete" ]) async def test_client_context_manager_response(method, app, loop) -> None: async with _TestClient(_TestServer(app), loop=loop) as client: async with getattr(client, method)('/') as resp: assert resp.status == 200 if method != 'head': text = await resp.text() assert "Hello, world" in text async def test_custom_port(loop, app, aiohttp_unused_port) -> None: port = aiohttp_unused_port() client = _TestClient(_TestServer(app, loop=loop, port=port), loop=loop) await client.start_server() assert client.server.port == port resp = await client.get('/') assert resp.status == 200 text = await resp.text() assert _hello_world_str == text await client.close() aiohttp-3.6.2/tests/test_tracing.py0000644000175100001650000001154413547410117017674 0ustar vstsdocker00000000000000from types import SimpleNamespace from unittest.mock import Mock import pytest from aiohttp.test_utils import make_mocked_coro from aiohttp.tracing import ( Trace, TraceConfig, TraceConnectionCreateEndParams, TraceConnectionCreateStartParams, TraceConnectionQueuedEndParams, TraceConnectionQueuedStartParams, TraceConnectionReuseconnParams, TraceDnsCacheHitParams, TraceDnsCacheMissParams, TraceDnsResolveHostEndParams, TraceDnsResolveHostStartParams, TraceRequestChunkSentParams, TraceRequestEndParams, TraceRequestExceptionParams, TraceRequestRedirectParams, TraceRequestStartParams, TraceResponseChunkReceivedParams, ) class TestTraceConfig: def test_trace_config_ctx_default(self) -> None: trace_config = TraceConfig() assert isinstance(trace_config.trace_config_ctx(), SimpleNamespace) def test_trace_config_ctx_factory(self) -> None: trace_config = TraceConfig(trace_config_ctx_factory=dict) assert isinstance(trace_config.trace_config_ctx(), dict) def test_trace_config_ctx_request_ctx(self) -> None: trace_request_ctx = Mock() trace_config = TraceConfig() trace_config_ctx = trace_config.trace_config_ctx( trace_request_ctx=trace_request_ctx) assert trace_config_ctx.trace_request_ctx is trace_request_ctx def test_freeze(self) -> None: trace_config = TraceConfig() trace_config.freeze() assert trace_config.on_request_start.frozen assert trace_config.on_request_chunk_sent.frozen assert trace_config.on_response_chunk_received.frozen assert trace_config.on_request_end.frozen assert trace_config.on_request_exception.frozen assert trace_config.on_request_redirect.frozen assert trace_config.on_connection_queued_start.frozen assert trace_config.on_connection_queued_end.frozen assert trace_config.on_connection_create_start.frozen assert trace_config.on_connection_create_end.frozen assert trace_config.on_connection_reuseconn.frozen assert trace_config.on_dns_resolvehost_start.frozen assert trace_config.on_dns_resolvehost_end.frozen assert trace_config.on_dns_cache_hit.frozen assert trace_config.on_dns_cache_miss.frozen class TestTrace: @pytest.mark.parametrize('signal,params,param_obj', [ ( 'request_start', (Mock(), Mock(), Mock()), TraceRequestStartParams ), ( 'request_chunk_sent', (Mock(), ), TraceRequestChunkSentParams ), ( 'response_chunk_received', (Mock(), ), TraceResponseChunkReceivedParams ), ( 'request_end', (Mock(), Mock(), Mock(), Mock()), TraceRequestEndParams ), ( 'request_exception', (Mock(), Mock(), Mock(), Mock()), TraceRequestExceptionParams ), ( 'request_redirect', (Mock(), Mock(), Mock(), Mock()), TraceRequestRedirectParams ), ( 'connection_queued_start', (), TraceConnectionQueuedStartParams ), ( 'connection_queued_end', (), TraceConnectionQueuedEndParams ), ( 'connection_create_start', (), TraceConnectionCreateStartParams ), ( 'connection_create_end', (), TraceConnectionCreateEndParams ), ( 'connection_reuseconn', (), TraceConnectionReuseconnParams ), ( 'dns_resolvehost_start', (Mock(),), TraceDnsResolveHostStartParams ), ( 'dns_resolvehost_end', (Mock(),), TraceDnsResolveHostEndParams ), ( 'dns_cache_hit', (Mock(),), TraceDnsCacheHitParams ), ( 'dns_cache_miss', (Mock(),), TraceDnsCacheMissParams ) ]) async def test_send(self, signal, params, param_obj) -> None: session = Mock() trace_request_ctx = Mock() callback = Mock(side_effect=make_mocked_coro(Mock())) trace_config = TraceConfig() getattr(trace_config, "on_%s" % signal).append(callback) trace_config.freeze() trace = Trace( session, trace_config, trace_config.trace_config_ctx(trace_request_ctx=trace_request_ctx) ) await getattr(trace, "send_%s" % signal)(*params) callback.assert_called_once_with( session, SimpleNamespace(trace_request_ctx=trace_request_ctx), param_obj(*params) ) aiohttp-3.6.2/tests/test_urldispatch.py0000644000175100001650000012033613547410117020567 0ustar vstsdocker00000000000000import os import pathlib import re from collections.abc import Container, Iterable, Mapping, MutableMapping, Sized from urllib.parse import unquote import pytest from yarl import URL import aiohttp from aiohttp import hdrs, web from aiohttp.test_utils import make_mocked_request from aiohttp.web import HTTPMethodNotAllowed, HTTPNotFound, Response from aiohttp.web_urldispatcher import ( PATH_SEP, AbstractResource, Domain, DynamicResource, MaskDomain, PlainResource, ResourceRoute, StaticResource, SystemRoute, View, _default_expect_handler, ) def make_handler(): async def handler(request): return Response(request) # pragma: no cover return handler @pytest.fixture def app(): return web.Application() @pytest.fixture def router(app): return app.router @pytest.fixture def fill_routes(router): def go(): route1 = router.add_route('GET', '/plain', make_handler()) route2 = router.add_route('GET', '/variable/{name}', make_handler()) resource = router.add_static('/static', os.path.dirname(aiohttp.__file__)) return [route1, route2] + list(resource) return go def test_register_uncommon_http_methods(router) -> None: uncommon_http_methods = { 'PROPFIND', 'PROPPATCH', 'COPY', 'LOCK', 'UNLOCK', 'MOVE', 'SUBSCRIBE', 'UNSUBSCRIBE', 'NOTIFY' } for method in uncommon_http_methods: router.add_route(method, '/handler/to/path', make_handler()) async def test_add_route_root(router) -> None: handler = make_handler() router.add_route('GET', '/', handler) req = make_mocked_request('GET', '/') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_simple(router) -> None: handler = make_handler() router.add_route('GET', '/handler/to/path', handler) req = make_mocked_request('GET', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_with_matchdict(router) -> None: handler = make_handler() router.add_route('GET', '/handler/{to}', handler) req = make_mocked_request('GET', '/handler/tail') info = await router.resolve(req) assert info is not None assert {'to': 'tail'} == info assert handler is info.handler assert info.route.name is None async def test_add_with_matchdict_with_colon(router) -> None: handler = make_handler() router.add_route('GET', '/handler/{to}', handler) req = make_mocked_request('GET', '/handler/1:2:3') info = await router.resolve(req) assert info is not None assert {'to': '1:2:3'} == info assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_get_shortcut(router) -> None: handler = make_handler() router.add_get('/handler/to/path', handler) req = make_mocked_request('GET', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_post_shortcut(router) -> None: handler = make_handler() router.add_post('/handler/to/path', handler) req = make_mocked_request('POST', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_put_shortcut(router) -> None: handler = make_handler() router.add_put('/handler/to/path', handler) req = make_mocked_request('PUT', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_patch_shortcut(router) -> None: handler = make_handler() router.add_patch('/handler/to/path', handler) req = make_mocked_request('PATCH', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_delete_shortcut(router) -> None: handler = make_handler() router.add_delete('/handler/to/path', handler) req = make_mocked_request('DELETE', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_route_with_add_head_shortcut(router) -> None: handler = make_handler() router.add_head('/handler/to/path', handler) req = make_mocked_request('HEAD', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 0 == len(info) assert handler is info.handler assert info.route.name is None async def test_add_with_name(router) -> None: handler = make_handler() router.add_route('GET', '/handler/to/path', handler, name='name') req = make_mocked_request('GET', '/handler/to/path') info = await router.resolve(req) assert info is not None assert 'name' == info.route.name async def test_add_with_tailing_slash(router) -> None: handler = make_handler() router.add_route('GET', '/handler/to/path/', handler) req = make_mocked_request('GET', '/handler/to/path/') info = await router.resolve(req) assert info is not None assert {} == info assert handler is info.handler def test_add_invalid_path(router) -> None: handler = make_handler() with pytest.raises(ValueError): router.add_route('GET', '/{/', handler) def test_add_url_invalid1(router) -> None: handler = make_handler() with pytest.raises(ValueError): router.add_route('post', '/post/{id', handler) def test_add_url_invalid2(router) -> None: handler = make_handler() with pytest.raises(ValueError): router.add_route('post', '/post/{id{}}', handler) def test_add_url_invalid3(router) -> None: handler = make_handler() with pytest.raises(ValueError): router.add_route('post', '/post/{id{}', handler) def test_add_url_invalid4(router) -> None: handler = make_handler() with pytest.raises(ValueError): router.add_route('post', '/post/{id"}', handler) async def test_add_url_escaping(router) -> None: handler = make_handler() router.add_route('GET', '/+$', handler) req = make_mocked_request('GET', '/+$') info = await router.resolve(req) assert info is not None assert handler is info.handler async def test_any_method(router) -> None: handler = make_handler() route = router.add_route(hdrs.METH_ANY, '/', handler) req = make_mocked_request('GET', '/') info1 = await router.resolve(req) assert info1 is not None assert route is info1.route req = make_mocked_request('POST', '/') info2 = await router.resolve(req) assert info2 is not None assert info1.route is info2.route async def test_match_second_result_in_table(router) -> None: handler1 = make_handler() handler2 = make_handler() router.add_route('GET', '/h1', handler1) router.add_route('POST', '/h2', handler2) req = make_mocked_request('POST', '/h2') info = await router.resolve(req) assert info is not None assert {} == info assert handler2 is info.handler async def test_raise_method_not_allowed(router) -> None: handler1 = make_handler() handler2 = make_handler() router.add_route('GET', '/', handler1) router.add_route('POST', '/', handler2) req = make_mocked_request('PUT', '/') match_info = await router.resolve(req) assert isinstance(match_info.route, SystemRoute) assert {} == match_info with pytest.raises(HTTPMethodNotAllowed) as ctx: await match_info.handler(req) exc = ctx.value assert 'PUT' == exc.method assert 405 == exc.status assert {'POST', 'GET'} == exc.allowed_methods async def test_raise_method_not_found(router) -> None: handler = make_handler() router.add_route('GET', '/a', handler) req = make_mocked_request('GET', '/b') match_info = await router.resolve(req) assert isinstance(match_info.route, SystemRoute) assert {} == match_info with pytest.raises(HTTPNotFound) as ctx: await match_info.handler(req) exc = ctx.value assert 404 == exc.status def test_double_add_url_with_the_same_name(router) -> None: handler1 = make_handler() handler2 = make_handler() router.add_route('GET', '/get', handler1, name='name') regexp = ("Duplicate 'name', already handled by") with pytest.raises(ValueError) as ctx: router.add_route('GET', '/get_other', handler2, name='name') assert re.match(regexp, str(ctx.value)) def test_route_plain(router) -> None: handler = make_handler() route = router.add_route('GET', '/get', handler, name='name') route2 = next(iter(router['name'])) url = route2.url_for() assert '/get' == str(url) assert route is route2 def test_route_unknown_route_name(router) -> None: with pytest.raises(KeyError): router['unknown'] def test_route_dynamic(router) -> None: handler = make_handler() route = router.add_route('GET', '/get/{name}', handler, name='name') route2 = next(iter(router['name'])) url = route2.url_for(name='John') assert '/get/John' == str(url) assert route is route2 def test_add_static(router) -> None: resource = router.add_static('/st', os.path.dirname(aiohttp.__file__), name='static') assert router['static'] is resource url = resource.url_for(filename='/dir/a.txt') assert '/st/dir/a.txt' == str(url) assert len(resource) == 2 def test_add_static_append_version(router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), name='static') url = resource.url_for(filename='/data.unknown_mime_type', append_version=True) expect_url = '/st/data.unknown_mime_type?' \ 'v=aUsn8CHEhhszc81d28QmlcBW0KQpfS2F4trgQKhOYd8%3D' assert expect_url == str(url) def test_add_static_append_version_set_from_constructor(router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), append_version=True, name='static') url = resource.url_for(filename='/data.unknown_mime_type') expect_url = '/st/data.unknown_mime_type?' \ 'v=aUsn8CHEhhszc81d28QmlcBW0KQpfS2F4trgQKhOYd8%3D' assert expect_url == str(url) def test_add_static_append_version_override_constructor(router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), append_version=True, name='static') url = resource.url_for(filename='/data.unknown_mime_type', append_version=False) expect_url = '/st/data.unknown_mime_type' assert expect_url == str(url) def test_add_static_append_version_filename_without_slash(router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), name='static') url = resource.url_for(filename='data.unknown_mime_type', append_version=True) expect_url = '/st/data.unknown_mime_type?' \ 'v=aUsn8CHEhhszc81d28QmlcBW0KQpfS2F4trgQKhOYd8%3D' assert expect_url == str(url) def test_add_static_append_version_non_exists_file(router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), name='static') url = resource.url_for(filename='/non_exists_file', append_version=True) assert '/st/non_exists_file' == str(url) def test_add_static_append_version_non_exists_file_without_slash( router) -> None: resource = router.add_static('/st', os.path.dirname(__file__), name='static') url = resource.url_for(filename='non_exists_file', append_version=True) assert '/st/non_exists_file' == str(url) def test_add_static_append_version_follow_symlink(router, tmpdir) -> None: """ Tests the access to a symlink, in static folder with apeend_version """ tmp_dir_path = str(tmpdir) symlink_path = os.path.join(tmp_dir_path, 'append_version_symlink') symlink_target_path = os.path.dirname(__file__) os.symlink(symlink_target_path, symlink_path, True) # Register global static route: resource = router.add_static('/st', tmp_dir_path, follow_symlinks=True, append_version=True) url = resource.url_for( filename='/append_version_symlink/data.unknown_mime_type') expect_url = '/st/append_version_symlink/data.unknown_mime_type?' \ 'v=aUsn8CHEhhszc81d28QmlcBW0KQpfS2F4trgQKhOYd8%3D' assert expect_url == str(url) def test_add_static_append_version_not_follow_symlink(router, tmpdir) -> None: """ Tests the access to a symlink, in static folder with apeend_version """ tmp_dir_path = str(tmpdir) symlink_path = os.path.join(tmp_dir_path, 'append_version_symlink') symlink_target_path = os.path.dirname(__file__) os.symlink(symlink_target_path, symlink_path, True) # Register global static route: resource = router.add_static('/st', tmp_dir_path, follow_symlinks=False, append_version=True) filename = '/append_version_symlink/data.unknown_mime_type' url = resource.url_for(filename=filename) assert '/st/append_version_symlink/data.unknown_mime_type' == str(url) def test_plain_not_match(router) -> None: handler = make_handler() router.add_route('GET', '/get/path', handler, name='name') route = router['name'] assert route._match('/another/path') is None def test_dynamic_not_match(router) -> None: handler = make_handler() router.add_route('GET', '/get/{name}', handler, name='name') route = router['name'] assert route._match('/another/path') is None async def test_static_not_match(router) -> None: router.add_static('/pre', os.path.dirname(aiohttp.__file__), name='name') resource = router['name'] ret = await resource.resolve( make_mocked_request('GET', '/another/path')) assert (None, set()) == ret def test_dynamic_with_trailing_slash(router) -> None: handler = make_handler() router.add_route('GET', '/get/{name}/', handler, name='name') route = router['name'] assert {'name': 'John'} == route._match('/get/John/') def test_len(router) -> None: handler = make_handler() router.add_route('GET', '/get1', handler, name='name1') router.add_route('GET', '/get2', handler, name='name2') assert 2 == len(router) def test_iter(router) -> None: handler = make_handler() router.add_route('GET', '/get1', handler, name='name1') router.add_route('GET', '/get2', handler, name='name2') assert {'name1', 'name2'} == set(iter(router)) def test_contains(router) -> None: handler = make_handler() router.add_route('GET', '/get1', handler, name='name1') router.add_route('GET', '/get2', handler, name='name2') assert 'name1' in router assert 'name3' not in router def test_static_repr(router) -> None: router.add_static('/get', os.path.dirname(aiohttp.__file__), name='name') assert re.match(r" None: route = router.add_static('/prefix', os.path.dirname(aiohttp.__file__)) assert '/prefix' == route._prefix def test_static_remove_trailing_slash(router) -> None: route = router.add_static('/prefix/', os.path.dirname(aiohttp.__file__)) assert '/prefix' == route._prefix async def test_add_route_with_re(router) -> None: handler = make_handler() router.add_route('GET', r'/handler/{to:\d+}', handler) req = make_mocked_request('GET', '/handler/1234') info = await router.resolve(req) assert info is not None assert {'to': '1234'} == info router.add_route('GET', r'/handler/{name}.html', handler) req = make_mocked_request('GET', '/handler/test.html') info = await router.resolve(req) assert {'name': 'test'} == info async def test_add_route_with_re_and_slashes(router) -> None: handler = make_handler() router.add_route('GET', r'/handler/{to:[^/]+/?}', handler) req = make_mocked_request('GET', '/handler/1234/') info = await router.resolve(req) assert info is not None assert {'to': '1234/'} == info router.add_route('GET', r'/handler/{to:.+}', handler) req = make_mocked_request('GET', '/handler/1234/5/6/7') info = await router.resolve(req) assert info is not None assert {'to': '1234/5/6/7'} == info async def test_add_route_with_re_not_match(router) -> None: handler = make_handler() router.add_route('GET', r'/handler/{to:\d+}', handler) req = make_mocked_request('GET', '/handler/tail') match_info = await router.resolve(req) assert isinstance(match_info.route, SystemRoute) assert {} == match_info with pytest.raises(HTTPNotFound): await match_info.handler(req) async def test_add_route_with_re_including_slashes(router) -> None: handler = make_handler() router.add_route('GET', r'/handler/{to:.+}/tail', handler) req = make_mocked_request('GET', '/handler/re/with/slashes/tail') info = await router.resolve(req) assert info is not None assert {'to': 're/with/slashes'} == info def test_add_route_with_invalid_re(router) -> None: handler = make_handler() with pytest.raises(ValueError) as ctx: router.add_route('GET', r'/handler/{to:+++}', handler) s = str(ctx.value) assert s.startswith("Bad pattern '" + PATH_SEP + "handler" + PATH_SEP + "(?P+++)': nothing to repeat") assert ctx.value.__cause__ is None def test_route_dynamic_with_regex_spec(router) -> None: handler = make_handler() route = router.add_route('GET', r'/get/{num:^\d+}', handler, name='name') url = route.url_for(num='123') assert '/get/123' == str(url) def test_route_dynamic_with_regex_spec_and_trailing_slash(router) -> None: handler = make_handler() route = router.add_route('GET', r'/get/{num:^\d+}/', handler, name='name') url = route.url_for(num='123') assert '/get/123/' == str(url) def test_route_dynamic_with_regex(router) -> None: handler = make_handler() route = router.add_route('GET', r'/{one}/{two:.+}', handler) url = route.url_for(one='1', two='2') assert '/1/2' == str(url) def test_route_dynamic_quoting(router) -> None: handler = make_handler() route = router.add_route('GET', r'/{arg}', handler) url = route.url_for(arg='1 2/текст') assert '/1%202/%D1%82%D0%B5%D0%BA%D1%81%D1%82' == str(url) async def test_regular_match_info(router) -> None: handler = make_handler() router.add_route('GET', '/get/{name}', handler) req = make_mocked_request('GET', '/get/john') match_info = await router.resolve(req) assert {'name': 'john'} == match_info assert re.match(">", repr(match_info)) async def test_match_info_with_plus(router) -> None: handler = make_handler() router.add_route('GET', '/get/{version}', handler) req = make_mocked_request('GET', '/get/1.0+test') match_info = await router.resolve(req) assert {'version': '1.0+test'} == match_info async def test_not_found_repr(router) -> None: req = make_mocked_request('POST', '/path/to') match_info = await router.resolve(req) assert "" == repr(match_info) async def test_not_allowed_repr(router) -> None: handler = make_handler() router.add_route('GET', '/path/to', handler) handler2 = make_handler() router.add_route('POST', '/path/to', handler2) req = make_mocked_request('PUT', '/path/to') match_info = await router.resolve(req) assert "" == repr(match_info) def test_default_expect_handler(router) -> None: route = router.add_route('GET', '/', make_handler()) assert route._expect_handler is _default_expect_handler def test_custom_expect_handler_plain(router) -> None: async def handler(request): pass route = router.add_route( 'GET', '/', make_handler(), expect_handler=handler) assert route._expect_handler is handler assert isinstance(route, ResourceRoute) def test_custom_expect_handler_dynamic(router) -> None: async def handler(request): pass route = router.add_route( 'GET', '/get/{name}', make_handler(), expect_handler=handler) assert route._expect_handler is handler assert isinstance(route, ResourceRoute) def test_expect_handler_non_coroutine(router) -> None: def handler(request): pass with pytest.raises(AssertionError): router.add_route('GET', '/', make_handler(), expect_handler=handler) async def test_dynamic_match_non_ascii(router) -> None: handler = make_handler() router.add_route('GET', '/{var}', handler) req = make_mocked_request( 'GET', '/%D1%80%D1%83%D1%81%20%D1%82%D0%B5%D0%BA%D1%81%D1%82') match_info = await router.resolve(req) assert {'var': 'рус текст'} == match_info async def test_dynamic_match_with_static_part(router) -> None: handler = make_handler() router.add_route('GET', '/{name}.html', handler) req = make_mocked_request('GET', '/file.html') match_info = await router.resolve(req) assert {'name': 'file'} == match_info async def test_dynamic_match_two_part2(router) -> None: handler = make_handler() router.add_route('GET', '/{name}.{ext}', handler) req = make_mocked_request('GET', '/file.html') match_info = await router.resolve(req) assert {'name': 'file', 'ext': 'html'} == match_info async def test_dynamic_match_unquoted_path(router) -> None: handler = make_handler() router.add_route('GET', '/{path}/{subpath}', handler) resource_id = 'my%2Fpath%7Cwith%21some%25strange%24characters' req = make_mocked_request('GET', '/path/{0}'.format(resource_id)) match_info = await router.resolve(req) assert match_info == { 'path': 'path', 'subpath': unquote(resource_id) } def test_add_route_not_started_with_slash(router) -> None: with pytest.raises(ValueError): handler = make_handler() router.add_route('GET', 'invalid_path', handler) def test_add_route_invalid_method(router) -> None: sample_bad_methods = { 'BAD METHOD', 'B@D_METHOD', '[BAD_METHOD]', '{BAD_METHOD}', '(BAD_METHOD)', 'B?D_METHOD', } for bad_method in sample_bad_methods: with pytest.raises(ValueError): handler = make_handler() router.add_route(bad_method, '/path', handler) def test_routes_view_len(router, fill_routes) -> None: fill_routes() assert 4 == len(router.routes()) def test_routes_view_iter(router, fill_routes) -> None: routes = fill_routes() assert list(routes) == list(router.routes()) def test_routes_view_contains(router, fill_routes) -> None: routes = fill_routes() for route in routes: assert route in router.routes() def test_routes_abc(router) -> None: assert isinstance(router.routes(), Sized) assert isinstance(router.routes(), Iterable) assert isinstance(router.routes(), Container) def test_named_resources_abc(router) -> None: assert isinstance(router.named_resources(), Mapping) assert not isinstance(router.named_resources(), MutableMapping) def test_named_resources(router) -> None: route1 = router.add_route('GET', '/plain', make_handler(), name='route1') route2 = router.add_route('GET', '/variable/{name}', make_handler(), name='route2') route3 = router.add_static('/static', os.path.dirname(aiohttp.__file__), name='route3') names = {route1.name, route2.name, route3.name} assert 3 == len(router.named_resources()) for name in names: assert name in router.named_resources() assert isinstance(router.named_resources()[name], AbstractResource) def test_resource_iter(router) -> None: async def handler(request): pass resource = router.add_resource('/path') r1 = resource.add_route('GET', handler) r2 = resource.add_route('POST', handler) assert 2 == len(resource) assert [r1, r2] == list(resource) def test_deprecate_bare_generators(router) -> None: resource = router.add_resource('/path') def gen(request): yield with pytest.warns(DeprecationWarning): resource.add_route('GET', gen) def test_view_route(router) -> None: resource = router.add_resource('/path') route = resource.add_route('GET', View) assert View is route.handler def test_resource_route_match(router) -> None: async def handler(request): pass resource = router.add_resource('/path') route = resource.add_route('GET', handler) assert {} == route.resource._match('/path') def test_error_on_double_route_adding(router) -> None: async def handler(request): pass resource = router.add_resource('/path') resource.add_route('GET', handler) with pytest.raises(RuntimeError): resource.add_route('GET', handler) def test_error_on_adding_route_after_wildcard(router) -> None: async def handler(request): pass resource = router.add_resource('/path') resource.add_route('*', handler) with pytest.raises(RuntimeError): resource.add_route('GET', handler) async def test_http_exception_is_none_when_resolved(router) -> None: handler = make_handler() router.add_route('GET', '/', handler) req = make_mocked_request('GET', '/') info = await router.resolve(req) assert info.http_exception is None async def test_http_exception_is_not_none_when_not_resolved(router) -> None: handler = make_handler() router.add_route('GET', '/', handler) req = make_mocked_request('GET', '/abc') info = await router.resolve(req) assert info.http_exception.status == 404 async def test_match_info_get_info_plain(router) -> None: handler = make_handler() router.add_route('GET', '/', handler) req = make_mocked_request('GET', '/') info = await router.resolve(req) assert info.get_info() == {'path': '/'} async def test_match_info_get_info_dynamic(router) -> None: handler = make_handler() router.add_route('GET', '/{a}', handler) req = make_mocked_request('GET', '/value') info = await router.resolve(req) assert info.get_info() == { 'pattern': re.compile(PATH_SEP+'(?P[^{}/]+)'), 'formatter': '/{a}'} async def test_match_info_get_info_dynamic2(router) -> None: handler = make_handler() router.add_route('GET', '/{a}/{b}', handler) req = make_mocked_request('GET', '/path/to') info = await router.resolve(req) assert info.get_info() == { 'pattern': re.compile(PATH_SEP + '(?P[^{}/]+)' + PATH_SEP + '(?P[^{}/]+)'), 'formatter': '/{a}/{b}'} def test_static_resource_get_info(router) -> None: directory = pathlib.Path(aiohttp.__file__).parent.resolve() resource = router.add_static('/st', directory) assert resource.get_info() == {'directory': directory, 'prefix': '/st'} async def test_system_route_get_info(router) -> None: handler = make_handler() router.add_route('GET', '/', handler) req = make_mocked_request('GET', '/abc') info = await router.resolve(req) assert info.get_info()['http_exception'].status == 404 def test_resources_view_len(router) -> None: router.add_resource('/plain') router.add_resource('/variable/{name}') assert 2 == len(router.resources()) def test_resources_view_iter(router) -> None: resource1 = router.add_resource('/plain') resource2 = router.add_resource('/variable/{name}') resources = [resource1, resource2] assert list(resources) == list(router.resources()) def test_resources_view_contains(router) -> None: resource1 = router.add_resource('/plain') resource2 = router.add_resource('/variable/{name}') resources = [resource1, resource2] for resource in resources: assert resource in router.resources() def test_resources_abc(router) -> None: assert isinstance(router.resources(), Sized) assert isinstance(router.resources(), Iterable) assert isinstance(router.resources(), Container) def test_static_route_user_home(router) -> None: here = pathlib.Path(aiohttp.__file__).parent home = pathlib.Path(os.path.expanduser('~')) if not str(here).startswith(str(home)): # pragma: no cover pytest.skip("aiohttp folder is not placed in user's HOME") static_dir = '~/' + str(here.relative_to(home)) route = router.add_static('/st', static_dir) assert here == route.get_info()['directory'] def test_static_route_points_to_file(router) -> None: here = pathlib.Path(aiohttp.__file__).parent / '__init__.py' with pytest.raises(ValueError): router.add_static('/st', here) async def test_404_for_static_resource(router) -> None: resource = router.add_static('/st', os.path.dirname(aiohttp.__file__)) ret = await resource.resolve( make_mocked_request('GET', '/unknown/path')) assert (None, set()) == ret async def test_405_for_resource_adapter(router) -> None: resource = router.add_static('/st', os.path.dirname(aiohttp.__file__)) ret = await resource.resolve( make_mocked_request('POST', '/st/abc.py')) assert (None, {'HEAD', 'GET'}) == ret async def test_check_allowed_method_for_found_resource(router) -> None: handler = make_handler() resource = router.add_resource('/') resource.add_route('GET', handler) ret = await resource.resolve(make_mocked_request('GET', '/')) assert ret[0] is not None assert {'GET'} == ret[1] def test_url_for_in_static_resource(router) -> None: resource = router.add_static('/static', os.path.dirname(aiohttp.__file__)) assert URL('/static/file.txt') == resource.url_for(filename='file.txt') def test_url_for_in_static_resource_pathlib(router) -> None: resource = router.add_static('/static', os.path.dirname(aiohttp.__file__)) assert URL('/static/file.txt') == resource.url_for( filename=pathlib.Path('file.txt')) def test_url_for_in_resource_route(router) -> None: route = router.add_route('GET', '/get/{name}', make_handler(), name='name') assert URL('/get/John') == route.url_for(name='John') def test_subapp_get_info(app) -> None: subapp = web.Application() resource = subapp.add_subapp('/pre', subapp) assert resource.get_info() == {'prefix': '/pre', 'app': subapp} @pytest.mark.parametrize('domain,error', [ (None, TypeError), ('', ValueError), ('http://dom', ValueError), ('*.example.com', ValueError), ('example$com', ValueError), ]) def test_domain_validation_error(domain, error): with pytest.raises(error): Domain(domain) def test_domain_valid(): assert Domain('example.com:81').canonical == 'example.com:81' assert MaskDomain('*.example.com').canonical == r'.*\.example\.com' assert Domain('пуни.код').canonical == 'xn--h1ajfq.xn--d1alm' @pytest.mark.parametrize('a,b,result', [ ('example.com', 'example.com', True), ('example.com:81', 'example.com:81', True), ('example.com:81', 'example.com', False), ('пуникод', 'xn--d1ahgkhc2a', True), ('*.example.com', 'jpg.example.com', True), ('*.example.com', 'a.example.com', True), ('*.example.com', 'example.com', False), ]) def test_match_domain(a, b, result): if '*' in a: rule = MaskDomain(a) else: rule = Domain(a) assert rule.match_domain(b) is result def test_add_subapp_errors(app): with pytest.raises(TypeError): app.add_subapp(1, web.Application()) def test_subapp_rule_resource(app): subapp = web.Application() subapp.router.add_get('/', make_handler()) rule = Domain('example.com') assert rule.get_info() == {'domain': 'example.com'} resource = app.add_domain('example.com', subapp) assert resource.canonical == 'example.com' assert resource.get_info() == {'rule': resource._rule, 'app': subapp} resource.add_prefix('/a') resource.raw_match('/b') assert len(resource) assert list(resource) assert repr(resource).startswith(' None: subapp = web.Application() resource = app.add_subapp('/pre', subapp) with pytest.raises(RuntimeError): resource.url_for() def test_subapp_repr(app) -> None: subapp = web.Application() resource = app.add_subapp('/pre', subapp) assert repr(resource).startswith( ' None: subapp = web.Application() subapp.router.add_get('/', make_handler(), allow_head=False) subapp.router.add_post('/', make_handler()) resource = app.add_subapp('/pre', subapp) assert len(resource) == 2 def test_subapp_iter(app) -> None: subapp = web.Application() r1 = subapp.router.add_get('/', make_handler(), allow_head=False) r2 = subapp.router.add_post('/', make_handler()) resource = app.add_subapp('/pre', subapp) assert list(resource) == [r1, r2] def test_invalid_route_name(router) -> None: with pytest.raises(ValueError): router.add_get('/', make_handler(), name='invalid name') def test_frozen_router(router) -> None: router.freeze() with pytest.raises(RuntimeError): router.add_get('/', make_handler()) def test_frozen_router_subapp(app) -> None: subapp = web.Application() subapp.freeze() with pytest.raises(RuntimeError): app.add_subapp('/pre', subapp) def test_frozen_app_on_subapp(app) -> None: app.freeze() subapp = web.Application() with pytest.raises(RuntimeError): app.add_subapp('/pre', subapp) def test_set_options_route(router) -> None: resource = router.add_static('/static', os.path.dirname(aiohttp.__file__)) options = None for route in resource: if route.method == 'OPTIONS': options = route assert options is None resource.set_options_route(make_handler()) for route in resource: if route.method == 'OPTIONS': options = route assert options is not None with pytest.raises(RuntimeError): resource.set_options_route(make_handler()) def test_dynamic_url_with_name_started_from_underscore(router) -> None: route = router.add_route('GET', '/get/{_name}', make_handler()) assert URL('/get/John') == route.url_for(_name='John') def test_cannot_add_subapp_with_empty_prefix(app) -> None: subapp = web.Application() with pytest.raises(ValueError): app.add_subapp('', subapp) def test_cannot_add_subapp_with_slash_prefix(app) -> None: subapp = web.Application() with pytest.raises(ValueError): app.add_subapp('/', subapp) async def test_convert_empty_path_to_slash_on_freezing(router) -> None: handler = make_handler() route = router.add_get('', handler) resource = route.resource assert resource.get_info() == {'path': ''} router.freeze() assert resource.get_info() == {'path': '/'} def test_deprecate_non_coroutine(router) -> None: def handler(request): pass with pytest.warns(DeprecationWarning): router.add_route('GET', '/handler', handler) def test_plain_resource_canonical() -> None: canonical = '/plain/path' res = PlainResource(path=canonical) assert res.canonical == canonical def test_dynamic_resource_canonical() -> None: canonicals = { '/get/{name}': '/get/{name}', r'/get/{num:^\d+}': '/get/{num}', r'/handler/{to:\d+}': r'/handler/{to}', r'/{one}/{two:.+}': r'/{one}/{two}', } for pattern, canonical in canonicals.items(): res = DynamicResource(path=pattern) assert res.canonical == canonical def test_static_resource_canonical() -> None: prefix = '/prefix' directory = str(os.path.dirname(aiohttp.__file__)) canonical = prefix res = StaticResource(prefix=prefix, directory=directory) assert res.canonical == canonical def test_prefixed_subapp_resource_canonical(app) -> None: canonical = '/prefix' subapp = web.Application() res = subapp.add_subapp(canonical, subapp) assert res.canonical == canonical async def test_prefixed_subapp_overlap(app) -> None: """ Subapp should not overshadow other subapps with overlapping prefixes """ subapp1 = web.Application() handler1 = make_handler() subapp1.router.add_get('/a', handler1) app.add_subapp('/s', subapp1) subapp2 = web.Application() handler2 = make_handler() subapp2.router.add_get('/b', handler2) app.add_subapp('/ss', subapp2) match_info = await app.router.resolve(make_mocked_request('GET', '/s/a')) assert match_info.route.handler is handler1 match_info = await app.router.resolve(make_mocked_request('GET', '/ss/b')) assert match_info.route.handler is handler2 async def test_prefixed_subapp_empty_route(app) -> None: subapp = web.Application() handler = make_handler() subapp.router.add_get('', handler) app.add_subapp('/s', subapp) match_info = await app.router.resolve(make_mocked_request('GET', '/s')) assert match_info.route.handler is handler match_info = await app.router.resolve(make_mocked_request('GET', '/s/')) assert "" == repr(match_info) async def test_prefixed_subapp_root_route(app) -> None: subapp = web.Application() handler = make_handler() subapp.router.add_get('/', handler) app.add_subapp('/s', subapp) match_info = await app.router.resolve(make_mocked_request('GET', '/s/')) assert match_info.route.handler is handler match_info = await app.router.resolve(make_mocked_request('GET', '/s')) assert "" == repr(match_info) aiohttp-3.6.2/tests/test_web_app.py0000644000175100001650000003522113547410117017660 0ustar vstsdocker00000000000000import asyncio from unittest import mock import pytest from async_generator import async_generator, yield_ from aiohttp import log, web from aiohttp.abc import AbstractAccessLogger, AbstractRouter from aiohttp.helpers import DEBUG, PY_36 from aiohttp.test_utils import make_mocked_coro async def test_app_ctor() -> None: loop = asyncio.get_event_loop() with pytest.warns(DeprecationWarning): app = web.Application(loop=loop) with pytest.warns(DeprecationWarning): assert loop is app.loop assert app.logger is log.web_logger def test_app_call() -> None: app = web.Application() assert app is app() def test_app_default_loop() -> None: app = web.Application() with pytest.warns(DeprecationWarning): assert app.loop is None async def test_set_loop() -> None: loop = asyncio.get_event_loop() app = web.Application() app._set_loop(loop) with pytest.warns(DeprecationWarning): assert app.loop is loop def test_set_loop_default_loop() -> None: loop = asyncio.new_event_loop() asyncio.set_event_loop(loop) app = web.Application() app._set_loop(None) with pytest.warns(DeprecationWarning): assert app.loop is loop asyncio.set_event_loop(None) def test_set_loop_with_different_loops() -> None: loop = asyncio.new_event_loop() app = web.Application() app._set_loop(loop) with pytest.warns(DeprecationWarning): assert app.loop is loop with pytest.raises(RuntimeError): app._set_loop(loop=object()) @pytest.mark.parametrize('debug', [True, False]) async def test_app_make_handler_debug_exc(mocker, debug) -> None: with pytest.warns(DeprecationWarning): app = web.Application(debug=debug) srv = mocker.patch('aiohttp.web_app.Server') with pytest.warns(DeprecationWarning): assert app.debug == debug app._make_handler() srv.assert_called_with(app._handle, request_factory=app._make_request, access_log_class=mock.ANY, loop=asyncio.get_event_loop(), debug=debug) async def test_app_make_handler_args(mocker) -> None: app = web.Application(handler_args={'test': True}) srv = mocker.patch('aiohttp.web_app.Server') app._make_handler() srv.assert_called_with(app._handle, request_factory=app._make_request, access_log_class=mock.ANY, loop=asyncio.get_event_loop(), debug=mock.ANY, test=True) async def test_app_make_handler_access_log_class(mocker) -> None: class Logger: pass app = web.Application() with pytest.raises(TypeError): app._make_handler(access_log_class=Logger) class Logger(AbstractAccessLogger): def log(self, request, response, time): self.logger.info('msg') srv = mocker.patch('aiohttp.web_app.Server') app._make_handler(access_log_class=Logger) srv.assert_called_with(app._handle, access_log_class=Logger, request_factory=app._make_request, loop=asyncio.get_event_loop(), debug=mock.ANY) app = web.Application(handler_args={'access_log_class': Logger}) app._make_handler(access_log_class=Logger) srv.assert_called_with(app._handle, access_log_class=Logger, request_factory=app._make_request, loop=asyncio.get_event_loop(), debug=mock.ANY) async def test_app_make_handler_raises_deprecation_warning() -> None: app = web.Application() with pytest.warns(DeprecationWarning): app.make_handler() async def test_app_register_on_finish() -> None: app = web.Application() cb1 = make_mocked_coro(None) cb2 = make_mocked_coro(None) app.on_cleanup.append(cb1) app.on_cleanup.append(cb2) app.freeze() await app.cleanup() cb1.assert_called_once_with(app) cb2.assert_called_once_with(app) async def test_app_register_coro() -> None: app = web.Application() fut = asyncio.get_event_loop().create_future() async def cb(app): await asyncio.sleep(0.001) fut.set_result(123) app.on_cleanup.append(cb) app.freeze() await app.cleanup() assert fut.done() assert 123 == fut.result() def test_non_default_router() -> None: router = mock.Mock(spec=AbstractRouter) with pytest.warns(DeprecationWarning): app = web.Application(router=router) assert router is app.router def test_logging() -> None: logger = mock.Mock() app = web.Application() app.logger = logger assert app.logger is logger async def test_on_shutdown() -> None: app = web.Application() called = False async def on_shutdown(app_param): nonlocal called assert app is app_param called = True app.on_shutdown.append(on_shutdown) app.freeze() await app.shutdown() assert called async def test_on_startup() -> None: app = web.Application() long_running1_called = False long_running2_called = False all_long_running_called = False async def long_running1(app_param): nonlocal long_running1_called assert app is app_param long_running1_called = True async def long_running2(app_param): nonlocal long_running2_called assert app is app_param long_running2_called = True async def on_startup_all_long_running(app_param): nonlocal all_long_running_called assert app is app_param all_long_running_called = True return await asyncio.gather(long_running1(app_param), long_running2(app_param)) app.on_startup.append(on_startup_all_long_running) app.freeze() await app.startup() assert long_running1_called assert long_running2_called assert all_long_running_called def test_app_delitem() -> None: app = web.Application() app['key'] = 'value' assert len(app) == 1 del app['key'] assert len(app) == 0 def test_app_freeze() -> None: app = web.Application() subapp = mock.Mock() subapp._middlewares = () app._subapps.append(subapp) app.freeze() assert subapp.freeze.called app.freeze() assert len(subapp.freeze.call_args_list) == 1 def test_equality() -> None: app1 = web.Application() app2 = web.Application() assert app1 == app1 assert app1 != app2 def test_app_run_middlewares() -> None: root = web.Application() sub = web.Application() root.add_subapp('/sub', sub) root.freeze() assert root._run_middlewares is False @web.middleware async def middleware(request, handler): return await handler(request) root = web.Application(middlewares=[middleware]) sub = web.Application() root.add_subapp('/sub', sub) root.freeze() assert root._run_middlewares is True root = web.Application() sub = web.Application(middlewares=[middleware]) root.add_subapp('/sub', sub) root.freeze() assert root._run_middlewares is True def test_subapp_pre_frozen_after_adding() -> None: app = web.Application() subapp = web.Application() app.add_subapp('/prefix', subapp) assert subapp.pre_frozen assert not subapp.frozen @pytest.mark.skipif(not PY_36, reason="Python 3.6+ required") def test_app_inheritance() -> None: with pytest.warns(DeprecationWarning): class A(web.Application): pass @pytest.mark.skipif(not DEBUG, reason="The check is applied in DEBUG mode only") def test_app_custom_attr() -> None: app = web.Application() with pytest.warns(DeprecationWarning): app.custom = None async def test_cleanup_ctx() -> None: app = web.Application() out = [] def f(num): @async_generator async def inner(app): out.append('pre_' + str(num)) await yield_(None) out.append('post_' + str(num)) return inner app.cleanup_ctx.append(f(1)) app.cleanup_ctx.append(f(2)) app.freeze() await app.startup() assert out == ['pre_1', 'pre_2'] await app.cleanup() assert out == ['pre_1', 'pre_2', 'post_2', 'post_1'] async def test_cleanup_ctx_exception_on_startup() -> None: app = web.Application() out = [] exc = Exception('fail') def f(num, fail=False): @async_generator async def inner(app): out.append('pre_' + str(num)) if fail: raise exc await yield_(None) out.append('post_' + str(num)) return inner app.cleanup_ctx.append(f(1)) app.cleanup_ctx.append(f(2, True)) app.cleanup_ctx.append(f(3)) app.freeze() with pytest.raises(Exception) as ctx: await app.startup() assert ctx.value is exc assert out == ['pre_1', 'pre_2'] await app.cleanup() assert out == ['pre_1', 'pre_2', 'post_1'] async def test_cleanup_ctx_exception_on_cleanup() -> None: app = web.Application() out = [] exc = Exception('fail') def f(num, fail=False): @async_generator async def inner(app): out.append('pre_' + str(num)) await yield_(None) out.append('post_' + str(num)) if fail: raise exc return inner app.cleanup_ctx.append(f(1)) app.cleanup_ctx.append(f(2, True)) app.cleanup_ctx.append(f(3)) app.freeze() await app.startup() assert out == ['pre_1', 'pre_2', 'pre_3'] with pytest.raises(Exception) as ctx: await app.cleanup() assert ctx.value is exc assert out == ['pre_1', 'pre_2', 'pre_3', 'post_3', 'post_2', 'post_1'] async def test_cleanup_ctx_exception_on_cleanup_multiple() -> None: app = web.Application() out = [] def f(num, fail=False): @async_generator async def inner(app): out.append('pre_' + str(num)) await yield_(None) out.append('post_' + str(num)) if fail: raise Exception('fail_' + str(num)) return inner app.cleanup_ctx.append(f(1)) app.cleanup_ctx.append(f(2, True)) app.cleanup_ctx.append(f(3, True)) app.freeze() await app.startup() assert out == ['pre_1', 'pre_2', 'pre_3'] with pytest.raises(web.CleanupError) as ctx: await app.cleanup() exc = ctx.value assert len(exc.exceptions) == 2 assert str(exc.exceptions[0]) == 'fail_3' assert str(exc.exceptions[1]) == 'fail_2' assert out == ['pre_1', 'pre_2', 'pre_3', 'post_3', 'post_2', 'post_1'] async def test_cleanup_ctx_multiple_yields() -> None: app = web.Application() out = [] def f(num): @async_generator async def inner(app): out.append('pre_' + str(num)) await yield_(None) out.append('post_' + str(num)) await yield_(None) return inner app.cleanup_ctx.append(f(1)) app.freeze() await app.startup() assert out == ['pre_1'] with pytest.raises(RuntimeError) as ctx: await app.cleanup() assert "has more than one 'yield'" in str(ctx.value) assert out == ['pre_1', 'post_1'] async def test_subapp_chained_config_dict_visibility(aiohttp_client) -> None: async def main_handler(request): assert request.config_dict['key1'] == 'val1' assert 'key2' not in request.config_dict return web.Response(status=200) root = web.Application() root['key1'] = 'val1' root.add_routes([web.get('/', main_handler)]) async def sub_handler(request): assert request.config_dict['key1'] == 'val1' assert request.config_dict['key2'] == 'val2' return web.Response(status=201) sub = web.Application() sub['key2'] = 'val2' sub.add_routes([web.get('/', sub_handler)]) root.add_subapp('/sub', sub) client = await aiohttp_client(root) resp = await client.get('/') assert resp.status == 200 resp = await client.get('/sub/') assert resp.status == 201 async def test_subapp_chained_config_dict_overriding(aiohttp_client) -> None: async def main_handler(request): assert request.config_dict['key'] == 'val1' return web.Response(status=200) root = web.Application() root['key'] = 'val1' root.add_routes([web.get('/', main_handler)]) async def sub_handler(request): assert request.config_dict['key'] == 'val2' return web.Response(status=201) sub = web.Application() sub['key'] = 'val2' sub.add_routes([web.get('/', sub_handler)]) root.add_subapp('/sub', sub) client = await aiohttp_client(root) resp = await client.get('/') assert resp.status == 200 resp = await client.get('/sub/') assert resp.status == 201 async def test_subapp_on_startup(aiohttp_client) -> None: subapp = web.Application() startup_called = False async def on_startup(app): nonlocal startup_called startup_called = True app['startup'] = True subapp.on_startup.append(on_startup) ctx_pre_called = False ctx_post_called = False @async_generator async def cleanup_ctx(app): nonlocal ctx_pre_called, ctx_post_called ctx_pre_called = True app['cleanup'] = True await yield_(None) ctx_post_called = True subapp.cleanup_ctx.append(cleanup_ctx) shutdown_called = False async def on_shutdown(app): nonlocal shutdown_called shutdown_called = True subapp.on_shutdown.append(on_shutdown) cleanup_called = False async def on_cleanup(app): nonlocal cleanup_called cleanup_called = True subapp.on_cleanup.append(on_cleanup) app = web.Application() app.add_subapp('/subapp', subapp) assert not startup_called assert not ctx_pre_called assert not ctx_post_called assert not shutdown_called assert not cleanup_called assert subapp.on_startup.frozen assert subapp.cleanup_ctx.frozen assert subapp.on_shutdown.frozen assert subapp.on_cleanup.frozen assert subapp.router.frozen client = await aiohttp_client(app) assert startup_called assert ctx_pre_called assert not ctx_post_called assert not shutdown_called assert not cleanup_called await client.close() assert startup_called assert ctx_pre_called assert ctx_post_called assert shutdown_called assert cleanup_called def test_app_iter(): app = web.Application() app['a'] = '1' app['b'] = '2' assert sorted(list(app)) == ['a', 'b'] def test_app_boolean() -> None: app = web.Application() assert app aiohttp-3.6.2/tests/test_web_cli.py0000644000175100001650000001032713547410117017647 0ustar vstsdocker00000000000000import pytest from aiohttp import web def test_entry_func_empty(mocker) -> None: error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) argv = [""] with pytest.raises(SystemExit): web.main(argv) error.assert_called_with( "'entry-func' not in 'module:function' syntax" ) def test_entry_func_only_module(mocker) -> None: argv = ["test"] error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with( "'entry-func' not in 'module:function' syntax" ) def test_entry_func_only_function(mocker) -> None: argv = [":test"] error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with( "'entry-func' not in 'module:function' syntax" ) def test_entry_func_only_separator(mocker) -> None: argv = [":"] error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with( "'entry-func' not in 'module:function' syntax" ) def test_entry_func_relative_module(mocker) -> None: argv = [".a.b:c"] error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with("relative module names not supported") def test_entry_func_non_existent_module(mocker) -> None: argv = ["alpha.beta:func"] mocker.patch("aiohttp.web.import_module", side_effect=ImportError("Test Error")) error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with('unable to import alpha.beta: Test Error') def test_entry_func_non_existent_attribute(mocker) -> None: argv = ["alpha.beta:func"] import_module = mocker.patch("aiohttp.web.import_module") error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) module = import_module("alpha.beta") del module.func with pytest.raises(SystemExit): web.main(argv) error.assert_called_with( "module %r has no attribute %r" % ("alpha.beta", "func") ) def test_path_when_unsupported(mocker, monkeypatch) -> None: argv = "--path=test_path.sock alpha.beta:func".split() mocker.patch("aiohttp.web.import_module") monkeypatch.delattr("socket.AF_UNIX", raising=False) error = mocker.patch("aiohttp.web.ArgumentParser.error", side_effect=SystemExit) with pytest.raises(SystemExit): web.main(argv) error.assert_called_with("file system paths not supported by your" " operating environment") def test_entry_func_call(mocker) -> None: mocker.patch("aiohttp.web.run_app") import_module = mocker.patch("aiohttp.web.import_module") argv = ("-H testhost -P 6666 --extra-optional-eins alpha.beta:func " "--extra-optional-zwei extra positional args").split() module = import_module("alpha.beta") with pytest.raises(SystemExit): web.main(argv) module.func.assert_called_with( ("--extra-optional-eins --extra-optional-zwei extra positional " "args").split() ) def test_running_application(mocker) -> None: run_app = mocker.patch("aiohttp.web.run_app") import_module = mocker.patch("aiohttp.web.import_module") exit = mocker.patch("aiohttp.web.ArgumentParser.exit", side_effect=SystemExit) argv = ("-H testhost -P 6666 --extra-optional-eins alpha.beta:func " "--extra-optional-zwei extra positional args").split() module = import_module("alpha.beta") app = module.func() with pytest.raises(SystemExit): web.main(argv) run_app.assert_called_with(app, host="testhost", port=6666, path=None) exit.assert_called_with(message="Stopped\n") aiohttp-3.6.2/tests/test_web_exceptions.py0000644000175100001650000001332113547410117021256 0ustar vstsdocker00000000000000import collections import re from traceback import format_exception from unittest import mock import pytest from aiohttp import helpers, signals, web from aiohttp.test_utils import make_mocked_request @pytest.fixture def buf(): return bytearray() @pytest.fixture def http_request(buf): method = 'GET' path = '/' writer = mock.Mock() writer.drain.return_value = () def append(data=b''): buf.extend(data) return helpers.noop() async def write_headers(status_line, headers): headers = status_line + '\r\n' + ''.join( [k + ': ' + v + '\r\n' for k, v in headers.items()]) headers = headers.encode('utf-8') + b'\r\n' buf.extend(headers) writer.buffer_data.side_effect = append writer.write.side_effect = append writer.write_eof.side_effect = append writer.write_headers.side_effect = write_headers app = mock.Mock() app._debug = False app.on_response_prepare = signals.Signal(app) app.on_response_prepare.freeze() req = make_mocked_request(method, path, app=app, writer=writer) return req def test_all_http_exceptions_exported() -> None: assert 'HTTPException' in web.__all__ for name in dir(web): if name.startswith('_'): continue obj = getattr(web, name) if isinstance(obj, type) and issubclass(obj, web.HTTPException): assert name in web.__all__ async def test_HTTPOk(buf, http_request) -> None: resp = web.HTTPOk() await resp.prepare(http_request) await resp.write_eof() txt = buf.decode('utf8') assert re.match(('HTTP/1.1 200 OK\r\n' 'Content-Type: text/plain; charset=utf-8\r\n' 'Content-Length: 7\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n' '200: OK'), txt) def test_terminal_classes_has_status_code() -> None: terminals = set() for name in dir(web): obj = getattr(web, name) if isinstance(obj, type) and issubclass(obj, web.HTTPException): terminals.add(obj) dup = frozenset(terminals) for cls1 in dup: for cls2 in dup: if cls1 in cls2.__bases__: terminals.discard(cls1) for cls in terminals: assert cls.status_code is not None codes = collections.Counter(cls.status_code for cls in terminals) assert None not in codes assert 1 == codes.most_common(1)[0][1] async def test_HTTPFound(buf, http_request) -> None: resp = web.HTTPFound(location='/redirect') assert '/redirect' == resp.location assert '/redirect' == resp.headers['location'] await resp.prepare(http_request) await resp.write_eof() txt = buf.decode('utf8') assert re.match('HTTP/1.1 302 Found\r\n' 'Content-Type: text/plain; charset=utf-8\r\n' 'Location: /redirect\r\n' 'Content-Length: 10\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n' '302: Found', txt) def test_HTTPFound_empty_location() -> None: with pytest.raises(ValueError): web.HTTPFound(location='') with pytest.raises(ValueError): web.HTTPFound(location=None) def test_HTTPFound_location_CRLF() -> None: exc = web.HTTPFound(location='/redirect\r\n') assert '\r\n' not in exc.headers['Location'] async def test_HTTPMethodNotAllowed(buf, http_request) -> None: resp = web.HTTPMethodNotAllowed('get', ['POST', 'PUT']) assert 'GET' == resp.method assert {'POST', 'PUT'} == resp.allowed_methods assert 'POST,PUT' == resp.headers['allow'] await resp.prepare(http_request) await resp.write_eof() txt = buf.decode('utf8') assert re.match('HTTP/1.1 405 Method Not Allowed\r\n' 'Content-Type: text/plain; charset=utf-8\r\n' 'Allow: POST,PUT\r\n' 'Content-Length: 23\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n' '405: Method Not Allowed', txt) def test_override_body_with_text() -> None: resp = web.HTTPNotFound(text="Page not found") assert 404 == resp.status assert "Page not found".encode('utf-8') == resp.body assert "Page not found" == resp.text assert "text/plain" == resp.content_type assert "utf-8" == resp.charset def test_override_body_with_binary() -> None: txt = "Page not found" with pytest.warns(DeprecationWarning): resp = web.HTTPNotFound(body=txt.encode('utf-8'), content_type="text/html") assert 404 == resp.status assert txt.encode('utf-8') == resp.body assert txt == resp.text assert "text/html" == resp.content_type assert resp.charset is None def test_default_body() -> None: resp = web.HTTPOk() assert b'200: OK' == resp.body def test_empty_body_204() -> None: resp = web.HTTPNoContent() assert resp.body is None def test_empty_body_205() -> None: resp = web.HTTPNoContent() assert resp.body is None def test_empty_body_304() -> None: resp = web.HTTPNoContent() resp.body is None def test_link_header_451(buf) -> None: resp = web.HTTPUnavailableForLegalReasons(link='http://warning.or.kr/') assert 'http://warning.or.kr/' == resp.link assert '; rel="blocked-by"' == resp.headers['Link'] def test_HTTPException_retains_cause() -> None: with pytest.raises(web.HTTPException) as ei: try: raise Exception('CustomException') except Exception as exc: raise web.HTTPException() from exc tb = ''.join(format_exception(ei.type, ei.value, ei.tb)) assert 'CustomException' in tb assert 'direct cause' in tb aiohttp-3.6.2/tests/test_web_functional.py0000644000175100001650000015751213547410117021252 0ustar vstsdocker00000000000000import asyncio import io import json import pathlib import socket import zlib from unittest import mock import pytest from async_generator import async_generator, yield_ from multidict import CIMultiDictProxy, MultiDict from yarl import URL import aiohttp from aiohttp import ( FormData, HttpVersion10, HttpVersion11, TraceConfig, multipart, web, ) from aiohttp.test_utils import make_mocked_coro try: import ssl except ImportError: ssl = None # type: ignore @pytest.fixture def here(): return pathlib.Path(__file__).parent @pytest.fixture def fname(here): return here / 'conftest.py' async def test_simple_get(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(body=b'OK') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert 'OK' == txt async def test_simple_get_with_text(aiohttp_client) -> None: async def handler(request): body = await request.read() assert b'' == body return web.Response(text='OK', headers={'content-type': 'text/plain'}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert 'OK' == txt async def test_handler_returns_not_response(aiohttp_server, aiohttp_client) -> None: asyncio.get_event_loop().set_debug(True) logger = mock.Mock() async def handler(request): return 'abc' app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app, logger=logger) client = await aiohttp_client(server) with pytest.raises(aiohttp.ServerDisconnectedError): await client.get('/') logger.exception.assert_called_with('Unhandled runtime exception', exc_info=mock.ANY) async def test_handler_returns_none(aiohttp_server, aiohttp_client) -> None: asyncio.get_event_loop().set_debug(True) logger = mock.Mock() async def handler(request): return None app = web.Application() app.router.add_get('/', handler) server = await aiohttp_server(app, logger=logger) client = await aiohttp_client(server) with pytest.raises(aiohttp.ServerDisconnectedError): await client.get('/') # Actual error text is placed in exc_info logger.exception.assert_called_with('Unhandled runtime exception', exc_info=mock.ANY) async def test_head_returns_empty_body(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'test') app = web.Application() app.router.add_head('/', handler) client = await aiohttp_client(app, version=HttpVersion11) resp = await client.head('/') assert 200 == resp.status txt = await resp.text() assert '' == txt async def test_response_before_complete(aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = b'0' * 1024 * 1024 resp = await client.post('/', data=data) assert 200 == resp.status text = await resp.text() assert 'OK' == text async def test_post_form(aiohttp_client) -> None: async def handler(request): data = await request.post() assert {'a': '1', 'b': '2', 'c': ''} == data return web.Response(body=b'OK') app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data={'a': 1, 'b': 2, 'c': ''}) assert 200 == resp.status txt = await resp.text() assert 'OK' == txt async def test_post_text(aiohttp_client) -> None: async def handler(request): data = await request.text() assert 'русский' == data data2 = await request.text() assert data == data2 return web.Response(text=data) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data='русский') assert 200 == resp.status txt = await resp.text() assert 'русский' == txt async def test_post_json(aiohttp_client) -> None: dct = {'key': 'текст'} async def handler(request): data = await request.json() assert dct == data data2 = await request.json(loads=json.loads) assert data == data2 resp = web.Response() resp.content_type = 'application/json' resp.body = json.dumps(data).encode('utf8') return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) headers = {'Content-Type': 'application/json'} resp = await client.post('/', data=json.dumps(dct), headers=headers) assert 200 == resp.status data = await resp.json() assert dct == data async def test_multipart(aiohttp_client) -> None: with multipart.MultipartWriter() as writer: writer.append('test') writer.append_json({'passed': True}) async def handler(request): reader = await request.multipart() assert isinstance(reader, multipart.MultipartReader) part = await reader.next() assert isinstance(part, multipart.BodyPartReader) thing = await part.text() assert thing == 'test' part = await reader.next() assert isinstance(part, multipart.BodyPartReader) assert part.headers['Content-Type'] == 'application/json' thing = await part.json() assert thing == {'passed': True} resp = web.Response() resp.content_type = 'application/json' resp.body = b'' return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=writer) assert 200 == resp.status await resp.release() async def test_multipart_empty(aiohttp_client) -> None: with multipart.MultipartWriter() as writer: pass async def handler(request): reader = await request.multipart() assert isinstance(reader, multipart.MultipartReader) async for part in reader: assert False, 'Unexpected part found in reader: {!r}'.format(part) return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=writer) assert 200 == resp.status await resp.release() async def test_multipart_content_transfer_encoding(aiohttp_client) -> None: """For issue #1168""" with multipart.MultipartWriter() as writer: writer.append(b'\x00' * 10, headers={'Content-Transfer-Encoding': 'binary'}) async def handler(request): reader = await request.multipart() assert isinstance(reader, multipart.MultipartReader) part = await reader.next() assert isinstance(part, multipart.BodyPartReader) assert part.headers['Content-Transfer-Encoding'] == 'binary' thing = await part.read() assert thing == b'\x00' * 10 resp = web.Response() resp.content_type = 'application/json' resp.body = b'' return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=writer) assert 200 == resp.status await resp.release() async def test_render_redirect(aiohttp_client) -> None: async def handler(request): raise web.HTTPMovedPermanently(location='/path') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/', allow_redirects=False) assert 301 == resp.status txt = await resp.text() assert '301: Moved Permanently' == txt assert '/path' == resp.headers['location'] async def test_post_single_file(aiohttp_client) -> None: here = pathlib.Path(__file__).parent def check_file(fs): fullname = here / fs.filename with fullname.open() as f: test_data = f.read().encode() data = fs.file.read() assert test_data == data async def handler(request): data = await request.post() assert ['data.unknown_mime_type'] == list(data.keys()) for fs in data.values(): check_file(fs) fs.file.close() resp = web.Response(body=b'OK') return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) fname = here / 'data.unknown_mime_type' resp = await client.post('/', data=[fname.open()]) assert 200 == resp.status async def test_files_upload_with_same_key(aiohttp_client) -> None: async def handler(request): data = await request.post() files = data.getall('file') file_names = set() for _file in files: assert not _file.file.closed if _file.filename == 'test1.jpeg': assert _file.file.read() == b'binary data 1' if _file.filename == 'test2.jpeg': assert _file.file.read() == b'binary data 2' file_names.add(_file.filename) assert len(files) == 2 assert file_names == {'test1.jpeg', 'test2.jpeg'} resp = web.Response(body=b'OK') return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = FormData() data.add_field('file', b'binary data 1', content_type='image/jpeg', filename='test1.jpeg') data.add_field('file', b'binary data 2', content_type='image/jpeg', filename='test2.jpeg') resp = await client.post('/', data=data) assert 200 == resp.status async def test_post_files(aiohttp_client) -> None: here = pathlib.Path(__file__).parent def check_file(fs): fullname = here / fs.filename with fullname.open() as f: test_data = f.read().encode() data = fs.file.read() assert test_data == data async def handler(request): data = await request.post() assert ['data.unknown_mime_type', 'conftest.py'] == list(data.keys()) for fs in data.values(): check_file(fs) fs.file.close() resp = web.Response(body=b'OK') return resp app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with (here / 'data.unknown_mime_type').open() as f1: with (here / 'conftest.py').open() as f2: resp = await client.post('/', data=[f1, f2]) assert 200 == resp.status async def test_release_post_data(aiohttp_client) -> None: async def handler(request): await request.release() chunk = await request.content.readany() assert chunk == b'' return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data='post text') assert 200 == resp.status async def test_POST_DATA_with_content_transfer_encoding( aiohttp_client) -> None: async def handler(request): data = await request.post() assert b'123' == data['name'] return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) form = FormData() form.add_field('name', b'123', content_transfer_encoding='base64') resp = await client.post('/', data=form) assert 200 == resp.status async def test_post_form_with_duplicate_keys(aiohttp_client) -> None: async def handler(request): data = await request.post() lst = list(data.items()) assert [('a', '1'), ('a', '2')] == lst return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=MultiDict([('a', 1), ('a', 2)])) assert 200 == resp.status def test_repr_for_application() -> None: app = web.Application() assert "".format(id(app)) == repr(app) async def test_expect_default_handler_unknown(aiohttp_client) -> None: """Test default Expect handler for unknown Expect value. A server that does not understand or is unable to comply with any of the expectation values in the Expect field of a request MUST respond with appropriate error status. The server MUST respond with a 417 (Expectation Failed) status if any of the expectations cannot be met or, if there are other problems with the request, some other 4xx status. http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.20 """ async def handler(request): await request.post() pytest.xfail('Handler should not proceed to this point in case of ' 'unknown Expect header') app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', headers={'Expect': 'SPAM'}) assert 417 == resp.status async def test_100_continue(aiohttp_client) -> None: async def handler(request): data = await request.post() assert b'123' == data['name'] return web.Response() form = FormData() form.add_field('name', b'123', content_transfer_encoding='base64') app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=form, expect100=True) assert 200 == resp.status async def test_100_continue_custom(aiohttp_client) -> None: expect_received = False async def handler(request): data = await request.post() assert b'123' == data['name'] return web.Response() async def expect_handler(request): nonlocal expect_received expect_received = True if request.version == HttpVersion11: await request.writer.write(b"HTTP/1.1 100 Continue\r\n\r\n") form = FormData() form.add_field('name', b'123', content_transfer_encoding='base64') app = web.Application() app.router.add_post('/', handler, expect_handler=expect_handler) client = await aiohttp_client(app) resp = await client.post('/', data=form, expect100=True) assert 200 == resp.status assert expect_received async def test_100_continue_custom_response(aiohttp_client) -> None: async def handler(request): data = await request.post() assert b'123', data['name'] return web.Response() async def expect_handler(request): if request.version == HttpVersion11: if auth_err: raise web.HTTPForbidden() await request.writer.write(b"HTTP/1.1 100 Continue\r\n\r\n") form = FormData() form.add_field('name', b'123', content_transfer_encoding='base64') app = web.Application() app.router.add_post('/', handler, expect_handler=expect_handler) client = await aiohttp_client(app) auth_err = False resp = await client.post('/', data=form, expect100=True) assert 200 == resp.status auth_err = True resp = await client.post('/', data=form, expect100=True) assert 403 == resp.status async def test_100_continue_for_not_found(aiohttp_client) -> None: app = web.Application() client = await aiohttp_client(app) resp = await client.post('/not_found', data='data', expect100=True) assert 404 == resp.status async def test_100_continue_for_not_allowed(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.get('/', expect100=True) assert 405 == resp.status async def test_http11_keep_alive_default(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, version=HttpVersion11) resp = await client.get('/') assert 200 == resp.status assert resp.version == HttpVersion11 assert 'Connection' not in resp.headers @pytest.mark.xfail async def test_http10_keep_alive_default(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, version=HttpVersion10) resp = await client.get('/') assert 200 == resp.status assert resp.version == HttpVersion10 assert resp.headers['Connection'] == 'keep-alive' async def test_http10_keep_alive_with_headers_close(aiohttp_client) -> None: async def handler(request): await request.read() return web.Response(body=b'OK') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, version=HttpVersion10) headers = {'Connection': 'close'} resp = await client.get('/', headers=headers) assert 200 == resp.status assert resp.version == HttpVersion10 assert 'Connection' not in resp.headers async def test_http10_keep_alive_with_headers(aiohttp_client) -> None: async def handler(request): await request.read() return web.Response(body=b'OK') app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, version=HttpVersion10) headers = {'Connection': 'keep-alive'} resp = await client.get('/', headers=headers) assert 200 == resp.status assert resp.version == HttpVersion10 assert resp.headers['Connection'] == 'keep-alive' async def test_upload_file(aiohttp_client) -> None: here = pathlib.Path(__file__).parent fname = here / 'aiohttp.png' with fname.open('rb') as f: data = f.read() async def handler(request): form = await request.post() raw_data = form['file'].file.read() assert data == raw_data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data={'file': data}) assert 200 == resp.status async def test_upload_file_object(aiohttp_client) -> None: here = pathlib.Path(__file__).parent fname = here / 'aiohttp.png' with fname.open('rb') as f: data = f.read() async def handler(request): form = await request.post() raw_data = form['file'].file.read() assert data == raw_data return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) with fname.open('rb') as f: resp = await client.post('/', data={'file': f}) assert 200 == resp.status async def test_empty_content_for_query_without_body(aiohttp_client) -> None: async def handler(request): assert not request.body_exists assert not request.can_read_body with pytest.warns(DeprecationWarning): assert not request.has_body return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/') assert 200 == resp.status async def test_empty_content_for_query_with_body(aiohttp_client) -> None: async def handler(request): assert request.body_exists assert request.can_read_body with pytest.warns(DeprecationWarning): assert request.has_body body = await request.read() return web.Response(body=body) app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post('/', data=b'data') assert 200 == resp.status async def test_get_with_empty_arg(aiohttp_client) -> None: async def handler(request): assert 'arg' in request.query assert '' == request.query['arg'] return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/?arg') assert 200 == resp.status async def test_large_header(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) headers = {'Long-Header': 'ab' * 8129} resp = await client.get('/', headers=headers) assert 400 == resp.status async def test_large_header_allowed(aiohttp_client, aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_post('/', handler) server = await aiohttp_server(app, max_field_size=81920) client = await aiohttp_client(server) headers = {'Long-Header': 'ab' * 8129} resp = await client.post('/', headers=headers) assert 200 == resp.status async def test_get_with_empty_arg_with_equal(aiohttp_client) -> None: async def handler(request): assert 'arg' in request.query assert '' == request.query['arg'] return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/?arg=') assert 200 == resp.status async def test_response_with_async_gen(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() data_size = len(data) @async_generator async def stream(f_name): with f_name.open('rb') as f: data = f.read(100) while data: await yield_(data) data = f.read(100) async def handler(request): headers = {'Content-Length': str(data_size)} return web.Response(body=stream(fname), headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == data assert resp.headers.get('Content-Length') == str(len(resp_data)) async def test_response_with_streamer(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() data_size = len(data) with pytest.warns(DeprecationWarning): @aiohttp.streamer async def stream(writer, f_name): with f_name.open('rb') as f: data = f.read(100) while data: await writer.write(data) data = f.read(100) async def handler(request): headers = {'Content-Length': str(data_size)} return web.Response(body=stream(fname), headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == data assert resp.headers.get('Content-Length') == str(len(resp_data)) async def test_response_with_async_gen_no_params(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() data_size = len(data) @async_generator async def stream(): with fname.open('rb') as f: data = f.read(100) while data: await yield_(data) data = f.read(100) async def handler(request): headers = {'Content-Length': str(data_size)} return web.Response(body=stream(), headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == data assert resp.headers.get('Content-Length') == str(len(resp_data)) async def test_response_with_streamer_no_params(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() data_size = len(data) with pytest.warns(DeprecationWarning): @aiohttp.streamer async def stream(writer): with fname.open('rb') as f: data = f.read(100) while data: await writer.write(data) data = f.read(100) async def handler(request): headers = {'Content-Length': str(data_size)} return web.Response(body=stream, headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == data assert resp.headers.get('Content-Length') == str(len(resp_data)) async def test_response_with_file(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() async def handler(request): return web.Response(body=fname.open('rb')) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() expected_content_disposition = ( 'attachment; filename="conftest.py"; filename*=utf-8\'\'conftest.py' ) assert resp_data == data assert resp.headers.get('Content-Type') in ( 'application/octet-stream', 'text/x-python', 'text/plain', ) assert resp.headers.get('Content-Length') == str(len(resp_data)) assert ( resp.headers.get('Content-Disposition') == expected_content_disposition ) async def test_response_with_file_ctype(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() async def handler(request): return web.Response( body=fname.open('rb'), headers={'content-type': 'text/binary'}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() expected_content_disposition = ( 'attachment; filename="conftest.py"; filename*=utf-8\'\'conftest.py' ) assert resp_data == data assert resp.headers.get('Content-Type') == 'text/binary' assert resp.headers.get('Content-Length') == str(len(resp_data)) assert ( resp.headers.get('Content-Disposition') == expected_content_disposition ) async def test_response_with_payload_disp(aiohttp_client, fname) -> None: with fname.open('rb') as f: data = f.read() async def handler(request): pl = aiohttp.get_payload(fname.open('rb')) pl.set_content_disposition('inline', filename='test.txt') return web.Response( body=pl, headers={'content-type': 'text/binary'}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == data assert resp.headers.get('Content-Type') == 'text/binary' assert resp.headers.get('Content-Length') == str(len(resp_data)) assert (resp.headers.get('Content-Disposition') == 'inline; filename="test.txt"; filename*=utf-8\'\'test.txt') async def test_response_with_payload_stringio(aiohttp_client, fname) -> None: async def handler(request): return web.Response(body=io.StringIO('test')) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status resp_data = await resp.read() assert resp_data == b'test' async def test_response_with_precompressed_body_gzip(aiohttp_client) -> None: async def handler(request): headers = {'Content-Encoding': 'gzip'} zcomp = zlib.compressobj(wbits=16 + zlib.MAX_WBITS) data = zcomp.compress(b'mydata') + zcomp.flush() return web.Response(body=data, headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status data = await resp.read() assert b'mydata' == data assert resp.headers.get('Content-Encoding') == 'gzip' async def test_response_with_precompressed_body_deflate( aiohttp_client) -> None: async def handler(request): headers = {'Content-Encoding': 'deflate'} zcomp = zlib.compressobj(wbits=-zlib.MAX_WBITS) data = zcomp.compress(b'mydata') + zcomp.flush() return web.Response(body=data, headers=headers) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status data = await resp.read() assert b'mydata' == data assert resp.headers.get('Content-Encoding') == 'deflate' async def test_bad_request_payload(aiohttp_client) -> None: async def handler(request): assert request.method == 'POST' with pytest.raises(aiohttp.web.RequestPayloadError): await request.content.read() return web.Response() app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) resp = await client.post( '/', data=b'test', headers={'content-encoding': 'gzip'}) assert 200 == resp.status async def test_stream_response_multiple_chunks(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse() resp.enable_chunked_encoding() await resp.prepare(request) await resp.write(b'x') await resp.write(b'y') await resp.write(b'z') return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status data = await resp.read() assert b'xyz' == data async def test_start_without_routes(aiohttp_client) -> None: app = web.Application() client = await aiohttp_client(app) resp = await client.get('/') assert 404 == resp.status async def test_requests_count(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) assert client.server.handler.requests_count == 0 resp = await client.get('/') assert 200 == resp.status assert client.server.handler.requests_count == 1 resp = await client.get('/') assert 200 == resp.status assert client.server.handler.requests_count == 2 resp = await client.get('/') assert 200 == resp.status assert client.server.handler.requests_count == 3 async def test_redirect_url(aiohttp_client) -> None: async def redirector(request): raise web.HTTPFound(location=URL('/redirected')) async def redirected(request): return web.Response() app = web.Application() app.router.add_get('/redirector', redirector) app.router.add_get('/redirected', redirected) client = await aiohttp_client(app) resp = await client.get('/redirector') assert resp.status == 200 async def test_simple_subapp(aiohttp_client) -> None: async def handler(request): return web.Response(text="OK") app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.status == 200 txt = await resp.text() assert 'OK' == txt async def test_subapp_reverse_url(aiohttp_client) -> None: async def handler(request): raise web.HTTPMovedPermanently( location=subapp.router['name'].url_for()) async def handler2(request): return web.Response(text="OK") app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) subapp.router.add_get('/final', handler2, name='name') app.add_subapp('/path', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.status == 200 txt = await resp.text() assert 'OK' == txt assert resp.url.path == '/path/final' async def test_subapp_reverse_variable_url(aiohttp_client) -> None: async def handler(request): raise web.HTTPMovedPermanently( location=subapp.router['name'].url_for(part='final')) async def handler2(request): return web.Response(text="OK") app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) subapp.router.add_get('/{part}', handler2, name='name') app.add_subapp('/path', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.status == 200 txt = await resp.text() assert 'OK' == txt assert resp.url.path == '/path/final' async def test_subapp_reverse_static_url(aiohttp_client) -> None: fname = 'aiohttp.png' async def handler(request): raise web.HTTPMovedPermanently( location=subapp.router['name'].url_for(filename=fname)) app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) here = pathlib.Path(__file__).parent subapp.router.add_static('/static', here, name='name') app.add_subapp('/path', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.url.path == '/path/static/' + fname assert resp.status == 200 body = await resp.read() with (here / fname).open('rb') as f: assert body == f.read() async def test_subapp_app(aiohttp_client) -> None: async def handler(request): assert request.app is subapp return web.Response(text='OK') app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path/', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.status == 200 txt = await resp.text() assert 'OK' == txt async def test_subapp_not_found(aiohttp_client) -> None: async def handler(request): return web.Response(text='OK') app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path/', subapp) client = await aiohttp_client(app) resp = await client.get('/path/other') assert resp.status == 404 async def test_subapp_not_found2(aiohttp_client) -> None: async def handler(request): return web.Response(text='OK') app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path/', subapp) client = await aiohttp_client(app) resp = await client.get('/invalid/other') assert resp.status == 404 async def test_subapp_not_allowed(aiohttp_client) -> None: async def handler(request): return web.Response(text='OK') app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path/', subapp) client = await aiohttp_client(app) resp = await client.post('/path/to') assert resp.status == 405 assert resp.headers['Allow'] == 'GET,HEAD' async def test_subapp_cannot_add_app_in_handler(aiohttp_client) -> None: async def handler(request): request.match_info.add_app(app) return web.Response(text='OK') app = web.Application() subapp = web.Application() subapp.router.add_get('/to', handler) app.add_subapp('/path/', subapp) client = await aiohttp_client(app) resp = await client.get('/path/to') assert resp.status == 500 async def test_subapp_middlewares(aiohttp_client) -> None: order = [] async def handler(request): return web.Response(text='OK') async def middleware_factory(app, handler): async def middleware(request): order.append((1, app)) resp = await handler(request) assert 200 == resp.status order.append((2, app)) return resp return middleware app = web.Application(middlewares=[middleware_factory]) subapp1 = web.Application(middlewares=[middleware_factory]) subapp2 = web.Application(middlewares=[middleware_factory]) subapp2.router.add_get('/to', handler) with pytest.warns(DeprecationWarning): subapp1.add_subapp('/b/', subapp2) app.add_subapp('/a/', subapp1) client = await aiohttp_client(app) resp = await client.get('/a/b/to') assert resp.status == 200 assert [(1, app), (1, subapp1), (1, subapp2), (2, subapp2), (2, subapp1), (2, app)] == order async def test_subapp_on_response_prepare(aiohttp_client) -> None: order = [] async def handler(request): return web.Response(text='OK') def make_signal(app): async def on_response(request, response): order.append(app) return on_response app = web.Application() app.on_response_prepare.append(make_signal(app)) subapp1 = web.Application() subapp1.on_response_prepare.append(make_signal(subapp1)) subapp2 = web.Application() subapp2.on_response_prepare.append(make_signal(subapp2)) subapp2.router.add_get('/to', handler) subapp1.add_subapp('/b/', subapp2) app.add_subapp('/a/', subapp1) client = await aiohttp_client(app) resp = await client.get('/a/b/to') assert resp.status == 200 assert [app, subapp1, subapp2] == order async def test_subapp_on_startup(aiohttp_server) -> None: order = [] async def on_signal(app): order.append(app) app = web.Application() app.on_startup.append(on_signal) subapp1 = web.Application() subapp1.on_startup.append(on_signal) subapp2 = web.Application() subapp2.on_startup.append(on_signal) subapp1.add_subapp('/b/', subapp2) app.add_subapp('/a/', subapp1) await aiohttp_server(app) assert [app, subapp1, subapp2] == order async def test_subapp_on_shutdown(aiohttp_server) -> None: order = [] async def on_signal(app): order.append(app) app = web.Application() app.on_shutdown.append(on_signal) subapp1 = web.Application() subapp1.on_shutdown.append(on_signal) subapp2 = web.Application() subapp2.on_shutdown.append(on_signal) subapp1.add_subapp('/b/', subapp2) app.add_subapp('/a/', subapp1) server = await aiohttp_server(app) await server.close() assert [app, subapp1, subapp2] == order async def test_subapp_on_cleanup(aiohttp_server) -> None: order = [] async def on_signal(app): order.append(app) app = web.Application() app.on_cleanup.append(on_signal) subapp1 = web.Application() subapp1.on_cleanup.append(on_signal) subapp2 = web.Application() subapp2.on_cleanup.append(on_signal) subapp1.add_subapp('/b/', subapp2) app.add_subapp('/a/', subapp1) server = await aiohttp_server(app) await server.close() assert [app, subapp1, subapp2] == order @pytest.mark.parametrize('route,expected,middlewares', [ ('/sub/', ['A: root', 'C: sub', 'D: sub'], 'AC'), ('/', ['A: root', 'B: root'], 'AC'), ('/sub/', ['A: root', 'D: sub'], 'A'), ('/', ['A: root', 'B: root'], 'A'), ('/sub/', ['C: sub', 'D: sub'], 'C'), ('/', ['B: root'], 'C'), ('/sub/', ['D: sub'], ''), ('/', ['B: root'], ''), ]) async def test_subapp_middleware_context(aiohttp_client, route, expected, middlewares): values = [] def show_app_context(appname): @web.middleware async def middleware(request, handler): values.append('{}: {}'.format( appname, request.app['my_value'])) return await handler(request) return middleware def make_handler(appname): async def handler(request): values.append('{}: {}'.format( appname, request.app['my_value'])) return web.Response(text='Ok') return handler app = web.Application() app['my_value'] = 'root' if 'A' in middlewares: app.middlewares.append(show_app_context('A')) app.router.add_get('/', make_handler('B')) subapp = web.Application() subapp['my_value'] = 'sub' if 'C' in middlewares: subapp.middlewares.append(show_app_context('C')) subapp.router.add_get('/', make_handler('D')) app.add_subapp('/sub/', subapp) client = await aiohttp_client(app) resp = await client.get(route) assert 200 == resp.status assert 'Ok' == await resp.text() assert expected == values async def test_custom_date_header(aiohttp_client) -> None: async def handler(request): return web.Response(headers={'Date': 'Sun, 30 Oct 2016 03:13:52 GMT'}) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status assert resp.headers['Date'] == 'Sun, 30 Oct 2016 03:13:52 GMT' async def test_response_prepared_with_clone(aiohttp_client) -> None: async def handler(request): cloned = request.clone() resp = web.StreamResponse() await resp.prepare(cloned) return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status async def test_app_max_client_size(aiohttp_client) -> None: async def handler(request): await request.post() return web.Response(body=b'ok') max_size = 1024**2 app = web.Application() app.router.add_post('/', handler) client = await aiohttp_client(app) data = {"long_string": max_size * 'x' + 'xxx'} with pytest.warns(ResourceWarning): resp = await client.post('/', data=data) assert 413 == resp.status resp_text = await resp.text() assert 'Maximum request body size 1048576 exceeded, ' \ 'actual body size 1048591' in resp_text async def test_app_max_client_size_adjusted(aiohttp_client) -> None: async def handler(request): await request.post() return web.Response(body=b'ok') default_max_size = 1024**2 custom_max_size = default_max_size * 2 app = web.Application(client_max_size=custom_max_size) app.router.add_post('/', handler) client = await aiohttp_client(app) data = {'long_string': default_max_size * 'x' + 'xxx'} with pytest.warns(ResourceWarning): resp = await client.post('/', data=data) assert 200 == resp.status resp_text = await resp.text() assert 'ok' == resp_text too_large_data = {'log_string': custom_max_size * 'x' + "xxx"} with pytest.warns(ResourceWarning): resp = await client.post('/', data=too_large_data) assert 413 == resp.status resp_text = await resp.text() assert 'Maximum request body size 2097152 exceeded, ' \ 'actual body size 2097166' in resp_text async def test_app_max_client_size_none(aiohttp_client) -> None: async def handler(request): await request.post() return web.Response(body=b'ok') default_max_size = 1024**2 custom_max_size = None app = web.Application(client_max_size=custom_max_size) app.router.add_post('/', handler) client = await aiohttp_client(app) data = {'long_string': default_max_size * 'x' + 'xxx'} with pytest.warns(ResourceWarning): resp = await client.post('/', data=data) assert 200 == resp.status resp_text = await resp.text() assert 'ok' == resp_text too_large_data = {'log_string': default_max_size * 2 * 'x'} with pytest.warns(ResourceWarning): resp = await client.post('/', data=too_large_data) assert 200 == resp.status resp_text = await resp.text() assert resp_text == 'ok' async def test_post_max_client_size(aiohttp_client) -> None: async def handler(request): await request.post() return web.Response() app = web.Application(client_max_size=10) app.router.add_post('/', handler) client = await aiohttp_client(app) data = {'long_string': 1024 * 'x', 'file': io.BytesIO(b'test')} resp = await client.post('/', data=data) assert 413 == resp.status resp_text = await resp.text() assert 'Maximum request body size 10 exceeded, ' \ 'actual body size 1024' in resp_text async def test_post_max_client_size_for_file(aiohttp_client) -> None: async def handler(request): await request.post() return web.Response() app = web.Application(client_max_size=2) app.router.add_post('/', handler) client = await aiohttp_client(app) data = {'file': io.BytesIO(b'test')} resp = await client.post('/', data=data) assert 413 == resp.status async def test_response_with_bodypart(aiohttp_client) -> None: async def handler(request): reader = await request.multipart() part = await reader.next() return web.Response(body=part) app = web.Application(client_max_size=2) app.router.add_post('/', handler) client = await aiohttp_client(app) data = {'file': io.BytesIO(b'test')} resp = await client.post('/', data=data) assert 200 == resp.status body = await resp.read() assert body == b'test' disp = multipart.parse_content_disposition( resp.headers['content-disposition']) assert disp == ('attachment', {'name': 'file', 'filename': 'file', 'filename*': 'file'}) async def test_response_with_bodypart_named(aiohttp_client, tmpdir) -> None: async def handler(request): reader = await request.multipart() part = await reader.next() return web.Response(body=part) app = web.Application(client_max_size=2) app.router.add_post('/', handler) client = await aiohttp_client(app) f = tmpdir.join('foobar.txt') f.write_text('test', encoding='utf8') data = {'file': open(str(f), 'rb')} resp = await client.post('/', data=data) assert 200 == resp.status body = await resp.read() assert body == b'test' disp = multipart.parse_content_disposition( resp.headers['content-disposition']) assert disp == ( 'attachment', {'name': 'file', 'filename': 'foobar.txt', 'filename*': 'foobar.txt'} ) async def test_response_with_bodypart_invalid_name(aiohttp_client) -> None: async def handler(request): reader = await request.multipart() part = await reader.next() return web.Response(body=part) app = web.Application(client_max_size=2) app.router.add_post('/', handler) client = await aiohttp_client(app) with aiohttp.MultipartWriter() as mpwriter: mpwriter.append(b'test') resp = await client.post('/', data=mpwriter) assert 200 == resp.status body = await resp.read() assert body == b'test' assert 'content-disposition' not in resp.headers async def test_request_clone(aiohttp_client) -> None: async def handler(request): r2 = request.clone(method='POST') assert r2.method == 'POST' assert r2.match_info is request.match_info return web.Response() app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status async def test_await(aiohttp_server) -> None: async def handler(request): resp = web.StreamResponse(headers={'content-length': str(4)}) await resp.prepare(request) with pytest.warns(DeprecationWarning): await resp.drain() await asyncio.sleep(0.01) await resp.write(b'test') await asyncio.sleep(0.01) await resp.write_eof() return resp app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: resp = await session.get(server.make_url('/')) assert resp.status == 200 assert resp.connection is not None await resp.read() await resp.release() assert resp.connection is None async def test_response_context_manager(aiohttp_server) -> None: async def handler(request): return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) resp = await aiohttp.ClientSession().get(server.make_url('/')) async with resp: assert resp.status == 200 assert resp.connection is None assert resp.connection is None async def test_response_context_manager_error(aiohttp_server) -> None: async def handler(request): return web.Response(text='some text') app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) session = aiohttp.ClientSession() cm = session.get(server.make_url('/')) resp = await cm with pytest.raises(RuntimeError): async with resp: assert resp.status == 200 resp.content.set_exception(RuntimeError()) await resp.read() assert resp.closed assert len(session._connector._conns) == 1 async def aiohttp_client_api_context_manager(aiohttp_server): async def handler(request): return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: async with session.get(server.make_url('/')) as resp: assert resp.status == 200 assert resp.connection is None assert resp.connection is None async def test_context_manager_close_on_release(aiohttp_server, mocker) -> None: async def handler(request): resp = web.StreamResponse() await resp.prepare(request) with pytest.warns(DeprecationWarning): await resp.drain() await asyncio.sleep(10) return resp app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: resp = await session.get(server.make_url('/')) proto = resp.connection._protocol mocker.spy(proto, 'close') async with resp: assert resp.status == 200 assert resp.connection is not None assert resp.connection is None assert proto.close.called async def test_iter_any(aiohttp_server) -> None: data = b'0123456789' * 1024 async def handler(request): buf = [] async for raw in request.content.iter_any(): buf.append(raw) assert b''.join(buf) == data return web.Response() app = web.Application() app.router.add_route('POST', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession() as session: async with session.post(server.make_url('/'), data=data) as resp: assert resp.status == 200 async def test_request_tracing(aiohttp_server) -> None: on_request_start = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_request_end = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_dns_resolvehost_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock())) on_dns_resolvehost_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock())) on_request_redirect = mock.Mock(side_effect=make_mocked_coro(mock.Mock())) on_connection_create_start = mock.Mock( side_effect=make_mocked_coro(mock.Mock())) on_connection_create_end = mock.Mock( side_effect=make_mocked_coro(mock.Mock())) async def redirector(request): raise web.HTTPFound(location=URL('/redirected')) async def redirected(request): return web.Response() trace_config = TraceConfig() trace_config.on_request_start.append(on_request_start) trace_config.on_request_end.append(on_request_end) trace_config.on_request_redirect.append(on_request_redirect) trace_config.on_connection_create_start.append( on_connection_create_start) trace_config.on_connection_create_end.append( on_connection_create_end) trace_config.on_dns_resolvehost_start.append( on_dns_resolvehost_start) trace_config.on_dns_resolvehost_end.append( on_dns_resolvehost_end) app = web.Application() app.router.add_get('/redirector', redirector) app.router.add_get('/redirected', redirected) server = await aiohttp_server(app) class FakeResolver: _LOCAL_HOST = {0: '127.0.0.1', socket.AF_INET: '127.0.0.1'} def __init__(self, fakes): """fakes -- dns -> port dict""" self._fakes = fakes self._resolver = aiohttp.DefaultResolver() async def resolve(self, host, port=0, family=socket.AF_INET): fake_port = self._fakes.get(host) if fake_port is not None: return [{'hostname': host, 'host': self._LOCAL_HOST[family], 'port': fake_port, 'family': socket.AF_INET, 'proto': 0, 'flags': socket.AI_NUMERICHOST}] else: return await self._resolver.resolve(host, port, family) resolver = FakeResolver({'example.com': server.port}) connector = aiohttp.TCPConnector(resolver=resolver) client = aiohttp.ClientSession(connector=connector, trace_configs=[trace_config]) await client.get('http://example.com/redirector', data="foo") assert on_request_start.called assert on_request_end.called assert on_dns_resolvehost_start.called assert on_dns_resolvehost_end.called assert on_request_redirect.called assert on_connection_create_start.called assert on_connection_create_end.called await client.close() async def test_return_http_exception_deprecated(aiohttp_client) -> None: async def handler(request): return web.HTTPForbidden() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) with pytest.warns(DeprecationWarning): await client.get('/') async def test_request_path(aiohttp_client) -> None: async def handler(request): assert request.path_qs == '/path%20to?a=1' assert request.path == '/path to' assert request.raw_path == '/path%20to?a=1' return web.Response(body=b'OK') app = web.Application() app.router.add_get('/path to', handler) client = await aiohttp_client(app) resp = await client.get('/path to', params={'a': '1'}) assert 200 == resp.status txt = await resp.text() assert 'OK' == txt async def test_app_add_routes(aiohttp_client) -> None: async def handler(request): return web.Response() app = web.Application() app.add_routes([web.get('/get', handler)]) client = await aiohttp_client(app) resp = await client.get('/get') assert resp.status == 200 async def test_request_headers_type(aiohttp_client) -> None: async def handler(request): assert isinstance(request.headers, CIMultiDictProxy) return web.Response() app = web.Application() app.add_routes([web.get('/get', handler)]) client = await aiohttp_client(app) resp = await client.get('/get') assert resp.status == 200 async def test_signal_on_error_handler(aiohttp_client) -> None: async def on_prepare(request, response): response.headers['X-Custom'] = 'val' app = web.Application() app.on_response_prepare.append(on_prepare) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 404 assert resp.headers['X-Custom'] == 'val' aiohttp-3.6.2/tests/test_web_log.py0000644000175100001650000001275513547410117017670 0ustar vstsdocker00000000000000import datetime import platform from unittest import mock import pytest import aiohttp from aiohttp.abc import AbstractAccessLogger from aiohttp.web_log import AccessLogger IS_PYPY = platform.python_implementation() == 'PyPy' def test_access_logger_format() -> None: log_format = '%T "%{ETag}o" %X {X} %%P' mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, log_format) expected = '%s "%s" %%X {X} %%%s' assert expected == access_logger._log_format @pytest.mark.skipif( IS_PYPY, reason=""" Because of patching :py:class:`datetime.datetime`, under PyPy it fails in :py:func:`isinstance` call in :py:meth:`datetime.datetime.__sub__` (called from :py:meth:`aiohttp.AccessLogger._format_t`): *** TypeError: isinstance() arg 2 must be a class, type, or tuple of classes and types (Pdb) from datetime import datetime (Pdb) isinstance(now, datetime) *** TypeError: isinstance() arg 2 must be a class, type, or tuple of classes and types (Pdb) datetime.__class__ (Pdb) isinstance(now, datetime.__class__) False Ref: https://bitbucket.org/pypy/pypy/issues/1187/call-to-isinstance-in-__sub__-self-other Ref: https://github.com/celery/celery/issues/811 Ref: https://stackoverflow.com/a/46102240/595220 """, # noqa: E501 ) def test_access_logger_atoms(mocker) -> None: utcnow = datetime.datetime(1843, 1, 1, 0, 30) mock_datetime = mocker.patch("datetime.datetime") mock_getpid = mocker.patch("os.getpid") mock_datetime.utcnow.return_value = utcnow mock_getpid.return_value = 42 log_format = '%a %t %P %r %s %b %T %Tf %D "%{H1}i" "%{H2}i"' mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, log_format) request = mock.Mock(headers={'H1': 'a', 'H2': 'b'}, method="GET", path_qs="/path", version=aiohttp.HttpVersion(1, 1), remote="127.0.0.2") response = mock.Mock(headers={}, body_length=42, status=200) access_logger.log(request, response, 3.1415926) assert not mock_logger.exception.called expected = ('127.0.0.2 [01/Jan/1843:00:29:56 +0000] <42> ' 'GET /path HTTP/1.1 200 42 3 3.141593 3141593 "a" "b"') extra = { 'first_request_line': 'GET /path HTTP/1.1', 'process_id': '<42>', 'remote_address': '127.0.0.2', 'request_start_time': '[01/Jan/1843:00:29:56 +0000]', 'request_time': '3', 'request_time_frac': '3.141593', 'request_time_micro': '3141593', 'response_size': 42, 'response_status': 200, 'request_header': {'H1': 'a', 'H2': 'b'}, } mock_logger.info.assert_called_with(expected, extra=extra) def test_access_logger_dicts() -> None: log_format = '%{User-Agent}i %{Content-Length}o %{None}i' mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, log_format) request = mock.Mock(headers={"User-Agent": "Mock/1.0"}, version=(1, 1), remote="127.0.0.2") response = mock.Mock(headers={"Content-Length": 123}) access_logger.log(request, response, 0.0) assert not mock_logger.error.called expected = 'Mock/1.0 123 -' extra = { 'request_header': {"User-Agent": "Mock/1.0", 'None': '-'}, 'response_header': {'Content-Length': 123} } mock_logger.info.assert_called_with(expected, extra=extra) def test_access_logger_unix_socket() -> None: log_format = '|%a|' mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, log_format) request = mock.Mock(headers={"User-Agent": "Mock/1.0"}, version=(1, 1), remote="") response = mock.Mock() access_logger.log(request, response, 0.0) assert not mock_logger.error.called expected = '||' mock_logger.info.assert_called_with(expected, extra={'remote_address': ''}) def test_logger_no_message() -> None: mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, "%r %{content-type}i") extra_dict = { 'first_request_line': '-', 'request_header': {'content-type': '(no headers)'} } access_logger.log(None, None, 0.0) mock_logger.info.assert_called_with("- (no headers)", extra=extra_dict) def test_logger_internal_error() -> None: mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, "%D") access_logger.log(None, None, 'invalid') mock_logger.exception.assert_called_with("Error in logging") def test_logger_no_transport() -> None: mock_logger = mock.Mock() access_logger = AccessLogger(mock_logger, "%a") access_logger.log(None, None, 0) mock_logger.info.assert_called_with("-", extra={'remote_address': '-'}) def test_logger_abc() -> None: class Logger(AbstractAccessLogger): def log(self, request, response, time): 1 / 0 mock_logger = mock.Mock() access_logger = Logger(mock_logger, None) with pytest.raises(ZeroDivisionError): access_logger.log(None, None, None) class Logger(AbstractAccessLogger): def log(self, request, response, time): self.logger.info(self.log_format.format( request=request, response=response, time=time )) mock_logger = mock.Mock() access_logger = Logger(mock_logger, '{request} {response} {time}') access_logger.log('request', 'response', 1) mock_logger.info.assert_called_with('request response 1') aiohttp-3.6.2/tests/test_web_middleware.py0000644000175100001650000003565313547410117021226 0ustar vstsdocker00000000000000import re import pytest from yarl import URL from aiohttp import web async def test_middleware_modifies_response(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') @web.middleware async def middleware(request, handler): resp = await handler(request) assert 200 == resp.status resp.set_status(201) resp.text = resp.text + '[MIDDLEWARE]' return resp app = web.Application() app.middlewares.append(middleware) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 201 == resp.status txt = await resp.text() assert 'OK[MIDDLEWARE]' == txt async def test_middleware_handles_exception(loop, aiohttp_client) -> None: async def handler(request): raise RuntimeError('Error text') @web.middleware async def middleware(request, handler): with pytest.raises(RuntimeError) as ctx: await handler(request) return web.Response(status=501, text=str(ctx.value) + '[MIDDLEWARE]') app = web.Application() app.middlewares.append(middleware) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 501 == resp.status txt = await resp.text() assert 'Error text[MIDDLEWARE]' == txt async def test_middleware_chain(loop, aiohttp_client) -> None: async def handler(request): return web.Response(text='OK') def make_middleware(num): @web.middleware async def middleware(request, handler): resp = await handler(request) resp.text = resp.text + '[{}]'.format(num) return resp return middleware app = web.Application() app.middlewares.append(make_middleware(1)) app.middlewares.append(make_middleware(2)) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert 'OK[2][1]' == txt @pytest.fixture def cli(loop, aiohttp_client): async def handler(request): return web.Response(text="OK") def wrapper(extra_middlewares): app = web.Application() app.router.add_route( 'GET', '/resource1', handler) app.router.add_route( 'GET', '/resource2/', handler) app.router.add_route( 'GET', '/resource1/a/b', handler) app.router.add_route( 'GET', '/resource2/a/b/', handler) app.router.add_route( 'GET', '/resource2/a/b%2Fc/', handler) app.middlewares.extend(extra_middlewares) return aiohttp_client(app, server_kwargs={'skip_url_asserts': True}) return wrapper class TestNormalizePathMiddleware: @pytest.mark.parametrize("path, status", [ ('/resource1', 200), ('/resource1/', 404), ('/resource2', 200), ('/resource2/', 200), ('/resource1?p1=1&p2=2', 200), ('/resource1/?p1=1&p2=2', 404), ('/resource2?p1=1&p2=2', 200), ('/resource2/?p1=1&p2=2', 200), ('/resource2/a/b%2Fc', 200), ('/resource2/a/b%2Fc/', 200) ]) async def test_add_trailing_when_necessary( self, path, status, cli): extra_middlewares = [ web.normalize_path_middleware(merge_slashes=False)] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query @pytest.mark.parametrize("path, status", [ ('/resource1', 200), ('/resource1/', 200), ('/resource2', 404), ('/resource2/', 200), ('/resource1?p1=1&p2=2', 200), ('/resource1/?p1=1&p2=2', 200), ('/resource2?p1=1&p2=2', 404), ('/resource2/?p1=1&p2=2', 200), ('/resource2/a/b%2Fc', 404), ('/resource2/a/b%2Fc/', 200) ]) async def test_remove_trailing_when_necessary(self, path, status, cli) -> None: extra_middlewares = [ web.normalize_path_middleware( append_slash=False, remove_slash=True, merge_slashes=False)] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query @pytest.mark.parametrize("path, status", [ ('/resource1', 200), ('/resource1/', 404), ('/resource2', 404), ('/resource2/', 200), ('/resource1?p1=1&p2=2', 200), ('/resource1/?p1=1&p2=2', 404), ('/resource2?p1=1&p2=2', 404), ('/resource2/?p1=1&p2=2', 200), ('/resource2/a/b%2Fc', 404), ('/resource2/a/b%2Fc/', 200) ]) async def test_no_trailing_slash_when_disabled( self, path, status, cli): extra_middlewares = [ web.normalize_path_middleware( append_slash=False, merge_slashes=False)] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query @pytest.mark.parametrize("path, status", [ ('/resource1/a/b', 200), ('//resource1//a//b', 200), ('//resource1//a//b/', 404), ('///resource1//a//b', 200), ('/////resource1/a///b', 200), ('/////resource1/a//b/', 404), ('/resource1/a/b?p=1', 200), ('//resource1//a//b?p=1', 200), ('//resource1//a//b/?p=1', 404), ('///resource1//a//b?p=1', 200), ('/////resource1/a///b?p=1', 200), ('/////resource1/a//b/?p=1', 404), ]) async def test_merge_slash(self, path, status, cli) -> None: extra_middlewares = [ web.normalize_path_middleware(append_slash=False)] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query @pytest.mark.parametrize("path, status", [ ('/resource1/a/b', 200), ('/resource1/a/b/', 404), ('//resource2//a//b', 200), ('//resource2//a//b/', 200), ('///resource1//a//b', 200), ('///resource1//a//b/', 404), ('/////resource1/a///b', 200), ('/////resource1/a///b/', 404), ('/resource2/a/b', 200), ('//resource2//a//b', 200), ('//resource2//a//b/', 200), ('///resource2//a//b', 200), ('///resource2//a//b/', 200), ('/////resource2/a///b', 200), ('/////resource2/a///b/', 200), ('/resource1/a/b?p=1', 200), ('/resource1/a/b/?p=1', 404), ('//resource2//a//b?p=1', 200), ('//resource2//a//b/?p=1', 200), ('///resource1//a//b?p=1', 200), ('///resource1//a//b/?p=1', 404), ('/////resource1/a///b?p=1', 200), ('/////resource1/a///b/?p=1', 404), ('/resource2/a/b?p=1', 200), ('//resource2//a//b?p=1', 200), ('//resource2//a//b/?p=1', 200), ('///resource2//a//b?p=1', 200), ('///resource2//a//b/?p=1', 200), ('/////resource2/a///b?p=1', 200), ('/////resource2/a///b/?p=1', 200) ]) async def test_append_and_merge_slash(self, path, status, cli) -> None: extra_middlewares = [ web.normalize_path_middleware()] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query @pytest.mark.parametrize("path, status", [ ('/resource1/a/b', 200), ('/resource1/a/b/', 200), ('//resource2//a//b', 404), ('//resource2//a//b/', 200), ('///resource1//a//b', 200), ('///resource1//a//b/', 200), ('/////resource1/a///b', 200), ('/////resource1/a///b/', 200), ('/////resource1/a///b///', 200), ('/resource2/a/b', 404), ('//resource2//a//b', 404), ('//resource2//a//b/', 200), ('///resource2//a//b', 404), ('///resource2//a//b/', 200), ('/////resource2/a///b', 404), ('/////resource2/a///b/', 200), ('/resource1/a/b?p=1', 200), ('/resource1/a/b/?p=1', 200), ('//resource2//a//b?p=1', 404), ('//resource2//a//b/?p=1', 200), ('///resource1//a//b?p=1', 200), ('///resource1//a//b/?p=1', 200), ('/////resource1/a///b?p=1', 200), ('/////resource1/a///b/?p=1', 200), ('/resource2/a/b?p=1', 404), ('//resource2//a//b?p=1', 404), ('//resource2//a//b/?p=1', 200), ('///resource2//a//b?p=1', 404), ('///resource2//a//b/?p=1', 200), ('/////resource2/a///b?p=1', 404), ('/////resource2/a///b/?p=1', 200) ]) async def test_remove_and_merge_slash(self, path, status, cli) -> None: extra_middlewares = [ web.normalize_path_middleware( append_slash=False, remove_slash=True)] client = await cli(extra_middlewares) resp = await client.get(path) assert resp.status == status assert resp.url.query == URL(path).query async def test_cannot_remove_and_add_slash(self) -> None: with pytest.raises(AssertionError): web.normalize_path_middleware(append_slash=True, remove_slash=True) async def test_old_style_middleware(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') async def middleware_factory(app, handler): async def middleware(request): resp = await handler(request) assert 200 == resp.status resp.set_status(201) resp.text = resp.text + '[old style middleware]' return resp return middleware with pytest.warns(DeprecationWarning) as warning_checker: app = web.Application() app.middlewares.append(middleware_factory) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 201 == resp.status txt = await resp.text() assert 'OK[old style middleware]' == txt assert len(warning_checker) == 1 msg = str(warning_checker.list[0].message) assert re.match('^old-style middleware ' '".' 'middleware_factory at 0x[0-9a-fA-F]+>" ' 'deprecated, see #2252$', msg) async def test_mixed_middleware(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') async def m_old1(app, handler): async def middleware(request): resp = await handler(request) resp.text += '[old style 1]' return resp return middleware @web.middleware async def m_new1(request, handler): resp = await handler(request) resp.text += '[new style 1]' return resp async def m_old2(app, handler): async def middleware(request): resp = await handler(request) resp.text += '[old style 2]' return resp return middleware @web.middleware async def m_new2(request, handler): resp = await handler(request) resp.text += '[new style 2]' return resp middlewares = m_old1, m_new1, m_old2, m_new2 with pytest.warns(DeprecationWarning) as w: app = web.Application(middlewares=middlewares) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status txt = await resp.text() assert 'OK[new style 2][old style 2][new style 1][old style 1]' == txt assert len(w) == 2 tmpl = ('^old-style middleware ' '".' '{} at 0x[0-9a-fA-F]+>" ' 'deprecated, see #2252$') p1 = tmpl.format('m_old1') p2 = tmpl.format('m_old2') assert re.match(p2, str(w.list[0].message)) assert re.match(p1, str(w.list[1].message)) async def test_old_style_middleware_class(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') class Middleware: async def __call__(self, app, handler): async def middleware(request): resp = await handler(request) assert 200 == resp.status resp.set_status(201) resp.text = resp.text + '[old style middleware]' return resp return middleware with pytest.warns(DeprecationWarning) as warning_checker: app = web.Application() app.middlewares.append(Middleware()) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 201 == resp.status txt = await resp.text() assert 'OK[old style middleware]' == txt assert len(warning_checker) == 1 msg = str(warning_checker.list[0].message) assert re.match('^old-style middleware ' '".Middleware object ' 'at 0x[0-9a-fA-F]+>" deprecated, see #2252$', msg) async def test_new_style_middleware_class(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') @web.middleware class Middleware: async def __call__(self, request, handler): resp = await handler(request) assert 200 == resp.status resp.set_status(201) resp.text = resp.text + '[new style middleware]' return resp with pytest.warns(None) as warning_checker: app = web.Application() app.middlewares.append(Middleware()) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 201 == resp.status txt = await resp.text() assert 'OK[new style middleware]' == txt assert len(warning_checker) == 0 async def test_new_style_middleware_method(loop, aiohttp_client) -> None: async def handler(request): return web.Response(body=b'OK') class Middleware: @web.middleware async def call(self, request, handler): resp = await handler(request) assert 200 == resp.status resp.set_status(201) resp.text = resp.text + '[new style middleware]' return resp with pytest.warns(None) as warning_checker: app = web.Application() app.middlewares.append(Middleware().call) app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 201 == resp.status txt = await resp.text() assert 'OK[new style middleware]' == txt assert len(warning_checker) == 0 aiohttp-3.6.2/tests/test_web_protocol.py0000644000175100001650000005264513547410117020752 0ustar vstsdocker00000000000000"""Tests for aiohttp/server.py""" import asyncio import platform import socket from functools import partial from unittest import mock import pytest from aiohttp import helpers, http, streams, web IS_MACOS = platform.system() == 'Darwin' @pytest.fixture def make_srv(loop, manager): srv = None def maker(*, cls=web.RequestHandler, **kwargs): nonlocal srv m = kwargs.pop('manager', manager) srv = cls(m, loop=loop, access_log=None, **kwargs) return srv yield maker if srv is not None: if srv.transport is not None: srv.connection_lost(None) @pytest.fixture def manager(request_handler, loop): async def maker(): return web.Server(request_handler) return loop.run_until_complete(maker()) @pytest.fixture def srv(make_srv, transport): srv = make_srv() srv.connection_made(transport) transport.close.side_effect = partial(srv.connection_lost, None) with mock.patch.object( web.RequestHandler, '_drain_helper', side_effect=helpers.noop ): yield srv @pytest.fixture def buf(): return bytearray() @pytest.fixture def request_handler(): async def handler(request): return web.Response() m = mock.Mock() m.side_effect = handler return m @pytest.fixture def handle_with_error(): def wrapper(exc=ValueError): async def handle(request): raise exc h = mock.Mock() h.side_effect = handle return h return wrapper @pytest.fixture def writer(srv): return http.StreamWriter(srv, srv.transport, srv._loop) @pytest.fixture def transport(buf): transport = mock.Mock() def write(chunk): buf.extend(chunk) transport.write.side_effect = write transport.is_closing.return_value = False return transport @pytest.fixture def ceil(mocker): def ceil(val): return val mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def test_shutdown(srv, transport) -> None: loop = asyncio.get_event_loop() assert transport is srv.transport srv._keepalive = True task_handler = srv._task_handler assert srv._waiter is not None assert srv._task_handler is not None t0 = loop.time() await srv.shutdown() t1 = loop.time() assert t1 - t0 < 0.05, t1-t0 assert transport.close.called assert srv.transport is None assert not srv._task_handler await asyncio.sleep(0.1) assert task_handler.done() async def test_double_shutdown(srv, transport) -> None: await srv.shutdown() assert transport.close.called assert srv.transport is None transport.reset_mock() await srv.shutdown() assert not transport.close.called assert srv.transport is None async def test_shutdown_wait_error_handler(srv, transport) -> None: loop = asyncio.get_event_loop() async def _error_handle(): pass srv._error_handler = loop.create_task(_error_handle()) await srv.shutdown() assert srv._error_handler.done() async def test_close_after_response(srv, transport) -> None: srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') h = srv._task_handler await asyncio.sleep(0.1) assert srv._waiter is None assert srv._task_handler is None assert transport.close.called assert srv.transport is None assert h.done() def test_connection_made(make_srv) -> None: srv = make_srv() srv.connection_made(mock.Mock()) assert not srv._force_close def test_connection_made_with_tcp_keepaplive(make_srv, transport) -> None: srv = make_srv() sock = mock.Mock() transport.get_extra_info.return_value = sock srv.connection_made(transport) sock.setsockopt.assert_called_with(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1) def test_connection_made_without_tcp_keepaplive(make_srv) -> None: srv = make_srv(tcp_keepalive=False) sock = mock.Mock() transport = mock.Mock() transport.get_extra_info.return_value = sock srv.connection_made(transport) assert not sock.setsockopt.called def test_eof_received(make_srv) -> None: srv = make_srv() srv.connection_made(mock.Mock()) srv.eof_received() # assert srv.reader._eof async def test_connection_lost(srv) -> None: srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') srv._keepalive = True handle = srv._task_handler await asyncio.sleep(0) # wait for .start() starting srv.connection_lost(None) assert srv._force_close await handle assert not srv._task_handler def test_srv_keep_alive(srv) -> None: assert not srv._keepalive srv.keep_alive(True) assert srv._keepalive srv.keep_alive(False) assert not srv._keepalive def test_srv_keep_alive_disable(srv) -> None: handle = srv._keepalive_handle = mock.Mock() srv.keep_alive(False) assert not srv._keepalive assert srv._keepalive_handle is None handle.cancel.assert_called_with() async def test_simple(srv, buf) -> None: srv.data_received( b'GET / HTTP/1.1\r\n\r\n') await asyncio.sleep(0.05) assert buf.startswith(b'HTTP/1.1 200 OK\r\n') async def test_bad_method(srv, buf) -> None: srv.data_received( b':BAD; / HTTP/1.0\r\n' b'Host: example.com\r\n\r\n') await asyncio.sleep(0) assert buf.startswith(b'HTTP/1.0 400 Bad Request\r\n') async def test_data_received_error(srv, buf) -> None: transport = srv.transport srv._request_parser = mock.Mock() srv._request_parser.feed_data.side_effect = TypeError srv.data_received( b'!@#$ / HTTP/1.0\r\n' b'Host: example.com\r\n\r\n') await asyncio.sleep(0) assert buf.startswith(b'HTTP/1.0 500 Internal Server Error\r\n') assert transport.close.called assert srv._error_handler is None async def test_line_too_long(srv, buf) -> None: srv.data_received(b''.join([b'a' for _ in range(10000)]) + b'\r\n\r\n') await asyncio.sleep(0) assert buf.startswith(b'HTTP/1.0 400 Bad Request\r\n') async def test_invalid_content_length(srv, buf) -> None: srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: sdgg\r\n\r\n') await asyncio.sleep(0) assert buf.startswith(b'HTTP/1.0 400 Bad Request\r\n') async def test_unhandled_runtime_error( make_srv, transport, request_handler ): async def handle(request): resp = web.Response() resp.write_eof = mock.Mock() resp.write_eof.side_effect = RuntimeError return resp srv = make_srv(lingering_time=0) srv.debug = True srv.connection_made(transport) srv.logger.exception = mock.Mock() request_handler.side_effect = handle srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') await srv._task_handler assert request_handler.called srv.logger.exception.assert_called_with( "Unhandled runtime exception", exc_info=mock.ANY) async def test_handle_uncompleted( make_srv, transport, handle_with_error, request_handler): closed = False def close(): nonlocal closed closed = True transport.close.side_effect = close srv = make_srv(lingering_time=0) srv.connection_made(transport) srv.logger.exception = mock.Mock() request_handler.side_effect = handle_with_error() srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 50000\r\n\r\n') await srv._task_handler assert request_handler.called assert closed srv.logger.exception.assert_called_with( "Error handling request", exc_info=mock.ANY) @pytest.mark.xfail( IS_MACOS, raises=TypeError, reason='Intermittently fails on macOS', ) async def test_handle_uncompleted_pipe( make_srv, transport, request_handler, handle_with_error): closed = False normal_completed = False def close(): nonlocal closed closed = True transport.close.side_effect = close srv = make_srv(lingering_time=0) srv.connection_made(transport) srv.logger.exception = mock.Mock() async def handle(request): nonlocal normal_completed normal_completed = True await asyncio.sleep(0.05) return web.Response() # normal request_handler.side_effect = handle srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0) # with exception request_handler.side_effect = handle_with_error() srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 50000\r\n\r\n') assert srv._task_handler await asyncio.sleep(0) await srv._task_handler assert normal_completed assert request_handler.called assert closed srv.logger.exception.assert_called_with( "Error handling request", exc_info=mock.ANY) async def test_lingering(srv, transport) -> None: assert not transport.close.called async def handle(message, request, writer): pass with mock.patch.object( web.RequestHandler, 'handle_request', create=True, new=handle ): srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 3\r\n\r\n') await asyncio.sleep(0.05) assert not transport.close.called srv.data_received(b'123') await asyncio.sleep(0) transport.close.assert_called_with() async def test_lingering_disabled(make_srv, transport, request_handler) -> None: async def handle_request(request): await asyncio.sleep(0) srv = make_srv(lingering_time=0) srv.connection_made(transport) request_handler.side_effect = handle_request await asyncio.sleep(0) assert not transport.close.called srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 50\r\n\r\n') await asyncio.sleep(0) assert not transport.close.called await asyncio.sleep(0.05) transport.close.assert_called_with() async def test_lingering_timeout( make_srv, transport, ceil, request_handler ): async def handle_request(request): await asyncio.sleep(0) srv = make_srv(lingering_time=1e-30) srv.connection_made(transport) request_handler.side_effect = handle_request await asyncio.sleep(0.05) assert not transport.close.called srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n' b'Content-Length: 50\r\n\r\n') await asyncio.sleep(0) assert not transport.close.called await asyncio.sleep(0.05) transport.close.assert_called_with() async def test_handle_payload_access_error( make_srv, transport, request_handler ): srv = make_srv(lingering_time=0) srv.connection_made(transport) srv.data_received( b'POST /test HTTP/1.1\r\n' b'Content-Length: 9\r\n\r\n' b'some data' ) # start request_handler task await asyncio.sleep(0.05) with pytest.raises(web.PayloadAccessError): await request_handler.call_args[0][0].content.read() async def test_handle_cancel(make_srv, transport) -> None: log = mock.Mock() srv = make_srv(logger=log, debug=True) srv.connection_made(transport) async def handle_request(message, payload, writer): await asyncio.sleep(10) async def cancel(): srv._task_handler.cancel() with mock.patch.object( web.RequestHandler, 'handle_request', create=True, new=handle_request ): srv.data_received( b'GET / HTTP/1.0\r\n' b'Content-Length: 10\r\n' b'Host: example.com\r\n\r\n') await asyncio.gather(srv._task_handler, cancel()) assert log.debug.called async def test_handle_cancelled(make_srv, transport) -> None: log = mock.Mock() srv = make_srv(logger=log, debug=True) srv.connection_made(transport) # start request_handler task await asyncio.sleep(0) srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n\r\n') r_handler = srv._task_handler assert (await r_handler) is None async def test_handle_400(srv, buf, transport) -> None: srv.data_received(b'GET / HT/asd\r\n\r\n') await asyncio.sleep(0) assert b'400 Bad Request' in buf async def test_keep_alive(make_srv, transport, ceil) -> None: loop = asyncio.get_event_loop() srv = make_srv(keepalive_timeout=0.05) future = loop.create_future() future.set_result(1) with mock.patch.object( web.RequestHandler, 'KEEPALIVE_RESCHEDULE_DELAY', new=0.1 ), mock.patch.object( web.RequestHandler, 'handle_request', create=True, return_value=future ): srv.connection_made(transport) srv.keep_alive(True) srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') waiter = None while waiter is None: await asyncio.sleep(0) waiter = srv._waiter assert srv._keepalive_handle is not None assert not transport.close.called await asyncio.sleep(0.2) assert transport.close.called assert waiter.cancelled async def test_srv_process_request_without_timeout(make_srv, transport) -> None: srv = make_srv() srv.connection_made(transport) srv.data_received( b'GET / HTTP/1.0\r\n' b'Host: example.com\r\n\r\n') await srv._task_handler assert transport.close.called def test_keep_alive_timeout_default(srv) -> None: assert 75 == srv.keepalive_timeout def test_keep_alive_timeout_nondefault(make_srv) -> None: srv = make_srv(keepalive_timeout=10) assert 10 == srv.keepalive_timeout async def test_supports_connect_method(srv, transport, request_handler) -> None: srv.data_received( b'CONNECT aiohttp.readthedocs.org:80 HTTP/1.0\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0.1) assert request_handler.called assert isinstance( request_handler.call_args[0][0].content, streams.StreamReader) async def test_content_length_0(srv, request_handler) -> None: srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.org\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0) assert request_handler.called assert request_handler.call_args[0][0].content == streams.EMPTY_PAYLOAD def test_rudimentary_transport(srv) -> None: transport = mock.Mock() srv.connection_made(transport) srv.pause_reading() assert srv._reading_paused assert transport.pause_reading.called srv.resume_reading() assert not srv._reading_paused assert transport.resume_reading.called transport.resume_reading.side_effect = NotImplementedError() transport.pause_reading.side_effect = NotImplementedError() srv._reading_paused = False srv.pause_reading() assert srv._reading_paused srv.resume_reading() assert not srv._reading_paused async def test_close(srv, transport) -> None: transport.close.side_effect = partial(srv.connection_lost, None) srv.connection_made(transport) await asyncio.sleep(0) handle_request = mock.Mock() handle_request.side_effect = helpers.noop with mock.patch.object( web.RequestHandler, 'handle_request', create=True, new=handle_request ): assert transport is srv.transport srv._keepalive = True srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n' b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0.05) assert srv._task_handler assert srv._waiter srv.close() await asyncio.sleep(0) assert srv._task_handler is None assert srv.transport is None assert transport.close.called async def test_pipeline_multiple_messages( srv, transport, request_handler ): transport.close.side_effect = partial(srv.connection_lost, None) processed = 0 async def handle(request): nonlocal processed processed += 1 return web.Response() request_handler.side_effect = handle assert transport is srv.transport srv._keepalive = True srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n' b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') assert srv._task_handler is not None assert len(srv._messages) == 2 assert srv._waiter is not None await asyncio.sleep(0.05) assert srv._task_handler is not None assert srv._waiter is not None assert processed == 2 async def test_pipeline_response_order( srv, buf, transport, request_handler ): transport.close.side_effect = partial(srv.connection_lost, None) srv._keepalive = True processed = [] async def handle1(request): nonlocal processed await asyncio.sleep(0.01) resp = web.StreamResponse() await resp.prepare(request) await resp.write(b'test1') await resp.write_eof() processed.append(1) return resp request_handler.side_effect = handle1 srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0) # second async def handle2(request): nonlocal processed resp = web.StreamResponse() await resp.prepare(request) await resp.write(b'test2') await resp.write_eof() processed.append(2) return resp request_handler.side_effect = handle2 srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') await asyncio.sleep(0) assert srv._task_handler is not None await asyncio.sleep(0.1) assert processed == [1, 2] def test_data_received_close(srv) -> None: srv.close() srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') assert not srv._messages def test_data_received_force_close(srv) -> None: srv.force_close() srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: example.com\r\n' b'Content-Length: 0\r\n\r\n') assert not srv._messages async def test__process_keepalive(srv) -> None: loop = asyncio.get_event_loop() # wait till the waiter is waiting await asyncio.sleep(0) assert srv._waiter is not None srv._keepalive_time = 1 srv._keepalive = True srv._keepalive_timeout = 1 expired_time = srv._keepalive_time + srv._keepalive_timeout + 1 with mock.patch.object(loop, "time", return_value=expired_time): srv._process_keepalive() assert srv._force_close async def test__process_keepalive_schedule_next(srv) -> None: loop = asyncio.get_event_loop() # wait till the waiter is waiting await asyncio.sleep(0) srv._keepalive = True srv._keepalive_time = 1 srv._keepalive_timeout = 1 expire_time = srv._keepalive_time + srv._keepalive_timeout with mock.patch.object(loop, "time", return_value=expire_time): with mock.patch.object(loop, "call_later") as call_later_patched: srv._process_keepalive() call_later_patched.assert_called_with( 1, srv._process_keepalive ) async def test__process_keepalive_force_close(srv) -> None: loop = asyncio.get_event_loop() srv._force_close = True with mock.patch.object(loop, "call_at") as call_at_patched: srv._process_keepalive() assert not call_at_patched.called async def test_two_data_received_without_waking_up_start_task(srv) -> None: # make a chance to srv.start() method start waiting for srv._waiter await asyncio.sleep(0.01) assert srv._waiter is not None srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: ex.com\r\n' b'Content-Length: 1\r\n\r\n' b'a') srv.data_received( b'GET / HTTP/1.1\r\n' b'Host: ex.com\r\n' b'Content-Length: 1\r\n\r\n' b'b') assert len(srv._messages) == 2 assert srv._waiter.done() await asyncio.sleep(0.01) async def test_client_disconnect(aiohttp_server) -> None: async def handler(request): await request.content.read(10) return web.Response() logger = mock.Mock() app = web.Application() app._debug = True app.router.add_route('POST', '/', handler) server = await aiohttp_server(app, logger=logger) if helpers.PY_38: writer = await asyncio.connect('127.0.0.1', server.port) else: _, writer = await asyncio.open_connection('127.0.0.1', server.port) writer.write("""POST / HTTP/1.1\r Connection: keep-alive\r Content-Length: 10\r Host: localhost:{port}\r \r """.format(port=server.port).encode("ascii")) await writer.drain() await asyncio.sleep(0.1) writer.write(b"x") writer.close() await asyncio.sleep(0.1) logger.debug.assert_called_with('Ignored premature client disconnection 2') aiohttp-3.6.2/tests/test_web_request.py0000644000175100001650000005407013547410117020573 0ustar vstsdocker00000000000000import asyncio import socket from collections.abc import MutableMapping from unittest import mock import pytest from multidict import CIMultiDict, CIMultiDictProxy, MultiDict from yarl import URL from aiohttp import HttpVersion from aiohttp.helpers import DEBUG from aiohttp.http_parser import RawRequestMessage from aiohttp.streams import StreamReader from aiohttp.test_utils import make_mocked_request from aiohttp.web import BaseRequest, HTTPRequestEntityTooLarge @pytest.fixture def protocol(): return mock.Mock(_reading_paused=False) def test_base_ctor() -> None: message = RawRequestMessage( 'GET', '/path/to?a=1&b=2', HttpVersion(1, 1), CIMultiDictProxy(CIMultiDict()), (), False, False, False, False, URL('/path/to?a=1&b=2')) req = BaseRequest(message, mock.Mock(), mock.Mock(), mock.Mock(), mock.Mock(), mock.Mock()) assert 'GET' == req.method assert HttpVersion(1, 1) == req.version assert req.host == socket.getfqdn() assert '/path/to?a=1&b=2' == req.path_qs assert '/path/to' == req.path assert 'a=1&b=2' == req.query_string assert CIMultiDict() == req.headers assert () == req.raw_headers get = req.query assert MultiDict([('a', '1'), ('b', '2')]) == get # second call should return the same object assert get is req.query assert req.keep_alive assert req def test_ctor() -> None: req = make_mocked_request('GET', '/path/to?a=1&b=2') assert 'GET' == req.method assert HttpVersion(1, 1) == req.version assert req.host == socket.getfqdn() assert '/path/to?a=1&b=2' == req.path_qs assert '/path/to' == req.path assert 'a=1&b=2' == req.query_string assert CIMultiDict() == req.headers assert () == req.raw_headers get = req.query assert MultiDict([('a', '1'), ('b', '2')]) == get # second call should return the same object assert get is req.query assert req.keep_alive # just make sure that all lines of make_mocked_request covered headers = CIMultiDict(FOO='bar') payload = mock.Mock() protocol = mock.Mock() app = mock.Mock() req = make_mocked_request('GET', '/path/to?a=1&b=2', headers=headers, protocol=protocol, payload=payload, app=app) assert req.app is app assert req.content is payload assert req.protocol is protocol assert req.transport is protocol.transport assert req.headers == headers assert req.raw_headers == ((b'FOO', b'bar'),) assert req.task is req._task def test_deprecated_message() -> None: req = make_mocked_request('GET', '/path/to?a=1&b=2') with pytest.warns(DeprecationWarning): assert req.message == req._message def test_doubleslashes() -> None: # NB: //foo/bar is an absolute URL with foo netloc and /bar path req = make_mocked_request('GET', '/bar//foo/') assert '/bar//foo/' == req.path def test_content_type_not_specified() -> None: req = make_mocked_request('Get', '/') assert 'application/octet-stream' == req.content_type def test_content_type_from_spec() -> None: req = make_mocked_request('Get', '/', CIMultiDict([('CONTENT-TYPE', 'application/json')])) assert 'application/json' == req.content_type def test_content_type_from_spec_with_charset() -> None: req = make_mocked_request( 'Get', '/', CIMultiDict([('CONTENT-TYPE', 'text/html; charset=UTF-8')])) assert 'text/html' == req.content_type assert 'UTF-8' == req.charset def test_calc_content_type_on_getting_charset() -> None: req = make_mocked_request( 'Get', '/', CIMultiDict([('CONTENT-TYPE', 'text/html; charset=UTF-8')])) assert 'UTF-8' == req.charset assert 'text/html' == req.content_type def test_urlencoded_querystring() -> None: req = make_mocked_request( 'GET', '/yandsearch?text=%D1%82%D0%B5%D0%BA%D1%81%D1%82') assert {'text': 'текст'} == req.query def test_non_ascii_path() -> None: req = make_mocked_request('GET', '/путь') assert '/путь' == req.path def test_non_ascii_raw_path() -> None: req = make_mocked_request('GET', '/путь') assert '/путь' == req.raw_path def test_content_length() -> None: req = make_mocked_request('Get', '/', CIMultiDict([('CONTENT-LENGTH', '123')])) assert 123 == req.content_length def test_range_to_slice_head() -> None: def bytes_gen(size): for i in range(size): yield i % 256 payload = bytearray(bytes_gen(10000)) req = make_mocked_request( 'GET', '/', headers=CIMultiDict([('RANGE', 'bytes=0-499')]), payload=payload) assert isinstance(req.http_range, slice) assert req.content[req.http_range] == payload[:500] def test_range_to_slice_mid() -> None: def bytes_gen(size): for i in range(size): yield i % 256 payload = bytearray(bytes_gen(10000)) req = make_mocked_request( 'GET', '/', headers=CIMultiDict([('RANGE', 'bytes=500-999')]), payload=payload) assert isinstance(req.http_range, slice) assert req.content[req.http_range] == payload[500:1000] def test_range_to_slice_tail_start() -> None: def bytes_gen(size): for i in range(size): yield i % 256 payload = bytearray(bytes_gen(10000)) req = make_mocked_request( 'GET', '/', headers=CIMultiDict([('RANGE', 'bytes=9500-')]), payload=payload) assert isinstance(req.http_range, slice) assert req.content[req.http_range] == payload[-500:] def test_range_to_slice_tail_stop() -> None: def bytes_gen(size): for i in range(size): yield i % 256 payload = bytearray(bytes_gen(10000)) req = make_mocked_request( 'GET', '/', headers=CIMultiDict([('RANGE', 'bytes=-500')]), payload=payload) assert isinstance(req.http_range, slice) assert req.content[req.http_range] == payload[-500:] def test_non_keepalive_on_http10() -> None: req = make_mocked_request('GET', '/', version=HttpVersion(1, 0)) assert not req.keep_alive def test_non_keepalive_on_closing() -> None: req = make_mocked_request('GET', '/', closing=True) assert not req.keep_alive async def test_call_POST_on_GET_request() -> None: req = make_mocked_request('GET', '/') ret = await req.post() assert CIMultiDict() == ret async def test_call_POST_on_weird_content_type() -> None: req = make_mocked_request( 'POST', '/', headers=CIMultiDict({'CONTENT-TYPE': 'something/weird'})) ret = await req.post() assert CIMultiDict() == ret async def test_call_POST_twice() -> None: req = make_mocked_request('GET', '/') ret1 = await req.post() ret2 = await req.post() assert ret1 is ret2 def test_no_request_cookies() -> None: req = make_mocked_request('GET', '/') assert req.cookies == {} cookies = req.cookies assert cookies is req.cookies def test_request_cookie() -> None: headers = CIMultiDict(COOKIE='cookie1=value1; cookie2=value2') req = make_mocked_request('GET', '/', headers=headers) assert req.cookies == {'cookie1': 'value1', 'cookie2': 'value2'} def test_request_cookie__set_item() -> None: headers = CIMultiDict(COOKIE='name=value') req = make_mocked_request('GET', '/', headers=headers) assert req.cookies == {'name': 'value'} with pytest.raises(TypeError): req.cookies['my'] = 'value' def test_match_info() -> None: req = make_mocked_request('GET', '/') assert req._match_info is req.match_info def test_request_is_mutable_mapping() -> None: req = make_mocked_request('GET', '/') assert isinstance(req, MutableMapping) req['key'] = 'value' assert 'value' == req['key'] def test_request_delitem() -> None: req = make_mocked_request('GET', '/') req['key'] = 'value' assert 'value' == req['key'] del req['key'] assert 'key' not in req def test_request_len() -> None: req = make_mocked_request('GET', '/') assert len(req) == 0 req['key'] = 'value' assert len(req) == 1 def test_request_iter() -> None: req = make_mocked_request('GET', '/') req['key'] = 'value' req['key2'] = 'value2' assert set(req) == {'key', 'key2'} def test___repr__() -> None: req = make_mocked_request('GET', '/path/to') assert "" == repr(req) def test___repr___non_ascii_path() -> None: req = make_mocked_request('GET', '/path/\U0001f415\U0001f308') assert "" == repr(req) def test_http_scheme() -> None: req = make_mocked_request('GET', '/', headers={'Host': 'example.com'}) assert "http" == req.scheme assert req.secure is False def test_https_scheme_by_ssl_transport() -> None: req = make_mocked_request('GET', '/', headers={'Host': 'example.com'}, sslcontext=True) assert "https" == req.scheme assert req.secure is True def test_single_forwarded_header() -> None: header = 'by=identifier;for=identifier;host=identifier;proto=identifier' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['by'] == 'identifier' assert req.forwarded[0]['for'] == 'identifier' assert req.forwarded[0]['host'] == 'identifier' assert req.forwarded[0]['proto'] == 'identifier' @pytest.mark.parametrize( "forward_for_in, forward_for_out", [ ("1.2.3.4:1234", "1.2.3.4:1234"), ("1.2.3.4", "1.2.3.4"), ('"[2001:db8:cafe::17]:1234"', '[2001:db8:cafe::17]:1234'), ('"[2001:db8:cafe::17]"', '[2001:db8:cafe::17]'), ]) def test_forwarded_node_identifier(forward_for_in, forward_for_out) -> None: header = 'for={}'.format(forward_for_in) req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded == ({'for': forward_for_out},) def test_single_forwarded_header_camelcase() -> None: header = 'bY=identifier;fOr=identifier;HOst=identifier;pRoTO=identifier' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['by'] == 'identifier' assert req.forwarded[0]['for'] == 'identifier' assert req.forwarded[0]['host'] == 'identifier' assert req.forwarded[0]['proto'] == 'identifier' def test_single_forwarded_header_single_param() -> None: header = 'BY=identifier' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['by'] == 'identifier' def test_single_forwarded_header_multiple_param() -> None: header = 'By=identifier1,BY=identifier2, By=identifier3 , BY=identifier4' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert len(req.forwarded) == 4 assert req.forwarded[0]['by'] == 'identifier1' assert req.forwarded[1]['by'] == 'identifier2' assert req.forwarded[2]['by'] == 'identifier3' assert req.forwarded[3]['by'] == 'identifier4' def test_single_forwarded_header_quoted_escaped() -> None: header = r'BY=identifier;pROTO="\lala lan\d\~ 123\!&"' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['by'] == 'identifier' assert req.forwarded[0]['proto'] == 'lala land~ 123!&' def test_single_forwarded_header_custom_param() -> None: header = r'BY=identifier;PROTO=https;SOME="other, \"value\""' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert len(req.forwarded) == 1 assert req.forwarded[0]['by'] == 'identifier' assert req.forwarded[0]['proto'] == 'https' assert req.forwarded[0]['some'] == 'other, "value"' def test_single_forwarded_header_empty_params() -> None: # This is allowed by the grammar given in RFC 7239 header = ';For=identifier;;PROTO=https;;;' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['for'] == 'identifier' assert req.forwarded[0]['proto'] == 'https' def test_single_forwarded_header_bad_separator() -> None: header = 'BY=identifier PROTO=https' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert 'proto' not in req.forwarded[0] def test_single_forwarded_header_injection1() -> None: # We might receive a header like this if we're sitting behind a reverse # proxy that blindly appends a forwarded-element without checking # the syntax of existing field-values. We should be able to recover # the appended element anyway. header = 'for=_injected;by=", for=_real' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert len(req.forwarded) == 2 assert 'by' not in req.forwarded[0] assert req.forwarded[1]['for'] == '_real' def test_single_forwarded_header_injection2() -> None: header = 'very bad syntax, for=_real' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert len(req.forwarded) == 2 assert 'for' not in req.forwarded[0] assert req.forwarded[1]['for'] == '_real' def test_single_forwarded_header_long_quoted_string() -> None: header = 'for="' + '\\\\' * 5000 + '"' req = make_mocked_request('GET', '/', headers=CIMultiDict({'Forwarded': header})) assert req.forwarded[0]['for'] == '\\' * 5000 def test_multiple_forwarded_headers() -> None: headers = CIMultiDict() headers.add('Forwarded', 'By=identifier1;for=identifier2, BY=identifier3') headers.add('Forwarded', 'By=identifier4;fOr=identifier5') req = make_mocked_request('GET', '/', headers=headers) assert len(req.forwarded) == 3 assert req.forwarded[0]['by'] == 'identifier1' assert req.forwarded[0]['for'] == 'identifier2' assert req.forwarded[1]['by'] == 'identifier3' assert req.forwarded[2]['by'] == 'identifier4' assert req.forwarded[2]['for'] == 'identifier5' def test_multiple_forwarded_headers_bad_syntax() -> None: headers = CIMultiDict() headers.add('Forwarded', 'for=_1;by=_2') headers.add('Forwarded', 'invalid value') headers.add('Forwarded', '') headers.add('Forwarded', 'for=_3;by=_4') req = make_mocked_request('GET', '/', headers=headers) assert len(req.forwarded) == 4 assert req.forwarded[0]['for'] == '_1' assert 'for' not in req.forwarded[1] assert 'for' not in req.forwarded[2] assert req.forwarded[3]['by'] == '_4' def test_multiple_forwarded_headers_injection() -> None: headers = CIMultiDict() # This could be sent by an attacker, hoping to "shadow" the second header. headers.add('Forwarded', 'for=_injected;by="') # This is added by our trusted reverse proxy. headers.add('Forwarded', 'for=_real;by=_actual_proxy') req = make_mocked_request('GET', '/', headers=headers) assert len(req.forwarded) == 2 assert 'by' not in req.forwarded[0] assert req.forwarded[1]['for'] == '_real' assert req.forwarded[1]['by'] == '_actual_proxy' def test_host_by_host_header() -> None: req = make_mocked_request('GET', '/', headers=CIMultiDict({'Host': 'example.com'})) assert req.host == 'example.com' def test_raw_headers() -> None: req = make_mocked_request('GET', '/', headers=CIMultiDict({'X-HEADER': 'aaa'})) assert req.raw_headers == ((b'X-HEADER', b'aaa'),) def test_rel_url() -> None: req = make_mocked_request('GET', '/path') assert URL('/path') == req.rel_url def test_url_url() -> None: req = make_mocked_request('GET', '/path', headers={'HOST': 'example.com'}) assert URL('http://example.com/path') == req.url def test_clone() -> None: req = make_mocked_request('GET', '/path') req2 = req.clone() assert req2.method == 'GET' assert req2.rel_url == URL('/path') def test_clone_client_max_size() -> None: req = make_mocked_request('GET', '/path', client_max_size=1024) req2 = req.clone() assert req._client_max_size == req2._client_max_size assert req2._client_max_size == 1024 def test_clone_method() -> None: req = make_mocked_request('GET', '/path') req2 = req.clone(method='POST') assert req2.method == 'POST' assert req2.rel_url == URL('/path') def test_clone_rel_url() -> None: req = make_mocked_request('GET', '/path') req2 = req.clone(rel_url=URL('/path2')) assert req2.rel_url == URL('/path2') def test_clone_rel_url_str() -> None: req = make_mocked_request('GET', '/path') req2 = req.clone(rel_url='/path2') assert req2.rel_url == URL('/path2') def test_clone_headers() -> None: req = make_mocked_request('GET', '/path', headers={'A': 'B'}) req2 = req.clone(headers=CIMultiDict({'B': 'C'})) assert req2.headers == CIMultiDict({'B': 'C'}) assert req2.raw_headers == ((b'B', b'C'),) def test_clone_headers_dict() -> None: req = make_mocked_request('GET', '/path', headers={'A': 'B'}) req2 = req.clone(headers={'B': 'C'}) assert req2.headers == CIMultiDict({'B': 'C'}) assert req2.raw_headers == ((b'B', b'C'),) async def test_cannot_clone_after_read(protocol) -> None: payload = StreamReader(protocol) payload.feed_data(b'data') payload.feed_eof() req = make_mocked_request('GET', '/path', payload=payload) await req.read() with pytest.raises(RuntimeError): req.clone() async def test_make_too_big_request(protocol) -> None: payload = StreamReader(protocol) large_file = 1024 ** 2 * b'x' too_large_file = large_file + b'x' payload.feed_data(too_large_file) payload.feed_eof() req = make_mocked_request('POST', '/', payload=payload) with pytest.raises(HTTPRequestEntityTooLarge) as err: await req.read() assert err.value.status_code == 413 async def test_make_too_big_request_adjust_limit(protocol) -> None: payload = StreamReader(protocol) large_file = 1024 ** 2 * b'x' too_large_file = large_file + b'x' payload.feed_data(too_large_file) payload.feed_eof() max_size = 1024**2 + 2 req = make_mocked_request('POST', '/', payload=payload, client_max_size=max_size) txt = await req.read() assert len(txt) == 1024**2 + 1 async def test_multipart_formdata(protocol) -> None: payload = StreamReader(protocol) payload.feed_data(b"""-----------------------------326931944431359\r Content-Disposition: form-data; name="a"\r \r b\r -----------------------------326931944431359\r Content-Disposition: form-data; name="c"\r \r d\r -----------------------------326931944431359--\r\n""") content_type = "multipart/form-data; boundary="\ "---------------------------326931944431359" payload.feed_eof() req = make_mocked_request('POST', '/', headers={'CONTENT-TYPE': content_type}, payload=payload) result = await req.post() assert dict(result) == {'a': 'b', 'c': 'd'} async def test_make_too_big_request_limit_None(protocol) -> None: payload = StreamReader(protocol) large_file = 1024 ** 2 * b'x' too_large_file = large_file + b'x' payload.feed_data(too_large_file) payload.feed_eof() max_size = None req = make_mocked_request('POST', '/', payload=payload, client_max_size=max_size) txt = await req.read() assert len(txt) == 1024**2 + 1 def test_remote_peername_tcp() -> None: transp = mock.Mock() transp.get_extra_info.return_value = ('10.10.10.10', 1234) req = make_mocked_request('GET', '/', transport=transp) assert req.remote == '10.10.10.10' def test_remote_peername_unix() -> None: transp = mock.Mock() transp.get_extra_info.return_value = '/path/to/sock' req = make_mocked_request('GET', '/', transport=transp) assert req.remote == '/path/to/sock' def test_save_state_on_clone() -> None: req = make_mocked_request('GET', '/') req['key'] = 'val' req2 = req.clone() req2['key'] = 'val2' assert req['key'] == 'val' assert req2['key'] == 'val2' def test_clone_scheme() -> None: req = make_mocked_request('GET', '/') req2 = req.clone(scheme='https') assert req2.scheme == 'https' def test_clone_host() -> None: req = make_mocked_request('GET', '/') req2 = req.clone(host='example.com') assert req2.host == 'example.com' def test_clone_remote() -> None: req = make_mocked_request('GET', '/') req2 = req.clone(remote='11.11.11.11') assert req2.remote == '11.11.11.11' @pytest.mark.skipif(not DEBUG, reason="The check is applied in DEBUG mode only") def test_request_custom_attr() -> None: req = make_mocked_request('GET', '/') with pytest.warns(DeprecationWarning): req.custom = None def test_remote_with_closed_transport() -> None: transp = mock.Mock() transp.get_extra_info.return_value = ('10.10.10.10', 1234) req = make_mocked_request('GET', '/', transport=transp) req._protocol = None assert req.remote == '10.10.10.10' def test_url_http_with_closed_transport() -> None: req = make_mocked_request('GET', '/') req._protocol = None assert str(req.url).startswith('http://') def test_url_https_with_closed_transport() -> None: req = make_mocked_request('GET', '/', sslcontext=True) req._protocol = None assert str(req.url).startswith('https://') def test_eq() -> None: req1 = make_mocked_request('GET', '/path/to?a=1&b=2') req2 = make_mocked_request('GET', '/path/to?a=1&b=2') assert req1 != req2 assert req1 == req1 async def test_loop_prop() -> None: loop = asyncio.get_event_loop() req = make_mocked_request('GET', '/path', loop=loop) with pytest.warns(DeprecationWarning): assert req.loop is loop aiohttp-3.6.2/tests/test_web_request_handler.py0000644000175100001650000000275613547410117022274 0ustar vstsdocker00000000000000from unittest import mock from aiohttp import web from aiohttp.test_utils import make_mocked_coro async def serve(request): return web.Response() async def test_repr() -> None: manager = web.Server(serve) handler = manager() assert '' == repr(handler) handler.transport = object() assert '' == repr(handler) async def test_connections() -> None: manager = web.Server(serve) assert manager.connections == [] handler = object() transport = object() manager.connection_made(handler, transport) assert manager.connections == [handler] manager.connection_lost(handler, None) assert manager.connections == [] async def test_shutdown_no_timeout() -> None: manager = web.Server(serve) handler = mock.Mock() handler.shutdown = make_mocked_coro(mock.Mock()) transport = mock.Mock() manager.connection_made(handler, transport) await manager.shutdown() manager.connection_lost(handler, None) assert manager.connections == [] handler.shutdown.assert_called_with(None) async def test_shutdown_timeout() -> None: manager = web.Server(serve) handler = mock.Mock() handler.shutdown = make_mocked_coro(mock.Mock()) transport = mock.Mock() manager.connection_made(handler, transport) await manager.shutdown(timeout=0.1) manager.connection_lost(handler, None) assert manager.connections == [] handler.shutdown.assert_called_with(0.1) aiohttp-3.6.2/tests/test_web_response.py0000644000175100001650000010211213547410117020730 0ustar vstsdocker00000000000000import collections.abc import datetime import gzip import json import re from concurrent.futures import ThreadPoolExecutor from unittest import mock import pytest from multidict import CIMultiDict, CIMultiDictProxy from aiohttp import HttpVersion, HttpVersion10, HttpVersion11, hdrs, signals from aiohttp.payload import BytesPayload from aiohttp.test_utils import make_mocked_coro, make_mocked_request from aiohttp.web import ContentCoding, Response, StreamResponse, json_response def make_request(method, path, headers=CIMultiDict(), version=HttpVersion11, on_response_prepare=None, **kwargs): app = kwargs.pop('app', None) or mock.Mock() app._debug = False if on_response_prepare is None: on_response_prepare = signals.Signal(app) app.on_response_prepare = on_response_prepare app.on_response_prepare.freeze() protocol = kwargs.pop('protocol', None) or mock.Mock() return make_mocked_request(method, path, headers, version=version, protocol=protocol, app=app, **kwargs) @pytest.fixture def buf(): return bytearray() @pytest.fixture def writer(buf): writer = mock.Mock() def acquire(cb): cb(writer.transport) def buffer_data(chunk): buf.extend(chunk) def write(chunk): buf.extend(chunk) async def write_headers(status_line, headers): headers = status_line + '\r\n' + ''.join( [k + ': ' + v + '\r\n' for k, v in headers.items()]) headers = headers.encode('utf-8') + b'\r\n' buf.extend(headers) async def write_eof(chunk=b''): buf.extend(chunk) writer.acquire.side_effect = acquire writer.transport.write.side_effect = write writer.write.side_effect = write writer.write_eof.side_effect = write_eof writer.write_headers.side_effect = write_headers writer.buffer_data.side_effect = buffer_data writer.drain.return_value = () return writer def test_stream_response_ctor() -> None: resp = StreamResponse() assert 200 == resp.status assert resp.keep_alive is None assert resp.task is None req = mock.Mock() resp._req = req assert resp.task is req.task def test_stream_response_hashable() -> None: # should not raise exception hash(StreamResponse()) def test_stream_response_eq() -> None: resp1 = StreamResponse() resp2 = StreamResponse() assert resp1 == resp1 assert not resp1 == resp2 def test_stream_response_is_mutable_mapping() -> None: resp = StreamResponse() assert isinstance(resp, collections.abc.MutableMapping) resp['key'] = 'value' assert 'value' == resp['key'] def test_stream_response_delitem() -> None: resp = StreamResponse() resp['key'] = 'value' del resp['key'] assert 'key' not in resp def test_stream_response_len() -> None: resp = StreamResponse() assert len(resp) == 0 resp['key'] = 'value' assert len(resp) == 1 def test_request_iter() -> None: resp = StreamResponse() resp['key'] = 'value' resp['key2'] = 'value2' assert set(resp) == {'key', 'key2'} def test_content_length() -> None: resp = StreamResponse() assert resp.content_length is None def test_content_length_setter() -> None: resp = StreamResponse() resp.content_length = 234 assert 234 == resp.content_length def test_content_length_setter_with_enable_chunked_encoding() -> None: resp = StreamResponse() resp.enable_chunked_encoding() with pytest.raises(RuntimeError): resp.content_length = 234 def test_drop_content_length_header_on_setting_len_to_None() -> None: resp = StreamResponse() resp.content_length = 1 assert "1" == resp.headers['Content-Length'] resp.content_length = None assert 'Content-Length' not in resp.headers def test_set_content_length_to_None_on_non_set() -> None: resp = StreamResponse() resp.content_length = None assert 'Content-Length' not in resp.headers resp.content_length = None assert 'Content-Length' not in resp.headers def test_setting_content_type() -> None: resp = StreamResponse() resp.content_type = 'text/html' assert 'text/html' == resp.headers['content-type'] def test_setting_charset() -> None: resp = StreamResponse() resp.content_type = 'text/html' resp.charset = 'koi8-r' assert 'text/html; charset=koi8-r' == resp.headers['content-type'] def test_default_charset() -> None: resp = StreamResponse() assert resp.charset is None def test_reset_charset() -> None: resp = StreamResponse() resp.content_type = 'text/html' resp.charset = None assert resp.charset is None def test_reset_charset_after_setting() -> None: resp = StreamResponse() resp.content_type = 'text/html' resp.charset = 'koi8-r' resp.charset = None assert resp.charset is None def test_charset_without_content_type() -> None: resp = StreamResponse() with pytest.raises(RuntimeError): resp.charset = 'koi8-r' def test_last_modified_initial() -> None: resp = StreamResponse() assert resp.last_modified is None def test_last_modified_string() -> None: resp = StreamResponse() dt = datetime.datetime(1990, 1, 2, 3, 4, 5, 0, datetime.timezone.utc) resp.last_modified = 'Mon, 2 Jan 1990 03:04:05 GMT' assert resp.last_modified == dt def test_last_modified_timestamp() -> None: resp = StreamResponse() dt = datetime.datetime(1970, 1, 1, 0, 0, 0, 0, datetime.timezone.utc) resp.last_modified = 0 assert resp.last_modified == dt resp.last_modified = 0.0 assert resp.last_modified == dt def test_last_modified_datetime() -> None: resp = StreamResponse() dt = datetime.datetime(2001, 2, 3, 4, 5, 6, 0, datetime.timezone.utc) resp.last_modified = dt assert resp.last_modified == dt def test_last_modified_reset() -> None: resp = StreamResponse() resp.last_modified = 0 resp.last_modified = None assert resp.last_modified is None async def test_start() -> None: req = make_request('GET', '/') resp = StreamResponse() assert resp.keep_alive is None msg = await resp.prepare(req) assert msg.write_headers.called msg2 = await resp.prepare(req) assert msg is msg2 assert resp.keep_alive req2 = make_request('GET', '/') # with pytest.raises(RuntimeError): msg3 = await resp.prepare(req2) assert msg is msg3 async def test_chunked_encoding() -> None: req = make_request('GET', '/') resp = StreamResponse() assert not resp.chunked resp.enable_chunked_encoding() assert resp.chunked msg = await resp.prepare(req) assert msg.chunked def test_enable_chunked_encoding_with_content_length() -> None: resp = StreamResponse() resp.content_length = 234 with pytest.raises(RuntimeError): resp.enable_chunked_encoding() async def test_chunk_size() -> None: req = make_request('GET', '/') resp = StreamResponse() assert not resp.chunked with pytest.warns(DeprecationWarning): resp.enable_chunked_encoding(chunk_size=8192) assert resp.chunked msg = await resp.prepare(req) assert msg.chunked assert msg.enable_chunking.called assert msg.filter is not None async def test_chunked_encoding_forbidden_for_http_10() -> None: req = make_request('GET', '/', version=HttpVersion10) resp = StreamResponse() resp.enable_chunked_encoding() with pytest.raises(RuntimeError) as ctx: await resp.prepare(req) assert re.match("Using chunked encoding is forbidden for HTTP/1.0", str(ctx.value)) async def test_compression_no_accept() -> None: req = make_request('GET', '/') resp = StreamResponse() assert not resp.chunked assert not resp.compression resp.enable_compression() assert resp.compression msg = await resp.prepare(req) assert not msg.enable_compression.called async def test_force_compression_no_accept_backwards_compat() -> None: req = make_request('GET', '/') resp = StreamResponse() assert not resp.chunked assert not resp.compression with pytest.warns(DeprecationWarning): resp.enable_compression(force=True) assert resp.compression msg = await resp.prepare(req) assert msg.enable_compression.called assert msg.filter is not None async def test_force_compression_false_backwards_compat() -> None: req = make_request('GET', '/') resp = StreamResponse() assert not resp.compression with pytest.warns(DeprecationWarning): resp.enable_compression(force=False) assert resp.compression msg = await resp.prepare(req) assert not msg.enable_compression.called async def test_compression_default_coding() -> None: req = make_request( 'GET', '/', headers=CIMultiDict({hdrs.ACCEPT_ENCODING: 'gzip, deflate'})) resp = StreamResponse() assert not resp.chunked assert not resp.compression resp.enable_compression() assert resp.compression msg = await resp.prepare(req) msg.enable_compression.assert_called_with('deflate') assert 'deflate' == resp.headers.get(hdrs.CONTENT_ENCODING) assert msg.filter is not None async def test_force_compression_deflate() -> None: req = make_request( 'GET', '/', headers=CIMultiDict({hdrs.ACCEPT_ENCODING: 'gzip, deflate'})) resp = StreamResponse() resp.enable_compression(ContentCoding.deflate) assert resp.compression msg = await resp.prepare(req) msg.enable_compression.assert_called_with('deflate') assert 'deflate' == resp.headers.get(hdrs.CONTENT_ENCODING) async def test_force_compression_no_accept_deflate() -> None: req = make_request('GET', '/') resp = StreamResponse() resp.enable_compression(ContentCoding.deflate) assert resp.compression msg = await resp.prepare(req) msg.enable_compression.assert_called_with('deflate') assert 'deflate' == resp.headers.get(hdrs.CONTENT_ENCODING) async def test_force_compression_gzip() -> None: req = make_request( 'GET', '/', headers=CIMultiDict({hdrs.ACCEPT_ENCODING: 'gzip, deflate'})) resp = StreamResponse() resp.enable_compression(ContentCoding.gzip) assert resp.compression msg = await resp.prepare(req) msg.enable_compression.assert_called_with('gzip') assert 'gzip' == resp.headers.get(hdrs.CONTENT_ENCODING) async def test_force_compression_no_accept_gzip() -> None: req = make_request('GET', '/') resp = StreamResponse() resp.enable_compression(ContentCoding.gzip) assert resp.compression msg = await resp.prepare(req) msg.enable_compression.assert_called_with('gzip') assert 'gzip' == resp.headers.get(hdrs.CONTENT_ENCODING) async def test_change_content_threaded_compression_enabled() -> None: req = make_request('GET', '/') body_thread_size = 1024 body = b'answer' * body_thread_size resp = Response(body=body, zlib_executor_size=body_thread_size) resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert gzip.decompress(resp._compressed_body) == body async def test_change_content_threaded_compression_enabled_explicit() -> None: req = make_request('GET', '/') body_thread_size = 1024 body = b'answer' * body_thread_size with ThreadPoolExecutor(1) as executor: resp = Response(body=body, zlib_executor_size=body_thread_size, zlib_executor=executor) resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert gzip.decompress(resp._compressed_body) == body async def test_change_content_length_if_compression_enabled() -> None: req = make_request('GET', '/') resp = Response(body=b'answer') resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length is not None and \ resp.content_length != len(b'answer') async def test_set_content_length_if_compression_enabled() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH in headers assert headers[hdrs.CONTENT_LENGTH] == '26' assert hdrs.TRANSFER_ENCODING not in headers writer.write_headers.side_effect = write_headers req = make_request('GET', '/', writer=writer) resp = Response(body=b'answer') resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length == 26 del resp.headers[hdrs.CONTENT_LENGTH] assert resp.content_length == 26 async def test_remove_content_length_if_compression_enabled_http11() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH not in headers assert headers.get(hdrs.TRANSFER_ENCODING, '') == 'chunked' writer.write_headers.side_effect = write_headers req = make_request('GET', '/', writer=writer) resp = StreamResponse() resp.content_length = 123 resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length is None async def test_remove_content_length_if_compression_enabled_http10() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH not in headers assert hdrs.TRANSFER_ENCODING not in headers writer.write_headers.side_effect = write_headers req = make_request('GET', '/', version=HttpVersion10, writer=writer) resp = StreamResponse() resp.content_length = 123 resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length is None async def test_force_compression_identity() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH in headers assert hdrs.TRANSFER_ENCODING not in headers writer.write_headers.side_effect = write_headers req = make_request('GET', '/', writer=writer) resp = StreamResponse() resp.content_length = 123 resp.enable_compression(ContentCoding.identity) await resp.prepare(req) assert resp.content_length == 123 async def test_force_compression_identity_response() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert headers[hdrs.CONTENT_LENGTH] == "6" assert hdrs.TRANSFER_ENCODING not in headers writer.write_headers.side_effect = write_headers req = make_request('GET', '/', writer=writer) resp = Response(body=b'answer') resp.enable_compression(ContentCoding.identity) await resp.prepare(req) assert resp.content_length == 6 async def test_rm_content_length_if_compression_http11() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH not in headers assert headers.get(hdrs.TRANSFER_ENCODING, '') == 'chunked' writer.write_headers.side_effect = write_headers req = make_request('GET', '/', writer=writer) payload = BytesPayload(b'answer', headers={"X-Test-Header": "test"}) resp = Response(body=payload) assert resp.content_length == 6 resp.body = payload resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length is None async def test_rm_content_length_if_compression_http10() -> None: writer = mock.Mock() async def write_headers(status_line, headers): assert hdrs.CONTENT_LENGTH not in headers assert hdrs.TRANSFER_ENCODING not in headers writer.write_headers.side_effect = write_headers req = make_request('GET', '/', version=HttpVersion10, writer=writer) resp = Response(body=BytesPayload(b'answer')) resp.enable_compression(ContentCoding.gzip) await resp.prepare(req) assert resp.content_length is None async def test_content_length_on_chunked() -> None: req = make_request('GET', '/') resp = Response(body=b'answer') assert resp.content_length == 6 resp.enable_chunked_encoding() assert resp.content_length is None await resp.prepare(req) async def test_write_non_byteish() -> None: resp = StreamResponse() await resp.prepare(make_request('GET', '/')) with pytest.raises(AssertionError): await resp.write(123) async def test_write_before_start() -> None: resp = StreamResponse() with pytest.raises(RuntimeError): await resp.write(b'data') async def test_cannot_write_after_eof() -> None: resp = StreamResponse() req = make_request('GET', '/') await resp.prepare(req) await resp.write(b'data') await resp.write_eof() req.writer.write.reset_mock() with pytest.raises(RuntimeError): await resp.write(b'next data') assert not req.writer.write.called async def test___repr___after_eof() -> None: resp = StreamResponse() await resp.prepare(make_request('GET', '/')) assert resp.prepared await resp.write(b'data') await resp.write_eof() assert not resp.prepared resp_repr = repr(resp) assert resp_repr == '' async def test_cannot_write_eof_before_headers() -> None: resp = StreamResponse() with pytest.raises(AssertionError): await resp.write_eof() async def test_cannot_write_eof_twice() -> None: resp = StreamResponse() writer = mock.Mock() resp_impl = await resp.prepare(make_request('GET', '/')) resp_impl.write = make_mocked_coro(None) resp_impl.write_eof = make_mocked_coro(None) await resp.write(b'data') assert resp_impl.write.called await resp.write_eof() resp_impl.write.reset_mock() await resp.write_eof() assert not writer.write.called def test_force_close() -> None: resp = StreamResponse() assert resp.keep_alive is None resp.force_close() assert resp.keep_alive is False async def test_response_output_length() -> None: resp = StreamResponse() await resp.prepare(make_request('GET', '/')) with pytest.warns(DeprecationWarning): assert resp.output_length def test_response_cookies() -> None: resp = StreamResponse() assert resp.cookies == {} assert str(resp.cookies) == '' resp.set_cookie('name', 'value') assert str(resp.cookies) == 'Set-Cookie: name=value; Path=/' resp.set_cookie('name', 'other_value') assert str(resp.cookies) == 'Set-Cookie: name=other_value; Path=/' resp.cookies['name'] = 'another_other_value' resp.cookies['name']['max-age'] = 10 assert (str(resp.cookies) == 'Set-Cookie: name=another_other_value; Max-Age=10; Path=/') resp.del_cookie('name') expected = ('Set-Cookie: name=("")?; ' 'expires=Thu, 01 Jan 1970 00:00:00 GMT; Max-Age=0; Path=/') assert re.match(expected, str(resp.cookies)) resp.set_cookie('name', 'value', domain='local.host') expected = 'Set-Cookie: name=value; Domain=local.host; Path=/' assert str(resp.cookies) == expected def test_response_cookie_path() -> None: resp = StreamResponse() assert resp.cookies == {} resp.set_cookie('name', 'value', path='/some/path') assert str(resp.cookies) == 'Set-Cookie: name=value; Path=/some/path' resp.set_cookie('name', 'value', expires='123') assert (str(resp.cookies) == 'Set-Cookie: name=value; expires=123; Path=/') resp.set_cookie('name', 'value', domain='example.com', path='/home', expires='123', max_age='10', secure=True, httponly=True, version='2.0') assert (str(resp.cookies).lower() == 'set-cookie: name=value; ' 'domain=example.com; ' 'expires=123; ' 'httponly; ' 'max-age=10; ' 'path=/home; ' 'secure; ' 'version=2.0') def test_response_cookie__issue_del_cookie() -> None: resp = StreamResponse() assert resp.cookies == {} assert str(resp.cookies) == '' resp.del_cookie('name') expected = ('Set-Cookie: name=("")?; ' 'expires=Thu, 01 Jan 1970 00:00:00 GMT; Max-Age=0; Path=/') assert re.match(expected, str(resp.cookies)) def test_cookie_set_after_del() -> None: resp = StreamResponse() resp.del_cookie('name') resp.set_cookie('name', 'val') # check for Max-Age dropped expected = 'Set-Cookie: name=val; Path=/' assert str(resp.cookies) == expected def test_set_status_with_reason() -> None: resp = StreamResponse() resp.set_status(200, "Everithing is fine!") assert 200 == resp.status assert "Everithing is fine!" == resp.reason async def test_start_force_close() -> None: req = make_request('GET', '/') resp = StreamResponse() resp.force_close() assert not resp.keep_alive await resp.prepare(req) assert not resp.keep_alive async def test___repr__() -> None: req = make_request('GET', '/path/to') resp = StreamResponse(reason=301) await resp.prepare(req) assert "" == repr(resp) def test___repr___not_prepared() -> None: resp = StreamResponse(reason=301) assert "" == repr(resp) async def test_keep_alive_http10_default() -> None: req = make_request('GET', '/', version=HttpVersion10) resp = StreamResponse() await resp.prepare(req) assert not resp.keep_alive async def test_keep_alive_http10_switched_on() -> None: headers = CIMultiDict(Connection='keep-alive') req = make_request('GET', '/', version=HttpVersion10, headers=headers) req._message = req._message._replace(should_close=False) resp = StreamResponse() await resp.prepare(req) assert resp.keep_alive async def test_keep_alive_http09() -> None: headers = CIMultiDict(Connection='keep-alive') req = make_request('GET', '/', version=HttpVersion(0, 9), headers=headers) resp = StreamResponse() await resp.prepare(req) assert not resp.keep_alive async def test_prepare_twice() -> None: req = make_request('GET', '/') resp = StreamResponse() impl1 = await resp.prepare(req) impl2 = await resp.prepare(req) assert impl1 is impl2 async def test_prepare_calls_signal() -> None: app = mock.Mock() sig = make_mocked_coro() on_response_prepare = signals.Signal(app) on_response_prepare.append(sig) req = make_request('GET', '/', app=app, on_response_prepare=on_response_prepare) resp = StreamResponse() await resp.prepare(req) sig.assert_called_with(req, resp) # Response class def test_response_ctor() -> None: resp = Response() assert 200 == resp.status assert 'OK' == resp.reason assert resp.body is None assert resp.content_length == 0 assert 'CONTENT-LENGTH' not in resp.headers async def test_ctor_with_headers_and_status() -> None: resp = Response(body=b'body', status=201, headers={'Age': '12', 'DATE': 'date'}) assert 201 == resp.status assert b'body' == resp.body assert resp.headers['AGE'] == '12' req = make_mocked_request('GET', '/') await resp._start(req) assert 4 == resp.content_length assert resp.headers['CONTENT-LENGTH'] == '4' def test_ctor_content_type() -> None: resp = Response(content_type='application/json') assert 200 == resp.status assert 'OK' == resp.reason assert 0 == resp.content_length assert (CIMultiDict([('CONTENT-TYPE', 'application/json')]) == resp.headers) def test_ctor_text_body_combined() -> None: with pytest.raises(ValueError): Response(body=b'123', text='test text') async def test_ctor_text() -> None: resp = Response(text='test text') assert 200 == resp.status assert 'OK' == resp.reason assert 9 == resp.content_length assert (CIMultiDict( [('CONTENT-TYPE', 'text/plain; charset=utf-8')]) == resp.headers) assert resp.body == b'test text' assert resp.text == 'test text' resp.headers['DATE'] = 'date' req = make_mocked_request('GET', '/', version=HttpVersion11) await resp._start(req) assert resp.headers['CONTENT-LENGTH'] == '9' def test_ctor_charset() -> None: resp = Response(text='текст', charset='koi8-r') assert 'текст'.encode('koi8-r') == resp.body assert 'koi8-r' == resp.charset def test_ctor_charset_default_utf8() -> None: resp = Response(text='test test', charset=None) assert 'utf-8' == resp.charset def test_ctor_charset_in_content_type() -> None: with pytest.raises(ValueError): Response(text='test test', content_type='text/plain; charset=utf-8') def test_ctor_charset_without_text() -> None: resp = Response(content_type='text/plain', charset='koi8-r') assert 'koi8-r' == resp.charset def test_ctor_content_type_with_extra() -> None: resp = Response(text='test test', content_type='text/plain; version=0.0.4') assert resp.content_type == 'text/plain' assert resp.headers['content-type'] == \ 'text/plain; version=0.0.4; charset=utf-8' def test_ctor_both_content_type_param_and_header_with_text() -> None: with pytest.raises(ValueError): Response(headers={'Content-Type': 'application/json'}, content_type='text/html', text='text') def test_ctor_both_charset_param_and_header_with_text() -> None: with pytest.raises(ValueError): Response(headers={'Content-Type': 'application/json'}, charset='koi8-r', text='text') def test_ctor_both_content_type_param_and_header() -> None: with pytest.raises(ValueError): Response(headers={'Content-Type': 'application/json'}, content_type='text/html') def test_ctor_both_charset_param_and_header() -> None: with pytest.raises(ValueError): Response(headers={'Content-Type': 'application/json'}, charset='koi8-r') async def test_assign_nonbyteish_body() -> None: resp = Response(body=b'data') with pytest.raises(ValueError): resp.body = 123 assert b'data' == resp.body assert 4 == resp.content_length resp.headers['DATE'] = 'date' req = make_mocked_request('GET', '/', version=HttpVersion11) await resp._start(req) assert resp.headers['CONTENT-LENGTH'] == '4' assert 4 == resp.content_length def test_assign_nonstr_text() -> None: resp = Response(text='test') with pytest.raises(AssertionError): resp.text = b'123' assert b'test' == resp.body assert 4 == resp.content_length def test_response_set_content_length() -> None: resp = Response() with pytest.raises(RuntimeError): resp.content_length = 1 async def test_send_headers_for_empty_body(buf, writer) -> None: req = make_request('GET', '/', writer=writer) resp = Response() await resp.prepare(req) await resp.write_eof() txt = buf.decode('utf8') assert re.match('HTTP/1.1 200 OK\r\n' 'Content-Length: 0\r\n' 'Content-Type: application/octet-stream\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n', txt) async def test_render_with_body(buf, writer) -> None: req = make_request('GET', '/', writer=writer) resp = Response(body=b'data') await resp.prepare(req) await resp.write_eof() txt = buf.decode('utf8') assert re.match('HTTP/1.1 200 OK\r\n' 'Content-Length: 4\r\n' 'Content-Type: application/octet-stream\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n' 'data', txt) async def test_send_set_cookie_header(buf, writer) -> None: resp = Response() resp.cookies['name'] = 'value' req = make_request('GET', '/', writer=writer) await resp.prepare(req) await resp.write_eof() txt = buf.decode('utf8') assert re.match('HTTP/1.1 200 OK\r\n' 'Content-Length: 0\r\n' 'Set-Cookie: name=value\r\n' 'Content-Type: application/octet-stream\r\n' 'Date: .+\r\n' 'Server: .+\r\n\r\n', txt) async def test_consecutive_write_eof() -> None: writer = mock.Mock() writer.write_eof = make_mocked_coro() writer.write_headers = make_mocked_coro() req = make_request('GET', '/', writer=writer) data = b'data' resp = Response(body=data) await resp.prepare(req) await resp.write_eof() await resp.write_eof() writer.write_eof.assert_called_once_with(data) def test_set_text_with_content_type() -> None: resp = Response() resp.content_type = "text/html" resp.text = "text" assert "text" == resp.text assert b"text" == resp.body assert "text/html" == resp.content_type def test_set_text_with_charset() -> None: resp = Response() resp.content_type = 'text/plain' resp.charset = "KOI8-R" resp.text = "текст" assert "текст" == resp.text assert "текст".encode('koi8-r') == resp.body assert "koi8-r" == resp.charset def test_default_content_type_in_stream_response() -> None: resp = StreamResponse() assert resp.content_type == 'application/octet-stream' def test_default_content_type_in_response() -> None: resp = Response() assert resp.content_type == 'application/octet-stream' def test_content_type_with_set_text() -> None: resp = Response(text='text') assert resp.content_type == 'text/plain' def test_content_type_with_set_body() -> None: resp = Response(body=b'body') assert resp.content_type == 'application/octet-stream' def test_started_when_not_started() -> None: resp = StreamResponse() assert not resp.prepared async def test_started_when_started() -> None: resp = StreamResponse() await resp.prepare(make_request('GET', '/')) assert resp.prepared async def test_drain_before_start() -> None: resp = StreamResponse() with pytest.raises(AssertionError): await resp.drain() async def test_changing_status_after_prepare_raises() -> None: resp = StreamResponse() await resp.prepare(make_request('GET', '/')) with pytest.raises(AssertionError): resp.set_status(400) def test_nonstr_text_in_ctor() -> None: with pytest.raises(TypeError): Response(text=b'data') def test_text_in_ctor_with_content_type() -> None: resp = Response(text='data', content_type='text/html') assert 'data' == resp.text assert 'text/html' == resp.content_type def test_text_in_ctor_with_content_type_header() -> None: resp = Response(text='текст', headers={'Content-Type': 'text/html; charset=koi8-r'}) assert 'текст'.encode('koi8-r') == resp.body assert 'text/html' == resp.content_type assert 'koi8-r' == resp.charset def test_text_in_ctor_with_content_type_header_multidict() -> None: headers = CIMultiDict({'Content-Type': 'text/html; charset=koi8-r'}) resp = Response(text='текст', headers=headers) assert 'текст'.encode('koi8-r') == resp.body assert 'text/html' == resp.content_type assert 'koi8-r' == resp.charset def test_body_in_ctor_with_content_type_header_multidict() -> None: headers = CIMultiDict({'Content-Type': 'text/html; charset=koi8-r'}) resp = Response(body='текст'.encode('koi8-r'), headers=headers) assert 'текст'.encode('koi8-r') == resp.body assert 'text/html' == resp.content_type assert 'koi8-r' == resp.charset def test_text_with_empty_payload() -> None: resp = Response(status=200) assert resp.body is None assert resp.text is None def test_response_with_content_length_header_without_body() -> None: resp = Response(headers={'Content-Length': 123}) assert resp.content_length == 123 def test_response_with_immutable_headers() -> None: resp = Response(text='text', headers=CIMultiDictProxy(CIMultiDict({'Header': 'Value'}))) assert resp.headers == {'Header': 'Value', 'Content-Type': 'text/plain; charset=utf-8'} class TestJSONResponse: def test_content_type_is_application_json_by_default(self) -> None: resp = json_response('') assert 'application/json' == resp.content_type def test_passing_text_only(self) -> None: resp = json_response(text=json.dumps('jaysawn')) assert resp.text == json.dumps('jaysawn') def test_data_and_text_raises_value_error(self) -> None: with pytest.raises(ValueError) as excinfo: json_response(data='foo', text='bar') expected_message = ( 'only one of data, text, or body should be specified' ) assert expected_message == excinfo.value.args[0] def test_data_and_body_raises_value_error(self) -> None: with pytest.raises(ValueError) as excinfo: json_response(data='foo', body=b'bar') expected_message = ( 'only one of data, text, or body should be specified' ) assert expected_message == excinfo.value.args[0] def test_text_is_json_encoded(self) -> None: resp = json_response({'foo': 42}) assert json.dumps({'foo': 42}) == resp.text def test_content_type_is_overrideable(self) -> None: resp = json_response({'foo': 42}, content_type='application/vnd.json+api') assert 'application/vnd.json+api' == resp.content_type aiohttp-3.6.2/tests/test_web_runner.py0000644000175100001650000000765113547410117020417 0ustar vstsdocker00000000000000import asyncio import platform import signal import pytest from aiohttp import web from aiohttp.test_utils import get_unused_port_socket @pytest.fixture def app(): return web.Application() @pytest.fixture def make_runner(loop, app): asyncio.set_event_loop(loop) runners = [] def go(**kwargs): runner = web.AppRunner(app, **kwargs) runners.append(runner) return runner yield go for runner in runners: loop.run_until_complete(runner.cleanup()) async def test_site_for_nonfrozen_app(make_runner) -> None: runner = make_runner() with pytest.raises(RuntimeError): web.TCPSite(runner) assert len(runner.sites) == 0 @pytest.mark.skipif(platform.system() == "Windows", reason="the test is not valid for Windows") async def test_runner_setup_handle_signals(make_runner) -> None: runner = make_runner(handle_signals=True) await runner.setup() assert signal.getsignal(signal.SIGTERM) is not signal.SIG_DFL await runner.cleanup() assert signal.getsignal(signal.SIGTERM) is signal.SIG_DFL @pytest.mark.skipif(platform.system() == "Windows", reason="the test is not valid for Windows") async def test_runner_setup_without_signal_handling(make_runner) -> None: runner = make_runner(handle_signals=False) await runner.setup() assert signal.getsignal(signal.SIGTERM) is signal.SIG_DFL await runner.cleanup() assert signal.getsignal(signal.SIGTERM) is signal.SIG_DFL async def test_site_double_added(make_runner) -> None: _sock = get_unused_port_socket('127.0.0.1') runner = make_runner() await runner.setup() site = web.SockSite(runner, _sock) await site.start() with pytest.raises(RuntimeError): await site.start() assert len(runner.sites) == 1 async def test_site_stop_not_started(make_runner) -> None: runner = make_runner() await runner.setup() site = web.TCPSite(runner) with pytest.raises(RuntimeError): await site.stop() assert len(runner.sites) == 0 async def test_custom_log_format(make_runner) -> None: runner = make_runner(access_log_format='abc') await runner.setup() assert runner.server._kwargs['access_log_format'] == 'abc' async def test_unreg_site(make_runner) -> None: runner = make_runner() await runner.setup() site = web.TCPSite(runner) with pytest.raises(RuntimeError): runner._unreg_site(site) async def test_app_property(make_runner, app) -> None: runner = make_runner() assert runner.app is app def test_non_app() -> None: with pytest.raises(TypeError): web.AppRunner(object()) @pytest.mark.skipif(platform.system() == "Windows", reason="Unix socket support is required") async def test_addresses(make_runner, shorttmpdir) -> None: _sock = get_unused_port_socket('127.0.0.1') runner = make_runner() await runner.setup() tcp = web.SockSite(runner, _sock) await tcp.start() path = str(shorttmpdir / 'tmp.sock') unix = web.UnixSite(runner, path) await unix.start() actual_addrs = runner.addresses expected_host, expected_post = _sock.getsockname()[:2] assert actual_addrs == [(expected_host, expected_post), path] @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_runner_wrong_loop(app, pipe_name) -> None: runner = web.AppRunner(app) await runner.setup() with pytest.raises(RuntimeError): web.NamedPipeSite(runner, pipe_name) @pytest.mark.skipif(platform.system() != "Windows", reason="Proactor Event loop present only in Windows") async def test_named_pipe_runner_proactor_loop( proactor_loop, app, pipe_name ) -> None: runner = web.AppRunner(app) await runner.setup() pipe = web.NamedPipeSite(runner, pipe_name) await pipe.start() await runner.cleanup() aiohttp-3.6.2/tests/test_web_sendfile.py0000644000175100001650000000731713547410117020676 0ustar vstsdocker00000000000000from unittest import mock from aiohttp import hdrs from aiohttp.test_utils import make_mocked_coro, make_mocked_request from aiohttp.web_fileresponse import FileResponse def test_using_gzip_if_header_present_and_file_available(loop) -> None: request = make_mocked_request( 'GET', 'http://python.org/logo.png', headers={ hdrs.ACCEPT_ENCODING: 'gzip' } ) gz_filepath = mock.Mock() gz_filepath.open = mock.mock_open() gz_filepath.is_file.return_value = True gz_filepath.stat.return_value = mock.MagicMock() gz_filepath.stat.st_size = 1024 filepath = mock.Mock() filepath.name = 'logo.png' filepath.open = mock.mock_open() filepath.with_name.return_value = gz_filepath file_sender = FileResponse(filepath) file_sender._sendfile = make_mocked_coro(None) loop.run_until_complete(file_sender.prepare(request)) assert not filepath.open.called assert gz_filepath.open.called def test_gzip_if_header_not_present_and_file_available(loop) -> None: request = make_mocked_request( 'GET', 'http://python.org/logo.png', headers={ } ) gz_filepath = mock.Mock() gz_filepath.open = mock.mock_open() gz_filepath.is_file.return_value = True filepath = mock.Mock() filepath.name = 'logo.png' filepath.open = mock.mock_open() filepath.with_name.return_value = gz_filepath filepath.stat.return_value = mock.MagicMock() filepath.stat.st_size = 1024 file_sender = FileResponse(filepath) file_sender._sendfile = make_mocked_coro(None) loop.run_until_complete(file_sender.prepare(request)) assert filepath.open.called assert not gz_filepath.open.called def test_gzip_if_header_not_present_and_file_not_available(loop) -> None: request = make_mocked_request( 'GET', 'http://python.org/logo.png', headers={ } ) gz_filepath = mock.Mock() gz_filepath.open = mock.mock_open() gz_filepath.is_file.return_value = False filepath = mock.Mock() filepath.name = 'logo.png' filepath.open = mock.mock_open() filepath.with_name.return_value = gz_filepath filepath.stat.return_value = mock.MagicMock() filepath.stat.st_size = 1024 file_sender = FileResponse(filepath) file_sender._sendfile = make_mocked_coro(None) loop.run_until_complete(file_sender.prepare(request)) assert filepath.open.called assert not gz_filepath.open.called def test_gzip_if_header_present_and_file_not_available(loop) -> None: request = make_mocked_request( 'GET', 'http://python.org/logo.png', headers={ hdrs.ACCEPT_ENCODING: 'gzip' } ) gz_filepath = mock.Mock() gz_filepath.open = mock.mock_open() gz_filepath.is_file.return_value = False filepath = mock.Mock() filepath.name = 'logo.png' filepath.open = mock.mock_open() filepath.with_name.return_value = gz_filepath filepath.stat.return_value = mock.MagicMock() filepath.stat.st_size = 1024 file_sender = FileResponse(filepath) file_sender._sendfile = make_mocked_coro(None) loop.run_until_complete(file_sender.prepare(request)) assert filepath.open.called assert not gz_filepath.open.called def test_status_controlled_by_user(loop) -> None: request = make_mocked_request( 'GET', 'http://python.org/logo.png', headers={ } ) filepath = mock.Mock() filepath.name = 'logo.png' filepath.open = mock.mock_open() filepath.stat.return_value = mock.MagicMock() filepath.stat.st_size = 1024 file_sender = FileResponse(filepath, status=203) file_sender._sendfile = make_mocked_coro(None) loop.run_until_complete(file_sender.prepare(request)) assert file_sender._status == 203 aiohttp-3.6.2/tests/test_web_sendfile_functional.py0000644000175100001650000006010413547410117023111 0ustar vstsdocker00000000000000import asyncio import os import pathlib import socket import zlib import pytest import aiohttp from aiohttp import web try: import ssl except ImportError: ssl = None # type: ignore @pytest.fixture(params=['sendfile', 'fallback'], ids=['sendfile', 'fallback']) def sender(request): def maker(*args, **kwargs): ret = web.FileResponse(*args, **kwargs) if request.param == 'fallback': ret._sendfile = ret._sendfile_fallback return ret return maker async def test_static_file_ok(aiohttp_client, sender) -> None: filepath = pathlib.Path(__file__).parent / 'data.unknown_mime_type' async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 txt = await resp.text() assert 'file content' == txt.rstrip() assert 'application/octet-stream' == resp.headers['Content-Type'] assert resp.headers.get('Content-Encoding') is None await resp.release() async def test_static_file_ok_string_path(aiohttp_client, sender) -> None: filepath = pathlib.Path(__file__).parent / 'data.unknown_mime_type' async def handler(request): return sender(str(filepath)) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 txt = await resp.text() assert 'file content' == txt.rstrip() assert 'application/octet-stream' == resp.headers['Content-Type'] assert resp.headers.get('Content-Encoding') is None await resp.release() async def test_static_file_not_exists(aiohttp_client) -> None: app = web.Application() client = await aiohttp_client(app) resp = await client.get('/fake') assert resp.status == 404 await resp.release() async def test_static_file_name_too_long(aiohttp_client) -> None: app = web.Application() client = await aiohttp_client(app) resp = await client.get('/x*500') assert resp.status == 404 await resp.release() async def test_static_file_upper_directory(aiohttp_client) -> None: app = web.Application() client = await aiohttp_client(app) resp = await client.get('/../../') assert resp.status == 404 await resp.release() async def test_static_file_with_content_type(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent / 'aiohttp.jpg') async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 body = await resp.read() with filepath.open('rb') as f: content = f.read() assert content == body assert resp.headers['Content-Type'] == 'image/jpeg' assert resp.headers.get('Content-Encoding') is None resp.close() async def test_static_file_custom_content_type(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent / 'hello.txt.gz') async def handler(request): resp = sender(filepath, chunk_size=16) resp.content_type = 'application/pdf' return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 body = await resp.read() with filepath.open('rb') as f: content = f.read() assert content == body assert resp.headers['Content-Type'] == 'application/pdf' assert resp.headers.get('Content-Encoding') is None resp.close() async def test_static_file_custom_content_type_compress(aiohttp_client, sender): filepath = (pathlib.Path(__file__).parent / 'hello.txt') async def handler(request): resp = sender(filepath, chunk_size=16) resp.content_type = 'application/pdf' return resp app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 body = await resp.read() assert b'hello aiohttp\n' == body assert resp.headers['Content-Type'] == 'application/pdf' assert resp.headers.get('Content-Encoding') == 'gzip' resp.close() async def test_static_file_with_content_encoding(aiohttp_client, sender) -> None: filepath = pathlib.Path(__file__).parent / 'hello.txt.gz' async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status body = await resp.read() assert b'hello aiohttp\n' == body ct = resp.headers['CONTENT-TYPE'] assert 'text/plain' == ct encoding = resp.headers['CONTENT-ENCODING'] assert 'gzip' == encoding resp.close() async def test_static_file_if_modified_since(aiohttp_client, sender) -> None: filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert 200 == resp.status lastmod = resp.headers.get('Last-Modified') assert lastmod is not None resp.close() resp = await client.get('/', headers={'If-Modified-Since': lastmod}) body = await resp.read() assert 304 == resp.status assert resp.headers.get('Content-Length') is None assert b'' == body resp.close() async def test_static_file_if_modified_since_past_date(aiohttp_client, sender) -> None: filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Mon, 1 Jan 1990 01:01:01 GMT' resp = await client.get('/', headers={'If-Modified-Since': lastmod}) assert 200 == resp.status resp.close() async def test_static_file_if_modified_since_invalid_date(aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'not a valid HTTP-date' resp = await client.get('/', headers={'If-Modified-Since': lastmod}) assert 200 == resp.status resp.close() async def test_static_file_if_modified_since_future_date(aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Fri, 31 Dec 9999 23:59:59 GMT' resp = await client.get('/', headers={'If-Modified-Since': lastmod}) body = await resp.read() assert 304 == resp.status assert resp.headers.get('Content-Length') is None assert b'' == body resp.close() @pytest.mark.skipif(not ssl, reason="ssl not supported") async def test_static_file_ssl( aiohttp_server, ssl_ctx, aiohttp_client, client_ssl_ctx, ) -> None: dirname = os.path.dirname(__file__) filename = 'data.unknown_mime_type' app = web.Application() app.router.add_static('/static', dirname) server = await aiohttp_server(app, ssl=ssl_ctx) conn = aiohttp.TCPConnector(ssl=client_ssl_ctx) client = await aiohttp_client(server, connector=conn) resp = await client.get('/static/'+filename) assert 200 == resp.status txt = await resp.text() assert 'file content' == txt.rstrip() ct = resp.headers['CONTENT-TYPE'] assert 'application/octet-stream' == ct assert resp.headers.get('CONTENT-ENCODING') is None async def test_static_file_directory_traversal_attack(aiohttp_client) -> None: dirname = os.path.dirname(__file__) relpath = '../README.rst' assert os.path.isfile(os.path.join(dirname, relpath)) app = web.Application() app.router.add_static('/static', dirname) client = await aiohttp_client(app) resp = await client.get('/static/'+relpath) assert 404 == resp.status url_relpath2 = '/static/dir/../' + relpath resp = await client.get(url_relpath2) assert 404 == resp.status url_abspath = \ '/static/' + os.path.abspath(os.path.join(dirname, relpath)) resp = await client.get(url_abspath) assert 403 == resp.status def test_static_route_path_existence_check() -> None: directory = os.path.dirname(__file__) web.StaticResource("/", directory) nodirectory = os.path.join(directory, "nonexistent-uPNiOEAg5d") with pytest.raises(ValueError): web.StaticResource("/", nodirectory) async def test_static_file_huge(aiohttp_client, tmpdir) -> None: filename = 'huge_data.unknown_mime_type' # fill 20MB file with tmpdir.join(filename).open('w') as f: for i in range(1024*20): f.write(chr(i % 64 + 0x20) * 1024) file_st = os.stat(str(tmpdir.join(filename))) app = web.Application() app.router.add_static('/static', str(tmpdir)) client = await aiohttp_client(app) resp = await client.get('/static/'+filename) assert 200 == resp.status ct = resp.headers['CONTENT-TYPE'] assert 'application/octet-stream' == ct assert resp.headers.get('CONTENT-ENCODING') is None assert int(resp.headers.get('CONTENT-LENGTH')) == file_st.st_size f = tmpdir.join(filename).open('rb') off = 0 cnt = 0 while off < file_st.st_size: chunk = await resp.content.readany() expected = f.read(len(chunk)) assert chunk == expected off += len(chunk) cnt += 1 f.close() async def test_static_file_range(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent.parent / 'LICENSE.txt') filesize = filepath.stat().st_size async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with filepath.open('rb') as f: content = f.read() # Ensure the whole file requested in parts is correct responses = await asyncio.gather( client.get('/', headers={'Range': 'bytes=0-999'}), client.get('/', headers={'Range': 'bytes=1000-1999'}), client.get('/', headers={'Range': 'bytes=2000-'}), ) assert len(responses) == 3 assert responses[0].status == 206, \ "failed 'bytes=0-999': %s" % responses[0].reason assert responses[0].headers['Content-Range'] == 'bytes 0-999/{0}'.format( filesize), 'failed: Content-Range Error' assert responses[1].status == 206, \ "failed 'bytes=1000-1999': %s" % responses[1].reason assert responses[1].headers['Content-Range'] == \ 'bytes 1000-1999/{0}'.format(filesize), 'failed: Content-Range Error' assert responses[2].status == 206, \ "failed 'bytes=2000-': %s" % responses[2].reason assert responses[2].headers['Content-Range'] == \ 'bytes 2000-{0}/{1}'.format(filesize - 1, filesize), \ 'failed: Content-Range Error' body = await asyncio.gather( *(resp.read() for resp in responses), ) assert len(body[0]) == 1000, \ "failed 'bytes=0-999', received %d bytes" % len(body[0]) assert len(body[1]) == 1000, \ "failed 'bytes=1000-1999', received %d bytes" % len(body[1]) responses[0].close() responses[1].close() responses[2].close() assert content == b"".join(body) async def test_static_file_range_end_bigger_than_size( aiohttp_client, sender ): filepath = (pathlib.Path(__file__).parent / 'aiohttp.png') async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with filepath.open('rb') as f: content = f.read() # Ensure the whole file requested in parts is correct response = await client.get( '/', headers={'Range': 'bytes=54000-55000'}) assert response.status == 206, \ "failed 'bytes=54000-55000': %s" % response.reason assert response.headers['Content-Range'] == \ 'bytes 54000-54996/54997', 'failed: Content-Range Error' body = await response.read() assert len(body) == 997, \ "failed 'bytes=54000-55000', received %d bytes" % len(body) assert content[54000:] == body async def test_static_file_range_beyond_eof(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent / 'aiohttp.png') async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) # Ensure the whole file requested in parts is correct response = await client.get( '/', headers={'Range': 'bytes=1000000-1200000'}) assert response.status == 416, \ "failed 'bytes=1000000-1200000': %s" % response.reason async def test_static_file_range_tail(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent / 'aiohttp.png') async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) with filepath.open('rb') as f: content = f.read() # Ensure the tail of the file is correct resp = await client.get('/', headers={'Range': 'bytes=-500'}) assert resp.status == 206, resp.reason assert resp.headers['Content-Range'] == 'bytes 54497-54996/54997', \ 'failed: Content-Range Error' body4 = await resp.read() resp.close() assert content[-500:] == body4 # Ensure out-of-range tails could be handled resp2 = await client.get('/', headers={'Range': 'bytes=-99999999999999'}) assert resp2.status == 206, resp.reason assert resp2.headers['Content-Range'] == 'bytes 0-54996/54997', \ 'failed: Content-Range Error' async def test_static_file_invalid_range(aiohttp_client, sender) -> None: filepath = (pathlib.Path(__file__).parent / 'aiohttp.png') async def handler(request): return sender(filepath, chunk_size=16) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) # range must be in bytes resp = await client.get('/', headers={'Range': 'blocks=0-10'}) assert resp.status == 416, 'Range must be in bytes' resp.close() # start > end resp = await client.get('/', headers={'Range': 'bytes=100-0'}) assert resp.status == 416, "Range start can't be greater than end" resp.close() # start > end resp = await client.get('/', headers={'Range': 'bytes=10-9'}) assert resp.status == 416, "Range start can't be greater than end" resp.close() # non-number range resp = await client.get('/', headers={'Range': 'bytes=a-f'}) assert resp.status == 416, 'Range must be integers' resp.close() # double dash range resp = await client.get('/', headers={'Range': 'bytes=0--10'}) assert resp.status == 416, 'double dash in range' resp.close() # no range resp = await client.get('/', headers={'Range': 'bytes=-'}) assert resp.status == 416, 'no range given' resp.close() async def test_static_file_if_unmodified_since_past_with_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Mon, 1 Jan 1990 01:01:01 GMT' resp = await client.get('/', headers={ 'If-Unmodified-Since': lastmod, 'Range': 'bytes=2-'}) assert 412 == resp.status resp.close() async def test_static_file_if_unmodified_since_future_with_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Fri, 31 Dec 9999 23:59:59 GMT' resp = await client.get('/', headers={ 'If-Unmodified-Since': lastmod, 'Range': 'bytes=2-'}) assert 206 == resp.status assert resp.headers['Content-Range'] == 'bytes 2-12/13' assert resp.headers['Content-Length'] == '11' resp.close() async def test_static_file_if_range_past_with_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Mon, 1 Jan 1990 01:01:01 GMT' resp = await client.get('/', headers={ 'If-Range': lastmod, 'Range': 'bytes=2-'}) assert 200 == resp.status assert resp.headers['Content-Length'] == '13' resp.close() async def test_static_file_if_range_future_with_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Fri, 31 Dec 9999 23:59:59 GMT' resp = await client.get('/', headers={ 'If-Range': lastmod, 'Range': 'bytes=2-'}) assert 206 == resp.status assert resp.headers['Content-Range'] == 'bytes 2-12/13' assert resp.headers['Content-Length'] == '11' resp.close() async def test_static_file_if_unmodified_since_past_without_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Mon, 1 Jan 1990 01:01:01 GMT' resp = await client.get('/', headers={'If-Unmodified-Since': lastmod}) assert 412 == resp.status resp.close() async def test_static_file_if_unmodified_since_future_without_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Fri, 31 Dec 9999 23:59:59 GMT' resp = await client.get('/', headers={'If-Unmodified-Since': lastmod}) assert 200 == resp.status assert resp.headers['Content-Length'] == '13' resp.close() async def test_static_file_if_range_past_without_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Mon, 1 Jan 1990 01:01:01 GMT' resp = await client.get('/', headers={'If-Range': lastmod}) assert 200 == resp.status assert resp.headers['Content-Length'] == '13' resp.close() async def test_static_file_if_range_future_without_range( aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'Fri, 31 Dec 9999 23:59:59 GMT' resp = await client.get('/', headers={'If-Range': lastmod}) assert 200 == resp.status assert resp.headers['Content-Length'] == '13' resp.close() async def test_static_file_if_unmodified_since_invalid_date(aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'not a valid HTTP-date' resp = await client.get('/', headers={'If-Unmodified-Since': lastmod}) assert 200 == resp.status resp.close() async def test_static_file_if_range_invalid_date(aiohttp_client, sender): filename = 'data.unknown_mime_type' filepath = pathlib.Path(__file__).parent / filename async def handler(request): return sender(filepath) app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) lastmod = 'not a valid HTTP-date' resp = await client.get('/', headers={'If-Range': lastmod}) assert 200 == resp.status resp.close() async def test_static_file_compression(aiohttp_client, sender) -> None: filepath = pathlib.Path(__file__).parent / 'data.unknown_mime_type' async def handler(request): ret = sender(filepath) ret.enable_compression() return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app, auto_decompress=False) resp = await client.get('/') assert resp.status == 200 zcomp = zlib.compressobj(wbits=-zlib.MAX_WBITS) expected_body = zcomp.compress(b'file content\n') + zcomp.flush() assert expected_body == await resp.read() assert 'application/octet-stream' == resp.headers['Content-Type'] assert resp.headers.get('Content-Encoding') == 'deflate' await resp.release() async def test_static_file_huge_cancel(aiohttp_client, tmpdir) -> None: filename = 'huge_data.unknown_mime_type' # fill 100MB file with tmpdir.join(filename).open('w') as f: for i in range(1024*20): f.write(chr(i % 64 + 0x20) * 1024) task = None async def handler(request): nonlocal task task = request.task # reduce send buffer size tr = request.transport sock = tr.get_extra_info('socket') sock.setsockopt(socket.SOL_SOCKET, socket.SO_SNDBUF, 1024) ret = web.FileResponse(pathlib.Path(str(tmpdir.join(filename)))) return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 task.cancel() await asyncio.sleep(0) data = b'' while True: try: data += await resp.content.read(1024) except aiohttp.ClientPayloadError: break assert len(data) < 1024 * 1024 * 20 async def test_static_file_huge_error(aiohttp_client, tmpdir) -> None: filename = 'huge_data.unknown_mime_type' # fill 20MB file with tmpdir.join(filename).open('wb') as f: f.seek(20*1024*1024) f.write(b'1') async def handler(request): # reduce send buffer size tr = request.transport sock = tr.get_extra_info('socket') sock.setsockopt(socket.SOL_SOCKET, socket.SO_SNDBUF, 1024) ret = web.FileResponse(pathlib.Path(str(tmpdir.join(filename)))) return ret app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 200 # raise an exception on server side resp.close() aiohttp-3.6.2/tests/test_web_server.py0000644000175100001650000001222013547410117020400 0ustar vstsdocker00000000000000import asyncio from unittest import mock import pytest from aiohttp import client, web async def test_simple_server(aiohttp_raw_server, aiohttp_client) -> None: async def handler(request): return web.Response(text=str(request.rel_url)) server = await aiohttp_raw_server(handler) cli = await aiohttp_client(server) resp = await cli.get('/path/to') assert resp.status == 200 txt = await resp.text() assert txt == '/path/to' async def test_raw_server_not_http_exception(aiohttp_raw_server, aiohttp_client): exc = RuntimeError("custom runtime error") async def handler(request): raise exc logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger, debug=False) cli = await aiohttp_client(server) resp = await cli.get('/path/to') assert resp.status == 500 assert resp.headers['Content-Type'].startswith('text/plain') txt = await resp.text() assert txt.startswith('500 Internal Server Error') assert 'Traceback' not in txt logger.exception.assert_called_with( "Error handling request", exc_info=exc) async def test_raw_server_handler_timeout(aiohttp_raw_server, aiohttp_client) -> None: exc = asyncio.TimeoutError("error") async def handler(request): raise exc logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger) cli = await aiohttp_client(server) resp = await cli.get('/path/to') assert resp.status == 504 await resp.text() logger.debug.assert_called_with("Request handler timed out.", exc_info=exc) async def test_raw_server_do_not_swallow_exceptions(aiohttp_raw_server, aiohttp_client): async def handler(request): raise asyncio.CancelledError() logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger) cli = await aiohttp_client(server) with pytest.raises(client.ServerDisconnectedError): await cli.get('/path/to') logger.debug.assert_called_with('Ignored premature client disconnection') async def test_raw_server_cancelled_in_write_eof(aiohttp_raw_server, aiohttp_client): async def handler(request): resp = web.Response(text=str(request.rel_url)) resp.write_eof = mock.Mock(side_effect=asyncio.CancelledError("error")) return resp logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger) cli = await aiohttp_client(server) resp = await cli.get('/path/to') with pytest.raises(client.ClientPayloadError): await resp.read() logger.debug.assert_called_with('Ignored premature client disconnection ') async def test_raw_server_not_http_exception_debug(aiohttp_raw_server, aiohttp_client): exc = RuntimeError("custom runtime error") async def handler(request): raise exc logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger, debug=True) cli = await aiohttp_client(server) resp = await cli.get('/path/to') assert resp.status == 500 assert resp.headers['Content-Type'].startswith('text/plain') txt = await resp.text() assert 'Traceback (most recent call last):\n' in txt logger.exception.assert_called_with( "Error handling request", exc_info=exc) async def test_raw_server_html_exception(aiohttp_raw_server, aiohttp_client): exc = RuntimeError("custom runtime error") async def handler(request): raise exc logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger, debug=False) cli = await aiohttp_client(server) resp = await cli.get('/path/to', headers={'Accept': 'text/html'}) assert resp.status == 500 assert resp.headers['Content-Type'].startswith('text/html') txt = await resp.text() assert txt == ( '500 Internal Server Error\n' '

    500 Internal Server Error

    \n' 'Server got itself in trouble\n' '\n' ) logger.exception.assert_called_with( "Error handling request", exc_info=exc) async def test_raw_server_html_exception_debug(aiohttp_raw_server, aiohttp_client): exc = RuntimeError("custom runtime error") async def handler(request): raise exc logger = mock.Mock() server = await aiohttp_raw_server(handler, logger=logger, debug=True) cli = await aiohttp_client(server) resp = await cli.get('/path/to', headers={'Accept': 'text/html'}) assert resp.status == 500 assert resp.headers['Content-Type'].startswith('text/html') txt = await resp.text() assert txt.startswith( '500 Internal Server Error\n' '

    500 Internal Server Error

    \n' '

    Traceback:

    \n' '
    Traceback (most recent call last):\n'
        )
    
        logger.exception.assert_called_with(
            "Error handling request", exc_info=exc)
    aiohttp-3.6.2/tests/test_web_urldispatcher.py0000644000175100001650000003323213547410117021751 0ustar  vstsdocker00000000000000import functools
    import os
    import pathlib
    import shutil
    import tempfile
    from unittest import mock
    from unittest.mock import MagicMock
    
    import pytest
    
    from aiohttp import abc, web
    from aiohttp.web_urldispatcher import SystemRoute
    
    
    @pytest.fixture(scope='function')
    def tmp_dir_path(request):
        """
        Give a path for a temporary directory
        The directory is destroyed at the end of the test.
        """
        # Temporary directory.
        tmp_dir = tempfile.mkdtemp()
    
        def teardown():
            # Delete the whole directory:
            shutil.rmtree(tmp_dir)
    
        request.addfinalizer(teardown)
        return tmp_dir
    
    
    @pytest.mark.parametrize(
        "show_index,status,prefix,data",
        [pytest.param(False, 403, '/', None, id="index_forbidden"),
         pytest.param(True, 200, '/',
                      b'\n\nIndex of /.\n'
                      b'\n\n

    Index of /.

    \n
    \n\n', id="index_root"), pytest.param(True, 200, '/static', b'\n\nIndex of /.\n' b'\n\n

    Index of /.

    \n\n\n', id="index_static")]) async def test_access_root_of_static_handler(tmp_dir_path, aiohttp_client, show_index, status, prefix, data) -> None: """ Tests the operation of static file server. Try to access the root of static file server, and make sure that correct HTTP statuses are returned depending if we directory index should be shown or not. """ # Put a file inside tmp_dir_path: my_file_path = os.path.join(tmp_dir_path, 'my_file') with open(my_file_path, 'w') as fw: fw.write('hello') my_dir_path = os.path.join(tmp_dir_path, 'my_dir') os.mkdir(my_dir_path) my_file_path = os.path.join(my_dir_path, 'my_file_in_dir') with open(my_file_path, 'w') as fw: fw.write('world') app = web.Application() # Register global static route: app.router.add_static(prefix, tmp_dir_path, show_index=show_index) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get(prefix) assert r.status == status if data: assert r.headers['Content-Type'] == "text/html; charset=utf-8" read_ = (await r.read()) assert read_ == data async def test_follow_symlink(tmp_dir_path, aiohttp_client) -> None: """ Tests the access to a symlink, in static folder """ data = 'hello world' my_dir_path = os.path.join(tmp_dir_path, 'my_dir') os.mkdir(my_dir_path) my_file_path = os.path.join(my_dir_path, 'my_file_in_dir') with open(my_file_path, 'w') as fw: fw.write(data) my_symlink_path = os.path.join(tmp_dir_path, 'my_symlink') os.symlink(my_dir_path, my_symlink_path) app = web.Application() # Register global static route: app.router.add_static('/', tmp_dir_path, follow_symlinks=True) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get('/my_symlink/my_file_in_dir') assert r.status == 200 assert (await r.text()) == data @pytest.mark.parametrize('dir_name,filename,data', [ ('', 'test file.txt', 'test text'), ('test dir name', 'test dir file .txt', 'test text file folder') ]) async def test_access_to_the_file_with_spaces(tmp_dir_path, aiohttp_client, dir_name, filename, data): """ Checks operation of static files with spaces """ my_dir_path = os.path.join(tmp_dir_path, dir_name) if dir_name: os.mkdir(my_dir_path) my_file_path = os.path.join(my_dir_path, filename) with open(my_file_path, 'w') as fw: fw.write(data) app = web.Application() url = os.path.join('/', dir_name, filename) app.router.add_static('/', tmp_dir_path) client = await aiohttp_client(app) r = await client.get(url) assert r.status == 200 assert (await r.text()) == data async def test_access_non_existing_resource(tmp_dir_path, aiohttp_client) -> None: """ Tests accessing non-existing resource Try to access a non-exiting resource and make sure that 404 HTTP status returned. """ app = web.Application() # Register global static route: app.router.add_static('/', tmp_dir_path, show_index=True) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get('/non_existing_resource') assert r.status == 404 @pytest.mark.parametrize('registered_path,request_url', [ ('/a:b', '/a:b'), ('/a@b', '/a@b'), ('/a:b', '/a%3Ab'), ]) async def test_url_escaping(aiohttp_client, registered_path, request_url) -> None: """ Tests accessing a resource with """ app = web.Application() async def handler(request): return web.Response() app.router.add_get(registered_path, handler) client = await aiohttp_client(app) r = await client.get(request_url) assert r.status == 200 async def test_handler_metadata_persistence() -> None: """ Tests accessing metadata of a handler after registering it on the app router. """ app = web.Application() async def async_handler(request): """Doc""" return web.Response() def sync_handler(request): """Doc""" return web.Response() app.router.add_get('/async', async_handler) with pytest.warns(DeprecationWarning): app.router.add_get('/sync', sync_handler) for resource in app.router.resources(): for route in resource: assert route.handler.__doc__ == 'Doc' async def test_unauthorized_folder_access(tmp_dir_path, aiohttp_client) -> None: """ Tests the unauthorized access to a folder of static file server. Try to list a folder content of static file server when server does not have permissions to do so for the folder. """ my_dir_path = os.path.join(tmp_dir_path, 'my_dir') os.mkdir(my_dir_path) app = web.Application() with mock.patch('pathlib.Path.__new__') as path_constructor: path = MagicMock() path.joinpath.return_value = path path.resolve.return_value = path path.iterdir.return_value.__iter__.side_effect = PermissionError() path_constructor.return_value = path # Register global static route: app.router.add_static('/', tmp_dir_path, show_index=True) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get('/my_dir') assert r.status == 403 async def test_access_symlink_loop(tmp_dir_path, aiohttp_client) -> None: """ Tests the access to a looped symlink, which could not be resolved. """ my_dir_path = os.path.join(tmp_dir_path, 'my_symlink') os.symlink(my_dir_path, my_dir_path) app = web.Application() # Register global static route: app.router.add_static('/', tmp_dir_path, show_index=True) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get('/my_symlink') assert r.status == 404 async def test_access_special_resource(tmp_dir_path, aiohttp_client) -> None: """ Tests the access to a resource that is neither a file nor a directory. Checks that if a special resource is accessed (f.e. named pipe or UNIX domain socket) then 404 HTTP status returned. """ app = web.Application() with mock.patch('pathlib.Path.__new__') as path_constructor: special = MagicMock() special.is_dir.return_value = False special.is_file.return_value = False path = MagicMock() path.joinpath.side_effect = lambda p: (special if p == 'special' else path) path.resolve.return_value = path special.resolve.return_value = special path_constructor.return_value = path # Register global static route: app.router.add_static('/', tmp_dir_path, show_index=True) client = await aiohttp_client(app) # Request the root of the static directory. r = await client.get('/special') assert r.status == 403 async def test_partially_applied_handler(aiohttp_client) -> None: app = web.Application() async def handler(data, request): return web.Response(body=data) with pytest.warns(DeprecationWarning): app.router.add_route('GET', '/', functools.partial(handler, b'hello')) client = await aiohttp_client(app) r = await client.get('/') data = (await r.read()) assert data == b'hello' def test_system_route() -> None: route = SystemRoute(web.HTTPCreated(reason='test')) with pytest.raises(RuntimeError): route.url_for() assert route.name is None assert route.resource is None assert "" == repr(route) assert 201 == route.status assert 'test' == route.reason async def test_412_is_returned(aiohttp_client) -> None: class MyRouter(abc.AbstractRouter): async def resolve(self, request): raise web.HTTPPreconditionFailed() with pytest.warns(DeprecationWarning): app = web.Application(router=MyRouter()) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 412 async def test_allow_head(aiohttp_client) -> None: """ Test allow_head on routes. """ app = web.Application() async def handler(_): return web.Response() app.router.add_get('/a', handler, name='a') app.router.add_get('/b', handler, allow_head=False, name='b') client = await aiohttp_client(app) r = await client.get('/a') assert r.status == 200 await r.release() r = await client.head('/a') assert r.status == 200 await r.release() r = await client.get('/b') assert r.status == 200 await r.release() r = await client.head('/b') assert r.status == 405 await r.release() @pytest.mark.parametrize("path", [ '/a', '/{a}', ]) def test_reuse_last_added_resource(path) -> None: """ Test that adding a route with the same name and path of the last added resource doesn't create a new resource. """ app = web.Application() async def handler(request): return web.Response() app.router.add_get(path, handler, name="a") app.router.add_post(path, handler, name="a") assert len(app.router.resources()) == 1 def test_resource_raw_match() -> None: app = web.Application() async def handler(request): return web.Response() route = app.router.add_get("/a", handler, name="a") assert route.resource.raw_match("/a") route = app.router.add_get("/{b}", handler, name="b") assert route.resource.raw_match("/{b}") resource = app.router.add_static("/static", ".") assert not resource.raw_match("/static") async def test_add_view(aiohttp_client) -> None: app = web.Application() class MyView(web.View): async def get(self): return web.Response() async def post(self): return web.Response() app.router.add_view("/a", MyView) client = await aiohttp_client(app) r = await client.get("/a") assert r.status == 200 await r.release() r = await client.post("/a") assert r.status == 200 await r.release() r = await client.put("/a") assert r.status == 405 await r.release() async def test_decorate_view(aiohttp_client) -> None: routes = web.RouteTableDef() @routes.view("/a") class MyView(web.View): async def get(self): return web.Response() async def post(self): return web.Response() app = web.Application() app.router.add_routes(routes) client = await aiohttp_client(app) r = await client.get("/a") assert r.status == 200 await r.release() r = await client.post("/a") assert r.status == 200 await r.release() r = await client.put("/a") assert r.status == 405 await r.release() async def test_web_view(aiohttp_client) -> None: app = web.Application() class MyView(web.View): async def get(self): return web.Response() async def post(self): return web.Response() app.router.add_routes([ web.view("/a", MyView) ]) client = await aiohttp_client(app) r = await client.get("/a") assert r.status == 200 await r.release() r = await client.post("/a") assert r.status == 200 await r.release() r = await client.put("/a") assert r.status == 405 await r.release() async def test_static_absolute_url(aiohttp_client, tmpdir) -> None: # requested url is an absolute name like # /static/\\machine_name\c$ or /static/D:\path # where the static dir is totally different app = web.Application() fname = tmpdir / 'file.txt' fname.write_text('sample text', 'ascii') here = pathlib.Path(__file__).parent app.router.add_static('/static', here) client = await aiohttp_client(app) resp = await client.get('/static/' + str(fname)) assert resp.status == 403 aiohttp-3.6.2/tests/test_web_websocket.py0000644000175100001650000003310413547410117021064 0ustar vstsdocker00000000000000import asyncio from unittest import mock import pytest from multidict import CIMultiDict from aiohttp import WSMessage, WSMsgType, signals from aiohttp.log import ws_logger from aiohttp.streams import EofStream from aiohttp.test_utils import make_mocked_coro, make_mocked_request from aiohttp.web import HTTPBadRequest, WebSocketResponse from aiohttp.web_ws import WS_CLOSED_MESSAGE, WebSocketReady @pytest.fixture def app(loop): ret = mock.Mock() ret.loop = loop ret._debug = False ret.on_response_prepare = signals.Signal(ret) ret.on_response_prepare.freeze() return ret @pytest.fixture def protocol(): ret = mock.Mock() ret.set_parser.return_value = ret return ret @pytest.fixture def make_request(app, protocol): def maker(method, path, headers=None, protocols=False): if headers is None: headers = CIMultiDict( {'HOST': 'server.example.com', 'UPGRADE': 'websocket', 'CONNECTION': 'Upgrade', 'SEC-WEBSOCKET-KEY': 'dGhlIHNhbXBsZSBub25jZQ==', 'ORIGIN': 'http://example.com', 'SEC-WEBSOCKET-VERSION': '13'}) if protocols: headers['SEC-WEBSOCKET-PROTOCOL'] = 'chat, superchat' return make_mocked_request( method, path, headers, app=app, protocol=protocol, loop=app.loop) return maker async def test_nonstarted_ping() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.ping() async def test_nonstarted_pong() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.pong() async def test_nonstarted_send_str() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.send_str('string') async def test_nonstarted_send_bytes() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.send_bytes(b'bytes') async def test_nonstarted_send_json() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.send_json({'type': 'json'}) async def test_nonstarted_close() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.close() async def test_nonstarted_receive_str() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.receive_str() async def test_nonstarted_receive_bytes() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.receive_bytes() async def test_nonstarted_receive_json() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.receive_json() async def test_receive_str_nonstring(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) async def receive(): return WSMessage(WSMsgType.BINARY, b'data', b'') ws.receive = receive with pytest.raises(TypeError): await ws.receive_str() async def test_receive_bytes_nonsbytes(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) async def receive(): return WSMessage(WSMsgType.TEXT, 'data', b'') ws.receive = receive with pytest.raises(TypeError): await ws.receive_bytes() async def test_send_str_nonstring(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) with pytest.raises(TypeError): await ws.send_str(b'bytes') async def test_send_bytes_nonbytes(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) with pytest.raises(TypeError): await ws.send_bytes('string') async def test_send_json_nonjson(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) with pytest.raises(TypeError): await ws.send_json(set()) async def test_write_non_prepared() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.write(b'data') def test_websocket_ready() -> None: websocket_ready = WebSocketReady(True, 'chat') assert websocket_ready.ok is True assert websocket_ready.protocol == 'chat' def test_websocket_not_ready() -> None: websocket_ready = WebSocketReady(False, None) assert websocket_ready.ok is False assert websocket_ready.protocol is None def test_websocket_ready_unknown_protocol() -> None: websocket_ready = WebSocketReady(True, None) assert websocket_ready.ok is True assert websocket_ready.protocol is None def test_bool_websocket_ready() -> None: websocket_ready = WebSocketReady(True, None) assert bool(websocket_ready) is True def test_bool_websocket_not_ready() -> None: websocket_ready = WebSocketReady(False, None) assert bool(websocket_ready) is False def test_can_prepare_ok(make_request) -> None: req = make_request('GET', '/', protocols=True) ws = WebSocketResponse(protocols=('chat',)) assert WebSocketReady(True, 'chat') == ws.can_prepare(req) def test_can_prepare_unknown_protocol(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() assert WebSocketReady(True, None) == ws.can_prepare(req) def test_can_prepare_without_upgrade(make_request) -> None: req = make_request('GET', '/', headers=CIMultiDict({})) ws = WebSocketResponse() assert WebSocketReady(False, None) == ws.can_prepare(req) async def test_can_prepare_started(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) with pytest.raises(RuntimeError) as ctx: ws.can_prepare(req) assert 'Already started' in str(ctx.value) def test_closed_after_ctor() -> None: ws = WebSocketResponse() assert not ws.closed assert ws.close_code is None async def test_send_str_closed(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() mocker.spy(ws_logger, 'warning') await ws.send_str('string') assert ws_logger.warning.called async def test_send_bytes_closed(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() mocker.spy(ws_logger, 'warning') await ws.send_bytes(b'bytes') assert ws_logger.warning.called async def test_send_json_closed(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() mocker.spy(ws_logger, 'warning') await ws.send_json({'type': 'json'}) assert ws_logger.warning.called async def test_ping_closed(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() mocker.spy(ws_logger, 'warning') await ws.ping() assert ws_logger.warning.called async def test_pong_closed(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() mocker.spy(ws_logger, 'warning') await ws.pong() assert ws_logger.warning.called async def test_close_idempotent(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) assert (await ws.close(code=1, message='message1')) assert ws.closed assert not (await ws.close(code=2, message='message2')) async def test_prepare_post_method_ok(make_request) -> None: req = make_request('POST', '/') ws = WebSocketResponse() await ws.prepare(req) assert ws.prepared async def test_prepare_without_upgrade(make_request) -> None: req = make_request('GET', '/', headers=CIMultiDict({})) ws = WebSocketResponse() with pytest.raises(HTTPBadRequest): await ws.prepare(req) async def test_wait_closed_before_start() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.close() async def test_write_eof_not_started() -> None: ws = WebSocketResponse() with pytest.raises(RuntimeError): await ws.write_eof() async def test_write_eof_idempotent(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() await ws.write_eof() await ws.write_eof() await ws.write_eof() async def test_receive_eofstream_in_reader(make_request, loop) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader = mock.Mock() exc = EofStream() res = loop.create_future() res.set_exception(exc) ws._reader.read = make_mocked_coro(res) ws._payload_writer.drain = mock.Mock() ws._payload_writer.drain.return_value = loop.create_future() ws._payload_writer.drain.return_value.set_result(True) msg = await ws.receive() assert msg.type == WSMsgType.CLOSED assert ws.closed async def test_receive_exc_in_reader(make_request, loop) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader = mock.Mock() exc = ValueError() res = loop.create_future() res.set_exception(exc) ws._reader.read = make_mocked_coro(res) ws._payload_writer.drain = mock.Mock() ws._payload_writer.drain.return_value = loop.create_future() ws._payload_writer.drain.return_value.set_result(True) msg = await ws.receive() assert msg.type == WSMsgType.ERROR assert msg.data is exc assert ws.exception() is exc async def test_receive_cancelled(make_request, loop) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader = mock.Mock() res = loop.create_future() res.set_exception(asyncio.CancelledError()) ws._reader.read = make_mocked_coro(res) with pytest.raises(asyncio.CancelledError): await ws.receive() async def test_receive_timeouterror(make_request, loop) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader = mock.Mock() res = loop.create_future() res.set_exception(asyncio.TimeoutError()) ws._reader.read = make_mocked_coro(res) with pytest.raises(asyncio.TimeoutError): await ws.receive() async def test_multiple_receive_on_close_connection(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader.feed_data(WS_CLOSED_MESSAGE, 0) await ws.close() await ws.receive() await ws.receive() await ws.receive() await ws.receive() with pytest.raises(RuntimeError): await ws.receive() async def test_concurrent_receive(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._waiting = True with pytest.raises(RuntimeError): await ws.receive() async def test_close_exc(make_request, loop, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) ws._reader = mock.Mock() exc = ValueError() ws._reader.read.return_value = loop.create_future() ws._reader.read.return_value.set_exception(exc) ws._payload_writer.drain = mock.Mock() ws._payload_writer.drain.return_value = loop.create_future() ws._payload_writer.drain.return_value.set_result(True) await ws.close() assert ws.closed assert ws.exception() is exc ws._closed = False ws._reader.read.return_value = loop.create_future() ws._reader.read.return_value.set_exception(asyncio.CancelledError()) with pytest.raises(asyncio.CancelledError): await ws.close() assert ws.close_code == 1006 async def test_close_exc2(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) exc = ValueError() ws._writer = mock.Mock() ws._writer.close.side_effect = exc await ws.close() assert ws.closed assert ws.exception() is exc ws._closed = False ws._writer.close.side_effect = asyncio.CancelledError() with pytest.raises(asyncio.CancelledError): await ws.close() async def test_prepare_twice_idempotent(make_request) -> None: req = make_request('GET', '/') ws = WebSocketResponse() impl1 = await ws.prepare(req) impl2 = await ws.prepare(req) assert impl1 is impl2 async def test_send_with_per_message_deflate(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws.prepare(req) writer_send = ws._writer.send = make_mocked_coro() await ws.send_str('string', compress=15) writer_send.assert_called_with('string', binary=False, compress=15) await ws.send_bytes(b'bytes', compress=0) writer_send.assert_called_with(b'bytes', binary=True, compress=0) await ws.send_json('[{}]', compress=9) writer_send.assert_called_with('"[{}]"', binary=False, compress=9) async def test_no_transfer_encoding_header(make_request, mocker) -> None: req = make_request('GET', '/') ws = WebSocketResponse() await ws._start(req) assert 'Transfer-Encoding' not in ws.headers aiohttp-3.6.2/tests/test_web_websocket_functional.py0000644000175100001650000004726213547410117023320 0ustar vstsdocker00000000000000"""HTTP websocket server functional tests""" import asyncio import pytest import aiohttp from aiohttp import web from aiohttp.http import WSMsgType @pytest.fixture def ceil(mocker): def ceil(val): return val mocker.patch('aiohttp.helpers.ceil').side_effect = ceil async def test_websocket_can_prepare(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() if not ws.can_prepare(request): raise web.HTTPUpgradeRequired() return web.Response() app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') assert resp.status == 426 async def test_websocket_json(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() if not ws.can_prepare(request): return web.HTTPUpgradeRequired() await ws.prepare(request) msg = await ws.receive() msg_json = msg.json() answer = msg_json['test'] await ws.send_str(answer) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') expected_value = 'value' payload = '{"test": "%s"}' % expected_value await ws.send_str(payload) resp = await ws.receive() assert resp.data == expected_value async def test_websocket_json_invalid_message(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) try: await ws.receive_json() except ValueError: await ws.send_str('ValueError was raised') else: raise Exception('No Exception') finally: await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') payload = 'NOT A VALID JSON STRING' await ws.send_str(payload) data = await ws.receive_str() assert 'ValueError was raised' in data async def test_websocket_send_json(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) data = await ws.receive_json() await ws.send_json(data) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') expected_value = 'value' await ws.send_json({'test': expected_value}) data = await ws.receive_json() assert data['test'] == expected_value async def test_websocket_receive_json(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) data = await ws.receive_json() answer = data['test'] await ws.send_str(answer) await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') expected_value = 'value' payload = '{"test": "%s"}' % expected_value await ws.send_str(payload) resp = await ws.receive() assert resp.data == expected_value async def test_send_recv_text(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_str() await ws.send_str(msg+'/answer') await ws.close() closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.send_str('ask') msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.TEXT assert 'ask/answer' == msg.data msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == '' assert ws.closed assert ws.close_code == 1000 await closed async def test_send_recv_bytes(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) msg = await ws.receive_bytes() await ws.send_bytes(msg+b'/answer') await ws.close() closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.send_bytes(b'ask') msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.BINARY assert b'ask/answer' == msg.data msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == '' assert ws.closed assert ws.close_code == 1000 await closed async def test_send_recv_json(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) data = await ws.receive_json() await ws.send_json({'response': data['request']}) await ws.close() closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.send_str('{"request": "test"}') msg = await ws.receive() data = msg.json() assert msg.type == aiohttp.WSMsgType.TEXT assert data['response'] == 'test' msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == '' await ws.close() await closed async def test_close_timeout(loop, aiohttp_client) -> None: aborted = loop.create_future() async def handler(request): ws = web.WebSocketResponse(timeout=0.1) await ws.prepare(request) assert 'request' == (await ws.receive_str()) await ws.send_str('reply') begin = ws._loop.time() assert (await ws.close()) elapsed = ws._loop.time() - begin assert elapsed < 0.201, \ 'close() should have returned before ' \ 'at most 2x timeout.' assert ws.close_code == 1006 assert isinstance(ws.exception(), asyncio.TimeoutError) aborted.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.send_str('request') assert 'reply' == (await ws.receive_str()) # The server closes here. Then the client sends bogus messages with an # internval shorter than server-side close timeout, to make the server # hanging indefinitely. await asyncio.sleep(0.08, loop=loop) msg = await ws._reader.read() assert msg.type == WSMsgType.CLOSE await ws.send_str('hang') # i am not sure what do we test here # under uvloop this code raises RuntimeError try: await asyncio.sleep(0.08, loop=loop) await ws.send_str('hang') await asyncio.sleep(0.08, loop=loop) await ws.send_str('hang') await asyncio.sleep(0.08, loop=loop) await ws.send_str('hang') except RuntimeError: pass await asyncio.sleep(0.08, loop=loop) assert (await aborted) await ws.close() async def test_concurrent_close(loop, aiohttp_client) -> None: srv_ws = None async def handler(request): nonlocal srv_ws ws = srv_ws = web.WebSocketResponse( autoclose=False, protocols=('foo', 'bar')) await ws.prepare(request) msg = await ws.receive() assert msg.type == WSMsgType.CLOSING msg = await ws.receive() assert msg.type == WSMsgType.CLOSING await asyncio.sleep(0, loop=loop) msg = await ws.receive() assert msg.type == WSMsgType.CLOSED return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoclose=False, protocols=('eggs', 'bar')) await srv_ws.close(code=1007) msg = await ws.receive() assert msg.type == WSMsgType.CLOSE await asyncio.sleep(0, loop=loop) msg = await ws.receive() assert msg.type == WSMsgType.CLOSED async def test_auto_pong_with_closing_by_peer(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive() msg = await ws.receive() assert msg.type == WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == 'exit message' closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoclose=False, autoping=False) await ws.ping() await ws.send_str('ask') msg = await ws.receive() assert msg.type == WSMsgType.PONG await ws.close(code=1000, message='exit message') await closed async def test_ping(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.ping('data') await ws.receive() closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) msg = await ws.receive() assert msg.type == WSMsgType.PING assert msg.data == b'data' await ws.pong() await ws.close() await closed async def aiohttp_client_ping(loop, aiohttp_client): closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) await ws.receive() closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) await ws.ping('data') msg = await ws.receive() assert msg.type == WSMsgType.PONG assert msg.data == b'data' await ws.pong() await ws.close() async def test_pong(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse(autoping=False) await ws.prepare(request) msg = await ws.receive() assert msg.type == WSMsgType.PING await ws.pong('data') msg = await ws.receive() assert msg.type == WSMsgType.CLOSE assert msg.data == 1000 assert msg.extra == 'exit message' closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) await ws.ping('data') msg = await ws.receive() assert msg.type == WSMsgType.PONG assert msg.data == b'data' await ws.close(code=1000, message='exit message') await closed async def test_change_status(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() ws.set_status(200) assert 200 == ws.status await ws.prepare(request) assert 101 == ws.status await ws.close() closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) await ws.close() await closed await ws.close() async def test_handle_protocol(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse(protocols=('foo', 'bar')) await ws.prepare(request) await ws.close() assert 'bar' == ws.ws_protocol closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', protocols=('eggs', 'bar')) await ws.close() await closed async def test_server_close_handshake(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse(protocols=('foo', 'bar')) await ws.prepare(request) await ws.close() closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoclose=False, protocols=('eggs', 'bar')) msg = await ws.receive() assert msg.type == WSMsgType.CLOSE await ws.close() await closed async def aiohttp_client_close_handshake(loop, aiohttp_client, ceil): closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse( autoclose=False, protocols=('foo', 'bar')) await ws.prepare(request) msg = await ws.receive() assert msg.type == WSMsgType.CLOSE assert not ws.closed await ws.close() assert ws.closed assert ws.close_code == 1007 msg = await ws.receive() assert msg.type == WSMsgType.CLOSED closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoclose=False, protocols=('eggs', 'bar')) await ws.close(code=1007) msg = await ws.receive() assert msg.type == WSMsgType.CLOSED await closed async def test_server_close_handshake_server_eats_client_messages( loop, aiohttp_client ): closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse(protocols=('foo', 'bar')) await ws.prepare(request) await ws.close() closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoclose=False, autoping=False, protocols=('eggs', 'bar')) msg = await ws.receive() assert msg.type == WSMsgType.CLOSE await ws.send_str('text') await ws.send_bytes(b'bytes') await ws.ping() await ws.close() await closed async def test_receive_timeout(loop, aiohttp_client) -> None: raised = False async def handler(request): ws = web.WebSocketResponse(receive_timeout=0.1) await ws.prepare(request) try: await ws.receive() except asyncio.TimeoutError: nonlocal raised raised = True await ws.close() return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.receive() await ws.close() assert raised async def test_custom_receive_timeout(loop, aiohttp_client) -> None: raised = False async def handler(request): ws = web.WebSocketResponse(receive_timeout=None) await ws.prepare(request) try: await ws.receive(0.1) except asyncio.TimeoutError: nonlocal raised raised = True await ws.close() return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.receive() await ws.close() assert raised async def test_heartbeat(loop, aiohttp_client, ceil) -> None: async def handler(request): ws = web.WebSocketResponse(heartbeat=0.05) await ws.prepare(request) await ws.receive() await ws.close() return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.ping await ws.close() async def test_heartbeat_no_pong(loop, aiohttp_client, ceil) -> None: cancelled = False async def handler(request): nonlocal cancelled ws = web.WebSocketResponse(heartbeat=0.05) await ws.prepare(request) try: await ws.receive() except asyncio.CancelledError: cancelled = True return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/', autoping=False) msg = await ws.receive() assert msg.type == aiohttp.WSMsgType.ping await ws.receive() assert cancelled async def test_server_ws_async_for(loop, aiohttp_server) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) async for msg in ws: assert msg.type == aiohttp.WSMsgType.TEXT s = msg.data await ws.send_str(s + '/answer') await ws.close() closed.set_result(1) return ws app = web.Application() app.router.add_route('GET', '/', handler) server = await aiohttp_server(app) async with aiohttp.ClientSession(loop=loop) as sm: async with sm.ws_connect(server.make_url('/')) as resp: items = ['q1', 'q2', 'q3'] for item in items: await resp.send_str(item) msg = await resp.receive() assert msg.type == aiohttp.WSMsgType.TEXT assert item + '/answer' == msg.data await resp.close() await closed async def test_closed_async_for(loop, aiohttp_client) -> None: closed = loop.create_future() async def handler(request): ws = web.WebSocketResponse() await ws.prepare(request) messages = [] async for msg in ws: messages.append(msg) if 'stop' == msg.data: await ws.send_str('stopping') await ws.close() assert 1 == len(messages) assert messages[0].type == WSMsgType.TEXT assert messages[0].data == 'stop' closed.set_result(None) return ws app = web.Application() app.router.add_get('/', handler) client = await aiohttp_client(app) ws = await client.ws_connect('/') await ws.send_str('stop') msg = await ws.receive() assert msg.type == WSMsgType.TEXT assert msg.data == 'stopping' await ws.close() await closed async def test_websocket_disable_keepalive(loop, aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() if not ws.can_prepare(request): return web.Response(text='OK') assert request.protocol._keepalive await ws.prepare(request) assert not request.protocol._keepalive assert not request.protocol._keepalive_handle await ws.send_str('OK') await ws.close() return ws app = web.Application() app.router.add_route('GET', '/', handler) client = await aiohttp_client(app) resp = await client.get('/') txt = await resp.text() assert txt == 'OK' ws = await client.ws_connect('/') data = await ws.receive_str() assert data == 'OK' aiohttp-3.6.2/tests/test_websocket_handshake.py0000644000175100001650000002066713547410117022247 0ustar vstsdocker00000000000000"""Tests for http/websocket.py""" import base64 import os import pytest from aiohttp import web from aiohttp.test_utils import make_mocked_request def gen_ws_headers(protocols='', compress=0, extension_text='', server_notakeover=False, client_notakeover=False): key = base64.b64encode(os.urandom(16)).decode() hdrs = [('Upgrade', 'websocket'), ('Connection', 'upgrade'), ('Sec-Websocket-Version', '13'), ('Sec-Websocket-Key', key)] if protocols: hdrs += [('Sec-Websocket-Protocol', protocols)] if compress: params = 'permessage-deflate' if compress < 15: params += '; server_max_window_bits=' + str(compress) if server_notakeover: params += '; server_no_context_takeover' if client_notakeover: params += '; client_no_context_takeover' if extension_text: params += '; ' + extension_text hdrs += [('Sec-Websocket-Extensions', params)] return hdrs, key async def test_no_upgrade() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/') with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_no_connection() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'keep-alive'}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_protocol_version_unset() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'upgrade'}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_protocol_version_not_supported() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'upgrade', 'Sec-Websocket-Version': '1'}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_protocol_key_not_present() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'upgrade', 'Sec-Websocket-Version': '13'}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_protocol_key_invalid() -> None: ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'upgrade', 'Sec-Websocket-Version': '13', 'Sec-Websocket-Key': '123'}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_protocol_key_bad_size() -> None: ws = web.WebSocketResponse() sec_key = base64.b64encode(os.urandom(2)) val = sec_key.decode() req = make_mocked_request('GET', '/', headers={'Upgrade': 'websocket', 'Connection': 'upgrade', 'Sec-Websocket-Version': '13', 'Sec-Websocket-Key': val}) with pytest.raises(web.HTTPBadRequest): await ws.prepare(req) async def test_handshake_ok() -> None: hdrs, sec_key = gen_ws_headers() ws = web.WebSocketResponse() req = make_mocked_request('GET', '/', headers=hdrs) await ws.prepare(req) assert ws.ws_protocol is None async def test_handshake_protocol() -> None: # Tests if one protocol is returned by handshake proto = 'chat' ws = web.WebSocketResponse(protocols={'chat'}) req = make_mocked_request('GET', '/', headers=gen_ws_headers(proto)[0]) await ws.prepare(req) assert ws.ws_protocol == proto async def test_handshake_protocol_agreement() -> None: # Tests if the right protocol is selected given multiple best_proto = 'worse_proto' wanted_protos = ['best', 'chat', 'worse_proto'] server_protos = 'worse_proto,chat' ws = web.WebSocketResponse(protocols=wanted_protos) req = make_mocked_request('GET', '/', headers=gen_ws_headers(server_protos)[0]) await ws.prepare(req) assert ws.ws_protocol == best_proto async def test_handshake_protocol_unsupported(caplog) -> None: # Tests if a protocol mismatch handshake warns and returns None proto = 'chat' req = make_mocked_request('GET', '/', headers=gen_ws_headers('test')[0]) ws = web.WebSocketResponse(protocols=[proto]) await ws.prepare(req) assert (caplog.records[-1].msg == 'Client protocols %r don’t overlap server-known ones %r') assert ws.ws_protocol is None async def test_handshake_compress() -> None: hdrs, sec_key = gen_ws_headers(compress=15) req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() await ws.prepare(req) assert ws.compress == 15 def test_handshake_compress_server_notakeover() -> None: hdrs, sec_key = gen_ws_headers(compress=15, server_notakeover=True) req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert compress == 15 assert notakeover is True assert 'Sec-Websocket-Extensions' in headers assert headers['Sec-Websocket-Extensions'] == ( 'permessage-deflate; server_no_context_takeover') def test_handshake_compress_client_notakeover() -> None: hdrs, sec_key = gen_ws_headers(compress=15, client_notakeover=True) req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' in headers assert headers['Sec-Websocket-Extensions'] == ( 'permessage-deflate'), hdrs assert compress == 15 def test_handshake_compress_wbits() -> None: hdrs, sec_key = gen_ws_headers(compress=9) req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' in headers assert headers['Sec-Websocket-Extensions'] == ( 'permessage-deflate; server_max_window_bits=9') assert compress == 9 def test_handshake_compress_wbits_error() -> None: hdrs, sec_key = gen_ws_headers(compress=6) req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' not in headers assert compress == 0 def test_handshake_compress_bad_ext() -> None: hdrs, sec_key = gen_ws_headers(compress=15, extension_text='bad') req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' not in headers assert compress == 0 def test_handshake_compress_multi_ext_bad() -> None: hdrs, sec_key = gen_ws_headers(compress=15, extension_text='bad, permessage-deflate') req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' in headers assert headers['Sec-Websocket-Extensions'] == 'permessage-deflate' def test_handshake_compress_multi_ext_wbits() -> None: hdrs, sec_key = gen_ws_headers(compress=6, extension_text=', permessage-deflate') req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Sec-Websocket-Extensions' in headers assert headers['Sec-Websocket-Extensions'] == 'permessage-deflate' assert compress == 15 def test_handshake_no_transfer_encoding() -> None: hdrs, sec_key = gen_ws_headers() req = make_mocked_request('GET', '/', headers=hdrs) ws = web.WebSocketResponse() headers, _, compress, notakeover = ws._handshake(req) assert 'Transfer-Encoding' not in headers aiohttp-3.6.2/tests/test_websocket_parser.py0000644000175100001650000004032013547410117021601 0ustar vstsdocker00000000000000import pickle import random import struct import zlib from unittest import mock import pytest import aiohttp from aiohttp import http_websocket from aiohttp.http import WebSocketError, WSCloseCode, WSMessage, WSMsgType from aiohttp.http_websocket import ( _WS_DEFLATE_TRAILING, PACK_CLOSE_CODE, PACK_LEN1, PACK_LEN2, PACK_LEN3, WebSocketReader, _websocket_mask, ) def build_frame(message, opcode, use_mask=False, noheader=False, is_fin=True, compress=False): """Send a frame over the websocket with message as its payload.""" if compress: compressobj = zlib.compressobj(wbits=-9) message = compressobj.compress(message) message = message + compressobj.flush(zlib.Z_SYNC_FLUSH) if message.endswith(_WS_DEFLATE_TRAILING): message = message[:-4] msg_length = len(message) if use_mask: # pragma: no cover mask_bit = 0x80 else: mask_bit = 0 if is_fin: header_first_byte = 0x80 | opcode else: header_first_byte = opcode if compress: header_first_byte |= 0x40 if msg_length < 126: header = PACK_LEN1( header_first_byte, msg_length | mask_bit) elif msg_length < (1 << 16): # pragma: no cover header = PACK_LEN2( header_first_byte, 126 | mask_bit, msg_length) else: header = PACK_LEN3( header_first_byte, 127 | mask_bit, msg_length) if use_mask: # pragma: no cover mask = random.randrange(0, 0xffffffff) mask = mask.to_bytes(4, 'big') message = bytearray(message) _websocket_mask(mask, message) if noheader: return message else: return header + mask + message else: if noheader: return message else: return header + message def build_close_frame(code=1000, message=b'', noheader=False): """Close the websocket, sending the specified code and message.""" if isinstance(message, str): # pragma: no cover message = message.encode('utf-8') return build_frame( PACK_CLOSE_CODE(code) + message, opcode=WSMsgType.CLOSE, noheader=noheader) @pytest.fixture() def out(loop): return aiohttp.DataQueue(loop) @pytest.fixture() def parser(out): return WebSocketReader(out, 4*1024*1024) def test_parse_frame(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b00000001, 0b00000001)) res = parser.parse_frame(b'1') fin, opcode, payload, compress = res[0] assert (0, 1, b'1', False) == (fin, opcode, payload, not not compress) def test_parse_frame_length0(parser) -> None: fin, opcode, payload, compress = parser.parse_frame( struct.pack('!BB', 0b00000001, 0b00000000))[0] assert (0, 1, b'', False) == (fin, opcode, payload, not not compress) def test_parse_frame_length2(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b00000001, 126)) parser.parse_frame(struct.pack('!H', 4)) res = parser.parse_frame(b'1234') fin, opcode, payload, compress = res[0] assert (0, 1, b'1234', False) == (fin, opcode, payload, not not compress) def test_parse_frame_length4(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b00000001, 127)) parser.parse_frame(struct.pack('!Q', 4)) fin, opcode, payload, compress = parser.parse_frame(b'1234')[0] assert (0, 1, b'1234', False) == (fin, opcode, payload, not not compress) def test_parse_frame_mask(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b00000001, 0b10000001)) parser.parse_frame(b'0001') fin, opcode, payload, compress = parser.parse_frame(b'1')[0] assert (0, 1, b'\x01', False) == (fin, opcode, payload, not not compress) def test_parse_frame_header_reversed_bits(out, parser) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack('!BB', 0b01100000, 0b00000000)) raise out.exception() def test_parse_frame_header_control_frame(out, parser) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack('!BB', 0b00001000, 0b00000000)) raise out.exception() def _test_parse_frame_header_new_data_err(out, parser): with pytest.raises(WebSocketError): parser.parse_frame(struct.pack('!BB', 0b000000000, 0b00000000)) raise out.exception() def test_parse_frame_header_payload_size(out, parser) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack('!BB', 0b10001000, 0b01111110)) raise out.exception() def test_ping_frame(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.PING, b'data', False)] parser.feed_data(b'') res = out._buffer[0] assert res == ((WSMsgType.PING, b'data', ''), 4) def test_pong_frame(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.PONG, b'data', False)] parser.feed_data(b'') res = out._buffer[0] assert res == ((WSMsgType.PONG, b'data', ''), 4) def test_close_frame(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b'', False)] parser.feed_data(b'') res = out._buffer[0] assert res == ((WSMsgType.CLOSE, 0, ''), 0) def test_close_frame_info(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b'0112345', False)] parser.feed_data(b'') res = out._buffer[0] assert res == (WSMessage(WSMsgType.CLOSE, 12337, '12345'), 0) def test_close_frame_invalid(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b'1', False)] parser.feed_data(b'') assert isinstance(out.exception(), WebSocketError) assert out.exception().code == WSCloseCode.PROTOCOL_ERROR def test_close_frame_invalid_2(out, parser) -> None: data = build_close_frame(code=1) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR def test_close_frame_unicode_err(parser) -> None: data = build_close_frame( code=1000, message=b'\xf4\x90\x80\x80') with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.INVALID_TEXT def test_unknown_frame(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.CONTINUATION, b'', False)] with pytest.raises(WebSocketError): parser.feed_data(b'') raise out.exception() def test_simple_text(out, parser) -> None: data = build_frame(b'text', WSMsgType.TEXT) parser._feed_data(data) res = out._buffer[0] assert res == ((WSMsgType.TEXT, 'text', ''), 4) def test_simple_text_unicode_err(parser) -> None: data = build_frame(b'\xf4\x90\x80\x80', WSMsgType.TEXT) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.INVALID_TEXT def test_simple_binary(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [(1, WSMsgType.BINARY, b'binary', False)] parser.feed_data(b'') res = out._buffer[0] assert res == ((WSMsgType.BINARY, b'binary', ''), 6) def test_fragmentation_header(out, parser) -> None: data = build_frame(b'a', WSMsgType.TEXT) parser._feed_data(data[:1]) parser._feed_data(data[1:]) res = out._buffer[0] assert res == (WSMessage(WSMsgType.TEXT, 'a', ''), 1) def test_continuation(out, parser) -> None: data1 = build_frame(b'line1', WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) data2 = build_frame(b'line2', WSMsgType.CONTINUATION) parser._feed_data(data2) res = out._buffer[0] assert res == (WSMessage(WSMsgType.TEXT, 'line1line2', ''), 10) def test_continuation_with_ping(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.PING, b'', False), (1, WSMsgType.CONTINUATION, b'line2', False), ] data1 = build_frame(b'line1', WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) data2 = build_frame(b'', WSMsgType.PING) parser._feed_data(data2) data3 = build_frame(b'line2', WSMsgType.CONTINUATION) parser._feed_data(data3) res = out._buffer[0] assert res == (WSMessage(WSMsgType.PING, b'', ''), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, 'line1line2', ''), 10) def test_continuation_err(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (1, WSMsgType.TEXT, b'line2', False)] with pytest.raises(WebSocketError): parser._feed_data(b'') def test_continuation_with_close(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.CLOSE, build_close_frame(1002, b'test', noheader=True), False), (1, WSMsgType.CONTINUATION, b'line2', False), ] parser.feed_data(b'') res = out._buffer[0] assert res, (WSMessage(WSMsgType.CLOSE, 1002, 'test'), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, 'line1line2', ''), 10) def test_continuation_with_close_unicode_err(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.CLOSE, build_close_frame(1000, b'\xf4\x90\x80\x80', noheader=True), False), (1, WSMsgType.CONTINUATION, b'line2', False)] with pytest.raises(WebSocketError) as ctx: parser._feed_data(b'') assert ctx.value.code == WSCloseCode.INVALID_TEXT def test_continuation_with_close_bad_code(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.CLOSE, build_close_frame(1, b'test', noheader=True), False), (1, WSMsgType.CONTINUATION, b'line2', False)] with pytest.raises(WebSocketError) as ctx: parser._feed_data(b'') assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR def test_continuation_with_close_bad_payload(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.CLOSE, b'1', False), (1, WSMsgType.CONTINUATION, b'line2', False)] with pytest.raises(WebSocketError) as ctx: parser._feed_data(b'') assert ctx.value.code, WSCloseCode.PROTOCOL_ERROR def test_continuation_with_close_empty(out, parser) -> None: parser.parse_frame = mock.Mock() parser.parse_frame.return_value = [ (0, WSMsgType.TEXT, b'line1', False), (0, WSMsgType.CLOSE, b'', False), (1, WSMsgType.CONTINUATION, b'line2', False), ] parser.feed_data(b'') res = out._buffer[0] assert res, (WSMessage(WSMsgType.CLOSE, 0, ''), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, 'line1line2', ''), 10) websocket_mask_data = b'some very long data for masking by websocket' websocket_mask_mask = b'1234' websocket_mask_masked = (b'B]^Q\x11DVFH\x12_[_U\x13PPFR\x14W]A\x14\\S@_X' b'\\T\x14SK\x13CTP@[RYV@') def test_websocket_mask_python() -> None: message = bytearray(websocket_mask_data) http_websocket._websocket_mask_python( websocket_mask_mask, message) assert message == websocket_mask_masked @pytest.mark.skipif(not hasattr(http_websocket, '_websocket_mask_cython'), reason='Requires Cython') def test_websocket_mask_cython() -> None: message = bytearray(websocket_mask_data) http_websocket._websocket_mask_cython( websocket_mask_mask, message) assert message == websocket_mask_masked def test_websocket_mask_python_empty() -> None: message = bytearray() http_websocket._websocket_mask_python( websocket_mask_mask, message) assert message == bytearray() @pytest.mark.skipif(not hasattr(http_websocket, '_websocket_mask_cython'), reason='Requires Cython') def test_websocket_mask_cython_empty() -> None: message = bytearray() http_websocket._websocket_mask_cython( websocket_mask_mask, message) assert message == bytearray() def test_msgtype_aliases() -> None: assert aiohttp.WSMsgType.TEXT == aiohttp.WSMsgType.text assert aiohttp.WSMsgType.BINARY == aiohttp.WSMsgType.binary assert aiohttp.WSMsgType.PING == aiohttp.WSMsgType.ping assert aiohttp.WSMsgType.PONG == aiohttp.WSMsgType.pong assert aiohttp.WSMsgType.CLOSE == aiohttp.WSMsgType.close assert aiohttp.WSMsgType.CLOSED == aiohttp.WSMsgType.closed assert aiohttp.WSMsgType.ERROR == aiohttp.WSMsgType.error def test_parse_compress_frame_single(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b11000001, 0b00000001)) res = parser.parse_frame(b'1') fin, opcode, payload, compress = res[0] assert (1, 1, b'1', True) == (fin, opcode, payload, not not compress) def test_parse_compress_frame_multi(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b01000001, 126)) parser.parse_frame(struct.pack('!H', 4)) res = parser.parse_frame(b'1234') fin, opcode, payload, compress = res[0] assert (0, 1, b'1234', True) == (fin, opcode, payload, not not compress) parser.parse_frame(struct.pack('!BB', 0b10000001, 126)) parser.parse_frame(struct.pack('!H', 4)) res = parser.parse_frame(b'1234') fin, opcode, payload, compress = res[0] assert (1, 1, b'1234', True) == (fin, opcode, payload, not not compress) parser.parse_frame(struct.pack('!BB', 0b10000001, 126)) parser.parse_frame(struct.pack('!H', 4)) res = parser.parse_frame(b'1234') fin, opcode, payload, compress = res[0] assert (1, 1, b'1234', False) == (fin, opcode, payload, not not compress) def test_parse_compress_error_frame(parser) -> None: parser.parse_frame(struct.pack('!BB', 0b01000001, 0b00000001)) parser.parse_frame(b'1') with pytest.raises(WebSocketError) as ctx: parser.parse_frame(struct.pack('!BB', 0b11000001, 0b00000001)) parser.parse_frame(b'1') assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR def test_parse_no_compress_frame_single() -> None: parser_no_compress = WebSocketReader(out, 0, compress=False) with pytest.raises(WebSocketError) as ctx: parser_no_compress.parse_frame(struct.pack( '!BB', 0b11000001, 0b00000001)) parser_no_compress.parse_frame(b'1') assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR def test_msg_too_large(out) -> None: parser = WebSocketReader(out, 256, compress=False) data = build_frame(b'text'*256, WSMsgType.TEXT) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.MESSAGE_TOO_BIG def test_msg_too_large_not_fin(out) -> None: parser = WebSocketReader(out, 256, compress=False) data = build_frame(b'text'*256, WSMsgType.TEXT, is_fin=False) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.MESSAGE_TOO_BIG def test_compressed_msg_too_large(out) -> None: parser = WebSocketReader(out, 256, compress=True) data = build_frame(b'aaa'*256, WSMsgType.TEXT, compress=True) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.MESSAGE_TOO_BIG class TestWebSocketError: def test_ctor(self) -> None: err = WebSocketError(WSCloseCode.PROTOCOL_ERROR, 'Something invalid') assert err.code == WSCloseCode.PROTOCOL_ERROR assert str(err) == 'Something invalid' def test_pickle(self) -> None: err = WebSocketError(WSCloseCode.PROTOCOL_ERROR, 'Something invalid') err.foo = 'bar' for proto in range(pickle.HIGHEST_PROTOCOL + 1): pickled = pickle.dumps(err, proto) err2 = pickle.loads(pickled) assert err2.code == WSCloseCode.PROTOCOL_ERROR assert str(err2) == 'Something invalid' assert err2.foo == 'bar' aiohttp-3.6.2/tests/test_websocket_writer.py0000644000175100001650000000657413547410117021636 0ustar vstsdocker00000000000000import random from unittest import mock import pytest from aiohttp.http import WebSocketWriter from aiohttp.test_utils import make_mocked_coro @pytest.fixture def protocol(): ret = mock.Mock() ret._drain_helper = make_mocked_coro() return ret @pytest.fixture def transport(): return mock.Mock() @pytest.fixture def writer(protocol, transport): return WebSocketWriter(protocol, transport, use_mask=False) async def test_pong(writer) -> None: await writer.pong() writer.transport.write.assert_called_with(b'\x8a\x00') async def test_ping(writer) -> None: await writer.ping() writer.transport.write.assert_called_with(b'\x89\x00') async def test_send_text(writer) -> None: await writer.send(b'text') writer.transport.write.assert_called_with(b'\x81\x04text') async def test_send_binary(writer) -> None: await writer.send('binary', True) writer.transport.write.assert_called_with(b'\x82\x06binary') async def test_send_binary_long(writer) -> None: await writer.send(b'b' * 127, True) assert writer.transport.write.call_args[0][0].startswith(b'\x82~\x00\x7fb') async def test_send_binary_very_long(writer) -> None: await writer.send(b'b' * 65537, True) assert (writer.transport.write.call_args_list[0][0][0] == b'\x82\x7f\x00\x00\x00\x00\x00\x01\x00\x01') assert writer.transport.write.call_args_list[1][0][0] == b'b' * 65537 async def test_close(writer) -> None: await writer.close(1001, 'msg') writer.transport.write.assert_called_with(b'\x88\x05\x03\xe9msg') await writer.close(1001, b'msg') writer.transport.write.assert_called_with(b'\x88\x05\x03\xe9msg') # Test that Service Restart close code is also supported await writer.close(1012, b'msg') writer.transport.write.assert_called_with(b'\x88\x05\x03\xf4msg') async def test_send_text_masked(protocol, transport) -> None: writer = WebSocketWriter(protocol, transport, use_mask=True, random=random.Random(123)) await writer.send(b'text') writer.transport.write.assert_called_with(b'\x81\x84\rg\xb3fy\x02\xcb\x12') async def test_send_compress_text(protocol, transport) -> None: writer = WebSocketWriter(protocol, transport, compress=15) await writer.send(b'text') writer.transport.write.assert_called_with(b'\xc1\x06*I\xad(\x01\x00') await writer.send(b'text') writer.transport.write.assert_called_with(b'\xc1\x05*\x01b\x00\x00') async def test_send_compress_text_notakeover(protocol, transport) -> None: writer = WebSocketWriter(protocol, transport, compress=15, notakeover=True) await writer.send(b'text') writer.transport.write.assert_called_with(b'\xc1\x06*I\xad(\x01\x00') await writer.send(b'text') writer.transport.write.assert_called_with(b'\xc1\x06*I\xad(\x01\x00') async def test_send_compress_text_per_message(protocol, transport) -> None: writer = WebSocketWriter(protocol, transport) await writer.send(b'text', compress=15) writer.transport.write.assert_called_with(b'\xc1\x06*I\xad(\x01\x00') await writer.send(b'text') writer.transport.write.assert_called_with(b'\x81\x04text') await writer.send(b'text', compress=15) writer.transport.write.assert_called_with(b'\xc1\x06*I\xad(\x01\x00') aiohttp-3.6.2/tests/test_worker.py0000644000175100001650000001762113547410117017560 0ustar vstsdocker00000000000000"""Tests for aiohttp/worker.py""" import asyncio import os import socket import ssl from unittest import mock import pytest from aiohttp import web base_worker = pytest.importorskip('aiohttp.worker') try: import uvloop except ImportError: uvloop = None WRONG_LOG_FORMAT = '%a "%{Referrer}i" %(h)s %(l)s %s' ACCEPTABLE_LOG_FORMAT = '%a "%{Referrer}i" %s' # tokio event loop does not allow to override attributes def skip_if_no_dict(loop): if not hasattr(loop, '__dict__'): pytest.skip("can not override loop attributes") class BaseTestWorker: def __init__(self): self.servers = {} self.exit_code = 0 self._notify_waiter = None self.cfg = mock.Mock() self.cfg.graceful_timeout = 100 self.pid = 'pid' self.wsgi = web.Application() class AsyncioWorker(BaseTestWorker, # type: ignore base_worker.GunicornWebWorker): pass PARAMS = [AsyncioWorker] if uvloop is not None: class UvloopWorker(BaseTestWorker, # type: ignore base_worker.GunicornUVLoopWebWorker): pass PARAMS.append(UvloopWorker) @pytest.fixture(params=PARAMS) def worker(request, loop): asyncio.set_event_loop(loop) ret = request.param() ret.notify = mock.Mock() return ret def test_init_process(worker) -> None: with mock.patch('aiohttp.worker.asyncio') as m_asyncio: try: worker.init_process() except TypeError: pass assert m_asyncio.get_event_loop.return_value.close.called assert m_asyncio.new_event_loop.called assert m_asyncio.set_event_loop.called def test_run(worker, loop) -> None: worker.log = mock.Mock() worker.cfg = mock.Mock() worker.cfg.access_log_format = ACCEPTABLE_LOG_FORMAT worker.cfg.is_ssl = False worker.sockets = [] worker.loop = loop with pytest.raises(SystemExit): worker.run() worker.log.exception.assert_not_called() assert loop.is_closed() def test_run_async_factory(worker, loop) -> None: worker.log = mock.Mock() worker.cfg = mock.Mock() worker.cfg.access_log_format = ACCEPTABLE_LOG_FORMAT worker.cfg.is_ssl = False worker.sockets = [] app = worker.wsgi async def make_app(): return app worker.wsgi = make_app worker.loop = loop worker.alive = False with pytest.raises(SystemExit): worker.run() worker.log.exception.assert_not_called() assert loop.is_closed() def test_run_not_app(worker, loop) -> None: worker.log = mock.Mock() worker.cfg = mock.Mock() worker.cfg.access_log_format = ACCEPTABLE_LOG_FORMAT worker.loop = loop worker.wsgi = "not-app" worker.alive = False with pytest.raises(SystemExit): worker.run() worker.log.exception.assert_called_with('Exception in gunicorn worker') assert loop.is_closed() def test_handle_quit(worker, loop) -> None: worker.loop = mock.Mock() worker.handle_quit(object(), object()) assert not worker.alive assert worker.exit_code == 0 worker.loop.call_later.asset_called_with( 0.1, worker._notify_waiter_done) def test_handle_abort(worker) -> None: with mock.patch('aiohttp.worker.sys') as m_sys: worker.handle_abort(object(), object()) assert not worker.alive assert worker.exit_code == 1 m_sys.exit.assert_called_with(1) def test__wait_next_notify(worker) -> None: worker.loop = mock.Mock() worker._notify_waiter_done = mock.Mock() fut = worker._wait_next_notify() assert worker._notify_waiter == fut worker.loop.call_later.assert_called_with(1.0, worker._notify_waiter_done, fut) def test__notify_waiter_done(worker) -> None: worker._notify_waiter = None worker._notify_waiter_done() assert worker._notify_waiter is None waiter = worker._notify_waiter = mock.Mock() worker._notify_waiter.done.return_value = False worker._notify_waiter_done() assert worker._notify_waiter is None waiter.set_result.assert_called_with(True) def test__notify_waiter_done_explicit_waiter(worker) -> None: worker._notify_waiter = None assert worker._notify_waiter is None waiter = worker._notify_waiter = mock.Mock() waiter.done.return_value = False waiter2 = worker._notify_waiter = mock.Mock() worker._notify_waiter_done(waiter) assert worker._notify_waiter is waiter2 waiter.set_result.assert_called_with(True) assert not waiter2.set_result.called def test_init_signals(worker) -> None: worker.loop = mock.Mock() worker.init_signals() assert worker.loop.add_signal_handler.called @pytest.mark.parametrize('source,result', [ (ACCEPTABLE_LOG_FORMAT, ACCEPTABLE_LOG_FORMAT), (AsyncioWorker.DEFAULT_GUNICORN_LOG_FORMAT, AsyncioWorker.DEFAULT_AIOHTTP_LOG_FORMAT), ]) def test__get_valid_log_format_ok(worker, source, result) -> None: assert result == worker._get_valid_log_format(source) def test__get_valid_log_format_exc(worker) -> None: with pytest.raises(ValueError) as exc: worker._get_valid_log_format(WRONG_LOG_FORMAT) assert '%(name)s' in str(exc.value) async def test__run_ok_parent_changed(worker, loop, aiohttp_unused_port) -> None: skip_if_no_dict(loop) worker.ppid = 0 worker.alive = True sock = socket.socket() addr = ('localhost', aiohttp_unused_port()) sock.bind(addr) worker.sockets = [sock] worker.log = mock.Mock() worker.loop = loop worker.cfg.access_log_format = ACCEPTABLE_LOG_FORMAT worker.cfg.max_requests = 0 worker.cfg.is_ssl = False await worker._run() worker.notify.assert_called_with() worker.log.info.assert_called_with("Parent changed, shutting down: %s", worker) async def test__run_exc(worker, loop, aiohttp_unused_port) -> None: skip_if_no_dict(loop) worker.ppid = os.getppid() worker.alive = True sock = socket.socket() addr = ('localhost', aiohttp_unused_port()) sock.bind(addr) worker.sockets = [sock] worker.log = mock.Mock() worker.loop = loop worker.cfg.access_log_format = ACCEPTABLE_LOG_FORMAT worker.cfg.max_requests = 0 worker.cfg.is_ssl = False def raiser(): waiter = worker._notify_waiter worker.alive = False waiter.set_exception(RuntimeError()) loop.call_later(0.1, raiser) await worker._run() worker.notify.assert_called_with() def test__create_ssl_context_without_certs_and_ciphers( worker, tls_certificate_pem_path, ) -> None: worker.cfg.ssl_version = ssl.PROTOCOL_SSLv23 worker.cfg.cert_reqs = ssl.CERT_OPTIONAL worker.cfg.certfile = tls_certificate_pem_path worker.cfg.keyfile = tls_certificate_pem_path worker.cfg.ca_certs = None worker.cfg.ciphers = None ctx = worker._create_ssl_context(worker.cfg) assert isinstance(ctx, ssl.SSLContext) def test__create_ssl_context_with_ciphers( worker, tls_certificate_pem_path, ) -> None: worker.cfg.ssl_version = ssl.PROTOCOL_SSLv23 worker.cfg.cert_reqs = ssl.CERT_OPTIONAL worker.cfg.certfile = tls_certificate_pem_path worker.cfg.keyfile = tls_certificate_pem_path worker.cfg.ca_certs = None worker.cfg.ciphers = '3DES PSK' ctx = worker._create_ssl_context(worker.cfg) assert isinstance(ctx, ssl.SSLContext) def test__create_ssl_context_with_ca_certs( worker, tls_ca_certificate_pem_path, tls_certificate_pem_path, ) -> None: worker.cfg.ssl_version = ssl.PROTOCOL_SSLv23 worker.cfg.cert_reqs = ssl.CERT_OPTIONAL worker.cfg.certfile = tls_certificate_pem_path worker.cfg.keyfile = tls_certificate_pem_path worker.cfg.ca_certs = tls_ca_certificate_pem_path worker.cfg.ciphers = None ctx = worker._create_ssl_context(worker.cfg) assert isinstance(ctx, ssl.SSLContext) aiohttp-3.6.2/vendor/0000755000175100001650000000000013547410140014756 5ustar vstsdocker00000000000000aiohttp-3.6.2/vendor/http-parser/0000755000175100001650000000000013547410140017227 5ustar vstsdocker00000000000000aiohttp-3.6.2/vendor/http-parser/.git0000644000175100001650000000005613547410117020020 0ustar vstsdocker00000000000000gitdir: ../../.git/modules/vendor/http-parser aiohttp-3.6.2/vendor/http-parser/.gitignore0000644000175100001650000000037713547410117021232 0ustar vstsdocker00000000000000/out/ core tags *.o test test_g test_fast bench url_parser parsertrace parsertrace_g *.mk *.Makefile *.so.* *.exe.* *.exe *.a # Visual Studio uglies *.suo *.sln *.vcxproj *.vcxproj.filters *.vcxproj.user *.opensdf *.ncrunchsolution* *.sdf *.vsp *.psess aiohttp-3.6.2/vendor/http-parser/.mailmap0000644000175100001650000000074013547410117020655 0ustar vstsdocker00000000000000# update AUTHORS with: # git log --all --reverse --format='%aN <%aE>' | perl -ne 'BEGIN{print "# Authors ordered by first contribution.\n"} print unless $h{$_}; $h{$_} = 1' > AUTHORS Ryan Dahl Salman Haq Simon Zimmermann Thomas LE ROUX LE ROUX Thomas Thomas LE ROUX Thomas LE ROUX Fedor Indutny aiohttp-3.6.2/vendor/http-parser/.travis.yml0000644000175100001650000000020413547410117021340 0ustar vstsdocker00000000000000language: c compiler: - clang - gcc script: - "make" notifications: email: false irc: - "irc.freenode.net#node-ci" aiohttp-3.6.2/vendor/http-parser/AUTHORS0000644000175100001650000000470613547410117020312 0ustar vstsdocker00000000000000# Authors ordered by first contribution. Ryan Dahl Jeremy Hinegardner Sergey Shepelev Joe Damato tomika Phoenix Sol Cliff Frey Ewen Cheslack-Postava Santiago Gala Tim Becker Jeff Terrace Ben Noordhuis Nathan Rajlich Mark Nottingham Aman Gupta Tim Becker Sean Cunningham Peter Griess Salman Haq Cliff Frey Jon Kolb Fouad Mardini Paul Querna Felix Geisendörfer koichik Andre Caron Ivo Raisr James McLaughlin David Gwynne Thomas LE ROUX Randy Rizun Andre Louis Caron Simon Zimmermann Erik Dubbelboer Martell Malone Bertrand Paquet BogDan Vatra Peter Faiman Corey Richardson Tóth Tamás Cam Swords Chris Dickinson Uli Köhler Charlie Somerville Patrik Stutz Fedor Indutny runner Alexis Campailla David Wragg Vinnie Falco Alex Butum Rex Feng Alex Kocharin Mark Koopman Helge Heß Alexis La Goutte George Miroshnykov Maciej Małecki Marc O'Morain Jeff Pinner Timothy J Fontaine Akagi201 Romain Giraud Jay Satiro Arne Steen Kjell Schubert Olivier Mengué aiohttp-3.6.2/vendor/http-parser/LICENSE-MIT0000644000175100001650000000206513547410117020672 0ustar vstsdocker00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. aiohttp-3.6.2/vendor/http-parser/Makefile0000644000175100001650000001224413547410117020676 0ustar vstsdocker00000000000000# Copyright Joyent, Inc. and other Node contributors. All rights reserved. # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to # deal in the Software without restriction, including without limitation the # rights to use, copy, modify, merge, publish, distribute, sublicense, and/or # sell copies of the Software, and to permit persons to whom the Software is # furnished to do so, subject to the following conditions: # # The above copyright notice and this permission notice shall be included in # all copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS # IN THE SOFTWARE. PLATFORM ?= $(shell sh -c 'uname -s | tr "[A-Z]" "[a-z]"') HELPER ?= BINEXT ?= SOLIBNAME = libhttp_parser SOMAJOR = 2 SOMINOR = 8 SOREV = 1 ifeq (darwin,$(PLATFORM)) SOEXT ?= dylib SONAME ?= $(SOLIBNAME).$(SOMAJOR).$(SOMINOR).$(SOEXT) LIBNAME ?= $(SOLIBNAME).$(SOMAJOR).$(SOMINOR).$(SOREV).$(SOEXT) else ifeq (wine,$(PLATFORM)) CC = winegcc BINEXT = .exe.so HELPER = wine else SOEXT ?= so SONAME ?= $(SOLIBNAME).$(SOEXT).$(SOMAJOR).$(SOMINOR) LIBNAME ?= $(SOLIBNAME).$(SOEXT).$(SOMAJOR).$(SOMINOR).$(SOREV) endif CC?=gcc AR?=ar CPPFLAGS ?= LDFLAGS ?= CPPFLAGS += -I. CPPFLAGS_DEBUG = $(CPPFLAGS) -DHTTP_PARSER_STRICT=1 CPPFLAGS_DEBUG += $(CPPFLAGS_DEBUG_EXTRA) CPPFLAGS_FAST = $(CPPFLAGS) -DHTTP_PARSER_STRICT=0 CPPFLAGS_FAST += $(CPPFLAGS_FAST_EXTRA) CPPFLAGS_BENCH = $(CPPFLAGS_FAST) CFLAGS += -Wall -Wextra -Werror CFLAGS_DEBUG = $(CFLAGS) -O0 -g $(CFLAGS_DEBUG_EXTRA) CFLAGS_FAST = $(CFLAGS) -O3 $(CFLAGS_FAST_EXTRA) CFLAGS_BENCH = $(CFLAGS_FAST) -Wno-unused-parameter CFLAGS_LIB = $(CFLAGS_FAST) -fPIC LDFLAGS_LIB = $(LDFLAGS) -shared INSTALL ?= install PREFIX ?= /usr/local LIBDIR = $(PREFIX)/lib INCLUDEDIR = $(PREFIX)/include ifeq (darwin,$(PLATFORM)) LDFLAGS_LIB += -Wl,-install_name,$(LIBDIR)/$(SONAME) else # TODO(bnoordhuis) The native SunOS linker expects -h rather than -soname... LDFLAGS_LIB += -Wl,-soname=$(SONAME) endif test: test_g test_fast $(HELPER) ./test_g$(BINEXT) $(HELPER) ./test_fast$(BINEXT) test_g: http_parser_g.o test_g.o $(CC) $(CFLAGS_DEBUG) $(LDFLAGS) http_parser_g.o test_g.o -o $@ test_g.o: test.c http_parser.h Makefile $(CC) $(CPPFLAGS_DEBUG) $(CFLAGS_DEBUG) -c test.c -o $@ http_parser_g.o: http_parser.c http_parser.h Makefile $(CC) $(CPPFLAGS_DEBUG) $(CFLAGS_DEBUG) -c http_parser.c -o $@ test_fast: http_parser.o test.o http_parser.h $(CC) $(CFLAGS_FAST) $(LDFLAGS) http_parser.o test.o -o $@ test.o: test.c http_parser.h Makefile $(CC) $(CPPFLAGS_FAST) $(CFLAGS_FAST) -c test.c -o $@ bench: http_parser.o bench.o $(CC) $(CFLAGS_BENCH) $(LDFLAGS) http_parser.o bench.o -o $@ bench.o: bench.c http_parser.h Makefile $(CC) $(CPPFLAGS_BENCH) $(CFLAGS_BENCH) -c bench.c -o $@ http_parser.o: http_parser.c http_parser.h Makefile $(CC) $(CPPFLAGS_FAST) $(CFLAGS_FAST) -c http_parser.c test-run-timed: test_fast while(true) do time $(HELPER) ./test_fast$(BINEXT) > /dev/null; done test-valgrind: test_g valgrind ./test_g libhttp_parser.o: http_parser.c http_parser.h Makefile $(CC) $(CPPFLAGS_FAST) $(CFLAGS_LIB) -c http_parser.c -o libhttp_parser.o library: libhttp_parser.o $(CC) $(LDFLAGS_LIB) -o $(LIBNAME) $< package: http_parser.o $(AR) rcs libhttp_parser.a http_parser.o url_parser: http_parser.o contrib/url_parser.c $(CC) $(CPPFLAGS_FAST) $(CFLAGS_FAST) $^ -o $@ url_parser_g: http_parser_g.o contrib/url_parser.c $(CC) $(CPPFLAGS_DEBUG) $(CFLAGS_DEBUG) $^ -o $@ parsertrace: http_parser.o contrib/parsertrace.c $(CC) $(CPPFLAGS_FAST) $(CFLAGS_FAST) $^ -o parsertrace$(BINEXT) parsertrace_g: http_parser_g.o contrib/parsertrace.c $(CC) $(CPPFLAGS_DEBUG) $(CFLAGS_DEBUG) $^ -o parsertrace_g$(BINEXT) tags: http_parser.c http_parser.h test.c ctags $^ install: library $(INSTALL) -D http_parser.h $(DESTDIR)$(INCLUDEDIR)/http_parser.h $(INSTALL) -D $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(LIBNAME) ln -s $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(SONAME) ln -s $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(SOLIBNAME).$(SOEXT) install-strip: library $(INSTALL) -D http_parser.h $(DESTDIR)$(INCLUDEDIR)/http_parser.h $(INSTALL) -D -s $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(LIBNAME) ln -s $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(SONAME) ln -s $(LIBNAME) $(DESTDIR)$(LIBDIR)/$(SOLIBNAME).$(SOEXT) uninstall: rm $(DESTDIR)$(INCLUDEDIR)/http_parser.h rm $(DESTDIR)$(LIBDIR)/$(SOLIBNAME).$(SOEXT) rm $(DESTDIR)$(LIBDIR)/$(SONAME) rm $(DESTDIR)$(LIBDIR)/$(LIBNAME) clean: rm -f *.o *.a tags test test_fast test_g \ http_parser.tar libhttp_parser.so.* \ url_parser url_parser_g parsertrace parsertrace_g \ *.exe *.exe.so contrib/url_parser.c: http_parser.h contrib/parsertrace.c: http_parser.h .PHONY: clean package test-run test-run-timed test-valgrind install install-strip uninstall aiohttp-3.6.2/vendor/http-parser/README.md0000644000175100001650000002217713547410117020523 0ustar vstsdocker00000000000000HTTP Parser =========== [![Build Status](https://api.travis-ci.org/nodejs/http-parser.svg?branch=master)](https://travis-ci.org/nodejs/http-parser) This is a parser for HTTP messages written in C. It parses both requests and responses. The parser is designed to be used in performance HTTP applications. It does not make any syscalls nor allocations, it does not buffer data, it can be interrupted at anytime. Depending on your architecture, it only requires about 40 bytes of data per message stream (in a web server that is per connection). Features: * No dependencies * Handles persistent streams (keep-alive). * Decodes chunked encoding. * Upgrade support * Defends against buffer overflow attacks. The parser extracts the following information from HTTP messages: * Header fields and values * Content-Length * Request method * Response status code * Transfer-Encoding * HTTP version * Request URL * Message body Usage ----- One `http_parser` object is used per TCP connection. Initialize the struct using `http_parser_init()` and set the callbacks. That might look something like this for a request parser: ```c http_parser_settings settings; settings.on_url = my_url_callback; settings.on_header_field = my_header_field_callback; /* ... */ http_parser *parser = malloc(sizeof(http_parser)); http_parser_init(parser, HTTP_REQUEST); parser->data = my_socket; ``` When data is received on the socket execute the parser and check for errors. ```c size_t len = 80*1024, nparsed; char buf[len]; ssize_t recved; recved = recv(fd, buf, len, 0); if (recved < 0) { /* Handle error. */ } /* Start up / continue the parser. * Note we pass recved==0 to signal that EOF has been received. */ nparsed = http_parser_execute(parser, &settings, buf, recved); if (parser->upgrade) { /* handle new protocol */ } else if (nparsed != recved) { /* Handle error. Usually just close the connection. */ } ``` `http_parser` needs to know where the end of the stream is. For example, sometimes servers send responses without Content-Length and expect the client to consume input (for the body) until EOF. To tell `http_parser` about EOF, give `0` as the fourth parameter to `http_parser_execute()`. Callbacks and errors can still be encountered during an EOF, so one must still be prepared to receive them. Scalar valued message information such as `status_code`, `method`, and the HTTP version are stored in the parser structure. This data is only temporally stored in `http_parser` and gets reset on each new message. If this information is needed later, copy it out of the structure during the `headers_complete` callback. The parser decodes the transfer-encoding for both requests and responses transparently. That is, a chunked encoding is decoded before being sent to the on_body callback. The Special Problem of Upgrade ------------------------------ `http_parser` supports upgrading the connection to a different protocol. An increasingly common example of this is the WebSocket protocol which sends a request like GET /demo HTTP/1.1 Upgrade: WebSocket Connection: Upgrade Host: example.com Origin: http://example.com WebSocket-Protocol: sample followed by non-HTTP data. (See [RFC6455](https://tools.ietf.org/html/rfc6455) for more information the WebSocket protocol.) To support this, the parser will treat this as a normal HTTP message without a body, issuing both on_headers_complete and on_message_complete callbacks. However http_parser_execute() will stop parsing at the end of the headers and return. The user is expected to check if `parser->upgrade` has been set to 1 after `http_parser_execute()` returns. Non-HTTP data begins at the buffer supplied offset by the return value of `http_parser_execute()`. Callbacks --------- During the `http_parser_execute()` call, the callbacks set in `http_parser_settings` will be executed. The parser maintains state and never looks behind, so buffering the data is not necessary. If you need to save certain data for later usage, you can do that from the callbacks. There are two types of callbacks: * notification `typedef int (*http_cb) (http_parser*);` Callbacks: on_message_begin, on_headers_complete, on_message_complete. * data `typedef int (*http_data_cb) (http_parser*, const char *at, size_t length);` Callbacks: (requests only) on_url, (common) on_header_field, on_header_value, on_body; Callbacks must return 0 on success. Returning a non-zero value indicates error to the parser, making it exit immediately. For cases where it is necessary to pass local information to/from a callback, the `http_parser` object's `data` field can be used. An example of such a case is when using threads to handle a socket connection, parse a request, and then give a response over that socket. By instantiation of a thread-local struct containing relevant data (e.g. accepted socket, allocated memory for callbacks to write into, etc), a parser's callbacks are able to communicate data between the scope of the thread and the scope of the callback in a threadsafe manner. This allows `http_parser` to be used in multi-threaded contexts. Example: ```c typedef struct { socket_t sock; void* buffer; int buf_len; } custom_data_t; int my_url_callback(http_parser* parser, const char *at, size_t length) { /* access to thread local custom_data_t struct. Use this access save parsed data for later use into thread local buffer, or communicate over socket */ parser->data; ... return 0; } ... void http_parser_thread(socket_t sock) { int nparsed = 0; /* allocate memory for user data */ custom_data_t *my_data = malloc(sizeof(custom_data_t)); /* some information for use by callbacks. * achieves thread -> callback information flow */ my_data->sock = sock; /* instantiate a thread-local parser */ http_parser *parser = malloc(sizeof(http_parser)); http_parser_init(parser, HTTP_REQUEST); /* initialise parser */ /* this custom data reference is accessible through the reference to the parser supplied to callback functions */ parser->data = my_data; http_parser_settings settings; /* set up callbacks */ settings.on_url = my_url_callback; /* execute parser */ nparsed = http_parser_execute(parser, &settings, buf, recved); ... /* parsed information copied from callback. can now perform action on data copied into thread-local memory from callbacks. achieves callback -> thread information flow */ my_data->buffer; ... } ``` In case you parse HTTP message in chunks (i.e. `read()` request line from socket, parse, read half headers, parse, etc) your data callbacks may be called more than once. `http_parser` guarantees that data pointer is only valid for the lifetime of callback. You can also `read()` into a heap allocated buffer to avoid copying memory around if this fits your application. Reading headers may be a tricky task if you read/parse headers partially. Basically, you need to remember whether last header callback was field or value and apply the following logic: (on_header_field and on_header_value shortened to on_h_*) ------------------------ ------------ -------------------------------------------- | State (prev. callback) | Callback | Description/action | ------------------------ ------------ -------------------------------------------- | nothing (first call) | on_h_field | Allocate new buffer and copy callback data | | | | into it | ------------------------ ------------ -------------------------------------------- | value | on_h_field | New header started. | | | | Copy current name,value buffers to headers | | | | list and allocate new buffer for new name | ------------------------ ------------ -------------------------------------------- | field | on_h_field | Previous name continues. Reallocate name | | | | buffer and append callback data to it | ------------------------ ------------ -------------------------------------------- | field | on_h_value | Value for current header started. Allocate | | | | new buffer and copy callback data to it | ------------------------ ------------ -------------------------------------------- | value | on_h_value | Value continues. Reallocate value buffer | | | | and append callback data to it | ------------------------ ------------ -------------------------------------------- Parsing URLs ------------ A simplistic zero-copy URL parser is provided as `http_parser_parse_url()`. Users of this library may wish to use it to parse URLs constructed from consecutive `on_url` callbacks. See examples of reading in headers: * [partial example](http://gist.github.com/155877) in C * [from http-parser tests](http://github.com/joyent/http-parser/blob/37a0ff8/test.c#L403) in C * [from Node library](http://github.com/joyent/node/blob/842eaf4/src/http.js#L284) in Javascript aiohttp-3.6.2/vendor/http-parser/bench.c0000644000175100001650000000731213547410117020461 0ustar vstsdocker00000000000000/* Copyright Fedor Indutny. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #include "http_parser.h" #include #include #include #include #include /* 8 gb */ static const int64_t kBytes = 8LL << 30; static const char data[] = "POST /joyent/http-parser HTTP/1.1\r\n" "Host: github.com\r\n" "DNT: 1\r\n" "Accept-Encoding: gzip, deflate, sdch\r\n" "Accept-Language: ru-RU,ru;q=0.8,en-US;q=0.6,en;q=0.4\r\n" "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) " "AppleWebKit/537.36 (KHTML, like Gecko) " "Chrome/39.0.2171.65 Safari/537.36\r\n" "Accept: text/html,application/xhtml+xml,application/xml;q=0.9," "image/webp,*/*;q=0.8\r\n" "Referer: https://github.com/joyent/http-parser\r\n" "Connection: keep-alive\r\n" "Transfer-Encoding: chunked\r\n" "Cache-Control: max-age=0\r\n\r\nb\r\nhello world\r\n0\r\n"; static const size_t data_len = sizeof(data) - 1; static int on_info(http_parser* p) { return 0; } static int on_data(http_parser* p, const char *at, size_t length) { return 0; } static http_parser_settings settings = { .on_message_begin = on_info, .on_headers_complete = on_info, .on_message_complete = on_info, .on_header_field = on_data, .on_header_value = on_data, .on_url = on_data, .on_status = on_data, .on_body = on_data }; int bench(int iter_count, int silent) { struct http_parser parser; int i; int err; struct timeval start; struct timeval end; if (!silent) { err = gettimeofday(&start, NULL); assert(err == 0); } fprintf(stderr, "req_len=%d\n", (int) data_len); for (i = 0; i < iter_count; i++) { size_t parsed; http_parser_init(&parser, HTTP_REQUEST); parsed = http_parser_execute(&parser, &settings, data, data_len); assert(parsed == data_len); } if (!silent) { double elapsed; double bw; double total; err = gettimeofday(&end, NULL); assert(err == 0); fprintf(stdout, "Benchmark result:\n"); elapsed = (double) (end.tv_sec - start.tv_sec) + (end.tv_usec - start.tv_usec) * 1e-6f; total = (double) iter_count * data_len; bw = (double) total / elapsed; fprintf(stdout, "%.2f mb | %.2f mb/s | %.2f req/sec | %.2f s\n", (double) total / (1024 * 1024), bw / (1024 * 1024), (double) iter_count / elapsed, elapsed); fflush(stdout); } return 0; } int main(int argc, char** argv) { int64_t iterations; iterations = kBytes / (int64_t) data_len; if (argc == 2 && strcmp(argv[1], "infinite") == 0) { for (;;) bench(iterations, 1); return 0; } else { return bench(iterations, 0); } } aiohttp-3.6.2/vendor/http-parser/contrib/0000755000175100001650000000000013547410140020667 5ustar vstsdocker00000000000000aiohttp-3.6.2/vendor/http-parser/contrib/parsertrace.c0000644000175100001650000001013413547410117023351 0ustar vstsdocker00000000000000/* Copyright Joyent, Inc. and other Node contributors. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ /* Dump what the parser finds to stdout as it happen */ #include "http_parser.h" #include #include #include int on_message_begin(http_parser* _) { (void)_; printf("\n***MESSAGE BEGIN***\n\n"); return 0; } int on_headers_complete(http_parser* _) { (void)_; printf("\n***HEADERS COMPLETE***\n\n"); return 0; } int on_message_complete(http_parser* _) { (void)_; printf("\n***MESSAGE COMPLETE***\n\n"); return 0; } int on_url(http_parser* _, const char* at, size_t length) { (void)_; printf("Url: %.*s\n", (int)length, at); return 0; } int on_header_field(http_parser* _, const char* at, size_t length) { (void)_; printf("Header field: %.*s\n", (int)length, at); return 0; } int on_header_value(http_parser* _, const char* at, size_t length) { (void)_; printf("Header value: %.*s\n", (int)length, at); return 0; } int on_body(http_parser* _, const char* at, size_t length) { (void)_; printf("Body: %.*s\n", (int)length, at); return 0; } void usage(const char* name) { fprintf(stderr, "Usage: %s $type $filename\n" " type: -x, where x is one of {r,b,q}\n" " parses file as a Response, reQuest, or Both\n", name); exit(EXIT_FAILURE); } int main(int argc, char* argv[]) { enum http_parser_type file_type; if (argc != 3) { usage(argv[0]); } char* type = argv[1]; if (type[0] != '-') { usage(argv[0]); } switch (type[1]) { /* in the case of "-", type[1] will be NUL */ case 'r': file_type = HTTP_RESPONSE; break; case 'q': file_type = HTTP_REQUEST; break; case 'b': file_type = HTTP_BOTH; break; default: usage(argv[0]); } char* filename = argv[2]; FILE* file = fopen(filename, "r"); if (file == NULL) { perror("fopen"); goto fail; } fseek(file, 0, SEEK_END); long file_length = ftell(file); if (file_length == -1) { perror("ftell"); goto fail; } fseek(file, 0, SEEK_SET); char* data = malloc(file_length); if (fread(data, 1, file_length, file) != (size_t)file_length) { fprintf(stderr, "couldn't read entire file\n"); free(data); goto fail; } http_parser_settings settings; memset(&settings, 0, sizeof(settings)); settings.on_message_begin = on_message_begin; settings.on_url = on_url; settings.on_header_field = on_header_field; settings.on_header_value = on_header_value; settings.on_headers_complete = on_headers_complete; settings.on_body = on_body; settings.on_message_complete = on_message_complete; http_parser parser; http_parser_init(&parser, file_type); size_t nparsed = http_parser_execute(&parser, &settings, data, file_length); free(data); if (nparsed != (size_t)file_length) { fprintf(stderr, "Error: %s (%s)\n", http_errno_description(HTTP_PARSER_ERRNO(&parser)), http_errno_name(HTTP_PARSER_ERRNO(&parser))); goto fail; } return EXIT_SUCCESS; fail: fclose(file); return EXIT_FAILURE; } aiohttp-3.6.2/vendor/http-parser/contrib/url_parser.c0000644000175100001650000000217713547410117023224 0ustar vstsdocker00000000000000#include "http_parser.h" #include #include void dump_url (const char *url, const struct http_parser_url *u) { unsigned int i; printf("\tfield_set: 0x%x, port: %u\n", u->field_set, u->port); for (i = 0; i < UF_MAX; i++) { if ((u->field_set & (1 << i)) == 0) { printf("\tfield_data[%u]: unset\n", i); continue; } printf("\tfield_data[%u]: off: %u, len: %u, part: %.*s\n", i, u->field_data[i].off, u->field_data[i].len, u->field_data[i].len, url + u->field_data[i].off); } } int main(int argc, char ** argv) { struct http_parser_url u; int len, connect, result; if (argc != 3) { printf("Syntax : %s connect|get url\n", argv[0]); return 1; } len = strlen(argv[2]); connect = strcmp("connect", argv[1]) == 0 ? 1 : 0; printf("Parsing %s, connect %d\n", argv[2], connect); http_parser_url_init(&u); result = http_parser_parse_url(argv[2], len, connect, &u); if (result != 0) { printf("Parse error : %d\n", result); return result; } printf("Parse ok, result : \n"); dump_url(argv[2], &u); return 0; } aiohttp-3.6.2/vendor/http-parser/http_parser.c0000644000175100001650000021321513547410117021736 0ustar vstsdocker00000000000000/* Copyright Joyent, Inc. and other Node contributors. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #include "http_parser.h" #include #include #include #include #include #ifndef ULLONG_MAX # define ULLONG_MAX ((uint64_t) -1) /* 2^64-1 */ #endif #ifndef MIN # define MIN(a,b) ((a) < (b) ? (a) : (b)) #endif #ifndef ARRAY_SIZE # define ARRAY_SIZE(a) (sizeof(a) / sizeof((a)[0])) #endif #ifndef BIT_AT # define BIT_AT(a, i) \ (!!((unsigned int) (a)[(unsigned int) (i) >> 3] & \ (1 << ((unsigned int) (i) & 7)))) #endif #ifndef ELEM_AT # define ELEM_AT(a, i, v) ((unsigned int) (i) < ARRAY_SIZE(a) ? (a)[(i)] : (v)) #endif #define SET_ERRNO(e) \ do { \ parser->nread = nread; \ parser->http_errno = (e); \ } while(0) #define CURRENT_STATE() p_state #define UPDATE_STATE(V) p_state = (enum state) (V); #define RETURN(V) \ do { \ parser->nread = nread; \ parser->state = CURRENT_STATE(); \ return (V); \ } while (0); #define REEXECUTE() \ goto reexecute; \ #ifdef __GNUC__ # define LIKELY(X) __builtin_expect(!!(X), 1) # define UNLIKELY(X) __builtin_expect(!!(X), 0) #else # define LIKELY(X) (X) # define UNLIKELY(X) (X) #endif /* Run the notify callback FOR, returning ER if it fails */ #define CALLBACK_NOTIFY_(FOR, ER) \ do { \ assert(HTTP_PARSER_ERRNO(parser) == HPE_OK); \ \ if (LIKELY(settings->on_##FOR)) { \ parser->state = CURRENT_STATE(); \ if (UNLIKELY(0 != settings->on_##FOR(parser))) { \ SET_ERRNO(HPE_CB_##FOR); \ } \ UPDATE_STATE(parser->state); \ \ /* We either errored above or got paused; get out */ \ if (UNLIKELY(HTTP_PARSER_ERRNO(parser) != HPE_OK)) { \ return (ER); \ } \ } \ } while (0) /* Run the notify callback FOR and consume the current byte */ #define CALLBACK_NOTIFY(FOR) CALLBACK_NOTIFY_(FOR, p - data + 1) /* Run the notify callback FOR and don't consume the current byte */ #define CALLBACK_NOTIFY_NOADVANCE(FOR) CALLBACK_NOTIFY_(FOR, p - data) /* Run data callback FOR with LEN bytes, returning ER if it fails */ #define CALLBACK_DATA_(FOR, LEN, ER) \ do { \ assert(HTTP_PARSER_ERRNO(parser) == HPE_OK); \ \ if (FOR##_mark) { \ if (LIKELY(settings->on_##FOR)) { \ parser->state = CURRENT_STATE(); \ if (UNLIKELY(0 != \ settings->on_##FOR(parser, FOR##_mark, (LEN)))) { \ SET_ERRNO(HPE_CB_##FOR); \ } \ UPDATE_STATE(parser->state); \ \ /* We either errored above or got paused; get out */ \ if (UNLIKELY(HTTP_PARSER_ERRNO(parser) != HPE_OK)) { \ return (ER); \ } \ } \ FOR##_mark = NULL; \ } \ } while (0) /* Run the data callback FOR and consume the current byte */ #define CALLBACK_DATA(FOR) \ CALLBACK_DATA_(FOR, p - FOR##_mark, p - data + 1) /* Run the data callback FOR and don't consume the current byte */ #define CALLBACK_DATA_NOADVANCE(FOR) \ CALLBACK_DATA_(FOR, p - FOR##_mark, p - data) /* Set the mark FOR; non-destructive if mark is already set */ #define MARK(FOR) \ do { \ if (!FOR##_mark) { \ FOR##_mark = p; \ } \ } while (0) /* Don't allow the total size of the HTTP headers (including the status * line) to exceed HTTP_MAX_HEADER_SIZE. This check is here to protect * embedders against denial-of-service attacks where the attacker feeds * us a never-ending header that the embedder keeps buffering. * * This check is arguably the responsibility of embedders but we're doing * it on the embedder's behalf because most won't bother and this way we * make the web a little safer. HTTP_MAX_HEADER_SIZE is still far bigger * than any reasonable request or response so this should never affect * day-to-day operation. */ #define COUNT_HEADER_SIZE(V) \ do { \ nread += (V); \ if (UNLIKELY(nread > (HTTP_MAX_HEADER_SIZE))) { \ SET_ERRNO(HPE_HEADER_OVERFLOW); \ goto error; \ } \ } while (0) #define PROXY_CONNECTION "proxy-connection" #define CONNECTION "connection" #define CONTENT_LENGTH "content-length" #define TRANSFER_ENCODING "transfer-encoding" #define UPGRADE "upgrade" #define CHUNKED "chunked" #define KEEP_ALIVE "keep-alive" #define CLOSE "close" static const char *method_strings[] = { #define XX(num, name, string) #string, HTTP_METHOD_MAP(XX) #undef XX }; /* Tokens as defined by rfc 2616. Also lowercases them. * token = 1* * separators = "(" | ")" | "<" | ">" | "@" * | "," | ";" | ":" | "\" | <"> * | "/" | "[" | "]" | "?" | "=" * | "{" | "}" | SP | HT */ static const char tokens[256] = { /* 0 nul 1 soh 2 stx 3 etx 4 eot 5 enq 6 ack 7 bel */ 0, 0, 0, 0, 0, 0, 0, 0, /* 8 bs 9 ht 10 nl 11 vt 12 np 13 cr 14 so 15 si */ 0, 0, 0, 0, 0, 0, 0, 0, /* 16 dle 17 dc1 18 dc2 19 dc3 20 dc4 21 nak 22 syn 23 etb */ 0, 0, 0, 0, 0, 0, 0, 0, /* 24 can 25 em 26 sub 27 esc 28 fs 29 gs 30 rs 31 us */ 0, 0, 0, 0, 0, 0, 0, 0, /* 32 sp 33 ! 34 " 35 # 36 $ 37 % 38 & 39 ' */ ' ', '!', 0, '#', '$', '%', '&', '\'', /* 40 ( 41 ) 42 * 43 + 44 , 45 - 46 . 47 / */ 0, 0, '*', '+', 0, '-', '.', 0, /* 48 0 49 1 50 2 51 3 52 4 53 5 54 6 55 7 */ '0', '1', '2', '3', '4', '5', '6', '7', /* 56 8 57 9 58 : 59 ; 60 < 61 = 62 > 63 ? */ '8', '9', 0, 0, 0, 0, 0, 0, /* 64 @ 65 A 66 B 67 C 68 D 69 E 70 F 71 G */ 0, 'a', 'b', 'c', 'd', 'e', 'f', 'g', /* 72 H 73 I 74 J 75 K 76 L 77 M 78 N 79 O */ 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', /* 80 P 81 Q 82 R 83 S 84 T 85 U 86 V 87 W */ 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', /* 88 X 89 Y 90 Z 91 [ 92 \ 93 ] 94 ^ 95 _ */ 'x', 'y', 'z', 0, 0, 0, '^', '_', /* 96 ` 97 a 98 b 99 c 100 d 101 e 102 f 103 g */ '`', 'a', 'b', 'c', 'd', 'e', 'f', 'g', /* 104 h 105 i 106 j 107 k 108 l 109 m 110 n 111 o */ 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', /* 112 p 113 q 114 r 115 s 116 t 117 u 118 v 119 w */ 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', /* 120 x 121 y 122 z 123 { 124 | 125 } 126 ~ 127 del */ 'x', 'y', 'z', 0, '|', 0, '~', 0 }; static const int8_t unhex[256] = {-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1 ,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1 ,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1 , 0, 1, 2, 3, 4, 5, 6, 7, 8, 9,-1,-1,-1,-1,-1,-1 ,-1,10,11,12,13,14,15,-1,-1,-1,-1,-1,-1,-1,-1,-1 ,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1 ,-1,10,11,12,13,14,15,-1,-1,-1,-1,-1,-1,-1,-1,-1 ,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,-1 }; #if HTTP_PARSER_STRICT # define T(v) 0 #else # define T(v) v #endif static const uint8_t normal_url_char[32] = { /* 0 nul 1 soh 2 stx 3 etx 4 eot 5 enq 6 ack 7 bel */ 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0, /* 8 bs 9 ht 10 nl 11 vt 12 np 13 cr 14 so 15 si */ 0 | T(2) | 0 | 0 | T(16) | 0 | 0 | 0, /* 16 dle 17 dc1 18 dc2 19 dc3 20 dc4 21 nak 22 syn 23 etb */ 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0, /* 24 can 25 em 26 sub 27 esc 28 fs 29 gs 30 rs 31 us */ 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0, /* 32 sp 33 ! 34 " 35 # 36 $ 37 % 38 & 39 ' */ 0 | 2 | 4 | 0 | 16 | 32 | 64 | 128, /* 40 ( 41 ) 42 * 43 + 44 , 45 - 46 . 47 / */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 48 0 49 1 50 2 51 3 52 4 53 5 54 6 55 7 */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 56 8 57 9 58 : 59 ; 60 < 61 = 62 > 63 ? */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 0, /* 64 @ 65 A 66 B 67 C 68 D 69 E 70 F 71 G */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 72 H 73 I 74 J 75 K 76 L 77 M 78 N 79 O */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 80 P 81 Q 82 R 83 S 84 T 85 U 86 V 87 W */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 88 X 89 Y 90 Z 91 [ 92 \ 93 ] 94 ^ 95 _ */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 96 ` 97 a 98 b 99 c 100 d 101 e 102 f 103 g */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 104 h 105 i 106 j 107 k 108 l 109 m 110 n 111 o */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 112 p 113 q 114 r 115 s 116 t 117 u 118 v 119 w */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 128, /* 120 x 121 y 122 z 123 { 124 | 125 } 126 ~ 127 del */ 1 | 2 | 4 | 8 | 16 | 32 | 64 | 0, }; #undef T enum state { s_dead = 1 /* important that this is > 0 */ , s_start_req_or_res , s_res_or_resp_H , s_start_res , s_res_H , s_res_HT , s_res_HTT , s_res_HTTP , s_res_http_major , s_res_http_dot , s_res_http_minor , s_res_http_end , s_res_first_status_code , s_res_status_code , s_res_status_start , s_res_status , s_res_line_almost_done , s_start_req , s_req_method , s_req_spaces_before_url , s_req_schema , s_req_schema_slash , s_req_schema_slash_slash , s_req_server_start , s_req_server , s_req_server_with_at , s_req_path , s_req_query_string_start , s_req_query_string , s_req_fragment_start , s_req_fragment , s_req_http_start , s_req_http_H , s_req_http_HT , s_req_http_HTT , s_req_http_HTTP , s_req_http_major , s_req_http_dot , s_req_http_minor , s_req_http_end , s_req_line_almost_done , s_header_field_start , s_header_field , s_header_value_discard_ws , s_header_value_discard_ws_almost_done , s_header_value_discard_lws , s_header_value_start , s_header_value , s_header_value_lws , s_header_almost_done , s_chunk_size_start , s_chunk_size , s_chunk_parameters , s_chunk_size_almost_done , s_headers_almost_done , s_headers_done /* Important: 's_headers_done' must be the last 'header' state. All * states beyond this must be 'body' states. It is used for overflow * checking. See the PARSING_HEADER() macro. */ , s_chunk_data , s_chunk_data_almost_done , s_chunk_data_done , s_body_identity , s_body_identity_eof , s_message_done }; #define PARSING_HEADER(state) (state <= s_headers_done) enum header_states { h_general = 0 , h_C , h_CO , h_CON , h_matching_connection , h_matching_proxy_connection , h_matching_content_length , h_matching_transfer_encoding , h_matching_upgrade , h_connection , h_content_length , h_content_length_num , h_content_length_ws , h_transfer_encoding , h_upgrade , h_matching_transfer_encoding_chunked , h_matching_connection_token_start , h_matching_connection_keep_alive , h_matching_connection_close , h_matching_connection_upgrade , h_matching_connection_token , h_transfer_encoding_chunked , h_connection_keep_alive , h_connection_close , h_connection_upgrade }; enum http_host_state { s_http_host_dead = 1 , s_http_userinfo_start , s_http_userinfo , s_http_host_start , s_http_host_v6_start , s_http_host , s_http_host_v6 , s_http_host_v6_end , s_http_host_v6_zone_start , s_http_host_v6_zone , s_http_host_port_start , s_http_host_port }; /* Macros for character classes; depends on strict-mode */ #define CR '\r' #define LF '\n' #define LOWER(c) (unsigned char)(c | 0x20) #define IS_ALPHA(c) (LOWER(c) >= 'a' && LOWER(c) <= 'z') #define IS_NUM(c) ((c) >= '0' && (c) <= '9') #define IS_ALPHANUM(c) (IS_ALPHA(c) || IS_NUM(c)) #define IS_HEX(c) (IS_NUM(c) || (LOWER(c) >= 'a' && LOWER(c) <= 'f')) #define IS_MARK(c) ((c) == '-' || (c) == '_' || (c) == '.' || \ (c) == '!' || (c) == '~' || (c) == '*' || (c) == '\'' || (c) == '(' || \ (c) == ')') #define IS_USERINFO_CHAR(c) (IS_ALPHANUM(c) || IS_MARK(c) || (c) == '%' || \ (c) == ';' || (c) == ':' || (c) == '&' || (c) == '=' || (c) == '+' || \ (c) == '$' || (c) == ',') #define STRICT_TOKEN(c) ((c == ' ') ? 0 : tokens[(unsigned char)c]) #if HTTP_PARSER_STRICT #define TOKEN(c) STRICT_TOKEN(c) #define IS_URL_CHAR(c) (BIT_AT(normal_url_char, (unsigned char)c)) #define IS_HOST_CHAR(c) (IS_ALPHANUM(c) || (c) == '.' || (c) == '-') #else #define TOKEN(c) tokens[(unsigned char)c] #define IS_URL_CHAR(c) \ (BIT_AT(normal_url_char, (unsigned char)c) || ((c) & 0x80)) #define IS_HOST_CHAR(c) \ (IS_ALPHANUM(c) || (c) == '.' || (c) == '-' || (c) == '_') #endif /** * Verify that a char is a valid visible (printable) US-ASCII * character or %x80-FF **/ #define IS_HEADER_CHAR(ch) \ (ch == CR || ch == LF || ch == 9 || ((unsigned char)ch > 31 && ch != 127)) #define start_state (parser->type == HTTP_REQUEST ? s_start_req : s_start_res) #if HTTP_PARSER_STRICT # define STRICT_CHECK(cond) \ do { \ if (cond) { \ SET_ERRNO(HPE_STRICT); \ goto error; \ } \ } while (0) # define NEW_MESSAGE() (http_should_keep_alive(parser) ? start_state : s_dead) #else # define STRICT_CHECK(cond) # define NEW_MESSAGE() start_state #endif /* Map errno values to strings for human-readable output */ #define HTTP_STRERROR_GEN(n, s) { "HPE_" #n, s }, static struct { const char *name; const char *description; } http_strerror_tab[] = { HTTP_ERRNO_MAP(HTTP_STRERROR_GEN) }; #undef HTTP_STRERROR_GEN int http_message_needs_eof(const http_parser *parser); /* Our URL parser. * * This is designed to be shared by http_parser_execute() for URL validation, * hence it has a state transition + byte-for-byte interface. In addition, it * is meant to be embedded in http_parser_parse_url(), which does the dirty * work of turning state transitions URL components for its API. * * This function should only be invoked with non-space characters. It is * assumed that the caller cares about (and can detect) the transition between * URL and non-URL states by looking for these. */ static enum state parse_url_char(enum state s, const char ch) { if (ch == ' ' || ch == '\r' || ch == '\n') { return s_dead; } #if HTTP_PARSER_STRICT if (ch == '\t' || ch == '\f') { return s_dead; } #endif switch (s) { case s_req_spaces_before_url: /* Proxied requests are followed by scheme of an absolute URI (alpha). * All methods except CONNECT are followed by '/' or '*'. */ if (ch == '/' || ch == '*') { return s_req_path; } if (IS_ALPHA(ch)) { return s_req_schema; } break; case s_req_schema: if (IS_ALPHA(ch)) { return s; } if (ch == ':') { return s_req_schema_slash; } break; case s_req_schema_slash: if (ch == '/') { return s_req_schema_slash_slash; } break; case s_req_schema_slash_slash: if (ch == '/') { return s_req_server_start; } break; case s_req_server_with_at: if (ch == '@') { return s_dead; } /* fall through */ case s_req_server_start: case s_req_server: if (ch == '/') { return s_req_path; } if (ch == '?') { return s_req_query_string_start; } if (ch == '@') { return s_req_server_with_at; } if (IS_USERINFO_CHAR(ch) || ch == '[' || ch == ']') { return s_req_server; } break; case s_req_path: if (IS_URL_CHAR(ch)) { return s; } switch (ch) { case '?': return s_req_query_string_start; case '#': return s_req_fragment_start; } break; case s_req_query_string_start: case s_req_query_string: if (IS_URL_CHAR(ch)) { return s_req_query_string; } switch (ch) { case '?': /* allow extra '?' in query string */ return s_req_query_string; case '#': return s_req_fragment_start; } break; case s_req_fragment_start: if (IS_URL_CHAR(ch)) { return s_req_fragment; } switch (ch) { case '?': return s_req_fragment; case '#': return s; } break; case s_req_fragment: if (IS_URL_CHAR(ch)) { return s; } switch (ch) { case '?': case '#': return s; } break; default: break; } /* We should never fall out of the switch above unless there's an error */ return s_dead; } size_t http_parser_execute (http_parser *parser, const http_parser_settings *settings, const char *data, size_t len) { char c, ch; int8_t unhex_val; const char *p = data; const char *header_field_mark = 0; const char *header_value_mark = 0; const char *url_mark = 0; const char *body_mark = 0; const char *status_mark = 0; enum state p_state = (enum state) parser->state; const unsigned int lenient = parser->lenient_http_headers; uint32_t nread = parser->nread; /* We're in an error state. Don't bother doing anything. */ if (HTTP_PARSER_ERRNO(parser) != HPE_OK) { return 0; } if (len == 0) { switch (CURRENT_STATE()) { case s_body_identity_eof: /* Use of CALLBACK_NOTIFY() here would erroneously return 1 byte read if * we got paused. */ CALLBACK_NOTIFY_NOADVANCE(message_complete); return 0; case s_dead: case s_start_req_or_res: case s_start_res: case s_start_req: return 0; default: SET_ERRNO(HPE_INVALID_EOF_STATE); return 1; } } if (CURRENT_STATE() == s_header_field) header_field_mark = data; if (CURRENT_STATE() == s_header_value) header_value_mark = data; switch (CURRENT_STATE()) { case s_req_path: case s_req_schema: case s_req_schema_slash: case s_req_schema_slash_slash: case s_req_server_start: case s_req_server: case s_req_server_with_at: case s_req_query_string_start: case s_req_query_string: case s_req_fragment_start: case s_req_fragment: url_mark = data; break; case s_res_status: status_mark = data; break; default: break; } for (p=data; p != data + len; p++) { ch = *p; if (PARSING_HEADER(CURRENT_STATE())) COUNT_HEADER_SIZE(1); reexecute: switch (CURRENT_STATE()) { case s_dead: /* this state is used after a 'Connection: close' message * the parser will error out if it reads another message */ if (LIKELY(ch == CR || ch == LF)) break; SET_ERRNO(HPE_CLOSED_CONNECTION); goto error; case s_start_req_or_res: { if (ch == CR || ch == LF) break; parser->flags = 0; parser->content_length = ULLONG_MAX; if (ch == 'H') { UPDATE_STATE(s_res_or_resp_H); CALLBACK_NOTIFY(message_begin); } else { parser->type = HTTP_REQUEST; UPDATE_STATE(s_start_req); REEXECUTE(); } break; } case s_res_or_resp_H: if (ch == 'T') { parser->type = HTTP_RESPONSE; UPDATE_STATE(s_res_HT); } else { if (UNLIKELY(ch != 'E')) { SET_ERRNO(HPE_INVALID_CONSTANT); goto error; } parser->type = HTTP_REQUEST; parser->method = HTTP_HEAD; parser->index = 2; UPDATE_STATE(s_req_method); } break; case s_start_res: { if (ch == CR || ch == LF) break; parser->flags = 0; parser->content_length = ULLONG_MAX; if (ch == 'H') { UPDATE_STATE(s_res_H); } else { SET_ERRNO(HPE_INVALID_CONSTANT); goto error; } CALLBACK_NOTIFY(message_begin); break; } case s_res_H: STRICT_CHECK(ch != 'T'); UPDATE_STATE(s_res_HT); break; case s_res_HT: STRICT_CHECK(ch != 'T'); UPDATE_STATE(s_res_HTT); break; case s_res_HTT: STRICT_CHECK(ch != 'P'); UPDATE_STATE(s_res_HTTP); break; case s_res_HTTP: STRICT_CHECK(ch != '/'); UPDATE_STATE(s_res_http_major); break; case s_res_http_major: if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } parser->http_major = ch - '0'; UPDATE_STATE(s_res_http_dot); break; case s_res_http_dot: { if (UNLIKELY(ch != '.')) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } UPDATE_STATE(s_res_http_minor); break; } case s_res_http_minor: if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } parser->http_minor = ch - '0'; UPDATE_STATE(s_res_http_end); break; case s_res_http_end: { if (UNLIKELY(ch != ' ')) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } UPDATE_STATE(s_res_first_status_code); break; } case s_res_first_status_code: { if (!IS_NUM(ch)) { if (ch == ' ') { break; } SET_ERRNO(HPE_INVALID_STATUS); goto error; } parser->status_code = ch - '0'; UPDATE_STATE(s_res_status_code); break; } case s_res_status_code: { if (!IS_NUM(ch)) { switch (ch) { case ' ': UPDATE_STATE(s_res_status_start); break; case CR: case LF: UPDATE_STATE(s_res_status_start); REEXECUTE(); break; default: SET_ERRNO(HPE_INVALID_STATUS); goto error; } break; } parser->status_code *= 10; parser->status_code += ch - '0'; if (UNLIKELY(parser->status_code > 999)) { SET_ERRNO(HPE_INVALID_STATUS); goto error; } break; } case s_res_status_start: { MARK(status); UPDATE_STATE(s_res_status); parser->index = 0; if (ch == CR || ch == LF) REEXECUTE(); break; } case s_res_status: if (ch == CR) { UPDATE_STATE(s_res_line_almost_done); CALLBACK_DATA(status); break; } if (ch == LF) { UPDATE_STATE(s_header_field_start); CALLBACK_DATA(status); break; } break; case s_res_line_almost_done: STRICT_CHECK(ch != LF); UPDATE_STATE(s_header_field_start); break; case s_start_req: { if (ch == CR || ch == LF) break; parser->flags = 0; parser->content_length = ULLONG_MAX; if (UNLIKELY(!IS_ALPHA(ch))) { SET_ERRNO(HPE_INVALID_METHOD); goto error; } parser->method = (enum http_method) 0; parser->index = 1; switch (ch) { case 'A': parser->method = HTTP_ACL; break; case 'B': parser->method = HTTP_BIND; break; case 'C': parser->method = HTTP_CONNECT; /* or COPY, CHECKOUT */ break; case 'D': parser->method = HTTP_DELETE; break; case 'G': parser->method = HTTP_GET; break; case 'H': parser->method = HTTP_HEAD; break; case 'L': parser->method = HTTP_LOCK; /* or LINK */ break; case 'M': parser->method = HTTP_MKCOL; /* or MOVE, MKACTIVITY, MERGE, M-SEARCH, MKCALENDAR */ break; case 'N': parser->method = HTTP_NOTIFY; break; case 'O': parser->method = HTTP_OPTIONS; break; case 'P': parser->method = HTTP_POST; /* or PROPFIND|PROPPATCH|PUT|PATCH|PURGE */ break; case 'R': parser->method = HTTP_REPORT; /* or REBIND */ break; case 'S': parser->method = HTTP_SUBSCRIBE; /* or SEARCH, SOURCE */ break; case 'T': parser->method = HTTP_TRACE; break; case 'U': parser->method = HTTP_UNLOCK; /* or UNSUBSCRIBE, UNBIND, UNLINK */ break; default: SET_ERRNO(HPE_INVALID_METHOD); goto error; } UPDATE_STATE(s_req_method); CALLBACK_NOTIFY(message_begin); break; } case s_req_method: { const char *matcher; if (UNLIKELY(ch == '\0')) { SET_ERRNO(HPE_INVALID_METHOD); goto error; } matcher = method_strings[parser->method]; if (ch == ' ' && matcher[parser->index] == '\0') { UPDATE_STATE(s_req_spaces_before_url); } else if (ch == matcher[parser->index]) { ; /* nada */ } else if ((ch >= 'A' && ch <= 'Z') || ch == '-') { switch (parser->method << 16 | parser->index << 8 | ch) { #define XX(meth, pos, ch, new_meth) \ case (HTTP_##meth << 16 | pos << 8 | ch): \ parser->method = HTTP_##new_meth; break; XX(POST, 1, 'U', PUT) XX(POST, 1, 'A', PATCH) XX(POST, 1, 'R', PROPFIND) XX(PUT, 2, 'R', PURGE) XX(CONNECT, 1, 'H', CHECKOUT) XX(CONNECT, 2, 'P', COPY) XX(MKCOL, 1, 'O', MOVE) XX(MKCOL, 1, 'E', MERGE) XX(MKCOL, 1, '-', MSEARCH) XX(MKCOL, 2, 'A', MKACTIVITY) XX(MKCOL, 3, 'A', MKCALENDAR) XX(SUBSCRIBE, 1, 'E', SEARCH) XX(SUBSCRIBE, 1, 'O', SOURCE) XX(REPORT, 2, 'B', REBIND) XX(PROPFIND, 4, 'P', PROPPATCH) XX(LOCK, 1, 'I', LINK) XX(UNLOCK, 2, 'S', UNSUBSCRIBE) XX(UNLOCK, 2, 'B', UNBIND) XX(UNLOCK, 3, 'I', UNLINK) #undef XX default: SET_ERRNO(HPE_INVALID_METHOD); goto error; } } else { SET_ERRNO(HPE_INVALID_METHOD); goto error; } ++parser->index; break; } case s_req_spaces_before_url: { if (ch == ' ') break; MARK(url); if (parser->method == HTTP_CONNECT) { UPDATE_STATE(s_req_server_start); } UPDATE_STATE(parse_url_char(CURRENT_STATE(), ch)); if (UNLIKELY(CURRENT_STATE() == s_dead)) { SET_ERRNO(HPE_INVALID_URL); goto error; } break; } case s_req_schema: case s_req_schema_slash: case s_req_schema_slash_slash: case s_req_server_start: { switch (ch) { /* No whitespace allowed here */ case ' ': case CR: case LF: SET_ERRNO(HPE_INVALID_URL); goto error; default: UPDATE_STATE(parse_url_char(CURRENT_STATE(), ch)); if (UNLIKELY(CURRENT_STATE() == s_dead)) { SET_ERRNO(HPE_INVALID_URL); goto error; } } break; } case s_req_server: case s_req_server_with_at: case s_req_path: case s_req_query_string_start: case s_req_query_string: case s_req_fragment_start: case s_req_fragment: { switch (ch) { case ' ': UPDATE_STATE(s_req_http_start); CALLBACK_DATA(url); break; case CR: case LF: parser->http_major = 0; parser->http_minor = 9; UPDATE_STATE((ch == CR) ? s_req_line_almost_done : s_header_field_start); CALLBACK_DATA(url); break; default: UPDATE_STATE(parse_url_char(CURRENT_STATE(), ch)); if (UNLIKELY(CURRENT_STATE() == s_dead)) { SET_ERRNO(HPE_INVALID_URL); goto error; } } break; } case s_req_http_start: switch (ch) { case 'H': UPDATE_STATE(s_req_http_H); break; case ' ': break; default: SET_ERRNO(HPE_INVALID_CONSTANT); goto error; } break; case s_req_http_H: STRICT_CHECK(ch != 'T'); UPDATE_STATE(s_req_http_HT); break; case s_req_http_HT: STRICT_CHECK(ch != 'T'); UPDATE_STATE(s_req_http_HTT); break; case s_req_http_HTT: STRICT_CHECK(ch != 'P'); UPDATE_STATE(s_req_http_HTTP); break; case s_req_http_HTTP: STRICT_CHECK(ch != '/'); UPDATE_STATE(s_req_http_major); break; case s_req_http_major: if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } parser->http_major = ch - '0'; UPDATE_STATE(s_req_http_dot); break; case s_req_http_dot: { if (UNLIKELY(ch != '.')) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } UPDATE_STATE(s_req_http_minor); break; } case s_req_http_minor: if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_VERSION); goto error; } parser->http_minor = ch - '0'; UPDATE_STATE(s_req_http_end); break; case s_req_http_end: { if (ch == CR) { UPDATE_STATE(s_req_line_almost_done); break; } if (ch == LF) { UPDATE_STATE(s_header_field_start); break; } SET_ERRNO(HPE_INVALID_VERSION); goto error; break; } /* end of request line */ case s_req_line_almost_done: { if (UNLIKELY(ch != LF)) { SET_ERRNO(HPE_LF_EXPECTED); goto error; } UPDATE_STATE(s_header_field_start); break; } case s_header_field_start: { if (ch == CR) { UPDATE_STATE(s_headers_almost_done); break; } if (ch == LF) { /* they might be just sending \n instead of \r\n so this would be * the second \n to denote the end of headers*/ UPDATE_STATE(s_headers_almost_done); REEXECUTE(); } c = TOKEN(ch); if (UNLIKELY(!c)) { SET_ERRNO(HPE_INVALID_HEADER_TOKEN); goto error; } MARK(header_field); parser->index = 0; UPDATE_STATE(s_header_field); switch (c) { case 'c': parser->header_state = h_C; break; case 'p': parser->header_state = h_matching_proxy_connection; break; case 't': parser->header_state = h_matching_transfer_encoding; break; case 'u': parser->header_state = h_matching_upgrade; break; default: parser->header_state = h_general; break; } break; } case s_header_field: { const char* start = p; for (; p != data + len; p++) { ch = *p; c = TOKEN(ch); if (!c) break; switch (parser->header_state) { case h_general: { size_t limit = data + len - p; limit = MIN(limit, HTTP_MAX_HEADER_SIZE); while (p+1 < data + limit && TOKEN(p[1])) { p++; } break; } case h_C: parser->index++; parser->header_state = (c == 'o' ? h_CO : h_general); break; case h_CO: parser->index++; parser->header_state = (c == 'n' ? h_CON : h_general); break; case h_CON: parser->index++; switch (c) { case 'n': parser->header_state = h_matching_connection; break; case 't': parser->header_state = h_matching_content_length; break; default: parser->header_state = h_general; break; } break; /* connection */ case h_matching_connection: parser->index++; if (parser->index > sizeof(CONNECTION)-1 || c != CONNECTION[parser->index]) { parser->header_state = h_general; } else if (parser->index == sizeof(CONNECTION)-2) { parser->header_state = h_connection; } break; /* proxy-connection */ case h_matching_proxy_connection: parser->index++; if (parser->index > sizeof(PROXY_CONNECTION)-1 || c != PROXY_CONNECTION[parser->index]) { parser->header_state = h_general; } else if (parser->index == sizeof(PROXY_CONNECTION)-2) { parser->header_state = h_connection; } break; /* content-length */ case h_matching_content_length: parser->index++; if (parser->index > sizeof(CONTENT_LENGTH)-1 || c != CONTENT_LENGTH[parser->index]) { parser->header_state = h_general; } else if (parser->index == sizeof(CONTENT_LENGTH)-2) { parser->header_state = h_content_length; } break; /* transfer-encoding */ case h_matching_transfer_encoding: parser->index++; if (parser->index > sizeof(TRANSFER_ENCODING)-1 || c != TRANSFER_ENCODING[parser->index]) { parser->header_state = h_general; } else if (parser->index == sizeof(TRANSFER_ENCODING)-2) { parser->header_state = h_transfer_encoding; } break; /* upgrade */ case h_matching_upgrade: parser->index++; if (parser->index > sizeof(UPGRADE)-1 || c != UPGRADE[parser->index]) { parser->header_state = h_general; } else if (parser->index == sizeof(UPGRADE)-2) { parser->header_state = h_upgrade; } break; case h_connection: case h_content_length: case h_transfer_encoding: case h_upgrade: if (ch != ' ') parser->header_state = h_general; break; default: assert(0 && "Unknown header_state"); break; } } if (p == data + len) { --p; COUNT_HEADER_SIZE(p - start); break; } COUNT_HEADER_SIZE(p - start); if (ch == ':') { UPDATE_STATE(s_header_value_discard_ws); CALLBACK_DATA(header_field); break; } SET_ERRNO(HPE_INVALID_HEADER_TOKEN); goto error; } case s_header_value_discard_ws: if (ch == ' ' || ch == '\t') break; if (ch == CR) { UPDATE_STATE(s_header_value_discard_ws_almost_done); break; } if (ch == LF) { UPDATE_STATE(s_header_value_discard_lws); break; } /* fall through */ case s_header_value_start: { MARK(header_value); UPDATE_STATE(s_header_value); parser->index = 0; c = LOWER(ch); switch (parser->header_state) { case h_upgrade: parser->flags |= F_UPGRADE; parser->header_state = h_general; break; case h_transfer_encoding: /* looking for 'Transfer-Encoding: chunked' */ if ('c' == c) { parser->header_state = h_matching_transfer_encoding_chunked; } else { parser->header_state = h_general; } break; case h_content_length: if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_CONTENT_LENGTH); goto error; } if (parser->flags & F_CONTENTLENGTH) { SET_ERRNO(HPE_UNEXPECTED_CONTENT_LENGTH); goto error; } parser->flags |= F_CONTENTLENGTH; parser->content_length = ch - '0'; parser->header_state = h_content_length_num; break; case h_connection: /* looking for 'Connection: keep-alive' */ if (c == 'k') { parser->header_state = h_matching_connection_keep_alive; /* looking for 'Connection: close' */ } else if (c == 'c') { parser->header_state = h_matching_connection_close; } else if (c == 'u') { parser->header_state = h_matching_connection_upgrade; } else { parser->header_state = h_matching_connection_token; } break; /* Multi-value `Connection` header */ case h_matching_connection_token_start: break; default: parser->header_state = h_general; break; } break; } case s_header_value: { const char* start = p; enum header_states h_state = (enum header_states) parser->header_state; for (; p != data + len; p++) { ch = *p; if (ch == CR) { UPDATE_STATE(s_header_almost_done); parser->header_state = h_state; CALLBACK_DATA(header_value); break; } if (ch == LF) { UPDATE_STATE(s_header_almost_done); COUNT_HEADER_SIZE(p - start); parser->header_state = h_state; CALLBACK_DATA_NOADVANCE(header_value); REEXECUTE(); } if (!lenient && !IS_HEADER_CHAR(ch)) { SET_ERRNO(HPE_INVALID_HEADER_TOKEN); goto error; } c = LOWER(ch); switch (h_state) { case h_general: { const char* p_cr; const char* p_lf; size_t limit = data + len - p; limit = MIN(limit, HTTP_MAX_HEADER_SIZE); p_cr = (const char*) memchr(p, CR, limit); p_lf = (const char*) memchr(p, LF, limit); if (p_cr != NULL) { if (p_lf != NULL && p_cr >= p_lf) p = p_lf; else p = p_cr; } else if (UNLIKELY(p_lf != NULL)) { p = p_lf; } else { p = data + len; } --p; break; } case h_connection: case h_transfer_encoding: assert(0 && "Shouldn't get here."); break; case h_content_length: if (ch == ' ') break; h_state = h_content_length_num; /* fall through */ case h_content_length_num: { uint64_t t; if (ch == ' ') { h_state = h_content_length_ws; break; } if (UNLIKELY(!IS_NUM(ch))) { SET_ERRNO(HPE_INVALID_CONTENT_LENGTH); parser->header_state = h_state; goto error; } t = parser->content_length; t *= 10; t += ch - '0'; /* Overflow? Test against a conservative limit for simplicity. */ if (UNLIKELY((ULLONG_MAX - 10) / 10 < parser->content_length)) { SET_ERRNO(HPE_INVALID_CONTENT_LENGTH); parser->header_state = h_state; goto error; } parser->content_length = t; break; } case h_content_length_ws: if (ch == ' ') break; SET_ERRNO(HPE_INVALID_CONTENT_LENGTH); parser->header_state = h_state; goto error; /* Transfer-Encoding: chunked */ case h_matching_transfer_encoding_chunked: parser->index++; if (parser->index > sizeof(CHUNKED)-1 || c != CHUNKED[parser->index]) { h_state = h_general; } else if (parser->index == sizeof(CHUNKED)-2) { h_state = h_transfer_encoding_chunked; } break; case h_matching_connection_token_start: /* looking for 'Connection: keep-alive' */ if (c == 'k') { h_state = h_matching_connection_keep_alive; /* looking for 'Connection: close' */ } else if (c == 'c') { h_state = h_matching_connection_close; } else if (c == 'u') { h_state = h_matching_connection_upgrade; } else if (STRICT_TOKEN(c)) { h_state = h_matching_connection_token; } else if (c == ' ' || c == '\t') { /* Skip lws */ } else { h_state = h_general; } break; /* looking for 'Connection: keep-alive' */ case h_matching_connection_keep_alive: parser->index++; if (parser->index > sizeof(KEEP_ALIVE)-1 || c != KEEP_ALIVE[parser->index]) { h_state = h_matching_connection_token; } else if (parser->index == sizeof(KEEP_ALIVE)-2) { h_state = h_connection_keep_alive; } break; /* looking for 'Connection: close' */ case h_matching_connection_close: parser->index++; if (parser->index > sizeof(CLOSE)-1 || c != CLOSE[parser->index]) { h_state = h_matching_connection_token; } else if (parser->index == sizeof(CLOSE)-2) { h_state = h_connection_close; } break; /* looking for 'Connection: upgrade' */ case h_matching_connection_upgrade: parser->index++; if (parser->index > sizeof(UPGRADE) - 1 || c != UPGRADE[parser->index]) { h_state = h_matching_connection_token; } else if (parser->index == sizeof(UPGRADE)-2) { h_state = h_connection_upgrade; } break; case h_matching_connection_token: if (ch == ',') { h_state = h_matching_connection_token_start; parser->index = 0; } break; case h_transfer_encoding_chunked: if (ch != ' ') h_state = h_general; break; case h_connection_keep_alive: case h_connection_close: case h_connection_upgrade: if (ch == ',') { if (h_state == h_connection_keep_alive) { parser->flags |= F_CONNECTION_KEEP_ALIVE; } else if (h_state == h_connection_close) { parser->flags |= F_CONNECTION_CLOSE; } else if (h_state == h_connection_upgrade) { parser->flags |= F_CONNECTION_UPGRADE; } h_state = h_matching_connection_token_start; parser->index = 0; } else if (ch != ' ') { h_state = h_matching_connection_token; } break; default: UPDATE_STATE(s_header_value); h_state = h_general; break; } } parser->header_state = h_state; if (p == data + len) --p; COUNT_HEADER_SIZE(p - start); break; } case s_header_almost_done: { if (UNLIKELY(ch != LF)) { SET_ERRNO(HPE_LF_EXPECTED); goto error; } UPDATE_STATE(s_header_value_lws); break; } case s_header_value_lws: { if (ch == ' ' || ch == '\t') { UPDATE_STATE(s_header_value_start); REEXECUTE(); } /* finished the header */ switch (parser->header_state) { case h_connection_keep_alive: parser->flags |= F_CONNECTION_KEEP_ALIVE; break; case h_connection_close: parser->flags |= F_CONNECTION_CLOSE; break; case h_transfer_encoding_chunked: parser->flags |= F_CHUNKED; break; case h_connection_upgrade: parser->flags |= F_CONNECTION_UPGRADE; break; default: break; } UPDATE_STATE(s_header_field_start); REEXECUTE(); } case s_header_value_discard_ws_almost_done: { STRICT_CHECK(ch != LF); UPDATE_STATE(s_header_value_discard_lws); break; } case s_header_value_discard_lws: { if (ch == ' ' || ch == '\t') { UPDATE_STATE(s_header_value_discard_ws); break; } else { switch (parser->header_state) { case h_connection_keep_alive: parser->flags |= F_CONNECTION_KEEP_ALIVE; break; case h_connection_close: parser->flags |= F_CONNECTION_CLOSE; break; case h_connection_upgrade: parser->flags |= F_CONNECTION_UPGRADE; break; case h_transfer_encoding_chunked: parser->flags |= F_CHUNKED; break; default: break; } /* header value was empty */ MARK(header_value); UPDATE_STATE(s_header_field_start); CALLBACK_DATA_NOADVANCE(header_value); REEXECUTE(); } } case s_headers_almost_done: { STRICT_CHECK(ch != LF); if (parser->flags & F_TRAILING) { /* End of a chunked request */ UPDATE_STATE(s_message_done); CALLBACK_NOTIFY_NOADVANCE(chunk_complete); REEXECUTE(); } /* Cannot use chunked encoding and a content-length header together per the HTTP specification. */ if ((parser->flags & F_CHUNKED) && (parser->flags & F_CONTENTLENGTH)) { SET_ERRNO(HPE_UNEXPECTED_CONTENT_LENGTH); goto error; } UPDATE_STATE(s_headers_done); /* Set this here so that on_headers_complete() callbacks can see it */ if ((parser->flags & F_UPGRADE) && (parser->flags & F_CONNECTION_UPGRADE)) { /* For responses, "Upgrade: foo" and "Connection: upgrade" are * mandatory only when it is a 101 Switching Protocols response, * otherwise it is purely informational, to announce support. */ parser->upgrade = (parser->type == HTTP_REQUEST || parser->status_code == 101); } else { parser->upgrade = (parser->method == HTTP_CONNECT); } /* Here we call the headers_complete callback. This is somewhat * different than other callbacks because if the user returns 1, we * will interpret that as saying that this message has no body. This * is needed for the annoying case of recieving a response to a HEAD * request. * * We'd like to use CALLBACK_NOTIFY_NOADVANCE() here but we cannot, so * we have to simulate it by handling a change in errno below. */ if (settings->on_headers_complete) { switch (settings->on_headers_complete(parser)) { case 0: break; case 2: parser->upgrade = 1; /* fall through */ case 1: parser->flags |= F_SKIPBODY; break; default: SET_ERRNO(HPE_CB_headers_complete); RETURN(p - data); /* Error */ } } if (HTTP_PARSER_ERRNO(parser) != HPE_OK) { RETURN(p - data); } REEXECUTE(); } case s_headers_done: { int hasBody; STRICT_CHECK(ch != LF); parser->nread = 0; nread = 0; hasBody = parser->flags & F_CHUNKED || (parser->content_length > 0 && parser->content_length != ULLONG_MAX); if (parser->upgrade && (parser->method == HTTP_CONNECT || (parser->flags & F_SKIPBODY) || !hasBody)) { /* Exit, the rest of the message is in a different protocol. */ UPDATE_STATE(NEW_MESSAGE()); CALLBACK_NOTIFY(message_complete); RETURN((p - data) + 1); } if (parser->flags & F_SKIPBODY) { UPDATE_STATE(NEW_MESSAGE()); CALLBACK_NOTIFY(message_complete); } else if (parser->flags & F_CHUNKED) { /* chunked encoding - ignore Content-Length header */ UPDATE_STATE(s_chunk_size_start); } else { if (parser->content_length == 0) { /* Content-Length header given but zero: Content-Length: 0\r\n */ UPDATE_STATE(NEW_MESSAGE()); CALLBACK_NOTIFY(message_complete); } else if (parser->content_length != ULLONG_MAX) { /* Content-Length header given and non-zero */ UPDATE_STATE(s_body_identity); } else { if (!http_message_needs_eof(parser)) { /* Assume content-length 0 - read the next */ UPDATE_STATE(NEW_MESSAGE()); CALLBACK_NOTIFY(message_complete); } else { /* Read body until EOF */ UPDATE_STATE(s_body_identity_eof); } } } break; } case s_body_identity: { uint64_t to_read = MIN(parser->content_length, (uint64_t) ((data + len) - p)); assert(parser->content_length != 0 && parser->content_length != ULLONG_MAX); /* The difference between advancing content_length and p is because * the latter will automaticaly advance on the next loop iteration. * Further, if content_length ends up at 0, we want to see the last * byte again for our message complete callback. */ MARK(body); parser->content_length -= to_read; p += to_read - 1; if (parser->content_length == 0) { UPDATE_STATE(s_message_done); /* Mimic CALLBACK_DATA_NOADVANCE() but with one extra byte. * * The alternative to doing this is to wait for the next byte to * trigger the data callback, just as in every other case. The * problem with this is that this makes it difficult for the test * harness to distinguish between complete-on-EOF and * complete-on-length. It's not clear that this distinction is * important for applications, but let's keep it for now. */ CALLBACK_DATA_(body, p - body_mark + 1, p - data); REEXECUTE(); } break; } /* read until EOF */ case s_body_identity_eof: MARK(body); p = data + len - 1; break; case s_message_done: UPDATE_STATE(NEW_MESSAGE()); CALLBACK_NOTIFY(message_complete); if (parser->upgrade) { /* Exit, the rest of the message is in a different protocol. */ RETURN((p - data) + 1); } break; case s_chunk_size_start: { assert(nread == 1); assert(parser->flags & F_CHUNKED); unhex_val = unhex[(unsigned char)ch]; if (UNLIKELY(unhex_val == -1)) { SET_ERRNO(HPE_INVALID_CHUNK_SIZE); goto error; } parser->content_length = unhex_val; UPDATE_STATE(s_chunk_size); break; } case s_chunk_size: { uint64_t t; assert(parser->flags & F_CHUNKED); if (ch == CR) { UPDATE_STATE(s_chunk_size_almost_done); break; } unhex_val = unhex[(unsigned char)ch]; if (unhex_val == -1) { if (ch == ';' || ch == ' ') { UPDATE_STATE(s_chunk_parameters); break; } SET_ERRNO(HPE_INVALID_CHUNK_SIZE); goto error; } t = parser->content_length; t *= 16; t += unhex_val; /* Overflow? Test against a conservative limit for simplicity. */ if (UNLIKELY((ULLONG_MAX - 16) / 16 < parser->content_length)) { SET_ERRNO(HPE_INVALID_CONTENT_LENGTH); goto error; } parser->content_length = t; break; } case s_chunk_parameters: { assert(parser->flags & F_CHUNKED); /* just ignore this shit. TODO check for overflow */ if (ch == CR) { UPDATE_STATE(s_chunk_size_almost_done); break; } break; } case s_chunk_size_almost_done: { assert(parser->flags & F_CHUNKED); STRICT_CHECK(ch != LF); parser->nread = 0; nread = 0; if (parser->content_length == 0) { parser->flags |= F_TRAILING; UPDATE_STATE(s_header_field_start); } else { UPDATE_STATE(s_chunk_data); } CALLBACK_NOTIFY(chunk_header); break; } case s_chunk_data: { uint64_t to_read = MIN(parser->content_length, (uint64_t) ((data + len) - p)); assert(parser->flags & F_CHUNKED); assert(parser->content_length != 0 && parser->content_length != ULLONG_MAX); /* See the explanation in s_body_identity for why the content * length and data pointers are managed this way. */ MARK(body); parser->content_length -= to_read; p += to_read - 1; if (parser->content_length == 0) { UPDATE_STATE(s_chunk_data_almost_done); } break; } case s_chunk_data_almost_done: assert(parser->flags & F_CHUNKED); assert(parser->content_length == 0); STRICT_CHECK(ch != CR); UPDATE_STATE(s_chunk_data_done); CALLBACK_DATA(body); break; case s_chunk_data_done: assert(parser->flags & F_CHUNKED); STRICT_CHECK(ch != LF); parser->nread = 0; nread = 0; UPDATE_STATE(s_chunk_size_start); CALLBACK_NOTIFY(chunk_complete); break; default: assert(0 && "unhandled state"); SET_ERRNO(HPE_INVALID_INTERNAL_STATE); goto error; } } /* Run callbacks for any marks that we have leftover after we ran out of * bytes. There should be at most one of these set, so it's OK to invoke * them in series (unset marks will not result in callbacks). * * We use the NOADVANCE() variety of callbacks here because 'p' has already * overflowed 'data' and this allows us to correct for the off-by-one that * we'd otherwise have (since CALLBACK_DATA() is meant to be run with a 'p' * value that's in-bounds). */ assert(((header_field_mark ? 1 : 0) + (header_value_mark ? 1 : 0) + (url_mark ? 1 : 0) + (body_mark ? 1 : 0) + (status_mark ? 1 : 0)) <= 1); CALLBACK_DATA_NOADVANCE(header_field); CALLBACK_DATA_NOADVANCE(header_value); CALLBACK_DATA_NOADVANCE(url); CALLBACK_DATA_NOADVANCE(body); CALLBACK_DATA_NOADVANCE(status); RETURN(len); error: if (HTTP_PARSER_ERRNO(parser) == HPE_OK) { SET_ERRNO(HPE_UNKNOWN); } RETURN(p - data); } /* Does the parser need to see an EOF to find the end of the message? */ int http_message_needs_eof (const http_parser *parser) { if (parser->type == HTTP_REQUEST) { return 0; } /* See RFC 2616 section 4.4 */ if (parser->status_code / 100 == 1 || /* 1xx e.g. Continue */ parser->status_code == 204 || /* No Content */ parser->status_code == 304 || /* Not Modified */ parser->flags & F_SKIPBODY) { /* response to a HEAD request */ return 0; } if ((parser->flags & F_CHUNKED) || parser->content_length != ULLONG_MAX) { return 0; } return 1; } int http_should_keep_alive (const http_parser *parser) { if (parser->http_major > 0 && parser->http_minor > 0) { /* HTTP/1.1 */ if (parser->flags & F_CONNECTION_CLOSE) { return 0; } } else { /* HTTP/1.0 or earlier */ if (!(parser->flags & F_CONNECTION_KEEP_ALIVE)) { return 0; } } return !http_message_needs_eof(parser); } const char * http_method_str (enum http_method m) { return ELEM_AT(method_strings, m, ""); } const char * http_status_str (enum http_status s) { switch (s) { #define XX(num, name, string) case HTTP_STATUS_##name: return #string; HTTP_STATUS_MAP(XX) #undef XX default: return ""; } } void http_parser_init (http_parser *parser, enum http_parser_type t) { void *data = parser->data; /* preserve application data */ memset(parser, 0, sizeof(*parser)); parser->data = data; parser->type = t; parser->state = (t == HTTP_REQUEST ? s_start_req : (t == HTTP_RESPONSE ? s_start_res : s_start_req_or_res)); parser->http_errno = HPE_OK; } void http_parser_settings_init(http_parser_settings *settings) { memset(settings, 0, sizeof(*settings)); } const char * http_errno_name(enum http_errno err) { assert(((size_t) err) < ARRAY_SIZE(http_strerror_tab)); return http_strerror_tab[err].name; } const char * http_errno_description(enum http_errno err) { assert(((size_t) err) < ARRAY_SIZE(http_strerror_tab)); return http_strerror_tab[err].description; } static enum http_host_state http_parse_host_char(enum http_host_state s, const char ch) { switch(s) { case s_http_userinfo: case s_http_userinfo_start: if (ch == '@') { return s_http_host_start; } if (IS_USERINFO_CHAR(ch)) { return s_http_userinfo; } break; case s_http_host_start: if (ch == '[') { return s_http_host_v6_start; } if (IS_HOST_CHAR(ch)) { return s_http_host; } break; case s_http_host: if (IS_HOST_CHAR(ch)) { return s_http_host; } /* fall through */ case s_http_host_v6_end: if (ch == ':') { return s_http_host_port_start; } break; case s_http_host_v6: if (ch == ']') { return s_http_host_v6_end; } /* fall through */ case s_http_host_v6_start: if (IS_HEX(ch) || ch == ':' || ch == '.') { return s_http_host_v6; } if (s == s_http_host_v6 && ch == '%') { return s_http_host_v6_zone_start; } break; case s_http_host_v6_zone: if (ch == ']') { return s_http_host_v6_end; } /* fall through */ case s_http_host_v6_zone_start: /* RFC 6874 Zone ID consists of 1*( unreserved / pct-encoded) */ if (IS_ALPHANUM(ch) || ch == '%' || ch == '.' || ch == '-' || ch == '_' || ch == '~') { return s_http_host_v6_zone; } break; case s_http_host_port: case s_http_host_port_start: if (IS_NUM(ch)) { return s_http_host_port; } break; default: break; } return s_http_host_dead; } static int http_parse_host(const char * buf, struct http_parser_url *u, int found_at) { enum http_host_state s; const char *p; size_t buflen = u->field_data[UF_HOST].off + u->field_data[UF_HOST].len; assert(u->field_set & (1 << UF_HOST)); u->field_data[UF_HOST].len = 0; s = found_at ? s_http_userinfo_start : s_http_host_start; for (p = buf + u->field_data[UF_HOST].off; p < buf + buflen; p++) { enum http_host_state new_s = http_parse_host_char(s, *p); if (new_s == s_http_host_dead) { return 1; } switch(new_s) { case s_http_host: if (s != s_http_host) { u->field_data[UF_HOST].off = p - buf; } u->field_data[UF_HOST].len++; break; case s_http_host_v6: if (s != s_http_host_v6) { u->field_data[UF_HOST].off = p - buf; } u->field_data[UF_HOST].len++; break; case s_http_host_v6_zone_start: case s_http_host_v6_zone: u->field_data[UF_HOST].len++; break; case s_http_host_port: if (s != s_http_host_port) { u->field_data[UF_PORT].off = p - buf; u->field_data[UF_PORT].len = 0; u->field_set |= (1 << UF_PORT); } u->field_data[UF_PORT].len++; break; case s_http_userinfo: if (s != s_http_userinfo) { u->field_data[UF_USERINFO].off = p - buf ; u->field_data[UF_USERINFO].len = 0; u->field_set |= (1 << UF_USERINFO); } u->field_data[UF_USERINFO].len++; break; default: break; } s = new_s; } /* Make sure we don't end somewhere unexpected */ switch (s) { case s_http_host_start: case s_http_host_v6_start: case s_http_host_v6: case s_http_host_v6_zone_start: case s_http_host_v6_zone: case s_http_host_port_start: case s_http_userinfo: case s_http_userinfo_start: return 1; default: break; } return 0; } void http_parser_url_init(struct http_parser_url *u) { memset(u, 0, sizeof(*u)); } int http_parser_parse_url(const char *buf, size_t buflen, int is_connect, struct http_parser_url *u) { enum state s; const char *p; enum http_parser_url_fields uf, old_uf; int found_at = 0; if (buflen == 0) { return 1; } u->port = u->field_set = 0; s = is_connect ? s_req_server_start : s_req_spaces_before_url; old_uf = UF_MAX; for (p = buf; p < buf + buflen; p++) { s = parse_url_char(s, *p); /* Figure out the next field that we're operating on */ switch (s) { case s_dead: return 1; /* Skip delimeters */ case s_req_schema_slash: case s_req_schema_slash_slash: case s_req_server_start: case s_req_query_string_start: case s_req_fragment_start: continue; case s_req_schema: uf = UF_SCHEMA; break; case s_req_server_with_at: found_at = 1; /* fall through */ case s_req_server: uf = UF_HOST; break; case s_req_path: uf = UF_PATH; break; case s_req_query_string: uf = UF_QUERY; break; case s_req_fragment: uf = UF_FRAGMENT; break; default: assert(!"Unexpected state"); return 1; } /* Nothing's changed; soldier on */ if (uf == old_uf) { u->field_data[uf].len++; continue; } u->field_data[uf].off = p - buf; u->field_data[uf].len = 1; u->field_set |= (1 << uf); old_uf = uf; } /* host must be present if there is a schema */ /* parsing http:///toto will fail */ if ((u->field_set & (1 << UF_SCHEMA)) && (u->field_set & (1 << UF_HOST)) == 0) { return 1; } if (u->field_set & (1 << UF_HOST)) { if (http_parse_host(buf, u, found_at) != 0) { return 1; } } /* CONNECT requests can only contain "hostname:port" */ if (is_connect && u->field_set != ((1 << UF_HOST)|(1 << UF_PORT))) { return 1; } if (u->field_set & (1 << UF_PORT)) { uint16_t off; uint16_t len; const char* p; const char* end; unsigned long v; off = u->field_data[UF_PORT].off; len = u->field_data[UF_PORT].len; end = buf + off + len; /* NOTE: The characters are already validated and are in the [0-9] range */ assert(off + len <= buflen && "Port number overflow"); v = 0; for (p = buf + off; p < end; p++) { v *= 10; v += *p - '0'; /* Ports have a max value of 2^16 */ if (v > 0xffff) { return 1; } } u->port = (uint16_t) v; } return 0; } void http_parser_pause(http_parser *parser, int paused) { /* Users should only be pausing/unpausing a parser that is not in an error * state. In non-debug builds, there's not much that we can do about this * other than ignore it. */ if (HTTP_PARSER_ERRNO(parser) == HPE_OK || HTTP_PARSER_ERRNO(parser) == HPE_PAUSED) { uint32_t nread = parser->nread; /* used by the SET_ERRNO macro */ SET_ERRNO((paused) ? HPE_PAUSED : HPE_OK); } else { assert(0 && "Attempting to pause parser in error state"); } } int http_body_is_final(const struct http_parser *parser) { return parser->state == s_message_done; } unsigned long http_parser_version(void) { return HTTP_PARSER_VERSION_MAJOR * 0x10000 | HTTP_PARSER_VERSION_MINOR * 0x00100 | HTTP_PARSER_VERSION_PATCH * 0x00001; } aiohttp-3.6.2/vendor/http-parser/http_parser.gyp0000644000175100001650000000544713547410117022321 0ustar vstsdocker00000000000000# This file is used with the GYP meta build system. # http://code.google.com/p/gyp/ # To build try this: # svn co http://gyp.googlecode.com/svn/trunk gyp # ./gyp/gyp -f make --depth=`pwd` http_parser.gyp # ./out/Debug/test { 'target_defaults': { 'default_configuration': 'Debug', 'configurations': { # TODO: hoist these out and put them somewhere common, because # RuntimeLibrary MUST MATCH across the entire project 'Debug': { 'defines': [ 'DEBUG', '_DEBUG' ], 'cflags': [ '-Wall', '-Wextra', '-O0', '-g', '-ftrapv' ], 'msvs_settings': { 'VCCLCompilerTool': { 'RuntimeLibrary': 1, # static debug }, }, }, 'Release': { 'defines': [ 'NDEBUG' ], 'cflags': [ '-Wall', '-Wextra', '-O3' ], 'msvs_settings': { 'VCCLCompilerTool': { 'RuntimeLibrary': 0, # static release }, }, } }, 'msvs_settings': { 'VCCLCompilerTool': { }, 'VCLibrarianTool': { }, 'VCLinkerTool': { 'GenerateDebugInformation': 'true', }, }, 'conditions': [ ['OS == "win"', { 'defines': [ 'WIN32' ], }] ], }, 'targets': [ { 'target_name': 'http_parser', 'type': 'static_library', 'include_dirs': [ '.' ], 'direct_dependent_settings': { 'defines': [ 'HTTP_PARSER_STRICT=0' ], 'include_dirs': [ '.' ], }, 'defines': [ 'HTTP_PARSER_STRICT=0' ], 'sources': [ './http_parser.c', ], 'conditions': [ ['OS=="win"', { 'msvs_settings': { 'VCCLCompilerTool': { # Compile as C++. http_parser.c is actually C99, but C++ is # close enough in this case. 'CompileAs': 2, }, }, }] ], }, { 'target_name': 'http_parser_strict', 'type': 'static_library', 'include_dirs': [ '.' ], 'direct_dependent_settings': { 'defines': [ 'HTTP_PARSER_STRICT=1' ], 'include_dirs': [ '.' ], }, 'defines': [ 'HTTP_PARSER_STRICT=1' ], 'sources': [ './http_parser.c', ], 'conditions': [ ['OS=="win"', { 'msvs_settings': { 'VCCLCompilerTool': { # Compile as C++. http_parser.c is actually C99, but C++ is # close enough in this case. 'CompileAs': 2, }, }, }] ], }, { 'target_name': 'test-nonstrict', 'type': 'executable', 'dependencies': [ 'http_parser' ], 'sources': [ 'test.c' ] }, { 'target_name': 'test-strict', 'type': 'executable', 'dependencies': [ 'http_parser_strict' ], 'sources': [ 'test.c' ] } ] } aiohttp-3.6.2/vendor/http-parser/http_parser.h0000644000175100001650000004466013547410117021751 0ustar vstsdocker00000000000000/* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef http_parser_h #define http_parser_h #ifdef __cplusplus extern "C" { #endif /* Also update SONAME in the Makefile whenever you change these. */ #define HTTP_PARSER_VERSION_MAJOR 2 #define HTTP_PARSER_VERSION_MINOR 8 #define HTTP_PARSER_VERSION_PATCH 1 #include #if defined(_WIN32) && !defined(__MINGW32__) && \ (!defined(_MSC_VER) || _MSC_VER<1600) && !defined(__WINE__) #include typedef __int8 int8_t; typedef unsigned __int8 uint8_t; typedef __int16 int16_t; typedef unsigned __int16 uint16_t; typedef __int32 int32_t; typedef unsigned __int32 uint32_t; typedef __int64 int64_t; typedef unsigned __int64 uint64_t; #else #include #endif /* Compile with -DHTTP_PARSER_STRICT=0 to make less checks, but run * faster */ #ifndef HTTP_PARSER_STRICT # define HTTP_PARSER_STRICT 1 #endif /* Maximium header size allowed. If the macro is not defined * before including this header then the default is used. To * change the maximum header size, define the macro in the build * environment (e.g. -DHTTP_MAX_HEADER_SIZE=). To remove * the effective limit on the size of the header, define the macro * to a very large number (e.g. -DHTTP_MAX_HEADER_SIZE=0x7fffffff) */ #ifndef HTTP_MAX_HEADER_SIZE # define HTTP_MAX_HEADER_SIZE (80*1024) #endif typedef struct http_parser http_parser; typedef struct http_parser_settings http_parser_settings; /* Callbacks should return non-zero to indicate an error. The parser will * then halt execution. * * The one exception is on_headers_complete. In a HTTP_RESPONSE parser * returning '1' from on_headers_complete will tell the parser that it * should not expect a body. This is used when receiving a response to a * HEAD request which may contain 'Content-Length' or 'Transfer-Encoding: * chunked' headers that indicate the presence of a body. * * Returning `2` from on_headers_complete will tell parser that it should not * expect neither a body nor any futher responses on this connection. This is * useful for handling responses to a CONNECT request which may not contain * `Upgrade` or `Connection: upgrade` headers. * * http_data_cb does not return data chunks. It will be called arbitrarily * many times for each string. E.G. you might get 10 callbacks for "on_url" * each providing just a few characters more data. */ typedef int (*http_data_cb) (http_parser*, const char *at, size_t length); typedef int (*http_cb) (http_parser*); /* Status Codes */ #define HTTP_STATUS_MAP(XX) \ XX(100, CONTINUE, Continue) \ XX(101, SWITCHING_PROTOCOLS, Switching Protocols) \ XX(102, PROCESSING, Processing) \ XX(200, OK, OK) \ XX(201, CREATED, Created) \ XX(202, ACCEPTED, Accepted) \ XX(203, NON_AUTHORITATIVE_INFORMATION, Non-Authoritative Information) \ XX(204, NO_CONTENT, No Content) \ XX(205, RESET_CONTENT, Reset Content) \ XX(206, PARTIAL_CONTENT, Partial Content) \ XX(207, MULTI_STATUS, Multi-Status) \ XX(208, ALREADY_REPORTED, Already Reported) \ XX(226, IM_USED, IM Used) \ XX(300, MULTIPLE_CHOICES, Multiple Choices) \ XX(301, MOVED_PERMANENTLY, Moved Permanently) \ XX(302, FOUND, Found) \ XX(303, SEE_OTHER, See Other) \ XX(304, NOT_MODIFIED, Not Modified) \ XX(305, USE_PROXY, Use Proxy) \ XX(307, TEMPORARY_REDIRECT, Temporary Redirect) \ XX(308, PERMANENT_REDIRECT, Permanent Redirect) \ XX(400, BAD_REQUEST, Bad Request) \ XX(401, UNAUTHORIZED, Unauthorized) \ XX(402, PAYMENT_REQUIRED, Payment Required) \ XX(403, FORBIDDEN, Forbidden) \ XX(404, NOT_FOUND, Not Found) \ XX(405, METHOD_NOT_ALLOWED, Method Not Allowed) \ XX(406, NOT_ACCEPTABLE, Not Acceptable) \ XX(407, PROXY_AUTHENTICATION_REQUIRED, Proxy Authentication Required) \ XX(408, REQUEST_TIMEOUT, Request Timeout) \ XX(409, CONFLICT, Conflict) \ XX(410, GONE, Gone) \ XX(411, LENGTH_REQUIRED, Length Required) \ XX(412, PRECONDITION_FAILED, Precondition Failed) \ XX(413, PAYLOAD_TOO_LARGE, Payload Too Large) \ XX(414, URI_TOO_LONG, URI Too Long) \ XX(415, UNSUPPORTED_MEDIA_TYPE, Unsupported Media Type) \ XX(416, RANGE_NOT_SATISFIABLE, Range Not Satisfiable) \ XX(417, EXPECTATION_FAILED, Expectation Failed) \ XX(421, MISDIRECTED_REQUEST, Misdirected Request) \ XX(422, UNPROCESSABLE_ENTITY, Unprocessable Entity) \ XX(423, LOCKED, Locked) \ XX(424, FAILED_DEPENDENCY, Failed Dependency) \ XX(426, UPGRADE_REQUIRED, Upgrade Required) \ XX(428, PRECONDITION_REQUIRED, Precondition Required) \ XX(429, TOO_MANY_REQUESTS, Too Many Requests) \ XX(431, REQUEST_HEADER_FIELDS_TOO_LARGE, Request Header Fields Too Large) \ XX(451, UNAVAILABLE_FOR_LEGAL_REASONS, Unavailable For Legal Reasons) \ XX(500, INTERNAL_SERVER_ERROR, Internal Server Error) \ XX(501, NOT_IMPLEMENTED, Not Implemented) \ XX(502, BAD_GATEWAY, Bad Gateway) \ XX(503, SERVICE_UNAVAILABLE, Service Unavailable) \ XX(504, GATEWAY_TIMEOUT, Gateway Timeout) \ XX(505, HTTP_VERSION_NOT_SUPPORTED, HTTP Version Not Supported) \ XX(506, VARIANT_ALSO_NEGOTIATES, Variant Also Negotiates) \ XX(507, INSUFFICIENT_STORAGE, Insufficient Storage) \ XX(508, LOOP_DETECTED, Loop Detected) \ XX(510, NOT_EXTENDED, Not Extended) \ XX(511, NETWORK_AUTHENTICATION_REQUIRED, Network Authentication Required) \ enum http_status { #define XX(num, name, string) HTTP_STATUS_##name = num, HTTP_STATUS_MAP(XX) #undef XX }; /* Request Methods */ #define HTTP_METHOD_MAP(XX) \ XX(0, DELETE, DELETE) \ XX(1, GET, GET) \ XX(2, HEAD, HEAD) \ XX(3, POST, POST) \ XX(4, PUT, PUT) \ /* pathological */ \ XX(5, CONNECT, CONNECT) \ XX(6, OPTIONS, OPTIONS) \ XX(7, TRACE, TRACE) \ /* WebDAV */ \ XX(8, COPY, COPY) \ XX(9, LOCK, LOCK) \ XX(10, MKCOL, MKCOL) \ XX(11, MOVE, MOVE) \ XX(12, PROPFIND, PROPFIND) \ XX(13, PROPPATCH, PROPPATCH) \ XX(14, SEARCH, SEARCH) \ XX(15, UNLOCK, UNLOCK) \ XX(16, BIND, BIND) \ XX(17, REBIND, REBIND) \ XX(18, UNBIND, UNBIND) \ XX(19, ACL, ACL) \ /* subversion */ \ XX(20, REPORT, REPORT) \ XX(21, MKACTIVITY, MKACTIVITY) \ XX(22, CHECKOUT, CHECKOUT) \ XX(23, MERGE, MERGE) \ /* upnp */ \ XX(24, MSEARCH, M-SEARCH) \ XX(25, NOTIFY, NOTIFY) \ XX(26, SUBSCRIBE, SUBSCRIBE) \ XX(27, UNSUBSCRIBE, UNSUBSCRIBE) \ /* RFC-5789 */ \ XX(28, PATCH, PATCH) \ XX(29, PURGE, PURGE) \ /* CalDAV */ \ XX(30, MKCALENDAR, MKCALENDAR) \ /* RFC-2068, section 19.6.1.2 */ \ XX(31, LINK, LINK) \ XX(32, UNLINK, UNLINK) \ /* icecast */ \ XX(33, SOURCE, SOURCE) \ enum http_method { #define XX(num, name, string) HTTP_##name = num, HTTP_METHOD_MAP(XX) #undef XX }; enum http_parser_type { HTTP_REQUEST, HTTP_RESPONSE, HTTP_BOTH }; /* Flag values for http_parser.flags field */ enum flags { F_CHUNKED = 1 << 0 , F_CONNECTION_KEEP_ALIVE = 1 << 1 , F_CONNECTION_CLOSE = 1 << 2 , F_CONNECTION_UPGRADE = 1 << 3 , F_TRAILING = 1 << 4 , F_UPGRADE = 1 << 5 , F_SKIPBODY = 1 << 6 , F_CONTENTLENGTH = 1 << 7 }; /* Map for errno-related constants * * The provided argument should be a macro that takes 2 arguments. */ #define HTTP_ERRNO_MAP(XX) \ /* No error */ \ XX(OK, "success") \ \ /* Callback-related errors */ \ XX(CB_message_begin, "the on_message_begin callback failed") \ XX(CB_url, "the on_url callback failed") \ XX(CB_header_field, "the on_header_field callback failed") \ XX(CB_header_value, "the on_header_value callback failed") \ XX(CB_headers_complete, "the on_headers_complete callback failed") \ XX(CB_body, "the on_body callback failed") \ XX(CB_message_complete, "the on_message_complete callback failed") \ XX(CB_status, "the on_status callback failed") \ XX(CB_chunk_header, "the on_chunk_header callback failed") \ XX(CB_chunk_complete, "the on_chunk_complete callback failed") \ \ /* Parsing-related errors */ \ XX(INVALID_EOF_STATE, "stream ended at an unexpected time") \ XX(HEADER_OVERFLOW, \ "too many header bytes seen; overflow detected") \ XX(CLOSED_CONNECTION, \ "data received after completed connection: close message") \ XX(INVALID_VERSION, "invalid HTTP version") \ XX(INVALID_STATUS, "invalid HTTP status code") \ XX(INVALID_METHOD, "invalid HTTP method") \ XX(INVALID_URL, "invalid URL") \ XX(INVALID_HOST, "invalid host") \ XX(INVALID_PORT, "invalid port") \ XX(INVALID_PATH, "invalid path") \ XX(INVALID_QUERY_STRING, "invalid query string") \ XX(INVALID_FRAGMENT, "invalid fragment") \ XX(LF_EXPECTED, "LF character expected") \ XX(INVALID_HEADER_TOKEN, "invalid character in header") \ XX(INVALID_CONTENT_LENGTH, \ "invalid character in content-length header") \ XX(UNEXPECTED_CONTENT_LENGTH, \ "unexpected content-length header") \ XX(INVALID_CHUNK_SIZE, \ "invalid character in chunk size header") \ XX(INVALID_CONSTANT, "invalid constant string") \ XX(INVALID_INTERNAL_STATE, "encountered unexpected internal state")\ XX(STRICT, "strict mode assertion failed") \ XX(PAUSED, "parser is paused") \ XX(UNKNOWN, "an unknown error occurred") /* Define HPE_* values for each errno value above */ #define HTTP_ERRNO_GEN(n, s) HPE_##n, enum http_errno { HTTP_ERRNO_MAP(HTTP_ERRNO_GEN) }; #undef HTTP_ERRNO_GEN /* Get an http_errno value from an http_parser */ #define HTTP_PARSER_ERRNO(p) ((enum http_errno) (p)->http_errno) struct http_parser { /** PRIVATE **/ unsigned int type : 2; /* enum http_parser_type */ unsigned int flags : 8; /* F_* values from 'flags' enum; semi-public */ unsigned int state : 7; /* enum state from http_parser.c */ unsigned int header_state : 7; /* enum header_state from http_parser.c */ unsigned int index : 7; /* index into current matcher */ unsigned int lenient_http_headers : 1; uint32_t nread; /* # bytes read in various scenarios */ uint64_t content_length; /* # bytes in body (0 if no Content-Length header) */ /** READ-ONLY **/ unsigned short http_major; unsigned short http_minor; unsigned int status_code : 16; /* responses only */ unsigned int method : 8; /* requests only */ unsigned int http_errno : 7; /* 1 = Upgrade header was present and the parser has exited because of that. * 0 = No upgrade header present. * Should be checked when http_parser_execute() returns in addition to * error checking. */ unsigned int upgrade : 1; /** PUBLIC **/ void *data; /* A pointer to get hook to the "connection" or "socket" object */ }; struct http_parser_settings { http_cb on_message_begin; http_data_cb on_url; http_data_cb on_status; http_data_cb on_header_field; http_data_cb on_header_value; http_cb on_headers_complete; http_data_cb on_body; http_cb on_message_complete; /* When on_chunk_header is called, the current chunk length is stored * in parser->content_length. */ http_cb on_chunk_header; http_cb on_chunk_complete; }; enum http_parser_url_fields { UF_SCHEMA = 0 , UF_HOST = 1 , UF_PORT = 2 , UF_PATH = 3 , UF_QUERY = 4 , UF_FRAGMENT = 5 , UF_USERINFO = 6 , UF_MAX = 7 }; /* Result structure for http_parser_parse_url(). * * Callers should index into field_data[] with UF_* values iff field_set * has the relevant (1 << UF_*) bit set. As a courtesy to clients (and * because we probably have padding left over), we convert any port to * a uint16_t. */ struct http_parser_url { uint16_t field_set; /* Bitmask of (1 << UF_*) values */ uint16_t port; /* Converted UF_PORT string */ struct { uint16_t off; /* Offset into buffer in which field starts */ uint16_t len; /* Length of run in buffer */ } field_data[UF_MAX]; }; /* Returns the library version. Bits 16-23 contain the major version number, * bits 8-15 the minor version number and bits 0-7 the patch level. * Usage example: * * unsigned long version = http_parser_version(); * unsigned major = (version >> 16) & 255; * unsigned minor = (version >> 8) & 255; * unsigned patch = version & 255; * printf("http_parser v%u.%u.%u\n", major, minor, patch); */ unsigned long http_parser_version(void); void http_parser_init(http_parser *parser, enum http_parser_type type); /* Initialize http_parser_settings members to 0 */ void http_parser_settings_init(http_parser_settings *settings); /* Executes the parser. Returns number of parsed bytes. Sets * `parser->http_errno` on error. */ size_t http_parser_execute(http_parser *parser, const http_parser_settings *settings, const char *data, size_t len); /* If http_should_keep_alive() in the on_headers_complete or * on_message_complete callback returns 0, then this should be * the last message on the connection. * If you are the server, respond with the "Connection: close" header. * If you are the client, close the connection. */ int http_should_keep_alive(const http_parser *parser); /* Returns a string version of the HTTP method. */ const char *http_method_str(enum http_method m); /* Returns a string version of the HTTP status code. */ const char *http_status_str(enum http_status s); /* Return a string name of the given error */ const char *http_errno_name(enum http_errno err); /* Return a string description of the given error */ const char *http_errno_description(enum http_errno err); /* Initialize all http_parser_url members to 0 */ void http_parser_url_init(struct http_parser_url *u); /* Parse a URL; return nonzero on failure */ int http_parser_parse_url(const char *buf, size_t buflen, int is_connect, struct http_parser_url *u); /* Pause or un-pause the parser; a nonzero value pauses */ void http_parser_pause(http_parser *parser, int paused); /* Checks if this is the final chunk of the body. */ int http_body_is_final(const http_parser *parser); #ifdef __cplusplus } #endif #endif aiohttp-3.6.2/vendor/http-parser/test.c0000644000175100001650000035200513547410117020363 0ustar vstsdocker00000000000000/* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #include "http_parser.h" #include #include #include #include /* rand */ #include #include #if defined(__APPLE__) # undef strlncpy #endif /* defined(__APPLE__) */ #undef TRUE #define TRUE 1 #undef FALSE #define FALSE 0 #define MAX_HEADERS 13 #define MAX_ELEMENT_SIZE 2048 #define MAX_CHUNKS 16 #define MIN(a,b) ((a) < (b) ? (a) : (b)) #define ARRAY_SIZE(x) (sizeof(x) / sizeof(*x)) static http_parser parser; struct message { const char *name; // for debugging purposes const char *raw; enum http_parser_type type; enum http_method method; int status_code; char response_status[MAX_ELEMENT_SIZE]; char request_path[MAX_ELEMENT_SIZE]; char request_url[MAX_ELEMENT_SIZE]; char fragment[MAX_ELEMENT_SIZE]; char query_string[MAX_ELEMENT_SIZE]; char body[MAX_ELEMENT_SIZE]; size_t body_size; const char *host; const char *userinfo; uint16_t port; int num_headers; enum { NONE=0, FIELD, VALUE } last_header_element; char headers [MAX_HEADERS][2][MAX_ELEMENT_SIZE]; int should_keep_alive; int num_chunks; int num_chunks_complete; int chunk_lengths[MAX_CHUNKS]; const char *upgrade; // upgraded body unsigned short http_major; unsigned short http_minor; int message_begin_cb_called; int headers_complete_cb_called; int message_complete_cb_called; int status_cb_called; int message_complete_on_eof; int body_is_final; }; static int currently_parsing_eof; static struct message messages[5]; static int num_messages; static http_parser_settings *current_pause_parser; /* * R E Q U E S T S * */ const struct message requests[] = #define CURL_GET 0 { {.name= "curl get" ,.type= HTTP_REQUEST ,.raw= "GET /test HTTP/1.1\r\n" "User-Agent: curl/7.18.0 (i486-pc-linux-gnu) libcurl/7.18.0 OpenSSL/0.9.8g zlib/1.2.3.3 libidn/1.1\r\n" "Host: 0.0.0.0=5000\r\n" "Accept: */*\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/test" ,.request_url= "/test" ,.num_headers= 3 ,.headers= { { "User-Agent", "curl/7.18.0 (i486-pc-linux-gnu) libcurl/7.18.0 OpenSSL/0.9.8g zlib/1.2.3.3 libidn/1.1" } , { "Host", "0.0.0.0=5000" } , { "Accept", "*/*" } } ,.body= "" } #define FIREFOX_GET 1 , {.name= "firefox get" ,.type= HTTP_REQUEST ,.raw= "GET /favicon.ico HTTP/1.1\r\n" "Host: 0.0.0.0=5000\r\n" "User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9) Gecko/2008061015 Firefox/3.0\r\n" "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n" "Accept-Language: en-us,en;q=0.5\r\n" "Accept-Encoding: gzip,deflate\r\n" "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n" "Keep-Alive: 300\r\n" "Connection: keep-alive\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/favicon.ico" ,.request_url= "/favicon.ico" ,.num_headers= 8 ,.headers= { { "Host", "0.0.0.0=5000" } , { "User-Agent", "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9) Gecko/2008061015 Firefox/3.0" } , { "Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" } , { "Accept-Language", "en-us,en;q=0.5" } , { "Accept-Encoding", "gzip,deflate" } , { "Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7" } , { "Keep-Alive", "300" } , { "Connection", "keep-alive" } } ,.body= "" } #define DUMBLUCK 2 , {.name= "dumbluck" ,.type= HTTP_REQUEST ,.raw= "GET /dumbluck HTTP/1.1\r\n" "aaaaaaaaaaaaa:++++++++++\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/dumbluck" ,.request_url= "/dumbluck" ,.num_headers= 1 ,.headers= { { "aaaaaaaaaaaaa", "++++++++++" } } ,.body= "" } #define FRAGMENT_IN_URI 3 , {.name= "fragment in url" ,.type= HTTP_REQUEST ,.raw= "GET /forums/1/topics/2375?page=1#posts-17408 HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "page=1" ,.fragment= "posts-17408" ,.request_path= "/forums/1/topics/2375" /* XXX request url does include fragment? */ ,.request_url= "/forums/1/topics/2375?page=1#posts-17408" ,.num_headers= 0 ,.body= "" } #define GET_NO_HEADERS_NO_BODY 4 , {.name= "get no headers no body" ,.type= HTTP_REQUEST ,.raw= "GET /get_no_headers_no_body/world HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE /* would need Connection: close */ ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/get_no_headers_no_body/world" ,.request_url= "/get_no_headers_no_body/world" ,.num_headers= 0 ,.body= "" } #define GET_ONE_HEADER_NO_BODY 5 , {.name= "get one header no body" ,.type= HTTP_REQUEST ,.raw= "GET /get_one_header_no_body HTTP/1.1\r\n" "Accept: */*\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE /* would need Connection: close */ ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/get_one_header_no_body" ,.request_url= "/get_one_header_no_body" ,.num_headers= 1 ,.headers= { { "Accept" , "*/*" } } ,.body= "" } #define GET_FUNKY_CONTENT_LENGTH 6 , {.name= "get funky content length body hello" ,.type= HTTP_REQUEST ,.raw= "GET /get_funky_content_length_body_hello HTTP/1.0\r\n" "conTENT-Length: 5\r\n" "\r\n" "HELLO" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/get_funky_content_length_body_hello" ,.request_url= "/get_funky_content_length_body_hello" ,.num_headers= 1 ,.headers= { { "conTENT-Length" , "5" } } ,.body= "HELLO" } #define POST_IDENTITY_BODY_WORLD 7 , {.name= "post identity body world" ,.type= HTTP_REQUEST ,.raw= "POST /post_identity_body_world?q=search#hey HTTP/1.1\r\n" "Accept: */*\r\n" "Transfer-Encoding: identity\r\n" "Content-Length: 5\r\n" "\r\n" "World" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "q=search" ,.fragment= "hey" ,.request_path= "/post_identity_body_world" ,.request_url= "/post_identity_body_world?q=search#hey" ,.num_headers= 3 ,.headers= { { "Accept", "*/*" } , { "Transfer-Encoding", "identity" } , { "Content-Length", "5" } } ,.body= "World" } #define POST_CHUNKED_ALL_YOUR_BASE 8 , {.name= "post - chunked body: all your base are belong to us" ,.type= HTTP_REQUEST ,.raw= "POST /post_chunked_all_your_base HTTP/1.1\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "1e\r\nall your base are belong to us\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/post_chunked_all_your_base" ,.request_url= "/post_chunked_all_your_base" ,.num_headers= 1 ,.headers= { { "Transfer-Encoding" , "chunked" } } ,.body= "all your base are belong to us" ,.num_chunks_complete= 2 ,.chunk_lengths= { 0x1e } } #define TWO_CHUNKS_MULT_ZERO_END 9 , {.name= "two chunks ; triple zero ending" ,.type= HTTP_REQUEST ,.raw= "POST /two_chunks_mult_zero_end HTTP/1.1\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "5\r\nhello\r\n" "6\r\n world\r\n" "000\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/two_chunks_mult_zero_end" ,.request_url= "/two_chunks_mult_zero_end" ,.num_headers= 1 ,.headers= { { "Transfer-Encoding", "chunked" } } ,.body= "hello world" ,.num_chunks_complete= 3 ,.chunk_lengths= { 5, 6 } } #define CHUNKED_W_TRAILING_HEADERS 10 , {.name= "chunked with trailing headers. blech." ,.type= HTTP_REQUEST ,.raw= "POST /chunked_w_trailing_headers HTTP/1.1\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "5\r\nhello\r\n" "6\r\n world\r\n" "0\r\n" "Vary: *\r\n" "Content-Type: text/plain\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/chunked_w_trailing_headers" ,.request_url= "/chunked_w_trailing_headers" ,.num_headers= 3 ,.headers= { { "Transfer-Encoding", "chunked" } , { "Vary", "*" } , { "Content-Type", "text/plain" } } ,.body= "hello world" ,.num_chunks_complete= 3 ,.chunk_lengths= { 5, 6 } } #define CHUNKED_W_NONSENSE_AFTER_LENGTH 11 , {.name= "with nonsense after the length" ,.type= HTTP_REQUEST ,.raw= "POST /chunked_w_nonsense_after_length HTTP/1.1\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "5; ilovew3;whattheluck=aretheseparametersfor\r\nhello\r\n" "6; blahblah; blah\r\n world\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/chunked_w_nonsense_after_length" ,.request_url= "/chunked_w_nonsense_after_length" ,.num_headers= 1 ,.headers= { { "Transfer-Encoding", "chunked" } } ,.body= "hello world" ,.num_chunks_complete= 3 ,.chunk_lengths= { 5, 6 } } #define WITH_QUOTES 12 , {.name= "with quotes" ,.type= HTTP_REQUEST ,.raw= "GET /with_\"stupid\"_quotes?foo=\"bar\" HTTP/1.1\r\n\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "foo=\"bar\"" ,.fragment= "" ,.request_path= "/with_\"stupid\"_quotes" ,.request_url= "/with_\"stupid\"_quotes?foo=\"bar\"" ,.num_headers= 0 ,.headers= { } ,.body= "" } #define APACHEBENCH_GET 13 /* The server receiving this request SHOULD NOT wait for EOF * to know that content-length == 0. * How to represent this in a unit test? message_complete_on_eof * Compare with NO_CONTENT_LENGTH_RESPONSE. */ , {.name = "apachebench get" ,.type= HTTP_REQUEST ,.raw= "GET /test HTTP/1.0\r\n" "Host: 0.0.0.0:5000\r\n" "User-Agent: ApacheBench/2.3\r\n" "Accept: */*\r\n\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/test" ,.request_url= "/test" ,.num_headers= 3 ,.headers= { { "Host", "0.0.0.0:5000" } , { "User-Agent", "ApacheBench/2.3" } , { "Accept", "*/*" } } ,.body= "" } #define QUERY_URL_WITH_QUESTION_MARK_GET 14 /* Some clients include '?' characters in query strings. */ , {.name = "query url with question mark" ,.type= HTTP_REQUEST ,.raw= "GET /test.cgi?foo=bar?baz HTTP/1.1\r\n\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "foo=bar?baz" ,.fragment= "" ,.request_path= "/test.cgi" ,.request_url= "/test.cgi?foo=bar?baz" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define PREFIX_NEWLINE_GET 15 /* Some clients, especially after a POST in a keep-alive connection, * will send an extra CRLF before the next request */ , {.name = "newline prefix get" ,.type= HTTP_REQUEST ,.raw= "\r\nGET /test HTTP/1.1\r\n\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/test" ,.request_url= "/test" ,.num_headers= 0 ,.headers= { } ,.body= "" } #define UPGRADE_REQUEST 16 , {.name = "upgrade request" ,.type= HTTP_REQUEST ,.raw= "GET /demo HTTP/1.1\r\n" "Host: example.com\r\n" "Connection: Upgrade\r\n" "Sec-WebSocket-Key2: 12998 5 Y3 1 .P00\r\n" "Sec-WebSocket-Protocol: sample\r\n" "Upgrade: WebSocket\r\n" "Sec-WebSocket-Key1: 4 @1 46546xW%0l 1 5\r\n" "Origin: http://example.com\r\n" "\r\n" "Hot diggity dogg" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/demo" ,.request_url= "/demo" ,.num_headers= 7 ,.upgrade="Hot diggity dogg" ,.headers= { { "Host", "example.com" } , { "Connection", "Upgrade" } , { "Sec-WebSocket-Key2", "12998 5 Y3 1 .P00" } , { "Sec-WebSocket-Protocol", "sample" } , { "Upgrade", "WebSocket" } , { "Sec-WebSocket-Key1", "4 @1 46546xW%0l 1 5" } , { "Origin", "http://example.com" } } ,.body= "" } #define CONNECT_REQUEST 17 , {.name = "connect request" ,.type= HTTP_REQUEST ,.raw= "CONNECT 0-home0.netscape.com:443 HTTP/1.0\r\n" "User-agent: Mozilla/1.1N\r\n" "Proxy-authorization: basic aGVsbG86d29ybGQ=\r\n" "\r\n" "some data\r\n" "and yet even more data" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_CONNECT ,.query_string= "" ,.fragment= "" ,.request_path= "" ,.request_url= "0-home0.netscape.com:443" ,.num_headers= 2 ,.upgrade="some data\r\nand yet even more data" ,.headers= { { "User-agent", "Mozilla/1.1N" } , { "Proxy-authorization", "basic aGVsbG86d29ybGQ=" } } ,.body= "" } #define REPORT_REQ 18 , {.name= "report request" ,.type= HTTP_REQUEST ,.raw= "REPORT /test HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_REPORT ,.query_string= "" ,.fragment= "" ,.request_path= "/test" ,.request_url= "/test" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define NO_HTTP_VERSION 19 , {.name= "request with no http version" ,.type= HTTP_REQUEST ,.raw= "GET /\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 0 ,.http_minor= 9 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define MSEARCH_REQ 20 , {.name= "m-search request" ,.type= HTTP_REQUEST ,.raw= "M-SEARCH * HTTP/1.1\r\n" "HOST: 239.255.255.250:1900\r\n" "MAN: \"ssdp:discover\"\r\n" "ST: \"ssdp:all\"\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_MSEARCH ,.query_string= "" ,.fragment= "" ,.request_path= "*" ,.request_url= "*" ,.num_headers= 3 ,.headers= { { "HOST", "239.255.255.250:1900" } , { "MAN", "\"ssdp:discover\"" } , { "ST", "\"ssdp:all\"" } } ,.body= "" } #define LINE_FOLDING_IN_HEADER 21 , {.name= "line folding in header value" ,.type= HTTP_REQUEST ,.raw= "GET / HTTP/1.1\r\n" "Line1: abc\r\n" "\tdef\r\n" " ghi\r\n" "\t\tjkl\r\n" " mno \r\n" "\t \tqrs\r\n" "Line2: \t line2\t\r\n" "Line3:\r\n" " line3\r\n" "Line4: \r\n" " \r\n" "Connection:\r\n" " close\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 5 ,.headers= { { "Line1", "abc\tdef ghi\t\tjkl mno \t \tqrs" } , { "Line2", "line2\t" } , { "Line3", "line3" } , { "Line4", "" } , { "Connection", "close" }, } ,.body= "" } #define QUERY_TERMINATED_HOST 22 , {.name= "host terminated by a query string" ,.type= HTTP_REQUEST ,.raw= "GET http://hypnotoad.org?hail=all HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "hail=all" ,.fragment= "" ,.request_path= "" ,.request_url= "http://hypnotoad.org?hail=all" ,.host= "hypnotoad.org" ,.num_headers= 0 ,.headers= { } ,.body= "" } #define QUERY_TERMINATED_HOSTPORT 23 , {.name= "host:port terminated by a query string" ,.type= HTTP_REQUEST ,.raw= "GET http://hypnotoad.org:1234?hail=all HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "hail=all" ,.fragment= "" ,.request_path= "" ,.request_url= "http://hypnotoad.org:1234?hail=all" ,.host= "hypnotoad.org" ,.port= 1234 ,.num_headers= 0 ,.headers= { } ,.body= "" } #define SPACE_TERMINATED_HOSTPORT 24 , {.name= "host:port terminated by a space" ,.type= HTTP_REQUEST ,.raw= "GET http://hypnotoad.org:1234 HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "" ,.request_url= "http://hypnotoad.org:1234" ,.host= "hypnotoad.org" ,.port= 1234 ,.num_headers= 0 ,.headers= { } ,.body= "" } #define PATCH_REQ 25 , {.name = "PATCH request" ,.type= HTTP_REQUEST ,.raw= "PATCH /file.txt HTTP/1.1\r\n" "Host: www.example.com\r\n" "Content-Type: application/example\r\n" "If-Match: \"e0023aa4e\"\r\n" "Content-Length: 10\r\n" "\r\n" "cccccccccc" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_PATCH ,.query_string= "" ,.fragment= "" ,.request_path= "/file.txt" ,.request_url= "/file.txt" ,.num_headers= 4 ,.headers= { { "Host", "www.example.com" } , { "Content-Type", "application/example" } , { "If-Match", "\"e0023aa4e\"" } , { "Content-Length", "10" } } ,.body= "cccccccccc" } #define CONNECT_CAPS_REQUEST 26 , {.name = "connect caps request" ,.type= HTTP_REQUEST ,.raw= "CONNECT HOME0.NETSCAPE.COM:443 HTTP/1.0\r\n" "User-agent: Mozilla/1.1N\r\n" "Proxy-authorization: basic aGVsbG86d29ybGQ=\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_CONNECT ,.query_string= "" ,.fragment= "" ,.request_path= "" ,.request_url= "HOME0.NETSCAPE.COM:443" ,.num_headers= 2 ,.upgrade="" ,.headers= { { "User-agent", "Mozilla/1.1N" } , { "Proxy-authorization", "basic aGVsbG86d29ybGQ=" } } ,.body= "" } #if !HTTP_PARSER_STRICT #define UTF8_PATH_REQ 27 , {.name= "utf-8 path request" ,.type= HTTP_REQUEST ,.raw= "GET /δ¶/δt/pope?q=1#narf HTTP/1.1\r\n" "Host: github.com\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "q=1" ,.fragment= "narf" ,.request_path= "/δ¶/δt/pope" ,.request_url= "/δ¶/δt/pope?q=1#narf" ,.num_headers= 1 ,.headers= { {"Host", "github.com" } } ,.body= "" } #define HOSTNAME_UNDERSCORE 28 , {.name = "hostname underscore" ,.type= HTTP_REQUEST ,.raw= "CONNECT home_0.netscape.com:443 HTTP/1.0\r\n" "User-agent: Mozilla/1.1N\r\n" "Proxy-authorization: basic aGVsbG86d29ybGQ=\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_CONNECT ,.query_string= "" ,.fragment= "" ,.request_path= "" ,.request_url= "home_0.netscape.com:443" ,.num_headers= 2 ,.upgrade="" ,.headers= { { "User-agent", "Mozilla/1.1N" } , { "Proxy-authorization", "basic aGVsbG86d29ybGQ=" } } ,.body= "" } #endif /* !HTTP_PARSER_STRICT */ /* see https://github.com/ry/http-parser/issues/47 */ #define EAT_TRAILING_CRLF_NO_CONNECTION_CLOSE 29 , {.name = "eat CRLF between requests, no \"Connection: close\" header" ,.raw= "POST / HTTP/1.1\r\n" "Host: www.example.com\r\n" "Content-Type: application/x-www-form-urlencoded\r\n" "Content-Length: 4\r\n" "\r\n" "q=42\r\n" /* note the trailing CRLF */ ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 3 ,.upgrade= 0 ,.headers= { { "Host", "www.example.com" } , { "Content-Type", "application/x-www-form-urlencoded" } , { "Content-Length", "4" } } ,.body= "q=42" } /* see https://github.com/ry/http-parser/issues/47 */ #define EAT_TRAILING_CRLF_WITH_CONNECTION_CLOSE 30 , {.name = "eat CRLF between requests even if \"Connection: close\" is set" ,.raw= "POST / HTTP/1.1\r\n" "Host: www.example.com\r\n" "Content-Type: application/x-www-form-urlencoded\r\n" "Content-Length: 4\r\n" "Connection: close\r\n" "\r\n" "q=42\r\n" /* note the trailing CRLF */ ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE /* input buffer isn't empty when on_message_complete is called */ ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 4 ,.upgrade= 0 ,.headers= { { "Host", "www.example.com" } , { "Content-Type", "application/x-www-form-urlencoded" } , { "Content-Length", "4" } , { "Connection", "close" } } ,.body= "q=42" } #define PURGE_REQ 31 , {.name = "PURGE request" ,.type= HTTP_REQUEST ,.raw= "PURGE /file.txt HTTP/1.1\r\n" "Host: www.example.com\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_PURGE ,.query_string= "" ,.fragment= "" ,.request_path= "/file.txt" ,.request_url= "/file.txt" ,.num_headers= 1 ,.headers= { { "Host", "www.example.com" } } ,.body= "" } #define SEARCH_REQ 32 , {.name = "SEARCH request" ,.type= HTTP_REQUEST ,.raw= "SEARCH / HTTP/1.1\r\n" "Host: www.example.com\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_SEARCH ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 1 ,.headers= { { "Host", "www.example.com" } } ,.body= "" } #define PROXY_WITH_BASIC_AUTH 33 , {.name= "host:port and basic_auth" ,.type= HTTP_REQUEST ,.raw= "GET http://a%12:b!&*$@hypnotoad.org:1234/toto HTTP/1.1\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.fragment= "" ,.request_path= "/toto" ,.request_url= "http://a%12:b!&*$@hypnotoad.org:1234/toto" ,.host= "hypnotoad.org" ,.userinfo= "a%12:b!&*$" ,.port= 1234 ,.num_headers= 0 ,.headers= { } ,.body= "" } #define LINE_FOLDING_IN_HEADER_WITH_LF 34 , {.name= "line folding in header value" ,.type= HTTP_REQUEST ,.raw= "GET / HTTP/1.1\n" "Line1: abc\n" "\tdef\n" " ghi\n" "\t\tjkl\n" " mno \n" "\t \tqrs\n" "Line2: \t line2\t\n" "Line3:\n" " line3\n" "Line4: \n" " \n" "Connection:\n" " close\n" "\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/" ,.request_url= "/" ,.num_headers= 5 ,.headers= { { "Line1", "abc\tdef ghi\t\tjkl mno \t \tqrs" } , { "Line2", "line2\t" } , { "Line3", "line3" } , { "Line4", "" } , { "Connection", "close" }, } ,.body= "" } #define CONNECTION_MULTI 35 , {.name = "multiple connection header values with folding" ,.type= HTTP_REQUEST ,.raw= "GET /demo HTTP/1.1\r\n" "Host: example.com\r\n" "Connection: Something,\r\n" " Upgrade, ,Keep-Alive\r\n" "Sec-WebSocket-Key2: 12998 5 Y3 1 .P00\r\n" "Sec-WebSocket-Protocol: sample\r\n" "Upgrade: WebSocket\r\n" "Sec-WebSocket-Key1: 4 @1 46546xW%0l 1 5\r\n" "Origin: http://example.com\r\n" "\r\n" "Hot diggity dogg" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/demo" ,.request_url= "/demo" ,.num_headers= 7 ,.upgrade="Hot diggity dogg" ,.headers= { { "Host", "example.com" } , { "Connection", "Something, Upgrade, ,Keep-Alive" } , { "Sec-WebSocket-Key2", "12998 5 Y3 1 .P00" } , { "Sec-WebSocket-Protocol", "sample" } , { "Upgrade", "WebSocket" } , { "Sec-WebSocket-Key1", "4 @1 46546xW%0l 1 5" } , { "Origin", "http://example.com" } } ,.body= "" } #define CONNECTION_MULTI_LWS 36 , {.name = "multiple connection header values with folding and lws" ,.type= HTTP_REQUEST ,.raw= "GET /demo HTTP/1.1\r\n" "Connection: keep-alive, upgrade\r\n" "Upgrade: WebSocket\r\n" "\r\n" "Hot diggity dogg" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/demo" ,.request_url= "/demo" ,.num_headers= 2 ,.upgrade="Hot diggity dogg" ,.headers= { { "Connection", "keep-alive, upgrade" } , { "Upgrade", "WebSocket" } } ,.body= "" } #define CONNECTION_MULTI_LWS_CRLF 37 , {.name = "multiple connection header values with folding and lws" ,.type= HTTP_REQUEST ,.raw= "GET /demo HTTP/1.1\r\n" "Connection: keep-alive, \r\n upgrade\r\n" "Upgrade: WebSocket\r\n" "\r\n" "Hot diggity dogg" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_GET ,.query_string= "" ,.fragment= "" ,.request_path= "/demo" ,.request_url= "/demo" ,.num_headers= 2 ,.upgrade="Hot diggity dogg" ,.headers= { { "Connection", "keep-alive, upgrade" } , { "Upgrade", "WebSocket" } } ,.body= "" } #define UPGRADE_POST_REQUEST 38 , {.name = "upgrade post request" ,.type= HTTP_REQUEST ,.raw= "POST /demo HTTP/1.1\r\n" "Host: example.com\r\n" "Connection: Upgrade\r\n" "Upgrade: HTTP/2.0\r\n" "Content-Length: 15\r\n" "\r\n" "sweet post body" "Hot diggity dogg" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_POST ,.request_path= "/demo" ,.request_url= "/demo" ,.num_headers= 4 ,.upgrade="Hot diggity dogg" ,.headers= { { "Host", "example.com" } , { "Connection", "Upgrade" } , { "Upgrade", "HTTP/2.0" } , { "Content-Length", "15" } } ,.body= "sweet post body" } #define CONNECT_WITH_BODY_REQUEST 39 , {.name = "connect with body request" ,.type= HTTP_REQUEST ,.raw= "CONNECT foo.bar.com:443 HTTP/1.0\r\n" "User-agent: Mozilla/1.1N\r\n" "Proxy-authorization: basic aGVsbG86d29ybGQ=\r\n" "Content-Length: 10\r\n" "\r\n" "blarfcicle" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.method= HTTP_CONNECT ,.request_url= "foo.bar.com:443" ,.num_headers= 3 ,.upgrade="blarfcicle" ,.headers= { { "User-agent", "Mozilla/1.1N" } , { "Proxy-authorization", "basic aGVsbG86d29ybGQ=" } , { "Content-Length", "10" } } ,.body= "" } /* Examples from the Internet draft for LINK/UNLINK methods: * https://tools.ietf.org/id/draft-snell-link-method-01.html#rfc.section.5 */ #define LINK_REQUEST 40 , {.name = "link request" ,.type= HTTP_REQUEST ,.raw= "LINK /images/my_dog.jpg HTTP/1.1\r\n" "Host: example.com\r\n" "Link: ; rel=\"tag\"\r\n" "Link: ; rel=\"tag\"\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_LINK ,.request_path= "/images/my_dog.jpg" ,.request_url= "/images/my_dog.jpg" ,.query_string= "" ,.fragment= "" ,.num_headers= 3 ,.headers= { { "Host", "example.com" } , { "Link", "; rel=\"tag\"" } , { "Link", "; rel=\"tag\"" } } ,.body= "" } #define UNLINK_REQUEST 41 , {.name = "unlink request" ,.type= HTTP_REQUEST ,.raw= "UNLINK /images/my_dog.jpg HTTP/1.1\r\n" "Host: example.com\r\n" "Link: ; rel=\"tag\"\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_UNLINK ,.request_path= "/images/my_dog.jpg" ,.request_url= "/images/my_dog.jpg" ,.query_string= "" ,.fragment= "" ,.num_headers= 2 ,.headers= { { "Host", "example.com" } , { "Link", "; rel=\"tag\"" } } ,.body= "" } #define SOURCE_REQUEST 42 , {.name = "source request" ,.type= HTTP_REQUEST ,.raw= "SOURCE /music/sweet/music HTTP/1.1\r\n" "Host: example.com\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.method= HTTP_SOURCE ,.request_path= "/music/sweet/music" ,.request_url= "/music/sweet/music" ,.query_string= "" ,.fragment= "" ,.num_headers= 1 ,.headers= { { "Host", "example.com" } } ,.body= "" } }; /* * R E S P O N S E S * */ const struct message responses[] = #define GOOGLE_301 0 { {.name= "google 301" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 301 Moved Permanently\r\n" "Location: http://www.google.com/\r\n" "Content-Type: text/html; charset=UTF-8\r\n" "Date: Sun, 26 Apr 2009 11:11:49 GMT\r\n" "Expires: Tue, 26 May 2009 11:11:49 GMT\r\n" "X-$PrototypeBI-Version: 1.6.0.3\r\n" /* $ char in header field */ "Cache-Control: public, max-age=2592000\r\n" "Server: gws\r\n" "Content-Length: 219 \r\n" "\r\n" "\n" "301 Moved\n" "

    301 Moved

    \n" "The document has moved\n" "here.\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 301 ,.response_status= "Moved Permanently" ,.num_headers= 8 ,.headers= { { "Location", "http://www.google.com/" } , { "Content-Type", "text/html; charset=UTF-8" } , { "Date", "Sun, 26 Apr 2009 11:11:49 GMT" } , { "Expires", "Tue, 26 May 2009 11:11:49 GMT" } , { "X-$PrototypeBI-Version", "1.6.0.3" } , { "Cache-Control", "public, max-age=2592000" } , { "Server", "gws" } , { "Content-Length", "219 " } } ,.body= "\n" "301 Moved\n" "

    301 Moved

    \n" "The document has moved\n" "here.\r\n" "\r\n" } #define NO_CONTENT_LENGTH_RESPONSE 1 /* The client should wait for the server's EOF. That is, when content-length * is not specified, and "Connection: close", the end of body is specified * by the EOF. * Compare with APACHEBENCH_GET */ , {.name= "no content-length response" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Date: Tue, 04 Aug 2009 07:59:32 GMT\r\n" "Server: Apache\r\n" "X-Powered-By: Servlet/2.5 JSP/2.1\r\n" "Content-Type: text/xml; charset=utf-8\r\n" "Connection: close\r\n" "\r\n" "\n" "\n" " \n" " \n" " SOAP-ENV:Client\n" " Client Error\n" " \n" " \n" "" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 5 ,.headers= { { "Date", "Tue, 04 Aug 2009 07:59:32 GMT" } , { "Server", "Apache" } , { "X-Powered-By", "Servlet/2.5 JSP/2.1" } , { "Content-Type", "text/xml; charset=utf-8" } , { "Connection", "close" } } ,.body= "\n" "\n" " \n" " \n" " SOAP-ENV:Client\n" " Client Error\n" " \n" " \n" "" } #define NO_HEADERS_NO_BODY_404 2 , {.name= "404 no headers no body" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 404 Not Found\r\n\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 404 ,.response_status= "Not Found" ,.num_headers= 0 ,.headers= {} ,.body_size= 0 ,.body= "" } #define NO_REASON_PHRASE 3 , {.name= "301 no response phrase" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 301\r\n\r\n" ,.should_keep_alive = FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 301 ,.response_status= "" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define TRAILING_SPACE_ON_CHUNKED_BODY 4 , {.name="200 trailing space on chunked body" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Content-Type: text/plain\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "25 \r\n" "This is the data in the first chunk\r\n" "\r\n" "1C\r\n" "and this is the second one\r\n" "\r\n" "0 \r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 2 ,.headers= { {"Content-Type", "text/plain" } , {"Transfer-Encoding", "chunked" } } ,.body_size = 37+28 ,.body = "This is the data in the first chunk\r\n" "and this is the second one\r\n" ,.num_chunks_complete= 3 ,.chunk_lengths= { 0x25, 0x1c } } #define NO_CARRIAGE_RET 5 , {.name="no carriage ret" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\n" "Content-Type: text/html; charset=utf-8\n" "Connection: close\n" "\n" "these headers are from http://news.ycombinator.com/" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 2 ,.headers= { {"Content-Type", "text/html; charset=utf-8" } , {"Connection", "close" } } ,.body= "these headers are from http://news.ycombinator.com/" } #define PROXY_CONNECTION 6 , {.name="proxy connection" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Content-Type: text/html; charset=UTF-8\r\n" "Content-Length: 11\r\n" "Proxy-Connection: close\r\n" "Date: Thu, 31 Dec 2009 20:55:48 +0000\r\n" "\r\n" "hello world" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 4 ,.headers= { {"Content-Type", "text/html; charset=UTF-8" } , {"Content-Length", "11" } , {"Proxy-Connection", "close" } , {"Date", "Thu, 31 Dec 2009 20:55:48 +0000"} } ,.body= "hello world" } #define UNDERSTORE_HEADER_KEY 7 // shown by // curl -o /dev/null -v "http://ad.doubleclick.net/pfadx/DARTSHELLCONFIGXML;dcmt=text/xml;" , {.name="underscore header key" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Server: DCLK-AdSvr\r\n" "Content-Type: text/xml\r\n" "Content-Length: 0\r\n" "DCLK_imp: v7;x;114750856;0-0;0;17820020;0/0;21603567/21621457/1;;~okv=;dcmt=text/xml;;~cs=o\r\n\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 4 ,.headers= { {"Server", "DCLK-AdSvr" } , {"Content-Type", "text/xml" } , {"Content-Length", "0" } , {"DCLK_imp", "v7;x;114750856;0-0;0;17820020;0/0;21603567/21621457/1;;~okv=;dcmt=text/xml;;~cs=o" } } ,.body= "" } #define BONJOUR_MADAME_FR 8 /* The client should not merge two headers fields when the first one doesn't * have a value. */ , {.name= "bonjourmadame.fr" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.0 301 Moved Permanently\r\n" "Date: Thu, 03 Jun 2010 09:56:32 GMT\r\n" "Server: Apache/2.2.3 (Red Hat)\r\n" "Cache-Control: public\r\n" "Pragma: \r\n" "Location: http://www.bonjourmadame.fr/\r\n" "Vary: Accept-Encoding\r\n" "Content-Length: 0\r\n" "Content-Type: text/html; charset=UTF-8\r\n" "Connection: keep-alive\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.status_code= 301 ,.response_status= "Moved Permanently" ,.num_headers= 9 ,.headers= { { "Date", "Thu, 03 Jun 2010 09:56:32 GMT" } , { "Server", "Apache/2.2.3 (Red Hat)" } , { "Cache-Control", "public" } , { "Pragma", "" } , { "Location", "http://www.bonjourmadame.fr/" } , { "Vary", "Accept-Encoding" } , { "Content-Length", "0" } , { "Content-Type", "text/html; charset=UTF-8" } , { "Connection", "keep-alive" } } ,.body= "" } #define RES_FIELD_UNDERSCORE 9 /* Should handle spaces in header fields */ , {.name= "field underscore" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Date: Tue, 28 Sep 2010 01:14:13 GMT\r\n" "Server: Apache\r\n" "Cache-Control: no-cache, must-revalidate\r\n" "Expires: Mon, 26 Jul 1997 05:00:00 GMT\r\n" ".et-Cookie: PlaxoCS=1274804622353690521; path=/; domain=.plaxo.com\r\n" "Vary: Accept-Encoding\r\n" "_eep-Alive: timeout=45\r\n" /* semantic value ignored */ "_onnection: Keep-Alive\r\n" /* semantic value ignored */ "Transfer-Encoding: chunked\r\n" "Content-Type: text/html\r\n" "Connection: close\r\n" "\r\n" "0\r\n\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 11 ,.headers= { { "Date", "Tue, 28 Sep 2010 01:14:13 GMT" } , { "Server", "Apache" } , { "Cache-Control", "no-cache, must-revalidate" } , { "Expires", "Mon, 26 Jul 1997 05:00:00 GMT" } , { ".et-Cookie", "PlaxoCS=1274804622353690521; path=/; domain=.plaxo.com" } , { "Vary", "Accept-Encoding" } , { "_eep-Alive", "timeout=45" } , { "_onnection", "Keep-Alive" } , { "Transfer-Encoding", "chunked" } , { "Content-Type", "text/html" } , { "Connection", "close" } } ,.body= "" ,.num_chunks_complete= 1 ,.chunk_lengths= {} } #define NON_ASCII_IN_STATUS_LINE 10 /* Should handle non-ASCII in status line */ , {.name= "non-ASCII in status line" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 500 Oriëntatieprobleem\r\n" "Date: Fri, 5 Nov 2010 23:07:12 GMT+2\r\n" "Content-Length: 0\r\n" "Connection: close\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 500 ,.response_status= "Oriëntatieprobleem" ,.num_headers= 3 ,.headers= { { "Date", "Fri, 5 Nov 2010 23:07:12 GMT+2" } , { "Content-Length", "0" } , { "Connection", "close" } } ,.body= "" } #define HTTP_VERSION_0_9 11 /* Should handle HTTP/0.9 */ , {.name= "http version 0.9" ,.type= HTTP_RESPONSE ,.raw= "HTTP/0.9 200 OK\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 0 ,.http_minor= 9 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define NO_CONTENT_LENGTH_NO_TRANSFER_ENCODING_RESPONSE 12 /* The client should wait for the server's EOF. That is, when neither * content-length nor transfer-encoding is specified, the end of body * is specified by the EOF. */ , {.name= "neither content-length nor transfer-encoding response" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Content-Type: text/plain\r\n" "\r\n" "hello world" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 1 ,.headers= { { "Content-Type", "text/plain" } } ,.body= "hello world" } #define NO_BODY_HTTP10_KA_200 13 , {.name= "HTTP/1.0 with keep-alive and EOF-terminated 200 status" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.0 200 OK\r\n" "Connection: keep-alive\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 0 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 1 ,.headers= { { "Connection", "keep-alive" } } ,.body_size= 0 ,.body= "" } #define NO_BODY_HTTP10_KA_204 14 , {.name= "HTTP/1.0 with keep-alive and a 204 status" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.0 204 No content\r\n" "Connection: keep-alive\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.status_code= 204 ,.response_status= "No content" ,.num_headers= 1 ,.headers= { { "Connection", "keep-alive" } } ,.body_size= 0 ,.body= "" } #define NO_BODY_HTTP11_KA_200 15 , {.name= "HTTP/1.1 with an EOF-terminated 200 status" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 0 ,.headers={} ,.body_size= 0 ,.body= "" } #define NO_BODY_HTTP11_KA_204 16 , {.name= "HTTP/1.1 with a 204 status" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 204 No content\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 204 ,.response_status= "No content" ,.num_headers= 0 ,.headers={} ,.body_size= 0 ,.body= "" } #define NO_BODY_HTTP11_NOKA_204 17 , {.name= "HTTP/1.1 with a 204 status and keep-alive disabled" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 204 No content\r\n" "Connection: close\r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 204 ,.response_status= "No content" ,.num_headers= 1 ,.headers= { { "Connection", "close" } } ,.body_size= 0 ,.body= "" } #define NO_BODY_HTTP11_KA_CHUNKED_200 18 , {.name= "HTTP/1.1 with chunked endocing and a 200 response" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 1 ,.headers= { { "Transfer-Encoding", "chunked" } } ,.body_size= 0 ,.body= "" ,.num_chunks_complete= 1 } #if !HTTP_PARSER_STRICT #define SPACE_IN_FIELD_RES 19 /* Should handle spaces in header fields */ , {.name= "field space" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Server: Microsoft-IIS/6.0\r\n" "X-Powered-By: ASP.NET\r\n" "en-US Content-Type: text/xml\r\n" /* this is the problem */ "Content-Type: text/xml\r\n" "Content-Length: 16\r\n" "Date: Fri, 23 Jul 2010 18:45:38 GMT\r\n" "Connection: keep-alive\r\n" "\r\n" "hello" /* fake body */ ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 7 ,.headers= { { "Server", "Microsoft-IIS/6.0" } , { "X-Powered-By", "ASP.NET" } , { "en-US Content-Type", "text/xml" } , { "Content-Type", "text/xml" } , { "Content-Length", "16" } , { "Date", "Fri, 23 Jul 2010 18:45:38 GMT" } , { "Connection", "keep-alive" } } ,.body= "hello" } #endif /* !HTTP_PARSER_STRICT */ #define AMAZON_COM 20 , {.name= "amazon.com" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 301 MovedPermanently\r\n" "Date: Wed, 15 May 2013 17:06:33 GMT\r\n" "Server: Server\r\n" "x-amz-id-1: 0GPHKXSJQ826RK7GZEB2\r\n" "p3p: policyref=\"http://www.amazon.com/w3c/p3p.xml\",CP=\"CAO DSP LAW CUR ADM IVAo IVDo CONo OTPo OUR DELi PUBi OTRi BUS PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA HEA PRE LOC GOV OTC \"\r\n" "x-amz-id-2: STN69VZxIFSz9YJLbz1GDbxpbjG6Qjmmq5E3DxRhOUw+Et0p4hr7c/Q8qNcx4oAD\r\n" "Location: http://www.amazon.com/Dan-Brown/e/B000AP9DSU/ref=s9_pop_gw_al1?_encoding=UTF8&refinementId=618073011&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=0SHYY5BZXN3KR20BNFAY&pf_rd_t=101&pf_rd_p=1263340922&pf_rd_i=507846\r\n" "Vary: Accept-Encoding,User-Agent\r\n" "Content-Type: text/html; charset=ISO-8859-1\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "1\r\n" "\n\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 301 ,.response_status= "MovedPermanently" ,.num_headers= 9 ,.headers= { { "Date", "Wed, 15 May 2013 17:06:33 GMT" } , { "Server", "Server" } , { "x-amz-id-1", "0GPHKXSJQ826RK7GZEB2" } , { "p3p", "policyref=\"http://www.amazon.com/w3c/p3p.xml\",CP=\"CAO DSP LAW CUR ADM IVAo IVDo CONo OTPo OUR DELi PUBi OTRi BUS PHY ONL UNI PUR FIN COM NAV INT DEM CNT STA HEA PRE LOC GOV OTC \"" } , { "x-amz-id-2", "STN69VZxIFSz9YJLbz1GDbxpbjG6Qjmmq5E3DxRhOUw+Et0p4hr7c/Q8qNcx4oAD" } , { "Location", "http://www.amazon.com/Dan-Brown/e/B000AP9DSU/ref=s9_pop_gw_al1?_encoding=UTF8&refinementId=618073011&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-2&pf_rd_r=0SHYY5BZXN3KR20BNFAY&pf_rd_t=101&pf_rd_p=1263340922&pf_rd_i=507846" } , { "Vary", "Accept-Encoding,User-Agent" } , { "Content-Type", "text/html; charset=ISO-8859-1" } , { "Transfer-Encoding", "chunked" } } ,.body= "\n" ,.num_chunks_complete= 2 ,.chunk_lengths= { 1 } } #define EMPTY_REASON_PHRASE_AFTER_SPACE 20 , {.name= "empty reason phrase after space" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 \r\n" "\r\n" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "" ,.num_headers= 0 ,.headers= {} ,.body= "" } #define CONTENT_LENGTH_X 21 , {.name= "Content-Length-X" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Content-Length-X: 0\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "2\r\n" "OK\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 2 ,.headers= { { "Content-Length-X", "0" } , { "Transfer-Encoding", "chunked" } } ,.body= "OK" ,.num_chunks_complete= 2 ,.chunk_lengths= { 2 } } #define HTTP_101_RESPONSE_WITH_UPGRADE_HEADER 22 , {.name= "HTTP 101 response with Upgrade header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 101 Switching Protocols\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "\r\n" "proto" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 101 ,.response_status= "Switching Protocols" ,.upgrade= "proto" ,.num_headers= 2 ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } } } #define HTTP_101_RESPONSE_WITH_UPGRADE_HEADER_AND_CONTENT_LENGTH 23 , {.name= "HTTP 101 response with Upgrade and Content-Length header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 101 Switching Protocols\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "Content-Length: 4\r\n" "\r\n" "body" "proto" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 101 ,.response_status= "Switching Protocols" ,.body= "body" ,.upgrade= "proto" ,.num_headers= 3 ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } , { "Content-Length", "4" } } } #define HTTP_101_RESPONSE_WITH_UPGRADE_HEADER_AND_TRANSFER_ENCODING 24 , {.name= "HTTP 101 response with Upgrade and Transfer-Encoding header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 101 Switching Protocols\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "2\r\n" "bo\r\n" "2\r\n" "dy\r\n" "0\r\n" "\r\n" "proto" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 101 ,.response_status= "Switching Protocols" ,.body= "body" ,.upgrade= "proto" ,.num_headers= 3 ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } , { "Transfer-Encoding", "chunked" } } ,.num_chunks_complete= 3 ,.chunk_lengths= { 2, 2 } } #define HTTP_200_RESPONSE_WITH_UPGRADE_HEADER 25 , {.name= "HTTP 200 response with Upgrade header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "\r\n" "body" ,.should_keep_alive= FALSE ,.message_complete_on_eof= TRUE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.body= "body" ,.upgrade= NULL ,.num_headers= 2 ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } } } #define HTTP_200_RESPONSE_WITH_UPGRADE_HEADER_AND_CONTENT_LENGTH 26 , {.name= "HTTP 200 response with Upgrade and Content-Length header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "Content-Length: 4\r\n" "\r\n" "body" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 3 ,.body= "body" ,.upgrade= NULL ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } , { "Content-Length", "4" } } } #define HTTP_200_RESPONSE_WITH_UPGRADE_HEADER_AND_TRANSFER_ENCODING 27 , {.name= "HTTP 200 response with Upgrade and Transfer-Encoding header" ,.type= HTTP_RESPONSE ,.raw= "HTTP/1.1 200 OK\r\n" "Connection: upgrade\r\n" "Upgrade: h2c\r\n" "Transfer-Encoding: chunked\r\n" "\r\n" "2\r\n" "bo\r\n" "2\r\n" "dy\r\n" "0\r\n" "\r\n" ,.should_keep_alive= TRUE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 1 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 3 ,.body= "body" ,.upgrade= NULL ,.headers= { { "Connection", "upgrade" } , { "Upgrade", "h2c" } , { "Transfer-Encoding", "chunked" } } ,.num_chunks_complete= 3 ,.chunk_lengths= { 2, 2 } } }; /* strnlen() is a POSIX.2008 addition. Can't rely on it being available so * define it ourselves. */ size_t strnlen(const char *s, size_t maxlen) { const char *p; p = memchr(s, '\0', maxlen); if (p == NULL) return maxlen; return p - s; } size_t strlncat(char *dst, size_t len, const char *src, size_t n) { size_t slen; size_t dlen; size_t rlen; size_t ncpy; slen = strnlen(src, n); dlen = strnlen(dst, len); if (dlen < len) { rlen = len - dlen; ncpy = slen < rlen ? slen : (rlen - 1); memcpy(dst + dlen, src, ncpy); dst[dlen + ncpy] = '\0'; } assert(len > slen + dlen); return slen + dlen; } size_t strlncpy(char *dst, size_t len, const char *src, size_t n) { size_t slen; size_t ncpy; slen = strnlen(src, n); if (len > 0) { ncpy = slen < len ? slen : (len - 1); memcpy(dst, src, ncpy); dst[ncpy] = '\0'; } assert(len > slen); return slen; } int request_url_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); strlncat(messages[num_messages].request_url, sizeof(messages[num_messages].request_url), buf, len); return 0; } int header_field_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); struct message *m = &messages[num_messages]; if (m->last_header_element != FIELD) m->num_headers++; strlncat(m->headers[m->num_headers-1][0], sizeof(m->headers[m->num_headers-1][0]), buf, len); m->last_header_element = FIELD; return 0; } int header_value_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); struct message *m = &messages[num_messages]; strlncat(m->headers[m->num_headers-1][1], sizeof(m->headers[m->num_headers-1][1]), buf, len); m->last_header_element = VALUE; return 0; } void check_body_is_final (const http_parser *p) { if (messages[num_messages].body_is_final) { fprintf(stderr, "\n\n *** Error http_body_is_final() should return 1 " "on last on_body callback call " "but it doesn't! ***\n\n"); assert(0); abort(); } messages[num_messages].body_is_final = http_body_is_final(p); } int body_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); strlncat(messages[num_messages].body, sizeof(messages[num_messages].body), buf, len); messages[num_messages].body_size += len; check_body_is_final(p); // printf("body_cb: '%s'\n", requests[num_messages].body); return 0; } int count_body_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); assert(buf); messages[num_messages].body_size += len; check_body_is_final(p); return 0; } int message_begin_cb (http_parser *p) { assert(p == &parser); assert(!messages[num_messages].message_begin_cb_called); messages[num_messages].message_begin_cb_called = TRUE; return 0; } int headers_complete_cb (http_parser *p) { assert(p == &parser); messages[num_messages].method = parser.method; messages[num_messages].status_code = parser.status_code; messages[num_messages].http_major = parser.http_major; messages[num_messages].http_minor = parser.http_minor; messages[num_messages].headers_complete_cb_called = TRUE; messages[num_messages].should_keep_alive = http_should_keep_alive(&parser); return 0; } int message_complete_cb (http_parser *p) { assert(p == &parser); if (messages[num_messages].should_keep_alive != http_should_keep_alive(&parser)) { fprintf(stderr, "\n\n *** Error http_should_keep_alive() should have same " "value in both on_message_complete and on_headers_complete " "but it doesn't! ***\n\n"); assert(0); abort(); } if (messages[num_messages].body_size && http_body_is_final(p) && !messages[num_messages].body_is_final) { fprintf(stderr, "\n\n *** Error http_body_is_final() should return 1 " "on last on_body callback call " "but it doesn't! ***\n\n"); assert(0); abort(); } messages[num_messages].message_complete_cb_called = TRUE; messages[num_messages].message_complete_on_eof = currently_parsing_eof; num_messages++; return 0; } int response_status_cb (http_parser *p, const char *buf, size_t len) { assert(p == &parser); messages[num_messages].status_cb_called = TRUE; strlncat(messages[num_messages].response_status, sizeof(messages[num_messages].response_status), buf, len); return 0; } int chunk_header_cb (http_parser *p) { assert(p == &parser); int chunk_idx = messages[num_messages].num_chunks; messages[num_messages].num_chunks++; if (chunk_idx < MAX_CHUNKS) { messages[num_messages].chunk_lengths[chunk_idx] = p->content_length; } return 0; } int chunk_complete_cb (http_parser *p) { assert(p == &parser); /* Here we want to verify that each chunk_header_cb is matched by a * chunk_complete_cb, so not only should the total number of calls to * both callbacks be the same, but they also should be interleaved * properly */ assert(messages[num_messages].num_chunks == messages[num_messages].num_chunks_complete + 1); messages[num_messages].num_chunks_complete++; return 0; } /* These dontcall_* callbacks exist so that we can verify that when we're * paused, no additional callbacks are invoked */ int dontcall_message_begin_cb (http_parser *p) { if (p) { } // gcc fprintf(stderr, "\n\n*** on_message_begin() called on paused parser ***\n\n"); abort(); } int dontcall_header_field_cb (http_parser *p, const char *buf, size_t len) { if (p || buf || len) { } // gcc fprintf(stderr, "\n\n*** on_header_field() called on paused parser ***\n\n"); abort(); } int dontcall_header_value_cb (http_parser *p, const char *buf, size_t len) { if (p || buf || len) { } // gcc fprintf(stderr, "\n\n*** on_header_value() called on paused parser ***\n\n"); abort(); } int dontcall_request_url_cb (http_parser *p, const char *buf, size_t len) { if (p || buf || len) { } // gcc fprintf(stderr, "\n\n*** on_request_url() called on paused parser ***\n\n"); abort(); } int dontcall_body_cb (http_parser *p, const char *buf, size_t len) { if (p || buf || len) { } // gcc fprintf(stderr, "\n\n*** on_body_cb() called on paused parser ***\n\n"); abort(); } int dontcall_headers_complete_cb (http_parser *p) { if (p) { } // gcc fprintf(stderr, "\n\n*** on_headers_complete() called on paused " "parser ***\n\n"); abort(); } int dontcall_message_complete_cb (http_parser *p) { if (p) { } // gcc fprintf(stderr, "\n\n*** on_message_complete() called on paused " "parser ***\n\n"); abort(); } int dontcall_response_status_cb (http_parser *p, const char *buf, size_t len) { if (p || buf || len) { } // gcc fprintf(stderr, "\n\n*** on_status() called on paused parser ***\n\n"); abort(); } int dontcall_chunk_header_cb (http_parser *p) { if (p) { } // gcc fprintf(stderr, "\n\n*** on_chunk_header() called on paused parser ***\n\n"); exit(1); } int dontcall_chunk_complete_cb (http_parser *p) { if (p) { } // gcc fprintf(stderr, "\n\n*** on_chunk_complete() " "called on paused parser ***\n\n"); exit(1); } static http_parser_settings settings_dontcall = {.on_message_begin = dontcall_message_begin_cb ,.on_header_field = dontcall_header_field_cb ,.on_header_value = dontcall_header_value_cb ,.on_url = dontcall_request_url_cb ,.on_status = dontcall_response_status_cb ,.on_body = dontcall_body_cb ,.on_headers_complete = dontcall_headers_complete_cb ,.on_message_complete = dontcall_message_complete_cb ,.on_chunk_header = dontcall_chunk_header_cb ,.on_chunk_complete = dontcall_chunk_complete_cb }; /* These pause_* callbacks always pause the parser and just invoke the regular * callback that tracks content. Before returning, we overwrite the parser * settings to point to the _dontcall variety so that we can verify that * the pause actually did, you know, pause. */ int pause_message_begin_cb (http_parser *p) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return message_begin_cb(p); } int pause_header_field_cb (http_parser *p, const char *buf, size_t len) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return header_field_cb(p, buf, len); } int pause_header_value_cb (http_parser *p, const char *buf, size_t len) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return header_value_cb(p, buf, len); } int pause_request_url_cb (http_parser *p, const char *buf, size_t len) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return request_url_cb(p, buf, len); } int pause_body_cb (http_parser *p, const char *buf, size_t len) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return body_cb(p, buf, len); } int pause_headers_complete_cb (http_parser *p) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return headers_complete_cb(p); } int pause_message_complete_cb (http_parser *p) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return message_complete_cb(p); } int pause_response_status_cb (http_parser *p, const char *buf, size_t len) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return response_status_cb(p, buf, len); } int pause_chunk_header_cb (http_parser *p) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return chunk_header_cb(p); } int pause_chunk_complete_cb (http_parser *p) { http_parser_pause(p, 1); *current_pause_parser = settings_dontcall; return chunk_complete_cb(p); } int connect_headers_complete_cb (http_parser *p) { headers_complete_cb(p); return 1; } int connect_message_complete_cb (http_parser *p) { messages[num_messages].should_keep_alive = http_should_keep_alive(&parser); return message_complete_cb(p); } static http_parser_settings settings_pause = {.on_message_begin = pause_message_begin_cb ,.on_header_field = pause_header_field_cb ,.on_header_value = pause_header_value_cb ,.on_url = pause_request_url_cb ,.on_status = pause_response_status_cb ,.on_body = pause_body_cb ,.on_headers_complete = pause_headers_complete_cb ,.on_message_complete = pause_message_complete_cb ,.on_chunk_header = pause_chunk_header_cb ,.on_chunk_complete = pause_chunk_complete_cb }; static http_parser_settings settings = {.on_message_begin = message_begin_cb ,.on_header_field = header_field_cb ,.on_header_value = header_value_cb ,.on_url = request_url_cb ,.on_status = response_status_cb ,.on_body = body_cb ,.on_headers_complete = headers_complete_cb ,.on_message_complete = message_complete_cb ,.on_chunk_header = chunk_header_cb ,.on_chunk_complete = chunk_complete_cb }; static http_parser_settings settings_count_body = {.on_message_begin = message_begin_cb ,.on_header_field = header_field_cb ,.on_header_value = header_value_cb ,.on_url = request_url_cb ,.on_status = response_status_cb ,.on_body = count_body_cb ,.on_headers_complete = headers_complete_cb ,.on_message_complete = message_complete_cb ,.on_chunk_header = chunk_header_cb ,.on_chunk_complete = chunk_complete_cb }; static http_parser_settings settings_connect = {.on_message_begin = message_begin_cb ,.on_header_field = header_field_cb ,.on_header_value = header_value_cb ,.on_url = request_url_cb ,.on_status = response_status_cb ,.on_body = dontcall_body_cb ,.on_headers_complete = connect_headers_complete_cb ,.on_message_complete = connect_message_complete_cb ,.on_chunk_header = chunk_header_cb ,.on_chunk_complete = chunk_complete_cb }; static http_parser_settings settings_null = {.on_message_begin = 0 ,.on_header_field = 0 ,.on_header_value = 0 ,.on_url = 0 ,.on_status = 0 ,.on_body = 0 ,.on_headers_complete = 0 ,.on_message_complete = 0 ,.on_chunk_header = 0 ,.on_chunk_complete = 0 }; void parser_init (enum http_parser_type type) { num_messages = 0; http_parser_init(&parser, type); memset(&messages, 0, sizeof messages); } size_t parse (const char *buf, size_t len) { size_t nparsed; currently_parsing_eof = (len == 0); nparsed = http_parser_execute(&parser, &settings, buf, len); return nparsed; } size_t parse_count_body (const char *buf, size_t len) { size_t nparsed; currently_parsing_eof = (len == 0); nparsed = http_parser_execute(&parser, &settings_count_body, buf, len); return nparsed; } size_t parse_pause (const char *buf, size_t len) { size_t nparsed; http_parser_settings s = settings_pause; currently_parsing_eof = (len == 0); current_pause_parser = &s; nparsed = http_parser_execute(&parser, current_pause_parser, buf, len); return nparsed; } size_t parse_connect (const char *buf, size_t len) { size_t nparsed; currently_parsing_eof = (len == 0); nparsed = http_parser_execute(&parser, &settings_connect, buf, len); return nparsed; } static inline int check_str_eq (const struct message *m, const char *prop, const char *expected, const char *found) { if ((expected == NULL) != (found == NULL)) { printf("\n*** Error: %s in '%s' ***\n\n", prop, m->name); printf("expected %s\n", (expected == NULL) ? "NULL" : expected); printf(" found %s\n", (found == NULL) ? "NULL" : found); return 0; } if (expected != NULL && 0 != strcmp(expected, found)) { printf("\n*** Error: %s in '%s' ***\n\n", prop, m->name); printf("expected '%s'\n", expected); printf(" found '%s'\n", found); return 0; } return 1; } static inline int check_num_eq (const struct message *m, const char *prop, int expected, int found) { if (expected != found) { printf("\n*** Error: %s in '%s' ***\n\n", prop, m->name); printf("expected %d\n", expected); printf(" found %d\n", found); return 0; } return 1; } #define MESSAGE_CHECK_STR_EQ(expected, found, prop) \ if (!check_str_eq(expected, #prop, expected->prop, found->prop)) return 0 #define MESSAGE_CHECK_NUM_EQ(expected, found, prop) \ if (!check_num_eq(expected, #prop, expected->prop, found->prop)) return 0 #define MESSAGE_CHECK_URL_EQ(u, expected, found, prop, fn) \ do { \ char ubuf[256]; \ \ if ((u)->field_set & (1 << (fn))) { \ memcpy(ubuf, (found)->request_url + (u)->field_data[(fn)].off, \ (u)->field_data[(fn)].len); \ ubuf[(u)->field_data[(fn)].len] = '\0'; \ } else { \ ubuf[0] = '\0'; \ } \ \ check_str_eq(expected, #prop, expected->prop, ubuf); \ } while(0) int message_eq (int index, int connect, const struct message *expected) { int i; struct message *m = &messages[index]; MESSAGE_CHECK_NUM_EQ(expected, m, http_major); MESSAGE_CHECK_NUM_EQ(expected, m, http_minor); if (expected->type == HTTP_REQUEST) { MESSAGE_CHECK_NUM_EQ(expected, m, method); } else { MESSAGE_CHECK_NUM_EQ(expected, m, status_code); MESSAGE_CHECK_STR_EQ(expected, m, response_status); assert(m->status_cb_called); } if (!connect) { MESSAGE_CHECK_NUM_EQ(expected, m, should_keep_alive); MESSAGE_CHECK_NUM_EQ(expected, m, message_complete_on_eof); } assert(m->message_begin_cb_called); assert(m->headers_complete_cb_called); assert(m->message_complete_cb_called); MESSAGE_CHECK_STR_EQ(expected, m, request_url); /* Check URL components; we can't do this w/ CONNECT since it doesn't * send us a well-formed URL. */ if (*m->request_url && m->method != HTTP_CONNECT) { struct http_parser_url u; if (http_parser_parse_url(m->request_url, strlen(m->request_url), 0, &u)) { fprintf(stderr, "\n\n*** failed to parse URL %s ***\n\n", m->request_url); abort(); } if (expected->host) { MESSAGE_CHECK_URL_EQ(&u, expected, m, host, UF_HOST); } if (expected->userinfo) { MESSAGE_CHECK_URL_EQ(&u, expected, m, userinfo, UF_USERINFO); } m->port = (u.field_set & (1 << UF_PORT)) ? u.port : 0; MESSAGE_CHECK_URL_EQ(&u, expected, m, query_string, UF_QUERY); MESSAGE_CHECK_URL_EQ(&u, expected, m, fragment, UF_FRAGMENT); MESSAGE_CHECK_URL_EQ(&u, expected, m, request_path, UF_PATH); MESSAGE_CHECK_NUM_EQ(expected, m, port); } if (connect) { check_num_eq(m, "body_size", 0, m->body_size); } else if (expected->body_size) { MESSAGE_CHECK_NUM_EQ(expected, m, body_size); } else { MESSAGE_CHECK_STR_EQ(expected, m, body); } if (connect) { check_num_eq(m, "num_chunks_complete", 0, m->num_chunks_complete); } else { assert(m->num_chunks == m->num_chunks_complete); MESSAGE_CHECK_NUM_EQ(expected, m, num_chunks_complete); for (i = 0; i < m->num_chunks && i < MAX_CHUNKS; i++) { MESSAGE_CHECK_NUM_EQ(expected, m, chunk_lengths[i]); } } MESSAGE_CHECK_NUM_EQ(expected, m, num_headers); int r; for (i = 0; i < m->num_headers; i++) { r = check_str_eq(expected, "header field", expected->headers[i][0], m->headers[i][0]); if (!r) return 0; r = check_str_eq(expected, "header value", expected->headers[i][1], m->headers[i][1]); if (!r) return 0; } if (!connect) { MESSAGE_CHECK_STR_EQ(expected, m, upgrade); } return 1; } /* Given a sequence of varargs messages, return the number of them that the * parser should successfully parse, taking into account that upgraded * messages prevent all subsequent messages from being parsed. */ size_t count_parsed_messages(const size_t nmsgs, ...) { size_t i; va_list ap; va_start(ap, nmsgs); for (i = 0; i < nmsgs; i++) { struct message *m = va_arg(ap, struct message *); if (m->upgrade) { va_end(ap); return i + 1; } } va_end(ap); return nmsgs; } /* Given a sequence of bytes and the number of these that we were able to * parse, verify that upgrade bodies are correct. */ void upgrade_message_fix(char *body, const size_t nread, const size_t nmsgs, ...) { va_list ap; size_t i; size_t off = 0; va_start(ap, nmsgs); for (i = 0; i < nmsgs; i++) { struct message *m = va_arg(ap, struct message *); off += strlen(m->raw); if (m->upgrade) { off -= strlen(m->upgrade); /* Check the portion of the response after its specified upgrade */ if (!check_str_eq(m, "upgrade", body + off, body + nread)) { abort(); } /* Fix up the response so that message_eq() will verify the beginning * of the upgrade */ *(body + nread + strlen(m->upgrade)) = '\0'; messages[num_messages -1 ].upgrade = body + nread; va_end(ap); return; } } va_end(ap); printf("\n\n*** Error: expected a message with upgrade ***\n"); abort(); } static void print_error (const char *raw, size_t error_location) { fprintf(stderr, "\n*** %s ***\n\n", http_errno_description(HTTP_PARSER_ERRNO(&parser))); int this_line = 0, char_len = 0; size_t i, j, len = strlen(raw), error_location_line = 0; for (i = 0; i < len; i++) { if (i == error_location) this_line = 1; switch (raw[i]) { case '\r': char_len = 2; fprintf(stderr, "\\r"); break; case '\n': fprintf(stderr, "\\n\n"); if (this_line) goto print; error_location_line = 0; continue; default: char_len = 1; fputc(raw[i], stderr); break; } if (!this_line) error_location_line += char_len; } fprintf(stderr, "[eof]\n"); print: for (j = 0; j < error_location_line; j++) { fputc(' ', stderr); } fprintf(stderr, "^\n\nerror location: %u\n", (unsigned int)error_location); } void test_preserve_data (void) { char my_data[] = "application-specific data"; http_parser parser; parser.data = my_data; http_parser_init(&parser, HTTP_REQUEST); if (parser.data != my_data) { printf("\n*** parser.data not preserved accross http_parser_init ***\n\n"); abort(); } } struct url_test { const char *name; const char *url; int is_connect; struct http_parser_url u; int rv; }; const struct url_test url_tests[] = { {.name="proxy request" ,.url="http://hostname/" ,.is_connect=0 ,.u= {.field_set=(1 << UF_SCHEMA) | (1 << UF_HOST) | (1 << UF_PATH) ,.port=0 ,.field_data= {{ 0, 4 } /* UF_SCHEMA */ ,{ 7, 8 } /* UF_HOST */ ,{ 0, 0 } /* UF_PORT */ ,{ 15, 1 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="proxy request with port" ,.url="http://hostname:444/" ,.is_connect=0 ,.u= {.field_set=(1 << UF_SCHEMA) | (1 << UF_HOST) | (1 << UF_PORT) | (1 << UF_PATH) ,.port=444 ,.field_data= {{ 0, 4 } /* UF_SCHEMA */ ,{ 7, 8 } /* UF_HOST */ ,{ 16, 3 } /* UF_PORT */ ,{ 19, 1 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="CONNECT request" ,.url="hostname:443" ,.is_connect=1 ,.u= {.field_set=(1 << UF_HOST) | (1 << UF_PORT) ,.port=443 ,.field_data= {{ 0, 0 } /* UF_SCHEMA */ ,{ 0, 8 } /* UF_HOST */ ,{ 9, 3 } /* UF_PORT */ ,{ 0, 0 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="CONNECT request but not connect" ,.url="hostname:443" ,.is_connect=0 ,.rv=1 } , {.name="proxy ipv6 request" ,.url="http://[1:2::3:4]/" ,.is_connect=0 ,.u= {.field_set=(1 << UF_SCHEMA) | (1 << UF_HOST) | (1 << UF_PATH) ,.port=0 ,.field_data= {{ 0, 4 } /* UF_SCHEMA */ ,{ 8, 8 } /* UF_HOST */ ,{ 0, 0 } /* UF_PORT */ ,{ 17, 1 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="proxy ipv6 request with port" ,.url="http://[1:2::3:4]:67/" ,.is_connect=0 ,.u= {.field_set=(1 << UF_SCHEMA) | (1 << UF_HOST) | (1 << UF_PORT) | (1 << UF_PATH) ,.port=67 ,.field_data= {{ 0, 4 } /* UF_SCHEMA */ ,{ 8, 8 } /* UF_HOST */ ,{ 18, 2 } /* UF_PORT */ ,{ 20, 1 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="CONNECT ipv6 address" ,.url="[1:2::3:4]:443" ,.is_connect=1 ,.u= {.field_set=(1 << UF_HOST) | (1 << UF_PORT) ,.port=443 ,.field_data= {{ 0, 0 } /* UF_SCHEMA */ ,{ 1, 8 } /* UF_HOST */ ,{ 11, 3 } /* UF_PORT */ ,{ 0, 0 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="ipv4 in ipv6 address" ,.url="http://[2001:0000:0000:0000:0000:0000:1.9.1.1]/" ,.is_connect=0 ,.u= {.field_set=(1 << UF_SCHEMA) | (1 << UF_HOST) | (1 << UF_PATH) ,.port=0 ,.field_data= {{ 0, 4 } /* UF_SCHEMA */ ,{ 8, 37 } /* UF_HOST */ ,{ 0, 0 } /* UF_PORT */ ,{ 46, 1 } /* UF_PATH */ ,{ 0, 0 } /* UF_QUERY */ ,{ 0, 0 } /* UF_FRAGMENT */ ,{ 0, 0 } /* UF_USERINFO */ } } ,.rv=0 } , {.name="extra ? in query string" ,.url="http://a.tbcdn.cn/p/fp/2010c/??fp-header-min.css,fp-base-min.css," "fp-channel-min.css,fp-product-min.css,fp-mall-min.css,fp-category-min.css," "fp-sub-min.css,fp-gdp4p-min.css,fp-css3-min.css,fp-misc-min.css?t=20101022.css" ,.is_connect=0 ,.u= {.field_set=(1<field_set, u->port); for (i = 0; i < UF_MAX; i++) { if ((u->field_set & (1 << i)) == 0) { printf("\tfield_data[%u]: unset\n", i); continue; } printf("\tfield_data[%u]: off: %u len: %u part: \"%.*s\n\"", i, u->field_data[i].off, u->field_data[i].len, u->field_data[i].len, url + u->field_data[i].off); } } void test_parse_url (void) { struct http_parser_url u; const struct url_test *test; unsigned int i; int rv; for (i = 0; i < (sizeof(url_tests) / sizeof(url_tests[0])); i++) { test = &url_tests[i]; memset(&u, 0, sizeof(u)); rv = http_parser_parse_url(test->url, test->url ? strlen(test->url) : 0, test->is_connect, &u); if (test->rv == 0) { if (rv != 0) { printf("\n*** http_parser_parse_url(\"%s\") \"%s\" test failed, " "unexpected rv %d ***\n\n", test->url, test->name, rv); abort(); } if (memcmp(&u, &test->u, sizeof(u)) != 0) { printf("\n*** http_parser_parse_url(\"%s\") \"%s\" failed ***\n", test->url, test->name); printf("target http_parser_url:\n"); dump_url(test->url, &test->u); printf("result http_parser_url:\n"); dump_url(test->url, &u); abort(); } } else { /* test->rv != 0 */ if (rv == 0) { printf("\n*** http_parser_parse_url(\"%s\") \"%s\" test failed, " "unexpected rv %d ***\n\n", test->url, test->name, rv); abort(); } } } } void test_method_str (void) { assert(0 == strcmp("GET", http_method_str(HTTP_GET))); assert(0 == strcmp("", http_method_str(1337))); } void test_status_str (void) { assert(0 == strcmp("OK", http_status_str(HTTP_STATUS_OK))); assert(0 == strcmp("Not Found", http_status_str(HTTP_STATUS_NOT_FOUND))); assert(0 == strcmp("", http_status_str(1337))); } void test_message (const struct message *message) { size_t raw_len = strlen(message->raw); size_t msg1len; for (msg1len = 0; msg1len < raw_len; msg1len++) { parser_init(message->type); size_t read; const char *msg1 = message->raw; const char *msg2 = msg1 + msg1len; size_t msg2len = raw_len - msg1len; if (msg1len) { assert(num_messages == 0); messages[0].headers_complete_cb_called = FALSE; read = parse(msg1, msg1len); if (!messages[0].headers_complete_cb_called && parser.nread != read) { assert(parser.nread == read); print_error(msg1, read); abort(); } if (message->upgrade && parser.upgrade && num_messages > 0) { messages[num_messages - 1].upgrade = msg1 + read; goto test; } if (read != msg1len) { print_error(msg1, read); abort(); } } read = parse(msg2, msg2len); if (message->upgrade && parser.upgrade) { messages[num_messages - 1].upgrade = msg2 + read; goto test; } if (read != msg2len) { print_error(msg2, read); abort(); } read = parse(NULL, 0); if (read != 0) { print_error(message->raw, read); abort(); } test: if (num_messages != 1) { printf("\n*** num_messages != 1 after testing '%s' ***\n\n", message->name); abort(); } if(!message_eq(0, 0, message)) abort(); } } void test_message_count_body (const struct message *message) { parser_init(message->type); size_t read; size_t l = strlen(message->raw); size_t i, toread; size_t chunk = 4024; for (i = 0; i < l; i+= chunk) { toread = MIN(l-i, chunk); read = parse_count_body(message->raw + i, toread); if (read != toread) { print_error(message->raw, read); abort(); } } read = parse_count_body(NULL, 0); if (read != 0) { print_error(message->raw, read); abort(); } if (num_messages != 1) { printf("\n*** num_messages != 1 after testing '%s' ***\n\n", message->name); abort(); } if(!message_eq(0, 0, message)) abort(); } void test_simple_type (const char *buf, enum http_errno err_expected, enum http_parser_type type) { parser_init(type); enum http_errno err; parse(buf, strlen(buf)); err = HTTP_PARSER_ERRNO(&parser); parse(NULL, 0); /* In strict mode, allow us to pass with an unexpected HPE_STRICT as * long as the caller isn't expecting success. */ #if HTTP_PARSER_STRICT if (err_expected != err && err_expected != HPE_OK && err != HPE_STRICT) { #else if (err_expected != err) { #endif fprintf(stderr, "\n*** test_simple expected %s, but saw %s ***\n\n%s\n", http_errno_name(err_expected), http_errno_name(err), buf); abort(); } } void test_simple (const char *buf, enum http_errno err_expected) { test_simple_type(buf, err_expected, HTTP_REQUEST); } void test_invalid_header_content (int req, const char* str) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.1 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = str; size_t buflen = strlen(buf); parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { assert(HTTP_PARSER_ERRNO(&parser) == HPE_INVALID_HEADER_TOKEN); return; } fprintf(stderr, "\n*** Error expected but none in invalid header content test ***\n"); abort(); } void test_invalid_header_field_content_error (int req) { test_invalid_header_content(req, "Foo: F\01ailure"); test_invalid_header_content(req, "Foo: B\02ar"); } void test_invalid_header_field (int req, const char* str) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.1 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = str; size_t buflen = strlen(buf); parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { assert(HTTP_PARSER_ERRNO(&parser) == HPE_INVALID_HEADER_TOKEN); return; } fprintf(stderr, "\n*** Error expected but none in invalid header token test ***\n"); abort(); } void test_invalid_header_field_token_error (int req) { test_invalid_header_field(req, "Fo@: Failure"); test_invalid_header_field(req, "Foo\01\test: Bar"); } void test_double_content_length_error (int req) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.1 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = "Content-Length: 0\r\nContent-Length: 1\r\n\r\n"; size_t buflen = strlen(buf); parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { assert(HTTP_PARSER_ERRNO(&parser) == HPE_UNEXPECTED_CONTENT_LENGTH); return; } fprintf(stderr, "\n*** Error expected but none in double content-length test ***\n"); abort(); } void test_chunked_content_length_error (int req) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.1 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = "Transfer-Encoding: chunked\r\nContent-Length: 1\r\n\r\n"; size_t buflen = strlen(buf); parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { assert(HTTP_PARSER_ERRNO(&parser) == HPE_UNEXPECTED_CONTENT_LENGTH); return; } fprintf(stderr, "\n*** Error expected but none in chunked content-length test ***\n"); abort(); } void test_header_cr_no_lf_error (int req) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.1 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = "Foo: 1\rBar: 1\r\n\r\n"; size_t buflen = strlen(buf); parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { assert(HTTP_PARSER_ERRNO(&parser) == HPE_LF_EXPECTED); return; } fprintf(stderr, "\n*** Error expected but none in header whitespace test ***\n"); abort(); } void test_no_overflow_parse_url (void) { int rv; struct http_parser_url u; http_parser_url_init(&u); rv = http_parser_parse_url("http://example.com:8001", 22, 0, &u); if (rv != 0) { fprintf(stderr, "\n*** test_no_overflow_parse_url invalid return value=%d\n", rv); abort(); } if (u.port != 800) { fprintf(stderr, "\n*** test_no_overflow_parse_url invalid port number=%d\n", u.port); abort(); } } void test_header_overflow_error (int req) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; const char *buf; buf = req ? "GET / HTTP/1.1\r\n" : "HTTP/1.0 200 OK\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); buf = "header-key: header-value\r\n"; size_t buflen = strlen(buf); int i; for (i = 0; i < 10000; i++) { parsed = http_parser_execute(&parser, &settings_null, buf, buflen); if (parsed != buflen) { //fprintf(stderr, "error found on iter %d\n", i); assert(HTTP_PARSER_ERRNO(&parser) == HPE_HEADER_OVERFLOW); return; } } fprintf(stderr, "\n*** Error expected but none in header overflow test ***\n"); abort(); } void test_header_nread_value () { http_parser parser; http_parser_init(&parser, HTTP_REQUEST); size_t parsed; const char *buf; buf = "GET / HTTP/1.1\r\nheader: value\nhdr: value\r\n"; parsed = http_parser_execute(&parser, &settings_null, buf, strlen(buf)); assert(parsed == strlen(buf)); assert(parser.nread == strlen(buf)); } static void test_content_length_overflow (const char *buf, size_t buflen, int expect_ok) { http_parser parser; http_parser_init(&parser, HTTP_RESPONSE); http_parser_execute(&parser, &settings_null, buf, buflen); if (expect_ok) assert(HTTP_PARSER_ERRNO(&parser) == HPE_OK); else assert(HTTP_PARSER_ERRNO(&parser) == HPE_INVALID_CONTENT_LENGTH); } void test_header_content_length_overflow_error (void) { #define X(size) \ "HTTP/1.1 200 OK\r\n" \ "Content-Length: " #size "\r\n" \ "\r\n" const char a[] = X(1844674407370955160); /* 2^64 / 10 - 1 */ const char b[] = X(18446744073709551615); /* 2^64-1 */ const char c[] = X(18446744073709551616); /* 2^64 */ #undef X test_content_length_overflow(a, sizeof(a) - 1, 1); /* expect ok */ test_content_length_overflow(b, sizeof(b) - 1, 0); /* expect failure */ test_content_length_overflow(c, sizeof(c) - 1, 0); /* expect failure */ } void test_chunk_content_length_overflow_error (void) { #define X(size) \ "HTTP/1.1 200 OK\r\n" \ "Transfer-Encoding: chunked\r\n" \ "\r\n" \ #size "\r\n" \ "..." const char a[] = X(FFFFFFFFFFFFFFE); /* 2^64 / 16 - 1 */ const char b[] = X(FFFFFFFFFFFFFFFF); /* 2^64-1 */ const char c[] = X(10000000000000000); /* 2^64 */ #undef X test_content_length_overflow(a, sizeof(a) - 1, 1); /* expect ok */ test_content_length_overflow(b, sizeof(b) - 1, 0); /* expect failure */ test_content_length_overflow(c, sizeof(c) - 1, 0); /* expect failure */ } void test_no_overflow_long_body (int req, size_t length) { http_parser parser; http_parser_init(&parser, req ? HTTP_REQUEST : HTTP_RESPONSE); size_t parsed; size_t i; char buf1[3000]; size_t buf1len = sprintf(buf1, "%s\r\nConnection: Keep-Alive\r\nContent-Length: %lu\r\n\r\n", req ? "POST / HTTP/1.0" : "HTTP/1.0 200 OK", (unsigned long)length); parsed = http_parser_execute(&parser, &settings_null, buf1, buf1len); if (parsed != buf1len) goto err; for (i = 0; i < length; i++) { char foo = 'a'; parsed = http_parser_execute(&parser, &settings_null, &foo, 1); if (parsed != 1) goto err; } parsed = http_parser_execute(&parser, &settings_null, buf1, buf1len); if (parsed != buf1len) goto err; return; err: fprintf(stderr, "\n*** error in test_no_overflow_long_body %s of length %lu ***\n", req ? "REQUEST" : "RESPONSE", (unsigned long)length); abort(); } void test_multiple3 (const struct message *r1, const struct message *r2, const struct message *r3) { int message_count = count_parsed_messages(3, r1, r2, r3); char total[ strlen(r1->raw) + strlen(r2->raw) + strlen(r3->raw) + 1 ]; total[0] = '\0'; strcat(total, r1->raw); strcat(total, r2->raw); strcat(total, r3->raw); parser_init(r1->type); size_t read; read = parse(total, strlen(total)); if (parser.upgrade) { upgrade_message_fix(total, read, 3, r1, r2, r3); goto test; } if (read != strlen(total)) { print_error(total, read); abort(); } read = parse(NULL, 0); if (read != 0) { print_error(total, read); abort(); } test: if (message_count != num_messages) { fprintf(stderr, "\n\n*** Parser didn't see 3 messages only %d *** \n", num_messages); abort(); } if (!message_eq(0, 0, r1)) abort(); if (message_count > 1 && !message_eq(1, 0, r2)) abort(); if (message_count > 2 && !message_eq(2, 0, r3)) abort(); } /* SCAN through every possible breaking to make sure the * parser can handle getting the content in any chunks that * might come from the socket */ void test_scan (const struct message *r1, const struct message *r2, const struct message *r3) { char total[80*1024] = "\0"; char buf1[80*1024] = "\0"; char buf2[80*1024] = "\0"; char buf3[80*1024] = "\0"; strcat(total, r1->raw); strcat(total, r2->raw); strcat(total, r3->raw); size_t read; int total_len = strlen(total); int total_ops = 2 * (total_len - 1) * (total_len - 2) / 2; int ops = 0 ; size_t buf1_len, buf2_len, buf3_len; int message_count = count_parsed_messages(3, r1, r2, r3); int i,j,type_both; for (type_both = 0; type_both < 2; type_both ++ ) { for (j = 2; j < total_len; j ++ ) { for (i = 1; i < j; i ++ ) { if (ops % 1000 == 0) { printf("\b\b\b\b%3.0f%%", 100 * (float)ops /(float)total_ops); fflush(stdout); } ops += 1; parser_init(type_both ? HTTP_BOTH : r1->type); buf1_len = i; strlncpy(buf1, sizeof(buf1), total, buf1_len); buf1[buf1_len] = 0; buf2_len = j - i; strlncpy(buf2, sizeof(buf1), total+i, buf2_len); buf2[buf2_len] = 0; buf3_len = total_len - j; strlncpy(buf3, sizeof(buf1), total+j, buf3_len); buf3[buf3_len] = 0; assert(num_messages == 0); messages[0].headers_complete_cb_called = FALSE; read = parse(buf1, buf1_len); if (!messages[0].headers_complete_cb_called && parser.nread != read) { print_error(buf1, read); goto error; } if (parser.upgrade) goto test; if (read != buf1_len) { print_error(buf1, read); goto error; } read += parse(buf2, buf2_len); if (parser.upgrade) goto test; if (read != buf1_len + buf2_len) { print_error(buf2, read); goto error; } read += parse(buf3, buf3_len); if (parser.upgrade) goto test; if (read != buf1_len + buf2_len + buf3_len) { print_error(buf3, read); goto error; } parse(NULL, 0); test: if (parser.upgrade) { upgrade_message_fix(total, read, 3, r1, r2, r3); } if (message_count != num_messages) { fprintf(stderr, "\n\nParser didn't see %d messages only %d\n", message_count, num_messages); goto error; } if (!message_eq(0, 0, r1)) { fprintf(stderr, "\n\nError matching messages[0] in test_scan.\n"); goto error; } if (message_count > 1 && !message_eq(1, 0, r2)) { fprintf(stderr, "\n\nError matching messages[1] in test_scan.\n"); goto error; } if (message_count > 2 && !message_eq(2, 0, r3)) { fprintf(stderr, "\n\nError matching messages[2] in test_scan.\n"); goto error; } } } } puts("\b\b\b\b100%"); return; error: fprintf(stderr, "i=%d j=%d\n", i, j); fprintf(stderr, "buf1 (%u) %s\n\n", (unsigned int)buf1_len, buf1); fprintf(stderr, "buf2 (%u) %s\n\n", (unsigned int)buf2_len , buf2); fprintf(stderr, "buf3 (%u) %s\n", (unsigned int)buf3_len, buf3); abort(); } // user required to free the result // string terminated by \0 char * create_large_chunked_message (int body_size_in_kb, const char* headers) { int i; size_t wrote = 0; size_t headers_len = strlen(headers); size_t bufsize = headers_len + (5+1024+2)*body_size_in_kb + 6; char * buf = malloc(bufsize); memcpy(buf, headers, headers_len); wrote += headers_len; for (i = 0; i < body_size_in_kb; i++) { // write 1kb chunk into the body. memcpy(buf + wrote, "400\r\n", 5); wrote += 5; memset(buf + wrote, 'C', 1024); wrote += 1024; strcpy(buf + wrote, "\r\n"); wrote += 2; } memcpy(buf + wrote, "0\r\n\r\n", 6); wrote += 6; assert(wrote == bufsize); return buf; } /* Verify that we can pause parsing at any of the bytes in the * message and still get the result that we're expecting. */ void test_message_pause (const struct message *msg) { char *buf = (char*) msg->raw; size_t buflen = strlen(msg->raw); size_t nread; parser_init(msg->type); do { nread = parse_pause(buf, buflen); // We can only set the upgrade buffer once we've gotten our message // completion callback. if (messages[0].message_complete_cb_called && msg->upgrade && parser.upgrade) { messages[0].upgrade = buf + nread; goto test; } if (nread < buflen) { // Not much do to if we failed a strict-mode check if (HTTP_PARSER_ERRNO(&parser) == HPE_STRICT) { return; } assert (HTTP_PARSER_ERRNO(&parser) == HPE_PAUSED); } buf += nread; buflen -= nread; http_parser_pause(&parser, 0); } while (buflen > 0); nread = parse_pause(NULL, 0); assert (nread == 0); test: if (num_messages != 1) { printf("\n*** num_messages != 1 after testing '%s' ***\n\n", msg->name); abort(); } if(!message_eq(0, 0, msg)) abort(); } /* Verify that body and next message won't be parsed in responses to CONNECT */ void test_message_connect (const struct message *msg) { char *buf = (char*) msg->raw; size_t buflen = strlen(msg->raw); parser_init(msg->type); parse_connect(buf, buflen); if (num_messages != 1) { printf("\n*** num_messages != 1 after testing '%s' ***\n\n", msg->name); abort(); } if(!message_eq(0, 1, msg)) abort(); } int main (void) { unsigned i, j, k; unsigned long version; unsigned major; unsigned minor; unsigned patch; version = http_parser_version(); major = (version >> 16) & 255; minor = (version >> 8) & 255; patch = version & 255; printf("http_parser v%u.%u.%u (0x%06lx)\n", major, minor, patch, version); printf("sizeof(http_parser) = %u\n", (unsigned int)sizeof(http_parser)); //// API test_preserve_data(); test_parse_url(); test_method_str(); test_status_str(); //// NREAD test_header_nread_value(); //// OVERFLOW CONDITIONS test_no_overflow_parse_url(); test_header_overflow_error(HTTP_REQUEST); test_no_overflow_long_body(HTTP_REQUEST, 1000); test_no_overflow_long_body(HTTP_REQUEST, 100000); test_header_overflow_error(HTTP_RESPONSE); test_no_overflow_long_body(HTTP_RESPONSE, 1000); test_no_overflow_long_body(HTTP_RESPONSE, 100000); test_header_content_length_overflow_error(); test_chunk_content_length_overflow_error(); //// HEADER FIELD CONDITIONS test_double_content_length_error(HTTP_REQUEST); test_chunked_content_length_error(HTTP_REQUEST); test_header_cr_no_lf_error(HTTP_REQUEST); test_invalid_header_field_token_error(HTTP_REQUEST); test_invalid_header_field_content_error(HTTP_REQUEST); test_double_content_length_error(HTTP_RESPONSE); test_chunked_content_length_error(HTTP_RESPONSE); test_header_cr_no_lf_error(HTTP_RESPONSE); test_invalid_header_field_token_error(HTTP_RESPONSE); test_invalid_header_field_content_error(HTTP_RESPONSE); test_simple_type( "POST / HTTP/1.1\r\n" "Content-Length: 42 \r\n" // Note the surrounding whitespace. "\r\n", HPE_OK, HTTP_REQUEST); test_simple_type( "POST / HTTP/1.1\r\n" "Content-Length: 4 2\r\n" "\r\n", HPE_INVALID_CONTENT_LENGTH, HTTP_REQUEST); test_simple_type( "POST / HTTP/1.1\r\n" "Content-Length: 13 37\r\n" "\r\n", HPE_INVALID_CONTENT_LENGTH, HTTP_REQUEST); //// RESPONSES test_simple_type("HTP/1.1 200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); test_simple_type("HTTP/01.1 200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); test_simple_type("HTTP/11.1 200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); test_simple_type("HTTP/1.01 200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); test_simple_type("HTTP/1.1\t200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); test_simple_type("\rHTTP/1.1\t200 OK\r\n\r\n", HPE_INVALID_VERSION, HTTP_RESPONSE); for (i = 0; i < ARRAY_SIZE(responses); i++) { test_message(&responses[i]); } for (i = 0; i < ARRAY_SIZE(responses); i++) { test_message_pause(&responses[i]); } for (i = 0; i < ARRAY_SIZE(responses); i++) { test_message_connect(&responses[i]); } for (i = 0; i < ARRAY_SIZE(responses); i++) { if (!responses[i].should_keep_alive) continue; for (j = 0; j < ARRAY_SIZE(responses); j++) { if (!responses[j].should_keep_alive) continue; for (k = 0; k < ARRAY_SIZE(responses); k++) { test_multiple3(&responses[i], &responses[j], &responses[k]); } } } test_message_count_body(&responses[NO_HEADERS_NO_BODY_404]); test_message_count_body(&responses[TRAILING_SPACE_ON_CHUNKED_BODY]); // test very large chunked response { char * msg = create_large_chunked_message(31337, "HTTP/1.0 200 OK\r\n" "Transfer-Encoding: chunked\r\n" "Content-Type: text/plain\r\n" "\r\n"); struct message large_chunked = {.name= "large chunked" ,.type= HTTP_RESPONSE ,.raw= msg ,.should_keep_alive= FALSE ,.message_complete_on_eof= FALSE ,.http_major= 1 ,.http_minor= 0 ,.status_code= 200 ,.response_status= "OK" ,.num_headers= 2 ,.headers= { { "Transfer-Encoding", "chunked" } , { "Content-Type", "text/plain" } } ,.body_size= 31337*1024 ,.num_chunks_complete= 31338 }; for (i = 0; i < MAX_CHUNKS; i++) { large_chunked.chunk_lengths[i] = 1024; } test_message_count_body(&large_chunked); free(msg); } printf("response scan 1/2 "); test_scan( &responses[TRAILING_SPACE_ON_CHUNKED_BODY] , &responses[NO_BODY_HTTP10_KA_204] , &responses[NO_REASON_PHRASE] ); printf("response scan 2/2 "); test_scan( &responses[BONJOUR_MADAME_FR] , &responses[UNDERSTORE_HEADER_KEY] , &responses[NO_CARRIAGE_RET] ); puts("responses okay"); /// REQUESTS test_simple("GET / HTP/1.1\r\n\r\n", HPE_INVALID_VERSION); test_simple("GET / HTTP/01.1\r\n\r\n", HPE_INVALID_VERSION); test_simple("GET / HTTP/11.1\r\n\r\n", HPE_INVALID_VERSION); test_simple("GET / HTTP/1.01\r\n\r\n", HPE_INVALID_VERSION); // Extended characters - see nodejs/test/parallel/test-http-headers-obstext.js test_simple("GET / HTTP/1.1\r\n" "Test: Düsseldorf\r\n", HPE_OK); // Well-formed but incomplete test_simple("GET / HTTP/1.1\r\n" "Content-Type: text/plain\r\n" "Content-Length: 6\r\n" "\r\n" "fooba", HPE_OK); static const char *all_methods[] = { "DELETE", "GET", "HEAD", "POST", "PUT", //"CONNECT", //CONNECT can't be tested like other methods, it's a tunnel "OPTIONS", "TRACE", "COPY", "LOCK", "MKCOL", "MOVE", "PROPFIND", "PROPPATCH", "SEARCH", "UNLOCK", "BIND", "REBIND", "UNBIND", "ACL", "REPORT", "MKACTIVITY", "CHECKOUT", "MERGE", "M-SEARCH", "NOTIFY", "SUBSCRIBE", "UNSUBSCRIBE", "PATCH", "PURGE", "MKCALENDAR", "LINK", "UNLINK", 0 }; const char **this_method; for (this_method = all_methods; *this_method; this_method++) { char buf[200]; sprintf(buf, "%s / HTTP/1.1\r\n\r\n", *this_method); test_simple(buf, HPE_OK); } static const char *bad_methods[] = { "ASDF", "C******", "COLA", "GEM", "GETA", "M****", "MKCOLA", "PROPPATCHA", "PUN", "PX", "SA", "hello world", 0 }; for (this_method = bad_methods; *this_method; this_method++) { char buf[200]; sprintf(buf, "%s / HTTP/1.1\r\n\r\n", *this_method); test_simple(buf, HPE_INVALID_METHOD); } // illegal header field name line folding test_simple("GET / HTTP/1.1\r\n" "name\r\n" " : value\r\n" "\r\n", HPE_INVALID_HEADER_TOKEN); const char *dumbluck2 = "GET / HTTP/1.1\r\n" "X-SSL-Nonsense: -----BEGIN CERTIFICATE-----\r\n" "\tMIIFbTCCBFWgAwIBAgICH4cwDQYJKoZIhvcNAQEFBQAwcDELMAkGA1UEBhMCVUsx\r\n" "\tETAPBgNVBAoTCGVTY2llbmNlMRIwEAYDVQQLEwlBdXRob3JpdHkxCzAJBgNVBAMT\r\n" "\tAkNBMS0wKwYJKoZIhvcNAQkBFh5jYS1vcGVyYXRvckBncmlkLXN1cHBvcnQuYWMu\r\n" "\tdWswHhcNMDYwNzI3MTQxMzI4WhcNMDcwNzI3MTQxMzI4WjBbMQswCQYDVQQGEwJV\r\n" "\tSzERMA8GA1UEChMIZVNjaWVuY2UxEzARBgNVBAsTCk1hbmNoZXN0ZXIxCzAJBgNV\r\n" "\tBAcTmrsogriqMWLAk1DMRcwFQYDVQQDEw5taWNoYWVsIHBhcmQYJKoZIhvcNAQEB\r\n" "\tBQADggEPADCCAQoCggEBANPEQBgl1IaKdSS1TbhF3hEXSl72G9J+WC/1R64fAcEF\r\n" "\tW51rEyFYiIeZGx/BVzwXbeBoNUK41OK65sxGuflMo5gLflbwJtHBRIEKAfVVp3YR\r\n" "\tgW7cMA/s/XKgL1GEC7rQw8lIZT8RApukCGqOVHSi/F1SiFlPDxuDfmdiNzL31+sL\r\n" "\t0iwHDdNkGjy5pyBSB8Y79dsSJtCW/iaLB0/n8Sj7HgvvZJ7x0fr+RQjYOUUfrePP\r\n" "\tu2MSpFyf+9BbC/aXgaZuiCvSR+8Snv3xApQY+fULK/xY8h8Ua51iXoQ5jrgu2SqR\r\n" "\twgA7BUi3G8LFzMBl8FRCDYGUDy7M6QaHXx1ZWIPWNKsCAwEAAaOCAiQwggIgMAwG\r\n" "\tA1UdEwEB/wQCMAAwEQYJYIZIAYb4QgHTTPAQDAgWgMA4GA1UdDwEB/wQEAwID6DAs\r\n" "\tBglghkgBhvhCAQ0EHxYdVUsgZS1TY2llbmNlIFVzZXIgQ2VydGlmaWNhdGUwHQYD\r\n" "\tVR0OBBYEFDTt/sf9PeMaZDHkUIldrDYMNTBZMIGaBgNVHSMEgZIwgY+AFAI4qxGj\r\n" "\tloCLDdMVKwiljjDastqooXSkcjBwMQswCQYDVQQGEwJVSzERMA8GA1UEChMIZVNj\r\n" "\taWVuY2UxEjAQBgNVBAsTCUF1dGhvcml0eTELMAkGA1UEAxMCQ0ExLTArBgkqhkiG\r\n" "\t9w0BCQEWHmNhLW9wZXJhdG9yQGdyaWQtc3VwcG9ydC5hYy51a4IBADApBgNVHRIE\r\n" "\tIjAggR5jYS1vcGVyYXRvckBncmlkLXN1cHBvcnQuYWMudWswGQYDVR0gBBIwEDAO\r\n" "\tBgwrBgEEAdkvAQEBAQYwPQYJYIZIAYb4QgEEBDAWLmh0dHA6Ly9jYS5ncmlkLXN1\r\n" "\tcHBvcnQuYWMudmT4sopwqlBWsvcHViL2NybC9jYWNybC5jcmwwPQYJYIZIAYb4QgEDBDAWLmh0\r\n" "\tdHA6Ly9jYS5ncmlkLXN1cHBvcnQuYWMudWsvcHViL2NybC9jYWNybC5jcmwwPwYD\r\n" "\tVR0fBDgwNjA0oDKgMIYuaHR0cDovL2NhLmdyaWQt5hYy51ay9wdWIv\r\n" "\tY3JsL2NhY3JsLmNybDANBgkqhkiG9w0BAQUFAAOCAQEAS/U4iiooBENGW/Hwmmd3\r\n" "\tXCy6Zrt08YjKCzGNjorT98g8uGsqYjSxv/hmi0qlnlHs+k/3Iobc3LjS5AMYr5L8\r\n" "\tUO7OSkgFFlLHQyC9JzPfmLCAugvzEbyv4Olnsr8hbxF1MbKZoQxUZtMVu29wjfXk\r\n" "\thTeApBv7eaKCWpSp7MCbvgzm74izKhu3vlDk9w6qVrxePfGgpKPqfHiOoGhFnbTK\r\n" "\twTC6o2xq5y0qZ03JonF7OJspEd3I5zKY3E+ov7/ZhW6DqT8UFvsAdjvQbXyhV8Eu\r\n" "\tYhixw1aKEPzNjNowuIseVogKOLXxWI5vAi5HgXdS0/ES5gDGsABo4fqovUKlgop3\r\n" "\tRA==\r\n" "\t-----END CERTIFICATE-----\r\n" "\r\n"; test_simple(dumbluck2, HPE_OK); const char *corrupted_connection = "GET / HTTP/1.1\r\n" "Host: www.example.com\r\n" "Connection\r\033\065\325eep-Alive\r\n" "Accept-Encoding: gzip\r\n" "\r\n"; test_simple(corrupted_connection, HPE_INVALID_HEADER_TOKEN); const char *corrupted_header_name = "GET / HTTP/1.1\r\n" "Host: www.example.com\r\n" "X-Some-Header\r\033\065\325eep-Alive\r\n" "Accept-Encoding: gzip\r\n" "\r\n"; test_simple(corrupted_header_name, HPE_INVALID_HEADER_TOKEN); #if 0 // NOTE(Wed Nov 18 11:57:27 CET 2009) this seems okay. we just read body // until EOF. // // no content-length // error if there is a body without content length const char *bad_get_no_headers_no_body = "GET /bad_get_no_headers_no_body/world HTTP/1.1\r\n" "Accept: */*\r\n" "\r\n" "HELLO"; test_simple(bad_get_no_headers_no_body, 0); #endif /* TODO sending junk and large headers gets rejected */ /* check to make sure our predefined requests are okay */ for (i = 0; i < ARRAY_SIZE(requests); i++) { test_message(&requests[i]); } for (i = 0; i < ARRAY_SIZE(requests); i++) { test_message_pause(&requests[i]); } for (i = 0; i < ARRAY_SIZE(requests); i++) { if (!requests[i].should_keep_alive) continue; for (j = 0; j < ARRAY_SIZE(requests); j++) { if (!requests[j].should_keep_alive) continue; for (k = 0; k < ARRAY_SIZE(requests); k++) { test_multiple3(&requests[i], &requests[j], &requests[k]); } } } printf("request scan 1/4 "); test_scan( &requests[GET_NO_HEADERS_NO_BODY] , &requests[GET_ONE_HEADER_NO_BODY] , &requests[GET_NO_HEADERS_NO_BODY] ); printf("request scan 2/4 "); test_scan( &requests[POST_CHUNKED_ALL_YOUR_BASE] , &requests[POST_IDENTITY_BODY_WORLD] , &requests[GET_FUNKY_CONTENT_LENGTH] ); printf("request scan 3/4 "); test_scan( &requests[TWO_CHUNKS_MULT_ZERO_END] , &requests[CHUNKED_W_TRAILING_HEADERS] , &requests[CHUNKED_W_NONSENSE_AFTER_LENGTH] ); printf("request scan 4/4 "); test_scan( &requests[QUERY_URL_WITH_QUESTION_MARK_GET] , &requests[PREFIX_NEWLINE_GET ] , &requests[CONNECT_REQUEST] ); puts("requests okay"); return 0; }