multidict-4.7.3/0000755000175100001650000000000013602414216014012 5ustar vstsdocker00000000000000multidict-4.7.3/CHANGES.rst0000644000175100001650000001002513602414211015605 0ustar vstsdocker00000000000000========= Changelog ========= .. You should *NOT* be adding new change log entries to this file, this file is managed by towncrier. You *may* edit previous change logs to fix problems like typo corrections or such. To add a new change log entry, please see https://pip.pypa.io/en/latest/development/#adding-a-news-entry we named the news folder "changes". WARNING: Don't drop the next directive! .. towncrier release notes start 4.7.3 (2019-12-30) ================== Features -------- - Implement ``__sizeof__`` function to correctly calculate all internal structures size. `#444 `_ - Expose ``getversion()`` function. `#451 `_ Bugfixes -------- - Fix crashes in ``popone``/``popall`` when default is returned. `#450 `_ Improved Documentation ---------------------- - Corrected the documentation for ``MultiDict.extend()`` `#446 `_ ---- 4.7.2 (2019-12-20) ================== Bugfixes -------- - Fix crashing when multidict is used pyinstaller `#432 `_ - Fix typing for `CIMultiDict.copy` `#434 `_ - Fix memory leak in ``MultiDict.copy()`` `#443 `_ ---- 4.7.1 (2019-12-12) ================== Bugfixes -------- - `CIMultiDictProxy.copy` return object type `multidict._multidict.CIMultiDict` `#427 `_ - Make `CIMultiDict` subclassable again `#416 `_ - Fix regression, multidict can be constructed from arbitrary iterable of pairs again. `#418 `_ - `CIMultiDict.add` may be called with keyword arguments `#421 `_ Improved Documentation ---------------------- - Mention ``MULTIDICT_NO_EXTENSIONS`` environment variable in docs. `#393 `_ - Document the fact that ``istr`` preserves the casing of argument untouched but uses internal lower-cased copy for keys comparison. `#419 `_ ---- 4.7.0 (2019-12-10) ================== Features -------- - Replace Cython optimization with pure C `#249 `_ - Implement ``__length_hint__()`` for iterators `#310 `_ - Support the MultiDict[str] generic specialization in the runtime. `#392 `_ - Embed pair_list_t structure into MultiDict Python object `#395 `_ - Embed multidict pairs for small dictionaries to amortize the memory usage. `#396 `_ - Support weak references to C Extension classes. `#399 `_ - Add docstrings to provided classes. `#400 `_ - Merge ``multidict._istr`` back with ``multidict._multidict``. `#409 `_ Bugfixes -------- - Explicitly call ``tp_free`` slot on deallocation. `#407 `_ - Return class from __class_getitem__ to simplify subclassing `#413 `_ ---- 4.6.1 (2019-11-21) ==================== Bugfixes -------- - Fix PyPI link for GitHub Issues badge. `#391 `_ 4.6.0 (2019-11-20) ==================== Bugfixes -------- - Fix GC object tracking. `#314 `_ - Preserve the case of `istr` strings. `#374 `_ - Generate binary wheels for Python 3.8. multidict-4.7.3/LICENSE0000644000175100001650000002612513602414211015020 0ustar vstsdocker00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "{}" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright 2016-2017 Andrew Svetlov Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. multidict-4.7.3/MANIFEST.in0000644000175100001650000000042513602414211015544 0ustar vstsdocker00000000000000include LICENSE include CHANGES.rst include README.rst include Makefile graft multidict graft docs graft tests global-exclude *.pyc include multidict/*.c exclude multidict/_multidict.html exclude multidict/*.so exclude multidict/*.pyd exclude multidict/*.pyd prune docs/_build multidict-4.7.3/Makefile0000644000175100001650000000451513602414211015452 0ustar vstsdocker00000000000000# Some simple testing tasks (sorry, UNIX only). .PHONY: all build flake test vtest cov clean doc mypy PYXS = $(wildcard multidict/*.pyx) SRC = multidict tests setup.py all: test .install-deps: $(shell find requirements -type f) pip install -r requirements/dev.txt @touch .install-deps .flake: .install-deps $(shell find multidict -type f) \ $(shell find tests -type f) flake8 multidict tests @if ! isort -c -rc multidict tests; then \ echo "Import sort errors, run 'make fmt' to fix them!!!"; \ isort --diff -rc multidict tests; \ false; \ fi @touch .flake isort-check: @if ! isort -c -rc $(SRC); then \ echo "Import sort errors, run 'make fmt' to fix them!!!"; \ isort --diff -c -rc $(SRC); \ false; \ fi flake8: flake8 $(SRC) black-check: @if ! isort -c -rc $(SRC); then \ echo "black errors, run 'make fmt' to fix them!!!"; \ black -t py35 --diff --check $(SRC); \ false; \ fi mypy: mypy multidict tests lint: flake8 black-check mypy isort-check fmt: black -t py35 $(SRC) isort -rc $(SRC) check_changes: ./tools/check_changes.py .develop: .install-deps $(shell find multidict -type f) .flake check_changes mypy pip install -e . @touch .develop test: .develop @pytest -q vtest: .develop @pytest -s -v cov-dev: .develop @pytest --cov-report=html @echo "open file://`pwd`/htmlcov/index.html" cov-ci-run: .develop @echo "Regular run" @pytest --cov-report=html cov-dev-full: cov-ci-run @echo "open file://`pwd`/htmlcov/index.html" doc: @make -C docs html SPHINXOPTS="-W -E" @echo "open file://`pwd`/docs/_build/html/index.html" doc-spelling: @make -C docs spelling SPHINXOPTS="-W -E" install: @pip install -U 'pip' @pip install -Ur requirements/dev.txt install-dev: .develop clean: rm -rf `find . -name __pycache__` rm -f `find . -type f -name '*.py[co]' ` rm -f `find . -type f -name '*~' ` rm -f `find . -type f -name '.*~' ` rm -f `find . -type f -name '@*' ` rm -f `find . -type f -name '#*#' ` rm -f `find . -type f -name '*.orig' ` rm -f `find . -type f -name '*.rej' ` rm -f .coverage rm -rf coverage rm -rf build rm -rf cover rm -rf htmlcov make -C docs clean SPHINXBUILD=false python3 setup.py clean rm -f multidict/*.html rm -f multidict/*.so rm -f multidict/*.pyd rm -rf .tox multidict-4.7.3/PKG-INFO0000644000175100001650000001150413602414216015110 0ustar vstsdocker00000000000000Metadata-Version: 1.2 Name: multidict Version: 4.7.3 Summary: multidict implementation Home-page: https://github.com/aio-libs/multidict Author: Andrew Svetlov Author-email: andrew.svetlov@gmail.com License: Apache 2 Project-URL: Chat: Gitter, https://gitter.im/aio-libs/Lobby Project-URL: CI: Azure Pipelines, https://dev.azure.com/aio-libs/multidict/_build Project-URL: Coverage: codecov, https://codecov.io/github/aio-libs/multidict Project-URL: Docs: RTD, https://multidict.readthedocs.io Project-URL: GitHub: issues, https://github.com/aio-libs/multidict/issues Project-URL: GitHub: repo, https://github.com/aio-libs/multidict Description: ========= multidict ========= .. image:: https://dev.azure.com/aio-libs/multidict/_apis/build/status/CI?branchName=master :target: https://dev.azure.com/aio-libs/multidict/_build :alt: Azure Pipelines status for master branch .. image:: https://codecov.io/gh/aio-libs/multidict/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/multidict :alt: Coverage metrics .. image:: https://img.shields.io/pypi/v/multidict.svg :target: https://pypi.org/project/multidict :alt: PyPI .. image:: https://readthedocs.org/projects/multidict/badge/?version=latest :target: http://multidict.readthedocs.org/en/latest/?badge=latest :alt: Documentationb .. image:: https://img.shields.io/pypi/pyversions/multidict.svg :target: https://pypi.org/project/multidict :alt: Python versions .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Multidict is dict-like collection of *key-value pairs* where key might be occurred more than once in the container. Introduction ------------ *HTTP Headers* and *URL query string* require specific data structure: *multidict*. It behaves mostly like a regular ``dict`` but it may have several *values* for the same *key* and *preserves insertion ordering*. The *key* is ``str`` (or ``istr`` for case-insensitive dictionaries). ``multidict`` has four multidict classes: ``MultiDict``, ``MultiDictProxy``, ``CIMultiDict`` and ``CIMultiDictProxy``. Immutable proxies (``MultiDictProxy`` and ``CIMultiDictProxy``) provide a dynamic view for the proxied multidict, the view reflects underlying collection changes. They implement the ``collections.abc.Mapping`` interface. Regular mutable (``MultiDict`` and ``CIMultiDict``) classes implement ``collections.abc.MutableMapping`` and allows to change their own content. *Case insensitive* (``CIMultiDict`` and ``CIMultiDictProxy``) ones assume the *keys* are case insensitive, e.g.:: >>> dct = CIMultiDict(key='val') >>> 'Key' in dct True >>> dct['Key'] 'val' *Keys* should be ``str`` or ``istr`` instances. The library has optional C Extensions for sake of speed. License ------- Apache 2 Library Installation -------------------- .. code-block:: bash $ pip install multidict The library is Python 3 only! PyPI contains binary wheels for Linux, Windows and MacOS. If you want to install ``multidict`` on another operation system (or *Alpine Linux* inside a Docker) the Tarball will be used to compile the library from sources. It requires C compiler and Python headers installed. To skip the compilation please use `MULTIDICT_NO_EXTENSIONS` environment variable, e.g.: .. code-block:: bash $ MULTIDICT_NO_EXTENSIONS=1 pip install multidict Please note, Pure Python (uncompiled) version is about 20-50 times slower depending on the usage scenario!!! Changelog --------- See `RTD page `_. Platform: UNKNOWN Classifier: License :: OSI Approved :: Apache Software License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.5 multidict-4.7.3/README.rst0000644000175100001650000000564113602414211015502 0ustar vstsdocker00000000000000========= multidict ========= .. image:: https://dev.azure.com/aio-libs/multidict/_apis/build/status/CI?branchName=master :target: https://dev.azure.com/aio-libs/multidict/_build :alt: Azure Pipelines status for master branch .. image:: https://codecov.io/gh/aio-libs/multidict/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/multidict :alt: Coverage metrics .. image:: https://img.shields.io/pypi/v/multidict.svg :target: https://pypi.org/project/multidict :alt: PyPI .. image:: https://readthedocs.org/projects/multidict/badge/?version=latest :target: http://multidict.readthedocs.org/en/latest/?badge=latest :alt: Documentationb .. image:: https://img.shields.io/pypi/pyversions/multidict.svg :target: https://pypi.org/project/multidict :alt: Python versions .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Multidict is dict-like collection of *key-value pairs* where key might be occurred more than once in the container. Introduction ------------ *HTTP Headers* and *URL query string* require specific data structure: *multidict*. It behaves mostly like a regular ``dict`` but it may have several *values* for the same *key* and *preserves insertion ordering*. The *key* is ``str`` (or ``istr`` for case-insensitive dictionaries). ``multidict`` has four multidict classes: ``MultiDict``, ``MultiDictProxy``, ``CIMultiDict`` and ``CIMultiDictProxy``. Immutable proxies (``MultiDictProxy`` and ``CIMultiDictProxy``) provide a dynamic view for the proxied multidict, the view reflects underlying collection changes. They implement the ``collections.abc.Mapping`` interface. Regular mutable (``MultiDict`` and ``CIMultiDict``) classes implement ``collections.abc.MutableMapping`` and allows to change their own content. *Case insensitive* (``CIMultiDict`` and ``CIMultiDictProxy``) ones assume the *keys* are case insensitive, e.g.:: >>> dct = CIMultiDict(key='val') >>> 'Key' in dct True >>> dct['Key'] 'val' *Keys* should be ``str`` or ``istr`` instances. The library has optional C Extensions for sake of speed. License ------- Apache 2 Library Installation -------------------- .. code-block:: bash $ pip install multidict The library is Python 3 only! PyPI contains binary wheels for Linux, Windows and MacOS. If you want to install ``multidict`` on another operation system (or *Alpine Linux* inside a Docker) the Tarball will be used to compile the library from sources. It requires C compiler and Python headers installed. To skip the compilation please use `MULTIDICT_NO_EXTENSIONS` environment variable, e.g.: .. code-block:: bash $ MULTIDICT_NO_EXTENSIONS=1 pip install multidict Please note, Pure Python (uncompiled) version is about 20-50 times slower depending on the usage scenario!!! Changelog --------- See `RTD page `_. multidict-4.7.3/docs/0000755000175100001650000000000013602414216014742 5ustar vstsdocker00000000000000multidict-4.7.3/docs/Makefile0000644000175100001650000001533313602414211016402 0ustar vstsdocker00000000000000# Makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = _build # User-friendly check for sphinx-build ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1) $(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/) endif # Internal variables. PAPEROPT_a4 = -D latex_paper_size=a4 PAPEROPT_letter = -D latex_paper_size=letter ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . # the i18n builder cannot share the environment and doctrees with the others I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) . .PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " dirhtml to make HTML files named index.html in directories" @echo " singlehtml to make a single large HTML file" @echo " pickle to make pickle files" @echo " json to make JSON files" @echo " htmlhelp to make HTML files and a HTML help project" @echo " qthelp to make HTML files and a qthelp project" @echo " devhelp to make HTML files and a Devhelp project" @echo " epub to make an epub" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " latexpdf to make LaTeX files and run them through pdflatex" @echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx" @echo " text to make text files" @echo " man to make manual pages" @echo " texinfo to make Texinfo files" @echo " info to make Texinfo files and run them through makeinfo" @echo " gettext to make PO message catalogs" @echo " changes to make an overview of all changed/added/deprecated items" @echo " xml to make Docutils-native XML files" @echo " pseudoxml to make pseudoxml-XML files for display purposes" @echo " linkcheck to check all external links for integrity" @echo " doctest to run all doctests embedded in the documentation (if enabled)" clean: rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." dirhtml: $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml." singlehtml: $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml @echo @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml." pickle: $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle @echo @echo "Build finished; now you can process the pickle files." json: $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json @echo @echo "Build finished; now you can process the JSON files." htmlhelp: $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp @echo @echo "Build finished; now you can run HTML Help Workshop with the" \ ".hhp project file in $(BUILDDIR)/htmlhelp." qthelp: $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp @echo @echo "Build finished; now you can run "qcollectiongenerator" with the" \ ".qhcp project file in $(BUILDDIR)/qthelp, like this:" @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/aiohttp.qhcp" @echo "To view the help file:" @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/aiohttp.qhc" devhelp: $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp @echo @echo "Build finished." @echo "To view the help file:" @echo "# mkdir -p $$HOME/.local/share/devhelp/aiohttp" @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/aiohttp" @echo "# devhelp" epub: $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub @echo @echo "Build finished. The epub file is in $(BUILDDIR)/epub." latex: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex." @echo "Run \`make' in that directory to run these through (pdf)latex" \ "(use \`make latexpdf' here to do that automatically)." latexpdf: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through pdflatex..." $(MAKE) -C $(BUILDDIR)/latex all-pdf @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." latexpdfja: $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex @echo "Running LaTeX files through platex and dvipdfmx..." $(MAKE) -C $(BUILDDIR)/latex all-pdf-ja @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex." text: $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text @echo @echo "Build finished. The text files are in $(BUILDDIR)/text." man: $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man @echo @echo "Build finished. The manual pages are in $(BUILDDIR)/man." texinfo: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo @echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo." @echo "Run \`make' in that directory to run these through makeinfo" \ "(use \`make info' here to do that automatically)." info: $(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo @echo "Running Texinfo files through makeinfo..." make -C $(BUILDDIR)/texinfo info @echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo." gettext: $(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale @echo @echo "Build finished. The message catalogs are in $(BUILDDIR)/locale." changes: $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes @echo @echo "The overview file is in $(BUILDDIR)/changes." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." doctest: $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest @echo "Testing of doctests in the sources finished, look at the " \ "results in $(BUILDDIR)/doctest/output.txt." xml: $(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml @echo @echo "Build finished. The XML files are in $(BUILDDIR)/xml." pseudoxml: $(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml @echo @echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml." spelling: $(SPHINXBUILD) -b spelling $(ALLSPHINXOPTS) $(BUILDDIR)/spelling @echo @echo "Build finished." multidict-4.7.3/docs/benchmark.rst0000644000175100001650000000340013602414211017416 0ustar vstsdocker00000000000000.. _benchmarking-reference: ========== Benchmarks ========== Introduction ------------ Benchmarks allow to track performance from release to release and verify that latest changes haven not affected it drastically. Benchmarks are based on the `perf `_ module. How to run ---------- `requirements/dev.txt` should be installed before we can proceed with benchmarks. Please also make sure that you have `configured `_ your OS to have reliable results. To run benchmarks next command can be executed: .. code-block:: bash $ python benchmarks/benchmark.py This would run benchmarks for both classes (:class:`MultiDict` and :class:`CIMultiDict`) of both implementations (`Python` and `Cython`). To run benchmarks for a specific class of specific implementation please use `--impl` option: .. code-block:: bash $ python benchmarks/benchmark.py --impl multidict_cython would run benchmarks only for :class:`MultiDict` implemented in `Cython`. Please use `--help` to see all available options. Most of the options are described at `perf's Runner `_ documentation. How to compare implementations ------------------------------ `--impl` option allows to run benchmarks for a specific implementation of class. Combined with the `compare_to `_ command of :mod:`perf` module we can get a good picture of how implementation performs: .. code-block:: bash $ python benchmarks/benchmark.py --impl multidict_cython -o multidict_cy.json $ python benchmarks/benchmark.py --impl multidict_python -o multidict_py.json $ python -m perf compare_to multidict_cy.json multidict_py.json multidict-4.7.3/docs/changes.rst0000644000175100001650000000012113602414211017071 0ustar vstsdocker00000000000000.. _multidict_changes: .. include:: ../CHANGES.rst .. include:: ../HISTORY.rst multidict-4.7.3/docs/conf.py0000644000175100001650000002346713602414211016250 0ustar vstsdocker00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- # # multidict documentation build configuration file, created by # sphinx-quickstart on Wed Mar 5 12:35:35 2014. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. import codecs import datetime import os import re import sys _docs_path = os.path.dirname(__file__) _version_path = os.path.abspath(os.path.join(_docs_path, '..', 'multidict', '__init__.py')) with codecs.open(_version_path, 'r', 'latin1') as fp: try: _version_info = re.search(r'^__version__ = "' r"(?P\d+)" r"\.(?P\d+)" r"\.(?P\d+)" r'(?P.*)?"$', fp.read(), re.M).groupdict() except IndexError: raise RuntimeError('Unable to determine version.') import alabaster # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.extlinks', 'sphinx.ext.intersphinx', 'sphinx.ext.viewcode', 'alabaster', ] try: import sphinxcontrib.spelling extensions.append('sphinxcontrib.spelling') except ImportError: pass intersphinx_mapping = { 'python': ('http://docs.python.org/3', None), 'aiohttp': ('https://aiohttp.readthedocs.io/en/stable/', None), } # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The encoding of source files. # source_encoding = 'utf-8-sig' # The master toctree document. master_doc = 'index' # General information about the project. org = 'aio-libs' project = 'multidict' copyright = ( '2016β€’{end_year}, Andrew Svetlov'. format(end_year=datetime.date.today().year) ) # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = '{major}.{minor}'.format(**_version_info) # The full version, including alpha/beta/rc tags. release = '{major}.{minor}.{patch}-{tag}'.format(**_version_info) # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # language = None # There are two options for replacing |today|: either, you set today to some # non-false value, then it is used: # today = '' # Else, today_fmt is used as the format for a strftime call. # today_fmt = '%B %d, %Y' # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ['_build'] # The reST default role (used for this markup: `text`) to use for all # documents. # default_role = None # If true, '()' will be appended to :func: etc. cross-reference text. # add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). # add_module_names = True # If true, sectionauthor and moduleauthor directives will be shown in the # output. They are ignored by default. # show_authors = False # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # The default language to highlight source code in. highlight_language = 'python' # A list of ignored prefixes for module index sorting. # modindex_common_prefix = [] # If true, keep warnings as "system message" paragraphs in the built documents. # keep_warnings = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'alabaster' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. html_theme_options = { # 'logo': 'aiohttp-icon-128x128.png', 'description': project, 'github_user': org, 'github_repo': project, 'github_button': True, 'github_type': 'star', 'github_banner': True, 'travis_button': True, 'codecov_button': True, 'pre_bg': '#FFF6E5', 'note_bg': '#E5ECD1', 'note_border': '#BFCF8C', 'body_text': '#482C0A', 'sidebar_text': '#49443E', 'sidebar_header': '#4B4032', } # Add any paths that contain custom themes here, relative to this directory. html_theme_path = [alabaster.get_path()] # The name for this set of Sphinx documents. If None, it defaults to # " v documentation". # html_title = None # A shorter title for the navigation bar. Default is the same as html_title. # html_short_title = None # The name of an image file (relative to this directory) to place at the top # of the sidebar. # html_logo = 'aiohttp-icon.svg' # The name of an image file (within the static path) to use as favicon of the # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 # pixels large. # html_favicon = 'aiohttp-icon.ico' # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". # html_static_path = ['_static'] # Add any extra paths that contain custom files (such as robots.txt or # .htaccess) here, relative to this directory. These files are copied # directly to the root of the documentation. # html_extra_path = [] # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, # using the given strftime format. # html_last_updated_fmt = '%b %d, %Y' # If true, SmartyPants will be used to convert quotes and dashes to # typographically correct entities. # html_use_smartypants = True # Custom sidebar templates, maps document names to template names. html_sidebars = { '**': [ 'about.html', 'navigation.html', 'searchbox.html', ] } # Additional templates that should be rendered to pages, maps page names to # template names. # html_additional_pages = {} # If false, no module index is generated. # html_domain_indices = True # If false, no index is generated. # html_use_index = True # If true, the index is split into individual pages for each letter. # html_split_index = False # If true, links to the reST sources are added to the pages. # html_show_sourcelink = True # If true, "Created using Sphinx" is shown in the HTML footer. Default is True. # html_show_sphinx = True # If true, "(C) Copyright ..." is shown in the HTML footer. Default is True. # html_show_copyright = True # If true, an OpenSearch description file will be output, and all pages will # contain a tag referring to it. The value of this option must be the # base URL from which the finished HTML is served. # html_use_opensearch = '' # This is the file name suffix for HTML files (e.g. ".xhtml"). # html_file_suffix = None # Output file base name for HTML help builder. htmlhelp_basename = 'multidictdoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). # 'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). # 'pointsize': '10pt', # Additional stuff for the LaTeX preamble. # 'preamble': '', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ ('index', 'multidict.tex', 'multidict Documentation', 'Andrew Svetlov', 'manual'), ] # The name of an image file (relative to this directory) to place at the top of # the title page. # latex_logo = None # For "manual" documents, if this is true, then toplevel headings are parts, # not chapters. # latex_use_parts = False # If true, show page references after internal links. # latex_show_pagerefs = False # If true, show URL addresses after external links. # latex_show_urls = False # Documents to append as an appendix to all manuals. # latex_appendices = [] # If false, no module index is generated. # latex_domain_indices = True # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ ('index', project, 'multidict Documentation', ['Andrew Svetlov'], 1) ] # If true, show URL addresses after external links. # man_show_urls = False # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ ('index', project, 'multidict Documentation', 'Andrew Svetlov', project, 'One line description of project.', 'Miscellaneous'), ] # Documents to append as an appendix to all manuals. # texinfo_appendices = [] # If false, no module index is generated. # texinfo_domain_indices = True # How to display URL addresses: 'footnote', 'no', or 'inline'. # texinfo_show_urls = 'footnote' # If true, do not generate a @detailmenu in the "Top" node's menu. # texinfo_no_detailmenu = False github_repo_url = f'https://github.com/{org}/{project}' extlinks = { 'issue': (f'{github_repo_url}/issues/%s', '#'), 'pr': (f'{github_repo_url}/pull/%s', 'PR #'), } multidict-4.7.3/docs/index.rst0000644000175100001650000000605213602414211016601 0ustar vstsdocker00000000000000.. aiohttp documentation master file, created by sphinx-quickstart on Wed Mar 5 12:35:35 2014. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. multidict ========= Multidicts are useful for working with HTTP headers, URL query args etc. The code was extracted from aiohttp library. Introduction ------------ *HTTP Headers* and *URL query string* require specific data structure: *multidict*. It behaves mostly like a regular :class:`dict` but it may have several *values* for the same *key* and *preserves insertion ordering*. The *key* is :class:`str` (or :class:`istr` for case-insensitive dictionaries). :mod:`multidict` has four multidict classes: :class:`MultiDict`, :class:`MultiDictProxy`, :class:`CIMultiDict` and :class:`CIMultiDictProxy`. Immutable proxies (:class:`MultiDictProxy` and :class:`CIMultiDictProxy`) provide a dynamic view for the proxied multidict, the view reflects underlying collection changes. They implement the :class:`~collections.abc.Mapping` interface. Regular mutable (:class:`MultiDict` and :class:`CIMultiDict`) classes implement :class:`~collections.abc.MutableMapping` and allows to change their own content. *Case insensitive* (:class:`CIMultiDict` and :class:`CIMultiDictProxy`) ones assume the *keys* are case insensitive, e.g.:: >>> dct = CIMultiDict(key='val') >>> 'Key' in dct True >>> dct['Key'] 'val' *Keys* should be either :class:`str` or :class:`istr` instance. The library has optional C Extensions for sake of speed. Library Installation -------------------- .. code-block:: bash $ pip install multidict The library is Python 3 only! PyPI contains binary wheels for Linux, Windows and MacOS. If you want to install ``multidict`` on another operation system (or *Alpine Linux* inside a Docker) the Tarball will be used to compile the library from sources. It requires C compiler and Python headers installed. To skip the compilation please use `MULTIDICT_NO_EXTENSIONS` environment variable, e.g.: .. code-block:: bash $ MULTIDICT_NO_EXTENSIONS=1 pip install multidict Please note, Pure Python (uncompiled) version is about 20-50 times slower depending on the usage scenario!!! Source code ----------- The project is hosted on GitHub_ Please file an issue on the `bug tracker `_ if you have found a bug or have some suggestion in order to improve the library. The library uses `Azure Pipelines `_ for Continuous Integration. Discussion list --------------- *aio-libs* google group: https://groups.google.com/forum/#!forum/aio-libs Feel free to post your questions and ideas here. Authors and License ------------------- The ``multidict`` package is written by Andrew Svetlov. It's *Apache 2* licensed and freely available. Contents -------- .. toctree:: multidict benchmark changes Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` .. _GitHub: https://github.com/aio-libs/multidict multidict-4.7.3/docs/make.bat0000644000175100001650000001505713602414211016352 0ustar vstsdocker00000000000000@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=_build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% . set I18NSPHINXOPTS=%SPHINXOPTS% . if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. texinfo to make Texinfo files echo. gettext to make PO message catalogs echo. changes to make an overview over all changed/added/deprecated items echo. xml to make Docutils-native XML files echo. pseudoxml to make pseudoxml-XML files for display purposes echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) %SPHINXBUILD% 2> nul if errorlevel 9009 ( echo. echo.The 'sphinx-build' command was not found. Make sure you have Sphinx echo.installed, then set the SPHINXBUILD environment variable to point echo.to the full path of the 'sphinx-build' executable. Alternatively you echo.may add the Sphinx directory to PATH. echo. echo.If you don't have Sphinx installed, grab it from echo.http://sphinx-doc.org/ exit /b 1 ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\aiohttp.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\aiohttp.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdf" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "latexpdfja" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex cd %BUILDDIR%/latex make all-pdf-ja cd %BUILDDIR%/.. echo. echo.Build finished; the PDF files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "texinfo" ( %SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo if errorlevel 1 exit /b 1 echo. echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo. goto end ) if "%1" == "gettext" ( %SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale if errorlevel 1 exit /b 1 echo. echo.Build finished. The message catalogs are in %BUILDDIR%/locale. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) if "%1" == "xml" ( %SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml if errorlevel 1 exit /b 1 echo. echo.Build finished. The XML files are in %BUILDDIR%/xml. goto end ) if "%1" == "pseudoxml" ( %SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml if errorlevel 1 exit /b 1 echo. echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml. goto end ) :end multidict-4.7.3/docs/multidict.rst0000644000175100001650000002577013602414211017500 0ustar vstsdocker00000000000000.. _multidict-reference: ============ Reference ============ .. module:: multidict MultiDict ========= .. class:: MultiDict(**kwargs) MultiDict(mapping, **kwargs) MultiDict(iterable, **kwargs) Creates a mutable multidict instance. Accepted parameters are the same as for :class:`dict`. If the same key appears several times it will be added, e.g.:: >>> d = MultiDict([('a', 1), ('b', 2), ('a', 3)]) >>> d .. method:: len(d) Return the number of items in multidict *d*. .. method:: d[key] Return the **first** item of *d* with key *key*. Raises a :exc:`KeyError` if key is not in the multidict. .. method:: d[key] = value Set ``d[key]`` to *value*. Replace all items where key is equal to *key* with single item ``(key, value)``. .. method:: del d[key] Remove all items where key is equal to *key* from *d*. Raises a :exc:`KeyError` if *key* is not in the map. .. method:: key in d Return ``True`` if d has a key *key*, else ``False``. .. method:: key not in d Equivalent to ``not (key in d)`` .. method:: iter(d) Return an iterator over the keys of the dictionary. This is a shortcut for ``iter(d.keys())``. .. method:: add(key, value) Append ``(key, value)`` pair to the dictionary. .. method:: clear() Remove all items from the dictionary. .. method:: copy() Return a shallow copy of the dictionary. .. method:: extend([other]) Extend the dictionary with the key/value pairs from *other*, appending the pairs to this dictionary. For existing keys, values are added. Returns ``None``. :meth:`extend` accepts either another dictionary object or an iterable of key/value pairs (as tuples or other iterables of length two). If keyword arguments are specified, the dictionary is then extended with those key/value pairs: ``d.extend(red=1, blue=2)``. Effectively the same as calling :meth:`add` for every ``(key, value)`` pair. Also see :meth:`update`, for a version that replaces existing keys. .. method:: getone(key[, default]) Return the **first** value for *key* if *key* is in the dictionary, else *default*. Raises :exc:`KeyError` if *default* is not given and *key* is not found. ``d[key]`` is equivalent to ``d.getone(key)``. .. method:: getall(key[, default]) Return a list of all values for *key* if *key* is in the dictionary, else *default*. Raises :exc:`KeyError` if *default* is not given and *key* is not found. .. method:: get(key[, default]) Return the **first** value for *key* if *key* is in the dictionary, else *default*. If *default* is not given, it defaults to ``None``, so that this method never raises a :exc:`KeyError`. ``d.get(key)`` is equivalent to ``d.getone(key, None)``. .. method:: keys() Return a new view of the dictionary's keys. View contains all keys, possibly with duplicates. .. method:: items() Return a new view of the dictionary's items (``(key, value)`` pairs). View contains all items, multiple items can have the same key. .. method:: values() Return a new view of the dictionary's values. View contains all values. .. method:: popone(key[, default]) If *key* is in the dictionary, remove it and return its the **first** value, else return *default*. If *default* is not given and *key* is not in the dictionary, a :exc:`KeyError` is raised. .. versionadded:: 3.0 .. method:: pop(key[, default]) An alias to :meth:`pop` .. versionchanged:: 3.0 Now only *first* occurrence is removed (was all). .. method:: popall(key[, default]) If *key* is in the dictionary, remove all occurrences and return a :class:`list` of all values in corresponding order (as :meth:`getall` does). If *key* is not found and *default* is provided return *default*. If *default* is not given and *key* is not in the dictionary, a :exc:`KeyError` is raised. .. versionadded:: 3.0 .. method:: popitem() Remove and return an arbitrary ``(key, value)`` pair from the dictionary. :meth:`popitem` is useful to destructively iterate over a dictionary, as often used in set algorithms. If the dictionary is empty, calling :meth:`popitem` raises a :exc:`KeyError`. .. method:: setdefault(key[, default]) If *key* is in the dictionary, return its the **first** value. If not, insert *key* with a value of *default* and return *default*. *default* defaults to ``None``. .. method:: update([other]) Update the dictionary with the key/value pairs from *other*, overwriting existing keys. Returns ``None``. :meth:`update` accepts either another dictionary object or an iterable of key/value pairs (as tuples or other iterables of length two). If keyword arguments are specified, the dictionary is then updated with those key/value pairs: ``d.update(red=1, blue=2)``. Also see :meth:`extend` for a method that adds to existing keys rather than update them. .. seealso:: :class:`MultiDictProxy` can be used to create a read-only view of a :class:`MultiDict`. CIMultiDict =========== .. class:: CIMultiDict(**kwargs) CIMultiDict(mapping, **kwargs) CIMultiDict(iterable, **kwargs) Create a case insensitive multidict instance. The behavior is the same as of :class:`MultiDict` but key comparisons are case insensitive, e.g.:: >>> dct = CIMultiDict(a='val') >>> 'A' in dct True >>> dct['A'] 'val' >>> dct['a'] 'val' >>> dct['b'] = 'new val' >>> dct['B'] 'new val' The class is inherited from :class:`MultiDict`. .. seealso:: :class:`CIMultiDictProxy` can be used to create a read-only view of a :class:`CIMultiDict`. MultiDictProxy ============== .. class:: MultiDictProxy(multidict) Create an immutable multidict proxy. It provides a dynamic view on the multidict’s entries, which means that when the multidict changes, the view reflects these changes. Raises :exc:`TypeError` if *multidict* is not a :class:`MultiDict` instance. .. method:: len(d) Return number of items in multidict *d*. .. method:: d[key] Return the **first** item of *d* with key *key*. Raises a :exc:`KeyError` if key is not in the multidict. .. method:: key in d Return ``True`` if d has a key *key*, else ``False``. .. method:: key not in d Equivalent to ``not (key in d)`` .. method:: iter(d) Return an iterator over the keys of the dictionary. This is a shortcut for ``iter(d.keys())``. .. method:: copy() Return a shallow copy of the underlying multidict. .. method:: getone(key[, default]) Return the **first** value for *key* if *key* is in the dictionary, else *default*. Raises :exc:`KeyError` if *default* is not given and *key* is not found. ``d[key]`` is equivalent to ``d.getone(key)``. .. method:: getall(key[, default]) Return a list of all values for *key* if *key* is in the dictionary, else *default*. Raises :exc:`KeyError` if *default* is not given and *key* is not found. .. method:: get(key[, default]) Return the **first** value for *key* if *key* is in the dictionary, else *default*. If *default* is not given, it defaults to ``None``, so that this method never raises a :exc:`KeyError`. ``d.get(key)`` is equivalent to ``d.getone(key, None)``. .. method:: keys() Return a new view of the dictionary's keys. View contains all keys, possibly with duplicates. .. method:: items() Return a new view of the dictionary's items (``(key, value)`` pairs). View contains all items, multiple items can have the same key. .. method:: values() Return a new view of the dictionary's values. View contains all values. CIMultiDictProxy ================ .. class:: CIMultiDictProxy(multidict) Case insensitive version of :class:`MultiDictProxy`. Raises :exc:`TypeError` is *multidict* is not :class:`CIMultiDict` instance. The class is inherited from :class:`MultiDict`. Version ======= All multidicts have an internal version flag. It's changed on every dict update, thus the flag could be used for checks like cache expiring etc. .. function:: getversion(mdict) Return a version of given *mdict* object (works for proxies also). The type of returned value is opaque and should be used for equality tests only (``==`` and ``!=``), ordering is not allowed while not prohibited explicitly. .. versionadded:: 3.0 .. seealso:: :pep:`509` istr ==== :class:`CIMultiDict` accepts :class:`str` as *key* argument for dict lookups but uses case-folded (lower-cased) strings for the comparison internally. For more effective processing it should know if the *key* is already case-folded to skip the :meth:`~str.lower()` call. The performant code may create case-folded string keys explicitly hand, e.g:: >>> key = istr('Key') >>> key 'Key' >>> mdict = CIMultiDict(key='value') >>> key in mdict True >>> mdict[key] 'value' For performance :class:`istr` strings should be created once and stored somewhere for the later usage, see :mod:`aiohttp.hdrs` for example. .. class:: istr(object='') istr(bytes_or_buffer[, encoding[, errors]]) Create a new **case-folded** string object from the given *object*. If *encoding* or *errors* are specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of ``object.__str__()`` (if defined) or ``repr(object)``. *encoding* defaults to ``sys.getdefaultencoding()``. *errors* defaults to ``'strict'``. The class is inherited from :class:`str` and has all regular string methods. .. versionchanged:: 2.0 ``upstr`` is a deprecated alias for ``istr``. .. versionchanged:: 3.7 ``istr`` doesn't title-case its argument anymore but uses internal lower-cased data for fast case-insensitive comparison. Abstract Base Classes ===================== The module provides two ABCs: ``MultiMapping`` and ``MutableMultiMapping``. They are similar to :class:`collections.abc.Mapping` and :class:`collections.abc.MutableMapping` and inherited from them. .. versionadded:: 3.3 Typing ====== The library is shipped with embedded type annotations, mypy just picks the annotations by default. :class:`MultiDict`, :class:`CIMultiDict`, :class:`MultiDictProxy`, and :class:`CIMultiDictProxy` are *generic* types; please use the corresponding notation for multidict value types, e.g. ``md: MultiDict[str] = MultiDict()``. The type of multidict keys is always :class:`str` or a class derived from a string. .. versionadded:: 3.7 multidict-4.7.3/docs/spelling_wordlist.txt0000644000175100001650000000137413602414211021247 0ustar vstsdocker00000000000000 aiohttp args async autocalculated autodetection autogenerates autogeneration basename bugfixes cchardet cChardet changelog charset charsetdetect criterias css ctor Ctrl cython deallocation dict docstrings eof fallback fastpath filename getitem github google gunicorn Gunicorn Indices IP IPv ish istr iterable iterables javascript json keepalive keepalives keepaliving lookups manylinux middleware middlewares multidict multidicts Multidicts multipart Multipart mypy Nikolay param params performant pickable pre proxied pyenv pyinstaller refactor refactored regex regexs repo runtime subclassable subclassing subprotocol subprotocols Svetlov toolbar toolset tuples un uncompiled upstr url urlencoded urls utf websocket websockets Websockets wildcard Workflow wsgi multidict-4.7.3/multidict/0000755000175100001650000000000013602414216016010 5ustar vstsdocker00000000000000multidict-4.7.3/multidict/__init__.py0000644000175100001650000000165513602414211020123 0ustar vstsdocker00000000000000"""Multidict implementation. HTTP Headers and URL query string require specific data structure: multidict. It behaves mostly like a dict but it can have several values for the same key. """ from ._abc import MultiMapping, MutableMultiMapping from ._compat import USE_CYTHON_EXTENSIONS __all__ = ( "MultiMapping", "MutableMultiMapping", "MultiDictProxy", "CIMultiDictProxy", "MultiDict", "CIMultiDict", "upstr", "istr", "getversion" ) __version__ = "4.7.3" try: if not USE_CYTHON_EXTENSIONS: raise ImportError from ._multidict import ( MultiDictProxy, CIMultiDictProxy, MultiDict, CIMultiDict, istr, getversion, ) except ImportError: # pragma: no cover from ._multidict_py import ( MultiDictProxy, CIMultiDictProxy, MultiDict, CIMultiDict, istr, getversion, ) upstr = istr multidict-4.7.3/multidict/__init__.pyi0000644000175100001650000001150313602414211020265 0ustar vstsdocker00000000000000import abc from typing import ( Dict, Generic, Iterable, Iterator, List, Mapping, MutableMapping, Tuple, TypeVar, Union, overload, ) class istr(str): ... upstr = istr _S = Union[str, istr] _T = TypeVar("_T") _T_co = TypeVar("_T_co", covariant=True) _D = TypeVar("_D") class MultiMapping(Mapping[_S, _T_co]): @overload @abc.abstractmethod def getall(self, key: _S) -> List[_T_co]: ... @overload @abc.abstractmethod def getall(self, key: _S, default: _D) -> Union[List[_T_co], _D]: ... @overload @abc.abstractmethod def getone(self, key: _S) -> _T_co: ... @overload @abc.abstractmethod def getone(self, key: _S, default: _D) -> Union[_T_co, _D]: ... _Arg = Union[Mapping[_S, _T], Dict[_S, _T], MultiMapping[_T], Iterable[Tuple[_S, _T]]] class MutableMultiMapping(MultiMapping[_T], MutableMapping[_S, _T], Generic[_T]): @abc.abstractmethod def add(self, key: _S, value: _T) -> None: ... @abc.abstractmethod def extend(self, arg: _Arg[_T] = ..., **kwargs: _T) -> None: ... @overload @abc.abstractmethod def popone(self, key: _S) -> _T: ... @overload @abc.abstractmethod def popone(self, key: _S, default: _D) -> Union[_T, _D]: ... @overload @abc.abstractmethod def popall(self, key: _S) -> List[_T]: ... @overload @abc.abstractmethod def popall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... class MultiDict(MutableMultiMapping[_T], Generic[_T]): def __init__(self, arg: _Arg[_T] = ..., **kwargs: _T) -> None: ... def copy(self) -> MultiDict[_T]: ... def __getitem__(self, k: _S) -> _T: ... def __setitem__(self, k: _S, v: _T) -> None: ... def __delitem__(self, v: _S) -> None: ... def __iter__(self) -> Iterator[_S]: ... def __len__(self) -> int: ... @overload def getall(self, key: _S) -> List[_T]: ... @overload def getall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... @overload def getone(self, key: _S) -> _T: ... @overload def getone(self, key: _S, default: _D) -> Union[_T, _D]: ... def add(self, key: _S, value: _T) -> None: ... def extend(self, arg: _Arg[_T] = ..., **kwargs: _T) -> None: ... @overload def popone(self, key: _S) -> _T: ... @overload def popone(self, key: _S, default: _D) -> Union[_T, _D]: ... @overload def popall(self, key: _S) -> List[_T]: ... @overload def popall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... class CIMultiDict(MutableMultiMapping[_T], Generic[_T]): def __init__(self, arg: _Arg[_T] = ..., **kwargs: _T) -> None: ... def copy(self) -> CIMultiDict[_T]: ... def __getitem__(self, k: _S) -> _T: ... def __setitem__(self, k: _S, v: _T) -> None: ... def __delitem__(self, v: _S) -> None: ... def __iter__(self) -> Iterator[_S]: ... def __len__(self) -> int: ... @overload def getall(self, key: _S) -> List[_T]: ... @overload def getall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... @overload def getone(self, key: _S) -> _T: ... @overload def getone(self, key: _S, default: _D) -> Union[_T, _D]: ... def add(self, key: _S, value: _T) -> None: ... def extend(self, arg: _Arg[_T] = ..., **kwargs: _T) -> None: ... @overload def popone(self, key: _S) -> _T: ... @overload def popone(self, key: _S, default: _D) -> Union[_T, _D]: ... @overload def popall(self, key: _S) -> List[_T]: ... @overload def popall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... class MultiDictProxy(MultiMapping[_T], Generic[_T]): def __init__( self, arg: Union[MultiMapping[_T], MutableMultiMapping[_T]] ) -> None: ... def copy(self) -> MultiDict[_T]: ... def __getitem__(self, k: _S) -> _T: ... def __iter__(self) -> Iterator[_S]: ... def __len__(self) -> int: ... @overload def getall(self, key: _S) -> List[_T]: ... @overload def getall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... @overload def getone(self, key: _S) -> _T: ... @overload def getone(self, key: _S, default: _D) -> Union[_T, _D]: ... class CIMultiDictProxy(MultiMapping[_T], Generic[_T]): def __init__( self, arg: Union[MultiMapping[_T], MutableMultiMapping[_T]] ) -> None: ... def __getitem__(self, k: _S) -> _T: ... def __iter__(self) -> Iterator[_S]: ... def __len__(self) -> int: ... @overload def getall(self, key: _S) -> List[_T]: ... @overload def getall(self, key: _S, default: _D) -> Union[List[_T], _D]: ... @overload def getone(self, key: _S) -> _T: ... @overload def getone(self, key: _S, default: _D) -> Union[_T, _D]: ... def copy(self) -> CIMultiDict[_T]: ... def getversion( md: Union[MultiDict[_T], CIMultiDict[_T], MultiDictProxy[_T], CIMultiDictProxy[_T]] ) -> int: ... multidict-4.7.3/multidict/_abc.py0000644000175100001650000000200113602414211017232 0ustar vstsdocker00000000000000import abc from collections.abc import Mapping, MutableMapping class _TypingMeta(abc.ABCMeta): # A fake metaclass to satisfy typing deps in runtime # basically MultiMapping[str] and other generic-like type instantiations # are emulated. # Note: real type hints are provided by __init__.pyi stub file def __getitem__(self, key): return self class MultiMapping(Mapping, metaclass=_TypingMeta): @abc.abstractmethod def getall(self, key, default=None): raise KeyError @abc.abstractmethod def getone(self, key, default=None): raise KeyError class MutableMultiMapping(MultiMapping, MutableMapping): @abc.abstractmethod def add(self, key, value): raise NotImplementedError @abc.abstractmethod def extend(self, *args, **kwargs): raise NotImplementedError @abc.abstractmethod def popone(self, key, default=None): raise KeyError @abc.abstractmethod def popall(self, key, default=None): raise KeyError multidict-4.7.3/multidict/_compat.py0000644000175100001650000000055313602414211020002 0ustar vstsdocker00000000000000import os import platform NO_EXTENSIONS = bool(os.environ.get("MULTIDICT_NO_EXTENSIONS")) PYPY = platform.python_implementation() == "PyPy" USE_CYTHON_EXTENSIONS = USE_CYTHON = not NO_EXTENSIONS and not PYPY if USE_CYTHON_EXTENSIONS: try: from . import _multidict # noqa except ImportError: USE_CYTHON_EXTENSIONS = USE_CYTHON = False multidict-4.7.3/multidict/_multidict.c0000644000175100001650000011703613602414211020314 0ustar vstsdocker00000000000000#include "Python.h" #include "structmember.h" // Include order important #include "_multilib/defs.h" #include "_multilib/istr.h" #include "_multilib/pair_list.h" #include "_multilib/dict.h" #include "_multilib/iter.h" #include "_multilib/views.h" static PyObject *collections_abc_mapping; static PyObject *collections_abc_mut_mapping; static PyObject *collections_abc_mut_multi_mapping; static PyTypeObject multidict_type; static PyTypeObject cimultidict_type; static PyTypeObject multidict_proxy_type; static PyTypeObject cimultidict_proxy_type; static PyObject *repr_func; #define MultiDict_CheckExact(o) (Py_TYPE(o) == &multidict_type) #define CIMultiDict_CheckExact(o) (Py_TYPE(o) == &cimultidict_type) #define MultiDictProxy_CheckExact(o) (Py_TYPE(o) == &multidict_proxy_type) #define CIMultiDictProxy_CheckExact(o) (Py_TYPE(o) == &cimultidict_proxy_type) /* Helper macro for something like isinstance(obj, Base) */ #define _MultiDict_Check(o) \ ((MultiDict_CheckExact(o)) || \ (CIMultiDict_CheckExact(o)) || \ (MultiDictProxy_CheckExact(o)) || \ (CIMultiDictProxy_CheckExact(o))) /******************** Internal Methods ********************/ /* Forward declaration */ static PyObject *multidict_items(MultiDictObject *self); static inline PyObject * _multidict_getone(MultiDictObject *self, PyObject *key, PyObject *_default) { PyObject *val = pair_list_get_one(&self->pairs, key); if (val == NULL && PyErr_ExceptionMatches(PyExc_KeyError) && _default != NULL) { PyErr_Clear(); Py_INCREF(_default); return _default; } return val; } static inline int _multidict_eq(MultiDictObject *self, MultiDictObject *other) { Py_ssize_t pos1 = 0, pos2 = 0; Py_hash_t h1 = 0, h2 = 0; PyObject *identity1 = NULL, *identity2 = NULL, *value1 = NULL, *value2 = NULL; int cmp_identity = 0, cmp_value = 0; if (self == other) { return 1; } if (pair_list_len(&self->pairs) != pair_list_len(&other->pairs)) { return 0; } while (_pair_list_next(&self->pairs, &pos1, &identity1, NULL, &value1, &h1) && _pair_list_next(&other->pairs, &pos2, &identity2, NULL, &value2, &h2)) { if (h1 != h2) { return 0; } cmp_identity = PyObject_RichCompareBool(identity1, identity2, Py_NE); if (cmp_identity < 0) { return -1; } cmp_value = PyObject_RichCompareBool(value1, value2, Py_NE); if (cmp_value < 0) { return -1; } if (cmp_identity || cmp_value) { return 0; } } return 1; } static inline int _multidict_update_items(MultiDictObject *self, pair_list_t *pairs) { return pair_list_update(&self->pairs, pairs); } static inline int _multidict_append_items(MultiDictObject *self, pair_list_t *pairs) { PyObject *key = NULL, *value = NULL; Py_ssize_t pos = 0; while (_pair_list_next(pairs, &pos, NULL, &key, &value, NULL)) { if (pair_list_add(&self->pairs, key, value) < 0) { return -1; } } return 0; } static inline int _multidict_append_items_seq(MultiDictObject *self, PyObject *arg, const char *name) { PyObject *key = NULL, *value = NULL, *item = NULL, *iter = PyObject_GetIter(arg); if (iter == NULL) { return -1; } while ((item = PyIter_Next(iter)) != NULL) { if (PyTuple_CheckExact(item)) { if (PyTuple_GET_SIZE(item) != 2) { goto invalid_type; } key = PyTuple_GET_ITEM(item, 0); Py_INCREF(key); value = PyTuple_GET_ITEM(item, 1); Py_INCREF(value); } else if (PyList_CheckExact(item)) { if (PyList_GET_SIZE(item) != 2) { goto invalid_type; } key = PyList_GET_ITEM(item, 0); Py_INCREF(key); value = PyList_GET_ITEM(item, 1); Py_INCREF(value); } else if (PySequence_Check(item)) { if (PySequence_Size(item) != 2) { goto invalid_type; } key = PySequence_GetItem(item, 0); value = PySequence_GetItem(item, 1); } else { goto invalid_type; } if (pair_list_add(&self->pairs, key, value) < 0) { goto fail; } Py_CLEAR(key); Py_CLEAR(value); Py_CLEAR(item); } Py_DECREF(iter); if (PyErr_Occurred()) { return -1; } return 0; invalid_type: PyErr_Format( PyExc_TypeError, "%s takes either dict or list of (key, value) pairs", name, NULL ); goto fail; fail: Py_XDECREF(key); Py_XDECREF(value); Py_XDECREF(item); Py_DECREF(iter); return -1; } static inline int _multidict_list_extend(PyObject *list, PyObject *target_list) { PyObject *item = NULL, *iter = PyObject_GetIter(target_list); if (iter == NULL) { return -1; } while ((item = PyIter_Next(iter)) != NULL) { if (PyList_Append(list, item) < 0) { Py_DECREF(item); Py_DECREF(iter); return -1; } Py_DECREF(item); } Py_DECREF(iter); if (PyErr_Occurred()) { return -1; } return 0; } static inline int _multidict_extend_with_args(MultiDictObject *self, PyObject *arg, PyObject *kwds, const char *name, int do_add) { PyObject *arg_items = NULL, /* tracked by GC */ *kwds_items = NULL; /* new reference */ pair_list_t *pairs = NULL; int err = 0; // TODO: mb can be refactored more clear if (_MultiDict_Check(arg) && kwds == NULL) { if (MultiDict_CheckExact(arg) || CIMultiDict_CheckExact(arg)) { pairs = &((MultiDictObject*)arg)->pairs; } else if (MultiDictProxy_CheckExact(arg) || CIMultiDictProxy_CheckExact(arg)) { pairs = &((MultiDictProxyObject*)arg)->md->pairs; } if (do_add) { return _multidict_append_items(self, pairs); } return _multidict_update_items(self, pairs); } if (PyObject_HasAttrString(arg, "items")) { if (_MultiDict_Check(arg)) { arg_items = multidict_items((MultiDictObject*)arg); } else { arg_items = PyMapping_Items(arg); } if (arg_items == NULL) { return -1; } } else { arg_items = arg; Py_INCREF(arg_items); } if (kwds && PyArg_ValidateKeywordArguments(kwds)) { kwds_items = PyDict_Items(kwds); err = _multidict_list_extend(arg_items, kwds_items); Py_DECREF(kwds_items); if (err < 0) { Py_DECREF(arg_items); return -1; } } if (do_add) { err = _multidict_append_items_seq(self, arg_items, name); } else { err = pair_list_update_from_seq(&self->pairs, arg_items); } Py_DECREF(arg_items); return err; } static inline int _multidict_extend_with_kwds(MultiDictObject *self, PyObject *kwds, const char *name, int do_add) { PyObject *arg = NULL; int err = 0; if (!PyArg_ValidateKeywordArguments(kwds)) { return -1; } arg = PyDict_Items(kwds); if (do_add) { err = _multidict_append_items_seq(self, arg, name); } else { err = pair_list_update_from_seq(&self->pairs, arg); } Py_DECREF(arg); return err; } static inline int _multidict_extend(MultiDictObject *self, PyObject *args, PyObject *kwds, const char *name, int do_add) { PyObject *arg = NULL; if (args && PyObject_Length(args) > 1) { PyErr_Format( PyExc_TypeError, "%s takes at most 1 positional argument (%zd given)", name, PyObject_Length(args), NULL ); return -1; } if (args && PyObject_Length(args) > 0) { if (!PyArg_UnpackTuple(args, name, 0, 1, &arg)) { return -1; } if (_multidict_extend_with_args(self, arg, kwds, name, do_add) < 0) { return -1; } } else if (kwds && PyObject_Length(kwds) > 0) { if (_multidict_extend_with_kwds(self, kwds, name, do_add) < 0) { return -1; } } return 0; } static inline PyObject * _multidict_copy(MultiDictObject *self, PyTypeObject *multidict_tp_object) { MultiDictObject *new_multidict = NULL; PyObject *arg_items = NULL, *items = NULL; new_multidict = (MultiDictObject*)PyType_GenericNew( multidict_tp_object, NULL, NULL); if (new_multidict == NULL) { return NULL; } if (multidict_tp_object->tp_init( (PyObject*)new_multidict, NULL, NULL) < 0) { return NULL; } items = multidict_items(self); if (items == NULL) { goto fail; } // TODO: "Implementation looks as slow as possible ..." arg_items = PyTuple_New(1); if (arg_items == NULL) { goto fail; } Py_INCREF(items); PyTuple_SET_ITEM(arg_items, 0, items); if (_multidict_extend( new_multidict, arg_items, NULL, "copy", 1) < 0) { goto fail; } Py_DECREF(items); Py_DECREF(arg_items); return (PyObject*)new_multidict; fail: Py_XDECREF(items); Py_XDECREF(arg_items); Py_DECREF(new_multidict); return NULL; } static inline PyObject * _multidict_proxy_copy(MultiDictProxyObject *self, PyTypeObject *type) { PyObject *new_multidict = PyType_GenericNew(type, NULL, NULL); if (new_multidict == NULL) { goto fail; } if (type->tp_init(new_multidict, NULL, NULL) < 0) { goto fail; } if (_multidict_extend_with_args( (MultiDictObject*)new_multidict, (PyObject*)self, NULL, "copy", 1) < 0) { goto fail; } return new_multidict; fail: Py_XDECREF(new_multidict); return NULL; } /******************** Base Methods ********************/ static inline PyObject * multidict_getall(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *list = NULL, *key = NULL, *_default = NULL; static char *getall_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:getall", getall_keywords, &key, &_default)) { return NULL; } list = pair_list_get_all(&self->pairs, key); if (list == NULL && PyErr_ExceptionMatches(PyExc_KeyError) && _default != NULL) { PyErr_Clear(); Py_INCREF(_default); return _default; } return list; } static inline PyObject * multidict_getone(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *_default = NULL; static char *getone_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:getone", getone_keywords, &key, &_default)) { return NULL; } return _multidict_getone(self, key, _default); } static inline PyObject * multidict_get(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *_default = Py_None, *ret; static char *getone_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:getone", getone_keywords, &key, &_default)) { return NULL; } ret = _multidict_getone(self, key, _default); return ret; } static inline PyObject * multidict_keys(MultiDictObject *self) { return multidict_keysview_new((PyObject*)self); } static inline PyObject * multidict_items(MultiDictObject *self) { return multidict_itemsview_new((PyObject*)self); } static inline PyObject * multidict_values(MultiDictObject *self) { return multidict_valuesview_new((PyObject*)self); } static inline PyObject * multidict_reduce(MultiDictObject *self) { PyObject *items = NULL, *items_list = NULL, *args = NULL, *result = NULL; items = multidict_items(self); if (items == NULL) { goto ret; } items_list = PySequence_List(items); if (items_list == NULL) { goto ret; } args = PyTuple_Pack(1, items_list); if (args == NULL) { goto ret; } result = PyTuple_Pack(2, Py_TYPE(self), args); ret: Py_XDECREF(args); Py_XDECREF(items_list); Py_XDECREF(items); return result; } static inline PyObject * multidict_repr(PyObject *self) { return PyObject_CallFunctionObjArgs( repr_func, self, NULL); } static inline Py_ssize_t multidict_mp_len(MultiDictObject *self) { return pair_list_len(&self->pairs); } static inline PyObject * multidict_mp_subscript(MultiDictObject *self, PyObject *key) { return _multidict_getone(self, key, NULL); } static inline int multidict_mp_as_subscript(MultiDictObject *self, PyObject *key, PyObject *val) { if (val == NULL) { return pair_list_del(&self->pairs, key); } else { return pair_list_replace(&self->pairs, key, val); } } static inline int multidict_sq_contains(MultiDictObject *self, PyObject *key) { return pair_list_contains(&self->pairs, key); } static inline PyObject * multidict_tp_iter(MultiDictObject *self) { return PyObject_GetIter(multidict_keysview_new((PyObject*)self)); } static inline PyObject * multidict_tp_richcompare(PyObject *self, PyObject *other, int op) { // TODO: refactoring me with love int cmp = 0; if (op != Py_EQ && op != Py_NE) { Py_RETURN_NOTIMPLEMENTED; } if (MultiDict_CheckExact(other) || CIMultiDict_CheckExact(other)) { cmp = _multidict_eq( (MultiDictObject*)self, (MultiDictObject*)other ); if (cmp < 0) { return NULL; } if (op == Py_NE) { cmp = !cmp; } return PyBool_FromLong(cmp); } if (MultiDictProxy_CheckExact(other) || CIMultiDictProxy_CheckExact(other)) { cmp = _multidict_eq( (MultiDictObject*)self, ((MultiDictProxyObject*)other)->md ); if (cmp < 0) { return NULL; } if (op == Py_NE) { cmp = !cmp; } return PyBool_FromLong(cmp); } cmp = PyObject_IsInstance(other, (PyObject*)collections_abc_mapping); if (cmp < 0) { return NULL; } if (cmp) { cmp = pair_list_eq_to_mapping(&((MultiDictObject*)self)->pairs, other); if (cmp < 0) { return NULL; } if (op == Py_NE) { cmp = !cmp; } return PyBool_FromLong(cmp); } Py_RETURN_NOTIMPLEMENTED; } static inline void multidict_tp_dealloc(MultiDictObject *self) { PyObject_GC_UnTrack(self); Py_TRASHCAN_SAFE_BEGIN(self); if (self->weaklist != NULL) { PyObject_ClearWeakRefs((PyObject *)self); }; pair_list_dealloc(&self->pairs); Py_TYPE(self)->tp_free((PyObject *)self); Py_TRASHCAN_SAFE_END(self); } static inline int multidict_tp_traverse(MultiDictObject *self, visitproc visit, void *arg) { return pair_list_traverse(&self->pairs, visit, arg); } static inline int multidict_tp_clear(MultiDictObject *self) { return pair_list_clear(&self->pairs); } PyDoc_STRVAR(multidict_getall_doc, "Return a list of all values matching the key."); PyDoc_STRVAR(multidict_getone_doc, "Get first value matching the key."); PyDoc_STRVAR(multidict_get_doc, "Get first value matching the key.\n\nThe method is alias for .getone()."); PyDoc_STRVAR(multidict_keys_doc, "Return a new view of the dictionary's keys."); PyDoc_STRVAR(multidict_items_doc, "Return a new view of the dictionary's items *(key, value) pairs)."); PyDoc_STRVAR(multidict_values_doc, "Return a new view of the dictionary's values."); /******************** MultiDict ********************/ static inline int multidict_tp_init(MultiDictObject *self, PyObject *args, PyObject *kwds) { if (pair_list_init(&self->pairs) < 0) { return -1; } if (_multidict_extend(self, args, kwds, "MultiDict", 1) < 0) { return -1; } return 0; } static inline PyObject * multidict_add(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *val = NULL; static char *kwlist[] = {"key", "value", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "OO:add", kwlist, &key, &val)) { return NULL; } if (pair_list_add(&self->pairs, key, val) < 0) { return NULL; } Py_RETURN_NONE; } static inline PyObject * multidict_copy(MultiDictObject *self) { return _multidict_copy(self, &multidict_type); } static inline PyObject * multidict_extend(MultiDictObject *self, PyObject *args, PyObject *kwds) { if (_multidict_extend(self, args, kwds, "extend", 1) < 0) { return NULL; } Py_RETURN_NONE; } static inline PyObject * multidict_clear(MultiDictObject *self) { if (pair_list_clear(&self->pairs) < 0) { return NULL; } Py_RETURN_NONE; } static inline PyObject * multidict_setdefault(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *_default = NULL; static char *setdefault_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:setdefault", setdefault_keywords, &key, &_default)) { return NULL; } return pair_list_set_default(&self->pairs, key, _default); } static inline PyObject * multidict_popone(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *_default = NULL, *ret_val = NULL; static char *popone_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:popone", popone_keywords, &key, &_default)) { return NULL; } ret_val = pair_list_pop_one(&self->pairs, key); if (ret_val == NULL && PyErr_ExceptionMatches(PyExc_KeyError) && _default != NULL) { PyErr_Clear(); Py_INCREF(_default); return _default; } return ret_val; } static inline PyObject * multidict_popall(MultiDictObject *self, PyObject *args, PyObject *kwds) { PyObject *key = NULL, *_default = NULL, *ret_val = NULL; static char *popall_keywords[] = {"key", "default", NULL}; if (!PyArg_ParseTupleAndKeywords(args, kwds, "O|O:popall", popall_keywords, &key, &_default)) { return NULL; } ret_val = pair_list_pop_all(&self->pairs, key); if (ret_val == NULL && PyErr_ExceptionMatches(PyExc_KeyError) && _default != NULL) { PyErr_Clear(); Py_INCREF(_default); return _default; } return ret_val; } static inline PyObject * multidict_popitem(MultiDictObject *self) { return pair_list_pop_item(&self->pairs); } static inline PyObject * multidict_update(MultiDictObject *self, PyObject *args, PyObject *kwds) { if (_multidict_extend(self, args, kwds, "update", 0) < 0) { return NULL; } Py_RETURN_NONE; } PyDoc_STRVAR(multidict_add_doc, "Add the key and value, not overwriting any previous value."); PyDoc_STRVAR(multidict_copy_doc, "Return a copy of itself."); PyDoc_STRVAR(multdicit_method_extend_doc, "Extend current MultiDict with more values.\n\ This method must be used instead of update."); PyDoc_STRVAR(multidict_clear_doc, "Remove all items from MultiDict"); PyDoc_STRVAR(multidict_setdefault_doc, "Return value for key, set value to default if key is not present."); PyDoc_STRVAR(multidict_popone_doc, "Remove the last occurrence of key and return the corresponding value.\n\n\ If key is not found, default is returned if given, otherwise KeyError is \ raised.\n"); PyDoc_STRVAR(multidict_popall_doc, "Remove all occurrences of key and return the list of corresponding values.\n\n\ If key is not found, default is returned if given, otherwise KeyError is \ raised.\n"); PyDoc_STRVAR(multidict_popitem_doc, "Remove and return an arbitrary (key, value) pair."); PyDoc_STRVAR(multidict_update_doc, "Update the dictionary from *other*, overwriting existing keys."); static inline PyObject * multidict_class_getitem(PyObject *self, PyObject *arg) { Py_INCREF(self); return self; } PyDoc_STRVAR(sizeof__doc__, "D.__sizeof__() -> size of D in memory, in bytes"); static inline PyObject * _multidict_sizeof(MultiDictObject *self) { Py_ssize_t size = sizeof(MultiDictObject); if (self->pairs.pairs != self->pairs.buffer) { size += (Py_ssize_t)sizeof(pair_t) * self->pairs.capacity; } return PyLong_FromSsize_t(size); } static PySequenceMethods multidict_sequence = { .sq_contains = (objobjproc)multidict_sq_contains, }; static PyMappingMethods multidict_mapping = { .mp_length = (lenfunc)multidict_mp_len, .mp_subscript = (binaryfunc)multidict_mp_subscript, .mp_ass_subscript = (objobjargproc)multidict_mp_as_subscript, }; static PyMethodDef multidict_methods[] = { { "getall", (PyCFunction)multidict_getall, METH_VARARGS | METH_KEYWORDS, multidict_getall_doc }, { "getone", (PyCFunction)multidict_getone, METH_VARARGS | METH_KEYWORDS, multidict_getone_doc }, { "get", (PyCFunction)multidict_get, METH_VARARGS | METH_KEYWORDS, multidict_get_doc }, { "keys", (PyCFunction)multidict_keys, METH_NOARGS, multidict_keys_doc }, { "items", (PyCFunction)multidict_items, METH_NOARGS, multidict_items_doc }, { "values", (PyCFunction)multidict_values, METH_NOARGS, multidict_values_doc }, { "add", (PyCFunction)multidict_add, METH_VARARGS | METH_KEYWORDS, multidict_add_doc }, { "copy", (PyCFunction)multidict_copy, METH_NOARGS, multidict_copy_doc }, { "extend", (PyCFunction)multidict_extend, METH_VARARGS | METH_KEYWORDS, multdicit_method_extend_doc }, { "clear", (PyCFunction)multidict_clear, METH_NOARGS, multidict_clear_doc }, { "setdefault", (PyCFunction)multidict_setdefault, METH_VARARGS | METH_KEYWORDS, multidict_setdefault_doc }, { "popone", (PyCFunction)multidict_popone, METH_VARARGS | METH_KEYWORDS, multidict_popone_doc }, { "pop", (PyCFunction)multidict_popone, METH_VARARGS | METH_KEYWORDS, multidict_popone_doc }, { "popall", (PyCFunction)multidict_popall, METH_VARARGS | METH_KEYWORDS, multidict_popall_doc }, { "popitem", (PyCFunction)multidict_popitem, METH_NOARGS, multidict_popitem_doc }, { "update", (PyCFunction)multidict_update, METH_VARARGS | METH_KEYWORDS, multidict_update_doc }, { "__reduce__", (PyCFunction)multidict_reduce, METH_NOARGS, NULL, }, { "__class_getitem__", (PyCFunction)multidict_class_getitem, METH_O | METH_CLASS, NULL }, { "__sizeof__", (PyCFunction)_multidict_sizeof, METH_NOARGS, sizeof__doc__, }, { NULL, NULL } /* sentinel */ }; PyDoc_STRVAR(MultDict_doc, "Dictionary with the support for duplicate keys."); static PyTypeObject multidict_type = { PyVarObject_HEAD_INIT(NULL, 0) "multidict._multidict.MultiDict", /* tp_name */ sizeof(MultiDictObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_tp_dealloc, .tp_repr = (reprfunc)multidict_repr, .tp_as_sequence = &multidict_sequence, .tp_as_mapping = &multidict_mapping, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, .tp_doc = MultDict_doc, .tp_traverse = (traverseproc)multidict_tp_traverse, .tp_clear = (inquiry)multidict_tp_clear, .tp_richcompare = (richcmpfunc)multidict_tp_richcompare, .tp_weaklistoffset = offsetof(MultiDictObject, weaklist), .tp_iter = (getiterfunc)multidict_tp_iter, .tp_methods = multidict_methods, .tp_init = (initproc)multidict_tp_init, .tp_alloc = PyType_GenericAlloc, .tp_new = PyType_GenericNew, .tp_free = PyObject_GC_Del, }; /******************** CIMultiDict ********************/ static inline int cimultidict_tp_init(MultiDictObject *self, PyObject *args, PyObject *kwds) { if (ci_pair_list_init(&self->pairs) < 0) { return -1; } if (_multidict_extend(self, args, kwds, "CIMultiDict", 1) < 0) { return -1; } return 0; } static inline PyObject * cimultidict_copy(MultiDictObject *self) { return _multidict_copy(self, &cimultidict_type); } PyDoc_STRVAR(cimultidict_copy_doc, "Return a copy of itself."); static PyMethodDef cimultidict_methods[] = { { "copy", (PyCFunction)cimultidict_copy, METH_NOARGS, cimultidict_copy_doc }, { NULL, NULL } /* sentinel */ }; PyDoc_STRVAR(CIMultDict_doc, "Dictionary with the support for duplicate case-insensitive keys."); static PyTypeObject cimultidict_type = { PyVarObject_HEAD_INIT(NULL, 0) "multidict._multidict.CIMultiDict", /* tp_name */ sizeof(MultiDictObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_tp_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, .tp_doc = CIMultDict_doc, .tp_traverse = (traverseproc)multidict_tp_traverse, .tp_clear = (inquiry)multidict_tp_clear, .tp_weaklistoffset = offsetof(MultiDictObject, weaklist), .tp_methods = cimultidict_methods, .tp_base = &multidict_type, .tp_init = (initproc)cimultidict_tp_init, .tp_alloc = PyType_GenericAlloc, .tp_new = PyType_GenericNew, .tp_free = PyObject_GC_Del, }; /******************** MultiDictProxy ********************/ static inline int multidict_proxy_tp_init(MultiDictProxyObject *self, PyObject *args, PyObject *kwds) { PyObject *arg = NULL; MultiDictObject *md = NULL; if (!PyArg_UnpackTuple(args, "multidict._multidict.MultiDictProxy", 0, 1, &arg)) { return -1; } if (arg == NULL) { PyErr_Format( PyExc_TypeError, "__init__() missing 1 required positional argument: 'arg'" ); return -1; } if (!MultiDictProxy_CheckExact(arg) && !CIMultiDict_CheckExact(arg) && !MultiDict_CheckExact(arg)) { PyErr_Format( PyExc_TypeError, "ctor requires MultiDict or MultiDictProxy instance, " "not ", Py_TYPE(arg)->tp_name ); return -1; } md = (MultiDictObject*)arg; if (MultiDictProxy_CheckExact(arg)) { md = ((MultiDictProxyObject*)arg)->md; } Py_INCREF(md); self->md = md; return 0; } static inline PyObject * multidict_proxy_getall(MultiDictProxyObject *self, PyObject *args, PyObject *kwds) { return multidict_getall(self->md, args, kwds); } static inline PyObject * multidict_proxy_getone(MultiDictProxyObject *self, PyObject *args, PyObject *kwds) { return multidict_getone(self->md, args, kwds); } static inline PyObject * multidict_proxy_get(MultiDictProxyObject *self, PyObject *args, PyObject *kwds) { return multidict_get(self->md, args, kwds); } static inline PyObject * multidict_proxy_keys(MultiDictProxyObject *self) { return multidict_keys(self->md); } static inline PyObject * multidict_proxy_items(MultiDictProxyObject *self) { return multidict_items(self->md); } static inline PyObject * multidict_proxy_values(MultiDictProxyObject *self) { return multidict_values(self->md); } static inline PyObject * multidict_proxy_copy(MultiDictProxyObject *self) { return _multidict_proxy_copy(self, &multidict_type); } static inline PyObject * multidict_proxy_reduce(MultiDictProxyObject *self) { PyErr_Format( PyExc_TypeError, "can't pickle %s objects", Py_TYPE(self)->tp_name ); return NULL; } static inline Py_ssize_t multidict_proxy_mp_len(MultiDictProxyObject *self) { return multidict_mp_len(self->md); } static inline PyObject * multidict_proxy_mp_subscript(MultiDictProxyObject *self, PyObject *key) { return multidict_mp_subscript(self->md, key); } static inline int multidict_proxy_sq_contains(MultiDictProxyObject *self, PyObject *key) { return multidict_sq_contains(self->md, key); } static inline PyObject * multidict_proxy_tp_iter(MultiDictProxyObject *self) { return multidict_tp_iter(self->md); } static inline PyObject * multidict_proxy_tp_richcompare(MultiDictProxyObject *self, PyObject *other, int op) { return multidict_tp_richcompare((PyObject*)self->md, other, op); } static inline void multidict_proxy_tp_dealloc(MultiDictProxyObject *self) { PyObject_GC_UnTrack(self); if (self->weaklist != NULL) { PyObject_ClearWeakRefs((PyObject *)self); }; Py_XDECREF(self->md); Py_TYPE(self)->tp_free((PyObject *)self); } static inline int multidict_proxy_tp_traverse(MultiDictProxyObject *self, visitproc visit, void *arg) { Py_VISIT(self->md); return 0; } static inline int multidict_proxy_tp_clear(MultiDictProxyObject *self) { Py_CLEAR(self->md); return 0; } static PySequenceMethods multidict_proxy_sequence = { .sq_contains = (objobjproc)multidict_proxy_sq_contains, }; static PyMappingMethods multidict_proxy_mapping = { .mp_length = (lenfunc)multidict_proxy_mp_len, .mp_subscript = (binaryfunc)multidict_proxy_mp_subscript, }; static PyMethodDef multidict_proxy_methods[] = { { "getall", (PyCFunction)multidict_proxy_getall, METH_VARARGS | METH_KEYWORDS, multidict_getall_doc }, { "getone", (PyCFunction)multidict_proxy_getone, METH_VARARGS | METH_KEYWORDS, multidict_getone_doc }, { "get", (PyCFunction)multidict_proxy_get, METH_VARARGS | METH_KEYWORDS, multidict_get_doc }, { "keys", (PyCFunction)multidict_proxy_keys, METH_NOARGS, multidict_keys_doc }, { "items", (PyCFunction)multidict_proxy_items, METH_NOARGS, multidict_items_doc }, { "values", (PyCFunction)multidict_proxy_values, METH_NOARGS, multidict_values_doc }, { "copy", (PyCFunction)multidict_proxy_copy, METH_NOARGS, multidict_copy_doc }, { "__reduce__", (PyCFunction)multidict_proxy_reduce, METH_NOARGS, NULL }, { "__class_getitem__", (PyCFunction)multidict_class_getitem, METH_O | METH_CLASS, NULL }, { NULL, NULL } /* sentinel */ }; PyDoc_STRVAR(MultDictProxy_doc, "Read-only proxy for MultiDict instance."); static PyTypeObject multidict_proxy_type = { PyVarObject_HEAD_INIT(NULL, 0) "multidict._multidict.MultiDictProxy", /* tp_name */ sizeof(MultiDictProxyObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_proxy_tp_dealloc, .tp_repr = (reprfunc)multidict_repr, .tp_as_sequence = &multidict_proxy_sequence, .tp_as_mapping = &multidict_proxy_mapping, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, .tp_doc = MultDictProxy_doc, .tp_traverse = (traverseproc)multidict_proxy_tp_traverse, .tp_clear = (inquiry)multidict_proxy_tp_clear, .tp_richcompare = (richcmpfunc)multidict_proxy_tp_richcompare, .tp_weaklistoffset = offsetof(MultiDictProxyObject, weaklist), .tp_iter = (getiterfunc)multidict_proxy_tp_iter, .tp_methods = multidict_proxy_methods, .tp_init = (initproc)multidict_proxy_tp_init, .tp_alloc = PyType_GenericAlloc, .tp_new = PyType_GenericNew, .tp_free = PyObject_GC_Del, }; /******************** CIMultiDictProxy ********************/ static inline int cimultidict_proxy_tp_init(MultiDictProxyObject *self, PyObject *args, PyObject *kwds) { PyObject *arg = NULL; MultiDictObject *md = NULL; if (!PyArg_UnpackTuple(args, "multidict._multidict.CIMultiDictProxy", 1, 1, &arg)) { return -1; } if (arg == NULL) { PyErr_Format( PyExc_TypeError, "__init__() missing 1 required positional argument: 'arg'" ); return -1; } if (!CIMultiDictProxy_CheckExact(arg) && !CIMultiDict_CheckExact(arg)) { PyErr_Format( PyExc_TypeError, "ctor requires CIMultiDict or CIMultiDictProxy instance, " "not ", Py_TYPE(arg)->tp_name ); return -1; } md = (MultiDictObject*)arg; if (CIMultiDictProxy_CheckExact(arg)) { md = ((MultiDictProxyObject*)arg)->md; } Py_INCREF(md); self->md = md; return 0; } static inline PyObject * cimultidict_proxy_copy(MultiDictProxyObject *self) { return _multidict_proxy_copy(self, &cimultidict_type); } PyDoc_STRVAR(CIMultDictProxy_doc, "Read-only proxy for CIMultiDict instance."); PyDoc_STRVAR(cimultidict_proxy_copy_doc, "Return copy of itself"); static PyMethodDef cimultidict_proxy_methods[] = { { "copy", (PyCFunction)cimultidict_proxy_copy, METH_NOARGS, cimultidict_proxy_copy_doc }, { NULL, NULL } /* sentinel */ }; static PyTypeObject cimultidict_proxy_type = { PyVarObject_HEAD_INIT(NULL, 0) "multidict._multidict.CIMultiDictProxy", /* tp_name */ sizeof(MultiDictProxyObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_proxy_tp_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_HAVE_GC, .tp_doc = CIMultDictProxy_doc, .tp_traverse = (traverseproc)multidict_proxy_tp_traverse, .tp_clear = (inquiry)multidict_proxy_tp_clear, .tp_richcompare = (richcmpfunc)multidict_proxy_tp_richcompare, .tp_weaklistoffset = offsetof(MultiDictProxyObject, weaklist), .tp_methods = cimultidict_proxy_methods, .tp_base = &multidict_proxy_type, .tp_init = (initproc)cimultidict_proxy_tp_init, .tp_alloc = PyType_GenericAlloc, .tp_new = PyType_GenericNew, .tp_free = PyObject_GC_Del, }; /******************** Other functions ********************/ static inline PyObject * getversion(PyObject *self, PyObject *md) { pair_list_t *pairs = NULL; if (MultiDict_CheckExact(md) || CIMultiDict_CheckExact(md)) { pairs = &((MultiDictObject*)md)->pairs; } else if (MultiDictProxy_CheckExact(md) || CIMultiDictProxy_CheckExact(md)) { pairs = &((MultiDictProxyObject*)md)->md->pairs; } else { PyErr_Format(PyExc_TypeError, "unexpected type"); return NULL; } return PyLong_FromUnsignedLong(pair_list_version(pairs)); } /******************** Module ********************/ static inline void module_free(void *m) { Py_CLEAR(collections_abc_mapping); Py_CLEAR(collections_abc_mut_mapping); Py_CLEAR(collections_abc_mut_multi_mapping); } static PyMethodDef multidict_module_methods[] = { { "getversion", (PyCFunction)getversion, METH_O }, { NULL, NULL } /* sentinel */ }; static PyModuleDef multidict_module = { PyModuleDef_HEAD_INIT, /* m_base */ "_multidict", /* m_name */ .m_size = -1, .m_methods = multidict_module_methods, .m_free = (freefunc)module_free, }; PyMODINIT_FUNC PyInit__multidict() { PyObject *module = NULL, *reg_func_call_result = NULL; #define WITH_MOD(NAME) \ Py_CLEAR(module); \ module = PyImport_ImportModule(NAME); \ if (module == NULL) { \ goto fail; \ } #define GET_MOD_ATTR(VAR, NAME) \ VAR = PyObject_GetAttrString(module, NAME); \ if (VAR == NULL) { \ goto fail; \ } if (multidict_views_init() < 0) { goto fail; } if (multidict_iter_init() < 0) { goto fail; } if (istr_init() < 0) { goto fail; } if (PyType_Ready(&multidict_type) < 0 || PyType_Ready(&cimultidict_type) < 0 || PyType_Ready(&multidict_proxy_type) < 0 || PyType_Ready(&cimultidict_proxy_type) < 0) { goto fail; } WITH_MOD("collections.abc"); GET_MOD_ATTR(collections_abc_mapping, "Mapping"); WITH_MOD("multidict._abc"); GET_MOD_ATTR(collections_abc_mut_mapping, "MultiMapping"); WITH_MOD("multidict._abc"); GET_MOD_ATTR(collections_abc_mut_multi_mapping, "MutableMultiMapping"); WITH_MOD("multidict._multidict_base"); GET_MOD_ATTR(repr_func, "_mdrepr"); /* Register in _abc mappings (CI)MultiDict and (CI)MultiDictProxy */ reg_func_call_result = PyObject_CallMethod( collections_abc_mut_mapping, "register", "O", (PyObject*)&multidict_proxy_type ); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); reg_func_call_result = PyObject_CallMethod( collections_abc_mut_mapping, "register", "O", (PyObject*)&cimultidict_proxy_type ); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); reg_func_call_result = PyObject_CallMethod( collections_abc_mut_multi_mapping, "register", "O", (PyObject*)&multidict_type ); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); reg_func_call_result = PyObject_CallMethod( collections_abc_mut_multi_mapping, "register", "O", (PyObject*)&cimultidict_type ); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); /* Instantiate this module */ module = PyModule_Create(&multidict_module); Py_INCREF(&istr_type); if (PyModule_AddObject( module, "istr", (PyObject*)&istr_type) < 0) { goto fail; } Py_INCREF(&multidict_type); if (PyModule_AddObject( module, "MultiDict", (PyObject*)&multidict_type) < 0) { goto fail; } Py_INCREF(&cimultidict_type); if (PyModule_AddObject( module, "CIMultiDict", (PyObject*)&cimultidict_type) < 0) { goto fail; } Py_INCREF(&multidict_proxy_type); if (PyModule_AddObject( module, "MultiDictProxy", (PyObject*)&multidict_proxy_type) < 0) { goto fail; } Py_INCREF(&cimultidict_proxy_type); if (PyModule_AddObject( module, "CIMultiDictProxy", (PyObject*)&cimultidict_proxy_type) < 0) { goto fail; } return module; fail: Py_XDECREF(collections_abc_mapping); Py_XDECREF(collections_abc_mut_mapping); Py_XDECREF(collections_abc_mut_multi_mapping); return NULL; #undef WITH_MOD #undef GET_MOD_ATTR } multidict-4.7.3/multidict/_multidict_base.py0000644000175100001650000000731713602414211021514 0ustar vstsdocker00000000000000from collections.abc import ItemsView, Iterable, KeysView, Set, ValuesView def _abc_itemsview_register(view_cls): ItemsView.register(view_cls) def _abc_keysview_register(view_cls): KeysView.register(view_cls) def _abc_valuesview_register(view_cls): ValuesView.register(view_cls) def _viewbaseset_richcmp(view, other, op): if op == 0: # < if not isinstance(other, Set): return NotImplemented return len(view) < len(other) and view <= other elif op == 1: # <= if not isinstance(other, Set): return NotImplemented if len(view) > len(other): return False for elem in view: if elem not in other: return False return True elif op == 2: # == if not isinstance(other, Set): return NotImplemented return len(view) == len(other) and view <= other elif op == 3: # != return not view == other elif op == 4: # > if not isinstance(other, Set): return NotImplemented return len(view) > len(other) and view >= other elif op == 5: # >= if not isinstance(other, Set): return NotImplemented if len(view) < len(other): return False for elem in other: if elem not in view: return False return True def _viewbaseset_and(view, other): if not isinstance(other, Iterable): return NotImplemented if isinstance(view, Set): view = set(iter(view)) if isinstance(other, Set): other = set(iter(other)) if not isinstance(other, Set): other = set(iter(other)) return view & other def _viewbaseset_or(view, other): if not isinstance(other, Iterable): return NotImplemented if isinstance(view, Set): view = set(iter(view)) if isinstance(other, Set): other = set(iter(other)) if not isinstance(other, Set): other = set(iter(other)) return view | other def _viewbaseset_sub(view, other): if not isinstance(other, Iterable): return NotImplemented if isinstance(view, Set): view = set(iter(view)) if isinstance(other, Set): other = set(iter(other)) if not isinstance(other, Set): other = set(iter(other)) return view - other def _viewbaseset_xor(view, other): if not isinstance(other, Iterable): return NotImplemented if isinstance(view, Set): view = set(iter(view)) if isinstance(other, Set): other = set(iter(other)) if not isinstance(other, Set): other = set(iter(other)) return view ^ other def _itemsview_isdisjoint(view, other): "Return True if two sets have a null intersection." for v in other: if v in view: return False return True def _itemsview_repr(view): lst = [] for k, v in view: lst.append("{!r}: {!r}".format(k, v)) body = ", ".join(lst) return "{}({})".format(view.__class__.__name__, body) def _keysview_isdisjoint(view, other): "Return True if two sets have a null intersection." for k in other: if k in view: return False return True def _keysview_repr(view): lst = [] for k in view: lst.append("{!r}".format(k)) body = ", ".join(lst) return "{}({})".format(view.__class__.__name__, body) def _valuesview_repr(view): lst = [] for v in view: lst.append("{!r}".format(v)) body = ", ".join(lst) return "{}({})".format(view.__class__.__name__, body) def _mdrepr(md): lst = [] for k, v in md.items(): lst.append("'{}': {!r}".format(k, v)) body = ", ".join(lst) return "<{}({})>".format(md.__class__.__name__, body) multidict-4.7.3/multidict/_multidict_py.py0000644000175100001650000003455513602414211021236 0ustar vstsdocker00000000000000import sys from array import array from collections import abc from ._abc import MultiMapping, MutableMultiMapping _marker = object() class istr(str): """Case insensitive str.""" __is_istr__ = True upstr = istr # for relaxing backward compatibility problems def getversion(md): if not isinstance(md, _Base): raise TypeError("Parameter should be multidict or proxy") return md._impl._version _version = array("Q", [0]) class _Impl: __slots__ = ("_items", "_version") def __init__(self): self._items = [] self.incr_version() def incr_version(self): global _version v = _version v[0] += 1 self._version = v[0] if sys.implementation.name != "pypy": def __sizeof__(self): return object.__sizeof__(self) + sys.getsizeof(self._items) class _Base: def _title(self, key): return key def getall(self, key, default=_marker): """Return a list of all values matching the key.""" identity = self._title(key) res = [v for i, k, v in self._impl._items if i == identity] if res: return res if not res and default is not _marker: return default raise KeyError("Key not found: %r" % key) def getone(self, key, default=_marker): """Get first value matching the key.""" identity = self._title(key) for i, k, v in self._impl._items: if i == identity: return v if default is not _marker: return default raise KeyError("Key not found: %r" % key) # Mapping interface # def __getitem__(self, key): return self.getone(key) def get(self, key, default=None): """Get first value matching the key. The method is alias for .getone(). """ return self.getone(key, default) def __iter__(self): return iter(self.keys()) def __len__(self): return len(self._impl._items) def keys(self): """Return a new view of the dictionary's keys.""" return _KeysView(self._impl) def items(self): """Return a new view of the dictionary's items *(key, value) pairs).""" return _ItemsView(self._impl) def values(self): """Return a new view of the dictionary's values.""" return _ValuesView(self._impl) def __eq__(self, other): if not isinstance(other, abc.Mapping): return NotImplemented if isinstance(other, _Base): lft = self._impl._items rht = other._impl._items if len(lft) != len(rht): return False for (i1, k2, v1), (i2, k2, v2) in zip(lft, rht): if i1 != i2 or v1 != v2: return False return True if len(self._impl._items) != len(other): return False for k, v in self.items(): nv = other.get(k, _marker) if v != nv: return False return True def __contains__(self, key): identity = self._title(key) for i, k, v in self._impl._items: if i == identity: return True return False def __repr__(self): body = ", ".join("'{}': {!r}".format(k, v) for k, v in self.items()) return "<{}({})>".format(self.__class__.__name__, body) class MultiDictProxy(_Base, MultiMapping): """Read-only proxy for MultiDict instance.""" def __init__(self, arg): if not isinstance(arg, (MultiDict, MultiDictProxy)): raise TypeError( "ctor requires MultiDict or MultiDictProxy instance" ", not {}".format(type(arg)) ) self._impl = arg._impl def __reduce__(self): raise TypeError("can't pickle {} objects".format(self.__class__.__name__)) def copy(self): """Return a copy of itself.""" return MultiDict(self.items()) class CIMultiDictProxy(MultiDictProxy): """Read-only proxy for CIMultiDict instance.""" def __init__(self, arg): if not isinstance(arg, (CIMultiDict, CIMultiDictProxy)): raise TypeError( "ctor requires CIMultiDict or CIMultiDictProxy instance" ", not {}".format(type(arg)) ) self._impl = arg._impl def _title(self, key): return key.title() def copy(self): """Return a copy of itself.""" return CIMultiDict(self.items()) class MultiDict(_Base, MutableMultiMapping): """Dictionary with the support for duplicate keys.""" def __init__(self, *args, **kwargs): self._impl = _Impl() self._extend(args, kwargs, self.__class__.__name__, self._extend_items) if sys.implementation.name != "pypy": def __sizeof__(self): return object.__sizeof__(self) + sys.getsizeof(self._impl) def __reduce__(self): return (self.__class__, (list(self.items()),)) def _title(self, key): return key def _key(self, key): if isinstance(key, str): return key else: raise TypeError( "MultiDict keys should be either str " "or subclasses of str" ) def add(self, key, value): identity = self._title(key) self._impl._items.append((identity, self._key(key), value)) self._impl.incr_version() def copy(self): """Return a copy of itself.""" cls = self.__class__ return cls(self.items()) __copy__ = copy def extend(self, *args, **kwargs): """Extend current MultiDict with more values. This method must be used instead of update. """ self._extend(args, kwargs, "extend", self._extend_items) def _extend(self, args, kwargs, name, method): if len(args) > 1: raise TypeError( "{} takes at most 1 positional argument" " ({} given)".format(name, len(args)) ) if args: arg = args[0] if isinstance(args[0], (MultiDict, MultiDictProxy)) and not kwargs: items = arg._impl._items else: if hasattr(arg, "items"): arg = arg.items() if kwargs: arg = list(arg) arg.extend(list(kwargs.items())) items = [] for item in arg: if not len(item) == 2: raise TypeError( "{} takes either dict or list of (key, value) " "tuples".format(name) ) items.append((self._title(item[0]), self._key(item[0]), item[1])) method(items) else: method( [ (self._title(key), self._key(key), value) for key, value in kwargs.items() ] ) def _extend_items(self, items): for identity, key, value in items: self.add(key, value) def clear(self): """Remove all items from MultiDict.""" self._impl._items.clear() self._impl.incr_version() # Mapping interface # def __setitem__(self, key, value): self._replace(key, value) def __delitem__(self, key): identity = self._title(key) items = self._impl._items found = False for i in range(len(items) - 1, -1, -1): if items[i][0] == identity: del items[i] found = True if not found: raise KeyError(key) else: self._impl.incr_version() def setdefault(self, key, default=None): """Return value for key, set value to default if key is not present.""" identity = self._title(key) for i, k, v in self._impl._items: if i == identity: return v self.add(key, default) return default def popone(self, key, default=_marker): """Remove specified key and return the corresponding value. If key is not found, d is returned if given, otherwise KeyError is raised. """ identity = self._title(key) for i in range(len(self._impl._items)): if self._impl._items[i][0] == identity: value = self._impl._items[i][2] del self._impl._items[i] self._impl.incr_version() return value if default is _marker: raise KeyError(key) else: return default pop = popone # type: ignore def popall(self, key, default=_marker): """Remove all occurrences of key and return the list of corresponding values. If key is not found, default is returned if given, otherwise KeyError is raised. """ found = False identity = self._title(key) ret = [] for i in range(len(self._impl._items) - 1, -1, -1): item = self._impl._items[i] if item[0] == identity: ret.append(item[2]) del self._impl._items[i] self._impl.incr_version() found = True if not found: if default is _marker: raise KeyError(key) else: return default else: ret.reverse() return ret def popitem(self): """Remove and return an arbitrary (key, value) pair.""" if self._impl._items: i = self._impl._items.pop(0) self._impl.incr_version() return i[1], i[2] else: raise KeyError("empty multidict") def update(self, *args, **kwargs): """Update the dictionary from *other*, overwriting existing keys.""" self._extend(args, kwargs, "update", self._update_items) def _update_items(self, items): if not items: return used_keys = {} for identity, key, value in items: start = used_keys.get(identity, 0) for i in range(start, len(self._impl._items)): item = self._impl._items[i] if item[0] == identity: used_keys[identity] = i + 1 self._impl._items[i] = (identity, key, value) break else: self._impl._items.append((identity, key, value)) used_keys[identity] = len(self._impl._items) # drop tails i = 0 while i < len(self._impl._items): item = self._impl._items[i] identity = item[0] pos = used_keys.get(identity) if pos is None: i += 1 continue if i >= pos: del self._impl._items[i] else: i += 1 self._impl.incr_version() def _replace(self, key, value): key = self._key(key) identity = self._title(key) items = self._impl._items for i in range(len(items)): item = items[i] if item[0] == identity: items[i] = (identity, key, value) # i points to last found item rgt = i self._impl.incr_version() break else: self._impl._items.append((identity, key, value)) self._impl.incr_version() return # remove all tail items i = rgt + 1 while i < len(items): item = items[i] if item[0] == identity: del items[i] else: i += 1 class CIMultiDict(MultiDict): """Dictionary with the support for duplicate case-insensitive keys.""" def _title(self, key): return key.title() class _Iter: __slots__ = ("_size", "_iter") def __init__(self, size, iterator): self._size = size self._iter = iterator def __iter__(self): return self def __next__(self): return next(self._iter) def __length_hint__(self): return self._size class _ViewBase: def __init__(self, impl): self._impl = impl self._version = impl._version def __len__(self): return len(self._impl._items) class _ItemsView(_ViewBase, abc.ItemsView): def __contains__(self, item): assert isinstance(item, tuple) or isinstance(item, list) assert len(item) == 2 for i, k, v in self._impl._items: if item[0] == k and item[1] == v: return True return False def __iter__(self): return _Iter(len(self), self._iter()) def _iter(self): for i, k, v in self._impl._items: if self._version != self._impl._version: raise RuntimeError("Dictionary changed during iteration") yield k, v def __repr__(self): lst = [] for item in self._impl._items: lst.append("{!r}: {!r}".format(item[1], item[2])) body = ", ".join(lst) return "{}({})".format(self.__class__.__name__, body) class _ValuesView(_ViewBase, abc.ValuesView): def __contains__(self, value): for item in self._impl._items: if item[2] == value: return True return False def __iter__(self): return _Iter(len(self), self._iter()) def _iter(self): for item in self._impl._items: if self._version != self._impl._version: raise RuntimeError("Dictionary changed during iteration") yield item[2] def __repr__(self): lst = [] for item in self._impl._items: lst.append("{!r}".format(item[2])) body = ", ".join(lst) return "{}({})".format(self.__class__.__name__, body) class _KeysView(_ViewBase, abc.KeysView): def __contains__(self, key): for item in self._impl._items: if item[1] == key: return True return False def __iter__(self): return _Iter(len(self), self._iter()) def _iter(self): for item in self._impl._items: if self._version != self._impl._version: raise RuntimeError("Dictionary changed during iteration") yield item[1] def __repr__(self): lst = [] for item in self._impl._items: lst.append("{!r}".format(item[1])) body = ", ".join(lst) return "{}({})".format(self.__class__.__name__, body) multidict-4.7.3/multidict/_multilib/0000755000175100001650000000000013602414216017770 5ustar vstsdocker00000000000000multidict-4.7.3/multidict/_multilib/defs.h0000644000175100001650000000116313602414211021056 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_DEFS_H #define _MULTIDICT_DEFS_H #ifdef __cplusplus extern "C" { #endif _Py_IDENTIFIER(lower); /* We link this module statically for convenience. If compiled as a shared library instead, some compilers don't allow addresses of Python objects defined in other libraries to be used in static initializers here. The DEFERRED_ADDRESS macro is used to tag the slots where such addresses appear; the module init function must fill in the tagged slots at runtime. The argument is for documentation -- the macro ignores it. */ #define DEFERRED_ADDRESS(ADDR) 0 #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/_multilib/dict.h0000644000175100001650000000056013602414211021060 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_C_H #define _MULTIDICT_C_H #ifdef __cplusplus extern "C" { #endif typedef struct { // 16 or 24 for GC prefix PyObject_HEAD // 16 PyObject *weaklist; pair_list_t pairs; } MultiDictObject; typedef struct { PyObject_HEAD PyObject *weaklist; MultiDictObject *md; } MultiDictProxyObject; #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/_multilib/istr.h0000644000175100001650000000357213602414211021124 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_ISTR_H #define _MULTIDICT_ISTR_H #ifdef __cplusplus extern "C" { #endif typedef struct { PyUnicodeObject str; PyObject * canonical; } istrobject; PyDoc_STRVAR(istr__doc__, "istr class implementation"); static PyTypeObject istr_type; static inline void istr_dealloc(istrobject *self) { Py_XDECREF(self->canonical); PyUnicode_Type.tp_dealloc((PyObject*)self); } static inline PyObject * istr_new(PyTypeObject *type, PyObject *args, PyObject *kwds) { PyObject *x = NULL; static char *kwlist[] = {"object", "encoding", "errors", 0}; PyObject *encoding = NULL; PyObject *errors = NULL; PyObject *s = NULL; PyObject * ret = NULL; if (!PyArg_ParseTupleAndKeywords(args, kwds, "|OOO:str", kwlist, &x, &encoding, &errors)) { return NULL; } if (x != NULL && Py_TYPE(x) == &istr_type) { Py_INCREF(x); return x; } ret = PyUnicode_Type.tp_new(type, args, kwds); if (!ret) { goto fail; } s =_PyObject_CallMethodId(ret, &PyId_lower, NULL); if (!s) { goto fail; } ((istrobject*)ret)->canonical = s; s = NULL; /* the reference is stollen by .canonical */ return ret; fail: Py_XDECREF(ret); return NULL; } static PyTypeObject istr_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict.istr", sizeof(istrobject), .tp_dealloc = (destructor)istr_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE | Py_TPFLAGS_UNICODE_SUBCLASS, .tp_doc = istr__doc__, .tp_base = DEFERRED_ADDRESS(&PyUnicode_Type), .tp_new = (newfunc)istr_new, }; static inline int istr_init(void) { istr_type.tp_base = &PyUnicode_Type; if (PyType_Ready(&istr_type) < 0) { return -1; } return 0; } #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/_multilib/iter.h0000644000175100001650000001365713602414211021113 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_ITER_H #define _MULTIDICT_ITER_H #ifdef __cplusplus extern "C" { #endif static PyTypeObject multidict_items_iter_type; static PyTypeObject multidict_values_iter_type; static PyTypeObject multidict_keys_iter_type; typedef struct multidict_iter { PyObject_HEAD MultiDictObject *md; // MultiDict or CIMultiDict Py_ssize_t current; uint64_t version; } MultidictIter; static inline void _init_iter(MultidictIter *it, MultiDictObject *md) { Py_INCREF(md); it->md = md; it->current = 0; it->version = pair_list_version(&md->pairs); } static inline PyObject * multidict_items_iter_new(MultiDictObject *md) { MultidictIter *it = PyObject_GC_New( MultidictIter, &multidict_items_iter_type); if (it == NULL) { return NULL; } _init_iter(it, md); PyObject_GC_Track(it); return (PyObject *)it; } static inline PyObject * multidict_keys_iter_new(MultiDictObject *md) { MultidictIter *it = PyObject_GC_New( MultidictIter, &multidict_keys_iter_type); if (it == NULL) { return NULL; } _init_iter(it, md); PyObject_GC_Track(it); return (PyObject *)it; } static inline PyObject * multidict_values_iter_new(MultiDictObject *md) { MultidictIter *it = PyObject_GC_New( MultidictIter, &multidict_values_iter_type); if (it == NULL) { return NULL; } _init_iter(it, md); PyObject_GC_Track(it); return (PyObject *)it; } static inline PyObject * multidict_items_iter_iternext(MultidictIter *self) { PyObject *key = NULL; PyObject *value = NULL; PyObject *ret = NULL; if (self->version != pair_list_version(&self->md->pairs)) { PyErr_SetString(PyExc_RuntimeError, "Dictionary changed during iteration"); return NULL; } if (!_pair_list_next(&self->md->pairs, &self->current, NULL, &key, &value, NULL)) { PyErr_SetNone(PyExc_StopIteration); return NULL; } ret = PyTuple_Pack(2, key, value); if (ret == NULL) { return NULL; } return ret; } static inline PyObject * multidict_values_iter_iternext(MultidictIter *self) { PyObject *value = NULL; if (self->version != pair_list_version(&self->md->pairs)) { PyErr_SetString(PyExc_RuntimeError, "Dictionary changed during iteration"); return NULL; } if (!pair_list_next(&self->md->pairs, &self->current, NULL, NULL, &value)) { PyErr_SetNone(PyExc_StopIteration); return NULL; } Py_INCREF(value); return value; } static inline PyObject * multidict_keys_iter_iternext(MultidictIter *self) { PyObject *key = NULL; if (self->version != pair_list_version(&self->md->pairs)) { PyErr_SetString(PyExc_RuntimeError, "Dictionary changed during iteration"); return NULL; } if (!pair_list_next(&self->md->pairs, &self->current, NULL, &key, NULL)) { PyErr_SetNone(PyExc_StopIteration); return NULL; } Py_INCREF(key); return key; } static inline void multidict_iter_dealloc(MultidictIter *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->md); PyObject_GC_Del(self); } static inline int multidict_iter_traverse(MultidictIter *self, visitproc visit, void *arg) { Py_VISIT(self->md); return 0; } static inline int multidict_iter_clear(MultidictIter *self) { Py_CLEAR(self->md); return 0; } static inline PyObject * multidict_iter_len(MultidictIter *self) { return PyLong_FromLong(pair_list_len(&self->md->pairs)); } PyDoc_STRVAR(length_hint_doc, "Private method returning an estimate of len(list(it))."); static PyMethodDef multidict_iter_methods[] = { { "__length_hint__", (PyCFunction)(void(*)(void))multidict_iter_len, METH_NOARGS, length_hint_doc }, { NULL, NULL } /* sentinel */ }; /***********************************************************************/ static PyTypeObject multidict_items_iter_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._itemsiter", /* tp_name */ sizeof(MultidictIter), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_iter_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_iter_traverse, .tp_clear = (inquiry)multidict_iter_clear, .tp_iter = PyObject_SelfIter, .tp_iternext = (iternextfunc)multidict_items_iter_iternext, .tp_methods = multidict_iter_methods, }; static PyTypeObject multidict_values_iter_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._valuesiter", /* tp_name */ sizeof(MultidictIter), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_iter_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_iter_traverse, .tp_clear = (inquiry)multidict_iter_clear, .tp_iter = PyObject_SelfIter, .tp_iternext = (iternextfunc)multidict_values_iter_iternext, .tp_methods = multidict_iter_methods, }; static PyTypeObject multidict_keys_iter_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._keysiter", /* tp_name */ sizeof(MultidictIter), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_iter_dealloc, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_iter_traverse, .tp_clear = (inquiry)multidict_iter_clear, .tp_iter = PyObject_SelfIter, .tp_iternext = (iternextfunc)multidict_keys_iter_iternext, .tp_methods = multidict_iter_methods, }; static inline int multidict_iter_init() { if (PyType_Ready(&multidict_items_iter_type) < 0 || PyType_Ready(&multidict_values_iter_type) < 0 || PyType_Ready(&multidict_keys_iter_type) < 0) { return -1; } return 0; } #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/_multilib/pair_list.h0000644000175100001650000006447113602414211022136 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_PAIR_LIST_H #define _MULTIDICT_PAIR_LIST_H #ifdef __cplusplus extern "C" { #endif #include #include #include typedef PyObject * (*calc_identity_func)(PyObject *key); typedef struct pair { PyObject *identity; // 8 PyObject *key; // 8 PyObject *value; // 8 Py_hash_t hash; // 8 } pair_t; /* Note about the structure size With 29 pairs the MultiDict object size is slightly less than 1KiB (1000-1008 bytes depending on Python version, plus extra 12 bytes for memory allocator internal structures). As the result the max reserved size is 1020 bytes at most. To fit into 512 bytes, the structure can contain only 13 pairs which is too small, e.g. https://www.python.org returns 16 headers (9 of them are caching proxy information though). The embedded buffer intention is to fit the vast majority of possible HTTP headers into the buffer without allocating an extra memory block. */ #if (PY_VERSION_HEX < 0x03080000) #define EMBEDDED_CAPACITY 28 #else #define EMBEDDED_CAPACITY 29 #endif typedef struct pair_list { // 40 Py_ssize_t capacity; // 8 Py_ssize_t size; // 8 uint64_t version; // 8 calc_identity_func calc_identity; // 8 pair_t *pairs; // 8 pair_t buffer[EMBEDDED_CAPACITY]; } pair_list_t; #define MIN_CAPACITY 63 #define CAPACITY_STEP 64 /* Global counter used to set ma_version_tag field of dictionary. * It is incremented each time that a dictionary is created and each * time that a dictionary is modified. */ static uint64_t pair_list_global_version = 0; #define NEXT_VERSION() (++pair_list_global_version) static inline int str_cmp(PyObject *s1, PyObject *s2) { PyObject *ret = PyUnicode_RichCompare(s1, s2, Py_EQ); if (ret == Py_True) { Py_DECREF(ret); return 1; } else if (ret == NULL) { return -1; } else { Py_DECREF(ret); return 0; } } static inline PyObject * key_to_str(PyObject *key) { PyObject *ret; PyTypeObject *type = Py_TYPE(key); if (type == &istr_type) { ret = ((istrobject*)key)->canonical; Py_INCREF(ret); return ret; } if (PyUnicode_CheckExact(key)) { Py_INCREF(key); return key; } if (PyUnicode_Check(key)) { return PyObject_Str(key); } PyErr_SetString(PyExc_TypeError, "MultiDict keys should be either str " "or subclasses of str"); return NULL; } static inline PyObject * ci_key_to_str(PyObject *key) { PyObject *ret; PyTypeObject *type = Py_TYPE(key); if (type == &istr_type) { ret = ((istrobject*)key)->canonical; Py_INCREF(ret); return ret; } if (PyUnicode_Check(key)) { return _PyObject_CallMethodId(key, &PyId_lower, NULL); } PyErr_SetString(PyExc_TypeError, "CIMultiDict keys should be either str " "or subclasses of str"); return NULL; } static inline pair_t * pair_list_get(pair_list_t *list, Py_ssize_t i) { pair_t *item = list->pairs + i; return item; } static inline int pair_list_grow(pair_list_t *list) { // Grow by one element if needed Py_ssize_t new_capacity; pair_t *new_pairs; if (list->size < list->capacity) { return 0; } if (list->pairs == list->buffer) { new_pairs = PyMem_New(pair_t, MIN_CAPACITY); memcpy(new_pairs, list->buffer, (size_t)list->capacity * sizeof(pair_t)); list->pairs = new_pairs; list->capacity = MIN_CAPACITY; return 0; } else { new_capacity = list->capacity + CAPACITY_STEP; new_pairs = PyMem_Resize(list->pairs, pair_t, (size_t)new_capacity); if (NULL == new_pairs) { // Resizing error return -1; } list->pairs = new_pairs; list->capacity = new_capacity; return 0; } } static inline int pair_list_shrink(pair_list_t *list) { // Shrink by one element if needed. // Optimization is applied to prevent jitter // (grow-shrink-grow-shrink on adding-removing the single element // when the buffer is full). // To prevent this, the buffer is resized if the size is less than the capacity // by 2*CAPACITY_STEP factor. // The switch back to embedded buffer is never performed for both reasons: // the code simplicity and the jitter prevention. pair_t *new_pairs; Py_ssize_t new_capacity; if (list->capacity - list->size < 2 * CAPACITY_STEP) { return 0; } new_capacity = list->capacity - CAPACITY_STEP; if (new_capacity < MIN_CAPACITY) { return 0; } new_pairs = PyMem_Resize(list->pairs, pair_t, (size_t)new_capacity); if (NULL == new_pairs) { // Resizing error return -1; } list->pairs = new_pairs; list->capacity = new_capacity; return 0; } static inline int _pair_list_init(pair_list_t *list, calc_identity_func calc_identity) { list->pairs = list->buffer; list->capacity = EMBEDDED_CAPACITY; list->size = 0; list->version = NEXT_VERSION(); list->calc_identity = calc_identity; return 0; } static inline int pair_list_init(pair_list_t *list) { return _pair_list_init(list, key_to_str); } static inline int ci_pair_list_init(pair_list_t *list) { return _pair_list_init(list, ci_key_to_str); } static inline void pair_list_dealloc(pair_list_t *list) { pair_t *pair; Py_ssize_t pos; for (pos = 0; pos < list->size; pos++) { pair = pair_list_get(list, pos); Py_XDECREF(pair->identity); Py_XDECREF(pair->key); Py_XDECREF(pair->value); } /* Strictly speaking, resetting size and capacity and assigning pairs to buffer is not necessary. Do it to consistency and idemotency. The cleanup doesn't hurt performance. !!! !!! The buffer deletion is crucial though. !!! */ list->size = 0; if (list->pairs != list->buffer) { PyMem_Del(list->pairs); list->pairs = list->buffer; list->capacity = EMBEDDED_CAPACITY; } } static inline Py_ssize_t pair_list_len(pair_list_t *list) { return list->size; } static inline int _pair_list_add_with_hash(pair_list_t *list, PyObject *identity, PyObject *key, PyObject *value, Py_hash_t hash) { pair_t *pair; if (pair_list_grow(list) < 0) { return -1; } pair = pair_list_get(list, list->size); Py_INCREF(identity); pair->identity = identity; Py_INCREF(key); pair->key = key; Py_INCREF(value); pair->value = value; pair->hash = hash; list->version = NEXT_VERSION(); list->size += 1; return 0; } static inline int pair_list_add(pair_list_t *list, PyObject *key, PyObject *value) { Py_hash_t hash; PyObject *identity = NULL; int ret; identity = list->calc_identity(key); if (identity == NULL) { goto fail; } hash = PyObject_Hash(identity); if (hash == -1) { goto fail; } ret = _pair_list_add_with_hash(list, identity, key, value, hash); Py_DECREF(identity); return ret; fail: Py_XDECREF(identity); return -1; } static inline int pair_list_del_at(pair_list_t *list, Py_ssize_t pos) { // return 1 on success, -1 on failure Py_ssize_t tail; pair_t *pair; pair = pair_list_get(list, pos); Py_DECREF(pair->identity); Py_DECREF(pair->key); Py_DECREF(pair->value); list->size -= 1; list->version = NEXT_VERSION(); if (list->size == pos) { // remove from tail, no need to shift body return 0; } tail = list->size - pos; // TODO: raise an error if tail < 0 memmove((void *)pair_list_get(list, pos), (void *)pair_list_get(list, pos + 1), sizeof(pair_t) * (size_t)tail); return pair_list_shrink(list); } static inline int _pair_list_drop_tail(pair_list_t *list, PyObject *identity, Py_hash_t hash, Py_ssize_t pos) { // return 1 if deleted, 0 if not found pair_t *pair; int ret; int found = 0; if (pos >= list->size) { return 0; } for (; pos < list->size; pos++) { pair = pair_list_get(list, pos); if (pair->hash != hash) { continue; } ret = str_cmp(pair->identity, identity); if (ret > 0) { if (pair_list_del_at(list, pos) < 0) { return -1; } found = 1; pos--; } else if (ret == -1) { return -1; } } return found; } static inline int _pair_list_del_hash(pair_list_t *list, PyObject *identity, PyObject *key, Py_hash_t hash) { int ret = _pair_list_drop_tail(list, identity, hash, 0); if (ret < 0) { return -1; } else if (ret == 0) { PyErr_SetObject(PyExc_KeyError, key); return -1; } else { list->version = NEXT_VERSION(); return 0; } } static inline int pair_list_del(pair_list_t *list, PyObject *key) { PyObject *identity = NULL; Py_hash_t hash; int ret; identity = list->calc_identity(key); if (identity == NULL) { goto fail; } hash = PyObject_Hash(identity); if (hash == -1) { goto fail; } ret = _pair_list_del_hash(list, identity, key, hash); Py_DECREF(identity); return ret; fail: Py_XDECREF(identity); return -1; } static inline uint64_t pair_list_version(pair_list_t *list) { return list->version; } static inline int _pair_list_next(pair_list_t *list, Py_ssize_t *ppos, PyObject **pidentity, PyObject **pkey, PyObject **pvalue, Py_hash_t *phash) { pair_t *pair; if (*ppos >= list->size) { return 0; } pair = pair_list_get(list, *ppos); if (pidentity) { *pidentity = pair->identity; } if (pkey) { *pkey = pair->key; } if (pvalue) { *pvalue = pair->value; } if (phash) { *phash = pair->hash; } *ppos += 1; return 1; } static inline int pair_list_next(pair_list_t *list, Py_ssize_t *ppos, PyObject **pidentity, PyObject **pkey, PyObject **pvalue) { Py_hash_t hash; return _pair_list_next(list, ppos, pidentity, pkey, pvalue, &hash); } static inline int pair_list_contains(pair_list_t *list, PyObject *key) { Py_hash_t hash1, hash2; Py_ssize_t pos = 0; PyObject *ident = NULL; PyObject *identity = NULL; int tmp; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash1 = PyObject_Hash(ident); if (hash1 == -1) { goto fail; } while (_pair_list_next(list, &pos, &identity, NULL, NULL, &hash2)) { if (hash1 != hash2) { continue; } tmp = str_cmp(ident, identity); if (tmp > 0) { Py_DECREF(ident); return 1; } else if (tmp < 0) { goto fail; } } Py_DECREF(ident); return 0; fail: Py_XDECREF(ident); return -1; } static inline PyObject * pair_list_get_one(pair_list_t *list, PyObject *key) { Py_hash_t hash1, hash2; Py_ssize_t pos = 0; PyObject *ident = NULL; PyObject *identity = NULL; PyObject *value = NULL; int tmp; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash1 = PyObject_Hash(ident); if (hash1 == -1) { goto fail; } while (_pair_list_next(list, &pos, &identity, NULL, &value, &hash2)) { if (hash1 != hash2) { continue; } tmp = str_cmp(ident, identity); if (tmp > 0) { Py_INCREF(value); Py_DECREF(ident); return value; } else if (tmp < 0) { goto fail; } } Py_DECREF(ident); PyErr_SetObject(PyExc_KeyError, key); return NULL; fail: Py_XDECREF(ident); return NULL; } static inline PyObject * pair_list_get_all(pair_list_t *list, PyObject *key) { Py_hash_t hash1, hash2; Py_ssize_t pos = 0; PyObject *ident = NULL; PyObject *identity = NULL; PyObject *value = NULL; PyObject *res = NULL; int tmp; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash1 = PyObject_Hash(ident); if (hash1 == -1) { goto fail; } while (_pair_list_next(list, &pos, &identity, NULL, &value, &hash2)) { if (hash1 != hash2) { continue; } tmp = str_cmp(ident, identity); if (tmp > 0) { if (res == NULL) { res = PyList_New(1); if (res == NULL) { goto fail; } if (PyList_SetItem(res, 0, value) < 0) { goto fail; } Py_INCREF(value); } else if (PyList_Append(res, value) < 0) { goto fail; } } else if (tmp < 0) { goto fail; } } if (res == NULL) { PyErr_SetObject(PyExc_KeyError, key); } Py_DECREF(ident); return res; fail: Py_XDECREF(ident); Py_XDECREF(res); return NULL; } static inline PyObject * pair_list_set_default(pair_list_t *list, PyObject *key, PyObject *value) { Py_hash_t hash1, hash2; Py_ssize_t pos = 0; PyObject *ident = NULL; PyObject *identity = NULL; PyObject *value2 = NULL; int tmp; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash1 = PyObject_Hash(ident); if (hash1 == -1) { goto fail; } while (_pair_list_next(list, &pos, &identity, NULL, &value2, &hash2)) { if (hash1 != hash2) { continue; } tmp = str_cmp(ident, identity); if (tmp > 0) { Py_INCREF(value2); Py_DECREF(ident); return value2; } else if (tmp < 0) { goto fail; } } if (_pair_list_add_with_hash(list, ident, key, value, hash1) < 0) { goto fail; } Py_INCREF(value); Py_DECREF(ident); return value; fail: Py_XDECREF(ident); return NULL; } static inline PyObject * pair_list_pop_one(pair_list_t *list, PyObject *key) { pair_t *pair; Py_hash_t hash; Py_ssize_t pos; PyObject *value = NULL; int tmp; PyObject *ident = NULL; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash = PyObject_Hash(ident); if (hash == -1) { goto fail; } for (pos=0; pos < list->size; pos++) { pair = pair_list_get(list, pos); if (pair->hash != hash) { continue; } tmp = str_cmp(ident, pair->identity); if (tmp > 0) { value = pair->value; Py_INCREF(value); if (pair_list_del_at(list, pos) < 0) { goto fail; } Py_DECREF(ident); return value; } else if (tmp < 0) { goto fail; } } PyErr_SetObject(PyExc_KeyError, key); goto fail; fail: Py_XDECREF(value); Py_XDECREF(ident); return NULL; } static inline PyObject * pair_list_pop_all(pair_list_t *list, PyObject *key) { Py_hash_t hash; Py_ssize_t pos; pair_t *pair; int tmp; PyObject *res = NULL; PyObject *ident = NULL; ident = list->calc_identity(key); if (ident == NULL) { goto fail; } hash = PyObject_Hash(ident); if (hash == -1) { goto fail; } if (list->size == 0) { PyErr_SetObject(PyExc_KeyError, ident); goto fail; } for (pos = list->size - 1; pos >= 0; pos--) { pair = pair_list_get(list, pos); if (hash != pair->hash) { continue; } tmp = str_cmp(ident, pair->identity); if (tmp > 0) { if (res == NULL) { res = PyList_New(1); if (res == NULL) { goto fail; } if (PyList_SetItem(res, 0, pair->value) < 0) { goto fail; } Py_INCREF(pair->value); } else if (PyList_Append(res, pair->value) < 0) { goto fail; } if (pair_list_del_at(list, pos) < 0) { goto fail; } } else if (tmp < 0) { goto fail; } } if (res == NULL) { PyErr_SetObject(PyExc_KeyError, key); } else if (PyList_Reverse(res) < 0) { goto fail; } Py_DECREF(ident); return res; fail: Py_XDECREF(ident); Py_XDECREF(res); return NULL; } static inline PyObject * pair_list_pop_item(pair_list_t *list) { PyObject *ret; pair_t *pair; if (list->size == 0) { PyErr_SetString(PyExc_KeyError, "empty multidict"); return NULL; } pair = pair_list_get(list, 0); ret = PyTuple_Pack(2, pair->key, pair->value); if (ret == NULL) { return NULL; } if (pair_list_del_at(list, 0) < 0) { Py_DECREF(ret); return NULL; } return ret; } static inline int pair_list_replace(pair_list_t *list, PyObject * key, PyObject *value) { pair_t *pair; Py_ssize_t pos; int tmp; int found = 0; PyObject *identity = NULL; Py_hash_t hash; identity = list->calc_identity(key); if (identity == NULL) { goto fail; } hash = PyObject_Hash(identity); if (hash == -1) { goto fail; } for (pos = 0; pos < list->size; pos++) { pair = pair_list_get(list, pos); if (hash != pair->hash) { continue; } tmp = str_cmp(identity, pair->identity); if (tmp > 0) { found = 1; Py_INCREF(key); Py_DECREF(pair->key); pair->key = key; Py_INCREF(value); Py_DECREF(pair->value); pair->value = value; break; } else if (tmp < 0) { goto fail; } } if (!found) { if (_pair_list_add_with_hash(list, identity, key, value, hash) < 0) { goto fail; } Py_DECREF(identity); return 0; } else { list->version = NEXT_VERSION(); if (_pair_list_drop_tail(list, identity, hash, pos+1) < 0) { goto fail; } Py_DECREF(identity); return 0; } fail: Py_XDECREF(identity); return -1; } static inline int _dict_set_number(PyObject *dict, PyObject *key, Py_ssize_t num) { PyObject *tmp = PyLong_FromSsize_t(num); if (tmp == NULL) { return -1; } if (PyDict_SetItem(dict, key, tmp) < 0) { Py_DECREF(tmp); return -1; } return 0; } static inline int _pair_list_post_update(pair_list_t *list, PyObject* used_keys, Py_ssize_t pos) { pair_t *pair; PyObject *tmp; Py_ssize_t num; for (; pos < list->size; pos++) { pair = pair_list_get(list, pos); tmp = PyDict_GetItem(used_keys, pair->identity); if (tmp == NULL) { // not found continue; } num = PyLong_AsSsize_t(tmp); if (num == -1) { if (!PyErr_Occurred()) { PyErr_SetString(PyExc_RuntimeError, "invalid internal state"); } return -1; } if (pos >= num) { // del self[pos] if (pair_list_del_at(list, pos) < 0) { return -1; } pos--; } } list->version = NEXT_VERSION(); return 0; } // TODO: need refactoring function name static inline int _pair_list_update(pair_list_t *list, PyObject *key, PyObject *value, PyObject *used_keys, PyObject *identity, Py_hash_t hash) { PyObject *item = NULL; pair_t *pair = NULL; Py_ssize_t pos; int found; int ident_cmp_res; item = PyDict_GetItem(used_keys, identity); if (item == NULL) { pos = 0; } else { pos = PyLong_AsSsize_t(item); if (pos == -1) { if (!PyErr_Occurred()) { PyErr_SetString(PyExc_RuntimeError, "invalid internal state"); } return -1; } } found = 0; for (; pos < list->size; pos++) { pair = pair_list_get(list, pos); if (pair->hash != hash) { continue; } ident_cmp_res = str_cmp(pair->identity, identity); if (ident_cmp_res > 0) { Py_INCREF(key); Py_DECREF(pair->key); pair->key = key; Py_INCREF(value); Py_DECREF(pair->value); pair->value = value; if (_dict_set_number(used_keys, pair->identity, pos + 1) < 0) { return -1; } found = 1; break; } else if (ident_cmp_res < 0) { return -1; } } if (!found) { if (_pair_list_add_with_hash(list, identity, key, value, hash) < 0) { return -1; } if (_dict_set_number(used_keys, identity, list->size) < 0) { return -1; } } return 0; } static inline int pair_list_update(pair_list_t *list, pair_list_t *other) { PyObject *used_keys = NULL; pair_t *pair = NULL; Py_ssize_t pos; if (other->size == 0) { return 0; } used_keys = PyDict_New(); if (used_keys == NULL) { return -1; } for (pos = 0; pos < other->size; pos++) { pair = pair_list_get(other, pos); if (_pair_list_update(list, pair->key, pair->value, used_keys, pair->identity, pair->hash) < 0) { goto fail; } } if (_pair_list_post_update(list, used_keys, 0) < 0) { goto fail; } Py_DECREF(used_keys); return 0; fail: Py_XDECREF(used_keys); return -1; } static inline int pair_list_update_from_seq(pair_list_t *list, PyObject *seq) { PyObject *it = NULL; // iter(seq) PyObject *fast = NULL; // item as a 2-tuple or 2-list PyObject *item = NULL; // seq[i] PyObject *used_keys = NULL; // dict() PyObject *key = NULL; PyObject *value = NULL; PyObject *identity = NULL; Py_hash_t hash; Py_ssize_t i; Py_ssize_t n; it = PyObject_GetIter(seq); if (it == NULL) { return -1; } used_keys = PyDict_New(); if (used_keys == NULL) { goto fail_1; } for (i = 0; ; ++i) { // i - index into seq of current element fast = NULL; item = PyIter_Next(it); if (item == NULL) { if (PyErr_Occurred()) { goto fail_1; } break; } // Convert item to sequence, and verify length 2. fast = PySequence_Fast(item, ""); if (fast == NULL) { if (PyErr_ExceptionMatches(PyExc_TypeError)) { PyErr_Format(PyExc_TypeError, "multidict cannot convert sequence element #%zd" " to a sequence", i); } goto fail_1; } n = PySequence_Fast_GET_SIZE(fast); if (n != 2) { PyErr_Format(PyExc_ValueError, "multidict update sequence element #%zd " "has length %zd; 2 is required", i, n); goto fail_1; } key = PySequence_Fast_GET_ITEM(fast, 0); value = PySequence_Fast_GET_ITEM(fast, 1); Py_INCREF(key); Py_INCREF(value); identity = list->calc_identity(key); if (identity == NULL) { goto fail_1; } hash = PyObject_Hash(identity); if (hash == -1) { goto fail_1; } if (_pair_list_update(list, key, value, used_keys, identity, hash) < 0) { goto fail_1; } Py_DECREF(key); Py_DECREF(value); Py_DECREF(fast); Py_DECREF(item); Py_DECREF(identity); } if (_pair_list_post_update(list, used_keys, 0) < 0) { goto fail_2; } Py_DECREF(it); Py_DECREF(used_keys); return 0; fail_1: Py_XDECREF(key); Py_XDECREF(value); Py_XDECREF(fast); Py_XDECREF(item); Py_XDECREF(identity); fail_2: Py_XDECREF(it); Py_XDECREF(used_keys); return -1; } static inline int pair_list_eq_to_mapping(pair_list_t *list, PyObject *other) { PyObject *key = NULL; PyObject *avalue = NULL; PyObject *bvalue = NULL; Py_ssize_t pos; int cmp; if (!PyMapping_Check(other)) { PyErr_Format(PyExc_TypeError, "other argument must be a mapping, not %s", Py_TYPE(other)->tp_name); return -1; } if (pair_list_len(list) != PyMapping_Length(other)) { return 0; } pos = 0; while (pair_list_next(list, &pos, NULL, &key, &avalue)) { bvalue = PyObject_GetItem(other, key); if (bvalue == NULL) { PyErr_Clear(); return 0; } cmp = PyObject_RichCompareBool(avalue, bvalue, Py_EQ); Py_DECREF(bvalue); if (cmp < 0) { return -1; } else if (cmp > 0) { continue; } else { return 0; } } return 1; } /***********************************************************************/ static inline int pair_list_traverse(pair_list_t *list, visitproc visit, void *arg) { pair_t *pair = NULL; Py_ssize_t pos; for (pos = 0; pos < list->size; pos++) { pair = pair_list_get(list, pos); // Don't need traverse the identity: it is a terminal Py_VISIT(pair->key); Py_VISIT(pair->value); } return 0; } static inline int pair_list_clear(pair_list_t *list) { pair_t *pair = NULL; Py_ssize_t pos; if (list->size == 0) { return 0; } list->version = NEXT_VERSION(); for (pos = 0; pos < list->size; pos++) { pair = pair_list_get(list, pos); Py_CLEAR(pair->key); Py_CLEAR(pair->identity); Py_CLEAR(pair->value); } list->size = 0; if (list->pairs != list->buffer) { PyMem_Del(list->pairs); list->pairs = list->buffer; } return 0; } #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/_multilib/views.h0000644000175100001650000003043313602414211021274 0ustar vstsdocker00000000000000#ifndef _MULTIDICT_VIEWS_H #define _MULTIDICT_VIEWS_H #ifdef __cplusplus extern "C" { #endif static PyTypeObject multidict_itemsview_type; static PyTypeObject multidict_valuesview_type; static PyTypeObject multidict_keysview_type; static PyObject *viewbaseset_richcmp_func; static PyObject *viewbaseset_and_func; static PyObject *viewbaseset_or_func; static PyObject *viewbaseset_sub_func; static PyObject *viewbaseset_xor_func; static PyObject *abc_itemsview_register_func; static PyObject *abc_keysview_register_func; static PyObject *abc_valuesview_register_func; static PyObject *itemsview_isdisjoint_func; static PyObject *itemsview_repr_func; static PyObject *keysview_repr_func; static PyObject *keysview_isdisjoint_func; static PyObject *valuesview_repr_func; typedef struct { PyObject_HEAD PyObject *md; } _Multidict_ViewObject; /********** Base **********/ static inline void _init_view(_Multidict_ViewObject *self, PyObject *md) { Py_INCREF(md); self->md = md; } static inline void multidict_view_dealloc(_Multidict_ViewObject *self) { PyObject_GC_UnTrack(self); Py_XDECREF(self->md); PyObject_GC_Del(self); } static inline int multidict_view_traverse(_Multidict_ViewObject *self, visitproc visit, void *arg) { Py_VISIT(self->md); return 0; } static inline int multidict_view_clear(_Multidict_ViewObject *self) { Py_CLEAR(self->md); return 0; } static inline Py_ssize_t multidict_view_len(_Multidict_ViewObject *self) { return pair_list_len(&((MultiDictObject*)self->md)->pairs); } static inline PyObject * multidict_view_richcompare(PyObject *self, PyObject *other, int op) { PyObject *ret; PyObject *op_obj = PyLong_FromLong(op); if (op_obj == NULL) { return NULL; } ret = PyObject_CallFunctionObjArgs( viewbaseset_richcmp_func, self, other, op_obj, NULL); Py_DECREF(op_obj); return ret; } static inline PyObject * multidict_view_and(PyObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( viewbaseset_and_func, self, other, NULL); } static inline PyObject * multidict_view_or(PyObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( viewbaseset_or_func, self, other, NULL); } static inline PyObject * multidict_view_sub(PyObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( viewbaseset_sub_func, self, other, NULL); } static inline PyObject * multidict_view_xor(PyObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( viewbaseset_xor_func, self, other, NULL); } static PyNumberMethods multidict_view_as_number = { .nb_subtract = (binaryfunc)multidict_view_sub, .nb_and = (binaryfunc)multidict_view_and, .nb_xor = (binaryfunc)multidict_view_xor, .nb_or = (binaryfunc)multidict_view_or, }; /********** Items **********/ static inline PyObject * multidict_itemsview_new(PyObject *md) { _Multidict_ViewObject *mv = PyObject_GC_New( _Multidict_ViewObject, &multidict_itemsview_type); if (mv == NULL) { return NULL; } _init_view(mv, md); PyObject_GC_Track(mv); return (PyObject *)mv; } static inline PyObject * multidict_itemsview_iter(_Multidict_ViewObject *self) { return multidict_items_iter_new((MultiDictObject*)self->md); } static inline PyObject * multidict_itemsview_repr(_Multidict_ViewObject *self) { return PyObject_CallFunctionObjArgs( itemsview_repr_func, self, NULL); } static inline PyObject * multidict_itemsview_isdisjoint(_Multidict_ViewObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( itemsview_isdisjoint_func, self, other, NULL); } PyDoc_STRVAR(itemsview_isdisjoint_doc, "Return True if two sets have a null intersection."); static PyMethodDef multidict_itemsview_methods[] = { { "isdisjoint", (PyCFunction)multidict_itemsview_isdisjoint, METH_O, itemsview_isdisjoint_doc }, { NULL, NULL } /* sentinel */ }; static inline int multidict_itemsview_contains(_Multidict_ViewObject *self, PyObject *obj) { PyObject *akey = NULL, *aval = NULL, *bkey = NULL, *bval = NULL, *iter = NULL, *item = NULL; int ret1, ret2; if (!PyTuple_Check(obj) || PyTuple_GET_SIZE(obj) != 2) { return 0; } bkey = PyTuple_GET_ITEM(obj, 0); bval = PyTuple_GET_ITEM(obj, 1); iter = multidict_itemsview_iter(self); if (iter == NULL) { return 0; } while ((item = PyIter_Next(iter)) != NULL) { akey = PyTuple_GET_ITEM(item, 0); aval = PyTuple_GET_ITEM(item, 1); ret1 = PyObject_RichCompareBool(akey, bkey, Py_EQ); if (ret1 < 0) { Py_DECREF(iter); Py_DECREF(item); return -1; } ret2 = PyObject_RichCompareBool(aval, bval, Py_EQ); if (ret2 < 0) { Py_DECREF(iter); Py_DECREF(item); return -1; } if (ret1 > 0 && ret2 > 0) { Py_DECREF(iter); Py_DECREF(item); return 1; } Py_DECREF(item); } Py_DECREF(iter); if (PyErr_Occurred()) { return -1; } return 0; } static PySequenceMethods multidict_itemsview_as_sequence = { .sq_length = (lenfunc)multidict_view_len, .sq_contains = (objobjproc)multidict_itemsview_contains, }; static PyTypeObject multidict_itemsview_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._ItemsView", /* tp_name */ sizeof(_Multidict_ViewObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_view_dealloc, .tp_repr = (reprfunc)multidict_itemsview_repr, .tp_as_number = &multidict_view_as_number, .tp_as_sequence = &multidict_itemsview_as_sequence, .tp_getattro = PyObject_GenericGetAttr, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_view_traverse, .tp_clear = (inquiry)multidict_view_clear, .tp_richcompare = multidict_view_richcompare, .tp_iter = (getiterfunc)multidict_itemsview_iter, .tp_methods = multidict_itemsview_methods, }; /********** Keys **********/ static inline PyObject * multidict_keysview_new(PyObject *md) { _Multidict_ViewObject *mv = PyObject_GC_New( _Multidict_ViewObject, &multidict_keysview_type); if (mv == NULL) { return NULL; } _init_view(mv, md); PyObject_GC_Track(mv); return (PyObject *)mv; } static inline PyObject * multidict_keysview_iter(_Multidict_ViewObject *self) { return multidict_keys_iter_new(((MultiDictObject*)self->md)); } static inline PyObject * multidict_keysview_repr(_Multidict_ViewObject *self) { return PyObject_CallFunctionObjArgs( keysview_repr_func, self, NULL); } static inline PyObject * multidict_keysview_isdisjoint(_Multidict_ViewObject *self, PyObject *other) { return PyObject_CallFunctionObjArgs( keysview_isdisjoint_func, self, other, NULL); } PyDoc_STRVAR(keysview_isdisjoint_doc, "Return True if two sets have a null intersection."); static PyMethodDef multidict_keysview_methods[] = { { "isdisjoint", (PyCFunction)multidict_keysview_isdisjoint, METH_O, keysview_isdisjoint_doc }, { NULL, NULL } /* sentinel */ }; static inline int multidict_keysview_contains(_Multidict_ViewObject *self, PyObject *key) { return pair_list_contains(&((MultiDictObject*)self->md)->pairs, key); } static PySequenceMethods multidict_keysview_as_sequence = { .sq_length = (lenfunc)multidict_view_len, .sq_contains = (objobjproc)multidict_keysview_contains, }; static PyTypeObject multidict_keysview_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._KeysView", /* tp_name */ sizeof(_Multidict_ViewObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_view_dealloc, .tp_repr = (reprfunc)multidict_keysview_repr, .tp_as_number = &multidict_view_as_number, .tp_as_sequence = &multidict_keysview_as_sequence, .tp_getattro = PyObject_GenericGetAttr, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_view_traverse, .tp_clear = (inquiry)multidict_view_clear, .tp_richcompare = multidict_view_richcompare, .tp_iter = (getiterfunc)multidict_keysview_iter, .tp_methods = multidict_keysview_methods, }; /********** Values **********/ static inline PyObject * multidict_valuesview_new(PyObject *md) { _Multidict_ViewObject *mv = PyObject_GC_New( _Multidict_ViewObject, &multidict_valuesview_type); if (mv == NULL) { return NULL; } _init_view(mv, md); PyObject_GC_Track(mv); return (PyObject *)mv; } static inline PyObject * multidict_valuesview_iter(_Multidict_ViewObject *self) { return multidict_values_iter_new(((MultiDictObject*)self->md)); } static inline PyObject * multidict_valuesview_repr(_Multidict_ViewObject *self) { return PyObject_CallFunctionObjArgs( valuesview_repr_func, self, NULL); } static PySequenceMethods multidict_valuesview_as_sequence = { .sq_length = (lenfunc)multidict_view_len, }; static PyTypeObject multidict_valuesview_type = { PyVarObject_HEAD_INIT(DEFERRED_ADDRESS(&PyType_Type), 0) "multidict._multidict._ValuesView", /* tp_name */ sizeof(_Multidict_ViewObject), /* tp_basicsize */ .tp_dealloc = (destructor)multidict_view_dealloc, .tp_repr = (reprfunc)multidict_valuesview_repr, .tp_as_sequence = &multidict_valuesview_as_sequence, .tp_getattro = PyObject_GenericGetAttr, .tp_flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, .tp_traverse = (traverseproc)multidict_view_traverse, .tp_clear = (inquiry)multidict_view_clear, .tp_iter = (getiterfunc)multidict_valuesview_iter, }; static inline int multidict_views_init() { PyObject *reg_func_call_result = NULL; PyObject *module = PyImport_ImportModule("multidict._multidict_base"); if (module == NULL) { goto fail; } #define GET_MOD_ATTR(VAR, NAME) \ VAR = PyObject_GetAttrString(module, NAME); \ if (VAR == NULL) { \ goto fail; \ } GET_MOD_ATTR(viewbaseset_richcmp_func, "_viewbaseset_richcmp"); GET_MOD_ATTR(viewbaseset_and_func, "_viewbaseset_and"); GET_MOD_ATTR(viewbaseset_or_func, "_viewbaseset_or"); GET_MOD_ATTR(viewbaseset_sub_func, "_viewbaseset_sub"); GET_MOD_ATTR(viewbaseset_xor_func, "_viewbaseset_xor"); GET_MOD_ATTR(abc_itemsview_register_func, "_abc_itemsview_register"); GET_MOD_ATTR(abc_keysview_register_func, "_abc_keysview_register"); GET_MOD_ATTR(abc_valuesview_register_func, "_abc_valuesview_register"); GET_MOD_ATTR(itemsview_repr_func, "_itemsview_isdisjoint"); GET_MOD_ATTR(itemsview_repr_func, "_itemsview_repr"); GET_MOD_ATTR(keysview_repr_func, "_keysview_repr"); GET_MOD_ATTR(keysview_isdisjoint_func, "_keysview_isdisjoint"); GET_MOD_ATTR(valuesview_repr_func, "_valuesview_repr"); if (PyType_Ready(&multidict_itemsview_type) < 0 || PyType_Ready(&multidict_valuesview_type) < 0 || PyType_Ready(&multidict_keysview_type) < 0) { goto fail; } // abc.ItemsView.register(_ItemsView) reg_func_call_result = PyObject_CallFunctionObjArgs( abc_itemsview_register_func, (PyObject*)&multidict_itemsview_type, NULL); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); // abc.KeysView.register(_KeysView) reg_func_call_result = PyObject_CallFunctionObjArgs( abc_keysview_register_func, (PyObject*)&multidict_keysview_type, NULL); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); // abc.ValuesView.register(_KeysView) reg_func_call_result = PyObject_CallFunctionObjArgs( abc_valuesview_register_func, (PyObject*)&multidict_valuesview_type, NULL); if (reg_func_call_result == NULL) { goto fail; } Py_DECREF(reg_func_call_result); Py_DECREF(module); return 0; fail: Py_CLEAR(module); return -1; #undef GET_MOD_ATTR } #ifdef __cplusplus } #endif #endif multidict-4.7.3/multidict/py.typed0000644000175100001650000000001713602414211017500 0ustar vstsdocker00000000000000PEP-561 marker.multidict-4.7.3/multidict.egg-info/0000755000175100001650000000000013602414216017502 5ustar vstsdocker00000000000000multidict-4.7.3/multidict.egg-info/PKG-INFO0000644000175100001650000001150413602414216020600 0ustar vstsdocker00000000000000Metadata-Version: 1.2 Name: multidict Version: 4.7.3 Summary: multidict implementation Home-page: https://github.com/aio-libs/multidict Author: Andrew Svetlov Author-email: andrew.svetlov@gmail.com License: Apache 2 Project-URL: Chat: Gitter, https://gitter.im/aio-libs/Lobby Project-URL: CI: Azure Pipelines, https://dev.azure.com/aio-libs/multidict/_build Project-URL: Coverage: codecov, https://codecov.io/github/aio-libs/multidict Project-URL: Docs: RTD, https://multidict.readthedocs.io Project-URL: GitHub: issues, https://github.com/aio-libs/multidict/issues Project-URL: GitHub: repo, https://github.com/aio-libs/multidict Description: ========= multidict ========= .. image:: https://dev.azure.com/aio-libs/multidict/_apis/build/status/CI?branchName=master :target: https://dev.azure.com/aio-libs/multidict/_build :alt: Azure Pipelines status for master branch .. image:: https://codecov.io/gh/aio-libs/multidict/branch/master/graph/badge.svg :target: https://codecov.io/gh/aio-libs/multidict :alt: Coverage metrics .. image:: https://img.shields.io/pypi/v/multidict.svg :target: https://pypi.org/project/multidict :alt: PyPI .. image:: https://readthedocs.org/projects/multidict/badge/?version=latest :target: http://multidict.readthedocs.org/en/latest/?badge=latest :alt: Documentationb .. image:: https://img.shields.io/pypi/pyversions/multidict.svg :target: https://pypi.org/project/multidict :alt: Python versions .. image:: https://badges.gitter.im/Join%20Chat.svg :target: https://gitter.im/aio-libs/Lobby :alt: Chat on Gitter Multidict is dict-like collection of *key-value pairs* where key might be occurred more than once in the container. Introduction ------------ *HTTP Headers* and *URL query string* require specific data structure: *multidict*. It behaves mostly like a regular ``dict`` but it may have several *values* for the same *key* and *preserves insertion ordering*. The *key* is ``str`` (or ``istr`` for case-insensitive dictionaries). ``multidict`` has four multidict classes: ``MultiDict``, ``MultiDictProxy``, ``CIMultiDict`` and ``CIMultiDictProxy``. Immutable proxies (``MultiDictProxy`` and ``CIMultiDictProxy``) provide a dynamic view for the proxied multidict, the view reflects underlying collection changes. They implement the ``collections.abc.Mapping`` interface. Regular mutable (``MultiDict`` and ``CIMultiDict``) classes implement ``collections.abc.MutableMapping`` and allows to change their own content. *Case insensitive* (``CIMultiDict`` and ``CIMultiDictProxy``) ones assume the *keys* are case insensitive, e.g.:: >>> dct = CIMultiDict(key='val') >>> 'Key' in dct True >>> dct['Key'] 'val' *Keys* should be ``str`` or ``istr`` instances. The library has optional C Extensions for sake of speed. License ------- Apache 2 Library Installation -------------------- .. code-block:: bash $ pip install multidict The library is Python 3 only! PyPI contains binary wheels for Linux, Windows and MacOS. If you want to install ``multidict`` on another operation system (or *Alpine Linux* inside a Docker) the Tarball will be used to compile the library from sources. It requires C compiler and Python headers installed. To skip the compilation please use `MULTIDICT_NO_EXTENSIONS` environment variable, e.g.: .. code-block:: bash $ MULTIDICT_NO_EXTENSIONS=1 pip install multidict Please note, Pure Python (uncompiled) version is about 20-50 times slower depending on the usage scenario!!! Changelog --------- See `RTD page `_. Platform: UNKNOWN Classifier: License :: OSI Approved :: Apache Software License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.5 multidict-4.7.3/multidict.egg-info/SOURCES.txt0000644000175100001650000000311413602414216021365 0ustar vstsdocker00000000000000CHANGES.rst LICENSE MANIFEST.in Makefile README.rst setup.cfg setup.py docs/Makefile docs/benchmark.rst docs/changes.rst docs/conf.py docs/index.rst docs/make.bat docs/multidict.rst docs/spelling_wordlist.txt multidict/__init__.py multidict/__init__.pyi multidict/_abc.py multidict/_compat.py multidict/_multidict.c multidict/_multidict_base.py multidict/_multidict_py.py multidict/py.typed multidict.egg-info/PKG-INFO multidict.egg-info/SOURCES.txt multidict.egg-info/dependency_links.txt multidict.egg-info/top_level.txt multidict/_multilib/defs.h multidict/_multilib/dict.h multidict/_multilib/istr.h multidict/_multilib/iter.h multidict/_multilib/pair_list.h multidict/_multilib/views.h tests/cimultidict.pickle.0 tests/cimultidict.pickle.1 tests/cimultidict.pickle.2 tests/cimultidict.pickle.3 tests/cimultidict.pickle.4 tests/cimultidict.pickle.5 tests/conftest.py tests/gen_pickles.py tests/multidict.pickle.0 tests/multidict.pickle.1 tests/multidict.pickle.2 tests/multidict.pickle.3 tests/multidict.pickle.4 tests/multidict.pickle.5 tests/pycimultidict.pickle.0 tests/pycimultidict.pickle.1 tests/pycimultidict.pickle.2 tests/pycimultidict.pickle.3 tests/pycimultidict.pickle.4 tests/pycimultidict.pickle.5 tests/pymultidict.pickle.0 tests/pymultidict.pickle.1 tests/pymultidict.pickle.2 tests/pymultidict.pickle.3 tests/pymultidict.pickle.4 tests/pymultidict.pickle.5 tests/test_abc.py tests/test_copy.py tests/test_guard.py tests/test_istr.py tests/test_multidict.py tests/test_mutable_multidict.py tests/test_mypy.py tests/test_pickle.py tests/test_types.py tests/test_update.py tests/test_version.pymultidict-4.7.3/multidict.egg-info/dependency_links.txt0000644000175100001650000000000113602414216023550 0ustar vstsdocker00000000000000 multidict-4.7.3/multidict.egg-info/top_level.txt0000644000175100001650000000001213602414216022225 0ustar vstsdocker00000000000000multidict multidict-4.7.3/setup.cfg0000644000175100001650000000140013602414216015626 0ustar vstsdocker00000000000000[aliases] test = pytest [metadata] license_file = LICENSE long_description = file: README.rst [flake8] ignore = E302,E701,E305,E704,F811,N811, W503 max-line-length = 88 [isort] multi_line_output = 3 include_trailing_comma = True force_grid_wrap = 0 use_parentheses = True known_first_party = multidict known_third_party = pytest [tool:pytest] testpaths = tests norecursedirs = dist build .tox docs requirements tools addopts = --doctest-modules --cov=multidict --cov-report term-missing:skip-covered --cov-report xml --junitxml=junit-test-results.xml -v doctest_optionflags = ALLOW_UNICODE ELLIPSIS junit_family = xunit2 [mypy-pytest] ignore_missing_imports = true [mypy-multidict._multidict] ignore_missing_imports = true [egg_info] tag_build = tag_date = 0 multidict-4.7.3/setup.py0000644000175100001650000000513713602414211015525 0ustar vstsdocker00000000000000import codecs import os import platform import re import sys from setuptools import Extension, setup NO_EXTENSIONS = bool(os.environ.get("MULTIDICT_NO_EXTENSIONS")) if sys.implementation.name != "cpython": NO_EXTENSIONS = True CFLAGS = ["-O2"] # CFLAGS = ['-g'] if platform.system() != "Windows": CFLAGS.extend( [ "-std=c99", "-Wall", "-Wsign-compare", "-Wconversion", "-fno-strict-aliasing", "-pedantic", ] ) extensions = [ Extension( "multidict._multidict", ["multidict/_multidict.c"], extra_compile_args=CFLAGS, ), ] with codecs.open( os.path.join( os.path.abspath(os.path.dirname(__file__)), "multidict", "__init__.py" ), "r", "latin1", ) as fp: try: version = re.findall(r'^__version__ = "([^"]+)"\r?$', fp.read(), re.M)[0] except IndexError: raise RuntimeError("Unable to determine version.") def read(f): return open(os.path.join(os.path.dirname(__file__), f)).read().strip() args = dict( name="multidict", version=version, description=("multidict implementation"), long_description=read("README.rst"), classifiers=[ "License :: OSI Approved :: Apache Software License", "Intended Audience :: Developers", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Development Status :: 5 - Production/Stable", ], author="Andrew Svetlov", author_email="andrew.svetlov@gmail.com", url="https://github.com/aio-libs/multidict", project_urls={ "Chat: Gitter": "https://gitter.im/aio-libs/Lobby", "CI: Azure Pipelines": "https://dev.azure.com/aio-libs/multidict/_build", "Coverage: codecov": "https://codecov.io/github/aio-libs/multidict", "Docs: RTD": "https://multidict.readthedocs.io", "GitHub: issues": "https://github.com/aio-libs/multidict/issues", "GitHub: repo": "https://github.com/aio-libs/multidict", }, license="Apache 2", packages=["multidict"], python_requires=">=3.5", include_package_data=True, ) if not NO_EXTENSIONS: print("**********************") print("* Accellerated build *") print("**********************") setup(ext_modules=extensions, **args) else: print("*********************") print("* Pure Python build *") print("*********************") setup(**args) multidict-4.7.3/tests/0000755000175100001650000000000013602414216015154 5ustar vstsdocker00000000000000multidict-4.7.3/tests/cimultidict.pickle.00000644000175100001650000000012113602414211021002 0ustar vstsdocker00000000000000cmultidict._multidict CIMultiDict p0 ((lp1 (Va p2 L1L tp3 a(g2 L2L tp4 atp5 Rp6 .multidict-4.7.3/tests/cimultidict.pickle.10000644000175100001650000000010713602414211021007 0ustar vstsdocker00000000000000cmultidict._multidict CIMultiDict q(]q((XaqKtq(hKtqetqRq.multidict-4.7.3/tests/cimultidict.pickle.20000644000175100001650000000010613602414211021007 0ustar vstsdocker00000000000000€cmultidict._multidict CIMultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/cimultidict.pickle.30000644000175100001650000000010613602414211021010 0ustar vstsdocker00000000000000€cmultidict._multidict CIMultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/cimultidict.pickle.40000644000175100001650000000011113602414211021005 0ustar vstsdocker00000000000000€•>Œmultidict._multidict”Œ CIMultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/cimultidict.pickle.50000644000175100001650000000011113602414211021006 0ustar vstsdocker00000000000000€•>Œmultidict._multidict”Œ CIMultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/conftest.py0000644000175100001650000000123013602414211017342 0ustar vstsdocker00000000000000import pickle import pytest from multidict._compat import USE_CYTHON_EXTENSIONS OPTIONAL_CYTHON = ( () if USE_CYTHON_EXTENSIONS else pytest.mark.skip(reason="No Cython extensions available") ) @pytest.fixture( scope="session", params=[ pytest.param("multidict._multidict", marks=OPTIONAL_CYTHON), "multidict._multidict_py", ], ) def _multidict(request): return pytest.importorskip(request.param) def pytest_generate_tests(metafunc): if "pickle_protocol" in metafunc.fixturenames: metafunc.parametrize( "pickle_protocol", list(range(pickle.HIGHEST_PROTOCOL + 1)), scope="session" ) multidict-4.7.3/tests/gen_pickles.py0000644000175100001650000000142613602414211020007 0ustar vstsdocker00000000000000import pickle from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as PyCIMultiDict # noqa from multidict._multidict_py import MultiDict as PyMultiDict # noqa try: from multidict._multidict import MultiDict, CIMultiDict # noqa except ImportError: pass def write(name, proto): cls = globals()[name] d = cls([("a", 1), ("a", 2)]) with open("{}.pickle.{}".format(name.lower(), proto), "wb") as f: pickle.dump(d, f, proto) def generate(): if not USE_CYTHON: raise RuntimeError("Cython is required") for proto in range(pickle.HIGHEST_PROTOCOL + 1): for name in ("MultiDict", "CIMultiDict", "PyMultiDict", "PyCIMultiDict"): write(name, proto) if __name__ == "__main__": generate() multidict-4.7.3/tests/multidict.pickle.00000644000175100001650000000011713602414211020473 0ustar vstsdocker00000000000000cmultidict._multidict MultiDict p0 ((lp1 (Va p2 L1L tp3 a(g2 L2L tp4 atp5 Rp6 .multidict-4.7.3/tests/multidict.pickle.10000644000175100001650000000010513602414211020471 0ustar vstsdocker00000000000000cmultidict._multidict MultiDict q(]q((XaqKtq(hKtqetqRq.multidict-4.7.3/tests/multidict.pickle.20000644000175100001650000000010413602414211020471 0ustar vstsdocker00000000000000€cmultidict._multidict MultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/multidict.pickle.30000644000175100001650000000010413602414211020472 0ustar vstsdocker00000000000000€cmultidict._multidict MultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/multidict.pickle.40000644000175100001650000000010713602414211020476 0ustar vstsdocker00000000000000€•<Œmultidict._multidict”Œ MultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/multidict.pickle.50000644000175100001650000000010713602414211020477 0ustar vstsdocker00000000000000€•<Œmultidict._multidict”Œ MultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/pycimultidict.pickle.00000644000175100001650000000012413602414211021356 0ustar vstsdocker00000000000000cmultidict._multidict_py CIMultiDict p0 ((lp1 (Va p2 L1L tp3 a(g2 L2L tp4 atp5 Rp6 .multidict-4.7.3/tests/pycimultidict.pickle.10000644000175100001650000000011213602414211021354 0ustar vstsdocker00000000000000cmultidict._multidict_py CIMultiDict q(]q((XaqKtq(hKtqetqRq.multidict-4.7.3/tests/pycimultidict.pickle.20000644000175100001650000000011113602414211021354 0ustar vstsdocker00000000000000€cmultidict._multidict_py CIMultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/pycimultidict.pickle.30000644000175100001650000000011113602414211021355 0ustar vstsdocker00000000000000€cmultidict._multidict_py CIMultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/pycimultidict.pickle.40000644000175100001650000000011413602414211021361 0ustar vstsdocker00000000000000€•AŒmultidict._multidict_py”Œ CIMultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/pycimultidict.pickle.50000644000175100001650000000011413602414211021362 0ustar vstsdocker00000000000000€•AŒmultidict._multidict_py”Œ CIMultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/pymultidict.pickle.00000644000175100001650000000012213602414211021040 0ustar vstsdocker00000000000000cmultidict._multidict_py MultiDict p0 ((lp1 (Va p2 L1L tp3 a(g2 L2L tp4 atp5 Rp6 .multidict-4.7.3/tests/pymultidict.pickle.10000644000175100001650000000011013602414211021036 0ustar vstsdocker00000000000000cmultidict._multidict_py MultiDict q(]q((XaqKtq(hKtqetqRq.multidict-4.7.3/tests/pymultidict.pickle.20000644000175100001650000000010713602414211021045 0ustar vstsdocker00000000000000€cmultidict._multidict_py MultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/pymultidict.pickle.30000644000175100001650000000010713602414211021046 0ustar vstsdocker00000000000000€cmultidict._multidict_py MultiDict q]q(XaqK†qhK†qe…qRq.multidict-4.7.3/tests/pymultidict.pickle.40000644000175100001650000000011213602414211021043 0ustar vstsdocker00000000000000€•?Œmultidict._multidict_py”Œ MultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/pymultidict.pickle.50000644000175100001650000000011213602414211021044 0ustar vstsdocker00000000000000€•?Œmultidict._multidict_py”Œ MultiDict”“”]”(Œa”K†”hK†”e…”R”.multidict-4.7.3/tests/test_abc.py0000644000175100001650000000613013602414211017305 0ustar vstsdocker00000000000000from collections.abc import Mapping, MutableMapping import pytest from multidict import MultiMapping, MutableMultiMapping from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as PyCIMultiDict from multidict._multidict_py import CIMultiDictProxy as PyCIMultiDictProxy from multidict._multidict_py import MultiDict as PyMultiDict # noqa: E402 from multidict._multidict_py import MultiDictProxy as PyMultiDictProxy if USE_CYTHON: from multidict._multidict import ( MultiDict, CIMultiDict, MultiDictProxy, CIMultiDictProxy, ) @pytest.fixture( params=([MultiDict, CIMultiDict] if USE_CYTHON else []) + [PyMultiDict, PyCIMultiDict], ids=(["MultiDict", "CIMultiDict"] if USE_CYTHON else []) + ["PyMultiDict", "PyCIMultiDict"], ) def cls(request): return request.param @pytest.fixture( params=( [(MultiDictProxy, MultiDict), (CIMultiDictProxy, CIMultiDict)] if USE_CYTHON else [] ) + [(PyMultiDictProxy, PyMultiDict), (PyCIMultiDictProxy, PyCIMultiDict)], ids=(["MultiDictProxy", "CIMultiDictProxy"] if USE_CYTHON else []) + ["PyMultiDictProxy", "PyCIMultiDictProxy"], ) def proxy_classes(request): return request.param def test_abc_inheritance(): assert issubclass(MultiMapping, Mapping) assert not issubclass(MultiMapping, MutableMapping) assert issubclass(MutableMultiMapping, Mapping) assert issubclass(MutableMultiMapping, MutableMapping) class A(MultiMapping): def __getitem__(self, key): pass def __iter__(self): pass def __len__(self): pass def getall(self, key, default=None): super().getall(key, default) def getone(self, key, default=None): super().getone(key, default) def test_abc_getall(): with pytest.raises(KeyError): A().getall("key") def test_abc_getone(): with pytest.raises(KeyError): A().getone("key") class B(A, MutableMultiMapping): def __setitem__(self, key, value): pass def __delitem__(self, key): pass def add(self, key, value): super().add(key, value) def extend(self, *args, **kwargs): super().extend(*args, **kwargs) def popall(self, key, default=None): super().popall(key, default) def popone(self, key, default=None): super().popone(key, default) def test_abc_add(): with pytest.raises(NotImplementedError): B().add("key", "val") def test_abc_extend(): with pytest.raises(NotImplementedError): B().extend() def test_abc_popone(): with pytest.raises(KeyError): B().popone("key") def test_abc_popall(): with pytest.raises(KeyError): B().popall("key") def test_multidict_inheritance(cls): assert issubclass(cls, MultiMapping) assert issubclass(cls, MutableMultiMapping) def test_proxy_inheritance(proxy_classes): proxy, _ = proxy_classes assert issubclass(proxy, MultiMapping) assert not issubclass(proxy, MutableMultiMapping) def test_generic_type_in_runtime(): MultiMapping[str] MutableMultiMapping[str] multidict-4.7.3/tests/test_copy.py0000644000175100001650000000347013602414211017536 0ustar vstsdocker00000000000000import copy import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as PyCIMultiDict from multidict._multidict_py import CIMultiDictProxy as PyCIMultiDictProxy from multidict._multidict_py import MultiDict as PyMultiDict # noqa: E402 from multidict._multidict_py import MultiDictProxy as PyMultiDictProxy if USE_CYTHON: from multidict._multidict import ( MultiDict, CIMultiDict, MultiDictProxy, CIMultiDictProxy, ) @pytest.fixture( params=([MultiDict, CIMultiDict] if USE_CYTHON else []) + [PyMultiDict, PyCIMultiDict], ids=(["MultiDict", "CIMultiDict"] if USE_CYTHON else []) + ["PyMultiDict", "PyCIMultiDict"], ) def cls(request): return request.param @pytest.fixture( params=( [(MultiDictProxy, MultiDict), (CIMultiDictProxy, CIMultiDict)] if USE_CYTHON else [] ) + [(PyMultiDictProxy, PyMultiDict), (PyCIMultiDictProxy, PyCIMultiDict)], ids=(["MultiDictProxy", "CIMultiDictProxy"] if USE_CYTHON else []) + ["PyMultiDictProxy", "PyCIMultiDictProxy"], ) def proxy_classes(request): return request.param def test_copy(cls): d = cls() d["foo"] = 6 d2 = d.copy() d2["foo"] = 7 assert d["foo"] == 6 assert d2["foo"] == 7 def test_copy_proxy(proxy_classes): proxy_cls, dict_cls = proxy_classes d = dict_cls() d["foo"] = 6 p = proxy_cls(d) d2 = p.copy() d2["foo"] = 7 assert d["foo"] == 6 assert p["foo"] == 6 assert d2["foo"] == 7 def test_copy_std_copy(cls): d = cls() d["foo"] = 6 d2 = copy.copy(d) d2["foo"] = 7 assert d["foo"] == 6 assert d2["foo"] == 7 def test_ci_multidict_clone(cls): d = cls(foo=6) d2 = cls(d) d2["foo"] = 7 assert d["foo"] == 6 assert d2["foo"] == 7 multidict-4.7.3/tests/test_guard.py0000644000175100001650000000151613602414211017665 0ustar vstsdocker00000000000000import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import MultiDict as PyMultiDict # noqa: E402 if USE_CYTHON: from multidict._multidict import MultiDict @pytest.fixture( params=([MultiDict] if USE_CYTHON else []) + [PyMultiDict], ids=(["MultiDict"] if USE_CYTHON else []) + ["PyMultiDict"], ) def cls(request): return request.param def test_guard_items(cls): md = cls({"a": "b"}) it = iter(md.items()) md["a"] = "c" with pytest.raises(RuntimeError): next(it) def test_guard_keys(cls): md = cls({"a": "b"}) it = iter(md.keys()) md["a"] = "c" with pytest.raises(RuntimeError): next(it) def test_guard_values(cls): md = cls({"a": "b"}) it = iter(md.values()) md["a"] = "c" with pytest.raises(RuntimeError): next(it) multidict-4.7.3/tests/test_istr.py0000644000175100001650000000314713602414211017546 0ustar vstsdocker00000000000000import gc import sys import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import istr as _istr # noqa: E402 if USE_CYTHON: from multidict._multidict import istr IMPLEMENTATION = getattr(sys, "implementation") # to suppress mypy error class IStrMixin: cls = NotImplemented def test_ctor(self): s = self.cls() assert "" == s def test_ctor_str(self): s = self.cls("aBcD") assert "aBcD" == s def test_ctor_istr(self): s = self.cls("A") s2 = self.cls(s) assert "A" == s assert s == s2 def test_ctor_buffer(self): s = self.cls(b"aBc") assert "b'aBc'" == s def test_ctor_repr(self): s = self.cls(None) assert "None" == s def test_str(self): s = self.cls("aBcD") s1 = str(s) assert s1 == "aBcD" assert type(s1) is str def test_eq(self): s1 = "Abc" s2 = self.cls(s1) assert s1 == s2 class TestPyIStr(IStrMixin): cls = _istr @staticmethod def _create_strs(): _istr("foobarbaz") istr2 = _istr() _istr(istr2) @pytest.mark.skipif( IMPLEMENTATION.name != "cpython", reason="PyPy has different GC implementation" ) def test_leak(self): gc.collect() cnt = len(gc.get_objects()) for _ in range(10000): self._create_strs() gc.collect() cnt2 = len(gc.get_objects()) assert abs(cnt - cnt2) < 10 # on PyPy these numbers are not equal if USE_CYTHON: class TestIStr(IStrMixin): cls = istr multidict-4.7.3/tests/test_multidict.py0000644000175100001650000003210513602414211020557 0ustar vstsdocker00000000000000import gc import operator import sys import weakref from functools import reduce import pytest import multidict def chained_callable(module, callables): """ Returns callable that will get and call all given objects in module in exact order. If `names` is a single object's name function will return object itself. Will treat `names` of type `str` as a list of single element. """ callables = (callables,) if isinstance(callables, str) else callables _callable, *rest = (getattr(module, name) for name in callables) def chained_call(*args, **kwargs): return reduce(lambda res, c: c(res), rest, _callable(*args, **kwargs)) return chained_call if len(rest) > 0 else _callable @pytest.fixture(scope="function") def cls(request, _multidict): return chained_callable(_multidict, request.param) dict_cls = proxy_cls = cls @pytest.mark.parametrize("cls", ["MultiDict", "CIMultiDict"], indirect=True) def test_exposed_names(cls): name = cls.__name__ while name.startswith("_"): name = name[1:] assert name in multidict.__all__ @pytest.mark.parametrize( "cls, key_cls", [("MultiDict", str), (("MultiDict", "MultiDictProxy"), str)], indirect=["cls"], ) def test__iter__types(cls, key_cls): d = cls([("key", "one"), ("key2", "two"), ("key", 3)]) for i in d: assert type(i) is key_cls, (type(i), key_cls) @pytest.mark.parametrize( "dict_cls, proxy_cls", [("MultiDict", "MultiDictProxy"), ("CIMultiDict", "CIMultiDictProxy")], indirect=True, ) def test_proxy_copy(dict_cls, proxy_cls): d1 = dict_cls(key="value", a="b") p1 = proxy_cls(d1) d2 = p1.copy() assert d1 == d2 assert d1 is not d2 @pytest.mark.skipif( sys.version_info < (3, 7), reason="__class_getitem__ is supported started from Python 3.7", ) @pytest.mark.parametrize( "cls", ["MultiDict", "CIMultiDict", "MultiDictProxy", "CIMultiDictProxy"], indirect=True, ) def test_class_getitem(cls): assert cls[str] is cls @pytest.mark.parametrize( "cls", ["MultiDict", "CIMultiDict", "MultiDictProxy", "CIMultiDictProxy"], indirect=True, ) def test_subclassing(cls): class MyClass(cls): pass class BaseMultiDictTest: def test_instantiate__empty(self, cls): d = cls() assert d == {} assert len(d) == 0 assert list(d.keys()) == [] assert list(d.values()) == [] assert list(d.items()) == [] assert cls() != list() with pytest.raises(TypeError, match=r"(2 given)"): cls(("key1", "value1"), ("key2", "value2")) @pytest.mark.parametrize("arg0", [[("key", "value1")], {"key": "value1"}]) def test_instantiate__from_arg0(self, cls, arg0): d = cls(arg0) assert d == {"key": "value1"} assert len(d) == 1 assert list(d.keys()) == ["key"] assert list(d.values()) == ["value1"] assert list(d.items()) == [("key", "value1")] def test_instantiate__with_kwargs(self, cls): d = cls([("key", "value1")], key2="value2") assert d == {"key": "value1", "key2": "value2"} assert len(d) == 2 assert sorted(d.keys()) == ["key", "key2"] assert sorted(d.values()) == ["value1", "value2"] assert sorted(d.items()) == [("key", "value1"), ("key2", "value2")] def test_instantiate__from_generator(self, cls): d = cls((str(i), i) for i in range(2)) assert d == {"0": 0, "1": 1} assert len(d) == 2 assert sorted(d.keys()) == ["0", "1"] assert sorted(d.values()) == [0, 1] assert sorted(d.items()) == [("0", 0), ("1", 1)] def test_instantiate__from_list_of_lists(self, cls): d = cls([["key", "value1"]]) assert d == {"key": "value1"} def test_instantiate__from_list_of_custom_pairs(self, cls): class Pair: def __len__(self): return 2 def __getitem__(self, pos): if pos == 0: return "key" elif pos == 1: return "value1" else: raise IndexError d = cls([Pair()]) assert d == {"key": "value1"} def test_getone(self, cls): d = cls([("key", "value1")], key="value2") assert d.getone("key") == "value1" assert d.get("key") == "value1" assert d["key"] == "value1" with pytest.raises(KeyError, match="key2"): d["key2"] with pytest.raises(KeyError, match="key2"): d.getone("key2") assert d.getone("key2", "default") == "default" def test__iter__( self, cls, ): d = cls([("key", "one"), ("key2", "two"), ("key", 3)]) assert list(d) == ["key", "key2", "key"] def test_keys__contains(self, cls): d = cls([("key", "one"), ("key2", "two"), ("key", 3)]) assert list(d.keys()) == ["key", "key2", "key"] assert "key" in d.keys() assert "key2" in d.keys() assert "foo" not in d.keys() def test_values__contains(self, cls): d = cls([("key", "one"), ("key", "two"), ("key", 3)]) assert list(d.values()) == ["one", "two", 3] assert "one" in d.values() assert "two" in d.values() assert 3 in d.values() assert "foo" not in d.values() def test_items__contains(self, cls): d = cls([("key", "one"), ("key", "two"), ("key", 3)]) assert list(d.items()) == [("key", "one"), ("key", "two"), ("key", 3)] assert ("key", "one") in d.items() assert ("key", "two") in d.items() assert ("key", 3) in d.items() assert ("foo", "bar") not in d.items() def test_cannot_create_from_unaccepted(self, cls): with pytest.raises(TypeError): cls([(1, 2, 3)]) def test_keys_is_set_less(self, cls): d = cls([("key", "value1")]) assert d.keys() < {"key", "key2"} def test_keys_is_set_less_equal(self, cls): d = cls([("key", "value1")]) assert d.keys() <= {"key"} def test_keys_is_set_equal(self, cls): d = cls([("key", "value1")]) assert d.keys() == {"key"} def test_keys_is_set_greater(self, cls): d = cls([("key", "value1")]) assert {"key", "key2"} > d.keys() def test_keys_is_set_greater_equal(self, cls): d = cls([("key", "value1")]) assert {"key"} >= d.keys() def test_keys_is_set_not_equal(self, cls): d = cls([("key", "value1")]) assert d.keys() != {"key2"} def test_eq(self, cls): d = cls([("key", "value1")]) assert {"key": "value1"} == d def test_eq2(self, cls): d1 = cls([("key", "value1")]) d2 = cls([("key2", "value1")]) assert d1 != d2 def test_eq3(self, cls): d1 = cls([("key", "value1")]) d2 = cls() assert d1 != d2 def test_eq_other_mapping_contains_more_keys(self, cls): d1 = cls(foo="bar") d2 = dict(foo="bar", bar="baz") assert d1 != d2 def test_ne(self, cls): d = cls([("key", "value1")]) assert d != {"key": "another_value"} def test_and(self, cls): d = cls([("key", "value1")]) assert {"key"} == d.keys() & {"key", "key2"} def test_and2(self, cls): d = cls([("key", "value1")]) assert {"key"} == {"key", "key2"} & d.keys() def test_or(self, cls): d = cls([("key", "value1")]) assert {"key", "key2"} == d.keys() | {"key2"} def test_or2(self, cls): d = cls([("key", "value1")]) assert {"key", "key2"} == {"key2"} | d.keys() def test_sub(self, cls): d = cls([("key", "value1"), ("key2", "value2")]) assert {"key"} == d.keys() - {"key2"} def test_sub2(self, cls): d = cls([("key", "value1"), ("key2", "value2")]) assert {"key3"} == {"key", "key2", "key3"} - d.keys() def test_xor(self, cls): d = cls([("key", "value1"), ("key2", "value2")]) assert {"key", "key3"} == d.keys() ^ {"key2", "key3"} def test_xor2(self, cls): d = cls([("key", "value1"), ("key2", "value2")]) assert {"key", "key3"} == {"key2", "key3"} ^ d.keys() @pytest.mark.parametrize("_set, expected", [({"key2"}, True), ({"key"}, False)]) def test_isdisjoint(self, cls, _set, expected): d = cls([("key", "value1")]) assert d.keys().isdisjoint(_set) == expected def test_repr_issue_410(self, cls): d = cls() try: raise Exception pytest.fail("Should never happen") # pragma: no cover except Exception as e: repr(d) assert sys.exc_info()[1] == e @pytest.mark.parametrize( "op", [operator.or_, operator.and_, operator.sub, operator.xor] ) @pytest.mark.parametrize("other", [{"other"}]) def test_op_issue_410(self, cls, op, other): d = cls([("key", "value")]) try: raise Exception pytest.fail("Should never happen") # pragma: no cover except Exception as e: op(d.keys(), other) assert sys.exc_info()[1] == e def test_weakref(self, cls): called = False def cb(wr): nonlocal called called = True d = cls() wr = weakref.ref(d, cb) del d gc.collect() assert called del wr def test_iter_length_hint_keys(self, cls): md = cls(a=1, b=2) it = iter(md.keys()) assert it.__length_hint__() == 2 def test_iter_length_hint_items(self, cls): md = cls(a=1, b=2) it = iter(md.items()) assert it.__length_hint__() == 2 def test_iter_length_hint_values(self, cls): md = cls(a=1, b=2) it = iter(md.values()) assert it.__length_hint__() == 2 class TestMultiDict(BaseMultiDictTest): @pytest.fixture(params=["MultiDict", ("MultiDict", "MultiDictProxy")]) def cls(self, request, _multidict): return chained_callable(_multidict, request.param) def test__repr__(self, cls): d = cls() _cls = type(d) assert str(d) == "<%s()>" % _cls.__name__ d = cls([("key", "one"), ("key", "two")]) assert str(d) == "<%s('key': 'one', 'key': 'two')>" % _cls.__name__ def test_getall(self, cls): d = cls([("key", "value1")], key="value2") assert d != {"key": "value1"} assert len(d) == 2 assert d.getall("key") == ["value1", "value2"] with pytest.raises(KeyError, match="some_key"): d.getall("some_key") default = object() assert d.getall("some_key", default) is default def test_preserve_stable_ordering(self, cls): d = cls([("a", 1), ("b", "2"), ("a", 3)]) s = "&".join("{}={}".format(k, v) for k, v in d.items()) assert s == "a=1&b=2&a=3" def test_get(self, cls): d = cls([("a", 1), ("a", 2)]) assert d["a"] == 1 def test_items__repr__(self, cls): d = cls([("key", "value1")], key="value2") expected = "_ItemsView('key': 'value1', 'key': 'value2')" assert repr(d.items()) == expected def test_keys__repr__(self, cls): d = cls([("key", "value1")], key="value2") assert repr(d.keys()) == "_KeysView('key', 'key')" def test_values__repr__(self, cls): d = cls([("key", "value1")], key="value2") assert repr(d.values()) == "_ValuesView('value1', 'value2')" class TestCIMultiDict(BaseMultiDictTest): @pytest.fixture(params=["CIMultiDict", ("CIMultiDict", "CIMultiDictProxy")]) def cls(self, request, _multidict): return chained_callable(_multidict, request.param) def test_basics(self, cls): d = cls([("KEY", "value1")], KEY="value2") assert d.getone("key") == "value1" assert d.get("key") == "value1" assert d.get("key2", "val") == "val" assert d["key"] == "value1" assert "key" in d with pytest.raises(KeyError, match="key2"): d["key2"] with pytest.raises(KeyError, match="key2"): d.getone("key2") def test_getall(self, cls): d = cls([("KEY", "value1")], KEY="value2") assert not d == {"KEY": "value1"} assert len(d) == 2 assert d.getall("key") == ["value1", "value2"] with pytest.raises(KeyError, match="some_key"): d.getall("some_key") def test_get(self, cls): d = cls([("A", 1), ("a", 2)]) assert 1 == d["a"] def test__repr__(self, cls): d = cls([("KEY", "value1")], key="value2") _cls = type(d) expected = "<%s('KEY': 'value1', 'key': 'value2')>" % _cls.__name__ assert str(d) == expected def test_items__repr__(self, cls): d = cls([("KEY", "value1")], key="value2") expected = "_ItemsView('KEY': 'value1', 'key': 'value2')" assert repr(d.items()) == expected def test_keys__repr__(self, cls): d = cls([("KEY", "value1")], key="value2") assert repr(d.keys()) == "_KeysView('KEY', 'key')" def test_values__repr__(self, cls): d = cls([("KEY", "value1")], key="value2") assert repr(d.values()) == "_ValuesView('value1', 'value2')" multidict-4.7.3/tests/test_mutable_multidict.py0000644000175100001650000003033213602414211022270 0ustar vstsdocker00000000000000import string import sys import pytest class TestMutableMultiDict: @pytest.fixture def cls(self, _multidict): return _multidict.MultiDict @pytest.fixture def proxy_cls(self, _multidict): return _multidict.MultiDictProxy @pytest.fixture def istr(self, _multidict): return _multidict.istr def test_copy(self, cls): d1 = cls(key="value", a="b") d2 = d1.copy() assert d1 == d2 assert d1 is not d2 def test__repr__(self, cls): d = cls() assert str(d) == "<%s()>" % cls.__name__ d = cls([("key", "one"), ("key", "two")]) expected = "<%s('key': 'one', 'key': 'two')>" % cls.__name__ assert str(d) == expected def test_getall(self, cls): d = cls([("key", "value1")], key="value2") assert len(d) == 2 assert d.getall("key") == ["value1", "value2"] with pytest.raises(KeyError, match="some_key"): d.getall("some_key") default = object() assert d.getall("some_key", default) is default def test_add(self, cls): d = cls() assert d == {} d["key"] = "one" assert d == {"key": "one"} assert d.getall("key") == ["one"] d["key"] = "two" assert d == {"key": "two"} assert d.getall("key") == ["two"] d.add("key", "one") assert 2 == len(d) assert d.getall("key") == ["two", "one"] d.add("foo", "bar") assert 3 == len(d) assert d.getall("foo") == ["bar"] def test_extend(self, cls): d = cls() assert d == {} d.extend([("key", "one"), ("key", "two")], key=3, foo="bar") assert d != {"key": "one", "foo": "bar"} assert 4 == len(d) itms = d.items() # we can't guarantee order of kwargs assert ("key", "one") in itms assert ("key", "two") in itms assert ("key", 3) in itms assert ("foo", "bar") in itms other = cls(bar="baz") assert other == {"bar": "baz"} d.extend(other) assert ("bar", "baz") in d.items() d.extend({"foo": "moo"}) assert ("foo", "moo") in d.items() d.extend() assert 6 == len(d) with pytest.raises(TypeError): d.extend("foo", "bar") def test_extend_from_proxy(self, cls, proxy_cls): d = cls([("a", "a"), ("b", "b")]) proxy = proxy_cls(d) d2 = cls() d2.extend(proxy) assert [("a", "a"), ("b", "b")] == list(d2.items()) def test_clear(self, cls): d = cls([("key", "one")], key="two", foo="bar") d.clear() assert d == {} assert list(d.items()) == [] def test_del(self, cls): d = cls([("key", "one"), ("key", "two")], foo="bar") assert list(d.keys()) == ["key", "key", "foo"] del d["key"] assert d == {"foo": "bar"} assert list(d.items()) == [("foo", "bar")] with pytest.raises(KeyError, match="key"): del d["key"] def test_set_default(self, cls): d = cls([("key", "one"), ("key", "two")], foo="bar") assert "one" == d.setdefault("key", "three") assert "three" == d.setdefault("otherkey", "three") assert "otherkey" in d assert "three" == d["otherkey"] def test_popitem(self, cls): d = cls() d.add("key", "val1") d.add("key", "val2") assert ("key", "val1") == d.popitem() assert [("key", "val2")] == list(d.items()) def test_popitem_empty_multidict(self, cls): d = cls() with pytest.raises(KeyError): d.popitem() def test_pop(self, cls): d = cls() d.add("key", "val1") d.add("key", "val2") assert "val1" == d.pop("key") assert {"key": "val2"} == d def test_pop2(self, cls): d = cls() d.add("key", "val1") d.add("key2", "val2") d.add("key", "val3") assert "val1" == d.pop("key") assert [("key2", "val2"), ("key", "val3")] == list(d.items()) def test_pop_default(self, cls): d = cls(other="val") assert "default" == d.pop("key", "default") assert "other" in d def test_pop_raises(self, cls): d = cls(other="val") with pytest.raises(KeyError, match="key"): d.pop("key") assert "other" in d def test_replacement_order(self, cls): d = cls() d.add("key1", "val1") d.add("key2", "val2") d.add("key1", "val3") d.add("key2", "val4") d["key1"] = "val" expected = [("key1", "val"), ("key2", "val2"), ("key2", "val4")] assert expected == list(d.items()) def test_nonstr_key(self, cls): d = cls() with pytest.raises(TypeError): d[1] = "val" def test_istr_key(self, cls, istr): d = cls() d[istr("1")] = "val" assert type(list(d.keys())[0]) is istr def test_str_derived_key(self, cls): class A(str): pass d = cls() d[A("1")] = "val" assert type(list(d.keys())[0]) is A def test_istr_key_add(self, cls, istr): d = cls() d.add(istr("1"), "val") assert type(list(d.keys())[0]) is istr def test_str_derived_key_add(self, cls): class A(str): pass d = cls() d.add(A("1"), "val") assert type(list(d.keys())[0]) is A def test_popall(self, cls): d = cls() d.add("key1", "val1") d.add("key2", "val2") d.add("key1", "val3") ret = d.popall("key1") assert ["val1", "val3"] == ret assert {"key2": "val2"} == d def test_popall_default(self, cls): d = cls() assert "val" == d.popall("key", "val") def test_popall_key_error(self, cls): d = cls() with pytest.raises(KeyError, match="key"): d.popall("key") def test_large_multidict_resizing(self, cls): SIZE = 1024 d = cls() for i in range(SIZE): d["key" + str(i)] = i for i in range(SIZE - 1): del d["key" + str(i)] assert {"key" + str(SIZE - 1): SIZE - 1} == d class TestCIMutableMultiDict: @pytest.fixture def cls(self, _multidict): return _multidict.CIMultiDict @pytest.fixture def proxy_cls(self, _multidict): return _multidict.CIMultiDictProxy @pytest.fixture def istr(self, _multidict): return _multidict.istr def test_getall(self, cls): d = cls([("KEY", "value1")], KEY="value2") assert d != {"KEY": "value1"} assert len(d) == 2 assert d.getall("key") == ["value1", "value2"] with pytest.raises(KeyError, match="some_key"): d.getall("some_key") def test_ctor(self, cls): d = cls(k1="v1") assert "v1" == d["K1"] assert ("k1", "v1") in d.items() def test_setitem(self, cls): d = cls() d["k1"] = "v1" assert "v1" == d["K1"] assert ("k1", "v1") in d.items() def test_delitem(self, cls): d = cls() d["k1"] = "v1" assert "K1" in d del d["k1"] assert "K1" not in d def test_copy(self, cls): d1 = cls(key="KEY", a="b") d2 = d1.copy() assert d1 == d2 assert d1.items() == d2.items() assert d1 is not d2 def test__repr__(self, cls): d = cls() assert str(d) == "<%s()>" % cls.__name__ d = cls([("KEY", "one"), ("KEY", "two")]) expected = "<%s('KEY': 'one', 'KEY': 'two')>" % cls.__name__ assert str(d) == expected def test_add(self, cls): d = cls() assert d == {} d["KEY"] = "one" assert ("KEY", "one") in d.items() assert d == cls({"Key": "one"}) assert d.getall("key") == ["one"] d["KEY"] = "two" assert ("KEY", "two") in d.items() assert d == cls({"Key": "two"}) assert d.getall("key") == ["two"] d.add("KEY", "one") assert ("KEY", "one") in d.items() assert 2 == len(d) assert d.getall("key") == ["two", "one"] d.add("FOO", "bar") assert ("FOO", "bar") in d.items() assert 3 == len(d) assert d.getall("foo") == ["bar"] d.add(key="test", value="test") assert ("test", "test") in d.items() assert 4 == len(d) assert d.getall("test") == ["test"] def test_extend(self, cls): d = cls() assert d == {} d.extend([("KEY", "one"), ("key", "two")], key=3, foo="bar") assert 4 == len(d) itms = d.items() # we can't guarantee order of kwargs assert ("KEY", "one") in itms assert ("key", "two") in itms assert ("key", 3) in itms assert ("foo", "bar") in itms other = cls(Bar="baz") assert other == {"Bar": "baz"} d.extend(other) assert ("Bar", "baz") in d.items() assert "bar" in d d.extend({"Foo": "moo"}) assert ("Foo", "moo") in d.items() assert "foo" in d d.extend() assert 6 == len(d) with pytest.raises(TypeError): d.extend("foo", "bar") def test_extend_from_proxy(self, cls, proxy_cls): d = cls([("a", "a"), ("b", "b")]) proxy = proxy_cls(d) d2 = cls() d2.extend(proxy) assert [("a", "a"), ("b", "b")] == list(d2.items()) def test_clear(self, cls): d = cls([("KEY", "one")], key="two", foo="bar") d.clear() assert d == {} assert list(d.items()) == [] def test_del(self, cls): d = cls([("KEY", "one"), ("key", "two")], foo="bar") del d["key"] assert d == {"foo": "bar"} assert list(d.items()) == [("foo", "bar")] with pytest.raises(KeyError, match="key"): del d["key"] def test_set_default(self, cls): d = cls([("KEY", "one"), ("key", "two")], foo="bar") assert "one" == d.setdefault("key", "three") assert "three" == d.setdefault("otherkey", "three") assert "otherkey" in d assert ("otherkey", "three") in d.items() assert "three" == d["OTHERKEY"] def test_popitem(self, cls): d = cls() d.add("KEY", "val1") d.add("key", "val2") pair = d.popitem() assert ("KEY", "val1") == pair assert isinstance(pair[0], str) assert [("key", "val2")] == list(d.items()) def test_popitem_empty_multidict(self, cls): d = cls() with pytest.raises(KeyError): d.popitem() def test_pop(self, cls): d = cls() d.add("KEY", "val1") d.add("key", "val2") assert "val1" == d.pop("KEY") assert {"key": "val2"} == d def test_pop_lowercase(self, cls): d = cls() d.add("KEY", "val1") d.add("key", "val2") assert "val1" == d.pop("key") assert {"key": "val2"} == d def test_pop_default(self, cls): d = cls(OTHER="val") assert "default" == d.pop("key", "default") assert "other" in d def test_pop_raises(self, cls): d = cls(OTHER="val") with pytest.raises(KeyError, match="KEY"): d.pop("KEY") assert "other" in d def test_extend_with_istr(self, cls, istr): us = istr("aBc") d = cls() d.extend([(us, "val")]) assert [("aBc", "val")] == list(d.items()) def test_copy_istr(self, cls, istr): d = cls({istr("Foo"): "bar"}) d2 = d.copy() assert d == d2 def test_eq(self, cls): d1 = cls(Key="val") d2 = cls(KEY="val") assert d1 == d2 @pytest.mark.skipif(sys.implementation.name == "pypy", reason="getsizeof() is not implemented on PyPy") def test_sizeof(self, cls): md = cls() s1 = sys.getsizeof(md) for i in string.ascii_lowercase: for j in string.ascii_uppercase: md[i + j] = i + j # multidict should be resized s2 = sys.getsizeof(md) assert s2 > s1 @pytest.mark.skipif(sys.implementation.name == "pypy", reason="getsizeof() is not implemented on PyPy") def test_min_sizeof(self, cls): md = cls() assert sys.getsizeof(md) < 1024 multidict-4.7.3/tests/test_mypy.py0000644000175100001650000000056213602414211017561 0ustar vstsdocker00000000000000import multidict def test_classes_not_abstract() -> None: d1 = multidict.MultiDict({"a": "b"}) # type: multidict.MultiDict[str] d2 = multidict.CIMultiDict({"a": "b"}) # type: multidict.CIMultiDict[str] d3 = multidict.MultiDictProxy(d1) d4 = multidict.CIMultiDictProxy(d2) d1.getone("a") d2.getall("a") d3.getone("a") d4.getall("a") multidict-4.7.3/tests/test_pickle.py0000644000175100001650000000413013602414211020025 0ustar vstsdocker00000000000000import pickle from pathlib import Path import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as PyCIMultiDict from multidict._multidict_py import CIMultiDictProxy as PyCIMultiDictProxy from multidict._multidict_py import MultiDict as PyMultiDict # noqa: E402 from multidict._multidict_py import MultiDictProxy as PyMultiDictProxy if USE_CYTHON: from multidict._multidict import ( MultiDict, CIMultiDict, MultiDictProxy, CIMultiDictProxy, ) here = Path(__file__).resolve().parent @pytest.fixture( params=(["MultiDict", "CIMultiDict"] if USE_CYTHON else []) + ["PyMultiDict", "PyCIMultiDict"] ) def cls_name(request): return request.param @pytest.fixture( params=([MultiDict, CIMultiDict] if USE_CYTHON else []) + [PyMultiDict, PyCIMultiDict], ids=(["MultiDict", "CIMultiDict"] if USE_CYTHON else []) + ["PyMultiDict", "PyCIMultiDict"], ) def cls(request): return request.param @pytest.fixture( params=( [(MultiDictProxy, MultiDict), (CIMultiDictProxy, CIMultiDict)] if USE_CYTHON else [] ) + [(PyMultiDictProxy, PyMultiDict), (PyCIMultiDictProxy, PyCIMultiDict)], ids=(["MultiDictProxy", "CIMultiDictProxy"] if USE_CYTHON else []) + ["PyMultiDictProxy", "PyCIMultiDictProxy"], ) def proxy_classes(request): return request.param def test_pickle(cls, pickle_protocol): d = cls([("a", 1), ("a", 2)]) pbytes = pickle.dumps(d, pickle_protocol) obj = pickle.loads(pbytes) assert d == obj assert isinstance(obj, cls) def test_pickle_proxy(proxy_classes): proxy_cls, dict_cls = proxy_classes d = dict_cls([("a", 1), ("a", 2)]) proxy = proxy_cls(d) with pytest.raises(TypeError): pickle.dumps(proxy) def test_load_from_file(pickle_protocol, cls_name): cls = globals()[cls_name] d = cls([("a", 1), ("a", 2)]) fname = "{}.pickle.{}".format(cls_name.lower(), pickle_protocol) p = here / fname with p.open("rb") as f: obj = pickle.load(f) assert d == obj assert isinstance(obj, cls) multidict-4.7.3/tests/test_types.py0000644000175100001650000000432313602414211017726 0ustar vstsdocker00000000000000import pytest def test_proxies(_multidict): assert issubclass(_multidict.CIMultiDictProxy, _multidict.MultiDictProxy) def test_dicts(_multidict): assert issubclass(_multidict.CIMultiDict, _multidict.MultiDict) def test_proxy_not_inherited_from_dict(_multidict): assert not issubclass(_multidict.MultiDictProxy, _multidict.MultiDict) def test_dict_not_inherited_from_proxy(_multidict): assert not issubclass(_multidict.MultiDict, _multidict.MultiDictProxy) def test_multidict_proxy_copy_type(_multidict): d = _multidict.MultiDict(key="val") p = _multidict.MultiDictProxy(d) assert isinstance(p.copy(), _multidict.MultiDict) def test_cimultidict_proxy_copy_type(_multidict): d = _multidict.CIMultiDict(key="val") p = _multidict.CIMultiDictProxy(d) assert isinstance(p.copy(), _multidict.CIMultiDict) def test_create_multidict_proxy_from_nonmultidict(_multidict): with pytest.raises(TypeError): _multidict.MultiDictProxy({}) def test_create_multidict_proxy_from_cimultidict(_multidict): d = _multidict.CIMultiDict(key="val") p = _multidict.MultiDictProxy(d) assert p == d def test_create_multidict_proxy_from_multidict_proxy_from_mdict(_multidict): d = _multidict.MultiDict(key="val") p = _multidict.MultiDictProxy(d) assert p == d p2 = _multidict.MultiDictProxy(p) assert p2 == p def test_create_cimultidict_proxy_from_cimultidict_proxy_from_ci(_multidict): d = _multidict.CIMultiDict(key="val") p = _multidict.CIMultiDictProxy(d) assert p == d p2 = _multidict.CIMultiDictProxy(p) assert p2 == p def test_create_cimultidict_proxy_from_nonmultidict(_multidict): with pytest.raises( TypeError, match=( "ctor requires CIMultiDict or CIMultiDictProxy instance, " "not " ), ): _multidict.CIMultiDictProxy({}) def test_create_ci_multidict_proxy_from_multidict(_multidict): d = _multidict.MultiDict(key="val") with pytest.raises( TypeError, match=( "ctor requires CIMultiDict or CIMultiDictProxy instance, " "not " ), ): _multidict.CIMultiDictProxy(d) multidict-4.7.3/tests/test_update.py0000644000175100001650000000614613602414211020051 0ustar vstsdocker00000000000000import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as PyCIMultiDict from multidict._multidict_py import MultiDict as PyMultiDict # noqa: E402 if USE_CYTHON: from multidict._multidict import MultiDict, CIMultiDict @pytest.fixture( params=([MultiDict, CIMultiDict] if USE_CYTHON else []) + [PyMultiDict, PyCIMultiDict], ids=(["MultiDict", "CIMultiDict"] if USE_CYTHON else []) + ["PyMultiDict", "PyCIMultiDict"], ) def cls(request): return request.param @pytest.fixture def md_cls(_multidict): return _multidict.MultiDict @pytest.fixture def ci_md_cls(_multidict): return _multidict.CIMultiDict @pytest.fixture def istr(_multidict): return _multidict.istr def test_update_replace(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = cls([("a", 4), ("b", 5), ("a", 6)]) obj1.update(obj2) expected = [("a", 4), ("b", 5), ("a", 6), ("c", 10)] assert list(obj1.items()) == expected def test_update_append(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = cls([("a", 4), ("a", 5), ("a", 6)]) obj1.update(obj2) expected = [("a", 4), ("b", 2), ("a", 5), ("c", 10), ("a", 6)] assert list(obj1.items()) == expected def test_update_remove(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = cls([("a", 4)]) obj1.update(obj2) expected = [("a", 4), ("b", 2), ("c", 10)] assert list(obj1.items()) == expected def test_update_replace_seq(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = [("a", 4), ("b", 5), ("a", 6)] obj1.update(obj2) expected = [("a", 4), ("b", 5), ("a", 6), ("c", 10)] assert list(obj1.items()) == expected def test_update_replace_seq2(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj1.update([("a", 4)], b=5, a=6) expected = [("a", 4), ("b", 5), ("a", 6), ("c", 10)] assert list(obj1.items()) == expected def test_update_append_seq(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = [("a", 4), ("a", 5), ("a", 6)] obj1.update(obj2) expected = [("a", 4), ("b", 2), ("a", 5), ("c", 10), ("a", 6)] assert list(obj1.items()) == expected def test_update_remove_seq(cls): obj1 = cls([("a", 1), ("b", 2), ("a", 3), ("c", 10)]) obj2 = [("a", 4)] obj1.update(obj2) expected = [("a", 4), ("b", 2), ("c", 10)] assert list(obj1.items()) == expected def test_update_md(md_cls): d = md_cls() d.add("key", "val1") d.add("key", "val2") d.add("key2", "val3") d.update(key="val") assert [("key", "val"), ("key2", "val3")] == list(d.items()) def test_update_istr_ci_md(ci_md_cls, istr): d = ci_md_cls() d.add(istr("KEY"), "val1") d.add("key", "val2") d.add("key2", "val3") d.update({istr("key"): "val"}) assert [("key", "val"), ("key2", "val3")] == list(d.items()) def test_update_ci_md(ci_md_cls): d = ci_md_cls() d.add("KEY", "val1") d.add("key", "val2") d.add("key2", "val3") d.update(Key="val") assert [("Key", "val"), ("key2", "val3")] == list(d.items()) multidict-4.7.3/tests/test_version.py0000644000175100001650000001061413602414211020247 0ustar vstsdocker00000000000000import pytest from multidict._compat import USE_CYTHON from multidict._multidict_py import CIMultiDict as _CIMultiDict from multidict._multidict_py import MultiDict as _MultiDict # noqa: E402 from multidict._multidict_py import getversion as _getversion if USE_CYTHON: from multidict._multidict import MultiDict, CIMultiDict, getversion class VersionMixin: cls = NotImplemented def getver(self, md): raise NotImplementedError def test_getversion_bad_param(self): with pytest.raises(TypeError): self.getver(1) def test_ctor(self): m1 = self.cls() v1 = self.getver(m1) m2 = self.cls() v2 = self.getver(m2) assert v1 != v2 def test_add(self): m = self.cls() v = self.getver(m) m.add("key", "val") assert self.getver(m) > v def test_delitem(self): m = self.cls() m.add("key", "val") v = self.getver(m) del m["key"] assert self.getver(m) > v def test_delitem_not_found(self): m = self.cls() m.add("key", "val") v = self.getver(m) with pytest.raises(KeyError): del m["notfound"] assert self.getver(m) == v def test_setitem(self): m = self.cls() m.add("key", "val") v = self.getver(m) m["key"] = "val2" assert self.getver(m) > v def test_setitem_not_found(self): m = self.cls() m.add("key", "val") v = self.getver(m) m["notfound"] = "val2" assert self.getver(m) > v def test_clear(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.clear() assert self.getver(m) > v def test_setdefault(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.setdefault("key2", "val2") assert self.getver(m) > v def test_popone(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.popone("key") assert self.getver(m) > v def test_popone_default(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.popone("key2", "default") assert self.getver(m) == v def test_popone_key_error(self): m = self.cls() m.add("key", "val") v = self.getver(m) with pytest.raises(KeyError): m.popone("key2") assert self.getver(m) == v def test_pop(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.pop("key") assert self.getver(m) > v def test_pop_default(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.pop("key2", "default") assert self.getver(m) == v def test_pop_key_error(self): m = self.cls() m.add("key", "val") v = self.getver(m) with pytest.raises(KeyError): m.pop("key2") assert self.getver(m) == v def test_popall(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.popall("key") assert self.getver(m) > v def test_popall_default(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.popall("key2", "default") assert self.getver(m) == v def test_popall_key_error(self): m = self.cls() m.add("key", "val") v = self.getver(m) with pytest.raises(KeyError): m.popall("key2") assert self.getver(m) == v def test_popitem(self): m = self.cls() m.add("key", "val") v = self.getver(m) m.popitem() assert self.getver(m) > v def test_popitem_key_error(self): m = self.cls() v = self.getver(m) with pytest.raises(KeyError): m.popitem() assert self.getver(m) == v if USE_CYTHON: class TestMultiDict(VersionMixin): cls = MultiDict def getver(self, md): return getversion(md) if USE_CYTHON: class TestCIMultiDict(VersionMixin): cls = CIMultiDict def getver(self, md): return getversion(md) class TestPyMultiDict(VersionMixin): cls = _MultiDict def getver(self, md): return _getversion(md) class TestPyCIMultiDict(VersionMixin): cls = _CIMultiDict def getver(self, md): return _getversion(md)