astroid-3.2.2/0000775000175000017500000000000014622475517013106 5ustar epsilonepsilonastroid-3.2.2/MANIFEST.in0000664000175000017500000000015714622475517014647 0ustar epsilonepsiloninclude README.rst include requirements*.txt include tox.ini recursive-include tests *.py graft tests/testdata astroid-3.2.2/setup.cfg0000664000175000017500000000044414622475517014731 0ustar epsilonepsilon# Setuptools v62.6 doesn't support editable installs with just 'pyproject.toml' (PEP 660). # Keep this file until it does! [metadata] # wheel doesn't yet read license_files from pyproject.toml - tools.setuptools # Keep it here until it does! license_files = LICENSE CONTRIBUTORS.txt astroid-3.2.2/README.rst0000664000175000017500000000543014622475517014577 0ustar epsilonepsilonAstroid ======= .. image:: https://codecov.io/gh/pylint-dev/astroid/branch/main/graph/badge.svg?token=Buxy4WptLb :target: https://codecov.io/gh/pylint-dev/astroid :alt: Coverage badge from codecov .. image:: https://readthedocs.org/projects/astroid/badge/?version=latest :target: http://astroid.readthedocs.io/en/latest/?badge=latest :alt: Documentation Status .. image:: https://img.shields.io/badge/code%20style-black-000000.svg :target: https://github.com/ambv/black .. image:: https://results.pre-commit.ci/badge/github/pylint-dev/astroid/main.svg :target: https://results.pre-commit.ci/latest/github/pylint-dev/astroid/main :alt: pre-commit.ci status .. |tidelift_logo| image:: https://raw.githubusercontent.com/pylint-dev/astroid/main/doc/media/Tidelift_Logos_RGB_Tidelift_Shorthand_On-White.png :width: 200 :alt: Tidelift .. list-table:: :widths: 10 100 * - |tidelift_logo| - Professional support for astroid is available as part of the `Tidelift Subscription`_. Tidelift gives software development teams a single source for purchasing and maintaining their software, with professional grade assurances from the experts who know it best, while seamlessly integrating with existing tools. .. _Tidelift Subscription: https://tidelift.com/subscription/pkg/pypi-astroid?utm_source=pypi-astroid&utm_medium=referral&utm_campaign=readme What's this? ------------ The aim of this module is to provide a common base representation of python source code. It is currently the library powering pylint's capabilities. It provides a compatible representation which comes from the `_ast` module. It rebuilds the tree generated by the builtin _ast module by recursively walking down the AST and building an extended ast. The new node classes have additional methods and attributes for different usages. They include some support for static inference and local name scopes. Furthermore, astroid can also build partial trees by inspecting living objects. Installation ------------ Extract the tarball, jump into the created directory and run:: pip install . If you want to do an editable installation, you can run:: pip install -e . If you have any questions, please mail the code-quality@python.org mailing list for support. See http://mail.python.org/mailman/listinfo/code-quality for subscription information and archives. Documentation ------------- http://astroid.readthedocs.io/en/latest/ Python Versions --------------- astroid 2.0 is currently available for Python 3 only. If you want Python 2 support, use an older version of astroid (though note that these versions are no longer supported). Test ---- Tests are in the 'test' subdirectory. To launch the whole tests suite, you can use either `tox` or `pytest`:: tox pytest astroid-3.2.2/requirements_full.txt0000664000175000017500000000032314622475517017412 0ustar epsilonepsilon-r requirements_minimal.txt -r requirements_dev.txt # Packages used to run additional tests attrs nose numpy>=1.17.0; python_version<"3.12" python-dateutil PyQt6 regex six urllib3>1,<2 typing_extensions>=4.4.0 astroid-3.2.2/doc/0000775000175000017500000000000014622475517013653 5ustar epsilonepsilonastroid-3.2.2/doc/requirements.txt0000664000175000017500000000004114622475517017132 0ustar epsilonepsilon-e . sphinx~=7.3 furo==2024.4.27 astroid-3.2.2/doc/api/0000775000175000017500000000000014622475517014424 5ustar epsilonepsilonastroid-3.2.2/doc/api/astroid.exceptions.rst0000664000175000017500000000112214622475517020777 0ustar epsilonepsilonExceptions ========== .. automodule:: astroid.exceptions .. rubric:: Exceptions .. autosummary:: AstroidBuildingError AstroidError AstroidImportError AstroidIndexError AstroidSyntaxError AstroidTypeError AttributeInferenceError DuplicateBasesError InconsistentMroError InferenceError MroError NameInferenceError NoDefault NotFoundError ParentMissingError ResolveError SuperArgumentTypeError SuperError TooManyLevelsError UnresolvableName UseInferenceDefault astroid-3.2.2/doc/api/general.rst0000664000175000017500000000006214622475517016571 0ustar epsilonepsilonGeneral API ------------ .. automodule:: astroid astroid-3.2.2/doc/api/index.rst0000664000175000017500000000017014622475517016263 0ustar epsilonepsilon API === .. toctree:: :maxdepth: 2 :titlesonly: general astroid.nodes base_nodes astroid.exceptions astroid-3.2.2/doc/api/astroid.nodes.rst0000664000175000017500000000412114622475517017730 0ustar epsilonepsilonNodes ===== .. _nodes: Nodes ----- .. autosummary:: :toctree: nodes :template: autosummary_class.rst astroid.nodes.AnnAssign astroid.nodes.Arguments astroid.nodes.Assert astroid.nodes.Assign astroid.nodes.AssignAttr astroid.nodes.AssignName astroid.nodes.AsyncFor astroid.nodes.AsyncFunctionDef astroid.nodes.AsyncWith astroid.nodes.Attribute astroid.nodes.AugAssign astroid.nodes.Await astroid.nodes.BinOp astroid.nodes.BoolOp astroid.nodes.Break astroid.nodes.Call astroid.nodes.ClassDef astroid.nodes.Compare astroid.nodes.Comprehension astroid.nodes.Const astroid.nodes.Continue astroid.nodes.Decorators astroid.nodes.DelAttr astroid.nodes.DelName astroid.nodes.Delete astroid.nodes.Dict astroid.nodes.DictComp astroid.nodes.DictUnpack astroid.nodes.EmptyNode astroid.nodes.ExceptHandler astroid.nodes.Expr astroid.nodes.For astroid.nodes.FormattedValue astroid.nodes.FunctionDef astroid.nodes.GeneratorExp astroid.nodes.Global astroid.nodes.If astroid.nodes.IfExp astroid.nodes.Import astroid.nodes.ImportFrom astroid.nodes.JoinedStr astroid.nodes.Keyword astroid.nodes.Lambda astroid.nodes.List astroid.nodes.ListComp astroid.nodes.Match astroid.nodes.MatchAs astroid.nodes.MatchCase astroid.nodes.MatchClass astroid.nodes.MatchMapping astroid.nodes.MatchOr astroid.nodes.MatchSequence astroid.nodes.MatchSingleton astroid.nodes.MatchStar astroid.nodes.MatchValue astroid.nodes.Module astroid.nodes.Name astroid.nodes.Nonlocal astroid.nodes.ParamSpec astroid.nodes.Pass astroid.nodes.Raise astroid.nodes.Return astroid.nodes.Set astroid.nodes.SetComp astroid.nodes.Slice astroid.nodes.Starred astroid.nodes.Subscript astroid.nodes.Try astroid.nodes.TryStar astroid.nodes.Tuple astroid.nodes.TypeAlias astroid.nodes.TypeVar astroid.nodes.TypeVarTuple astroid.nodes.UnaryOp astroid.nodes.Unknown astroid.nodes.While astroid.nodes.With astroid.nodes.Yield astroid.nodes.YieldFrom astroid-3.2.2/doc/api/base_nodes.rst0000664000175000017500000000051214622475517017256 0ustar epsilonepsilonBase Nodes ========== These are abstract node classes that :ref:`other nodes ` inherit from. .. autosummary:: :toctree: base_nodes :template: autosummary_class.rst astroid.nodes.BaseContainer astroid.nodes.ComprehensionScope astroid.nodes.LocalsDictNodeNG astroid.nodes.NodeNG astroid.nodes.Pattern astroid-3.2.2/doc/changelog.rst0000664000175000017500000000003214622475517016327 0ustar epsilonepsilon.. include:: ../ChangeLog astroid-3.2.2/doc/conf.py0000664000175000017500000000510014622475517015146 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import os import sys from datetime import datetime # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. sys.path.insert(0, os.path.abspath("..")) # -- General configuration ----------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be extensions # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = [ "sphinx.ext.autodoc", "sphinx.ext.autosummary", "sphinx.ext.intersphinx", "sphinx.ext.napoleon", "sphinx.ext.viewcode", ] # Add any paths that contain templates here, relative to this directory. templates_path = ["_templates"] # The suffix of source filenames. source_suffix = ".rst" # The master toctree document. root_doc = "index" # General information about the project. project = "Astroid" current_year = datetime.utcnow().year contributors = "Logilab, and astroid contributors" copyright = f"2003-{current_year}, {contributors}" from astroid.__pkginfo__ import __version__ # noqa release = __version__ # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. exclude_patterns = ["_build"] # The name of the Pygments (syntax highlighting) style to use. pygments_style = "sphinx" # -- Options for HTML output --------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = "furo" # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ["media"] # Output file base name for HTML help builder. htmlhelp_basename = "Pylintdoc" # -- Options for Autodoc ------------------------------------------------------- autodoc_default_options = { "members": True, "show-inheritance": True, "undoc-members": True, } intersphinx_mapping = { # Use dev so that the documentation builds when we are adding support for # upcoming Python versions. "python": ("https://docs.python.org/dev", None), } astroid-3.2.2/doc/release.md0000664000175000017500000001175414622475517015625 0ustar epsilonepsilon# Releasing an astroid version So, you want to release the `X.Y.Z` version of astroid ? ## Releasing a major or minor version **Before releasing a major or minor version check if there are any unreleased commits on the maintenance branch. If so, release a last patch release first. See `Releasing a patch version`.** - Remove the empty changelog for the last unreleased patch version `X.Y-1.Z'`. (For example: `v2.3.5`) - Check the result of `git diff vX.Y-1.Z' ChangeLog`. (For example: `git diff v2.3.4 ChangeLog`) - Install the release dependencies: `pip3 install -r requirements_minimal.txt` - Bump the version and release by using `tbump X.Y.0 --no-push --no-tag`. (For example: `tbump 2.4.0 --no-push --no-tag`) - Check the commit created with `git show` amend the commit if required. - Move the `main` branch up to a dev version with `tbump`: ```bash tbump X.Y+1.0-dev0 --no-tag --no-push # You can interrupt after the first step git commit -am "Upgrade the version to x.y+1.0-dev0 following x.y.0 release" ``` For example: ```bash tbump 2.5.0-dev0 --no-tag --no-push git commit -am "Upgrade the version to 2.5.0-dev0 following 2.4.0 release" ``` Check the commit and then push to a release branch - Open a merge request with the two commits (no one can push directly on `main`) - Trigger the "release tests" workflow in GitHub Actions. - After the merge, recover the merged commits on `main` and tag the first one (the version should be `X.Y.Z`) as `vX.Y.Z` (For example: `v2.4.0`) - Push the tag. - Release the version on GitHub with the same name as the tag and copy and paste the appropriate changelog in the description. This triggers the PyPI release. - Delete the `maintenance/X.Y-1.x` branch. (For example: `maintenance/2.3.x`) - Create a `maintenance/X.Y.x` (For example: `maintenance/2.4.x` from the `v2.4.0` tag.) based on the tag from the release. The maintenance branch are protected you won't be able to fix it after the fact if you create it from main. ## Backporting a fix from `main` to the maintenance branch Whenever a PR on `main` should be released in a patch release on the current maintenance branch: - Label the PR with `backport maintenance/X.Y-1.x`. (For example `backport maintenance/2.3.x`) - Squash the PR before merging (alternatively rebase if there's a single commit) - (If the automated cherry-pick has conflicts) - Add a `Needs backport` label and do it manually. - You might alternatively also: - Cherry-pick the changes that create the conflict if it's not a new feature before doing the original PR cherry-pick manually. - Decide to wait for the next minor to release the PR - In any case upgrade the milestones in the original PR and newly cherry-picked PR to match reality. - Release a patch version ## Releasing a patch version We release patch versions when a crash or a bug is fixed on the main branch and has been cherry-picked on the maintenance branch. Below, we will be releasing X.Y-1.Z (where X.Y is the version under development on `main`.) - Branch `release/X.Y-1.Z` off of `maintenance/X.Y.x` - Check the result of `git diff vX.Y-1.Z-1 ChangeLog`. (For example: `git diff v2.3.4 ChangeLog`) - Install the release dependencies: `pip3 install -r requirements_minimal.txt` - Bump the version and release by using `tbump X.Y-1.Z --no-tag --no-push`. (For example: `tbump 2.3.5 --no-tag --no-push`. We're not ready to tag before code review.) - Check the result visually with `git show`. - Open a merge request against `maintenance/X.Y-1.x` to run the CI tests for this branch. - Consider copying the changelog into the body of the PR to examine the rendered markdown. - Wait for an approval. Avoid using a merge commit. Avoid deleting the maintenance branch. - Checkout `maintenance/X.Y.x` and fast-forward to the new commit. - Create and push the tag: `git tag vX.Y-1.Z` && `git push --tags` - Release the version on GitHub with the same name as the tag and copy and paste the appropriate changelog in the description. This triggers the PyPI release. - Freeze the main branch. - Branch `post-X.Y-1.Z` from `main`. - `git merge maintenance/X.Y-1.x`: this should have the changelog for `X.Y-1.Z+1` (For example `v2.3.6`). This merge is required so `pre-commit autoupdate` works for pylint. - Fix version conflicts properly, meaning preserve the version numbers of the form `X.Y.0-devZ` (For example: `2.4.0-dev6`). - Open a merge request against main. Ensure a merge commit is used, because pre-commit need the patch release tag to be in the main branch history to consider the patch release as the latest version and this won't be the case with rebase or squash. You can defend against trigger-happy future selves by enabling auto-merge with the merge commit strategy. - Wait for approval. Again, use a merge commit. - Unblock the main branch. - Close the milestone and open a new patch-release milestone. ## Milestone handling We move issues that were not done to the next milestone and block releases only if there are any open issues labelled as `blocker`. astroid-3.2.2/doc/make.bat0000664000175000017500000001064514622475517015266 0ustar epsilonepsilon@ECHO OFF REM Command file for Sphinx documentation if "%SPHINXBUILD%" == "" ( set SPHINXBUILD=sphinx-build ) set BUILDDIR=build set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% source if NOT "%PAPER%" == "" ( set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS% ) if "%1" == "" goto help if "%1" == "help" ( :help echo.Please use `make ^` where ^ is one of echo. html to make standalone HTML files echo. dirhtml to make HTML files named index.html in directories echo. singlehtml to make a single large HTML file echo. pickle to make pickle files echo. json to make JSON files echo. htmlhelp to make HTML files and a HTML help project echo. qthelp to make HTML files and a qthelp project echo. devhelp to make HTML files and a Devhelp project echo. epub to make an epub echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter echo. text to make text files echo. man to make manual pages echo. changes to make an overview over all changed/added/deprecated items echo. linkcheck to check all external links for integrity echo. doctest to run all doctests embedded in the documentation if enabled goto end ) if "%1" == "clean" ( for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i del /q /s %BUILDDIR%\* goto end ) if "%1" == "html" ( %SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/html. goto end ) if "%1" == "dirhtml" ( %SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml. goto end ) if "%1" == "singlehtml" ( %SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml if errorlevel 1 exit /b 1 echo. echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml. goto end ) if "%1" == "pickle" ( %SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the pickle files. goto end ) if "%1" == "json" ( %SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can process the JSON files. goto end ) if "%1" == "htmlhelp" ( %SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run HTML Help Workshop with the ^ .hhp project file in %BUILDDIR%/htmlhelp. goto end ) if "%1" == "qthelp" ( %SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp if errorlevel 1 exit /b 1 echo. echo.Build finished; now you can run "qcollectiongenerator" with the ^ .qhcp project file in %BUILDDIR%/qthelp, like this: echo.^> qcollectiongenerator %BUILDDIR%\qthelp\Astroid.qhcp echo.To view the help file: echo.^> assistant -collectionFile %BUILDDIR%\qthelp\Astroid.ghc goto end ) if "%1" == "devhelp" ( %SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp if errorlevel 1 exit /b 1 echo. echo.Build finished. goto end ) if "%1" == "epub" ( %SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub if errorlevel 1 exit /b 1 echo. echo.Build finished. The epub file is in %BUILDDIR%/epub. goto end ) if "%1" == "latex" ( %SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex if errorlevel 1 exit /b 1 echo. echo.Build finished; the LaTeX files are in %BUILDDIR%/latex. goto end ) if "%1" == "text" ( %SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text if errorlevel 1 exit /b 1 echo. echo.Build finished. The text files are in %BUILDDIR%/text. goto end ) if "%1" == "man" ( %SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man if errorlevel 1 exit /b 1 echo. echo.Build finished. The manual pages are in %BUILDDIR%/man. goto end ) if "%1" == "changes" ( %SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes if errorlevel 1 exit /b 1 echo. echo.The overview file is in %BUILDDIR%/changes. goto end ) if "%1" == "linkcheck" ( %SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck if errorlevel 1 exit /b 1 echo. echo.Link check complete; look for any errors in the above output ^ or in %BUILDDIR%/linkcheck/output.txt. goto end ) if "%1" == "doctest" ( %SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest if errorlevel 1 exit /b 1 echo. echo.Testing of doctests in the sources finished, look at the ^ results in %BUILDDIR%/doctest/output.txt. goto end ) :end astroid-3.2.2/doc/inference.rst0000664000175000017500000000704014622475517016344 0ustar epsilonepsilon.. _inference: Inference Introduction ====================== What/where is 'inference' ? --------------------------- The inference is a mechanism through which *astroid* tries to interpret statically your Python code. How does it work ? ------------------ The magic is handled by :meth:`NodeNG.infer` method. *astroid* usually provides inference support for various Python primitives, such as protocols and statements, but it can also be enriched via `inference transforms`. In both cases the :meth:`infer` must return a *generator* which iterates through the various *values* the node could take. In some case the value yielded will not be a node found in the AST of the node but an instance of a special inference class such as :obj:`Uninferable`, or :class:`Instance`. Namely, the special singleton :obj:`Uninferable` is yielded when the inference reaches a point where it can't follow the code and is so unable to guess a value; and instances of the :class:`Instance` class are yielded when the current node is inferred to be an instance of some known class. Crash course into astroid's inference -------------------------------------- Let's see some examples on how the inference might work in ``astroid``. First we'll need to do a detour through some of the ``astroid``'s APIs. ``astroid`` offers a relatively similar API to the builtin ``ast`` module, that is, you can do ``astroid.parse(string)`` to get an AST out of the given string:: >>> tree = astroid.parse('a + b') >>> tree >>> >>> print(tree.repr_tree()) Module( name='', doc=None, file='', path=[''], package=False, pure_python=True, future_imports=set(), body=[Expr(value=BinOp( op='+', left=Name(name='a'), right=Name(name='b')))]) The :meth:`repr_tree` is super useful to inspect how a tree actually looks. Most of the time you can access the same fields as those represented in the output of :meth:`repr_tree` so you can do ``tree.body[0].value.left`` to get the left hand side operand of the addition operation. Another useful function that you can use is :func:`astroid.extract_node`, which given a string, tries to extract one or more nodes from the given string:: >>> node = astroid.extract_node(''' ... a = 1 ... b = 2 ... c ''') In that example, the node that is going to be returned is the last node from the tree, so it will be the ``Name(c)`` node. You can also use :func:`astroid.extract_node` to extract multiple nodes:: >>> nodes = astroid.extract_node(''' ... a = 1 #@ ... b = 2 #@ ... c ''') You can use ``#@`` comment to annotate the lines for which you want the corresponding nodes to be extracted. In that example, what we're going to extract is two ``Expr`` nodes, which is in astroid's parlance, two statements, but you can access their underlying ``Assign`` nodes using the ``.value`` attribute. Now let's see how can we use ``astroid`` to infer what's going on with your code. The main method that you can use is :meth:`infer`. It returns a generator with all the potential values that ``astroid`` can extract for a piece of code:: >>> name_node = astroid.extract_node(''' ... a = 1 ... b = 2 ... c = a + b ... c ''') >>> inferred = next(name_node.infer()) >>> inferred >>> inferred.value 3 From this example you can see that ``astroid`` is capable of *inferring* what ``c`` might hold, which is a constant value with the number 3. astroid-3.2.2/doc/_templates/0000775000175000017500000000000014622475517016010 5ustar epsilonepsilonastroid-3.2.2/doc/_templates/autosummary_class.rst0000664000175000017500000000013614622475517022315 0ustar epsilonepsilon{{ name | escape | underline}} .. currentmodule:: {{ module }} .. autoclass:: {{ objname }} astroid-3.2.2/doc/index.rst0000664000175000017500000000415514622475517015521 0ustar epsilonepsilon.. Astroid documentation main file, created by sphinx-quickstart on Wed Jun 26 15:00:40 2013. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. .. Please see the documentation for the Sphinx Python domain : http://sphinx-doc.org/domains.html#the-python-domain and the autodoc extension http://sphinx-doc.org/ext/autodoc.html Welcome to astroid's documentation! =================================== **astroid** is a library for AST parsing, static analysis and inference, currently powering most of **pylint** capabilities. It offers support for parsing Python source code into ASTs, similar to how the builtin **ast** module works. On top of that, it can partially infer various Python constructs, as seen in the following example:: from astroid import parse module = parse(''' def func(first, second): return first + second arg_1 = 2 arg_2 = 3 func(arg_1, arg_2) ''') >>> module.body[-1] >>> inferred = next(module.body[-1].value.infer()) >>> inferred >>> inferred.value 5 **astroid** also allows the user to write various inference transforms for enhancing its Python understanding, helping as well **pylint** in the process of figuring out the dynamic nature of Python. Support ------- .. image:: media/Tidelift_Logos_RGB_Tidelift_Shorthand_On-White.png :width: 75 :alt: Tidelift :align: left :class: tideliftlogo Professional support for astroid is available as part of the `Tidelift Subscription`_. Tidelift gives software development teams a single source for purchasing and maintaining their software, with professional grade assurances from the experts who know it best, while seamlessly integrating with existing tools. .. _Tidelift Subscription: https://tidelift.com/subscription/pkg/pypi-astroid?utm_source=pypi-astroid&utm_medium=referral&utm_campaign=readme .. toctree:: :maxdepth: 2 :hidden: inference extending api/index whatsnew .. toctree:: :hidden: :caption: Indices genindex modindex astroid-3.2.2/doc/Makefile0000664000175000017500000000155314622475517015317 0ustar epsilonepsilon# Makefile for Sphinx documentation # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build PAPER = BUILDDIR = build # Internal variables. ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) -n . .PHONY: help clean html linkcheck help: @echo "Please use \`make ' where is one of" @echo " html to make standalone HTML files" @echo " linkcheck to check all external links for integrity" clean: -rm -rf $(BUILDDIR)/* html: $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html @echo @echo "Build finished. The HTML pages are in $(BUILDDIR)/html." linkcheck: $(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck @echo @echo "Link check complete; look for any errors in the above output " \ "or in $(BUILDDIR)/linkcheck/output.txt." astroid-3.2.2/doc/media/0000775000175000017500000000000014622475517014732 5ustar epsilonepsilonastroid-3.2.2/doc/media/Tidelift_Logos_RGB_Tidelift_Shorthand_On-White_small.png0000664000175000017500000001563614622475517027614 0ustar epsilonepsilonPNG  IHDR,,y}usRGB pHYs.#.#x?vYiTXtXML:com.adobe.xmp 1 L'YIDATx}}yffngi]*U4 ( BW%|m4i 4ihUJaZU P6  }\_iE442ԄMx }}۽۝zvgyoy                                                                                                            .`V} l w4ޫLMMy3ltƒ=(q_}65jU 7VO@zr ե6.VܶSC}c6Ijqܒ5Vo4R8G )ާ>RVy2vN)KhO*mqG>0&AÉmTèY2Fx؇W\s~)d.[_˗Ƽ+ +4Чw ;9[)QK˹֪ۖ,yw{KGvT?5xk:y:#CߔUu~Aؔ4GϾC{vA'ȺlV/@Ʊ1 .ff@w,U2HK-055|kq9Vn VZ\2'u7E\a` $fM2.DNVz޵îXU5,k*D`v UT, ` ~qH)5tn F,90&?]9A9gגoCV o#ЭFe+pXCg6V[O$FߊVm: hsr( m? _p|vP#`u:g#K9xDPNU ~G#Cn刓֑l+9iu嚖Us"v{7d G3@U֧{0S-Nm)Ohz /"yv$X{I~^Տވ5: XIiJ'Ƭ3b6mڤ꧇JHUך\PJHUך\PJHUך\PJHUך\PJHUך\PJHUך\PJHUך]rZb9ςg916NΥ'@ cƉ+vؒNO]6S `uT腤GeRi̵}6渚{yǎ%-< rso\JVha ߵkB.= ҃qDrفIvsSy:|"k?ы~' l^ջYQeA;iTzh׽0(l6ޛ$ +* +H^sLl0.k|xWdЭJ%`T4h}IzMyߕ]E:\DZL4$=7-j0b0l>E%XipHe$߻6ieƵR("@z GrruO. xuǬ:i^ʗ%:m-@v *^>]vQ\ݟфuI.}Ak߃yx~C y-~_r.}XL Zbk&l^ Pz:J:7z5 yuׁ~ׁcҧTxЅA@Ï/:CiBkMƵdl]ЇWd@ǵ{51]?P! /ػVЂntk37]$  0I/ڎ\g.WP}{ӆ4Xif ƫNkNDNĀ,f 3 _kU^_8^n=5KwIǵti  yqs+WչdaGby m>=`pdhY:!:AOz\Au&;2F, HX \`ub@'td}}?* VxhKIƩ/\%9ܻtsZVaEԠo Q{гj+J%@V}uRi`\0 2($"tl/#&G'*?[Z4IJv/?W׿*7ךr&89 ƫ͓_kʾ7c:"vK+/Z´"!!1tda _utߝۻ|\4.Z^,faGxu'O"I,24`RiJ*? _6{&AMHXeLHN]yҝFk/0u_u |sC%`?CAm?#cbض{-ǫb  ]Q.Z]n!V\xr>3Q~=^ټ}3:LZ% `f ,Wjr?Ck&hƫ-SWL LN>tFe"XV GƾWoRR>,ֽc92ayjߚH=VD#c9돌*le} Љ0Љ@`\+ y"e.R^`%RȏkۧwLx{gd'˰'L~ңeMHHw3\.A(kok<[vW/2&=,.ǯohϙL{,yC%gftnl6"=/! gd?}AlT&9FIBNIuGS5W6NHrǫPڲv^fgc `p5,M kLҞg TbsZ/ #%Vj"PpVG K,yE7@XY-@Xo, T[  (>Y `e+ `P|$@RmW .@*xdI"\U@Ȓ+KE^( #%Vj"PpVG K,yE7@XY-@Xo, T[  (>Y `e+ `P|$@RmW .@*xdI"\U@Ȓ+KE^( #%Vj=,ˢ^^bYuo4V3}%#,@YiΑ㸃 )۰6[Nc9/x+ @ߍo!$JFq״_H^#!86={\SͩqjN 8 jxMѮ{l@fz泎4%Z;ކ qf[{߭T঱J*!XڹF ` sɧ4 C||_2 4bY}gZIH8S؉_ ä5:H;m/'ơ_ A5i@V|]MO7}$uznɓD7uM[&]>Mqg؞4vLbt}Sd/8Oޟ7E6K @ZhZV/艹 ʸ?&&ؖ~Ck/쌻1ʘW% DG^s 3i\W08[5_힙$ЈVsue3acUVRQgE / *QmmǶ^џI/I/)Ar\-Q|VLwۂ(e։R1 ,j3WCҧ$'*7iZ=7=IH{U0oy0 lG_W.Dz]"6;֚q6ʧ^~6%H8Yȕ^`º;,Ѡtkȷ{LlBBфIwi7k XyhTǩL(O^l_N~KZ* U[=/dH>0:.iydZ Z &PLqzX#^AfOz"ƩVM2{CUӃ0U鐵Ƴϙkrn9gΝJ\*o~{pNN=0+77@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@(z~^,cIENDB`astroid-3.2.2/doc/media/Tidelift_Logos_RGB_Tidelift_Shorthand_On-White.png0000664000175000017500000000774514622475517026426 0ustar epsilonepsilonPNG  IHDRXXf pHYs.#.#x?vIDATx1n7nRI;0 Ĭ@t/в6`T+0lXC PKb@kM EGC~9`sc @`,X, X @`,X, X @` X, X @` @`, X @` @`,X @` @`,X @` @`,X  @`,X   @`,X   @`,X   @`,X   @`,X,  @`,X, X @`,X, X @` X, X @` X, X @` @`, X @` @`,X @` @`3#`N.:I:vJkѢ!^V[_INFs#A`AuYԱ۵w^$wI> a$M؁$N.n hxj`q5W@'b\EqIhIEQ ߘ`>4F]!Xg@!F~#Jw+'g=c@`f ,lXX4ځy( 6̹QX_E`,X   >ca,ؠd$X X4?XA1 =<P (B8ڀo566H`!(t2'1 \տ 25IG\!(9fI:IǣݹM6t2Z OđL2H289$_IMr`@1g;g?I/z hul-z>eN.LaoP.X, X @` X, X @` @`, X @` @`,X WI~NF]Ou,8wIzhܤNF$$7^"@`%Iҭcqr:|R (dԛNF˦_t2L*n h$ѠNFIzIn6-/KX@0o>u䭗X.NF%кJ"e ز$?1:sq. X&4 t2ꦺ 'q:utKue,`+o.񛯟FKtz|4p"k\;u\͍+vޛ o~:yA(m":,+ofFZ:Q>+ovYQV ۶fu8,{wVތb#5O҉sYu꼕G0l6,/Q-' X`/'y+ovYT2tR7FȚבiʛFr:ƊX@k15H"e^yseY6IʛF"V^xf֫[hvd-]̒|2h'`AS1BkjΝi.էFQddS˺1 X@3XyӎZN'^{+oZZI^ţ@`[^y30VFul]ʛGzŎ[PL` @`,V`5t=׫a+,İ:\ $vi/V~׫,zqC/񨎬s <,_AOaʫ,q5N򩎗RY s4-W[Va:ɇ~S}ʰ˖GIaԫ,)괎=^}@`OWWI~K[=ֻհ?s. Xj͞8-Cr]q7e1 X@YqՉ7Mv\V(@`ei)z(@`͎AUIY hfXY); U7-c(zιQvW?7V8H(+v@`;qi7a\,`q5G0Qe/wD`@`, X @` @`,X @` @`,X @` @`,X  @`,X   @`,X   @`,X   @`,X6涰7ZJ܏1"I׻X ؂yAzkv -X.ŵ V^$ ͺ*o~叹ٵjvVtZ7;@`A{ϖ:M3?iv伩-?<<<j5_zy`dа oNnuXߎo_uy v,  @`,X,  @`,X, X @`,X, X @` X, X @` @`, X @` @`, X @` @`,X @` @`,X  @`,X   @`,X   @`,X   @`,X   @`,X,  @`,X, X @`,X, XO }qIENDB`astroid-3.2.2/doc/whatsnew.rst0000664000175000017500000000031114622475517016240 0ustar epsilonepsilon###################### What's New in astroid ###################### The "Changelog" contains *all* nontrivial changes to astroid for the current version. .. toctree:: :maxdepth: 2 changelog astroid-3.2.2/doc/extending.rst0000664000175000017500000002207014622475517016373 0ustar epsilonepsilonExtending astroid syntax tree ============================= Sometimes astroid will miss some potentially important information you may wish it supported instead, for instance with the libraries that rely on dynamic features of the language. In some other cases, you may want to customize the way inference works, for instance to explain **astroid** that calls to `collections.namedtuple` are returning a class with some known attributes. Modifications in the AST are possible in a couple of ways. AST transforms ^^^^^^^^^^^^^^ **astroid** has support for AST transformations, which given a node, should return either the same node but modified, or a completely new node. The transform functions needs to be registered with the underlying manager, that is, a class that **astroid** uses internally for all things configuration related. You can access the manager using `astroid.MANAGER`. The transform functions need to receive three parameters, with the third one being optional: * the type of the node for which the transform will be applied * the transform function itself * optionally, but strongly recommended, a transform predicate function. This function receives the node as an argument and it is expected to return a boolean specifying if the transform should be applied to this node or not. AST transforms - example ------------------------ Let's see some examples! Say that we love the new Python 3.6 feature called ``f-strings``, you might have heard of them and now you want to use them in your Python 3.6+ project as well. So instead of ``"your name is {}".format(name)"`` we'd want to rewrite this to ``f"your name is {name}"``. One thing you could do with astroid is that you can rewrite partially a tree and then dump it back on disk to get the new modifications. Let's see an example in which we rewrite our code so that instead of using ``.format()`` we'll use f-strings instead. While there are some technicalities to be aware of, such as the fact that astroid is an AST (abstract syntax tree), while for code round-tripping you might want a CST instead (concrete syntax tree), for the purpose of this example we'll just consider all the round-trip edge cases as being irrelevant. First of all, let's write a simple function that receives a node and returns the same node unmodified:: def format_to_fstring_transform(node): return node astroid.MANAGER.register_transform(...) For the registration of the transform, we are most likely interested in registering it for ``astroid.Call``, which is the node for function calls, so this now becomes:: def format_to_fstring_transform(node): return node astroid.MANAGER.register_transform( astroid.Call, format_to_fstring_transform, ) The next step would be to do the actual transformation, but before dwelving into that, let's see some important concepts that nodes in astroid have: * they have a parent. Every time we build a node, we have to provide a parent * most of the time they have a line number and a column offset as well * a node might also have children that are nodes as well. You can check what a node needs if you access its ``_astroid_fields``, ``_other_fields``, ``_other_other_fields`` properties. They are all tuples of strings, where the strings depicts attribute names. The first one is going to contain attributes that are nodes (so basically children of a node), the second one is going to contain non-AST objects (such as strings or other objects), while the third one can contain both AST and non-AST objects. When instantiating a node, the non-AST parameters are usually passed via the constructor, while the AST parameters are provided via the ``postinit()`` method. The only exception is that the parent is also passed via the constructor. Instantiating a new node might look as in:: new_node = FunctionDef( name='my_new_function', lineno=3, col_offset=0, parent=the_parent_of_this_function, ) new_node.postinit( args=args, body=body, returns=returns, doc_node=nodes.Const(value='the docstring of this function'), ) Now, with this knowledge, let's see how our transform might look:: def format_to_fstring_transform(node): f_string_node = astroid.JoinedStr( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, ) formatted_value_node = astroid.FormattedValue( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, ) formatted_value_node.postinit(value=node.args[0]) # Removes the {} since it will be represented as # formatted_value_node string = astroid.Const(node.func.expr.value.replace('{}', '')) f_string_node.postinit(values=[string, formatted_value_node]) return f_string_node astroid.MANAGER.register_transform( astroid.Call, format_to_fstring_transform, ) There are a couple of things going on, so let's see what we did: * ``JoinedStr`` is used to represent the f-string AST node. The catch is that the ``JoinedStr`` is formed out of the strings that don't contain a formatting placeholder, followed by the ``FormattedValue`` nodes, which contain the f-strings formatting placeholders. * ``node.args`` will hold a list of all the arguments passed in our function call, so ``node.args[0]`` will actually point to the name variable that we passed. * ``node.func.expr`` will be the string that we use for formatting. * We call ``postinit()`` with the value being the aforementioned name. This will result in the f-string being now complete. You can now check to see if your transform did its job correctly by getting the string representation of the node:: from astroid import parse tree = parse(''' "my name is {}".format(name) ''') print(tree.as_string()) The output should print ``f"my name is {name}"``, and that's how you do AST transformations with astroid! AST inference tip transforms ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Another interesting transform you can do with the AST is to provide the so called ``inference tip``. **astroid** can be used as more than an AST library, it also offers some basic support of inference, it can infer what names might mean in a given context, it can be used to solve attributes in a highly complex class hierarchy, etc. We call this mechanism generally ``inference`` throughout the project. An inference tip (or ``brain tip`` as another alias we might use), is a normal transform that's only called when we try to infer a particular node. Say for instance you want to infer the result of a particular function call. Here's a way you'd setup an inference tip. As seen, you need to wrap the transform with ``inference_tip``. Also it should receive an optional parameter ``context``, which is the inference context that will be used for that particular block of inference, and it is supposed to return an iterator:: def infer_my_custom_call(call_node, context=None): # Do some transformation here return iter((new_node, )) MANAGER.register_transform( nodes.Call, inference_tip(infer_my_custom_call), _looks_like_my_custom_call, ) This transform is now going to be triggered whenever **astroid** figures out a node for which the transform pattern should apply. Module extender transforms ^^^^^^^^^^^^^^^^^^^^^^^^^^^ Another form of transforms is the module extender transform. This one can be used to partially alter a module without going through the intricacies of writing a transform that operates on AST nodes. The module extender transform will add new nodes provided by the transform function to the module that we want to extend. To register a module extender transform, use the ``astroid.register_module_extender`` method. You'll need to pass a manager instance, the fully qualified name of the module you want to extend and a transform function. The transform function should not receive any parameters and it is expected to return an instance of ``astroid.Module``. Here's an example that might be useful:: def my_custom_module(): return astroid.parse(''' class SomeClass: ... class SomeOtherClass: ... ''') register_module_extender(astroid.MANAGER, 'mymodule', my_custom_module) Failed import hooks ^^^^^^^^^^^^^^^^^^^^ If you want to control the behaviour of astroid when it cannot import some import, you can use ``MANAGER.register_failed_import_hook`` to register a transform that's called whenever an import failed. The transform receives the module name that failed and it is expected to return an instance of :class:`astroid.Module`, otherwise it must raise ``AstroidBuildingError``, as seen in the following example:: def failed_custom_import(modname): if modname != 'my_custom_module': # Don't know about this module raise AstroidBuildingError(modname=modname) return astroid.parse(''' class ThisIsAFakeClass: pass ''') MANAGER.register_failed_import_hook(failed_custom_import) astroid-3.2.2/.git-blame-ignore-revs0000664000175000017500000000031514622475517017205 0ustar epsilonepsilon# Initial formatting of astroid add5f7b8eba427de9d39caae864bbc6dc37ef980 # Apply black on doc/conf.py 2dd9027054db541871713ef1cb1ae89513d05555 # Black's 2024 style 396f01a15d1cb0351b33654acdeedde64f537a0c astroid-3.2.2/LICENSE0000664000175000017500000006362414622475517014126 0ustar epsilonepsilon GNU LESSER GENERAL PUBLIC LICENSE Version 2.1, February 1999 Copyright (C) 1991, 1999 Free Software Foundation, Inc. 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the Lesser GPL. It also counts as the successor of the GNU Library Public License, version 2, hence the version number 2.1.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Lesser General Public License, applies to some specially designated software packages--typically libraries--of the Free Software Foundation and other authors who decide to use it. You can use it too, but we suggest you first think carefully about whether this license or the ordinary General Public License is the better strategy to use in any particular case, based on the explanations below. When we speak of free software, we are referring to freedom of use, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish); that you receive source code or can get it if you want it; that you can change the software and use pieces of it in new free programs; and that you are informed that you can do these things. To protect your rights, we need to make restrictions that forbid distributors to deny you these rights or to ask you to surrender these rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link other code with the library, you must provide complete object files to the recipients, so that they can relink them with the library after making changes to the library and recompiling it. And you must show them these terms so they know their rights. We protect your rights with a two-step method: (1) we copyright the library, and (2) we offer you this license, which gives you legal permission to copy, distribute and/or modify the library. To protect each distributor, we want to make it very clear that there is no warranty for the free library. Also, if the library is modified by someone else and passed on, the recipients should know that what they have is not the original version, so that the original author's reputation will not be affected by problems that might be introduced by others. Finally, software patents pose a constant threat to the existence of any free program. We wish to make sure that a company cannot effectively restrict the users of a free program by obtaining a restrictive license from a patent holder. Therefore, we insist that any patent license obtained for a version of the library must be consistent with the full freedom of use specified in this license. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License. This license, the GNU Lesser General Public License, applies to certain designated libraries, and is quite different from the ordinary General Public License. We use this license for certain libraries in order to permit linking those libraries into non-free programs. When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License permits more lax criteria for linking other code with the library. We call this license the "Lesser" General Public License because it does Less to protect the user's freedom than the ordinary General Public License. It also provides other free software developers Less of an advantage over competing non-free programs. These disadvantages are the reason we use the ordinary General Public License for many libraries. However, the Lesser license provides advantages in certain special circumstances. For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License. In other cases, permission to use a particular library in non-free programs enables a greater number of people to use a large body of free software. For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. Although the Lesser General Public License is Less protective of the users' freedom, it does ensure that the user of a program that is linked with the Library has the freedom and the wherewithal to run that program using a modified version of the Library. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, whereas the latter must be combined with the library in order to run. GNU LESSER GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library or other program which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Lesser General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also combine or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (1) uses at run time a copy of the library already present on the user's computer system, rather than copying library functions into the executable, and (2) will operate properly with a modified version of the library, if the user installs one, as long as the modified version is interface-compatible with the version that the work was made with. c) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. d) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. e) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the materials to be distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties with this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! astroid-3.2.2/pylintrc0000664000175000017500000002505014622475517014677 0ustar epsilonepsilon[MASTER] # Specify a configuration file. #rcfile= # Python code to execute, usually for sys.path manipulation such as # pygtk.require(). #init-hook= # Add files or directories to the blacklist. They should be base names, not # paths. ignore=CVS # Pickle collected data for later comparisons. persistent=yes # List of plugins (as comma separated values of python modules names) to load, # usually to register additional checkers. load-plugins= pylint.extensions.check_elif, pylint.extensions.bad_builtin, pylint.extensions.code_style, pylint.extensions.overlapping_exceptions, pylint.extensions.typing, pylint.extensions.set_membership, pylint.extensions.redefined_variable_type, pylint.extensions.for_any_all, # Use multiple processes to speed up Pylint. jobs=1 # Allow loading of arbitrary C extensions. Extensions are imported into the # active Python interpreter and may run arbitrary code. unsafe-load-any-extension=no # A comma-separated list of package or module names from where C extensions may # be loaded. Extensions are loading into the active Python interpreter and may # run arbitrary code extension-pkg-whitelist= # Minimum supported python version py-version = 3.8.0 [REPORTS] # Set the output format. Available formats are text, parseable, colorized, msvs # (visual studio) and html. You can also give a reporter class, eg # mypackage.mymodule.MyReporterClass. output-format=text # Tells whether to display a full report or only the messages reports=no # Python expression which should return a note less than 10 (10 is the highest # note). You have access to the variables errors warning, statement which # respectively contain the number of errors / warnings messages and the total # number of statements analyzed. This is used by the global evaluation report # (RP0004). evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) # Template used to display messages. This is a python new-style format string # used to format the message information. See doc for all details #msg-template= [MESSAGES CONTROL] # Only show warnings with the listed confidence levels. Leave empty to show # all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED confidence= # Disable the message, report, category or checker with the given id(s). You # can either give multiple identifiers separated by comma (,) or put this # option multiple times (only on the command line, not in the configuration # file where it should appear only once).You can also use "--disable=all" to # disable everything first and then re-enable specific checks. For example, if # you want to run only the similarities checker, you can use "--disable=all # --enable=similarities". If you want to run only the classes checker, but have # no Warning level messages displayed, use"--disable=all --enable=classes # --disable=W" disable=fixme, invalid-name, missing-docstring, too-few-public-methods, too-many-public-methods, too-many-boolean-expressions, too-many-branches, too-many-statements, # We know about it and we're doing our best to remove it in 2.0 (oups) cyclic-import, # Requires major redesign for fixing this (and private # access in the same project is fine) protected-access, # API requirements in most of the occurrences unused-argument, # black handles these format, # We might want to disable new checkers from master that do not exists # in latest published pylint bad-option-value, # Legacy warning not checked in astroid/brain before we # transitioned to setuptools and added an init.py duplicate-code, # This one would help performance but we need to fix the pipeline first # and there are a lot of warning for it consider-using-f-string, consider-using-assignment-expr, enable=useless-suppression [BASIC] # List of builtins function names that should not be used, separated by a comma bad-functions= # Good variable names which should always be accepted, separated by a comma good-names=i,j,k,e,ex,f,m,cm,Run,_,n,op,it # Bad variable names which should always be refused, separated by a comma bad-names=foo,bar,baz,toto,tutu,tata # Colon-delimited sets of names that determine each other's naming style when # the name regexes allow several styles. name-group= # Include a hint for the correct naming format with invalid-name include-naming-hint=no # Regular expression matching correct attribute names attr-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression matching correct constant names const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$ # Regular expression matching correct method names method-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression matching correct inline iteration names inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$ # Regular expression matching correct class names class-rgx=[A-Z_][a-zA-Z0-9]+$ # Regular expression matching correct argument names argument-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression matching correct module names module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$ # Regular expression matching correct function names function-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression matching correct variable names variable-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression matching correct class attribute names class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$ # Regular expression which should only match function or class names that do # not require a docstring. no-docstring-rgx=__.*__ # Minimum line length for functions/classes that require docstrings, shorter # ones are exempt. docstring-min-length=-1 [FORMAT] # Maximum number of characters on a single line. max-line-length=100 # Regexp for a line that is allowed to be longer than the limit. ignore-long-lines=^\s*(# )??$ # Allow the body of an if to be on the same line as the test if there is no # else. single-line-if-stmt=no # Maximum number of lines in a module max-module-lines=3000 # String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 # tab). indent-string=' ' # Number of spaces of indent required inside a hanging or continued line. indent-after-paren=4 # Expected format of line ending, e.g. empty (any line ending), LF or CRLF. expected-line-ending-format= [LOGGING] # Logging modules to check that the string format arguments are in logging # function parameter format logging-modules=logging [MISCELLANEOUS] # List of note tags to take in consideration, separated by a comma. notes=FIXME,XXX,TODO [SIMILARITIES] # Minimum lines number of a similarity. min-similarity-lines=4 # Ignore comments when computing similarities. ignore-comments=yes # Ignore docstrings when computing similarities. ignore-docstrings=yes # Ignore imports when computing similarities. ignore-imports=yes [SPELLING] # Spelling dictionary name. Available dictionaries: none. To make it working # install python-enchant package. spelling-dict= # List of comma separated words that should not be checked. spelling-ignore-words= # A path to a file that contains private dictionary; one word per line. spelling-private-dict-file= # Tells whether to store unknown words to indicated private dictionary in # --spelling-private-dict-file option instead of raising a message. spelling-store-unknown-words=no [TYPECHECK] ignore-on-opaque-inference=n # List of module names for which member attributes should not be checked # (useful for modules/projects where namespaces are manipulated during runtime # and thus existing member attributes cannot be deduced by static analysis ignored-modules=typed_ast.ast3 # List of classes names for which member attributes should not be checked # (useful for classes with attributes dynamically set). ignored-classes=SQLObject # Regex pattern to define which classes are considered mixins. mixin-class-rgx=.*Mix[Ii]n # List of members which are set dynamically and missed by pylint inference # system, and so shouldn't trigger E0201 when accessed. Python regular # expressions are accepted. generated-members=REQUEST,acl_users,aq_parent,argparse.Namespace [VARIABLES] # Tells whether we should check for unused import in __init__ files. init-import=no # A regular expression matching the name of dummy variables (i.e. expectedly # not used). dummy-variables-rgx=_$|dummy # List of additional names supposed to be defined in builtins. Remember that # you should avoid to define new builtins when possible. additional-builtins= # List of strings which can identify a callback function by name. A callback # name must start or end with one of those strings. callbacks=cb_,_cb [CLASSES] # List of method names used to declare (i.e. assign) instance attributes. defining-attr-methods=__init__,__new__,setUp # List of valid names for the first argument in a class method. valid-classmethod-first-arg=cls # List of valid names for the first argument in a metaclass class method. valid-metaclass-classmethod-first-arg=mcs # List of member names, which should be excluded from the protected access # warning. exclude-protected=_asdict,_fields,_replace,_source,_make [DESIGN] # Maximum number of arguments for function / method max-args=10 # Argument names that match this expression will be ignored. Default to name # with leading underscore ignored-argument-names=_.* # Maximum number of locals for function / method body max-locals=25 # Maximum number of return / yield for function / method body max-returns=10 # Maximum number of branch for function / method body max-branches=25 # Maximum number of statements in function / method body max-statements=60 # Maximum number of parents for a class (see R0901). max-parents=10 # Maximum number of attributes for a class (see R0902). max-attributes=15 # Minimum number of public methods for a class (see R0903). min-public-methods=2 # Maximum number of public methods for a class (see R0904). max-public-methods=20 [IMPORTS] # Deprecated modules which should not be used, separated by a comma deprecated-modules=stringprep,optparse # Create a graph of every (i.e. internal and external) dependencies in the # given file (report RP0402 must not be disabled) import-graph= # Create a graph of external dependencies in the given file (report RP0402 must # not be disabled) ext-import-graph= # Create a graph of internal dependencies in the given file (report RP0402 must # not be disabled) int-import-graph= [EXCEPTIONS] # Exceptions that will emit a warning when being caught. Defaults to # "Exception" overgeneral-exceptions=builtins.Exception [TYPING] # Annotations are used exclusively for type checking runtime-typing = no astroid-3.2.2/.pre-commit-config.yaml0000664000175000017500000000377114622475517017377 0ustar epsilonepsilonci: skip: [pylint] repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.6.0 hooks: - id: trailing-whitespace exclude: .github/|tests/testdata - id: end-of-file-fixer exclude: tests/testdata - repo: https://github.com/astral-sh/ruff-pre-commit rev: "v0.4.3" hooks: - id: ruff exclude: tests/testdata args: ["--fix"] - repo: https://github.com/Pierre-Sassoulas/copyright_notice_precommit rev: 0.1.2 hooks: - id: copyright-notice args: ["--notice=script/copyright.txt", "--enforce-all", "--autofix"] exclude: tests/testdata|setup.py types: [python] - repo: https://github.com/asottile/pyupgrade rev: v3.15.2 hooks: - id: pyupgrade exclude: tests/testdata args: [--py38-plus] - repo: https://github.com/Pierre-Sassoulas/black-disable-checker/ rev: v1.1.3 hooks: - id: black-disable-checker exclude: tests/test_nodes_lineno.py - repo: https://github.com/psf/black rev: 24.4.2 hooks: - id: black args: [--safe, --quiet] exclude: tests/testdata - repo: local hooks: - id: pylint name: pylint entry: pylint language: system types: [python] args: [ "-rn", "-sn", "--rcfile=pylintrc", # "--load-plugins=pylint.extensions.docparams", We're not ready for that ] exclude: tests/testdata|conf.py - repo: https://github.com/pre-commit/mirrors-mypy rev: v1.10.0 hooks: - id: mypy name: mypy entry: mypy language: python types: [python] args: [] require_serial: true additional_dependencies: ["types-typed-ast"] exclude: tests/testdata| # exclude everything, we're not ready - repo: https://github.com/pre-commit/mirrors-prettier rev: v4.0.0-alpha.8 hooks: - id: prettier args: [--prose-wrap=always, --print-width=88] astroid-3.2.2/.readthedocs.yaml0000664000175000017500000000036214622475517016336 0ustar epsilonepsilon# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details version: 2 sphinx: configuration: doc/conf.py build: os: ubuntu-22.04 tools: python: "3.11" python: install: - requirements: doc/requirements.txt astroid-3.2.2/astroid/0000775000175000017500000000000014622475517014553 5ustar epsilonepsilonastroid-3.2.2/astroid/manager.py0000664000175000017500000004435514622475517016552 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """astroid manager: avoid multiple astroid build of a same module when possible by providing a class responsible to get astroid representation from various source and using a cache of built modules) """ from __future__ import annotations import collections import os import types import zipimport from collections.abc import Callable, Iterator, Sequence from typing import Any, ClassVar from astroid import nodes from astroid.context import InferenceContext, _invalidate_cache from astroid.exceptions import AstroidBuildingError, AstroidImportError from astroid.interpreter._import import spec, util from astroid.modutils import ( NoSourceFile, _cache_normalize_path_, file_info_from_modpath, get_source_file, is_module_name_part_of_extension_package_whitelist, is_python_source, is_stdlib_module, load_module_from_name, modpath_from_file, ) from astroid.transforms import TransformVisitor from astroid.typing import AstroidManagerBrain, InferenceResult ZIP_IMPORT_EXTS = (".zip", ".egg", ".whl", ".pyz", ".pyzw") def safe_repr(obj: Any) -> str: try: return repr(obj) except Exception: # pylint: disable=broad-except return "???" class AstroidManager: """Responsible to build astroid from files or modules. Use the Borg (singleton) pattern. """ name = "astroid loader" brain: ClassVar[AstroidManagerBrain] = { "astroid_cache": {}, "_mod_file_cache": {}, "_failed_import_hooks": [], "always_load_extensions": False, "optimize_ast": False, "max_inferable_values": 100, "extension_package_whitelist": set(), "module_denylist": set(), "_transform": TransformVisitor(), "prefer_stubs": False, } def __init__(self) -> None: # NOTE: cache entries are added by the [re]builder self.astroid_cache = AstroidManager.brain["astroid_cache"] self._mod_file_cache = AstroidManager.brain["_mod_file_cache"] self._failed_import_hooks = AstroidManager.brain["_failed_import_hooks"] self.extension_package_whitelist = AstroidManager.brain[ "extension_package_whitelist" ] self.module_denylist = AstroidManager.brain["module_denylist"] self._transform = AstroidManager.brain["_transform"] self.prefer_stubs = AstroidManager.brain["prefer_stubs"] @property def always_load_extensions(self) -> bool: return AstroidManager.brain["always_load_extensions"] @always_load_extensions.setter def always_load_extensions(self, value: bool) -> None: AstroidManager.brain["always_load_extensions"] = value @property def optimize_ast(self) -> bool: return AstroidManager.brain["optimize_ast"] @optimize_ast.setter def optimize_ast(self, value: bool) -> None: AstroidManager.brain["optimize_ast"] = value @property def max_inferable_values(self) -> int: return AstroidManager.brain["max_inferable_values"] @max_inferable_values.setter def max_inferable_values(self, value: int) -> None: AstroidManager.brain["max_inferable_values"] = value @property def register_transform(self): # This and unregister_transform below are exported for convenience return self._transform.register_transform @property def unregister_transform(self): return self._transform.unregister_transform @property def builtins_module(self) -> nodes.Module: return self.astroid_cache["builtins"] @property def prefer_stubs(self) -> bool: return AstroidManager.brain["prefer_stubs"] @prefer_stubs.setter def prefer_stubs(self, value: bool) -> None: AstroidManager.brain["prefer_stubs"] = value def visit_transforms(self, node: nodes.NodeNG) -> InferenceResult: """Visit the transforms and apply them to the given *node*.""" return self._transform.visit(node) def ast_from_file( self, filepath: str, modname: str | None = None, fallback: bool = True, source: bool = False, ) -> nodes.Module: """Given a module name, return the astroid object.""" if modname is None: try: modname = ".".join(modpath_from_file(filepath)) except ImportError: modname = filepath if ( modname in self.astroid_cache and self.astroid_cache[modname].file == filepath ): return self.astroid_cache[modname] # Call get_source_file() only after a cache miss, # since it calls os.path.exists(). try: filepath = get_source_file( filepath, include_no_ext=True, prefer_stubs=self.prefer_stubs ) source = True except NoSourceFile: pass # Second attempt on the cache after get_source_file(). if ( modname in self.astroid_cache and self.astroid_cache[modname].file == filepath ): return self.astroid_cache[modname] if source: # pylint: disable=import-outside-toplevel; circular import from astroid.builder import AstroidBuilder return AstroidBuilder(self).file_build(filepath, modname) if fallback and modname: return self.ast_from_module_name(modname) raise AstroidBuildingError("Unable to build an AST for {path}.", path=filepath) def ast_from_string( self, data: str, modname: str = "", filepath: str | None = None ) -> nodes.Module: """Given some source code as a string, return its corresponding astroid object. """ # pylint: disable=import-outside-toplevel; circular import from astroid.builder import AstroidBuilder return AstroidBuilder(self).string_build(data, modname, filepath) def _build_stub_module(self, modname: str) -> nodes.Module: # pylint: disable=import-outside-toplevel; circular import from astroid.builder import AstroidBuilder return AstroidBuilder(self).string_build("", modname) def _build_namespace_module( self, modname: str, path: Sequence[str] ) -> nodes.Module: # pylint: disable=import-outside-toplevel; circular import from astroid.builder import build_namespace_package_module return build_namespace_package_module(modname, path) def _can_load_extension(self, modname: str) -> bool: if self.always_load_extensions: return True if is_stdlib_module(modname): return True return is_module_name_part_of_extension_package_whitelist( modname, self.extension_package_whitelist ) def ast_from_module_name( # noqa: C901 self, modname: str | None, context_file: str | None = None, use_cache: bool = True, ) -> nodes.Module: """Given a module name, return the astroid object.""" if modname is None: raise AstroidBuildingError("No module name given.") # Sometimes we don't want to use the cache. For example, when we're # importing a module with the same name as the file that is importing # we want to fallback on the import system to make sure we get the correct # module. if modname in self.module_denylist: raise AstroidImportError(f"Skipping ignored module {modname!r}") if modname in self.astroid_cache and use_cache: return self.astroid_cache[modname] if modname == "__main__": return self._build_stub_module(modname) if context_file: old_cwd = os.getcwd() os.chdir(os.path.dirname(context_file)) try: found_spec = self.file_from_module_name(modname, context_file) if found_spec.type == spec.ModuleType.PY_ZIPMODULE: module = self.zip_import_data(found_spec.location) if module is not None: return module elif found_spec.type in ( spec.ModuleType.C_BUILTIN, spec.ModuleType.C_EXTENSION, ): if ( found_spec.type == spec.ModuleType.C_EXTENSION and not self._can_load_extension(modname) ): return self._build_stub_module(modname) try: named_module = load_module_from_name(modname) except Exception as e: raise AstroidImportError( "Loading {modname} failed with:\n{error}", modname=modname, path=found_spec.location, ) from e return self.ast_from_module(named_module, modname) elif found_spec.type == spec.ModuleType.PY_COMPILED: raise AstroidImportError( "Unable to load compiled module {modname}.", modname=modname, path=found_spec.location, ) elif found_spec.type == spec.ModuleType.PY_NAMESPACE: return self._build_namespace_module( modname, found_spec.submodule_search_locations or [] ) elif found_spec.type == spec.ModuleType.PY_FROZEN: if found_spec.location is None: return self._build_stub_module(modname) # For stdlib frozen modules we can determine the location and # can therefore create a module from the source file return self.ast_from_file(found_spec.location, modname, fallback=False) if found_spec.location is None: raise AstroidImportError( "Can't find a file for module {modname}.", modname=modname ) return self.ast_from_file(found_spec.location, modname, fallback=False) except AstroidBuildingError as e: for hook in self._failed_import_hooks: try: return hook(modname) except AstroidBuildingError: pass raise e finally: if context_file: os.chdir(old_cwd) def zip_import_data(self, filepath: str) -> nodes.Module | None: if zipimport is None: return None # pylint: disable=import-outside-toplevel; circular import from astroid.builder import AstroidBuilder builder = AstroidBuilder(self) for ext in ZIP_IMPORT_EXTS: try: eggpath, resource = filepath.rsplit(ext + os.path.sep, 1) except ValueError: continue try: importer = zipimport.zipimporter(eggpath + ext) zmodname = resource.replace(os.path.sep, ".") if importer.is_package(resource): zmodname = zmodname + ".__init__" module = builder.string_build( importer.get_source(resource), zmodname, filepath ) return module except Exception: # pylint: disable=broad-except continue return None def file_from_module_name( self, modname: str, contextfile: str | None ) -> spec.ModuleSpec: try: value = self._mod_file_cache[(modname, contextfile)] except KeyError: try: value = file_info_from_modpath( modname.split("."), context_file=contextfile ) except ImportError as e: value = AstroidImportError( "Failed to import module {modname} with error:\n{error}.", modname=modname, # we remove the traceback here to save on memory usage (since these exceptions are cached) error=e.with_traceback(None), ) self._mod_file_cache[(modname, contextfile)] = value if isinstance(value, AstroidBuildingError): # we remove the traceback here to save on memory usage (since these exceptions are cached) raise value.with_traceback(None) # pylint: disable=no-member return value def ast_from_module( self, module: types.ModuleType, modname: str | None = None ) -> nodes.Module: """Given an imported module, return the astroid object.""" modname = modname or module.__name__ if modname in self.astroid_cache: return self.astroid_cache[modname] try: # some builtin modules don't have __file__ attribute filepath = module.__file__ if is_python_source(filepath): # Type is checked in is_python_source return self.ast_from_file(filepath, modname) # type: ignore[arg-type] except AttributeError: pass # pylint: disable=import-outside-toplevel; circular import from astroid.builder import AstroidBuilder return AstroidBuilder(self).module_build(module, modname) def ast_from_class(self, klass: type, modname: str | None = None) -> nodes.ClassDef: """Get astroid for the given class.""" if modname is None: try: modname = klass.__module__ except AttributeError as exc: raise AstroidBuildingError( "Unable to get module for class {class_name}.", cls=klass, class_repr=safe_repr(klass), modname=modname, ) from exc modastroid = self.ast_from_module_name(modname) ret = modastroid.getattr(klass.__name__)[0] assert isinstance(ret, nodes.ClassDef) return ret def infer_ast_from_something( self, obj: object, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Infer astroid for the given class.""" if hasattr(obj, "__class__") and not isinstance(obj, type): klass = obj.__class__ elif isinstance(obj, type): klass = obj else: raise AstroidBuildingError( # pragma: no cover "Unable to get type for {class_repr}.", cls=None, class_repr=safe_repr(obj), ) try: modname = klass.__module__ except AttributeError as exc: raise AstroidBuildingError( "Unable to get module for {class_repr}.", cls=klass, class_repr=safe_repr(klass), ) from exc except Exception as exc: raise AstroidImportError( "Unexpected error while retrieving module for {class_repr}:\n" "{error}", cls=klass, class_repr=safe_repr(klass), ) from exc try: name = klass.__name__ except AttributeError as exc: raise AstroidBuildingError( "Unable to get name for {class_repr}:\n", cls=klass, class_repr=safe_repr(klass), ) from exc except Exception as exc: raise AstroidImportError( "Unexpected error while retrieving name for {class_repr}:\n{error}", cls=klass, class_repr=safe_repr(klass), ) from exc # take care, on living object __module__ is regularly wrong :( modastroid = self.ast_from_module_name(modname) if klass is obj: yield from modastroid.igetattr(name, context) else: for inferred in modastroid.igetattr(name, context): yield inferred.instantiate_class() def register_failed_import_hook(self, hook: Callable[[str], nodes.Module]) -> None: """Registers a hook to resolve imports that cannot be found otherwise. `hook` must be a function that accepts a single argument `modname` which contains the name of the module or package that could not be imported. If `hook` can resolve the import, must return a node of type `astroid.Module`, otherwise, it must raise `AstroidBuildingError`. """ self._failed_import_hooks.append(hook) def cache_module(self, module: nodes.Module) -> None: """Cache a module if no module with the same name is known yet.""" self.astroid_cache.setdefault(module.name, module) def bootstrap(self) -> None: """Bootstrap the required AST modules needed for the manager to work. The bootstrap usually involves building the AST for the builtins module, which is required by the rest of astroid to work correctly. """ from astroid import raw_building # pylint: disable=import-outside-toplevel raw_building._astroid_bootstrapping() def clear_cache(self) -> None: """Clear the underlying caches, bootstrap the builtins module and re-register transforms. """ # import here because of cyclic imports # pylint: disable=import-outside-toplevel from astroid.brain.helpers import register_all_brains from astroid.inference_tip import clear_inference_tip_cache from astroid.interpreter._import.spec import _find_spec from astroid.interpreter.objectmodel import ObjectModel from astroid.nodes._base_nodes import LookupMixIn from astroid.nodes.scoped_nodes import ClassDef clear_inference_tip_cache() _invalidate_cache() # inference context cache self.astroid_cache.clear() # NB: not a new TransformVisitor() AstroidManager.brain["_transform"].transforms = collections.defaultdict(list) for lru_cache in ( LookupMixIn.lookup, _cache_normalize_path_, util.is_namespace, ObjectModel.attributes, ClassDef._metaclass_lookup_attribute, _find_spec, ): lru_cache.cache_clear() # type: ignore[attr-defined] self.bootstrap() # Reload brain plugins. During initialisation this is done in astroid.manager.py register_all_brains(self) astroid-3.2.2/astroid/test_utils.py0000664000175000017500000000465214622475517017333 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Utility functions for test code that uses astroid ASTs as input.""" from __future__ import annotations import contextlib import functools import sys import warnings from collections.abc import Callable import pytest from astroid import manager, nodes, transforms def require_version(minver: str = "0.0.0", maxver: str = "4.0.0") -> Callable: """Compare version of python interpreter to the given one and skips the test if older.""" def parse(python_version: str) -> tuple[int, ...]: try: return tuple(int(v) for v in python_version.split(".")) except ValueError as e: msg = f"{python_version} is not a correct version : should be X.Y[.Z]." raise ValueError(msg) from e min_version = parse(minver) max_version = parse(maxver) def check_require_version(f): current: tuple[int, int, int] = sys.version_info[:3] if min_version < current <= max_version: return f version: str = ".".join(str(v) for v in sys.version_info) @functools.wraps(f) def new_f(*args, **kwargs): if current <= min_version: pytest.skip(f"Needs Python > {minver}. Current version is {version}.") elif current > max_version: pytest.skip(f"Needs Python <= {maxver}. Current version is {version}.") return new_f return check_require_version def get_name_node(start_from, name, index=0): return [n for n in start_from.nodes_of_class(nodes.Name) if n.name == name][index] @contextlib.contextmanager def enable_warning(warning): warnings.simplefilter("always", warning) try: yield finally: # Reset it to default value, so it will take # into account the values from the -W flag. warnings.simplefilter("default", warning) def brainless_manager(): m = manager.AstroidManager() # avoid caching into the AstroidManager borg since we get problems # with other tests : m.__dict__ = {} m._failed_import_hooks = [] m.astroid_cache = {} m._mod_file_cache = {} m._transform = transforms.TransformVisitor() m.extension_package_whitelist = set() m.module_denylist = set() return m astroid-3.2.2/astroid/helpers.py0000664000175000017500000002721614622475517016577 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Various helper utilities.""" from __future__ import annotations import warnings from collections.abc import Generator from astroid import bases, manager, nodes, objects, raw_building, util from astroid.context import CallContext, InferenceContext from astroid.exceptions import ( AstroidTypeError, AttributeInferenceError, InferenceError, MroError, _NonDeducibleTypeHierarchy, ) from astroid.nodes import scoped_nodes from astroid.typing import InferenceResult from astroid.util import safe_infer as real_safe_infer def safe_infer( node: nodes.NodeNG | bases.Proxy | util.UninferableBase, context: InferenceContext | None = None, ) -> InferenceResult | None: # When removing, also remove the real_safe_infer alias warnings.warn( "Import safe_infer from astroid.util; this shim in astroid.helpers will be removed.", DeprecationWarning, stacklevel=2, ) return real_safe_infer(node, context=context) def _build_proxy_class(cls_name: str, builtins: nodes.Module) -> nodes.ClassDef: proxy = raw_building.build_class(cls_name) proxy.parent = builtins return proxy def _function_type( function: nodes.Lambda | nodes.FunctionDef | bases.UnboundMethod, builtins: nodes.Module, ) -> nodes.ClassDef: if isinstance(function, (scoped_nodes.Lambda, scoped_nodes.FunctionDef)): if function.root().name == "builtins": cls_name = "builtin_function_or_method" else: cls_name = "function" elif isinstance(function, bases.BoundMethod): cls_name = "method" else: cls_name = "function" return _build_proxy_class(cls_name, builtins) def _object_type( node: InferenceResult, context: InferenceContext | None = None ) -> Generator[InferenceResult | None, None, None]: astroid_manager = manager.AstroidManager() builtins = astroid_manager.builtins_module context = context or InferenceContext() for inferred in node.infer(context=context): if isinstance(inferred, scoped_nodes.ClassDef): if inferred.newstyle: metaclass = inferred.metaclass(context=context) if metaclass: yield metaclass continue yield builtins.getattr("type")[0] elif isinstance( inferred, (scoped_nodes.Lambda, bases.UnboundMethod, scoped_nodes.FunctionDef), ): yield _function_type(inferred, builtins) elif isinstance(inferred, scoped_nodes.Module): yield _build_proxy_class("module", builtins) elif isinstance(inferred, nodes.Unknown): raise InferenceError elif isinstance(inferred, util.UninferableBase): yield inferred elif isinstance(inferred, (bases.Proxy, nodes.Slice, objects.Super)): yield inferred._proxied else: # pragma: no cover raise AssertionError(f"We don't handle {type(inferred)} currently") def object_type( node: InferenceResult, context: InferenceContext | None = None ) -> InferenceResult | None: """Obtain the type of the given node. This is used to implement the ``type`` builtin, which means that it's used for inferring type calls, as well as used in a couple of other places in the inference. The node will be inferred first, so this function can support all sorts of objects, as long as they support inference. """ try: types = set(_object_type(node, context)) except InferenceError: return util.Uninferable if len(types) > 1 or not types: return util.Uninferable return next(iter(types)) def _object_type_is_subclass( obj_type: InferenceResult | None, class_or_seq: list[InferenceResult], context: InferenceContext | None = None, ) -> util.UninferableBase | bool: if isinstance(obj_type, util.UninferableBase) or not isinstance( obj_type, nodes.ClassDef ): return util.Uninferable # Instances are not types class_seq = [ item if not isinstance(item, bases.Instance) else util.Uninferable for item in class_or_seq ] # strict compatibility with issubclass # issubclass(type, (object, 1)) evaluates to true # issubclass(object, (1, type)) raises TypeError for klass in class_seq: if isinstance(klass, util.UninferableBase): raise AstroidTypeError("arg 2 must be a type or tuple of types") for obj_subclass in obj_type.mro(): if obj_subclass == klass: return True return False def object_isinstance( node: InferenceResult, class_or_seq: list[InferenceResult], context: InferenceContext | None = None, ) -> util.UninferableBase | bool: """Check if a node 'isinstance' any node in class_or_seq. :raises AstroidTypeError: if the given ``classes_or_seq`` are not types """ obj_type = object_type(node, context) if isinstance(obj_type, util.UninferableBase): return util.Uninferable return _object_type_is_subclass(obj_type, class_or_seq, context=context) def object_issubclass( node: nodes.NodeNG, class_or_seq: list[InferenceResult], context: InferenceContext | None = None, ) -> util.UninferableBase | bool: """Check if a type is a subclass of any node in class_or_seq. :raises AstroidTypeError: if the given ``classes_or_seq`` are not types :raises AstroidError: if the type of the given node cannot be inferred or its type's mro doesn't work """ if not isinstance(node, nodes.ClassDef): raise TypeError(f"{node} needs to be a ClassDef node") return _object_type_is_subclass(node, class_or_seq, context=context) def has_known_bases(klass, context: InferenceContext | None = None) -> bool: """Return whether all base classes of a class could be inferred.""" try: return klass._all_bases_known except AttributeError: pass for base in klass.bases: result = real_safe_infer(base, context=context) # TODO: check for A->B->A->B pattern in class structure too? if ( not isinstance(result, scoped_nodes.ClassDef) or result is klass or not has_known_bases(result, context=context) ): klass._all_bases_known = False return False klass._all_bases_known = True return True def _type_check(type1, type2) -> bool: if not all(map(has_known_bases, (type1, type2))): raise _NonDeducibleTypeHierarchy if not all([type1.newstyle, type2.newstyle]): return False try: return type1 in type2.mro()[:-1] except MroError as e: # The MRO is invalid. raise _NonDeducibleTypeHierarchy from e def is_subtype(type1, type2) -> bool: """Check if *type1* is a subtype of *type2*.""" return _type_check(type1=type2, type2=type1) def is_supertype(type1, type2) -> bool: """Check if *type2* is a supertype of *type1*.""" return _type_check(type1, type2) def class_instance_as_index(node: bases.Instance) -> nodes.Const | None: """Get the value as an index for the given instance. If an instance provides an __index__ method, then it can be used in some scenarios where an integer is expected, for instance when multiplying or subscripting a list. """ context = InferenceContext() try: for inferred in node.igetattr("__index__", context=context): if not isinstance(inferred, bases.BoundMethod): continue context.boundnode = node context.callcontext = CallContext(args=[], callee=inferred) for result in inferred.infer_call_result(node, context=context): if isinstance(result, nodes.Const) and isinstance(result.value, int): return result except InferenceError: pass return None def object_len(node, context: InferenceContext | None = None): """Infer length of given node object. :param Union[nodes.ClassDef, nodes.Instance] node: :param node: Node to infer length of :raises AstroidTypeError: If an invalid node is returned from __len__ method or no __len__ method exists :raises InferenceError: If the given node cannot be inferred or if multiple nodes are inferred or if the code executed in python would result in a infinite recursive check for length :rtype int: Integer length of node """ # pylint: disable=import-outside-toplevel; circular import from astroid.objects import FrozenSet inferred_node = real_safe_infer(node, context=context) # prevent self referential length calls from causing a recursion error # see https://github.com/pylint-dev/astroid/issues/777 node_frame = node.frame() if ( isinstance(node_frame, scoped_nodes.FunctionDef) and node_frame.name == "__len__" and isinstance(inferred_node, bases.Proxy) and inferred_node._proxied == node_frame.parent ): message = ( "Self referential __len__ function will " "cause a RecursionError on line {} of {}".format( node.lineno, node.root().file ) ) raise InferenceError(message) if inferred_node is None or isinstance(inferred_node, util.UninferableBase): raise InferenceError(node=node) if isinstance(inferred_node, nodes.Const) and isinstance( inferred_node.value, (bytes, str) ): return len(inferred_node.value) if isinstance(inferred_node, (nodes.List, nodes.Set, nodes.Tuple, FrozenSet)): return len(inferred_node.elts) if isinstance(inferred_node, nodes.Dict): return len(inferred_node.items) node_type = object_type(inferred_node, context=context) if not node_type: raise InferenceError(node=node) try: len_call = next(node_type.igetattr("__len__", context=context)) except StopIteration as e: raise AstroidTypeError(str(e)) from e except AttributeInferenceError as e: raise AstroidTypeError( f"object of type '{node_type.pytype()}' has no len()" ) from e inferred = len_call.infer_call_result(node, context) if isinstance(inferred, util.UninferableBase): raise InferenceError(node=node, context=context) result_of_len = next(inferred, None) if ( isinstance(result_of_len, nodes.Const) and result_of_len.pytype() == "builtins.int" ): return result_of_len.value if ( result_of_len is None or isinstance(result_of_len, bases.Instance) and result_of_len.is_subtype_of("builtins.int") ): # Fake a result as we don't know the arguments of the instance call. return 0 raise AstroidTypeError( f"'{result_of_len}' object cannot be interpreted as an integer" ) def _higher_function_scope(node: nodes.NodeNG) -> nodes.FunctionDef | None: """Search for the first function which encloses the given scope. This can be used for looking up in that function's scope, in case looking up in a lower scope for a particular name fails. :param node: A scope node. :returns: ``None``, if no parent function scope was found, otherwise an instance of :class:`astroid.nodes.scoped_nodes.Function`, which encloses the given node. """ current = node while current.parent and not isinstance(current.parent, nodes.FunctionDef): current = current.parent if current and current.parent: return current.parent return None astroid-3.2.2/astroid/transforms.py0000664000175000017500000001333114622475517017324 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import warnings from collections import defaultdict from collections.abc import Callable from typing import TYPE_CHECKING, List, Optional, Tuple, TypeVar, Union, cast, overload from astroid.context import _invalidate_cache from astroid.typing import SuccessfulInferenceResult, TransformFn if TYPE_CHECKING: from astroid import nodes _SuccessfulInferenceResultT = TypeVar( "_SuccessfulInferenceResultT", bound=SuccessfulInferenceResult ) _Predicate = Optional[Callable[[_SuccessfulInferenceResultT], bool]] _Vistables = Union[ "nodes.NodeNG", List["nodes.NodeNG"], Tuple["nodes.NodeNG", ...], str, None ] _VisitReturns = Union[ SuccessfulInferenceResult, List[SuccessfulInferenceResult], Tuple[SuccessfulInferenceResult, ...], str, None, ] class TransformVisitor: """A visitor for handling transforms. The standard approach of using it is to call :meth:`~visit` with an *astroid* module and the class will take care of the rest, walking the tree and running the transforms for each encountered node. Based on its usage in AstroidManager.brain, it should not be reinstantiated. """ def __init__(self) -> None: # The typing here is incorrect, but it's the best we can do # Refer to register_transform and unregister_transform for the correct types self.transforms: defaultdict[ type[SuccessfulInferenceResult], list[ tuple[ TransformFn[SuccessfulInferenceResult], _Predicate[SuccessfulInferenceResult], ] ], ] = defaultdict(list) def _transform(self, node: SuccessfulInferenceResult) -> SuccessfulInferenceResult: """Call matching transforms for the given node if any and return the transformed node. """ cls = node.__class__ for transform_func, predicate in self.transforms[cls]: if predicate is None or predicate(node): ret = transform_func(node) # if the transformation function returns something, it's # expected to be a replacement for the node if ret is not None: _invalidate_cache() node = ret if ret.__class__ != cls: # Can no longer apply the rest of the transforms. break return node def _visit(self, node: nodes.NodeNG) -> SuccessfulInferenceResult: for name in node._astroid_fields: value = getattr(node, name) value = cast(_Vistables, value) visited = self._visit_generic(value) if visited != value: setattr(node, name, visited) return self._transform(node) @overload def _visit_generic(self, node: None) -> None: ... @overload def _visit_generic(self, node: str) -> str: ... @overload def _visit_generic( self, node: list[nodes.NodeNG] ) -> list[SuccessfulInferenceResult]: ... @overload def _visit_generic( self, node: tuple[nodes.NodeNG, ...] ) -> tuple[SuccessfulInferenceResult, ...]: ... @overload def _visit_generic(self, node: nodes.NodeNG) -> SuccessfulInferenceResult: ... def _visit_generic(self, node: _Vistables) -> _VisitReturns: if isinstance(node, list): return [self._visit_generic(child) for child in node] if isinstance(node, tuple): return tuple(self._visit_generic(child) for child in node) if not node or isinstance(node, str): return node try: return self._visit(node) except RecursionError: # Returning the node untransformed is better than giving up. warnings.warn( f"Astroid was unable to transform {node}.\n" "Some functionality will be missing unless the system recursion limit is lifted.\n" "From pylint, try: --init-hook='import sys; sys.setrecursionlimit(2000)' or higher.", UserWarning, stacklevel=0, ) return node def register_transform( self, node_class: type[_SuccessfulInferenceResultT], transform: TransformFn[_SuccessfulInferenceResultT], predicate: _Predicate[_SuccessfulInferenceResultT] | None = None, ) -> None: """Register `transform(node)` function to be applied on the given node. The transform will only be applied if `predicate` is None or returns true when called with the node as argument. The transform function may return a value which is then used to substitute the original node in the tree. """ self.transforms[node_class].append((transform, predicate)) # type: ignore[index, arg-type] def unregister_transform( self, node_class: type[_SuccessfulInferenceResultT], transform: TransformFn[_SuccessfulInferenceResultT], predicate: _Predicate[_SuccessfulInferenceResultT] | None = None, ) -> None: """Unregister the given transform.""" self.transforms[node_class].remove((transform, predicate)) # type: ignore[index, arg-type] def visit(self, node: nodes.NodeNG) -> SuccessfulInferenceResult: """Walk the given astroid *tree* and transform each encountered node. Only the nodes which have transforms registered will actually be replaced or changed. """ return self._visit(node) astroid-3.2.2/astroid/__init__.py0000664000175000017500000001070214622475517016664 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Python Abstract Syntax Tree New Generation. The aim of this module is to provide a common base representation of python source code for projects such as pychecker, pyreverse, pylint... Well, actually the development of this library is essentially governed by pylint's needs. It mimics the class defined in the python's _ast module with some additional methods and attributes. New nodes instances are not fully compatible with python's _ast. Instance attributes are added by a builder object, which can either generate extended ast (let's call them astroid ;) by visiting an existent ast tree or by inspecting living object. Main modules are: * nodes and scoped_nodes for more information about methods and attributes added to different node classes * the manager contains a high level object to get astroid trees from source files and living objects. It maintains a cache of previously constructed tree for quick access * builder contains the class responsible to build astroid trees """ import functools import tokenize # isort: off # We have an isort: off on 'astroid.nodes' because of a circular import. from astroid.nodes import node_classes, scoped_nodes # isort: on from astroid import raw_building from astroid.__pkginfo__ import __version__, version from astroid.bases import BaseInstance, BoundMethod, Instance, UnboundMethod from astroid.brain.helpers import register_module_extender from astroid.builder import extract_node, parse from astroid.const import PY310_PLUS, Context from astroid.exceptions import ( AstroidBuildingError, AstroidError, AstroidImportError, AstroidIndexError, AstroidSyntaxError, AstroidTypeError, AstroidValueError, AttributeInferenceError, DuplicateBasesError, InconsistentMroError, InferenceError, InferenceOverwriteError, MroError, NameInferenceError, NoDefault, NotFoundError, ParentMissingError, ResolveError, StatementMissing, SuperArgumentTypeError, SuperError, TooManyLevelsError, UnresolvableName, UseInferenceDefault, ) from astroid.inference_tip import _inference_tip_cached, inference_tip from astroid.objects import ExceptionInstance # isort: off # It's impossible to import from astroid.nodes with a wildcard, because # there is a cyclic import that prevent creating an __all__ in astroid/nodes # and we need astroid/scoped_nodes and astroid/node_classes to work. So # importing with a wildcard would clash with astroid/nodes/scoped_nodes # and astroid/nodes/node_classes. from astroid.astroid_manager import MANAGER from astroid.nodes import ( CONST_CLS, AnnAssign, Arguments, Assert, Assign, AssignAttr, AssignName, AsyncFor, AsyncFunctionDef, AsyncWith, Attribute, AugAssign, Await, BinOp, BoolOp, Break, Call, ClassDef, Compare, Comprehension, ComprehensionScope, Const, Continue, Decorators, DelAttr, Delete, DelName, Dict, DictComp, DictUnpack, EmptyNode, EvaluatedObject, ExceptHandler, Expr, For, FormattedValue, FunctionDef, GeneratorExp, Global, If, IfExp, Import, ImportFrom, JoinedStr, Keyword, Lambda, List, ListComp, Match, MatchAs, MatchCase, MatchClass, MatchMapping, MatchOr, MatchSequence, MatchSingleton, MatchStar, MatchValue, Module, Name, NamedExpr, NodeNG, Nonlocal, ParamSpec, Pass, Raise, Return, Set, SetComp, Slice, Starred, Subscript, Try, TryStar, Tuple, TypeAlias, TypeVar, TypeVarTuple, UnaryOp, Unknown, While, With, Yield, YieldFrom, are_exclusive, builtin_lookup, unpack_infer, function_to_method, ) # isort: on from astroid.util import Uninferable # Performance hack for tokenize. See https://bugs.python.org/issue43014 # Adapted from https://github.com/PyCQA/pycodestyle/pull/993 if ( not PY310_PLUS and callable(getattr(tokenize, "_compile", None)) and getattr(tokenize._compile, "__wrapped__", None) is None # type: ignore[attr-defined] ): tokenize._compile = functools.lru_cache(tokenize._compile) # type: ignore[attr-defined] astroid-3.2.2/astroid/__pkginfo__.py0000664000175000017500000000043314622475517017356 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt __version__ = "3.2.2" version = __version__ astroid-3.2.2/astroid/constraint.py0000664000175000017500000001173614622475517017321 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Classes representing different types of constraints on inference values.""" from __future__ import annotations import sys from abc import ABC, abstractmethod from collections.abc import Iterator from typing import TYPE_CHECKING, Union from astroid import nodes, util from astroid.typing import InferenceResult if sys.version_info >= (3, 11): from typing import Self else: from typing_extensions import Self if TYPE_CHECKING: from astroid import bases _NameNodes = Union[nodes.AssignAttr, nodes.Attribute, nodes.AssignName, nodes.Name] class Constraint(ABC): """Represents a single constraint on a variable.""" def __init__(self, node: nodes.NodeNG, negate: bool) -> None: self.node = node """The node that this constraint applies to.""" self.negate = negate """True if this constraint is negated. E.g., "is not" instead of "is".""" @classmethod @abstractmethod def match( cls, node: _NameNodes, expr: nodes.NodeNG, negate: bool = False ) -> Self | None: """Return a new constraint for node matched from expr, if expr matches the constraint pattern. If negate is True, negate the constraint. """ @abstractmethod def satisfied_by(self, inferred: InferenceResult) -> bool: """Return True if this constraint is satisfied by the given inferred value.""" class NoneConstraint(Constraint): """Represents an "is None" or "is not None" constraint.""" CONST_NONE: nodes.Const = nodes.Const(None) @classmethod def match( cls, node: _NameNodes, expr: nodes.NodeNG, negate: bool = False ) -> Self | None: """Return a new constraint for node matched from expr, if expr matches the constraint pattern. Negate the constraint based on the value of negate. """ if isinstance(expr, nodes.Compare) and len(expr.ops) == 1: left = expr.left op, right = expr.ops[0] if op in {"is", "is not"} and ( _matches(left, node) and _matches(right, cls.CONST_NONE) ): negate = (op == "is" and negate) or (op == "is not" and not negate) return cls(node=node, negate=negate) return None def satisfied_by(self, inferred: InferenceResult) -> bool: """Return True if this constraint is satisfied by the given inferred value.""" # Assume true if uninferable if isinstance(inferred, util.UninferableBase): return True # Return the XOR of self.negate and matches(inferred, self.CONST_NONE) return self.negate ^ _matches(inferred, self.CONST_NONE) def get_constraints( expr: _NameNodes, frame: nodes.LocalsDictNodeNG ) -> dict[nodes.If, set[Constraint]]: """Returns the constraints for the given expression. The returned dictionary maps the node where the constraint was generated to the corresponding constraint(s). Constraints are computed statically by analysing the code surrounding expr. Currently this only supports constraints generated from if conditions. """ current_node: nodes.NodeNG | None = expr constraints_mapping: dict[nodes.If, set[Constraint]] = {} while current_node is not None and current_node is not frame: parent = current_node.parent if isinstance(parent, nodes.If): branch, _ = parent.locate_child(current_node) constraints: set[Constraint] | None = None if branch == "body": constraints = set(_match_constraint(expr, parent.test)) elif branch == "orelse": constraints = set(_match_constraint(expr, parent.test, invert=True)) if constraints: constraints_mapping[parent] = constraints current_node = parent return constraints_mapping ALL_CONSTRAINT_CLASSES = frozenset((NoneConstraint,)) """All supported constraint types.""" def _matches(node1: nodes.NodeNG | bases.Proxy, node2: nodes.NodeNG) -> bool: """Returns True if the two nodes match.""" if isinstance(node1, nodes.Name) and isinstance(node2, nodes.Name): return node1.name == node2.name if isinstance(node1, nodes.Attribute) and isinstance(node2, nodes.Attribute): return node1.attrname == node2.attrname and _matches(node1.expr, node2.expr) if isinstance(node1, nodes.Const) and isinstance(node2, nodes.Const): return node1.value == node2.value return False def _match_constraint( node: _NameNodes, expr: nodes.NodeNG, invert: bool = False ) -> Iterator[Constraint]: """Yields all constraint patterns for node that match.""" for constraint_cls in ALL_CONSTRAINT_CLASSES: constraint = constraint_cls.match(node, expr, invert) if constraint: yield constraint astroid-3.2.2/astroid/modutils.py0000664000175000017500000005636714622475517017006 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Python modules manipulation utility functions. :type PY_SOURCE_EXTS: tuple(str) :var PY_SOURCE_EXTS: list of possible python source file extension :type STD_LIB_DIRS: set of str :var STD_LIB_DIRS: directories where standard modules are located :type BUILTIN_MODULES: dict :var BUILTIN_MODULES: dictionary with builtin module names has key """ from __future__ import annotations import importlib import importlib.machinery import importlib.util import io import itertools import logging import os import sys import sysconfig import types import warnings from collections.abc import Callable, Iterable, Sequence from contextlib import redirect_stderr, redirect_stdout from functools import lru_cache from pathlib import Path from astroid.const import IS_JYTHON, IS_PYPY, PY310_PLUS from astroid.interpreter._import import spec, util if PY310_PLUS: from sys import stdlib_module_names else: from astroid._backport_stdlib_names import stdlib_module_names logger = logging.getLogger(__name__) if sys.platform.startswith("win"): PY_SOURCE_EXTS = ("py", "pyw", "pyi") PY_SOURCE_EXTS_STUBS_FIRST = ("pyi", "pyw", "py") PY_COMPILED_EXTS = ("dll", "pyd") else: PY_SOURCE_EXTS = ("py", "pyi") PY_SOURCE_EXTS_STUBS_FIRST = ("pyi", "py") PY_COMPILED_EXTS = ("so",) # TODO: Adding `platstdlib` is a fix for a workaround in virtualenv. At some point we should # revisit whether this is still necessary. See https://github.com/pylint-dev/astroid/pull/1323. STD_LIB_DIRS = {sysconfig.get_path("stdlib"), sysconfig.get_path("platstdlib")} if os.name == "nt": STD_LIB_DIRS.add(os.path.join(sys.prefix, "dlls")) try: # real_prefix is defined when running inside virtual environments, # created with the **virtualenv** library. # Deprecated in virtualenv==16.7.9 # See: https://github.com/pypa/virtualenv/issues/1622 STD_LIB_DIRS.add(os.path.join(sys.real_prefix, "dlls")) # type: ignore[attr-defined] except AttributeError: # sys.base_exec_prefix is always defined, but in a virtual environment # created with the stdlib **venv** module, it points to the original # installation, if the virtual env is activated. try: STD_LIB_DIRS.add(os.path.join(sys.base_exec_prefix, "dlls")) except AttributeError: pass if IS_PYPY and sys.version_info < (3, 8): # PyPy stores the stdlib in two places: sys.prefix/lib_pypy and sys.prefix/lib-python/3 # sysconfig.get_path on PyPy returns the first, but without an underscore so we patch this manually. # Beginning with 3.8 the stdlib is only stored in: sys.prefix/pypy{py_version_short} STD_LIB_DIRS.add(str(Path(sysconfig.get_path("stdlib")).parent / "lib_pypy")) STD_LIB_DIRS.add(str(Path(sysconfig.get_path("stdlib")).parent / "lib-python/3")) # TODO: This is a fix for a workaround in virtualenv. At some point we should revisit # whether this is still necessary. See https://github.com/pylint-dev/astroid/pull/1324. STD_LIB_DIRS.add(str(Path(sysconfig.get_path("platstdlib")).parent / "lib_pypy")) STD_LIB_DIRS.add( str(Path(sysconfig.get_path("platstdlib")).parent / "lib-python/3") ) if os.name == "posix": # Need the real prefix if we're in a virtualenv, otherwise # the usual one will do. # Deprecated in virtualenv==16.7.9 # See: https://github.com/pypa/virtualenv/issues/1622 try: prefix: str = sys.real_prefix # type: ignore[attr-defined] except AttributeError: prefix = sys.prefix def _posix_path(path: str) -> str: base_python = "python%d.%d" % sys.version_info[:2] return os.path.join(prefix, path, base_python) STD_LIB_DIRS.add(_posix_path("lib")) if sys.maxsize > 2**32: # This tries to fix a problem with /usr/lib64 builds, # where systems are running both 32-bit and 64-bit code # on the same machine, which reflects into the places where # standard library could be found. More details can be found # here http://bugs.python.org/issue1294959. # An easy reproducing case would be # https://github.com/pylint-dev/pylint/issues/712#issuecomment-163178753 STD_LIB_DIRS.add(_posix_path("lib64")) EXT_LIB_DIRS = {sysconfig.get_path("purelib"), sysconfig.get_path("platlib")} BUILTIN_MODULES = dict.fromkeys(sys.builtin_module_names, True) class NoSourceFile(Exception): """Exception raised when we are not able to get a python source file for a precompiled file. """ def _normalize_path(path: str) -> str: """Resolve symlinks in path and convert to absolute path. Note that environment variables and ~ in the path need to be expanded in advance. This can be cached by using _cache_normalize_path. """ return os.path.normcase(os.path.realpath(path)) def _path_from_filename(filename: str, is_jython: bool = IS_JYTHON) -> str: if not is_jython: return filename head, has_pyclass, _ = filename.partition("$py.class") if has_pyclass: return head + ".py" return filename def _handle_blacklist( blacklist: Sequence[str], dirnames: list[str], filenames: list[str] ) -> None: """Remove files/directories in the black list. dirnames/filenames are usually from os.walk """ for norecurs in blacklist: if norecurs in dirnames: dirnames.remove(norecurs) elif norecurs in filenames: filenames.remove(norecurs) @lru_cache def _cache_normalize_path_(path: str) -> str: return _normalize_path(path) def _cache_normalize_path(path: str) -> str: """Normalize path with caching.""" # _module_file calls abspath on every path in sys.path every time it's # called; on a larger codebase this easily adds up to half a second just # assembling path components. This cache alleviates that. if not path: # don't cache result for '' return _normalize_path(path) return _cache_normalize_path_(path) def load_module_from_name(dotted_name: str) -> types.ModuleType: """Load a Python module from its name. :type dotted_name: str :param dotted_name: python name of a module or package :raise ImportError: if the module or package is not found :rtype: module :return: the loaded module """ try: return sys.modules[dotted_name] except KeyError: pass # Capture and log anything emitted during import to avoid # contaminating JSON reports in pylint with redirect_stderr(io.StringIO()) as stderr, redirect_stdout( io.StringIO() ) as stdout: module = importlib.import_module(dotted_name) stderr_value = stderr.getvalue() if stderr_value: logger.error( "Captured stderr while importing %s:\n%s", dotted_name, stderr_value ) stdout_value = stdout.getvalue() if stdout_value: logger.info( "Captured stdout while importing %s:\n%s", dotted_name, stdout_value ) return module def load_module_from_modpath(parts: Sequence[str]) -> types.ModuleType: """Load a python module from its split name. :param parts: python name of a module or package split on '.' :raise ImportError: if the module or package is not found :return: the loaded module """ return load_module_from_name(".".join(parts)) def load_module_from_file(filepath: str) -> types.ModuleType: """Load a Python module from it's path. :type filepath: str :param filepath: path to the python module or package :raise ImportError: if the module or package is not found :rtype: module :return: the loaded module """ modpath = modpath_from_file(filepath) return load_module_from_modpath(modpath) def check_modpath_has_init(path: str, mod_path: list[str]) -> bool: """Check there are some __init__.py all along the way.""" modpath: list[str] = [] for part in mod_path: modpath.append(part) path = os.path.join(path, part) if not _has_init(path): old_namespace = util.is_namespace(".".join(modpath)) if not old_namespace: return False return True def _get_relative_base_path(filename: str, path_to_check: str) -> list[str] | None: """Extracts the relative mod path of the file to import from. Check if a file is within the passed in path and if so, returns the relative mod path from the one passed in. If the filename is no in path_to_check, returns None Note this function will look for both abs and realpath of the file, this allows to find the relative base path even if the file is a symlink of a file in the passed in path Examples: _get_relative_base_path("/a/b/c/d.py", "/a/b") -> ["c","d"] _get_relative_base_path("/a/b/c/d.py", "/dev") -> None """ importable_path = None path_to_check = os.path.normcase(path_to_check) abs_filename = os.path.abspath(filename) if os.path.normcase(abs_filename).startswith(path_to_check): importable_path = abs_filename real_filename = os.path.realpath(filename) if os.path.normcase(real_filename).startswith(path_to_check): importable_path = real_filename if importable_path: base_path = os.path.splitext(importable_path)[0] relative_base_path = base_path[len(path_to_check) :] return [pkg for pkg in relative_base_path.split(os.sep) if pkg] return None def modpath_from_file_with_callback( filename: str, path: Sequence[str] | None = None, is_package_cb: Callable[[str, list[str]], bool] | None = None, ) -> list[str]: filename = os.path.expanduser(_path_from_filename(filename)) paths_to_check = sys.path.copy() if path: paths_to_check += path for pathname in itertools.chain( paths_to_check, map(_cache_normalize_path, paths_to_check) ): if not pathname: continue modpath = _get_relative_base_path(filename, pathname) if not modpath: continue assert is_package_cb is not None if is_package_cb(pathname, modpath[:-1]): return modpath raise ImportError( "Unable to find module for {} in {}".format(filename, ", \n".join(sys.path)) ) def modpath_from_file(filename: str, path: Sequence[str] | None = None) -> list[str]: """Get the corresponding split module's name from a filename. This function will return the name of a module or package split on `.`. :type filename: str :param filename: file's path for which we want the module's name :type Optional[List[str]] path: Optional list of path where the module or package should be searched (use sys.path if nothing or None is given) :raise ImportError: if the corresponding module's name has not been found :rtype: list(str) :return: the corresponding split module's name """ return modpath_from_file_with_callback(filename, path, check_modpath_has_init) def file_from_modpath( modpath: list[str], path: Sequence[str] | None = None, context_file: str | None = None, ) -> str | None: return file_info_from_modpath(modpath, path, context_file).location def file_info_from_modpath( modpath: list[str], path: Sequence[str] | None = None, context_file: str | None = None, ) -> spec.ModuleSpec: """Given a mod path (i.e. split module / package name), return the corresponding file. Giving priority to source file over precompiled file if it exists. :param modpath: split module's name (i.e name of a module or package split on '.') (this means explicit relative imports that start with dots have empty strings in this list!) :param path: optional list of path where the module or package should be searched (use sys.path if nothing or None is given) :param context_file: context file to consider, necessary if the identifier has been introduced using a relative import unresolvable in the actual context (i.e. modutils) :raise ImportError: if there is no such module in the directory :return: the path to the module's file or None if it's an integrated builtin module such as 'sys' """ if context_file is not None: context: str | None = os.path.dirname(context_file) else: context = context_file if modpath[0] == "xml": # handle _xmlplus try: return _spec_from_modpath(["_xmlplus"] + modpath[1:], path, context) except ImportError: return _spec_from_modpath(modpath, path, context) elif modpath == ["os", "path"]: # FIXME: currently ignoring search_path... return spec.ModuleSpec( name="os.path", location=os.path.__file__, type=spec.ModuleType.PY_SOURCE, ) return _spec_from_modpath(modpath, path, context) def get_module_part(dotted_name: str, context_file: str | None = None) -> str: """Given a dotted name return the module part of the name : >>> get_module_part('astroid.as_string.dump') 'astroid.as_string' :param dotted_name: full name of the identifier we are interested in :param context_file: context file to consider, necessary if the identifier has been introduced using a relative import unresolvable in the actual context (i.e. modutils) :raise ImportError: if there is no such module in the directory :return: the module part of the name or None if we have not been able at all to import the given name XXX: deprecated, since it doesn't handle package precedence over module (see #10066) """ # os.path trick if dotted_name.startswith("os.path"): return "os.path" parts = dotted_name.split(".") if context_file is not None: # first check for builtin module which won't be considered latter # in that case (path != None) if parts[0] in BUILTIN_MODULES: if len(parts) > 2: raise ImportError(dotted_name) return parts[0] # don't use += or insert, we want a new list to be created ! path: list[str] | None = None starti = 0 if parts[0] == "": assert ( context_file is not None ), "explicit relative import, but no context_file?" path = [] # prevent resolving the import non-relatively starti = 1 # for all further dots: change context while starti < len(parts) and parts[starti] == "": starti += 1 assert ( context_file is not None ), "explicit relative import, but no context_file?" context_file = os.path.dirname(context_file) for i in range(starti, len(parts)): try: file_from_modpath( parts[starti : i + 1], path=path, context_file=context_file ) except ImportError: if i < max(1, len(parts) - 2): raise return ".".join(parts[:i]) return dotted_name def get_module_files( src_directory: str, blacklist: Sequence[str], list_all: bool = False ) -> list[str]: """Given a package directory return a list of all available python module's files in the package and its subpackages. :param src_directory: path of the directory corresponding to the package :param blacklist: iterable list of files or directories to ignore. :param list_all: get files from all paths, including ones without __init__.py :return: the list of all available python module's files in the package and its subpackages """ files: list[str] = [] for directory, dirnames, filenames in os.walk(src_directory): if directory in blacklist: continue _handle_blacklist(blacklist, dirnames, filenames) # check for __init__.py if not list_all and {"__init__.py", "__init__.pyi"}.isdisjoint(filenames): dirnames[:] = () continue for filename in filenames: if _is_python_file(filename): src = os.path.join(directory, filename) files.append(src) return files def get_source_file( filename: str, include_no_ext: bool = False, prefer_stubs: bool = False ) -> str: """Given a python module's file name return the matching source file name (the filename will be returned identically if it's already an absolute path to a python source file). :param filename: python module's file name :raise NoSourceFile: if no source file exists on the file system :return: the absolute path of the source file if it exists """ filename = os.path.abspath(_path_from_filename(filename)) base, orig_ext = os.path.splitext(filename) if orig_ext == ".pyi" and os.path.exists(f"{base}{orig_ext}"): return f"{base}{orig_ext}" for ext in PY_SOURCE_EXTS_STUBS_FIRST if prefer_stubs else PY_SOURCE_EXTS: source_path = f"{base}.{ext}" if os.path.exists(source_path): return source_path if include_no_ext and not orig_ext and os.path.exists(base): return base raise NoSourceFile(filename) def is_python_source(filename: str | None) -> bool: """Return: True if the filename is a python source file.""" if not filename: return False return os.path.splitext(filename)[1][1:] in PY_SOURCE_EXTS def is_stdlib_module(modname: str) -> bool: """Return: True if the modname is in the standard library""" return modname.split(".")[0] in stdlib_module_names def module_in_path(modname: str, path: str | Iterable[str]) -> bool: """Try to determine if a module is imported from one of the specified paths :param modname: name of the module :param path: paths to consider :return: true if the module: - is located on the path listed in one of the directory in `paths` """ modname = modname.split(".")[0] try: filename = file_from_modpath([modname]) except ImportError: # Import failed, we can't check path if we don't know it return False if filename is None: # No filename likely means it's compiled in, or potentially a namespace return False filename = _normalize_path(filename) if isinstance(path, str): return filename.startswith(_cache_normalize_path(path)) return any(filename.startswith(_cache_normalize_path(entry)) for entry in path) def is_standard_module(modname: str, std_path: Iterable[str] | None = None) -> bool: """Try to guess if a module is a standard python module (by default, see `std_path` parameter's description). :param modname: name of the module we are interested in :param std_path: list of path considered has standard :return: true if the module: - is located on the path listed in one of the directory in `std_path` - is a built-in module """ warnings.warn( "is_standard_module() is deprecated. Use, is_stdlib_module() or module_in_path() instead", DeprecationWarning, stacklevel=2, ) modname = modname.split(".")[0] try: filename = file_from_modpath([modname]) except ImportError: # import failed, i'm probably not so wrong by supposing it's # not standard... return False # modules which are not living in a file are considered standard # (sys and __builtin__ for instance) if filename is None: # we assume there are no namespaces in stdlib return not util.is_namespace(modname) filename = _normalize_path(filename) for path in EXT_LIB_DIRS: if filename.startswith(_cache_normalize_path(path)): return False if std_path is None: std_path = STD_LIB_DIRS return any(filename.startswith(_cache_normalize_path(path)) for path in std_path) def is_relative(modname: str, from_file: str) -> bool: """Return true if the given module name is relative to the given file name. :param modname: name of the module we are interested in :param from_file: path of the module from which modname has been imported :return: true if the module has been imported relatively to `from_file` """ if not os.path.isdir(from_file): from_file = os.path.dirname(from_file) if from_file in sys.path: return False return bool( importlib.machinery.PathFinder.find_spec( modname.split(".", maxsplit=1)[0], [from_file] ) ) # internal only functions ##################################################### def _spec_from_modpath( modpath: list[str], path: Sequence[str] | None = None, context: str | None = None, ) -> spec.ModuleSpec: """Given a mod path (i.e. split module / package name), return the corresponding spec. this function is used internally, see `file_from_modpath`'s documentation for more information """ assert modpath location = None if context is not None: try: found_spec = spec.find_spec(modpath, [context]) location = found_spec.location except ImportError: found_spec = spec.find_spec(modpath, path) location = found_spec.location else: found_spec = spec.find_spec(modpath, path) if found_spec.type == spec.ModuleType.PY_COMPILED: try: assert found_spec.location is not None location = get_source_file(found_spec.location) return found_spec._replace( location=location, type=spec.ModuleType.PY_SOURCE ) except NoSourceFile: return found_spec._replace(location=location) elif found_spec.type == spec.ModuleType.C_BUILTIN: # integrated builtin module return found_spec._replace(location=None) elif found_spec.type == spec.ModuleType.PKG_DIRECTORY: assert found_spec.location is not None location = _has_init(found_spec.location) return found_spec._replace(location=location, type=spec.ModuleType.PY_SOURCE) return found_spec def _is_python_file(filename: str) -> bool: """Return true if the given filename should be considered as a python file. .pyc and .pyo are ignored """ return filename.endswith((".py", ".pyi", ".so", ".pyd", ".pyw")) def _has_init(directory: str) -> str | None: """If the given directory has a valid __init__ file, return its path, else return None. """ mod_or_pack = os.path.join(directory, "__init__") for ext in (*PY_SOURCE_EXTS, "pyc", "pyo"): if os.path.exists(mod_or_pack + "." + ext): return mod_or_pack + "." + ext return None def is_namespace(specobj: spec.ModuleSpec) -> bool: return specobj.type == spec.ModuleType.PY_NAMESPACE def is_directory(specobj: spec.ModuleSpec) -> bool: return specobj.type == spec.ModuleType.PKG_DIRECTORY def is_module_name_part_of_extension_package_whitelist( module_name: str, package_whitelist: set[str] ) -> bool: """ Returns True if one part of the module name is in the package whitelist. >>> is_module_name_part_of_extension_package_whitelist('numpy.core.umath', {'numpy'}) True """ parts = module_name.split(".") return any( ".".join(parts[:x]) in package_whitelist for x in range(1, len(parts) + 1) ) astroid-3.2.2/astroid/rebuilder.py0000664000175000017500000021321514622475517017106 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains utilities for rebuilding an _ast tree in order to get a single Astroid representation. """ from __future__ import annotations import ast import sys import token from collections.abc import Callable, Generator from io import StringIO from tokenize import TokenInfo, generate_tokens from typing import TYPE_CHECKING, Final, TypeVar, Union, cast, overload from astroid import nodes from astroid._ast import ParserModule, get_parser_module, parse_function_type_comment from astroid.const import IS_PYPY, PY38, PY39_PLUS, PY312_PLUS, Context from astroid.manager import AstroidManager from astroid.nodes import NodeNG from astroid.nodes.node_classes import AssignName from astroid.nodes.utils import Position from astroid.typing import InferenceResult REDIRECT: Final[dict[str, str]] = { "arguments": "Arguments", "comprehension": "Comprehension", "ListCompFor": "Comprehension", "GenExprFor": "Comprehension", "excepthandler": "ExceptHandler", "keyword": "Keyword", "match_case": "MatchCase", } T_Doc = TypeVar( "T_Doc", "ast.Module", "ast.ClassDef", Union["ast.FunctionDef", "ast.AsyncFunctionDef"], ) _FunctionT = TypeVar("_FunctionT", nodes.FunctionDef, nodes.AsyncFunctionDef) _ForT = TypeVar("_ForT", nodes.For, nodes.AsyncFor) _WithT = TypeVar("_WithT", nodes.With, nodes.AsyncWith) NodesWithDocsType = Union[nodes.Module, nodes.ClassDef, nodes.FunctionDef] # noinspection PyMethodMayBeStatic class TreeRebuilder: """Rebuilds the _ast tree to become an Astroid tree.""" def __init__( self, manager: AstroidManager, parser_module: ParserModule | None = None, data: str | None = None, ) -> None: self._manager = manager self._data = data.split("\n") if data else None self._global_names: list[dict[str, list[nodes.Global]]] = [] self._import_from_nodes: list[nodes.ImportFrom] = [] self._delayed_assattr: list[nodes.AssignAttr] = [] self._visit_meths: dict[type[ast.AST], Callable[[ast.AST, NodeNG], NodeNG]] = {} if parser_module is None: self._parser_module = get_parser_module() else: self._parser_module = parser_module def _get_doc(self, node: T_Doc) -> tuple[T_Doc, ast.Constant | ast.Str | None]: """Return the doc ast node.""" try: if node.body and isinstance(node.body[0], ast.Expr): first_value = node.body[0].value if isinstance(first_value, ast.Constant) and isinstance( first_value.value, str ): doc_ast_node = first_value node.body = node.body[1:] # The ast parser of python < 3.8 sets col_offset of multi-line strings to -1 # as it is unable to determine the value correctly. We reset this to None. if doc_ast_node.col_offset == -1: doc_ast_node.col_offset = None return node, doc_ast_node except IndexError: pass # ast built from scratch return node, None def _get_context( self, node: ( ast.Attribute | ast.List | ast.Name | ast.Subscript | ast.Starred | ast.Tuple ), ) -> Context: return self._parser_module.context_classes.get(type(node.ctx), Context.Load) def _get_position_info( self, node: ast.ClassDef | ast.FunctionDef | ast.AsyncFunctionDef, parent: nodes.ClassDef | nodes.FunctionDef | nodes.AsyncFunctionDef, ) -> Position | None: """Return position information for ClassDef and FunctionDef nodes. In contrast to AST positions, these only include the actual keyword(s) and the class / function name. >>> @decorator >>> async def some_func(var: int) -> None: >>> ^^^^^^^^^^^^^^^^^^^ """ if not self._data: return None end_lineno = node.end_lineno if node.body: end_lineno = node.body[0].lineno # pylint: disable-next=unsubscriptable-object data = "\n".join(self._data[node.lineno - 1 : end_lineno]) start_token: TokenInfo | None = None keyword_tokens: tuple[int, ...] = (token.NAME,) if isinstance(parent, nodes.AsyncFunctionDef): search_token = "async" elif isinstance(parent, nodes.FunctionDef): search_token = "def" else: search_token = "class" for t in generate_tokens(StringIO(data).readline): if ( start_token is not None and t.type == token.NAME and t.string == node.name ): break if t.type in keyword_tokens: if t.string == search_token: start_token = t continue if t.string in {"def"}: continue start_token = None else: return None return Position( lineno=node.lineno + start_token.start[0] - 1, col_offset=start_token.start[1], end_lineno=node.lineno + t.end[0] - 1, end_col_offset=t.end[1], ) def _reset_end_lineno(self, newnode: nodes.NodeNG) -> None: """Reset end_lineno and end_col_offset attributes for PyPy 3.8. For some nodes, these are either set to -1 or only partially assigned. To keep consistency across astroid and pylint, reset all. This has been fixed in PyPy 3.9. For reference, an (incomplete) list of nodes with issues: - ClassDef - For - FunctionDef - While - Call - If - Decorators - Try - With - Assign """ newnode.end_lineno = None newnode.end_col_offset = None for child_node in newnode.get_children(): self._reset_end_lineno(child_node) def visit_module( self, node: ast.Module, modname: str, modpath: str, package: bool ) -> nodes.Module: """Visit a Module node by returning a fresh instance of it. Note: Method not called by 'visit' """ node, doc_ast_node = self._get_doc(node) newnode = nodes.Module( name=modname, file=modpath, path=[modpath], package=package, ) newnode.postinit( [self.visit(child, newnode) for child in node.body], doc_node=self.visit(doc_ast_node, newnode), ) if IS_PYPY and PY38: self._reset_end_lineno(newnode) return newnode if TYPE_CHECKING: # noqa: C901 @overload def visit(self, node: ast.arg, parent: NodeNG) -> nodes.AssignName: ... @overload def visit(self, node: ast.arguments, parent: NodeNG) -> nodes.Arguments: ... @overload def visit(self, node: ast.Assert, parent: NodeNG) -> nodes.Assert: ... @overload def visit( self, node: ast.AsyncFunctionDef, parent: NodeNG ) -> nodes.AsyncFunctionDef: ... @overload def visit(self, node: ast.AsyncFor, parent: NodeNG) -> nodes.AsyncFor: ... @overload def visit(self, node: ast.Await, parent: NodeNG) -> nodes.Await: ... @overload def visit(self, node: ast.AsyncWith, parent: NodeNG) -> nodes.AsyncWith: ... @overload def visit(self, node: ast.Assign, parent: NodeNG) -> nodes.Assign: ... @overload def visit(self, node: ast.AnnAssign, parent: NodeNG) -> nodes.AnnAssign: ... @overload def visit(self, node: ast.AugAssign, parent: NodeNG) -> nodes.AugAssign: ... @overload def visit(self, node: ast.BinOp, parent: NodeNG) -> nodes.BinOp: ... @overload def visit(self, node: ast.BoolOp, parent: NodeNG) -> nodes.BoolOp: ... @overload def visit(self, node: ast.Break, parent: NodeNG) -> nodes.Break: ... @overload def visit(self, node: ast.Call, parent: NodeNG) -> nodes.Call: ... @overload def visit(self, node: ast.ClassDef, parent: NodeNG) -> nodes.ClassDef: ... @overload def visit(self, node: ast.Continue, parent: NodeNG) -> nodes.Continue: ... @overload def visit(self, node: ast.Compare, parent: NodeNG) -> nodes.Compare: ... @overload def visit( self, node: ast.comprehension, parent: NodeNG ) -> nodes.Comprehension: ... @overload def visit(self, node: ast.Delete, parent: NodeNG) -> nodes.Delete: ... @overload def visit(self, node: ast.Dict, parent: NodeNG) -> nodes.Dict: ... @overload def visit(self, node: ast.DictComp, parent: NodeNG) -> nodes.DictComp: ... @overload def visit(self, node: ast.Expr, parent: NodeNG) -> nodes.Expr: ... @overload def visit( self, node: ast.ExceptHandler, parent: NodeNG ) -> nodes.ExceptHandler: ... @overload def visit(self, node: ast.For, parent: NodeNG) -> nodes.For: ... @overload def visit(self, node: ast.ImportFrom, parent: NodeNG) -> nodes.ImportFrom: ... @overload def visit(self, node: ast.FunctionDef, parent: NodeNG) -> nodes.FunctionDef: ... @overload def visit( self, node: ast.GeneratorExp, parent: NodeNG ) -> nodes.GeneratorExp: ... @overload def visit(self, node: ast.Attribute, parent: NodeNG) -> nodes.Attribute: ... @overload def visit(self, node: ast.Global, parent: NodeNG) -> nodes.Global: ... @overload def visit(self, node: ast.If, parent: NodeNG) -> nodes.If: ... @overload def visit(self, node: ast.IfExp, parent: NodeNG) -> nodes.IfExp: ... @overload def visit(self, node: ast.Import, parent: NodeNG) -> nodes.Import: ... @overload def visit(self, node: ast.JoinedStr, parent: NodeNG) -> nodes.JoinedStr: ... @overload def visit( self, node: ast.FormattedValue, parent: NodeNG ) -> nodes.FormattedValue: ... @overload def visit(self, node: ast.NamedExpr, parent: NodeNG) -> nodes.NamedExpr: ... if sys.version_info < (3, 9): # Not used in Python 3.9+ @overload def visit( self, node: ast.ExtSlice, parent: nodes.Subscript ) -> nodes.Tuple: ... @overload def visit(self, node: ast.Index, parent: nodes.Subscript) -> NodeNG: ... @overload def visit(self, node: ast.keyword, parent: NodeNG) -> nodes.Keyword: ... @overload def visit(self, node: ast.Lambda, parent: NodeNG) -> nodes.Lambda: ... @overload def visit(self, node: ast.List, parent: NodeNG) -> nodes.List: ... @overload def visit(self, node: ast.ListComp, parent: NodeNG) -> nodes.ListComp: ... @overload def visit( self, node: ast.Name, parent: NodeNG ) -> nodes.Name | nodes.Const | nodes.AssignName | nodes.DelName: ... @overload def visit(self, node: ast.Nonlocal, parent: NodeNG) -> nodes.Nonlocal: ... @overload def visit(self, node: ast.Constant, parent: NodeNG) -> nodes.Const: ... if sys.version_info >= (3, 12): @overload def visit(self, node: ast.ParamSpec, parent: NodeNG) -> nodes.ParamSpec: ... @overload def visit(self, node: ast.Pass, parent: NodeNG) -> nodes.Pass: ... @overload def visit(self, node: ast.Raise, parent: NodeNG) -> nodes.Raise: ... @overload def visit(self, node: ast.Return, parent: NodeNG) -> nodes.Return: ... @overload def visit(self, node: ast.Set, parent: NodeNG) -> nodes.Set: ... @overload def visit(self, node: ast.SetComp, parent: NodeNG) -> nodes.SetComp: ... @overload def visit(self, node: ast.Slice, parent: nodes.Subscript) -> nodes.Slice: ... @overload def visit(self, node: ast.Subscript, parent: NodeNG) -> nodes.Subscript: ... @overload def visit(self, node: ast.Starred, parent: NodeNG) -> nodes.Starred: ... @overload def visit(self, node: ast.Try, parent: NodeNG) -> nodes.Try: ... if sys.version_info >= (3, 11): @overload def visit(self, node: ast.TryStar, parent: NodeNG) -> nodes.TryStar: ... @overload def visit(self, node: ast.Tuple, parent: NodeNG) -> nodes.Tuple: ... if sys.version_info >= (3, 12): @overload def visit(self, node: ast.TypeAlias, parent: NodeNG) -> nodes.TypeAlias: ... @overload def visit(self, node: ast.TypeVar, parent: NodeNG) -> nodes.TypeVar: ... @overload def visit( self, node: ast.TypeVarTuple, parent: NodeNG ) -> nodes.TypeVarTuple: ... @overload def visit(self, node: ast.UnaryOp, parent: NodeNG) -> nodes.UnaryOp: ... @overload def visit(self, node: ast.While, parent: NodeNG) -> nodes.While: ... @overload def visit(self, node: ast.With, parent: NodeNG) -> nodes.With: ... @overload def visit(self, node: ast.Yield, parent: NodeNG) -> nodes.Yield: ... @overload def visit(self, node: ast.YieldFrom, parent: NodeNG) -> nodes.YieldFrom: ... if sys.version_info >= (3, 10): @overload def visit(self, node: ast.Match, parent: NodeNG) -> nodes.Match: ... @overload def visit( self, node: ast.match_case, parent: NodeNG ) -> nodes.MatchCase: ... @overload def visit( self, node: ast.MatchValue, parent: NodeNG ) -> nodes.MatchValue: ... @overload def visit( self, node: ast.MatchSingleton, parent: NodeNG ) -> nodes.MatchSingleton: ... @overload def visit( self, node: ast.MatchSequence, parent: NodeNG ) -> nodes.MatchSequence: ... @overload def visit( self, node: ast.MatchMapping, parent: NodeNG ) -> nodes.MatchMapping: ... @overload def visit( self, node: ast.MatchClass, parent: NodeNG ) -> nodes.MatchClass: ... @overload def visit(self, node: ast.MatchStar, parent: NodeNG) -> nodes.MatchStar: ... @overload def visit(self, node: ast.MatchAs, parent: NodeNG) -> nodes.MatchAs: ... @overload def visit(self, node: ast.MatchOr, parent: NodeNG) -> nodes.MatchOr: ... @overload def visit(self, node: ast.pattern, parent: NodeNG) -> nodes.Pattern: ... @overload def visit(self, node: ast.AST, parent: NodeNG) -> NodeNG: ... @overload def visit(self, node: None, parent: NodeNG) -> None: ... def visit(self, node: ast.AST | None, parent: NodeNG) -> NodeNG | None: if node is None: return None cls = node.__class__ if cls in self._visit_meths: visit_method = self._visit_meths[cls] else: cls_name = cls.__name__ visit_name = "visit_" + REDIRECT.get(cls_name, cls_name).lower() visit_method = getattr(self, visit_name) self._visit_meths[cls] = visit_method return visit_method(node, parent) def _save_assignment(self, node: nodes.AssignName | nodes.DelName) -> None: """Save assignment situation since node.parent is not available yet.""" if self._global_names and node.name in self._global_names[-1]: node.root().set_local(node.name, node) else: assert node.parent assert node.name node.parent.set_local(node.name, node) def visit_arg(self, node: ast.arg, parent: NodeNG) -> nodes.AssignName: """Visit an arg node by returning a fresh AssName instance.""" return self.visit_assignname(node, parent, node.arg) def visit_arguments(self, node: ast.arguments, parent: NodeNG) -> nodes.Arguments: """Visit an Arguments node by returning a fresh instance of it.""" vararg: str | None = None kwarg: str | None = None vararg_node = node.vararg kwarg_node = node.kwarg newnode = nodes.Arguments( node.vararg.arg if node.vararg else None, node.kwarg.arg if node.kwarg else None, parent, ( AssignName( vararg_node.arg, vararg_node.lineno, vararg_node.col_offset, parent, end_lineno=vararg_node.end_lineno, end_col_offset=vararg_node.end_col_offset, ) if vararg_node else None ), ( AssignName( kwarg_node.arg, kwarg_node.lineno, kwarg_node.col_offset, parent, end_lineno=kwarg_node.end_lineno, end_col_offset=kwarg_node.end_col_offset, ) if kwarg_node else None ), ) args = [self.visit(child, newnode) for child in node.args] defaults = [self.visit(child, newnode) for child in node.defaults] varargannotation: NodeNG | None = None kwargannotation: NodeNG | None = None if node.vararg: vararg = node.vararg.arg varargannotation = self.visit(node.vararg.annotation, newnode) if node.kwarg: kwarg = node.kwarg.arg kwargannotation = self.visit(node.kwarg.annotation, newnode) if PY38: # In Python 3.8 'end_lineno' and 'end_col_offset' # for 'kwonlyargs' don't include the annotation. for arg in node.kwonlyargs: if arg.annotation is not None: arg.end_lineno = arg.annotation.end_lineno arg.end_col_offset = arg.annotation.end_col_offset kwonlyargs = [self.visit(child, newnode) for child in node.kwonlyargs] kw_defaults = [self.visit(child, newnode) for child in node.kw_defaults] annotations = [self.visit(arg.annotation, newnode) for arg in node.args] kwonlyargs_annotations = [ self.visit(arg.annotation, newnode) for arg in node.kwonlyargs ] posonlyargs = [self.visit(child, newnode) for child in node.posonlyargs] posonlyargs_annotations = [ self.visit(arg.annotation, newnode) for arg in node.posonlyargs ] type_comment_args = [ self.check_type_comment(child, parent=newnode) for child in node.args ] type_comment_kwonlyargs = [ self.check_type_comment(child, parent=newnode) for child in node.kwonlyargs ] type_comment_posonlyargs = [ self.check_type_comment(child, parent=newnode) for child in node.posonlyargs ] newnode.postinit( args=args, defaults=defaults, kwonlyargs=kwonlyargs, posonlyargs=posonlyargs, kw_defaults=kw_defaults, annotations=annotations, kwonlyargs_annotations=kwonlyargs_annotations, posonlyargs_annotations=posonlyargs_annotations, varargannotation=varargannotation, kwargannotation=kwargannotation, type_comment_args=type_comment_args, type_comment_kwonlyargs=type_comment_kwonlyargs, type_comment_posonlyargs=type_comment_posonlyargs, ) # save argument names in locals: assert newnode.parent if vararg: newnode.parent.set_local(vararg, newnode) if kwarg: newnode.parent.set_local(kwarg, newnode) return newnode def visit_assert(self, node: ast.Assert, parent: NodeNG) -> nodes.Assert: """Visit a Assert node by returning a fresh instance of it.""" newnode = nodes.Assert( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) msg: NodeNG | None = None if node.msg: msg = self.visit(node.msg, newnode) newnode.postinit(self.visit(node.test, newnode), msg) return newnode def check_type_comment( self, node: ast.Assign | ast.arg | ast.For | ast.AsyncFor | ast.With | ast.AsyncWith, parent: ( nodes.Assign | nodes.Arguments | nodes.For | nodes.AsyncFor | nodes.With | nodes.AsyncWith ), ) -> NodeNG | None: if not node.type_comment: return None try: type_comment_ast = self._parser_module.parse(node.type_comment) except SyntaxError: # Invalid type comment, just skip it. return None # For '# type: # any comment' ast.parse returns a Module node, # without any nodes in the body. if not type_comment_ast.body: return None type_object = self.visit(type_comment_ast.body[0], parent=parent) if not isinstance(type_object, nodes.Expr): return None return type_object.value def check_function_type_comment( self, node: ast.FunctionDef | ast.AsyncFunctionDef, parent: NodeNG ) -> tuple[NodeNG | None, list[NodeNG]] | None: if not node.type_comment: return None try: type_comment_ast = parse_function_type_comment(node.type_comment) except SyntaxError: # Invalid type comment, just skip it. return None if not type_comment_ast: return None returns: NodeNG | None = None argtypes: list[NodeNG] = [ self.visit(elem, parent) for elem in (type_comment_ast.argtypes or []) ] if type_comment_ast.returns: returns = self.visit(type_comment_ast.returns, parent) return returns, argtypes def visit_asyncfunctiondef( self, node: ast.AsyncFunctionDef, parent: NodeNG ) -> nodes.AsyncFunctionDef: return self._visit_functiondef(nodes.AsyncFunctionDef, node, parent) def visit_asyncfor(self, node: ast.AsyncFor, parent: NodeNG) -> nodes.AsyncFor: return self._visit_for(nodes.AsyncFor, node, parent) def visit_await(self, node: ast.Await, parent: NodeNG) -> nodes.Await: newnode = nodes.Await( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(value=self.visit(node.value, newnode)) return newnode def visit_asyncwith(self, node: ast.AsyncWith, parent: NodeNG) -> nodes.AsyncWith: return self._visit_with(nodes.AsyncWith, node, parent) def visit_assign(self, node: ast.Assign, parent: NodeNG) -> nodes.Assign: """Visit a Assign node by returning a fresh instance of it.""" newnode = nodes.Assign( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) type_annotation = self.check_type_comment(node, parent=newnode) newnode.postinit( targets=[self.visit(child, newnode) for child in node.targets], value=self.visit(node.value, newnode), type_annotation=type_annotation, ) return newnode def visit_annassign(self, node: ast.AnnAssign, parent: NodeNG) -> nodes.AnnAssign: """Visit an AnnAssign node by returning a fresh instance of it.""" newnode = nodes.AnnAssign( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( target=self.visit(node.target, newnode), annotation=self.visit(node.annotation, newnode), simple=node.simple, value=self.visit(node.value, newnode), ) return newnode @overload def visit_assignname( self, node: ast.AST, parent: NodeNG, node_name: str ) -> nodes.AssignName: ... @overload def visit_assignname( self, node: ast.AST, parent: NodeNG, node_name: None ) -> None: ... def visit_assignname( self, node: ast.AST, parent: NodeNG, node_name: str | None ) -> nodes.AssignName | None: """Visit a node and return a AssignName node. Note: Method not called by 'visit' """ if node_name is None: return None newnode = nodes.AssignName( name=node_name, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) self._save_assignment(newnode) return newnode def visit_augassign(self, node: ast.AugAssign, parent: NodeNG) -> nodes.AugAssign: """Visit a AugAssign node by returning a fresh instance of it.""" newnode = nodes.AugAssign( op=self._parser_module.bin_op_classes[type(node.op)] + "=", lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.target, newnode), self.visit(node.value, newnode) ) return newnode def visit_binop(self, node: ast.BinOp, parent: NodeNG) -> nodes.BinOp: """Visit a BinOp node by returning a fresh instance of it.""" newnode = nodes.BinOp( op=self._parser_module.bin_op_classes[type(node.op)], lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.left, newnode), self.visit(node.right, newnode) ) return newnode def visit_boolop(self, node: ast.BoolOp, parent: NodeNG) -> nodes.BoolOp: """Visit a BoolOp node by returning a fresh instance of it.""" newnode = nodes.BoolOp( op=self._parser_module.bool_op_classes[type(node.op)], lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.values]) return newnode def visit_break(self, node: ast.Break, parent: NodeNG) -> nodes.Break: """Visit a Break node by returning a fresh instance of it.""" return nodes.Break( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_call(self, node: ast.Call, parent: NodeNG) -> nodes.Call: """Visit a CallFunc node by returning a fresh instance of it.""" newnode = nodes.Call( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( func=self.visit(node.func, newnode), args=[self.visit(child, newnode) for child in node.args], keywords=[self.visit(child, newnode) for child in node.keywords], ) return newnode def visit_classdef( self, node: ast.ClassDef, parent: NodeNG, newstyle: bool = True ) -> nodes.ClassDef: """Visit a ClassDef node to become astroid.""" node, doc_ast_node = self._get_doc(node) newnode = nodes.ClassDef( name=node.name, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) metaclass = None for keyword in node.keywords: if keyword.arg == "metaclass": metaclass = self.visit(keyword, newnode).value break decorators = self.visit_decorators(node, newnode) newnode.postinit( [self.visit(child, newnode) for child in node.bases], [self.visit(child, newnode) for child in node.body], decorators, newstyle, metaclass, [ self.visit(kwd, newnode) for kwd in node.keywords if kwd.arg != "metaclass" ], position=self._get_position_info(node, newnode), doc_node=self.visit(doc_ast_node, newnode), type_params=( [self.visit(param, newnode) for param in node.type_params] if PY312_PLUS else [] ), ) return newnode def visit_continue(self, node: ast.Continue, parent: NodeNG) -> nodes.Continue: """Visit a Continue node by returning a fresh instance of it.""" return nodes.Continue( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_compare(self, node: ast.Compare, parent: NodeNG) -> nodes.Compare: """Visit a Compare node by returning a fresh instance of it.""" newnode = nodes.Compare( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.left, newnode), [ ( self._parser_module.cmp_op_classes[op.__class__], self.visit(expr, newnode), ) for (op, expr) in zip(node.ops, node.comparators) ], ) return newnode def visit_comprehension( self, node: ast.comprehension, parent: NodeNG ) -> nodes.Comprehension: """Visit a Comprehension node by returning a fresh instance of it.""" newnode = nodes.Comprehension( parent=parent, # Comprehension nodes don't have these attributes # see https://docs.python.org/3/library/ast.html#abstract-grammar lineno=None, col_offset=None, end_lineno=None, end_col_offset=None, ) newnode.postinit( self.visit(node.target, newnode), self.visit(node.iter, newnode), [self.visit(child, newnode) for child in node.ifs], bool(node.is_async), ) return newnode def visit_decorators( self, node: ast.ClassDef | ast.FunctionDef | ast.AsyncFunctionDef, parent: NodeNG, ) -> nodes.Decorators | None: """Visit a Decorators node by returning a fresh instance of it. Note: Method not called by 'visit' """ if not node.decorator_list: return None # /!\ node is actually an _ast.FunctionDef node while # parent is an astroid.nodes.FunctionDef node # Set the line number of the first decorator for Python 3.8+. lineno = node.decorator_list[0].lineno end_lineno = node.decorator_list[-1].end_lineno end_col_offset = node.decorator_list[-1].end_col_offset newnode = nodes.Decorators( lineno=lineno, col_offset=node.col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.decorator_list]) return newnode def visit_delete(self, node: ast.Delete, parent: NodeNG) -> nodes.Delete: """Visit a Delete node by returning a fresh instance of it.""" newnode = nodes.Delete( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.targets]) return newnode def _visit_dict_items( self, node: ast.Dict, parent: NodeNG, newnode: nodes.Dict ) -> Generator[tuple[NodeNG, NodeNG], None, None]: for key, value in zip(node.keys, node.values): rebuilt_key: NodeNG rebuilt_value = self.visit(value, newnode) if not key: # Extended unpacking rebuilt_key = nodes.DictUnpack( lineno=rebuilt_value.lineno, col_offset=rebuilt_value.col_offset, end_lineno=rebuilt_value.end_lineno, end_col_offset=rebuilt_value.end_col_offset, parent=parent, ) else: rebuilt_key = self.visit(key, newnode) yield rebuilt_key, rebuilt_value def visit_dict(self, node: ast.Dict, parent: NodeNG) -> nodes.Dict: """Visit a Dict node by returning a fresh instance of it.""" newnode = nodes.Dict( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) items: list[tuple[InferenceResult, InferenceResult]] = list( self._visit_dict_items(node, parent, newnode) ) newnode.postinit(items) return newnode def visit_dictcomp(self, node: ast.DictComp, parent: NodeNG) -> nodes.DictComp: """Visit a DictComp node by returning a fresh instance of it.""" newnode = nodes.DictComp( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.key, newnode), self.visit(node.value, newnode), [self.visit(child, newnode) for child in node.generators], ) return newnode def visit_expr(self, node: ast.Expr, parent: NodeNG) -> nodes.Expr: """Visit a Expr node by returning a fresh instance of it.""" newnode = nodes.Expr( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_excepthandler( self, node: ast.ExceptHandler, parent: NodeNG ) -> nodes.ExceptHandler: """Visit an ExceptHandler node by returning a fresh instance of it.""" newnode = nodes.ExceptHandler( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.type, newnode), self.visit_assignname(node, newnode, node.name), [self.visit(child, newnode) for child in node.body], ) return newnode @overload def _visit_for( self, cls: type[nodes.For], node: ast.For, parent: NodeNG ) -> nodes.For: ... @overload def _visit_for( self, cls: type[nodes.AsyncFor], node: ast.AsyncFor, parent: NodeNG ) -> nodes.AsyncFor: ... def _visit_for( self, cls: type[_ForT], node: ast.For | ast.AsyncFor, parent: NodeNG ) -> _ForT: """Visit a For node by returning a fresh instance of it.""" col_offset = node.col_offset if IS_PYPY and not PY39_PLUS and isinstance(node, ast.AsyncFor) and self._data: # pylint: disable-next=unsubscriptable-object col_offset = self._data[node.lineno - 1].index("async") newnode = cls( lineno=node.lineno, col_offset=col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) type_annotation = self.check_type_comment(node, parent=newnode) newnode.postinit( target=self.visit(node.target, newnode), iter=self.visit(node.iter, newnode), body=[self.visit(child, newnode) for child in node.body], orelse=[self.visit(child, newnode) for child in node.orelse], type_annotation=type_annotation, ) return newnode def visit_for(self, node: ast.For, parent: NodeNG) -> nodes.For: return self._visit_for(nodes.For, node, parent) def visit_importfrom( self, node: ast.ImportFrom, parent: NodeNG ) -> nodes.ImportFrom: """Visit an ImportFrom node by returning a fresh instance of it.""" names = [(alias.name, alias.asname) for alias in node.names] newnode = nodes.ImportFrom( fromname=node.module or "", names=names, level=node.level or None, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # store From names to add them to locals after building self._import_from_nodes.append(newnode) return newnode @overload def _visit_functiondef( self, cls: type[nodes.FunctionDef], node: ast.FunctionDef, parent: NodeNG ) -> nodes.FunctionDef: ... @overload def _visit_functiondef( self, cls: type[nodes.AsyncFunctionDef], node: ast.AsyncFunctionDef, parent: NodeNG, ) -> nodes.AsyncFunctionDef: ... def _visit_functiondef( self, cls: type[_FunctionT], node: ast.FunctionDef | ast.AsyncFunctionDef, parent: NodeNG, ) -> _FunctionT: """Visit an FunctionDef node to become astroid.""" self._global_names.append({}) node, doc_ast_node = self._get_doc(node) lineno = node.lineno if node.decorator_list: # Python 3.8 sets the line number of a decorated function # to be the actual line number of the function, but the # previous versions expected the decorator's line number instead. # We reset the function's line number to that of the # first decorator to maintain backward compatibility. # It's not ideal but this discrepancy was baked into # the framework for *years*. lineno = node.decorator_list[0].lineno newnode = cls( name=node.name, lineno=lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) decorators = self.visit_decorators(node, newnode) returns: NodeNG | None if node.returns: returns = self.visit(node.returns, newnode) else: returns = None type_comment_args = type_comment_returns = None type_comment_annotation = self.check_function_type_comment(node, newnode) if type_comment_annotation: type_comment_returns, type_comment_args = type_comment_annotation newnode.postinit( args=self.visit(node.args, newnode), body=[self.visit(child, newnode) for child in node.body], decorators=decorators, returns=returns, type_comment_returns=type_comment_returns, type_comment_args=type_comment_args, position=self._get_position_info(node, newnode), doc_node=self.visit(doc_ast_node, newnode), type_params=( [self.visit(param, newnode) for param in node.type_params] if PY312_PLUS else [] ), ) self._global_names.pop() return newnode def visit_functiondef( self, node: ast.FunctionDef, parent: NodeNG ) -> nodes.FunctionDef: return self._visit_functiondef(nodes.FunctionDef, node, parent) def visit_generatorexp( self, node: ast.GeneratorExp, parent: NodeNG ) -> nodes.GeneratorExp: """Visit a GeneratorExp node by returning a fresh instance of it.""" newnode = nodes.GeneratorExp( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.elt, newnode), [self.visit(child, newnode) for child in node.generators], ) return newnode def visit_attribute( self, node: ast.Attribute, parent: NodeNG ) -> nodes.Attribute | nodes.AssignAttr | nodes.DelAttr: """Visit an Attribute node by returning a fresh instance of it.""" context = self._get_context(node) newnode: nodes.Attribute | nodes.AssignAttr | nodes.DelAttr if context == Context.Del: # FIXME : maybe we should reintroduce and visit_delattr ? # for instance, deactivating assign_ctx newnode = nodes.DelAttr( attrname=node.attr, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) elif context == Context.Store: newnode = nodes.AssignAttr( attrname=node.attr, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Prohibit a local save if we are in an ExceptHandler. if not isinstance(parent, nodes.ExceptHandler): # mypy doesn't recognize that newnode has to be AssignAttr because it # doesn't support ParamSpec # See https://github.com/python/mypy/issues/8645 self._delayed_assattr.append(newnode) # type: ignore[arg-type] else: newnode = nodes.Attribute( attrname=node.attr, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_global(self, node: ast.Global, parent: NodeNG) -> nodes.Global: """Visit a Global node to become astroid.""" newnode = nodes.Global( names=node.names, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) if self._global_names: # global at the module level, no effect for name in node.names: self._global_names[-1].setdefault(name, []).append(newnode) return newnode def visit_if(self, node: ast.If, parent: NodeNG) -> nodes.If: """Visit an If node by returning a fresh instance of it.""" newnode = nodes.If( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.test, newnode), [self.visit(child, newnode) for child in node.body], [self.visit(child, newnode) for child in node.orelse], ) return newnode def visit_ifexp(self, node: ast.IfExp, parent: NodeNG) -> nodes.IfExp: """Visit a IfExp node by returning a fresh instance of it.""" newnode = nodes.IfExp( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.test, newnode), self.visit(node.body, newnode), self.visit(node.orelse, newnode), ) return newnode def visit_import(self, node: ast.Import, parent: NodeNG) -> nodes.Import: """Visit a Import node by returning a fresh instance of it.""" names = [(alias.name, alias.asname) for alias in node.names] newnode = nodes.Import( names=names, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # save import names in parent's locals: for name, asname in newnode.names: name = asname or name parent.set_local(name.split(".")[0], newnode) return newnode def visit_joinedstr(self, node: ast.JoinedStr, parent: NodeNG) -> nodes.JoinedStr: newnode = nodes.JoinedStr( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.values]) return newnode def visit_formattedvalue( self, node: ast.FormattedValue, parent: NodeNG ) -> nodes.FormattedValue: newnode = nodes.FormattedValue( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( value=self.visit(node.value, newnode), conversion=node.conversion, format_spec=self.visit(node.format_spec, newnode), ) return newnode def visit_namedexpr(self, node: ast.NamedExpr, parent: NodeNG) -> nodes.NamedExpr: newnode = nodes.NamedExpr( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.target, newnode), self.visit(node.value, newnode) ) return newnode if sys.version_info < (3, 9): # Not used in Python 3.9+. def visit_extslice( self, node: ast.ExtSlice, parent: nodes.Subscript ) -> nodes.Tuple: """Visit an ExtSlice node by returning a fresh instance of Tuple.""" # ExtSlice doesn't have lineno or col_offset information newnode = nodes.Tuple(ctx=Context.Load, parent=parent) newnode.postinit([self.visit(dim, newnode) for dim in node.dims]) return newnode def visit_index(self, node: ast.Index, parent: nodes.Subscript) -> NodeNG: """Visit a Index node by returning a fresh instance of NodeNG.""" return self.visit(node.value, parent) def visit_keyword(self, node: ast.keyword, parent: NodeNG) -> nodes.Keyword: """Visit a Keyword node by returning a fresh instance of it.""" newnode = nodes.Keyword( arg=node.arg, # position attributes added in 3.9 lineno=getattr(node, "lineno", None), col_offset=getattr(node, "col_offset", None), end_lineno=getattr(node, "end_lineno", None), end_col_offset=getattr(node, "end_col_offset", None), parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_lambda(self, node: ast.Lambda, parent: NodeNG) -> nodes.Lambda: """Visit a Lambda node by returning a fresh instance of it.""" newnode = nodes.Lambda( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.args, newnode), self.visit(node.body, newnode)) return newnode def visit_list(self, node: ast.List, parent: NodeNG) -> nodes.List: """Visit a List node by returning a fresh instance of it.""" context = self._get_context(node) newnode = nodes.List( ctx=context, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.elts]) return newnode def visit_listcomp(self, node: ast.ListComp, parent: NodeNG) -> nodes.ListComp: """Visit a ListComp node by returning a fresh instance of it.""" newnode = nodes.ListComp( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.elt, newnode), [self.visit(child, newnode) for child in node.generators], ) return newnode def visit_name( self, node: ast.Name, parent: NodeNG ) -> nodes.Name | nodes.AssignName | nodes.DelName: """Visit a Name node by returning a fresh instance of it.""" context = self._get_context(node) newnode: nodes.Name | nodes.AssignName | nodes.DelName if context == Context.Del: newnode = nodes.DelName( name=node.id, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) elif context == Context.Store: newnode = nodes.AssignName( name=node.id, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) else: newnode = nodes.Name( name=node.id, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # XXX REMOVE me : if context in (Context.Del, Context.Store): # 'Aug' ?? newnode = cast(Union[nodes.AssignName, nodes.DelName], newnode) self._save_assignment(newnode) return newnode def visit_nonlocal(self, node: ast.Nonlocal, parent: NodeNG) -> nodes.Nonlocal: """Visit a Nonlocal node and return a new instance of it.""" return nodes.Nonlocal( names=node.names, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_constant(self, node: ast.Constant, parent: NodeNG) -> nodes.Const: """Visit a Constant node by returning a fresh instance of Const.""" return nodes.Const( value=node.value, kind=node.kind, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_paramspec(self, node: ast.ParamSpec, parent: NodeNG) -> nodes.ParamSpec: """Visit a ParamSpec node by returning a fresh instance of it.""" newnode = nodes.ParamSpec( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit(name=self.visit_assignname(node, newnode, node.name)) return newnode def visit_pass(self, node: ast.Pass, parent: NodeNG) -> nodes.Pass: """Visit a Pass node by returning a fresh instance of it.""" return nodes.Pass( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_raise(self, node: ast.Raise, parent: NodeNG) -> nodes.Raise: """Visit a Raise node by returning a fresh instance of it.""" newnode = nodes.Raise( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # no traceback; anyway it is not used in Pylint newnode.postinit( exc=self.visit(node.exc, newnode), cause=self.visit(node.cause, newnode), ) return newnode def visit_return(self, node: ast.Return, parent: NodeNG) -> nodes.Return: """Visit a Return node by returning a fresh instance of it.""" newnode = nodes.Return( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_set(self, node: ast.Set, parent: NodeNG) -> nodes.Set: """Visit a Set node by returning a fresh instance of it.""" newnode = nodes.Set( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.elts]) return newnode def visit_setcomp(self, node: ast.SetComp, parent: NodeNG) -> nodes.SetComp: """Visit a SetComp node by returning a fresh instance of it.""" newnode = nodes.SetComp( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.elt, newnode), [self.visit(child, newnode) for child in node.generators], ) return newnode def visit_slice(self, node: ast.Slice, parent: nodes.Subscript) -> nodes.Slice: """Visit a Slice node by returning a fresh instance of it.""" newnode = nodes.Slice( # position attributes added in 3.9 lineno=getattr(node, "lineno", None), col_offset=getattr(node, "col_offset", None), end_lineno=getattr(node, "end_lineno", None), end_col_offset=getattr(node, "end_col_offset", None), parent=parent, ) newnode.postinit( lower=self.visit(node.lower, newnode), upper=self.visit(node.upper, newnode), step=self.visit(node.step, newnode), ) return newnode def visit_subscript(self, node: ast.Subscript, parent: NodeNG) -> nodes.Subscript: """Visit a Subscript node by returning a fresh instance of it.""" context = self._get_context(node) newnode = nodes.Subscript( ctx=context, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.value, newnode), self.visit(node.slice, newnode) ) return newnode def visit_starred(self, node: ast.Starred, parent: NodeNG) -> nodes.Starred: """Visit a Starred node and return a new instance of it.""" context = self._get_context(node) newnode = nodes.Starred( ctx=context, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_try(self, node: ast.Try, parent: NodeNG) -> nodes.Try: """Visit a Try node by returning a fresh instance of it""" newnode = nodes.Try( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( body=[self.visit(child, newnode) for child in node.body], handlers=[self.visit(child, newnode) for child in node.handlers], orelse=[self.visit(child, newnode) for child in node.orelse], finalbody=[self.visit(child, newnode) for child in node.finalbody], ) return newnode def visit_trystar(self, node: ast.TryStar, parent: NodeNG) -> nodes.TryStar: newnode = nodes.TryStar( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( body=[self.visit(n, newnode) for n in node.body], handlers=[self.visit(n, newnode) for n in node.handlers], orelse=[self.visit(n, newnode) for n in node.orelse], finalbody=[self.visit(n, newnode) for n in node.finalbody], ) return newnode def visit_tuple(self, node: ast.Tuple, parent: NodeNG) -> nodes.Tuple: """Visit a Tuple node by returning a fresh instance of it.""" context = self._get_context(node) newnode = nodes.Tuple( ctx=context, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit([self.visit(child, newnode) for child in node.elts]) return newnode def visit_typealias(self, node: ast.TypeAlias, parent: NodeNG) -> nodes.TypeAlias: """Visit a TypeAlias node by returning a fresh instance of it.""" newnode = nodes.TypeAlias( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( name=self.visit(node.name, newnode), type_params=[self.visit(p, newnode) for p in node.type_params], value=self.visit(node.value, newnode), ) return newnode def visit_typevar(self, node: ast.TypeVar, parent: NodeNG) -> nodes.TypeVar: """Visit a TypeVar node by returning a fresh instance of it.""" newnode = nodes.TypeVar( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit( name=self.visit_assignname(node, newnode, node.name), bound=self.visit(node.bound, newnode), ) return newnode def visit_typevartuple( self, node: ast.TypeVarTuple, parent: NodeNG ) -> nodes.TypeVarTuple: """Visit a TypeVarTuple node by returning a fresh instance of it.""" newnode = nodes.TypeVarTuple( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit(name=self.visit_assignname(node, newnode, node.name)) return newnode def visit_unaryop(self, node: ast.UnaryOp, parent: NodeNG) -> nodes.UnaryOp: """Visit a UnaryOp node by returning a fresh instance of it.""" newnode = nodes.UnaryOp( op=self._parser_module.unary_op_classes[node.op.__class__], lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.operand, newnode)) return newnode def visit_while(self, node: ast.While, parent: NodeNG) -> nodes.While: """Visit a While node by returning a fresh instance of it.""" newnode = nodes.While( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( self.visit(node.test, newnode), [self.visit(child, newnode) for child in node.body], [self.visit(child, newnode) for child in node.orelse], ) return newnode @overload def _visit_with( self, cls: type[nodes.With], node: ast.With, parent: NodeNG ) -> nodes.With: ... @overload def _visit_with( self, cls: type[nodes.AsyncWith], node: ast.AsyncWith, parent: NodeNG ) -> nodes.AsyncWith: ... def _visit_with( self, cls: type[_WithT], node: ast.With | ast.AsyncWith, parent: NodeNG, ) -> _WithT: col_offset = node.col_offset if IS_PYPY and not PY39_PLUS and isinstance(node, ast.AsyncWith) and self._data: # pylint: disable-next=unsubscriptable-object col_offset = self._data[node.lineno - 1].index("async") newnode = cls( lineno=node.lineno, col_offset=col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_child(child: ast.withitem) -> tuple[NodeNG, NodeNG | None]: expr = self.visit(child.context_expr, newnode) var = self.visit(child.optional_vars, newnode) return expr, var type_annotation = self.check_type_comment(node, parent=newnode) newnode.postinit( items=[visit_child(child) for child in node.items], body=[self.visit(child, newnode) for child in node.body], type_annotation=type_annotation, ) return newnode def visit_with(self, node: ast.With, parent: NodeNG) -> NodeNG: return self._visit_with(nodes.With, node, parent) def visit_yield(self, node: ast.Yield, parent: NodeNG) -> NodeNG: """Visit a Yield node by returning a fresh instance of it.""" newnode = nodes.Yield( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode def visit_yieldfrom(self, node: ast.YieldFrom, parent: NodeNG) -> NodeNG: newnode = nodes.YieldFrom( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(self.visit(node.value, newnode)) return newnode if sys.version_info >= (3, 10): def visit_match(self, node: ast.Match, parent: NodeNG) -> nodes.Match: newnode = nodes.Match( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( subject=self.visit(node.subject, newnode), cases=[self.visit(case, newnode) for case in node.cases], ) return newnode def visit_matchcase( self, node: ast.match_case, parent: NodeNG ) -> nodes.MatchCase: newnode = nodes.MatchCase(parent=parent) newnode.postinit( pattern=self.visit(node.pattern, newnode), guard=self.visit(node.guard, newnode), body=[self.visit(child, newnode) for child in node.body], ) return newnode def visit_matchvalue( self, node: ast.MatchValue, parent: NodeNG ) -> nodes.MatchValue: newnode = nodes.MatchValue( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit(value=self.visit(node.value, newnode)) return newnode def visit_matchsingleton( self, node: ast.MatchSingleton, parent: NodeNG ) -> nodes.MatchSingleton: return nodes.MatchSingleton( value=node.value, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) def visit_matchsequence( self, node: ast.MatchSequence, parent: NodeNG ) -> nodes.MatchSequence: newnode = nodes.MatchSequence( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( patterns=[self.visit(pattern, newnode) for pattern in node.patterns] ) return newnode def visit_matchmapping( self, node: ast.MatchMapping, parent: NodeNG ) -> nodes.MatchMapping: newnode = nodes.MatchMapping( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit( keys=[self.visit(child, newnode) for child in node.keys], patterns=[self.visit(pattern, newnode) for pattern in node.patterns], rest=self.visit_assignname(node, newnode, node.rest), ) return newnode def visit_matchclass( self, node: ast.MatchClass, parent: NodeNG ) -> nodes.MatchClass: newnode = nodes.MatchClass( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( cls=self.visit(node.cls, newnode), patterns=[self.visit(pattern, newnode) for pattern in node.patterns], kwd_attrs=node.kwd_attrs, kwd_patterns=[ self.visit(pattern, newnode) for pattern in node.kwd_patterns ], ) return newnode def visit_matchstar( self, node: ast.MatchStar, parent: NodeNG ) -> nodes.MatchStar: newnode = nodes.MatchStar( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit(name=self.visit_assignname(node, newnode, node.name)) return newnode def visit_matchas(self, node: ast.MatchAs, parent: NodeNG) -> nodes.MatchAs: newnode = nodes.MatchAs( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) # Add AssignName node for 'node.name' # https://bugs.python.org/issue43994 newnode.postinit( pattern=self.visit(node.pattern, newnode), name=self.visit_assignname(node, newnode, node.name), ) return newnode def visit_matchor(self, node: ast.MatchOr, parent: NodeNG) -> nodes.MatchOr: newnode = nodes.MatchOr( lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=parent, ) newnode.postinit( patterns=[self.visit(pattern, newnode) for pattern in node.patterns] ) return newnode astroid-3.2.2/astroid/objects.py0000664000175000017500000003134114622475517016560 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Inference objects are a way to represent composite AST nodes, which are used only as inference results, so they can't be found in the original AST tree. For instance, inferring the following frozenset use, leads to an inferred FrozenSet: Call(func=Name('frozenset'), args=Tuple(...)) """ from __future__ import annotations from collections.abc import Generator, Iterator from functools import cached_property from typing import Any, Literal, NoReturn, TypeVar from astroid import bases, util from astroid.context import InferenceContext from astroid.exceptions import ( AttributeInferenceError, InferenceError, MroError, SuperError, ) from astroid.interpreter import objectmodel from astroid.manager import AstroidManager from astroid.nodes import node_classes, scoped_nodes from astroid.typing import InferenceResult, SuccessfulInferenceResult _T = TypeVar("_T") class FrozenSet(node_classes.BaseContainer): """Class representing a FrozenSet composite node.""" def pytype(self) -> Literal["builtins.frozenset"]: return "builtins.frozenset" def _infer(self, context: InferenceContext | None = None, **kwargs: Any): yield self @cached_property def _proxied(self): # pylint: disable=method-hidden ast_builtins = AstroidManager().builtins_module return ast_builtins.getattr("frozenset")[0] class Super(node_classes.NodeNG): """Proxy class over a super call. This class offers almost the same behaviour as Python's super, which is MRO lookups for retrieving attributes from the parents. The *mro_pointer* is the place in the MRO from where we should start looking, not counting it. *mro_type* is the object which provides the MRO, it can be both a type or an instance. *self_class* is the class where the super call is, while *scope* is the function where the super call is. """ special_attributes = objectmodel.SuperModel() def __init__( self, mro_pointer: SuccessfulInferenceResult, mro_type: SuccessfulInferenceResult, self_class: scoped_nodes.ClassDef, scope: scoped_nodes.FunctionDef, call: node_classes.Call, ) -> None: self.type = mro_type self.mro_pointer = mro_pointer self._class_based = False self._self_class = self_class self._scope = scope super().__init__( parent=scope, lineno=scope.lineno, col_offset=scope.col_offset, end_lineno=scope.end_lineno, end_col_offset=scope.end_col_offset, ) def _infer(self, context: InferenceContext | None = None, **kwargs: Any): yield self def super_mro(self): """Get the MRO which will be used to lookup attributes in this super.""" if not isinstance(self.mro_pointer, scoped_nodes.ClassDef): raise SuperError( "The first argument to super must be a subtype of " "type, not {mro_pointer}.", super_=self, ) if isinstance(self.type, scoped_nodes.ClassDef): # `super(type, type)`, most likely in a class method. self._class_based = True mro_type = self.type else: mro_type = getattr(self.type, "_proxied", None) if not isinstance(mro_type, (bases.Instance, scoped_nodes.ClassDef)): raise SuperError( "The second argument to super must be an " "instance or subtype of type, not {type}.", super_=self, ) if not mro_type.newstyle: raise SuperError("Unable to call super on old-style classes.", super_=self) mro = mro_type.mro() if self.mro_pointer not in mro: raise SuperError( "The second argument to super must be an " "instance or subtype of type, not {type}.", super_=self, ) index = mro.index(self.mro_pointer) return mro[index + 1 :] @cached_property def _proxied(self): ast_builtins = AstroidManager().builtins_module return ast_builtins.getattr("super")[0] def pytype(self) -> Literal["builtins.super"]: return "builtins.super" def display_type(self) -> str: return "Super of" @property def name(self): """Get the name of the MRO pointer.""" return self.mro_pointer.name def qname(self) -> Literal["super"]: return "super" def igetattr( # noqa: C901 self, name: str, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Retrieve the inferred values of the given attribute name.""" # '__class__' is a special attribute that should be taken directly # from the special attributes dict if name == "__class__": yield self.special_attributes.lookup(name) return try: mro = self.super_mro() # Don't let invalid MROs or invalid super calls # leak out as is from this function. except SuperError as exc: raise AttributeInferenceError( ( "Lookup for {name} on {target!r} because super call {super!r} " "is invalid." ), target=self, attribute=name, context=context, super_=exc.super_, ) from exc except MroError as exc: raise AttributeInferenceError( ( "Lookup for {name} on {target!r} failed because {cls!r} has an " "invalid MRO." ), target=self, attribute=name, context=context, mros=exc.mros, cls=exc.cls, ) from exc found = False for cls in mro: if name not in cls.locals: continue found = True for inferred in bases._infer_stmts([cls[name]], context, frame=self): if not isinstance(inferred, scoped_nodes.FunctionDef): yield inferred continue # We can obtain different descriptors from a super depending # on what we are accessing and where the super call is. if inferred.type == "classmethod": yield bases.BoundMethod(inferred, cls) elif self._scope.type == "classmethod" and inferred.type == "method": yield inferred elif self._class_based or inferred.type == "staticmethod": yield inferred elif isinstance(inferred, Property): function = inferred.function try: yield from function.infer_call_result( caller=self, context=context ) except InferenceError: yield util.Uninferable elif bases._is_property(inferred): # TODO: support other descriptors as well. try: yield from inferred.infer_call_result(self, context) except InferenceError: yield util.Uninferable else: yield bases.BoundMethod(inferred, cls) # Only if we haven't found any explicit overwrites for the # attribute we look it up in the special attributes if not found and name in self.special_attributes: yield self.special_attributes.lookup(name) return if not found: raise AttributeInferenceError(target=self, attribute=name, context=context) def getattr(self, name, context: InferenceContext | None = None): return list(self.igetattr(name, context=context)) class ExceptionInstance(bases.Instance): """Class for instances of exceptions. It has special treatment for some of the exceptions's attributes, which are transformed at runtime into certain concrete objects, such as the case of .args. """ @cached_property def special_attributes(self): qname = self.qname() instance = objectmodel.BUILTIN_EXCEPTIONS.get( qname, objectmodel.ExceptionInstanceModel ) return instance()(self) class DictInstance(bases.Instance): """Special kind of instances for dictionaries. This instance knows the underlying object model of the dictionaries, which means that methods such as .values or .items can be properly inferred. """ special_attributes = objectmodel.DictModel() # Custom objects tailored for dictionaries, which are used to # disambiguate between the types of Python 2 dict's method returns # and Python 3 (where they return set like objects). class DictItems(bases.Proxy): __str__ = node_classes.NodeNG.__str__ __repr__ = node_classes.NodeNG.__repr__ class DictKeys(bases.Proxy): __str__ = node_classes.NodeNG.__str__ __repr__ = node_classes.NodeNG.__repr__ class DictValues(bases.Proxy): __str__ = node_classes.NodeNG.__str__ __repr__ = node_classes.NodeNG.__repr__ class PartialFunction(scoped_nodes.FunctionDef): """A class representing partial function obtained via functools.partial.""" def __init__(self, call, name=None, lineno=None, col_offset=None, parent=None): # TODO: Pass end_lineno, end_col_offset and parent as well super().__init__( name, lineno=lineno, col_offset=col_offset, parent=node_classes.Unknown(), end_col_offset=0, end_lineno=0, ) # A typical FunctionDef automatically adds its name to the parent scope, # but a partial should not, so defer setting parent until after init self.parent = parent self.filled_args = call.positional_arguments[1:] self.filled_keywords = call.keyword_arguments wrapped_function = call.positional_arguments[0] inferred_wrapped_function = next(wrapped_function.infer()) if isinstance(inferred_wrapped_function, PartialFunction): self.filled_args = inferred_wrapped_function.filled_args + self.filled_args self.filled_keywords = { **inferred_wrapped_function.filled_keywords, **self.filled_keywords, } self.filled_positionals = len(self.filled_args) def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: if context: assert ( context.callcontext ), "CallContext should be set before inferring call result" current_passed_keywords = { keyword for (keyword, _) in context.callcontext.keywords } for keyword, value in self.filled_keywords.items(): if keyword not in current_passed_keywords: context.callcontext.keywords.append((keyword, value)) call_context_args = context.callcontext.args or [] context.callcontext.args = self.filled_args + call_context_args return super().infer_call_result(caller=caller, context=context) def qname(self) -> str: return self.__class__.__name__ # TODO: Hack to solve the circular import problem between node_classes and objects # This is not needed in 2.0, which has a cleaner design overall node_classes.Dict.__bases__ = (node_classes.NodeNG, DictInstance) class Property(scoped_nodes.FunctionDef): """Class representing a Python property.""" def __init__(self, function, name=None, lineno=None, col_offset=None, parent=None): self.function = function super().__init__( name, lineno=lineno, col_offset=col_offset, parent=parent, end_col_offset=function.end_col_offset, end_lineno=function.end_lineno, ) special_attributes = objectmodel.PropertyModel() type = "property" def pytype(self) -> Literal["builtins.property"]: return "builtins.property" def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> NoReturn: raise InferenceError("Properties are not callable") def _infer( self: _T, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[_T, None, None]: yield self astroid-3.2.2/astroid/builder.py0000664000175000017500000004444714622475517016570 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """The AstroidBuilder makes astroid from living object and / or from _ast. The builder is not thread safe and can't be used to parse different sources at the same time. """ from __future__ import annotations import ast import os import textwrap import types import warnings from collections.abc import Iterator, Sequence from io import TextIOWrapper from tokenize import detect_encoding from astroid import bases, modutils, nodes, raw_building, rebuilder, util from astroid._ast import ParserModule, get_parser_module from astroid.const import PY312_PLUS from astroid.exceptions import AstroidBuildingError, AstroidSyntaxError, InferenceError from astroid.manager import AstroidManager # The name of the transient function that is used to # wrap expressions to be extracted when calling # extract_node. _TRANSIENT_FUNCTION = "__" # The comment used to select a statement to be extracted # when calling extract_node. _STATEMENT_SELECTOR = "#@" MISPLACED_TYPE_ANNOTATION_ERROR = "misplaced type annotation" if PY312_PLUS: warnings.filterwarnings("ignore", "invalid escape sequence", SyntaxWarning) def open_source_file(filename: str) -> tuple[TextIOWrapper, str, str]: # pylint: disable=consider-using-with with open(filename, "rb") as byte_stream: encoding = detect_encoding(byte_stream.readline)[0] stream = open(filename, newline=None, encoding=encoding) data = stream.read() return stream, encoding, data def _can_assign_attr(node: nodes.ClassDef, attrname: str | None) -> bool: try: slots = node.slots() except NotImplementedError: pass else: if slots and attrname not in {slot.value for slot in slots}: return False return node.qname() != "builtins.object" class AstroidBuilder(raw_building.InspectBuilder): """Class for building an astroid tree from source code or from a live module. The param *manager* specifies the manager class which should be used. If no manager is given, then the default one will be used. The param *apply_transforms* determines if the transforms should be applied after the tree was built from source or from a live object, by default being True. """ def __init__( self, manager: AstroidManager | None = None, apply_transforms: bool = True ) -> None: super().__init__(manager) self._apply_transforms = apply_transforms if not raw_building.InspectBuilder.bootstrapped: raw_building._astroid_bootstrapping() def module_build( self, module: types.ModuleType, modname: str | None = None ) -> nodes.Module: """Build an astroid from a living module instance.""" node = None path = getattr(module, "__file__", None) loader = getattr(module, "__loader__", None) # Prefer the loader to get the source rather than assuming we have a # filesystem to read the source file from ourselves. if loader: modname = modname or module.__name__ source = loader.get_source(modname) if source: node = self.string_build(source, modname, path=path) if node is None and path is not None: path_, ext = os.path.splitext(modutils._path_from_filename(path)) if ext in {".py", ".pyc", ".pyo"} and os.path.exists(path_ + ".py"): node = self.file_build(path_ + ".py", modname) if node is None: # this is a built-in module # get a partial representation by introspection node = self.inspect_build(module, modname=modname, path=path) if self._apply_transforms: # We have to handle transformation by ourselves since the # rebuilder isn't called for builtin nodes node = self._manager.visit_transforms(node) assert isinstance(node, nodes.Module) return node def file_build(self, path: str, modname: str | None = None) -> nodes.Module: """Build astroid from a source code file (i.e. from an ast). *path* is expected to be a python source file """ try: stream, encoding, data = open_source_file(path) except OSError as exc: raise AstroidBuildingError( "Unable to load file {path}:\n{error}", modname=modname, path=path, error=exc, ) from exc except (SyntaxError, LookupError) as exc: raise AstroidSyntaxError( "Python 3 encoding specification error or unknown encoding:\n" "{error}", modname=modname, path=path, error=exc, ) from exc except UnicodeError as exc: # wrong encoding # detect_encoding returns utf-8 if no encoding specified raise AstroidBuildingError( "Wrong or no encoding specified for {filename}.", filename=path ) from exc with stream: # get module name if necessary if modname is None: try: modname = ".".join(modutils.modpath_from_file(path)) except ImportError: modname = os.path.splitext(os.path.basename(path))[0] # build astroid representation module, builder = self._data_build(data, modname, path) return self._post_build(module, builder, encoding) def string_build( self, data: str, modname: str = "", path: str | None = None ) -> nodes.Module: """Build astroid from source code string.""" module, builder = self._data_build(data, modname, path) module.file_bytes = data.encode("utf-8") return self._post_build(module, builder, "utf-8") def _post_build( self, module: nodes.Module, builder: rebuilder.TreeRebuilder, encoding: str ) -> nodes.Module: """Handles encoding and delayed nodes after a module has been built.""" module.file_encoding = encoding self._manager.cache_module(module) # post tree building steps after we stored the module in the cache: for from_node in builder._import_from_nodes: if from_node.modname == "__future__": for symbol, _ in from_node.names: module.future_imports.add(symbol) self.add_from_names_to_locals(from_node) # handle delayed assattr nodes for delayed in builder._delayed_assattr: self.delayed_assattr(delayed) # Visit the transforms if self._apply_transforms: module = self._manager.visit_transforms(module) return module def _data_build( self, data: str, modname: str, path: str | None ) -> tuple[nodes.Module, rebuilder.TreeRebuilder]: """Build tree node from data and add some informations.""" try: node, parser_module = _parse_string( data, type_comments=True, modname=modname ) except (TypeError, ValueError, SyntaxError) as exc: raise AstroidSyntaxError( "Parsing Python code failed:\n{error}", source=data, modname=modname, path=path, error=exc, ) from exc if path is not None: node_file = os.path.abspath(path) else: node_file = "" if modname.endswith(".__init__"): modname = modname[:-9] package = True else: package = ( path is not None and os.path.splitext(os.path.basename(path))[0] == "__init__" ) builder = rebuilder.TreeRebuilder(self._manager, parser_module, data) module = builder.visit_module(node, modname, node_file, package) return module, builder def add_from_names_to_locals(self, node: nodes.ImportFrom) -> None: """Store imported names to the locals. Resort the locals if coming from a delayed node """ def _key_func(node: nodes.NodeNG) -> int: return node.fromlineno or 0 def sort_locals(my_list: list[nodes.NodeNG]) -> None: my_list.sort(key=_key_func) assert node.parent # It should always default to the module for name, asname in node.names: if name == "*": try: imported = node.do_import_module() except AstroidBuildingError: continue for name in imported.public_names(): node.parent.set_local(name, node) sort_locals(node.parent.scope().locals[name]) # type: ignore[arg-type] else: node.parent.set_local(asname or name, node) sort_locals(node.parent.scope().locals[asname or name]) # type: ignore[arg-type] def delayed_assattr(self, node: nodes.AssignAttr) -> None: """Visit a AssAttr node. This adds name to locals and handle members definition. """ from astroid import objects # pylint: disable=import-outside-toplevel try: for inferred in node.expr.infer(): if isinstance(inferred, util.UninferableBase): continue try: # pylint: disable=unidiomatic-typecheck # We want a narrow check on the # parent type, not all of its subclasses if ( type(inferred) == bases.Instance or type(inferred) == objects.ExceptionInstance ): inferred = inferred._proxied iattrs = inferred.instance_attrs if not _can_assign_attr(inferred, node.attrname): continue elif isinstance(inferred, bases.Instance): # Const, Tuple or other containers that inherit from # `Instance` continue elif isinstance(inferred, (bases.Proxy, util.UninferableBase)): continue elif inferred.is_function: iattrs = inferred.instance_attrs else: iattrs = inferred.locals except AttributeError: # XXX log error continue values = iattrs.setdefault(node.attrname, []) if node in values: continue values.append(node) except InferenceError: pass def build_namespace_package_module(name: str, path: Sequence[str]) -> nodes.Module: module = nodes.Module(name, path=path, package=True) module.postinit(body=[], doc_node=None) return module def parse( code: str, module_name: str = "", path: str | None = None, apply_transforms: bool = True, ) -> nodes.Module: """Parses a source string in order to obtain an astroid AST from it. :param str code: The code for the module. :param str module_name: The name for the module, if any :param str path: The path for the module :param bool apply_transforms: Apply the transforms for the give code. Use it if you don't want the default transforms to be applied. """ code = textwrap.dedent(code) builder = AstroidBuilder( manager=AstroidManager(), apply_transforms=apply_transforms ) return builder.string_build(code, modname=module_name, path=path) def _extract_expressions(node: nodes.NodeNG) -> Iterator[nodes.NodeNG]: """Find expressions in a call to _TRANSIENT_FUNCTION and extract them. The function walks the AST recursively to search for expressions that are wrapped into a call to _TRANSIENT_FUNCTION. If it finds such an expression, it completely removes the function call node from the tree, replacing it by the wrapped expression inside the parent. :param node: An astroid node. :type node: astroid.bases.NodeNG :yields: The sequence of wrapped expressions on the modified tree expression can be found. """ if ( isinstance(node, nodes.Call) and isinstance(node.func, nodes.Name) and node.func.name == _TRANSIENT_FUNCTION ): real_expr = node.args[0] assert node.parent real_expr.parent = node.parent # Search for node in all _astng_fields (the fields checked when # get_children is called) of its parent. Some of those fields may # be lists or tuples, in which case the elements need to be checked. # When we find it, replace it by real_expr, so that the AST looks # like no call to _TRANSIENT_FUNCTION ever took place. for name in node.parent._astroid_fields: child = getattr(node.parent, name) if isinstance(child, list): for idx, compound_child in enumerate(child): if compound_child is node: child[idx] = real_expr elif child is node: setattr(node.parent, name, real_expr) yield real_expr else: for child in node.get_children(): yield from _extract_expressions(child) def _find_statement_by_line(node: nodes.NodeNG, line: int) -> nodes.NodeNG | None: """Extracts the statement on a specific line from an AST. If the line number of node matches line, it will be returned; otherwise its children are iterated and the function is called recursively. :param node: An astroid node. :type node: astroid.bases.NodeNG :param line: The line number of the statement to extract. :type line: int :returns: The statement on the line, or None if no statement for the line can be found. :rtype: astroid.bases.NodeNG or None """ if isinstance(node, (nodes.ClassDef, nodes.FunctionDef, nodes.MatchCase)): # This is an inaccuracy in the AST: the nodes that can be # decorated do not carry explicit information on which line # the actual definition (class/def), but .fromline seems to # be close enough. node_line = node.fromlineno else: node_line = node.lineno if node_line == line: return node for child in node.get_children(): result = _find_statement_by_line(child, line) if result: return result return None def extract_node(code: str, module_name: str = "") -> nodes.NodeNG | list[nodes.NodeNG]: """Parses some Python code as a module and extracts a designated AST node. Statements: To extract one or more statement nodes, append #@ to the end of the line Examples: >>> def x(): >>> def y(): >>> return 1 #@ The return statement will be extracted. >>> class X(object): >>> def meth(self): #@ >>> pass The function object 'meth' will be extracted. Expressions: To extract arbitrary expressions, surround them with the fake function call __(...). After parsing, the surrounded expression will be returned and the whole AST (accessible via the returned node's parent attribute) will look like the function call was never there in the first place. Examples: >>> a = __(1) The const node will be extracted. >>> def x(d=__(foo.bar)): pass The node containing the default argument will be extracted. >>> def foo(a, b): >>> return 0 < __(len(a)) < b The node containing the function call 'len' will be extracted. If no statements or expressions are selected, the last toplevel statement will be returned. If the selected statement is a discard statement, (i.e. an expression turned into a statement), the wrapped expression is returned instead. For convenience, singleton lists are unpacked. :param str code: A piece of Python code that is parsed as a module. Will be passed through textwrap.dedent first. :param str module_name: The name of the module. :returns: The designated node from the parse tree, or a list of nodes. """ def _extract(node: nodes.NodeNG | None) -> nodes.NodeNG | None: if isinstance(node, nodes.Expr): return node.value return node requested_lines: list[int] = [] for idx, line in enumerate(code.splitlines()): if line.strip().endswith(_STATEMENT_SELECTOR): requested_lines.append(idx + 1) tree = parse(code, module_name=module_name) if not tree.body: raise ValueError("Empty tree, cannot extract from it") extracted: list[nodes.NodeNG | None] = [] if requested_lines: extracted = [_find_statement_by_line(tree, line) for line in requested_lines] # Modifies the tree. extracted.extend(_extract_expressions(tree)) if not extracted: extracted.append(tree.body[-1]) extracted = [_extract(node) for node in extracted] extracted_without_none = [node for node in extracted if node is not None] if len(extracted_without_none) == 1: return extracted_without_none[0] return extracted_without_none def _extract_single_node(code: str, module_name: str = "") -> nodes.NodeNG: """Call extract_node while making sure that only one value is returned.""" ret = extract_node(code, module_name) if isinstance(ret, list): return ret[0] return ret def _parse_string( data: str, type_comments: bool = True, modname: str | None = None ) -> tuple[ast.Module, ParserModule]: parser_module = get_parser_module(type_comments=type_comments) try: parsed = parser_module.parse( data + "\n", type_comments=type_comments, filename=modname ) except SyntaxError as exc: # If the type annotations are misplaced for some reason, we do not want # to fail the entire parsing of the file, so we need to retry the parsing without # type comment support. if exc.args[0] != MISPLACED_TYPE_ANNOTATION_ERROR or not type_comments: raise parser_module = get_parser_module(type_comments=False) parsed = parser_module.parse(data + "\n", type_comments=False) return parsed, parser_module astroid-3.2.2/astroid/arguments.py0000664000175000017500000003127614622475517017143 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid import nodes from astroid.bases import Instance from astroid.context import CallContext, InferenceContext from astroid.exceptions import InferenceError, NoDefault from astroid.typing import InferenceResult from astroid.util import Uninferable, UninferableBase, safe_infer class CallSite: """Class for understanding arguments passed into a call site. It needs a call context, which contains the arguments and the keyword arguments that were passed into a given call site. In order to infer what an argument represents, call :meth:`infer_argument` with the corresponding function node and the argument name. :param callcontext: An instance of :class:`astroid.context.CallContext`, that holds the arguments for the call site. :param argument_context_map: Additional contexts per node, passed in from :attr:`astroid.context.Context.extra_context` :param context: An instance of :class:`astroid.context.Context`. """ def __init__( self, callcontext: CallContext, argument_context_map=None, context: InferenceContext | None = None, ): if argument_context_map is None: argument_context_map = {} self.argument_context_map = argument_context_map args = callcontext.args keywords = callcontext.keywords self.duplicated_keywords: set[str] = set() self._unpacked_args = self._unpack_args(args, context=context) self._unpacked_kwargs = self._unpack_keywords(keywords, context=context) self.positional_arguments = [ arg for arg in self._unpacked_args if not isinstance(arg, UninferableBase) ] self.keyword_arguments = { key: value for key, value in self._unpacked_kwargs.items() if not isinstance(value, UninferableBase) } @classmethod def from_call(cls, call_node, context: InferenceContext | None = None): """Get a CallSite object from the given Call node. context will be used to force a single inference path. """ # Determine the callcontext from the given `context` object if any. context = context or InferenceContext() callcontext = CallContext(call_node.args, call_node.keywords) return cls(callcontext, context=context) def has_invalid_arguments(self): """Check if in the current CallSite were passed *invalid* arguments. This can mean multiple things. For instance, if an unpacking of an invalid object was passed, then this method will return True. Other cases can be when the arguments can't be inferred by astroid, for example, by passing objects which aren't known statically. """ return len(self.positional_arguments) != len(self._unpacked_args) def has_invalid_keywords(self) -> bool: """Check if in the current CallSite were passed *invalid* keyword arguments. For instance, unpacking a dictionary with integer keys is invalid (**{1:2}), because the keys must be strings, which will make this method to return True. Other cases where this might return True if objects which can't be inferred were passed. """ return len(self.keyword_arguments) != len(self._unpacked_kwargs) def _unpack_keywords( self, keywords: list[tuple[str | None, nodes.NodeNG]], context: InferenceContext | None = None, ): values: dict[str | None, InferenceResult] = {} context = context or InferenceContext() context.extra_context = self.argument_context_map for name, value in keywords: if name is None: # Then it's an unpacking operation (**) inferred = safe_infer(value, context=context) if not isinstance(inferred, nodes.Dict): # Not something we can work with. values[name] = Uninferable continue for dict_key, dict_value in inferred.items: dict_key = safe_infer(dict_key, context=context) if not isinstance(dict_key, nodes.Const): values[name] = Uninferable continue if not isinstance(dict_key.value, str): values[name] = Uninferable continue if dict_key.value in values: # The name is already in the dictionary values[dict_key.value] = Uninferable self.duplicated_keywords.add(dict_key.value) continue values[dict_key.value] = dict_value else: values[name] = value return values def _unpack_args(self, args, context: InferenceContext | None = None): values = [] context = context or InferenceContext() context.extra_context = self.argument_context_map for arg in args: if isinstance(arg, nodes.Starred): inferred = safe_infer(arg.value, context=context) if isinstance(inferred, UninferableBase): values.append(Uninferable) continue if not hasattr(inferred, "elts"): values.append(Uninferable) continue values.extend(inferred.elts) else: values.append(arg) return values def infer_argument( self, funcnode: InferenceResult, name: str, context: InferenceContext ): # noqa: C901 """Infer a function argument value according to the call context.""" if not isinstance(funcnode, (nodes.FunctionDef, nodes.Lambda)): raise InferenceError( f"Can not infer function argument value for non-function node {funcnode!r}.", call_site=self, func=funcnode, arg=name, context=context, ) if name in self.duplicated_keywords: raise InferenceError( "The arguments passed to {func!r} have duplicate keywords.", call_site=self, func=funcnode, arg=name, context=context, ) # Look into the keywords first, maybe it's already there. try: return self.keyword_arguments[name].infer(context) except KeyError: pass # Too many arguments given and no variable arguments. if len(self.positional_arguments) > len(funcnode.args.args): if not funcnode.args.vararg and not funcnode.args.posonlyargs: raise InferenceError( "Too many positional arguments " "passed to {func!r} that does " "not have *args.", call_site=self, func=funcnode, arg=name, context=context, ) positional = self.positional_arguments[: len(funcnode.args.args)] vararg = self.positional_arguments[len(funcnode.args.args) :] # preserving previous behavior, when vararg and kwarg were not included in find_argname results if name in [funcnode.args.vararg, funcnode.args.kwarg]: argindex = None else: argindex = funcnode.args.find_argname(name)[0] kwonlyargs = {arg.name for arg in funcnode.args.kwonlyargs} kwargs = { key: value for key, value in self.keyword_arguments.items() if key not in kwonlyargs } # If there are too few positionals compared to # what the function expects to receive, check to see # if the missing positional arguments were passed # as keyword arguments and if so, place them into the # positional args list. if len(positional) < len(funcnode.args.args): for func_arg in funcnode.args.args: if func_arg.name in kwargs: arg = kwargs.pop(func_arg.name) positional.append(arg) if argindex is not None: boundnode = context.boundnode # 2. first argument of instance/class method if argindex == 0 and funcnode.type in {"method", "classmethod"}: # context.boundnode is None when an instance method is called with # the class, e.g. MyClass.method(obj, ...). In this case, self # is the first argument. if boundnode is None and funcnode.type == "method" and positional: return positional[0].infer(context=context) if boundnode is None: # XXX can do better ? boundnode = funcnode.parent.frame() if isinstance(boundnode, nodes.ClassDef): # Verify that we're accessing a method # of the metaclass through a class, as in # `cls.metaclass_method`. In this case, the # first argument is always the class. method_scope = funcnode.parent.scope() if method_scope is boundnode.metaclass(context=context): return iter((boundnode,)) if funcnode.type == "method": if not isinstance(boundnode, Instance): boundnode = boundnode.instantiate_class() return iter((boundnode,)) if funcnode.type == "classmethod": return iter((boundnode,)) # if we have a method, extract one position # from the index, so we'll take in account # the extra parameter represented by `self` or `cls` if funcnode.type in {"method", "classmethod"} and boundnode: argindex -= 1 # 2. search arg index try: return self.positional_arguments[argindex].infer(context) except IndexError: pass if funcnode.args.kwarg == name: # It wants all the keywords that were passed into # the call site. if self.has_invalid_keywords(): raise InferenceError( "Inference failed to find values for all keyword arguments " "to {func!r}: {unpacked_kwargs!r} doesn't correspond to " "{keyword_arguments!r}.", keyword_arguments=self.keyword_arguments, unpacked_kwargs=self._unpacked_kwargs, call_site=self, func=funcnode, arg=name, context=context, ) kwarg = nodes.Dict( lineno=funcnode.args.lineno, col_offset=funcnode.args.col_offset, parent=funcnode.args, end_lineno=funcnode.args.end_lineno, end_col_offset=funcnode.args.end_col_offset, ) kwarg.postinit( [(nodes.const_factory(key), value) for key, value in kwargs.items()] ) return iter((kwarg,)) if funcnode.args.vararg == name: # It wants all the args that were passed into # the call site. if self.has_invalid_arguments(): raise InferenceError( "Inference failed to find values for all positional " "arguments to {func!r}: {unpacked_args!r} doesn't " "correspond to {positional_arguments!r}.", positional_arguments=self.positional_arguments, unpacked_args=self._unpacked_args, call_site=self, func=funcnode, arg=name, context=context, ) args = nodes.Tuple( lineno=funcnode.args.lineno, col_offset=funcnode.args.col_offset, parent=funcnode.args, ) args.postinit(vararg) return iter((args,)) # Check if it's a default parameter. try: return funcnode.args.default_value(name).infer(context) except NoDefault: pass raise InferenceError( "No value found for argument {arg} to {func!r}", call_site=self, func=funcnode, arg=name, context=context, ) astroid-3.2.2/astroid/inference_tip.py0000664000175000017500000001076614622475517017751 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Transform utilities (filters and decorator).""" from __future__ import annotations from collections import OrderedDict from collections.abc import Generator from typing import Any, TypeVar from astroid.context import InferenceContext from astroid.exceptions import InferenceOverwriteError, UseInferenceDefault from astroid.nodes import NodeNG from astroid.typing import ( InferenceResult, InferFn, TransformFn, ) _cache: OrderedDict[ tuple[InferFn[Any], NodeNG, InferenceContext | None], list[InferenceResult] ] = OrderedDict() _CURRENTLY_INFERRING: set[tuple[InferFn[Any], NodeNG]] = set() _NodesT = TypeVar("_NodesT", bound=NodeNG) def clear_inference_tip_cache() -> None: """Clear the inference tips cache.""" _cache.clear() def _inference_tip_cached(func: InferFn[_NodesT]) -> InferFn[_NodesT]: """Cache decorator used for inference tips.""" def inner( node: _NodesT, context: InferenceContext | None = None, **kwargs: Any, ) -> Generator[InferenceResult, None, None]: partial_cache_key = (func, node) if partial_cache_key in _CURRENTLY_INFERRING: # If through recursion we end up trying to infer the same # func + node we raise here. _CURRENTLY_INFERRING.remove(partial_cache_key) raise UseInferenceDefault if context is not None and context.is_empty(): # Fresh, empty contexts will defeat the cache. context = None try: yield from _cache[func, node, context] return except KeyError: # Recursion guard with a partial cache key. # Using the full key causes a recursion error on PyPy. # It's a pragmatic compromise to avoid so much recursive inference # with slightly different contexts while still passing the simple # test cases included with this commit. _CURRENTLY_INFERRING.add(partial_cache_key) try: # May raise UseInferenceDefault result = _cache[func, node, context] = list( func(node, context, **kwargs) ) except Exception as e: # Suppress the KeyError from the cache miss. raise e from None finally: # Remove recursion guard. try: _CURRENTLY_INFERRING.remove(partial_cache_key) except KeyError: pass # Recursion may beat us to the punch. if len(_cache) > 64: _cache.popitem(last=False) # https://github.com/pylint-dev/pylint/issues/8686 yield from result # pylint: disable=used-before-assignment return inner def inference_tip( infer_function: InferFn[_NodesT], raise_on_overwrite: bool = False ) -> TransformFn[_NodesT]: """Given an instance specific inference function, return a function to be given to AstroidManager().register_transform to set this inference function. :param bool raise_on_overwrite: Raise an `InferenceOverwriteError` if the inference tip will overwrite another. Used for debugging Typical usage .. sourcecode:: python AstroidManager().register_transform(Call, inference_tip(infer_named_tuple), predicate) .. Note:: Using an inference tip will override any previously set inference tip for the given node. Use a predicate in the transform to prevent excess overwrites. """ def transform( node: _NodesT, infer_function: InferFn[_NodesT] = infer_function ) -> _NodesT: if ( raise_on_overwrite and node._explicit_inference is not None and node._explicit_inference is not infer_function ): raise InferenceOverwriteError( "Inference already set to {existing_inference}. " "Trying to overwrite with {new_inference} for {node}".format( existing_inference=infer_function, new_inference=node._explicit_inference, node=node, ) ) node._explicit_inference = _inference_tip_cached(infer_function) return node return transform astroid-3.2.2/astroid/const.py0000664000175000017500000000153614622475517016260 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import enum import sys PY38 = sys.version_info[:2] == (3, 8) PY39_PLUS = sys.version_info >= (3, 9) PY310_PLUS = sys.version_info >= (3, 10) PY311_PLUS = sys.version_info >= (3, 11) PY312_PLUS = sys.version_info >= (3, 12) PY313_PLUS = sys.version_info >= (3, 13) WIN32 = sys.platform == "win32" IS_PYPY = sys.implementation.name == "pypy" IS_JYTHON = sys.implementation.name == "jython" # pylint: disable-next=no-member PYPY_7_3_11_PLUS = IS_PYPY and sys.pypy_version_info >= (7, 3, 11) # type: ignore[attr-defined] class Context(enum.Enum): Load = 1 Store = 2 Del = 3 _EMPTY_OBJECT_MARKER = object() astroid-3.2.2/astroid/typing.py0000664000175000017500000000527414622475517016447 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from typing import ( TYPE_CHECKING, Any, Callable, Generator, Generic, Protocol, TypedDict, TypeVar, Union, ) if TYPE_CHECKING: from collections.abc import Iterator from astroid import bases, exceptions, nodes, transforms, util from astroid.context import InferenceContext from astroid.interpreter._import import spec class InferenceErrorInfo(TypedDict): """Store additional Inference error information raised with StopIteration exception. """ node: nodes.NodeNG context: InferenceContext | None class AstroidManagerBrain(TypedDict): """Dictionary to store relevant information for a AstroidManager class.""" astroid_cache: dict[str, nodes.Module] _mod_file_cache: dict[ tuple[str, str | None], spec.ModuleSpec | exceptions.AstroidImportError ] _failed_import_hooks: list[Callable[[str], nodes.Module]] always_load_extensions: bool optimize_ast: bool max_inferable_values: int extension_package_whitelist: set[str] _transform: transforms.TransformVisitor InferenceResult = Union["nodes.NodeNG", "util.UninferableBase", "bases.Proxy"] SuccessfulInferenceResult = Union["nodes.NodeNG", "bases.Proxy"] _SuccessfulInferenceResultT = TypeVar( "_SuccessfulInferenceResultT", bound=SuccessfulInferenceResult ) _SuccessfulInferenceResultT_contra = TypeVar( "_SuccessfulInferenceResultT_contra", bound=SuccessfulInferenceResult, contravariant=True, ) ConstFactoryResult = Union[ "nodes.List", "nodes.Set", "nodes.Tuple", "nodes.Dict", "nodes.Const", "nodes.EmptyNode", ] InferBinaryOp = Callable[ [ _SuccessfulInferenceResultT, Union["nodes.AugAssign", "nodes.BinOp"], str, InferenceResult, "InferenceContext", SuccessfulInferenceResult, ], Generator[InferenceResult, None, None], ] class InferFn(Protocol, Generic[_SuccessfulInferenceResultT_contra]): def __call__( self, node: _SuccessfulInferenceResultT_contra, context: InferenceContext | None = None, **kwargs: Any, ) -> Iterator[InferenceResult]: ... # pragma: no cover class TransformFn(Protocol, Generic[_SuccessfulInferenceResultT]): def __call__( self, node: _SuccessfulInferenceResultT, infer_function: InferFn[_SuccessfulInferenceResultT] = ..., ) -> _SuccessfulInferenceResultT | None: ... # pragma: no cover astroid-3.2.2/astroid/_backport_stdlib_names.py0000664000175000017500000001556214622475517021626 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Shim to support Python versions < 3.10 that don't have sys.stdlib_module_names These values were created by cherry-picking the commits from https://bugs.python.org/issue42955 into each version, but may be updated manually if changes are needed. """ import sys # TODO: Remove this file when Python 3.9 is no longer supported PY_3_7 = frozenset( { "__future__", "_abc", "_ast", "_asyncio", "_bisect", "_blake2", "_bootlocale", "_bz2", "_codecs", "_codecs_cn", "_codecs_hk", "_codecs_iso2022", "_codecs_jp", "_codecs_kr", "_codecs_tw", "_collections", "_collections_abc", "_compat_pickle", "_compression", "_contextvars", "_crypt", "_csv", "_ctypes", "_curses", "_curses_panel", "_datetime", "_dbm", "_decimal", "_dummy_thread", "_elementtree", "_functools", "_gdbm", "_hashlib", "_heapq", "_imp", "_io", "_json", "_locale", "_lsprof", "_lzma", "_markupbase", "_md5", "_msi", "_multibytecodec", "_multiprocessing", "_opcode", "_operator", "_osx_support", "_pickle", "_posixsubprocess", "_py_abc", "_pydecimal", "_pyio", "_queue", "_random", "_sha1", "_sha256", "_sha3", "_sha512", "_signal", "_sitebuiltins", "_socket", "_sqlite3", "_sre", "_ssl", "_stat", "_string", "_strptime", "_struct", "_symtable", "_thread", "_threading_local", "_tkinter", "_tracemalloc", "_uuid", "_warnings", "_weakref", "_weakrefset", "_winapi", "abc", "aifc", "antigravity", "argparse", "array", "ast", "asynchat", "asyncio", "asyncore", "atexit", "audioop", "base64", "bdb", "binascii", "binhex", "bisect", "builtins", "bz2", "cProfile", "calendar", "cgi", "cgitb", "chunk", "cmath", "cmd", "code", "codecs", "codeop", "collections", "colorsys", "compileall", "concurrent", "configparser", "contextlib", "contextvars", "copy", "copyreg", "crypt", "csv", "ctypes", "curses", "dataclasses", "datetime", "dbm", "decimal", "difflib", "dis", "distutils", "doctest", "dummy_threading", "email", "encodings", "ensurepip", "enum", "errno", "faulthandler", "fcntl", "filecmp", "fileinput", "fnmatch", "formatter", "fractions", "ftplib", "functools", "gc", "genericpath", "getopt", "getpass", "gettext", "glob", "grp", "gzip", "hashlib", "heapq", "hmac", "html", "http", "idlelib", "imaplib", "imghdr", "imp", "importlib", "inspect", "io", "ipaddress", "itertools", "json", "keyword", "lib2to3", "linecache", "locale", "logging", "lzma", "macpath", "mailbox", "mailcap", "marshal", "math", "mimetypes", "mmap", "modulefinder", "msilib", "msvcrt", "multiprocessing", "netrc", "nis", "nntplib", "nt", "ntpath", "nturl2path", "numbers", "opcode", "operator", "optparse", "os", "ossaudiodev", "parser", "pathlib", "pdb", "pickle", "pickletools", "pipes", "pkgutil", "platform", "plistlib", "poplib", "posix", "posixpath", "pprint", "profile", "pstats", "pty", "pwd", "py_compile", "pyclbr", "pydoc", "pydoc_data", "pyexpat", "queue", "quopri", "random", "re", "readline", "reprlib", "resource", "rlcompleter", "runpy", "sched", "secrets", "select", "selectors", "shelve", "shlex", "shutil", "signal", "site", "smtpd", "smtplib", "sndhdr", "socket", "socketserver", "spwd", "sqlite3", "sre_compile", "sre_constants", "sre_parse", "ssl", "stat", "statistics", "string", "stringprep", "struct", "subprocess", "sunau", "symbol", "symtable", "sys", "sysconfig", "syslog", "tabnanny", "tarfile", "telnetlib", "tempfile", "termios", "textwrap", "this", "threading", "time", "timeit", "tkinter", "token", "tokenize", "trace", "traceback", "tracemalloc", "tty", "turtle", "turtledemo", "types", "typing", "unicodedata", "unittest", "urllib", "uu", "uuid", "venv", "warnings", "wave", "weakref", "webbrowser", "winreg", "winsound", "wsgiref", "xdrlib", "xml", "xmlrpc", "zipapp", "zipfile", "zipimport", "zlib", } ) PY_3_8 = frozenset( PY_3_7 - { "macpath", } | { "_posixshmem", "_statistics", "_xxsubinterpreters", } ) PY_3_9 = frozenset( PY_3_8 - { "_dummy_thread", "dummy_threading", } | { "_aix_support", "_bootsubprocess", "_peg_parser", "_zoneinfo", "graphlib", "zoneinfo", } ) if sys.version_info[:2] == (3, 7): stdlib_module_names = PY_3_7 elif sys.version_info[:2] == (3, 8): stdlib_module_names = PY_3_8 elif sys.version_info[:2] == (3, 9): stdlib_module_names = PY_3_9 else: raise AssertionError("This module is only intended as a backport for Python <= 3.9") astroid-3.2.2/astroid/filter_statements.py0000664000175000017500000002225314622475517020665 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """_filter_stmts and helper functions. This method gets used in LocalsDictnodes.NodeNG._scope_lookup. It is not considered public. """ from __future__ import annotations from typing import TYPE_CHECKING from astroid import nodes from astroid.typing import SuccessfulInferenceResult if TYPE_CHECKING: from astroid.nodes import _base_nodes def _get_filtered_node_statements( base_node: nodes.NodeNG, stmt_nodes: list[nodes.NodeNG] ) -> list[tuple[nodes.NodeNG, _base_nodes.Statement]]: statements = [(node, node.statement()) for node in stmt_nodes] # Next we check if we have ExceptHandlers that are parent # of the underlying variable, in which case the last one survives if len(statements) > 1 and all( isinstance(stmt, nodes.ExceptHandler) for _, stmt in statements ): statements = [ (node, stmt) for node, stmt in statements if stmt.parent_of(base_node) ] return statements def _is_from_decorator(node) -> bool: """Return whether the given node is the child of a decorator.""" return any(isinstance(parent, nodes.Decorators) for parent in node.node_ancestors()) def _get_if_statement_ancestor(node: nodes.NodeNG) -> nodes.If | None: """Return the first parent node that is an If node (or None).""" for parent in node.node_ancestors(): if isinstance(parent, nodes.If): return parent return None def _filter_stmts( base_node: _base_nodes.LookupMixIn, stmts: list[SuccessfulInferenceResult], frame: nodes.LocalsDictNodeNG, offset: int, ) -> list[nodes.NodeNG]: """Filter the given list of statements to remove ignorable statements. If base_node is not a frame itself and the name is found in the inner frame locals, statements will be filtered to remove ignorable statements according to base_node's location. :param stmts: The statements to filter. :param frame: The frame that all of the given statements belong to. :param offset: The line offset to filter statements up to. :returns: The filtered statements. """ # if offset == -1, my actual frame is not the inner frame but its parent # # class A(B): pass # # we need this to resolve B correctly if offset == -1: myframe = base_node.frame().parent.frame() else: myframe = base_node.frame() # If the frame of this node is the same as the statement # of this node, then the node is part of a class or # a function definition and the frame of this node should be the # the upper frame, not the frame of the definition. # For more information why this is important, # see Pylint issue #295. # For example, for 'b', the statement is the same # as the frame / scope: # # def test(b=1): # ... if base_node.parent and base_node.statement() is myframe and myframe.parent: myframe = myframe.parent.frame() mystmt: _base_nodes.Statement | None = None if base_node.parent: mystmt = base_node.statement() # line filtering if we are in the same frame # # take care node may be missing lineno information (this is the case for # nodes inserted for living objects) if myframe is frame and mystmt and mystmt.fromlineno is not None: assert mystmt.fromlineno is not None, mystmt mylineno = mystmt.fromlineno + offset else: # disabling lineno filtering mylineno = 0 _stmts: list[nodes.NodeNG] = [] _stmt_parents = [] statements = _get_filtered_node_statements(base_node, stmts) for node, stmt in statements: # line filtering is on and we have reached our location, break if stmt.fromlineno and stmt.fromlineno > mylineno > 0: break # Ignore decorators with the same name as the # decorated function # Fixes issue #375 if mystmt is stmt and _is_from_decorator(base_node): continue if node.has_base(base_node): break if isinstance(node, nodes.EmptyNode): # EmptyNode does not have assign_type(), so just add it and move on _stmts.append(node) continue assign_type = node.assign_type() _stmts, done = assign_type._get_filtered_stmts(base_node, node, _stmts, mystmt) if done: break optional_assign = assign_type.optional_assign if optional_assign and assign_type.parent_of(base_node): # we are inside a loop, loop var assignment is hiding previous # assignment _stmts = [node] _stmt_parents = [stmt.parent] continue if isinstance(assign_type, nodes.NamedExpr): # If the NamedExpr is in an if statement we do some basic control flow inference if_parent = _get_if_statement_ancestor(assign_type) if if_parent: # If the if statement is within another if statement we append the node # to possible statements if _get_if_statement_ancestor(if_parent): optional_assign = False _stmts.append(node) _stmt_parents.append(stmt.parent) # Else we assume that it will be evaluated else: _stmts = [node] _stmt_parents = [stmt.parent] else: _stmts = [node] _stmt_parents = [stmt.parent] # XXX comment various branches below!!! try: pindex = _stmt_parents.index(stmt.parent) except ValueError: pass else: # we got a parent index, this means the currently visited node # is at the same block level as a previously visited node if _stmts[pindex].assign_type().parent_of(assign_type): # both statements are not at the same block level continue # if currently visited node is following previously considered # assignment and both are not exclusive, we can drop the # previous one. For instance in the following code :: # # if a: # x = 1 # else: # x = 2 # print x # # we can't remove neither x = 1 nor x = 2 when looking for 'x' # of 'print x'; while in the following :: # # x = 1 # x = 2 # print x # # we can remove x = 1 when we see x = 2 # # moreover, on loop assignment types, assignment won't # necessarily be done if the loop has no iteration, so we don't # want to clear previous assignments if any (hence the test on # optional_assign) if not (optional_assign or nodes.are_exclusive(_stmts[pindex], node)): del _stmt_parents[pindex] del _stmts[pindex] # If base_node and node are exclusive, then we can ignore node if nodes.are_exclusive(base_node, node): continue # An AssignName node overrides previous assignments if: # 1. node's statement always assigns # 2. node and base_node are in the same block (i.e., has the same parent as base_node) if isinstance(node, (nodes.NamedExpr, nodes.AssignName)): if isinstance(stmt, nodes.ExceptHandler): # If node's statement is an ExceptHandler, then it is the variable # bound to the caught exception. If base_node is not contained within # the exception handler block, node should override previous assignments; # otherwise, node should be ignored, as an exception variable # is local to the handler block. if stmt.parent_of(base_node): _stmts = [] _stmt_parents = [] else: continue elif not optional_assign and mystmt and stmt.parent is mystmt.parent: _stmts = [] _stmt_parents = [] elif isinstance(node, nodes.DelName): # Remove all previously stored assignments _stmts = [] _stmt_parents = [] continue # Add the new assignment _stmts.append(node) if isinstance(node, nodes.Arguments) or isinstance( node.parent, nodes.Arguments ): # Special case for _stmt_parents when node is a function parameter; # in this case, stmt is the enclosing FunctionDef, which is what we # want to add to _stmt_parents, not stmt.parent. This case occurs when # node is an Arguments node (representing varargs or kwargs parameter), # and when node.parent is an Arguments node (other parameters). # See issue #180. _stmt_parents.append(stmt) else: _stmt_parents.append(stmt.parent) return _stmts astroid-3.2.2/astroid/astroid_manager.py0000664000175000017500000000133114622475517020262 0ustar epsilonepsilon""" This file contain the global astroid MANAGER. It prevents a circular import that happened when the only possibility to import it was from astroid.__init__.py. This AstroidManager is a singleton/borg so it's possible to instantiate an AstroidManager() directly. """ # Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_all_brains from astroid.manager import AstroidManager MANAGER = AstroidManager() # Register all brains after instantiating the singleton Manager register_all_brains(MANAGER) astroid-3.2.2/astroid/bases.py0000664000175000017500000006532414622475517016234 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains base classes and functions for the nodes and some inference utils. """ from __future__ import annotations import collections import collections.abc from collections.abc import Iterable, Iterator from typing import TYPE_CHECKING, Any, Literal from astroid import decorators, nodes from astroid.const import PY310_PLUS from astroid.context import ( CallContext, InferenceContext, bind_context_to_node, copy_context, ) from astroid.exceptions import ( AstroidTypeError, AttributeInferenceError, InferenceError, NameInferenceError, ) from astroid.interpreter import objectmodel from astroid.typing import ( InferenceErrorInfo, InferenceResult, SuccessfulInferenceResult, ) from astroid.util import Uninferable, UninferableBase, safe_infer if TYPE_CHECKING: from astroid.constraint import Constraint PROPERTIES = {"builtins.property", "abc.abstractproperty"} if PY310_PLUS: PROPERTIES.add("enum.property") # List of possible property names. We use this list in order # to see if a method is a property or not. This should be # pretty reliable and fast, the alternative being to check each # decorator to see if its a real property-like descriptor, which # can be too complicated. # Also, these aren't qualified, because each project can # define them, we shouldn't expect to know every possible # property-like decorator! POSSIBLE_PROPERTIES = { "cached_property", "cachedproperty", "lazyproperty", "lazy_property", "reify", "lazyattribute", "lazy_attribute", "LazyProperty", "lazy", "cache_readonly", "DynamicClassAttribute", } def _is_property( meth: nodes.FunctionDef | UnboundMethod, context: InferenceContext | None = None ) -> bool: decoratornames = meth.decoratornames(context=context) if PROPERTIES.intersection(decoratornames): return True stripped = { name.split(".")[-1] for name in decoratornames if not isinstance(name, UninferableBase) } if any(name in stripped for name in POSSIBLE_PROPERTIES): return True # Lookup for subclasses of *property* if not meth.decorators: return False for decorator in meth.decorators.nodes or (): inferred = safe_infer(decorator, context=context) if inferred is None or isinstance(inferred, UninferableBase): continue if isinstance(inferred, nodes.ClassDef): for base_class in inferred.bases: if not isinstance(base_class, nodes.Name): continue module, _ = base_class.lookup(base_class.name) if ( isinstance(module, nodes.Module) and module.name == "builtins" and base_class.name == "property" ): return True return False class Proxy: """A simple proxy object. Note: Subclasses of this object will need a custom __getattr__ if new instance attributes are created. See the Const class """ _proxied: nodes.ClassDef | nodes.FunctionDef | nodes.Lambda | UnboundMethod def __init__( self, proxied: ( nodes.ClassDef | nodes.FunctionDef | nodes.Lambda | UnboundMethod | None ) = None, ) -> None: if proxied is None: # This is a hack to allow calling this __init__ during bootstrapping of # builtin classes and their docstrings. # For Const, Generator, and UnionType nodes the _proxied attribute # is set during bootstrapping # as we first need to build the ClassDef that they can proxy. # Thus, if proxied is None self should be a Const or Generator # as that is the only way _proxied will be correctly set as a ClassDef. assert isinstance(self, (nodes.Const, Generator, UnionType)) else: self._proxied = proxied def __getattr__(self, name: str) -> Any: if name == "_proxied": return self.__class__._proxied if name in self.__dict__: return self.__dict__[name] return getattr(self._proxied, name) def infer( # type: ignore[return] self, context: InferenceContext | None = None, **kwargs: Any ) -> collections.abc.Generator[InferenceResult, None, InferenceErrorInfo | None]: yield self def _infer_stmts( stmts: Iterable[InferenceResult], context: InferenceContext | None, frame: nodes.NodeNG | BaseInstance | None = None, ) -> collections.abc.Generator[InferenceResult, None, None]: """Return an iterator on statements inferred by each statement in *stmts*.""" inferred = False constraint_failed = False if context is not None: name = context.lookupname context = context.clone() if name is not None: constraints = context.constraints.get(name, {}) else: constraints = {} else: name = None constraints = {} context = InferenceContext() for stmt in stmts: if isinstance(stmt, UninferableBase): yield stmt inferred = True continue context.lookupname = stmt._infer_name(frame, name) try: stmt_constraints: set[Constraint] = set() for constraint_stmt, potential_constraints in constraints.items(): if not constraint_stmt.parent_of(stmt): stmt_constraints.update(potential_constraints) for inf in stmt.infer(context=context): if all(constraint.satisfied_by(inf) for constraint in stmt_constraints): yield inf inferred = True else: constraint_failed = True except NameInferenceError: continue except InferenceError: yield Uninferable inferred = True if not inferred and constraint_failed: yield Uninferable elif not inferred: raise InferenceError( "Inference failed for all members of {stmts!r}.", stmts=stmts, frame=frame, context=context, ) def _infer_method_result_truth( instance: Instance, method_name: str, context: InferenceContext ) -> bool | UninferableBase: # Get the method from the instance and try to infer # its return's truth value. meth = next(instance.igetattr(method_name, context=context), None) if meth and hasattr(meth, "infer_call_result"): if not meth.callable(): return Uninferable try: context.callcontext = CallContext(args=[], callee=meth) for value in meth.infer_call_result(instance, context=context): if isinstance(value, UninferableBase): return value try: inferred = next(value.infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e return inferred.bool_value() except InferenceError: pass return Uninferable class BaseInstance(Proxy): """An instance base class, which provides lookup methods for potential instances. """ _proxied: nodes.ClassDef special_attributes: objectmodel.ObjectModel def display_type(self) -> str: return "Instance of" def getattr( self, name: str, context: InferenceContext | None = None, lookupclass: bool = True, ) -> list[InferenceResult]: try: values = self._proxied.instance_attr(name, context) except AttributeInferenceError as exc: if self.special_attributes and name in self.special_attributes: return [self.special_attributes.lookup(name)] if lookupclass: # Class attributes not available through the instance # unless they are explicitly defined. return self._proxied.getattr(name, context, class_context=False) raise AttributeInferenceError( target=self, attribute=name, context=context ) from exc # since we've no context information, return matching class members as # well if lookupclass: try: return values + self._proxied.getattr( name, context, class_context=False ) except AttributeInferenceError: pass return values def igetattr( self, name: str, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Inferred getattr.""" if not context: context = InferenceContext() try: context.lookupname = name # XXX frame should be self._proxied, or not ? get_attr = self.getattr(name, context, lookupclass=False) yield from _infer_stmts( self._wrap_attr(get_attr, context), context, frame=self ) except AttributeInferenceError: try: # fallback to class.igetattr since it has some logic to handle # descriptors # But only if the _proxied is the Class. if self._proxied.__class__.__name__ != "ClassDef": raise attrs = self._proxied.igetattr(name, context, class_context=False) yield from self._wrap_attr(attrs, context) except AttributeInferenceError as error: raise InferenceError(**vars(error)) from error def _wrap_attr( self, attrs: Iterable[InferenceResult], context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Wrap bound methods of attrs in a InstanceMethod proxies.""" for attr in attrs: if isinstance(attr, UnboundMethod): if _is_property(attr): yield from attr.infer_call_result(self, context) else: yield BoundMethod(attr, self) elif isinstance(attr, nodes.Lambda): if attr.args.arguments and attr.args.arguments[0].name == "self": yield BoundMethod(attr, self) continue yield attr else: yield attr def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: """Infer what a class instance is returning when called.""" context = bind_context_to_node(context, self) inferred = False # If the call is an attribute on the instance, we infer the attribute itself if isinstance(caller, nodes.Call) and isinstance(caller.func, nodes.Attribute): for res in self.igetattr(caller.func.attrname, context): inferred = True yield res # Otherwise we infer the call to the __call__ dunder normally for node in self._proxied.igetattr("__call__", context): if isinstance(node, UninferableBase) or not node.callable(): continue if isinstance(node, BaseInstance) and node._proxied is self._proxied: inferred = True yield node # Prevent recursion. continue for res in node.infer_call_result(caller, context): inferred = True yield res if not inferred: raise InferenceError(node=self, caller=caller, context=context) class Instance(BaseInstance): """A special node representing a class instance.""" special_attributes = objectmodel.InstanceModel() def __init__(self, proxied: nodes.ClassDef | None) -> None: super().__init__(proxied) @decorators.yes_if_nothing_inferred def infer_binary_op( self, opnode: nodes.AugAssign | nodes.BinOp, operator: str, other: InferenceResult, context: InferenceContext, method: SuccessfulInferenceResult, ) -> Generator[InferenceResult, None, None]: return method.infer_call_result(self, context) def __repr__(self) -> str: return "".format( self._proxied.root().name, self._proxied.name, id(self) ) def __str__(self) -> str: return f"Instance of {self._proxied.root().name}.{self._proxied.name}" def callable(self) -> bool: try: self._proxied.getattr("__call__", class_context=False) return True except AttributeInferenceError: return False def pytype(self) -> str: return self._proxied.qname() def display_type(self) -> str: return "Instance of" def bool_value( self, context: InferenceContext | None = None ) -> bool | UninferableBase: """Infer the truth value for an Instance. The truth value of an instance is determined by these conditions: * if it implements __bool__ on Python 3 or __nonzero__ on Python 2, then its bool value will be determined by calling this special method and checking its result. * when this method is not defined, __len__() is called, if it is defined, and the object is considered true if its result is nonzero. If a class defines neither __len__() nor __bool__(), all its instances are considered true. """ context = context or InferenceContext() context.boundnode = self try: result = _infer_method_result_truth(self, "__bool__", context) except (InferenceError, AttributeInferenceError): # Fallback to __len__. try: result = _infer_method_result_truth(self, "__len__", context) except (AttributeInferenceError, InferenceError): return True return result def getitem( self, index: nodes.Const, context: InferenceContext | None = None ) -> InferenceResult | None: new_context = bind_context_to_node(context, self) if not context: context = new_context method = next(self.igetattr("__getitem__", context=context), None) # Create a new CallContext for providing index as an argument. new_context.callcontext = CallContext(args=[index], callee=method) if not isinstance(method, BoundMethod): raise InferenceError( "Could not find __getitem__ for {node!r}.", node=self, context=context ) if len(method.args.arguments) != 2: # (self, index) raise AstroidTypeError( "__getitem__ for {node!r} does not have correct signature", node=self, context=context, ) return next(method.infer_call_result(self, new_context), None) class UnboundMethod(Proxy): """A special node representing a method not bound to an instance.""" _proxied: nodes.FunctionDef | UnboundMethod special_attributes: ( objectmodel.BoundMethodModel | objectmodel.UnboundMethodModel ) = objectmodel.UnboundMethodModel() def __repr__(self) -> str: assert self._proxied.parent, "Expected a parent node" frame = self._proxied.parent.frame() return "<{} {} of {} at 0x{}".format( self.__class__.__name__, self._proxied.name, frame.qname(), id(self) ) def implicit_parameters(self) -> Literal[0, 1]: return 0 def is_bound(self) -> bool: return False def getattr(self, name: str, context: InferenceContext | None = None): if name in self.special_attributes: return [self.special_attributes.lookup(name)] return self._proxied.getattr(name, context) def igetattr( self, name: str, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: if name in self.special_attributes: return iter((self.special_attributes.lookup(name),)) return self._proxied.igetattr(name, context) def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: """ The boundnode of the regular context with a function called on ``object.__new__`` will be of type ``object``, which is incorrect for the argument in general. If no context is given the ``object.__new__`` call argument will be correctly inferred except when inside a call that requires the additional context (such as a classmethod) of the boundnode to determine which class the method was called from """ # If we're unbound method __new__ of a builtin, the result is an # instance of the class given as first argument. if self._proxied.name == "__new__": assert self._proxied.parent, "Expected a parent node" qname = self._proxied.parent.frame().qname() # Avoid checking builtins.type: _infer_type_new_call() does more validation if qname.startswith("builtins.") and qname != "builtins.type": return self._infer_builtin_new(caller, context or InferenceContext()) return self._proxied.infer_call_result(caller, context) def _infer_builtin_new( self, caller: SuccessfulInferenceResult | None, context: InferenceContext, ) -> collections.abc.Generator[ nodes.Const | Instance | UninferableBase, None, None ]: if not isinstance(caller, nodes.Call): return if not caller.args: return # Attempt to create a constant if len(caller.args) > 1: value = None if isinstance(caller.args[1], nodes.Const): value = caller.args[1].value else: inferred_arg = next(caller.args[1].infer(), None) if isinstance(inferred_arg, nodes.Const): value = inferred_arg.value if value is not None: const = nodes.const_factory(value) assert not isinstance(const, nodes.EmptyNode) yield const return node_context = context.extra_context.get(caller.args[0]) for inferred in caller.args[0].infer(context=node_context): if isinstance(inferred, UninferableBase): yield inferred if isinstance(inferred, nodes.ClassDef): yield Instance(inferred) raise InferenceError def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: return True class BoundMethod(UnboundMethod): """A special node representing a method bound to an instance.""" special_attributes = objectmodel.BoundMethodModel() def __init__( self, proxy: nodes.FunctionDef | nodes.Lambda | UnboundMethod, bound: SuccessfulInferenceResult, ) -> None: super().__init__(proxy) self.bound = bound def implicit_parameters(self) -> Literal[0, 1]: if self.name == "__new__": # __new__ acts as a classmethod but the class argument is not implicit. return 0 return 1 def is_bound(self) -> Literal[True]: return True def _infer_type_new_call( self, caller: nodes.Call, context: InferenceContext ) -> nodes.ClassDef | None: # noqa: C901 """Try to infer what type.__new__(mcs, name, bases, attrs) returns. In order for such call to be valid, the metaclass needs to be a subtype of ``type``, the name needs to be a string, the bases needs to be a tuple of classes """ # pylint: disable=import-outside-toplevel; circular import from astroid.nodes import Pass # Verify the metaclass try: mcs = next(caller.args[0].infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e if not isinstance(mcs, nodes.ClassDef): # Not a valid first argument. return None if not mcs.is_subtype_of("builtins.type"): # Not a valid metaclass. return None # Verify the name try: name = next(caller.args[1].infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e if not isinstance(name, nodes.Const): # Not a valid name, needs to be a const. return None if not isinstance(name.value, str): # Needs to be a string. return None # Verify the bases try: bases = next(caller.args[2].infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e if not isinstance(bases, nodes.Tuple): # Needs to be a tuple. return None try: inferred_bases = [next(elt.infer(context=context)) for elt in bases.elts] except StopIteration as e: raise InferenceError(context=context) from e if any(not isinstance(base, nodes.ClassDef) for base in inferred_bases): # All the bases needs to be Classes return None # Verify the attributes. try: attrs = next(caller.args[3].infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e if not isinstance(attrs, nodes.Dict): # Needs to be a dictionary. return None cls_locals: dict[str, list[InferenceResult]] = collections.defaultdict(list) for key, value in attrs.items: try: key = next(key.infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e try: value = next(value.infer(context=context)) except StopIteration as e: raise InferenceError(context=context) from e # Ignore non string keys if isinstance(key, nodes.Const) and isinstance(key.value, str): cls_locals[key.value].append(value) # Build the class from now. cls = mcs.__class__( name=name.value, lineno=caller.lineno or 0, col_offset=caller.col_offset or 0, parent=caller, end_lineno=caller.end_lineno, end_col_offset=caller.end_col_offset, ) empty = Pass( parent=cls, lineno=caller.lineno, col_offset=caller.col_offset, end_lineno=caller.end_lineno, end_col_offset=caller.end_col_offset, ) cls.postinit( bases=bases.elts, body=[empty], decorators=None, newstyle=True, metaclass=mcs, keywords=[], ) cls.locals = cls_locals return cls def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: context = bind_context_to_node(context, self.bound) if ( isinstance(self.bound, nodes.ClassDef) and self.bound.name == "type" and self.name == "__new__" and isinstance(caller, nodes.Call) and len(caller.args) == 4 ): # Check if we have a ``type.__new__(mcs, name, bases, attrs)`` call. new_cls = self._infer_type_new_call(caller, context) if new_cls: return iter((new_cls,)) return super().infer_call_result(caller, context) def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: return True class Generator(BaseInstance): """A special node representing a generator. Proxied class is set once for all in raw_building. """ # We defer initialization of special_attributes to the __init__ method since the constructor # of GeneratorModel requires the raw_building to be complete # TODO: This should probably be refactored. special_attributes: objectmodel.GeneratorModel def __init__( self, parent: nodes.FunctionDef, generator_initial_context: InferenceContext | None = None, ) -> None: super().__init__() self.parent = parent self._call_context = copy_context(generator_initial_context) # See comment above: this is a deferred initialization. Generator.special_attributes = objectmodel.GeneratorModel() def infer_yield_types(self) -> Iterator[InferenceResult]: yield from self.parent.infer_yield_result(self._call_context) def callable(self) -> Literal[False]: return False def pytype(self) -> str: return "builtins.generator" def display_type(self) -> str: return "Generator" def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: return True def __repr__(self) -> str: return f"" def __str__(self) -> str: return f"Generator({self._proxied.name})" class AsyncGenerator(Generator): """Special node representing an async generator.""" def pytype(self) -> Literal["builtins.async_generator"]: return "builtins.async_generator" def display_type(self) -> str: return "AsyncGenerator" def __repr__(self) -> str: return f"" def __str__(self) -> str: return f"AsyncGenerator({self._proxied.name})" class UnionType(BaseInstance): """Special node representing new style typing unions. Proxied class is set once for all in raw_building. """ def __init__( self, left: UnionType | nodes.ClassDef | nodes.Const, right: UnionType | nodes.ClassDef | nodes.Const, parent: nodes.NodeNG | None = None, ) -> None: super().__init__() self.parent = parent self.left = left self.right = right def callable(self) -> Literal[False]: return False def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: return True def pytype(self) -> Literal["types.UnionType"]: return "types.UnionType" def display_type(self) -> str: return "UnionType" def __repr__(self) -> str: return f"" def __str__(self) -> str: return f"UnionType({self._proxied.name})" astroid-3.2.2/astroid/interpreter/0000775000175000017500000000000014622475517017116 5ustar epsilonepsilonastroid-3.2.2/astroid/interpreter/__init__.py0000664000175000017500000000000014622475517021215 0ustar epsilonepsilonastroid-3.2.2/astroid/interpreter/objectmodel.py0000664000175000017500000010344114622475517021762 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Data object model, as per https://docs.python.org/3/reference/datamodel.html. This module describes, at least partially, a data object model for some of astroid's nodes. The model contains special attributes that nodes such as functions, classes, modules etc have, such as __doc__, __class__, __module__ etc, being used when doing attribute lookups over nodes. For instance, inferring `obj.__class__` will first trigger an inference of the `obj` variable. If it was successfully inferred, then an attribute `__class__ will be looked for in the inferred object. This is the part where the data model occurs. The model is attached to those nodes and the lookup mechanism will try to see if attributes such as `__class__` are defined by the model or not. If they are defined, the model will be requested to return the corresponding value of that attribute. Thus the model can be viewed as a special part of the lookup mechanism. """ from __future__ import annotations import itertools import os import pprint import types from collections.abc import Iterator from functools import lru_cache from typing import TYPE_CHECKING, Any, Literal import astroid from astroid import bases, nodes, util from astroid.context import InferenceContext, copy_context from astroid.exceptions import AttributeInferenceError, InferenceError, NoDefault from astroid.manager import AstroidManager from astroid.nodes import node_classes from astroid.typing import InferenceResult, SuccessfulInferenceResult if TYPE_CHECKING: from astroid.objects import Property IMPL_PREFIX = "attr_" LEN_OF_IMPL_PREFIX = len(IMPL_PREFIX) def _dunder_dict(instance, attributes): obj = node_classes.Dict( parent=instance, lineno=instance.lineno, col_offset=instance.col_offset, end_lineno=instance.end_lineno, end_col_offset=instance.end_col_offset, ) # Convert the keys to node strings keys = [ node_classes.Const(value=value, parent=obj) for value in list(attributes.keys()) ] # The original attribute has a list of elements for each key, # but that is not useful for retrieving the special attribute's value. # In this case, we're picking the last value from each list. values = [elem[-1] for elem in attributes.values()] obj.postinit(list(zip(keys, values))) return obj def _get_bound_node(model: ObjectModel) -> Any: # TODO: Use isinstance instead of try ... except after _instance has typing try: return model._instance._proxied except AttributeError: return model._instance class ObjectModel: def __init__(self): self._instance = None def __repr__(self): result = [] cname = type(self).__name__ string = "%(cname)s(%(fields)s)" alignment = len(cname) + 1 for field in sorted(self.attributes()): width = 80 - len(field) - alignment lines = pprint.pformat(field, indent=2, width=width).splitlines(True) inner = [lines[0]] for line in lines[1:]: inner.append(" " * alignment + line) result.append(field) return string % { "cname": cname, "fields": (",\n" + " " * alignment).join(result), } def __call__(self, instance): self._instance = instance return self def __get__(self, instance, cls=None): # ObjectModel needs to be a descriptor so that just doing # `special_attributes = SomeObjectModel` should be enough in the body of a node. # But at the same time, node.special_attributes should return an object # which can be used for manipulating the special attributes. That's the reason # we pass the instance through which it got accessed to ObjectModel.__call__, # returning itself afterwards, so we can still have access to the # underlying data model and to the instance for which it got accessed. return self(instance) def __contains__(self, name) -> bool: return name in self.attributes() @lru_cache # noqa def attributes(self) -> list[str]: """Get the attributes which are exported by this object model.""" return [o[LEN_OF_IMPL_PREFIX:] for o in dir(self) if o.startswith(IMPL_PREFIX)] def lookup(self, name): """Look up the given *name* in the current model. It should return an AST or an interpreter object, but if the name is not found, then an AttributeInferenceError will be raised. """ if name in self.attributes(): return getattr(self, IMPL_PREFIX + name) raise AttributeInferenceError(target=self._instance, attribute=name) @property def attr___new__(self) -> bases.BoundMethod: """Calling cls.__new__(type) on an object returns an instance of 'type'.""" from astroid import builder # pylint: disable=import-outside-toplevel node: nodes.FunctionDef = builder.extract_node( """def __new__(self, cls): return cls()""" ) # We set the parent as being the ClassDef of 'object' as that # triggers correct inference as a call to __new__ in bases.py node.parent = AstroidManager().builtins_module["object"] return bases.BoundMethod(proxy=node, bound=_get_bound_node(self)) @property def attr___init__(self) -> bases.BoundMethod: """Calling cls.__init__() normally returns None.""" from astroid import builder # pylint: disable=import-outside-toplevel # The *args and **kwargs are necessary not to trigger warnings about missing # or extra parameters for '__init__' methods we don't infer correctly. # This BoundMethod is the fallback value for those. node: nodes.FunctionDef = builder.extract_node( """def __init__(self, *args, **kwargs): return None""" ) # We set the parent as being the ClassDef of 'object' as that # is where this method originally comes from node.parent = AstroidManager().builtins_module["object"] return bases.BoundMethod(proxy=node, bound=_get_bound_node(self)) class ModuleModel(ObjectModel): def _builtins(self): builtins_ast_module = AstroidManager().builtins_module return builtins_ast_module.special_attributes.lookup("__dict__") @property def attr_builtins(self): return self._builtins() @property def attr___path__(self): if not self._instance.package: raise AttributeInferenceError(target=self._instance, attribute="__path__") path_objs = [ node_classes.Const( value=( path if not path.endswith("__init__.py") else os.path.dirname(path) ), parent=self._instance, ) for path in self._instance.path ] container = node_classes.List(parent=self._instance) container.postinit(path_objs) return container @property def attr___name__(self): return node_classes.Const(value=self._instance.name, parent=self._instance) @property def attr___doc__(self): return node_classes.Const( value=getattr(self._instance.doc_node, "value", None), parent=self._instance, ) @property def attr___file__(self): return node_classes.Const(value=self._instance.file, parent=self._instance) @property def attr___dict__(self): return _dunder_dict(self._instance, self._instance.globals) @property def attr___package__(self): if not self._instance.package: value = "" else: value = self._instance.name return node_classes.Const(value=value, parent=self._instance) # These are related to the Python 3 implementation of the # import system, # https://docs.python.org/3/reference/import.html#import-related-module-attributes @property def attr___spec__(self): # No handling for now. return node_classes.Unknown() @property def attr___loader__(self): # No handling for now. return node_classes.Unknown() @property def attr___cached__(self): # No handling for now. return node_classes.Unknown() class FunctionModel(ObjectModel): @property def attr___name__(self): return node_classes.Const(value=self._instance.name, parent=self._instance) @property def attr___doc__(self): return node_classes.Const( value=getattr(self._instance.doc_node, "value", None), parent=self._instance, ) @property def attr___qualname__(self): return node_classes.Const(value=self._instance.qname(), parent=self._instance) @property def attr___defaults__(self): func = self._instance if not func.args.defaults: return node_classes.Const(value=None, parent=func) defaults_obj = node_classes.Tuple(parent=func) defaults_obj.postinit(func.args.defaults) return defaults_obj @property def attr___annotations__(self): obj = node_classes.Dict( parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) if not self._instance.returns: returns = None else: returns = self._instance.returns args = self._instance.args pair_annotations = itertools.chain( zip(args.args or [], args.annotations), zip(args.kwonlyargs, args.kwonlyargs_annotations), zip(args.posonlyargs or [], args.posonlyargs_annotations), ) annotations = { arg.name: annotation for (arg, annotation) in pair_annotations if annotation } if args.varargannotation: annotations[args.vararg] = args.varargannotation if args.kwargannotation: annotations[args.kwarg] = args.kwargannotation if returns: annotations["return"] = returns items = [ (node_classes.Const(key, parent=obj), value) for (key, value) in annotations.items() ] obj.postinit(items) return obj @property def attr___dict__(self): return node_classes.Dict( parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) attr___globals__ = attr___dict__ @property def attr___kwdefaults__(self): def _default_args(args, parent): for arg in args.kwonlyargs: try: default = args.default_value(arg.name) except NoDefault: continue name = node_classes.Const(arg.name, parent=parent) yield name, default args = self._instance.args obj = node_classes.Dict( parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) defaults = dict(_default_args(args, obj)) obj.postinit(list(defaults.items())) return obj @property def attr___module__(self): return node_classes.Const(self._instance.root().qname()) @property def attr___get__(self): func = self._instance class DescriptorBoundMethod(bases.BoundMethod): """Bound method which knows how to understand calling descriptor binding. """ def implicit_parameters(self) -> Literal[0]: # Different than BoundMethod since the signature # is different. return 0 def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[bases.BoundMethod]: if len(caller.args) > 2 or len(caller.args) < 1: raise InferenceError( "Invalid arguments for descriptor binding", target=self, context=context, ) context = copy_context(context) try: cls = next(caller.args[0].infer(context=context)) except StopIteration as e: raise InferenceError(context=context, node=caller.args[0]) from e if isinstance(cls, util.UninferableBase): raise InferenceError( "Invalid class inferred", target=self, context=context ) # For some reason func is a Node that the below # code is not expecting if isinstance(func, bases.BoundMethod): yield func return # Rebuild the original value, but with the parent set as the # class where it will be bound. new_func = func.__class__( name=func.name, lineno=func.lineno, col_offset=func.col_offset, parent=func.parent, end_lineno=func.end_lineno, end_col_offset=func.end_col_offset, ) # pylint: disable=no-member new_func.postinit( func.args, func.body, func.decorators, func.returns, doc_node=func.doc_node, ) # Build a proper bound method that points to our newly built function. proxy = bases.UnboundMethod(new_func) yield bases.BoundMethod(proxy=proxy, bound=cls) @property def args(self): """Overwrite the underlying args to match those of the underlying func. Usually the underlying *func* is a function/method, as in: def test(self): pass This has only the *self* parameter but when we access test.__get__ we get a new object which has two parameters, *self* and *type*. """ nonlocal func arguments = astroid.Arguments( parent=func.args.parent, vararg=None, kwarg=None ) positional_or_keyword_params = func.args.args.copy() positional_or_keyword_params.append( astroid.AssignName( name="type", lineno=0, col_offset=0, parent=arguments, end_lineno=None, end_col_offset=None, ) ) positional_only_params = func.args.posonlyargs.copy() arguments.postinit( args=positional_or_keyword_params, posonlyargs=positional_only_params, defaults=[], kwonlyargs=[], kw_defaults=[], annotations=[], kwonlyargs_annotations=[], posonlyargs_annotations=[], ) return arguments return DescriptorBoundMethod(proxy=self._instance, bound=self._instance) # These are here just for completion. @property def attr___ne__(self): return node_classes.Unknown() attr___subclasshook__ = attr___ne__ attr___str__ = attr___ne__ attr___sizeof__ = attr___ne__ attr___setattr___ = attr___ne__ attr___repr__ = attr___ne__ attr___reduce__ = attr___ne__ attr___reduce_ex__ = attr___ne__ attr___lt__ = attr___ne__ attr___eq__ = attr___ne__ attr___gt__ = attr___ne__ attr___format__ = attr___ne__ attr___delattr___ = attr___ne__ attr___getattribute__ = attr___ne__ attr___hash__ = attr___ne__ attr___dir__ = attr___ne__ attr___call__ = attr___ne__ attr___class__ = attr___ne__ attr___closure__ = attr___ne__ attr___code__ = attr___ne__ class ClassModel(ObjectModel): def __init__(self): # Add a context so that inferences called from an instance don't recurse endlessly self.context = InferenceContext() super().__init__() @property def attr___module__(self): return node_classes.Const(self._instance.root().qname()) @property def attr___name__(self): return node_classes.Const(self._instance.name) @property def attr___qualname__(self): return node_classes.Const(self._instance.qname()) @property def attr___doc__(self): return node_classes.Const(getattr(self._instance.doc_node, "value", None)) @property def attr___mro__(self): if not self._instance.newstyle: raise AttributeInferenceError(target=self._instance, attribute="__mro__") mro = self._instance.mro() obj = node_classes.Tuple(parent=self._instance) obj.postinit(mro) return obj @property def attr_mro(self): if not self._instance.newstyle: raise AttributeInferenceError(target=self._instance, attribute="mro") other_self = self # Cls.mro is a method and we need to return one in order to have a proper inference. # The method we're returning is capable of inferring the underlying MRO though. class MroBoundMethod(bases.BoundMethod): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[node_classes.Tuple]: yield other_self.attr___mro__ implicit_metaclass = self._instance.implicit_metaclass() mro_method = implicit_metaclass.locals["mro"][0] return MroBoundMethod(proxy=mro_method, bound=implicit_metaclass) @property def attr___bases__(self): obj = node_classes.Tuple() context = InferenceContext() elts = list(self._instance._inferred_bases(context)) obj.postinit(elts=elts) return obj @property def attr___class__(self): # pylint: disable=import-outside-toplevel; circular import from astroid import helpers return helpers.object_type(self._instance) @property def attr___subclasses__(self): """Get the subclasses of the underlying class. This looks only in the current module for retrieving the subclasses, thus it might miss a couple of them. """ if not self._instance.newstyle: raise AttributeInferenceError( target=self._instance, attribute="__subclasses__" ) qname = self._instance.qname() root = self._instance.root() classes = [ cls for cls in root.nodes_of_class(nodes.ClassDef) if cls != self._instance and cls.is_subtype_of(qname, context=self.context) ] obj = node_classes.List(parent=self._instance) obj.postinit(classes) class SubclassesBoundMethod(bases.BoundMethod): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[node_classes.List]: yield obj implicit_metaclass = self._instance.implicit_metaclass() subclasses_method = implicit_metaclass.locals["__subclasses__"][0] return SubclassesBoundMethod(proxy=subclasses_method, bound=implicit_metaclass) @property def attr___dict__(self): return node_classes.Dict( parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) @property def attr___call__(self): """Calling a class A() returns an instance of A.""" return self._instance.instantiate_class() class SuperModel(ObjectModel): @property def attr___thisclass__(self): return self._instance.mro_pointer @property def attr___self_class__(self): return self._instance._self_class @property def attr___self__(self): return self._instance.type @property def attr___class__(self): return self._instance._proxied class UnboundMethodModel(ObjectModel): @property def attr___class__(self): # pylint: disable=import-outside-toplevel; circular import from astroid import helpers return helpers.object_type(self._instance) @property def attr___func__(self): return self._instance._proxied @property def attr___self__(self): return node_classes.Const(value=None, parent=self._instance) attr_im_func = attr___func__ attr_im_class = attr___class__ attr_im_self = attr___self__ class ContextManagerModel(ObjectModel): """Model for context managers. Based on 3.3.9 of the Data Model documentation: https://docs.python.org/3/reference/datamodel.html#with-statement-context-managers """ @property def attr___enter__(self) -> bases.BoundMethod: """Representation of the base implementation of __enter__. As per Python documentation: Enter the runtime context related to this object. The with statement will bind this method's return value to the target(s) specified in the as clause of the statement, if any. """ from astroid import builder # pylint: disable=import-outside-toplevel node: nodes.FunctionDef = builder.extract_node("""def __enter__(self): ...""") # We set the parent as being the ClassDef of 'object' as that # is where this method originally comes from node.parent = AstroidManager().builtins_module["object"] return bases.BoundMethod(proxy=node, bound=_get_bound_node(self)) @property def attr___exit__(self) -> bases.BoundMethod: """Representation of the base implementation of __exit__. As per Python documentation: Exit the runtime context related to this object. The parameters describe the exception that caused the context to be exited. If the context was exited without an exception, all three arguments will be None. """ from astroid import builder # pylint: disable=import-outside-toplevel node: nodes.FunctionDef = builder.extract_node( """def __exit__(self, exc_type, exc_value, traceback): ...""" ) # We set the parent as being the ClassDef of 'object' as that # is where this method originally comes from node.parent = AstroidManager().builtins_module["object"] return bases.BoundMethod(proxy=node, bound=_get_bound_node(self)) class BoundMethodModel(FunctionModel): @property def attr___func__(self): return self._instance._proxied._proxied @property def attr___self__(self): return self._instance.bound class GeneratorModel(FunctionModel, ContextManagerModel): def __new__(cls, *args, **kwargs): # Append the values from the GeneratorType unto this object. ret = super().__new__(cls, *args, **kwargs) generator = AstroidManager().builtins_module["generator"] for name, values in generator.locals.items(): method = values[0] def patched(cls, meth=method): return meth setattr(type(ret), IMPL_PREFIX + name, property(patched)) return ret @property def attr___name__(self): return node_classes.Const( value=self._instance.parent.name, parent=self._instance ) @property def attr___doc__(self): return node_classes.Const( value=getattr(self._instance.parent.doc_node, "value", None), parent=self._instance, ) class AsyncGeneratorModel(GeneratorModel): def __new__(cls, *args, **kwargs): # Append the values from the AGeneratorType unto this object. ret = super().__new__(cls, *args, **kwargs) astroid_builtins = AstroidManager().builtins_module generator = astroid_builtins.get("async_generator") if generator is None: # Make it backward compatible. generator = astroid_builtins.get("generator") for name, values in generator.locals.items(): method = values[0] def patched(cls, meth=method): return meth setattr(type(ret), IMPL_PREFIX + name, property(patched)) return ret class InstanceModel(ObjectModel): @property def attr___class__(self): return self._instance._proxied @property def attr___module__(self): return node_classes.Const(self._instance.root().qname()) @property def attr___doc__(self): return node_classes.Const(getattr(self._instance.doc_node, "value", None)) @property def attr___dict__(self): return _dunder_dict(self._instance, self._instance.instance_attrs) # Exception instances class ExceptionInstanceModel(InstanceModel): @property def attr_args(self) -> nodes.Tuple: return nodes.Tuple(parent=self._instance) @property def attr___traceback__(self): builtins_ast_module = AstroidManager().builtins_module traceback_type = builtins_ast_module[types.TracebackType.__name__] return traceback_type.instantiate_class() class SyntaxErrorInstanceModel(ExceptionInstanceModel): @property def attr_text(self): return node_classes.Const("") class OSErrorInstanceModel(ExceptionInstanceModel): @property def attr_filename(self): return node_classes.Const("") @property def attr_errno(self): return node_classes.Const(0) @property def attr_strerror(self): return node_classes.Const("") attr_filename2 = attr_filename class ImportErrorInstanceModel(ExceptionInstanceModel): @property def attr_name(self): return node_classes.Const("") @property def attr_path(self): return node_classes.Const("") class UnicodeDecodeErrorInstanceModel(ExceptionInstanceModel): @property def attr_object(self): return node_classes.Const(b"") BUILTIN_EXCEPTIONS = { "builtins.SyntaxError": SyntaxErrorInstanceModel, "builtins.ImportError": ImportErrorInstanceModel, "builtins.UnicodeDecodeError": UnicodeDecodeErrorInstanceModel, # These are all similar to OSError in terms of attributes "builtins.OSError": OSErrorInstanceModel, "builtins.BlockingIOError": OSErrorInstanceModel, "builtins.BrokenPipeError": OSErrorInstanceModel, "builtins.ChildProcessError": OSErrorInstanceModel, "builtins.ConnectionAbortedError": OSErrorInstanceModel, "builtins.ConnectionError": OSErrorInstanceModel, "builtins.ConnectionRefusedError": OSErrorInstanceModel, "builtins.ConnectionResetError": OSErrorInstanceModel, "builtins.FileExistsError": OSErrorInstanceModel, "builtins.FileNotFoundError": OSErrorInstanceModel, "builtins.InterruptedError": OSErrorInstanceModel, "builtins.IsADirectoryError": OSErrorInstanceModel, "builtins.NotADirectoryError": OSErrorInstanceModel, "builtins.PermissionError": OSErrorInstanceModel, "builtins.ProcessLookupError": OSErrorInstanceModel, "builtins.TimeoutError": OSErrorInstanceModel, } class DictModel(ObjectModel): @property def attr___class__(self): return self._instance._proxied def _generic_dict_attribute(self, obj, name): """Generate a bound method that can infer the given *obj*.""" class DictMethodBoundMethod(astroid.BoundMethod): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: yield obj meth = next(self._instance._proxied.igetattr(name), None) return DictMethodBoundMethod(proxy=meth, bound=self._instance) @property def attr_items(self): from astroid import objects # pylint: disable=import-outside-toplevel elems = [] obj = node_classes.List(parent=self._instance) for key, value in self._instance.items: elem = node_classes.Tuple(parent=obj) elem.postinit((key, value)) elems.append(elem) obj.postinit(elts=elems) items_obj = objects.DictItems(obj) return self._generic_dict_attribute(items_obj, "items") @property def attr_keys(self): from astroid import objects # pylint: disable=import-outside-toplevel keys = [key for (key, _) in self._instance.items] obj = node_classes.List(parent=self._instance) obj.postinit(elts=keys) keys_obj = objects.DictKeys(obj) return self._generic_dict_attribute(keys_obj, "keys") @property def attr_values(self): from astroid import objects # pylint: disable=import-outside-toplevel values = [value for (_, value) in self._instance.items] obj = node_classes.List(parent=self._instance) obj.postinit(values) values_obj = objects.DictValues(obj) return self._generic_dict_attribute(values_obj, "values") class PropertyModel(ObjectModel): """Model for a builtin property.""" def _init_function(self, name): function = nodes.FunctionDef( name=name, parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) args = nodes.Arguments(parent=function, vararg=None, kwarg=None) args.postinit( args=[], defaults=[], kwonlyargs=[], kw_defaults=[], annotations=[], posonlyargs=[], posonlyargs_annotations=[], kwonlyargs_annotations=[], ) function.postinit(args=args, body=[]) return function @property def attr_fget(self): func = self._instance class PropertyFuncAccessor(nodes.FunctionDef): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: nonlocal func if caller and len(caller.args) != 1: raise InferenceError( "fget() needs a single argument", target=self, context=context ) yield from func.function.infer_call_result( caller=caller, context=context ) property_accessor = PropertyFuncAccessor( name="fget", parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) property_accessor.postinit(args=func.args, body=func.body) return property_accessor @property def attr_fset(self): func = self._instance def find_setter(func: Property) -> astroid.FunctionDef | None: """ Given a property, find the corresponding setter function and returns it. :param func: property for which the setter has to be found :return: the setter function or None """ for target in [ t for t in func.parent.get_children() if t.name == func.function.name ]: for dec_name in target.decoratornames(): if dec_name.endswith(func.function.name + ".setter"): return target return None func_setter = find_setter(func) if not func_setter: raise InferenceError( f"Unable to find the setter of property {func.function.name}" ) class PropertyFuncAccessor(nodes.FunctionDef): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: nonlocal func_setter if caller and len(caller.args) != 2: raise InferenceError( "fset() needs two arguments", target=self, context=context ) yield from func_setter.infer_call_result(caller=caller, context=context) property_accessor = PropertyFuncAccessor( name="fset", parent=self._instance, lineno=self._instance.lineno, col_offset=self._instance.col_offset, end_lineno=self._instance.end_lineno, end_col_offset=self._instance.end_col_offset, ) property_accessor.postinit(args=func_setter.args, body=func_setter.body) return property_accessor @property def attr_setter(self): return self._init_function("setter") @property def attr_deleter(self): return self._init_function("deleter") @property def attr_getter(self): return self._init_function("getter") # pylint: enable=import-outside-toplevel astroid-3.2.2/astroid/interpreter/_import/0000775000175000017500000000000014622475517020567 5ustar epsilonepsilonastroid-3.2.2/astroid/interpreter/_import/__init__.py0000664000175000017500000000000014622475517022666 0ustar epsilonepsilonastroid-3.2.2/astroid/interpreter/_import/spec.py0000664000175000017500000004065014622475517022100 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import abc import enum import importlib import importlib.machinery import importlib.util import os import pathlib import sys import types import warnings import zipimport from collections.abc import Iterable, Iterator, Sequence from functools import lru_cache from pathlib import Path from typing import Any, Literal, NamedTuple, Protocol from astroid.const import PY310_PLUS from astroid.modutils import EXT_LIB_DIRS from . import util # The MetaPathFinder protocol comes from typeshed, which says: # Intentionally omits one deprecated and one optional method of `importlib.abc.MetaPathFinder` class _MetaPathFinder(Protocol): def find_spec( self, fullname: str, path: Sequence[str] | None, target: types.ModuleType | None = ..., ) -> importlib.machinery.ModuleSpec | None: ... # pragma: no cover class ModuleType(enum.Enum): """Python module types used for ModuleSpec.""" C_BUILTIN = enum.auto() C_EXTENSION = enum.auto() PKG_DIRECTORY = enum.auto() PY_CODERESOURCE = enum.auto() PY_COMPILED = enum.auto() PY_FROZEN = enum.auto() PY_RESOURCE = enum.auto() PY_SOURCE = enum.auto() PY_ZIPMODULE = enum.auto() PY_NAMESPACE = enum.auto() _MetaPathFinderModuleTypes: dict[str, ModuleType] = { # Finders created by setuptools editable installs "_EditableFinder": ModuleType.PY_SOURCE, "_EditableNamespaceFinder": ModuleType.PY_NAMESPACE, # Finders create by six "_SixMetaPathImporter": ModuleType.PY_SOURCE, } _EditableFinderClasses: set[str] = { "_EditableFinder", "_EditableNamespaceFinder", } class ModuleSpec(NamedTuple): """Defines a class similar to PEP 420's ModuleSpec. A module spec defines a name of a module, its type, location and where submodules can be found, if the module is a package. """ name: str type: ModuleType | None location: str | None = None origin: str | None = None submodule_search_locations: Sequence[str] | None = None class Finder: """A finder is a class which knows how to find a particular module.""" def __init__(self, path: Sequence[str] | None = None) -> None: self._path = path or sys.path @abc.abstractmethod def find_module( self, modname: str, module_parts: Sequence[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> ModuleSpec | None: """Find the given module. Each finder is responsible for each protocol of finding, as long as they all return a ModuleSpec. :param modname: The module which needs to be searched. :param module_parts: It should be a list of strings, where each part contributes to the module's namespace. :param processed: What parts from the module parts were processed so far. :param submodule_path: A list of paths where the module can be looked into. :returns: A ModuleSpec, describing how and where the module was found, None, otherwise. """ def contribute_to_path( self, spec: ModuleSpec, processed: list[str] ) -> Sequence[str] | None: """Get a list of extra paths where this finder can search.""" class ImportlibFinder(Finder): """A finder based on the importlib module.""" _SUFFIXES: Sequence[tuple[str, ModuleType]] = ( [(s, ModuleType.C_EXTENSION) for s in importlib.machinery.EXTENSION_SUFFIXES] + [(s, ModuleType.PY_SOURCE) for s in importlib.machinery.SOURCE_SUFFIXES] + [(s, ModuleType.PY_COMPILED) for s in importlib.machinery.BYTECODE_SUFFIXES] ) def find_module( self, modname: str, module_parts: Sequence[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> ModuleSpec | None: if submodule_path is not None: submodule_path = list(submodule_path) elif modname in sys.builtin_module_names: return ModuleSpec( name=modname, location=None, type=ModuleType.C_BUILTIN, ) else: try: with warnings.catch_warnings(): warnings.filterwarnings("ignore", category=UserWarning) spec = importlib.util.find_spec(modname) if ( spec and spec.loader # type: ignore[comparison-overlap] # noqa: E501 is importlib.machinery.FrozenImporter ): # No need for BuiltinImporter; builtins handled above return ModuleSpec( name=modname, location=getattr(spec.loader_state, "filename", None), type=ModuleType.PY_FROZEN, ) except ValueError: pass submodule_path = sys.path suffixes = (".py", ".pyi", importlib.machinery.BYTECODE_SUFFIXES[0]) for entry in submodule_path: package_directory = os.path.join(entry, modname) for suffix in suffixes: package_file_name = "__init__" + suffix file_path = os.path.join(package_directory, package_file_name) if os.path.isfile(file_path): return ModuleSpec( name=modname, location=package_directory, type=ModuleType.PKG_DIRECTORY, ) for suffix, type_ in ImportlibFinder._SUFFIXES: file_name = modname + suffix file_path = os.path.join(entry, file_name) if os.path.isfile(file_path): return ModuleSpec(name=modname, location=file_path, type=type_) return None def contribute_to_path( self, spec: ModuleSpec, processed: list[str] ) -> Sequence[str] | None: if spec.location is None: # Builtin. return None if _is_setuptools_namespace(Path(spec.location)): # extend_path is called, search sys.path for module/packages # of this name see pkgutil.extend_path documentation path = [ os.path.join(p, *processed) for p in sys.path if os.path.isdir(os.path.join(p, *processed)) ] elif spec.name == "distutils" and not any( spec.location.lower().startswith(ext_lib_dir.lower()) for ext_lib_dir in EXT_LIB_DIRS ): # virtualenv below 20.0 patches distutils in an unexpected way # so we just find the location of distutils that will be # imported to avoid spurious import-error messages # https://github.com/pylint-dev/pylint/issues/5645 # A regression test to create this scenario exists in release-tests.yml # and can be triggered manually from GitHub Actions distutils_spec = importlib.util.find_spec("distutils") if distutils_spec and distutils_spec.origin: origin_path = Path( distutils_spec.origin ) # e.g. .../distutils/__init__.py path = [str(origin_path.parent)] # e.g. .../distutils else: path = [spec.location] else: path = [spec.location] return path class ExplicitNamespacePackageFinder(ImportlibFinder): """A finder for the explicit namespace packages.""" def find_module( self, modname: str, module_parts: Sequence[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> ModuleSpec | None: if processed: modname = ".".join([*processed, modname]) if util.is_namespace(modname) and modname in sys.modules: submodule_path = sys.modules[modname].__path__ return ModuleSpec( name=modname, location="", origin="namespace", type=ModuleType.PY_NAMESPACE, submodule_search_locations=submodule_path, ) return None def contribute_to_path( self, spec: ModuleSpec, processed: list[str] ) -> Sequence[str] | None: return spec.submodule_search_locations class ZipFinder(Finder): """Finder that knows how to find a module inside zip files.""" def __init__(self, path: Sequence[str]) -> None: super().__init__(path) for entry_path in path: if entry_path not in sys.path_importer_cache: try: sys.path_importer_cache[entry_path] = zipimport.zipimporter( # type: ignore[assignment] entry_path ) except zipimport.ZipImportError: continue def find_module( self, modname: str, module_parts: Sequence[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> ModuleSpec | None: try: file_type, filename, path = _search_zip(module_parts) except ImportError: return None return ModuleSpec( name=modname, location=filename, origin="egg", type=file_type, submodule_search_locations=path, ) class PathSpecFinder(Finder): """Finder based on importlib.machinery.PathFinder.""" def find_module( self, modname: str, module_parts: Sequence[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> ModuleSpec | None: spec = importlib.machinery.PathFinder.find_spec(modname, path=submodule_path) if spec is not None: is_namespace_pkg = spec.origin is None location = spec.origin if not is_namespace_pkg else None module_type = ModuleType.PY_NAMESPACE if is_namespace_pkg else None return ModuleSpec( name=spec.name, location=location, origin=spec.origin, type=module_type, submodule_search_locations=list(spec.submodule_search_locations or []), ) return spec def contribute_to_path( self, spec: ModuleSpec, processed: list[str] ) -> Sequence[str] | None: if spec.type == ModuleType.PY_NAMESPACE: return spec.submodule_search_locations return None _SPEC_FINDERS = ( ImportlibFinder, ZipFinder, PathSpecFinder, ExplicitNamespacePackageFinder, ) def _is_setuptools_namespace(location: pathlib.Path) -> bool: try: with open(location / "__init__.py", "rb") as stream: data = stream.read(4096) except OSError: return False extend_path = b"pkgutil" in data and b"extend_path" in data declare_namespace = ( b"pkg_resources" in data and b"declare_namespace(__name__)" in data ) return extend_path or declare_namespace def _get_zipimporters() -> Iterator[tuple[str, zipimport.zipimporter]]: for filepath, importer in sys.path_importer_cache.items(): if isinstance(importer, zipimport.zipimporter): yield filepath, importer def _search_zip( modpath: Sequence[str], ) -> tuple[Literal[ModuleType.PY_ZIPMODULE], str, str]: for filepath, importer in _get_zipimporters(): if PY310_PLUS: found: Any = importer.find_spec(modpath[0]) else: found = importer.find_module(modpath[0]) if found: if PY310_PLUS: if not importer.find_spec(os.path.sep.join(modpath)): raise ImportError( "No module named %s in %s/%s" % (".".join(modpath[1:]), filepath, modpath) ) elif not importer.find_module(os.path.sep.join(modpath)): raise ImportError( "No module named %s in %s/%s" % (".".join(modpath[1:]), filepath, modpath) ) return ( ModuleType.PY_ZIPMODULE, os.path.abspath(filepath) + os.path.sep + os.path.sep.join(modpath), filepath, ) raise ImportError(f"No module named {'.'.join(modpath)}") def _find_spec_with_path( search_path: Sequence[str], modname: str, module_parts: list[str], processed: list[str], submodule_path: Sequence[str] | None, ) -> tuple[Finder | _MetaPathFinder, ModuleSpec]: for finder in _SPEC_FINDERS: finder_instance = finder(search_path) spec = finder_instance.find_module( modname, module_parts, processed, submodule_path ) if spec is None: continue return finder_instance, spec # Support for custom finders for meta_finder in sys.meta_path: # See if we support the customer import hook of the meta_finder meta_finder_name = meta_finder.__class__.__name__ if meta_finder_name not in _MetaPathFinderModuleTypes: # Setuptools>62 creates its EditableFinders dynamically and have # "type" as their __class__.__name__. We check __name__ as well # to see if we can support the finder. try: meta_finder_name = meta_finder.__name__ # type: ignore[attr-defined] except AttributeError: continue if meta_finder_name not in _MetaPathFinderModuleTypes: continue module_type = _MetaPathFinderModuleTypes[meta_finder_name] # Meta path finders are supposed to have a find_spec method since # Python 3.4. However, some third-party finders do not implement it. # PEP302 does not refer to find_spec as well. # See: https://github.com/pylint-dev/astroid/pull/1752/ if not hasattr(meta_finder, "find_spec"): continue spec = meta_finder.find_spec(modname, submodule_path) if spec: return ( meta_finder, ModuleSpec( spec.name, module_type, spec.origin, spec.origin, spec.submodule_search_locations, ), ) raise ImportError(f"No module named {'.'.join(module_parts)}") def find_spec(modpath: Iterable[str], path: Iterable[str] | None = None) -> ModuleSpec: """Find a spec for the given module. :type modpath: list or tuple :param modpath: split module's name (i.e name of a module or package split on '.'), with leading empty strings for explicit relative import :type path: list or None :param path: optional list of path where the module or package should be searched (use sys.path if nothing or None is given) :rtype: ModuleSpec :return: A module spec, which describes how the module was found and where. """ return _find_spec(tuple(modpath), tuple(path) if path else None) @lru_cache(maxsize=1024) def _find_spec(module_path: tuple, path: tuple) -> ModuleSpec: _path = path or sys.path # Need a copy for not mutating the argument. modpath = list(module_path) submodule_path = None module_parts = modpath[:] processed: list[str] = [] while modpath: modname = modpath.pop(0) finder, spec = _find_spec_with_path( _path, modname, module_parts, processed, submodule_path or path ) processed.append(modname) if modpath: if isinstance(finder, Finder): submodule_path = finder.contribute_to_path(spec, processed) # If modname is a package from an editable install, update submodule_path # so that the next module in the path will be found inside of it using importlib. # Existence of __name__ is guaranteed by _find_spec_with_path. elif finder.__name__ in _EditableFinderClasses: # type: ignore[attr-defined] submodule_path = spec.submodule_search_locations if spec.type == ModuleType.PKG_DIRECTORY: spec = spec._replace(submodule_search_locations=submodule_path) return spec astroid-3.2.2/astroid/interpreter/_import/util.py0000664000175000017500000001113614622475517022120 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import pathlib import sys from functools import lru_cache from importlib._bootstrap_external import _NamespacePath from importlib.util import _find_spec_from_path # type: ignore[attr-defined] from astroid.const import IS_PYPY if sys.version_info >= (3, 11): from importlib.machinery import NamespaceLoader else: from importlib._bootstrap_external import _NamespaceLoader as NamespaceLoader @lru_cache(maxsize=4096) def is_namespace(modname: str) -> bool: from astroid.modutils import ( # pylint: disable=import-outside-toplevel EXT_LIB_DIRS, STD_LIB_DIRS, ) STD_AND_EXT_LIB_DIRS = STD_LIB_DIRS.union(EXT_LIB_DIRS) if modname in sys.builtin_module_names: return False found_spec = None # find_spec() attempts to import parent packages when given dotted paths. # That's unacceptable here, so we fallback to _find_spec_from_path(), which does # not, but requires instead that each single parent ('astroid', 'nodes', etc.) # be specced from left to right. processed_components = [] last_submodule_search_locations: _NamespacePath | None = None for component in modname.split("."): processed_components.append(component) working_modname = ".".join(processed_components) try: # Both the modname and the path are built iteratively, with the # path (e.g. ['a', 'a/b', 'a/b/c']) lagging the modname by one found_spec = _find_spec_from_path( working_modname, path=last_submodule_search_locations ) except AttributeError: return False except ValueError: if modname == "__main__": return False try: # .pth files will be on sys.modules # __spec__ is set inconsistently on PyPy so we can't really on the heuristic here # See: https://foss.heptapod.net/pypy/pypy/-/issues/3736 # Check first fragment of modname, e.g. "astroid", not "astroid.interpreter" # because of cffi's behavior # See: https://github.com/pylint-dev/astroid/issues/1776 mod = sys.modules[processed_components[0]] return ( mod.__spec__ is None and getattr(mod, "__file__", None) is None and hasattr(mod, "__path__") and not IS_PYPY ) except KeyError: return False except AttributeError: # Workaround for "py" module # https://github.com/pytest-dev/apipkg/issues/13 return False except KeyError: # Intermediate steps might raise KeyErrors # https://github.com/python/cpython/issues/93334 # TODO: update if fixed in importlib # For tree a > b > c.py # >>> from importlib.machinery import PathFinder # >>> PathFinder.find_spec('a.b', ['a']) # KeyError: 'a' # Repair last_submodule_search_locations if last_submodule_search_locations: # pylint: disable=unsubscriptable-object last_item = last_submodule_search_locations[-1] # e.g. for failure example above, add 'a/b' and keep going # so that find_spec('a.b.c', path=['a', 'a/b']) succeeds assumed_location = pathlib.Path(last_item) / component last_submodule_search_locations.append(str(assumed_location)) continue # Update last_submodule_search_locations for next iteration if found_spec and found_spec.submodule_search_locations: # But immediately return False if we can detect we are in stdlib # or external lib (e.g site-packages) if any( any(location.startswith(lib_dir) for lib_dir in STD_AND_EXT_LIB_DIRS) for location in found_spec.submodule_search_locations ): return False last_submodule_search_locations = found_spec.submodule_search_locations return ( found_spec is not None and found_spec.submodule_search_locations is not None and found_spec.origin is None and ( found_spec.loader is None or isinstance(found_spec.loader, NamespaceLoader) ) ) astroid-3.2.2/astroid/interpreter/dunder_lookup.py0000664000175000017500000000472314622475517022350 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Contains logic for retrieving special methods. This implementation does not rely on the dot attribute access logic, found in ``.getattr()``. The difference between these two is that the dunder methods are looked with the type slots (you can find more about these here http://lucumr.pocoo.org/2014/8/16/the-python-i-would-like-to-see/) As such, the lookup for the special methods is actually simpler than the dot attribute access. """ from __future__ import annotations import itertools from typing import TYPE_CHECKING import astroid from astroid.exceptions import AttributeInferenceError if TYPE_CHECKING: from astroid import nodes from astroid.context import InferenceContext def _lookup_in_mro(node, name) -> list: attrs = node.locals.get(name, []) nodes = itertools.chain.from_iterable( ancestor.locals.get(name, []) for ancestor in node.ancestors(recurs=True) ) values = list(itertools.chain(attrs, nodes)) if not values: raise AttributeInferenceError(attribute=name, target=node) return values def lookup( node: nodes.NodeNG, name: str, context: InferenceContext | None = None ) -> list: """Lookup the given special method name in the given *node*. If the special method was found, then a list of attributes will be returned. Otherwise, `astroid.AttributeInferenceError` is going to be raised. """ if isinstance( node, (astroid.List, astroid.Tuple, astroid.Const, astroid.Dict, astroid.Set) ): return _builtin_lookup(node, name) if isinstance(node, astroid.Instance): return _lookup_in_mro(node, name) if isinstance(node, astroid.ClassDef): return _class_lookup(node, name, context=context) raise AttributeInferenceError(attribute=name, target=node) def _class_lookup( node: nodes.ClassDef, name: str, context: InferenceContext | None = None ) -> list: metaclass = node.metaclass(context=context) if metaclass is None: raise AttributeInferenceError(attribute=name, target=node) return _lookup_in_mro(metaclass, name) def _builtin_lookup(node, name) -> list: values = node.locals.get(name, []) if not values: raise AttributeInferenceError(attribute=name, target=node) return values astroid-3.2.2/astroid/raw_building.py0000664000175000017500000006134214622475517017601 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """this module contains a set of functions to create astroid trees from scratch (build_* functions) or from living object (object_build_* functions) """ from __future__ import annotations import builtins import inspect import io import logging import os import sys import types import warnings from collections.abc import Iterable from contextlib import redirect_stderr, redirect_stdout from typing import Any, Union from astroid import bases, nodes from astroid.const import _EMPTY_OBJECT_MARKER, IS_PYPY from astroid.manager import AstroidManager from astroid.nodes import node_classes logger = logging.getLogger(__name__) _FunctionTypes = Union[ types.FunctionType, types.MethodType, types.BuiltinFunctionType, types.WrapperDescriptorType, types.MethodDescriptorType, types.ClassMethodDescriptorType, ] # the keys of CONST_CLS eg python builtin types _CONSTANTS = tuple(node_classes.CONST_CLS) TYPE_NONE = type(None) TYPE_NOTIMPLEMENTED = type(NotImplemented) TYPE_ELLIPSIS = type(...) def _attach_local_node(parent, node, name: str) -> None: node.name = name # needed by add_local_node parent.add_local_node(node) def _add_dunder_class(func, member) -> None: """Add a __class__ member to the given func node, if we can determine it.""" python_cls = member.__class__ cls_name = getattr(python_cls, "__name__", None) if not cls_name: return cls_bases = [ancestor.__name__ for ancestor in python_cls.__bases__] ast_klass = build_class(cls_name, cls_bases, python_cls.__doc__) func.instance_attrs["__class__"] = [ast_klass] def attach_dummy_node(node, name: str, runtime_object=_EMPTY_OBJECT_MARKER) -> None: """create a dummy node and register it in the locals of the given node with the specified name """ enode = nodes.EmptyNode() enode.object = runtime_object _attach_local_node(node, enode, name) def attach_const_node(node, name: str, value) -> None: """create a Const node and register it in the locals of the given node with the specified name """ if name not in node.special_attributes: _attach_local_node(node, nodes.const_factory(value), name) def attach_import_node(node, modname: str, membername: str) -> None: """create a ImportFrom node and register it in the locals of the given node with the specified name """ from_node = nodes.ImportFrom(modname, [(membername, None)]) _attach_local_node(node, from_node, membername) def build_module(name: str, doc: str | None = None) -> nodes.Module: """create and initialize an astroid Module node""" node = nodes.Module(name, pure_python=False, package=False) node.postinit( body=[], doc_node=nodes.Const(value=doc) if doc else None, ) return node def build_class( name: str, basenames: Iterable[str] = (), doc: str | None = None ) -> nodes.ClassDef: """Create and initialize an astroid ClassDef node.""" node = nodes.ClassDef( name, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=nodes.Unknown(), ) node.postinit( bases=[ nodes.Name( name=base, lineno=0, col_offset=0, parent=node, end_lineno=None, end_col_offset=None, ) for base in basenames ], body=[], decorators=None, doc_node=nodes.Const(value=doc) if doc else None, ) return node def build_function( name: str, args: list[str] | None = None, posonlyargs: list[str] | None = None, defaults: list[Any] | None = None, doc: str | None = None, kwonlyargs: list[str] | None = None, kwonlydefaults: list[Any] | None = None, ) -> nodes.FunctionDef: """create and initialize an astroid FunctionDef node""" # first argument is now a list of decorators func = nodes.FunctionDef( name, lineno=0, col_offset=0, parent=node_classes.Unknown(), end_col_offset=0, end_lineno=0, ) argsnode = nodes.Arguments(parent=func, vararg=None, kwarg=None) # If args is None we don't have any information about the signature # (in contrast to when there are no arguments and args == []). We pass # this to the builder to indicate this. if args is not None: # We set the lineno and col_offset to 0 because we don't have any # information about the location of the function definition. arguments = [ nodes.AssignName( name=arg, parent=argsnode, lineno=0, col_offset=0, end_lineno=None, end_col_offset=None, ) for arg in args ] else: arguments = None default_nodes: list[nodes.NodeNG] | None if defaults is None: default_nodes = None else: default_nodes = [] for default in defaults: default_node = nodes.const_factory(default) default_node.parent = argsnode default_nodes.append(default_node) kwonlydefault_nodes: list[nodes.NodeNG | None] | None if kwonlydefaults is None: kwonlydefault_nodes = None else: kwonlydefault_nodes = [] for kwonlydefault in kwonlydefaults: kwonlydefault_node = nodes.const_factory(kwonlydefault) kwonlydefault_node.parent = argsnode kwonlydefault_nodes.append(kwonlydefault_node) # We set the lineno and col_offset to 0 because we don't have any # information about the location of the kwonly and posonlyargs. argsnode.postinit( args=arguments, defaults=default_nodes, kwonlyargs=[ nodes.AssignName( name=arg, parent=argsnode, lineno=0, col_offset=0, end_lineno=None, end_col_offset=None, ) for arg in kwonlyargs or () ], kw_defaults=kwonlydefault_nodes, annotations=[], posonlyargs=[ nodes.AssignName( name=arg, parent=argsnode, lineno=0, col_offset=0, end_lineno=None, end_col_offset=None, ) for arg in posonlyargs or () ], kwonlyargs_annotations=[], posonlyargs_annotations=[], ) func.postinit( args=argsnode, body=[], doc_node=nodes.Const(value=doc) if doc else None, ) if args: register_arguments(func) return func def build_from_import(fromname: str, names: list[str]) -> nodes.ImportFrom: """create and initialize an astroid ImportFrom import statement""" return nodes.ImportFrom(fromname, [(name, None) for name in names]) def register_arguments(func: nodes.FunctionDef, args: list | None = None) -> None: """add given arguments to local args is a list that may contains nested lists (i.e. def func(a, (b, c, d)): ...) """ # If no args are passed in, get the args from the function. if args is None: if func.args.vararg: func.set_local(func.args.vararg, func.args) if func.args.kwarg: func.set_local(func.args.kwarg, func.args) args = func.args.args # If the function has no args, there is nothing left to do. if args is None: return for arg in args: if isinstance(arg, nodes.AssignName): func.set_local(arg.name, arg) else: register_arguments(func, arg.elts) def object_build_class( node: nodes.Module | nodes.ClassDef, member: type, localname: str ) -> nodes.ClassDef: """create astroid for a living class object""" basenames = [base.__name__ for base in member.__bases__] return _base_class_object_build(node, member, basenames, localname=localname) def _get_args_info_from_callable( member: _FunctionTypes, ) -> tuple[list[str], list[str], list[Any], list[str], list[Any]]: """Returns args, posonlyargs, defaults, kwonlyargs. :note: currently ignores the return annotation. """ signature = inspect.signature(member) args: list[str] = [] defaults: list[Any] = [] posonlyargs: list[str] = [] kwonlyargs: list[str] = [] kwonlydefaults: list[Any] = [] for param_name, param in signature.parameters.items(): if param.kind == inspect.Parameter.POSITIONAL_ONLY: posonlyargs.append(param_name) elif param.kind == inspect.Parameter.POSITIONAL_OR_KEYWORD: args.append(param_name) elif param.kind == inspect.Parameter.VAR_POSITIONAL: args.append(param_name) elif param.kind == inspect.Parameter.VAR_KEYWORD: args.append(param_name) elif param.kind == inspect.Parameter.KEYWORD_ONLY: kwonlyargs.append(param_name) if param.default is not inspect.Parameter.empty: kwonlydefaults.append(param.default) continue if param.default is not inspect.Parameter.empty: defaults.append(param.default) return args, posonlyargs, defaults, kwonlyargs, kwonlydefaults def object_build_function( node: nodes.Module | nodes.ClassDef, member: _FunctionTypes, localname: str ) -> None: """create astroid for a living function object""" ( args, posonlyargs, defaults, kwonlyargs, kwonly_defaults, ) = _get_args_info_from_callable(member) func = build_function( getattr(member, "__name__", None) or localname, args, posonlyargs, defaults, member.__doc__, kwonlyargs=kwonlyargs, kwonlydefaults=kwonly_defaults, ) node.add_local_node(func, localname) def object_build_datadescriptor( node: nodes.Module | nodes.ClassDef, member: type, name: str ) -> nodes.ClassDef: """create astroid for a living data descriptor object""" return _base_class_object_build(node, member, [], name) def object_build_methoddescriptor( node: nodes.Module | nodes.ClassDef, member: _FunctionTypes, localname: str, ) -> None: """create astroid for a living method descriptor object""" # FIXME get arguments ? func = build_function( getattr(member, "__name__", None) or localname, doc=member.__doc__ ) node.add_local_node(func, localname) _add_dunder_class(func, member) def _base_class_object_build( node: nodes.Module | nodes.ClassDef, member: type, basenames: list[str], name: str | None = None, localname: str | None = None, ) -> nodes.ClassDef: """create astroid for a living class object, with a given set of base names (e.g. ancestors) """ class_name = name or getattr(member, "__name__", None) or localname assert isinstance(class_name, str) klass = build_class( class_name, basenames, member.__doc__, ) klass._newstyle = isinstance(member, type) node.add_local_node(klass, localname) try: # limit the instantiation trick since it's too dangerous # (such as infinite test execution...) # this at least resolves common case such as Exception.args, # OSError.errno if issubclass(member, Exception): instdict = member().__dict__ else: raise TypeError except TypeError: pass else: for item_name, obj in instdict.items(): valnode = nodes.EmptyNode() valnode.object = obj valnode.parent = klass valnode.lineno = 1 klass.instance_attrs[item_name] = [valnode] return klass def _build_from_function( node: nodes.Module | nodes.ClassDef, name: str, member: _FunctionTypes, module: types.ModuleType, ) -> None: # verify this is not an imported function try: code = member.__code__ # type: ignore[union-attr] except AttributeError: # Some implementations don't provide the code object, # such as Jython. code = None filename = getattr(code, "co_filename", None) if filename is None: assert isinstance(member, object) object_build_methoddescriptor(node, member, name) elif filename != getattr(module, "__file__", None): attach_dummy_node(node, name, member) else: object_build_function(node, member, name) def _safe_has_attribute(obj, member: str) -> bool: """Required because unexpected RunTimeError can be raised. See https://github.com/pylint-dev/astroid/issues/1958 """ try: return hasattr(obj, member) except Exception: # pylint: disable=broad-except return False class InspectBuilder: """class for building nodes from living object this is actually a really minimal representation, including only Module, FunctionDef and ClassDef nodes and some others as guessed. """ bootstrapped: bool = False def __init__(self, manager_instance: AstroidManager | None = None) -> None: self._manager = manager_instance or AstroidManager() self._done: dict[types.ModuleType | type, nodes.Module | nodes.ClassDef] = {} self._module: types.ModuleType def inspect_build( self, module: types.ModuleType, modname: str | None = None, path: str | None = None, ) -> nodes.Module: """build astroid from a living module (i.e. using inspect) this is used when there is no python source code available (either because it's a built-in module or because the .py is not available) """ self._module = module if modname is None: modname = module.__name__ try: node = build_module(modname, module.__doc__) except AttributeError: # in jython, java modules have no __doc__ (see #109562) node = build_module(modname) if path is None: node.path = node.file = path else: node.path = [os.path.abspath(path)] node.file = node.path[0] node.name = modname self._manager.cache_module(node) node.package = hasattr(module, "__path__") self._done = {} self.object_build(node, module) return node def object_build( self, node: nodes.Module | nodes.ClassDef, obj: types.ModuleType | type ) -> None: """recursive method which create a partial ast from real objects (only function, class, and method are handled) """ if obj in self._done: return None self._done[obj] = node for name in dir(obj): # inspect.ismethod() and inspect.isbuiltin() in PyPy return # the opposite of what they do in CPython for __class_getitem__. pypy__class_getitem__ = IS_PYPY and name == "__class_getitem__" try: with warnings.catch_warnings(): warnings.simplefilter("ignore") member = getattr(obj, name) except AttributeError: # damned ExtensionClass.Base, I know you're there ! attach_dummy_node(node, name) continue if inspect.ismethod(member) and not pypy__class_getitem__: member = member.__func__ if inspect.isfunction(member): _build_from_function(node, name, member, self._module) elif inspect.isbuiltin(member) or pypy__class_getitem__: if self.imported_member(node, member, name): continue object_build_methoddescriptor(node, member, name) elif inspect.isclass(member): if self.imported_member(node, member, name): continue if member in self._done: class_node = self._done[member] assert isinstance(class_node, nodes.ClassDef) if class_node not in node.locals.get(name, ()): node.add_local_node(class_node, name) else: class_node = object_build_class(node, member, name) # recursion self.object_build(class_node, member) if name == "__class__" and class_node.parent is None: class_node.parent = self._done[self._module] elif inspect.ismethoddescriptor(member): object_build_methoddescriptor(node, member, name) elif inspect.isdatadescriptor(member): object_build_datadescriptor(node, member, name) elif isinstance(member, _CONSTANTS): attach_const_node(node, name, member) elif inspect.isroutine(member): # This should be called for Jython, where some builtin # methods aren't caught by isbuiltin branch. _build_from_function(node, name, member, self._module) elif _safe_has_attribute(member, "__all__"): module = build_module(name) _attach_local_node(node, module, name) # recursion self.object_build(module, member) else: # create an empty node so that the name is actually defined attach_dummy_node(node, name, member) return None def imported_member(self, node, member, name: str) -> bool: """verify this is not an imported class or handle it""" # /!\ some classes like ExtensionClass doesn't have a __module__ # attribute ! Also, this may trigger an exception on badly built module # (see http://www.logilab.org/ticket/57299 for instance) try: modname = getattr(member, "__module__", None) except TypeError: modname = None if modname is None: if name in {"__new__", "__subclasshook__"}: # Python 2.5.1 (r251:54863, Sep 1 2010, 22:03:14) # >>> print object.__new__.__module__ # None modname = builtins.__name__ else: attach_dummy_node(node, name, member) return True # On PyPy during bootstrapping we infer _io while _module is # builtins. In CPython _io names itself io, see http://bugs.python.org/issue18602 # Therefore, this basically checks whether we are not in PyPy. if modname == "_io" and not self._module.__name__ == "builtins": return False real_name = {"gtk": "gtk_gtk"}.get(modname, modname) if real_name != self._module.__name__: # check if it sounds valid and then add an import node, else use a # dummy node try: with redirect_stderr(io.StringIO()) as stderr, redirect_stdout( io.StringIO() ) as stdout: getattr(sys.modules[modname], name) stderr_value = stderr.getvalue() if stderr_value: logger.error( "Captured stderr while getting %s from %s:\n%s", name, sys.modules[modname], stderr_value, ) stdout_value = stdout.getvalue() if stdout_value: logger.info( "Captured stdout while getting %s from %s:\n%s", name, sys.modules[modname], stdout_value, ) except (KeyError, AttributeError): attach_dummy_node(node, name, member) else: attach_import_node(node, modname, name) return True return False # astroid bootstrapping ###################################################### _CONST_PROXY: dict[type, nodes.ClassDef] = {} def _set_proxied(const) -> nodes.ClassDef: # TODO : find a nicer way to handle this situation; return _CONST_PROXY[const.value.__class__] def _astroid_bootstrapping() -> None: """astroid bootstrapping the builtins module""" # this boot strapping is necessary since we need the Const nodes to # inspect_build builtins, and then we can proxy Const builder = InspectBuilder() astroid_builtin = builder.inspect_build(builtins) for cls, node_cls in node_classes.CONST_CLS.items(): if cls is TYPE_NONE: proxy = build_class("NoneType") proxy.parent = astroid_builtin elif cls is TYPE_NOTIMPLEMENTED: proxy = build_class("NotImplementedType") proxy.parent = astroid_builtin elif cls is TYPE_ELLIPSIS: proxy = build_class("Ellipsis") proxy.parent = astroid_builtin else: proxy = astroid_builtin.getattr(cls.__name__)[0] assert isinstance(proxy, nodes.ClassDef) if cls in (dict, list, set, tuple): node_cls._proxied = proxy else: _CONST_PROXY[cls] = proxy # Set the builtin module as parent for some builtins. nodes.Const._proxied = property(_set_proxied) _GeneratorType = nodes.ClassDef( types.GeneratorType.__name__, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=nodes.Unknown(), ) _GeneratorType.parent = astroid_builtin generator_doc_node = ( nodes.Const(value=types.GeneratorType.__doc__) if types.GeneratorType.__doc__ else None ) _GeneratorType.postinit( bases=[], body=[], decorators=None, doc_node=generator_doc_node, ) bases.Generator._proxied = _GeneratorType builder.object_build(bases.Generator._proxied, types.GeneratorType) if hasattr(types, "AsyncGeneratorType"): _AsyncGeneratorType = nodes.ClassDef( types.AsyncGeneratorType.__name__, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=nodes.Unknown(), ) _AsyncGeneratorType.parent = astroid_builtin async_generator_doc_node = ( nodes.Const(value=types.AsyncGeneratorType.__doc__) if types.AsyncGeneratorType.__doc__ else None ) _AsyncGeneratorType.postinit( bases=[], body=[], decorators=None, doc_node=async_generator_doc_node, ) bases.AsyncGenerator._proxied = _AsyncGeneratorType builder.object_build(bases.AsyncGenerator._proxied, types.AsyncGeneratorType) if hasattr(types, "UnionType"): _UnionTypeType = nodes.ClassDef( types.UnionType.__name__, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=nodes.Unknown(), ) _UnionTypeType.parent = astroid_builtin union_type_doc_node = ( nodes.Const(value=types.UnionType.__doc__) if types.UnionType.__doc__ else None ) _UnionTypeType.postinit( bases=[], body=[], decorators=None, doc_node=union_type_doc_node, ) bases.UnionType._proxied = _UnionTypeType builder.object_build(bases.UnionType._proxied, types.UnionType) builtin_types = ( types.GetSetDescriptorType, types.GeneratorType, types.MemberDescriptorType, TYPE_NONE, TYPE_NOTIMPLEMENTED, types.FunctionType, types.MethodType, types.BuiltinFunctionType, types.ModuleType, types.TracebackType, ) for _type in builtin_types: if _type.__name__ not in astroid_builtin: klass = nodes.ClassDef( _type.__name__, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=nodes.Unknown(), ) klass.parent = astroid_builtin klass.postinit( bases=[], body=[], decorators=None, doc_node=nodes.Const(value=_type.__doc__) if _type.__doc__ else None, ) builder.object_build(klass, _type) astroid_builtin[_type.__name__] = klass InspectBuilder.bootstrapped = True # pylint: disable-next=import-outside-toplevel from astroid.brain.brain_builtin_inference import on_bootstrap # Instantiates an AstroidBuilder(), which is where # InspectBuilder.bootstrapped is checked, so place after bootstrapped=True. on_bootstrap() astroid-3.2.2/astroid/decorators.py0000664000175000017500000002061614622475517017277 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """A few useful function/method decorators.""" from __future__ import annotations import functools import inspect import sys import warnings from collections.abc import Callable, Generator from typing import TypeVar from astroid import util from astroid.context import InferenceContext from astroid.exceptions import InferenceError from astroid.typing import InferenceResult if sys.version_info >= (3, 10): from typing import ParamSpec else: from typing_extensions import ParamSpec _R = TypeVar("_R") _P = ParamSpec("_P") def path_wrapper(func): """Return the given infer function wrapped to handle the path. Used to stop inference if the node has already been looked at for a given `InferenceContext` to prevent infinite recursion """ @functools.wraps(func) def wrapped( node, context: InferenceContext | None = None, _func=func, **kwargs ) -> Generator: """Wrapper function handling context.""" if context is None: context = InferenceContext() if context.push(node): return yielded = set() for res in _func(node, context, **kwargs): # unproxy only true instance, not const, tuple, dict... if res.__class__.__name__ == "Instance": ares = res._proxied else: ares = res if ares not in yielded: yield res yielded.add(ares) return wrapped def yes_if_nothing_inferred( func: Callable[_P, Generator[InferenceResult, None, None]] ) -> Callable[_P, Generator[InferenceResult, None, None]]: def inner( *args: _P.args, **kwargs: _P.kwargs ) -> Generator[InferenceResult, None, None]: generator = func(*args, **kwargs) try: yield next(generator) except StopIteration: # generator is empty yield util.Uninferable return yield from generator return inner def raise_if_nothing_inferred( func: Callable[_P, Generator[InferenceResult, None, None]], ) -> Callable[_P, Generator[InferenceResult, None, None]]: def inner( *args: _P.args, **kwargs: _P.kwargs ) -> Generator[InferenceResult, None, None]: generator = func(*args, **kwargs) try: yield next(generator) except StopIteration as error: # generator is empty if error.args: raise InferenceError(**error.args[0]) from error raise InferenceError( "StopIteration raised without any error information." ) from error except RecursionError as error: raise InferenceError( f"RecursionError raised with limit {sys.getrecursionlimit()}." ) from error yield from generator return inner # Expensive decorators only used to emit Deprecation warnings. # If no other than the default DeprecationWarning are enabled, # fall back to passthrough implementations. if util.check_warnings_filter(): # noqa: C901 def deprecate_default_argument_values( astroid_version: str = "3.0", **arguments: str ) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: """Decorator which emits a DeprecationWarning if any arguments specified are None or not passed at all. Arguments should be a key-value mapping, with the key being the argument to check and the value being a type annotation as string for the value of the argument. To improve performance, only used when DeprecationWarnings other than the default one are enabled. """ # Helpful links # Decorator for DeprecationWarning: https://stackoverflow.com/a/49802489 # Typing of stacked decorators: https://stackoverflow.com/a/68290080 def deco(func: Callable[_P, _R]) -> Callable[_P, _R]: """Decorator function.""" @functools.wraps(func) def wrapper(*args: _P.args, **kwargs: _P.kwargs) -> _R: """Emit DeprecationWarnings if conditions are met.""" keys = list(inspect.signature(func).parameters.keys()) for arg, type_annotation in arguments.items(): try: index = keys.index(arg) except ValueError: raise ValueError( f"Can't find argument '{arg}' for '{args[0].__class__.__qualname__}'" ) from None if ( # Check kwargs # - if found, check it's not None (arg in kwargs and kwargs[arg] is None) # Check args # - make sure not in kwargs # - len(args) needs to be long enough, if too short # arg can't be in args either # - args[index] should not be None or arg not in kwargs and ( index == -1 or len(args) <= index or (len(args) > index and args[index] is None) ) ): warnings.warn( f"'{arg}' will be a required argument for " f"'{args[0].__class__.__qualname__}.{func.__name__}'" f" in astroid {astroid_version} " f"('{arg}' should be of type: '{type_annotation}')", DeprecationWarning, stacklevel=2, ) return func(*args, **kwargs) return wrapper return deco def deprecate_arguments( astroid_version: str = "3.0", **arguments: str ) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: """Decorator which emits a DeprecationWarning if any arguments specified are passed. Arguments should be a key-value mapping, with the key being the argument to check and the value being a string that explains what to do instead of passing the argument. To improve performance, only used when DeprecationWarnings other than the default one are enabled. """ def deco(func: Callable[_P, _R]) -> Callable[_P, _R]: @functools.wraps(func) def wrapper(*args: _P.args, **kwargs: _P.kwargs) -> _R: keys = list(inspect.signature(func).parameters.keys()) for arg, note in arguments.items(): try: index = keys.index(arg) except ValueError: raise ValueError( f"Can't find argument '{arg}' for '{args[0].__class__.__qualname__}'" ) from None if arg in kwargs or len(args) > index: warnings.warn( f"The argument '{arg}' for " f"'{args[0].__class__.__qualname__}.{func.__name__}' is deprecated " f"and will be removed in astroid {astroid_version} ({note})", DeprecationWarning, stacklevel=2, ) return func(*args, **kwargs) return wrapper return deco else: def deprecate_default_argument_values( astroid_version: str = "3.0", **arguments: str ) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: """Passthrough decorator to improve performance if DeprecationWarnings are disabled. """ def deco(func: Callable[_P, _R]) -> Callable[_P, _R]: """Decorator function.""" return func return deco def deprecate_arguments( astroid_version: str = "3.0", **arguments: str ) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: """Passthrough decorator to improve performance if DeprecationWarnings are disabled. """ def deco(func: Callable[_P, _R]) -> Callable[_P, _R]: """Decorator function.""" return func return deco astroid-3.2.2/astroid/brain/0000775000175000017500000000000014622475517015646 5ustar epsilonepsilonastroid-3.2.2/astroid/brain/brain_boto3.py0000664000175000017500000000206014622475517020417 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for understanding ``boto3.ServiceRequest()``.""" from astroid import extract_node from astroid.manager import AstroidManager from astroid.nodes.scoped_nodes import ClassDef BOTO_SERVICE_FACTORY_QUALIFIED_NAME = "boto3.resources.base.ServiceResource" def service_request_transform(node): """Transform ServiceResource to look like dynamic classes.""" code = """ def __getattr__(self, attr): return 0 """ func_getattr = extract_node(code) node.locals["__getattr__"] = [func_getattr] return node def _looks_like_boto3_service_request(node) -> bool: return node.qname() == BOTO_SERVICE_FACTORY_QUALIFIED_NAME def register(manager: AstroidManager) -> None: manager.register_transform( ClassDef, service_request_transform, _looks_like_boto3_service_request ) astroid-3.2.2/astroid/brain/brain_type.py0000664000175000017500000000476314622475517020366 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hooks for type support. Starting from python3.9, type object behaves as it had __class_getitem__ method. However it was not possible to simply add this method inside type's body, otherwise all types would also have this method. In this case it would have been possible to write str[int]. Guido Van Rossum proposed a hack to handle this in the interpreter: https://github.com/python/cpython/blob/67e394562d67cbcd0ac8114e5439494e7645b8f5/Objects/abstract.c#L181-L184 This brain follows the same logic. It is no wise to add permanently the __class_getitem__ method to the type object. Instead we choose to add it only in the case of a subscript node which inside name node is type. Doing this type[int] is allowed whereas str[int] is not. Thanks to Lukasz Langa for fruitful discussion. """ from __future__ import annotations from astroid import extract_node, inference_tip, nodes from astroid.const import PY39_PLUS from astroid.context import InferenceContext from astroid.exceptions import UseInferenceDefault from astroid.manager import AstroidManager def _looks_like_type_subscript(node) -> bool: """ Try to figure out if a Name node is used inside a type related subscript. :param node: node to check :type node: astroid.nodes.node_classes.NodeNG :return: whether the node is a Name node inside a type related subscript """ if isinstance(node, nodes.Name) and isinstance(node.parent, nodes.Subscript): return node.name == "type" return False def infer_type_sub(node, context: InferenceContext | None = None): """ Infer a type[...] subscript. :param node: node to infer :type node: astroid.nodes.node_classes.NodeNG :return: the inferred node :rtype: nodes.NodeNG """ node_scope, _ = node.scope().lookup("type") if not isinstance(node_scope, nodes.Module) or node_scope.qname() != "builtins": raise UseInferenceDefault() class_src = """ class type: def __class_getitem__(cls, key): return cls """ node = extract_node(class_src) return node.infer(context=context) def register(manager: AstroidManager) -> None: if PY39_PLUS: manager.register_transform( nodes.Name, inference_tip(infer_type_sub), _looks_like_type_subscript ) astroid-3.2.2/astroid/brain/brain_io.py0000664000175000017500000000306514622475517020006 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid brain hints for some of the _io C objects.""" from astroid.manager import AstroidManager from astroid.nodes import ClassDef BUFFERED = {"BufferedWriter", "BufferedReader"} TextIOWrapper = "TextIOWrapper" FileIO = "FileIO" BufferedWriter = "BufferedWriter" def _generic_io_transform(node, name, cls): """Transform the given name, by adding the given *class* as a member of the node. """ io_module = AstroidManager().ast_from_module_name("_io") attribute_object = io_module[cls] instance = attribute_object.instantiate_class() node.locals[name] = [instance] def _transform_text_io_wrapper(node): # This is not always correct, since it can vary with the type of the descriptor, # being stdout, stderr or stdin. But we cannot get access to the name of the # stream, which is why we are using the BufferedWriter class as a default # value return _generic_io_transform(node, name="buffer", cls=BufferedWriter) def _transform_buffered(node): return _generic_io_transform(node, name="raw", cls=FileIO) def register(manager: AstroidManager) -> None: manager.register_transform( ClassDef, _transform_buffered, lambda node: node.name in BUFFERED ) manager.register_transform( ClassDef, _transform_text_io_wrapper, lambda node: node.name == TextIOWrapper ) astroid-3.2.2/astroid/brain/brain_typing.py0000664000175000017500000004112514622475517020710 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for typing.py support.""" from __future__ import annotations import textwrap import typing from collections.abc import Iterator from functools import partial from typing import Final from astroid import context, extract_node, inference_tip from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder, _extract_single_node from astroid.const import PY39_PLUS, PY312_PLUS from astroid.exceptions import ( AstroidSyntaxError, AttributeInferenceError, InferenceError, UseInferenceDefault, ) from astroid.manager import AstroidManager from astroid.nodes.node_classes import ( Assign, AssignName, Attribute, Call, Const, JoinedStr, Name, NodeNG, Subscript, Tuple, ) from astroid.nodes.scoped_nodes import ClassDef, FunctionDef TYPING_TYPEVARS = {"TypeVar", "NewType"} TYPING_TYPEVARS_QUALIFIED: Final = { "typing.TypeVar", "typing.NewType", "typing_extensions.TypeVar", } TYPING_TYPEDDICT_QUALIFIED: Final = {"typing.TypedDict", "typing_extensions.TypedDict"} TYPING_TYPE_TEMPLATE = """ class Meta(type): def __getitem__(self, item): return self @property def __args__(self): return () class {0}(metaclass=Meta): pass """ TYPING_MEMBERS = set(getattr(typing, "__all__", [])) TYPING_ALIAS = frozenset( ( "typing.Hashable", "typing.Awaitable", "typing.Coroutine", "typing.AsyncIterable", "typing.AsyncIterator", "typing.Iterable", "typing.Iterator", "typing.Reversible", "typing.Sized", "typing.Container", "typing.Collection", "typing.Callable", "typing.AbstractSet", "typing.MutableSet", "typing.Mapping", "typing.MutableMapping", "typing.Sequence", "typing.MutableSequence", "typing.ByteString", "typing.Tuple", "typing.List", "typing.Deque", "typing.Set", "typing.FrozenSet", "typing.MappingView", "typing.KeysView", "typing.ItemsView", "typing.ValuesView", "typing.ContextManager", "typing.AsyncContextManager", "typing.Dict", "typing.DefaultDict", "typing.OrderedDict", "typing.Counter", "typing.ChainMap", "typing.Generator", "typing.AsyncGenerator", "typing.Type", "typing.Pattern", "typing.Match", ) ) CLASS_GETITEM_TEMPLATE = """ @classmethod def __class_getitem__(cls, item): return cls """ def looks_like_typing_typevar_or_newtype(node) -> bool: func = node.func if isinstance(func, Attribute): return func.attrname in TYPING_TYPEVARS if isinstance(func, Name): return func.name in TYPING_TYPEVARS return False def infer_typing_typevar_or_newtype( node: Call, context_itton: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """Infer a typing.TypeVar(...) or typing.NewType(...) call.""" try: func = next(node.func.infer(context=context_itton)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if func.qname() not in TYPING_TYPEVARS_QUALIFIED: raise UseInferenceDefault if not node.args: raise UseInferenceDefault # Cannot infer from a dynamic class name (f-string) if isinstance(node.args[0], JoinedStr): raise UseInferenceDefault typename = node.args[0].as_string().strip("'") try: node = extract_node(TYPING_TYPE_TEMPLATE.format(typename)) except AstroidSyntaxError as exc: raise InferenceError from exc return node.infer(context=context_itton) def _looks_like_typing_subscript(node) -> bool: """Try to figure out if a Subscript node *might* be a typing-related subscript.""" if isinstance(node, Name): return node.name in TYPING_MEMBERS if isinstance(node, Attribute): return node.attrname in TYPING_MEMBERS if isinstance(node, Subscript): return _looks_like_typing_subscript(node.value) return False def infer_typing_attr( node: Subscript, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """Infer a typing.X[...] subscript.""" try: value = next(node.value.infer()) # type: ignore[union-attr] # value shouldn't be None for Subscript. except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if not value.qname().startswith("typing.") or value.qname() in TYPING_ALIAS: # If typing subscript belongs to an alias handle it separately. raise UseInferenceDefault if isinstance(value, ClassDef) and value.qname() in { "typing.Generic", "typing.Annotated", "typing_extensions.Annotated", }: # typing.Generic and typing.Annotated (PY39) are subscriptable # through __class_getitem__. Since astroid can't easily # infer the native methods, replace them for an easy inference tip func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) value.locals["__class_getitem__"] = [func_to_add] if ( isinstance(node.parent, ClassDef) and node in node.parent.bases and getattr(node.parent, "__cache", None) ): # node.parent.slots is evaluated and cached before the inference tip # is first applied. Remove the last result to allow a recalculation of slots cache = node.parent.__cache # type: ignore[attr-defined] # Unrecognized getattr if cache.get(node.parent.slots) is not None: del cache[node.parent.slots] # Avoid re-instantiating this class every time it's seen node._explicit_inference = lambda node, context: iter([value]) return iter([value]) node = extract_node(TYPING_TYPE_TEMPLATE.format(value.qname().split(".")[-1])) return node.infer(context=ctx) def _looks_like_generic_class_pep695(node: ClassDef) -> bool: """Check if class is using type parameter. Python 3.12+.""" return len(node.type_params) > 0 def infer_typing_generic_class_pep695( node: ClassDef, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """Add __class_getitem__ for generic classes. Python 3.12+.""" func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) node.locals["__class_getitem__"] = [func_to_add] return iter([node]) def _looks_like_typedDict( # pylint: disable=invalid-name node: FunctionDef | ClassDef, ) -> bool: """Check if node is TypedDict FunctionDef.""" return node.qname() in TYPING_TYPEDDICT_QUALIFIED def infer_old_typedDict( # pylint: disable=invalid-name node: ClassDef, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: func_to_add = _extract_single_node("dict") node.locals["__call__"] = [func_to_add] return iter([node]) def infer_typedDict( # pylint: disable=invalid-name node: FunctionDef, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """Replace TypedDict FunctionDef with ClassDef.""" class_def = ClassDef( name="TypedDict", lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) class_def.postinit(bases=[extract_node("dict")], body=[], decorators=None) func_to_add = _extract_single_node("dict") class_def.locals["__call__"] = [func_to_add] return iter([class_def]) def _looks_like_typing_alias(node: Call) -> bool: """ Returns True if the node corresponds to a call to _alias function. For example : MutableSet = _alias(collections.abc.MutableSet, T) :param node: call node """ return ( isinstance(node.func, Name) # TODO: remove _DeprecatedGenericAlias when Py3.14 min and node.func.name in {"_alias", "_DeprecatedGenericAlias"} and ( # _alias function works also for builtins object such as list and dict isinstance(node.args[0], (Attribute, Name)) ) ) def _forbid_class_getitem_access(node: ClassDef) -> None: """Disable the access to __class_getitem__ method for the node in parameters.""" def full_raiser(origin_func, attr, *args, **kwargs): """ Raises an AttributeInferenceError in case of access to __class_getitem__ method. Otherwise, just call origin_func. """ if attr == "__class_getitem__": raise AttributeInferenceError("__class_getitem__ access is not allowed") return origin_func(attr, *args, **kwargs) try: node.getattr("__class_getitem__") # If we are here, then we are sure to modify an object that does have # __class_getitem__ method (which origin is the protocol defined in # collections module) whereas the typing module considers it should not. # We do not want __class_getitem__ to be found in the classdef partial_raiser = partial(full_raiser, node.getattr) node.getattr = partial_raiser except AttributeInferenceError: pass def infer_typing_alias( node: Call, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """ Infers the call to _alias function Insert ClassDef, with same name as aliased class, in mro to simulate _GenericAlias. :param node: call node :param context: inference context # TODO: evaluate if still necessary when Py3.12 is minimum """ if ( not isinstance(node.parent, Assign) or not len(node.parent.targets) == 1 or not isinstance(node.parent.targets[0], AssignName) ): raise UseInferenceDefault try: res = next(node.args[0].infer(context=ctx)) except StopIteration as e: raise InferenceError(node=node.args[0], context=ctx) from e assign_name = node.parent.targets[0] class_def = ClassDef( name=assign_name.name, lineno=assign_name.lineno, col_offset=assign_name.col_offset, parent=node.parent, end_lineno=assign_name.end_lineno, end_col_offset=assign_name.end_col_offset, ) if isinstance(res, ClassDef): # Only add `res` as base if it's a `ClassDef` # This isn't the case for `typing.Pattern` and `typing.Match` class_def.postinit(bases=[res], body=[], decorators=None) maybe_type_var = node.args[1] if ( not PY39_PLUS and not (isinstance(maybe_type_var, Tuple) and not maybe_type_var.elts) or PY39_PLUS and isinstance(maybe_type_var, Const) and maybe_type_var.value > 0 ): # If typing alias is subscriptable, add `__class_getitem__` to ClassDef func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) class_def.locals["__class_getitem__"] = [func_to_add] else: # If not, make sure that `__class_getitem__` access is forbidden. # This is an issue in cases where the aliased class implements it, # but the typing alias isn't subscriptable. E.g., `typing.ByteString` for PY39+ _forbid_class_getitem_access(class_def) # Avoid re-instantiating this class every time it's seen node._explicit_inference = lambda node, context: iter([class_def]) return iter([class_def]) def _looks_like_special_alias(node: Call) -> bool: """Return True if call is for Tuple or Callable alias. In PY37 and PY38 the call is to '_VariadicGenericAlias' with 'tuple' as first argument. In PY39+ it is replaced by a call to '_TupleType'. PY37: Tuple = _VariadicGenericAlias(tuple, (), inst=False, special=True) PY39: Tuple = _TupleType(tuple, -1, inst=False, name='Tuple') PY37: Callable = _VariadicGenericAlias(collections.abc.Callable, (), special=True) PY39: Callable = _CallableType(collections.abc.Callable, 2) """ return isinstance(node.func, Name) and ( not PY39_PLUS and node.func.name == "_VariadicGenericAlias" and ( isinstance(node.args[0], Name) and node.args[0].name == "tuple" or isinstance(node.args[0], Attribute) and node.args[0].as_string() == "collections.abc.Callable" ) or PY39_PLUS and ( node.func.name == "_TupleType" and isinstance(node.args[0], Name) and node.args[0].name == "tuple" or node.func.name == "_CallableType" and isinstance(node.args[0], Attribute) and node.args[0].as_string() == "collections.abc.Callable" ) ) def infer_special_alias( node: Call, ctx: context.InferenceContext | None = None ) -> Iterator[ClassDef]: """Infer call to tuple alias as new subscriptable class typing.Tuple.""" if not ( isinstance(node.parent, Assign) and len(node.parent.targets) == 1 and isinstance(node.parent.targets[0], AssignName) ): raise UseInferenceDefault try: res = next(node.args[0].infer(context=ctx)) except StopIteration as e: raise InferenceError(node=node.args[0], context=ctx) from e assign_name = node.parent.targets[0] class_def = ClassDef( name=assign_name.name, parent=node.parent, lineno=assign_name.lineno, col_offset=assign_name.col_offset, end_lineno=assign_name.end_lineno, end_col_offset=assign_name.end_col_offset, ) class_def.postinit(bases=[res], body=[], decorators=None) func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) class_def.locals["__class_getitem__"] = [func_to_add] # Avoid re-instantiating this class every time it's seen node._explicit_inference = lambda node, context: iter([class_def]) return iter([class_def]) def _looks_like_typing_cast(node: Call) -> bool: return isinstance(node, Call) and ( isinstance(node.func, Name) and node.func.name == "cast" or isinstance(node.func, Attribute) and node.func.attrname == "cast" ) def infer_typing_cast( node: Call, ctx: context.InferenceContext | None = None ) -> Iterator[NodeNG]: """Infer call to cast() returning same type as casted-from var.""" if not isinstance(node.func, (Name, Attribute)): raise UseInferenceDefault try: func = next(node.func.infer(context=ctx)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if ( not isinstance(func, FunctionDef) or func.qname() != "typing.cast" or len(node.args) != 2 ): raise UseInferenceDefault return node.args[1].infer(context=ctx) def _typing_transform(): return AstroidBuilder(AstroidManager()).string_build( textwrap.dedent( """ class Generic: @classmethod def __class_getitem__(cls, item): return cls class ParamSpec: @property def args(self): return ParamSpecArgs(self) @property def kwargs(self): return ParamSpecKwargs(self) class ParamSpecArgs: ... class ParamSpecKwargs: ... class TypeAlias: ... class Type: @classmethod def __class_getitem__(cls, item): return cls class TypeVar: @classmethod def __class_getitem__(cls, item): return cls class TypeVarTuple: ... """ ) ) def register(manager: AstroidManager) -> None: manager.register_transform( Call, inference_tip(infer_typing_typevar_or_newtype), looks_like_typing_typevar_or_newtype, ) manager.register_transform( Subscript, inference_tip(infer_typing_attr), _looks_like_typing_subscript ) manager.register_transform( Call, inference_tip(infer_typing_cast), _looks_like_typing_cast ) if PY39_PLUS: manager.register_transform( FunctionDef, inference_tip(infer_typedDict), _looks_like_typedDict ) else: manager.register_transform( ClassDef, inference_tip(infer_old_typedDict), _looks_like_typedDict ) manager.register_transform( Call, inference_tip(infer_typing_alias), _looks_like_typing_alias ) manager.register_transform( Call, inference_tip(infer_special_alias), _looks_like_special_alias ) if PY312_PLUS: register_module_extender(manager, "typing", _typing_transform) manager.register_transform( ClassDef, inference_tip(infer_typing_generic_class_pep695), _looks_like_generic_class_pep695, ) astroid-3.2.2/astroid/brain/brain_responses.py0000664000175000017500000000360014622475517021413 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hooks for responses. It might need to be manually updated from the public methods of :class:`responses.RequestsMock`. See: https://github.com/getsentry/responses/blob/master/responses.py """ from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def responses_funcs(): return parse( """ DELETE = "DELETE" GET = "GET" HEAD = "HEAD" OPTIONS = "OPTIONS" PATCH = "PATCH" POST = "POST" PUT = "PUT" response_callback = None def reset(): return def add( method=None, # method or ``Response`` url=None, body="", adding_headers=None, *args, **kwargs ): return def add_passthru(prefix): return def remove(method_or_response=None, url=None): return def replace(method_or_response=None, url=None, body="", *args, **kwargs): return def add_callback( method, url, callback, match_querystring=False, content_type="text/plain" ): return calls = [] def __enter__(): return def __exit__(type, value, traceback): success = type is None return success def activate(func): return func def start(): return def stop(allow_assert=True): return """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "responses", responses_funcs) astroid-3.2.2/astroid/brain/brain_numpy_core_multiarray.py0000664000175000017500000001046014622475517024025 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy.core.multiarray module.""" import functools from astroid.brain.brain_numpy_utils import ( attribute_looks_like_numpy_member, infer_numpy_member, name_looks_like_numpy_member, ) from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.nodes.node_classes import Attribute, Name def numpy_core_multiarray_transform(): return parse( """ # different functions defined in multiarray.py def inner(a, b): return numpy.ndarray([0, 0]) def vdot(a, b): return numpy.ndarray([0, 0]) """ ) METHODS_TO_BE_INFERRED = { "array": """def array(object, dtype=None, copy=True, order='K', subok=False, ndmin=0): return numpy.ndarray([0, 0])""", "dot": """def dot(a, b, out=None): return numpy.ndarray([0, 0])""", "empty_like": """def empty_like(a, dtype=None, order='K', subok=True): return numpy.ndarray((0, 0))""", "concatenate": """def concatenate(arrays, axis=None, out=None): return numpy.ndarray((0, 0))""", "where": """def where(condition, x=None, y=None): return numpy.ndarray([0, 0])""", "empty": """def empty(shape, dtype=float, order='C'): return numpy.ndarray([0, 0])""", "bincount": """def bincount(x, weights=None, minlength=0): return numpy.ndarray([0, 0])""", "busday_count": """def busday_count( begindates, enddates, weekmask='1111100', holidays=[], busdaycal=None, out=None ): return numpy.ndarray([0, 0])""", "busday_offset": """def busday_offset( dates, offsets, roll='raise', weekmask='1111100', holidays=None, busdaycal=None, out=None ): return numpy.ndarray([0, 0])""", "can_cast": """def can_cast(from_, to, casting='safe'): return True""", "copyto": """def copyto(dst, src, casting='same_kind', where=True): return None""", "datetime_as_string": """def datetime_as_string(arr, unit=None, timezone='naive', casting='same_kind'): return numpy.ndarray([0, 0])""", "is_busday": """def is_busday(dates, weekmask='1111100', holidays=None, busdaycal=None, out=None): return numpy.ndarray([0, 0])""", "lexsort": """def lexsort(keys, axis=-1): return numpy.ndarray([0, 0])""", "may_share_memory": """def may_share_memory(a, b, max_work=None): return True""", # Not yet available because dtype is not yet present in those brains # "min_scalar_type": """def min_scalar_type(a): # return numpy.dtype('int16')""", "packbits": """def packbits(a, axis=None, bitorder='big'): return numpy.ndarray([0, 0])""", # Not yet available because dtype is not yet present in those brains # "result_type": """def result_type(*arrays_and_dtypes): # return numpy.dtype('int16')""", "shares_memory": """def shares_memory(a, b, max_work=None): return True""", "unpackbits": """def unpackbits(a, axis=None, count=None, bitorder='big'): return numpy.ndarray([0, 0])""", "unravel_index": """def unravel_index(indices, shape, order='C'): return (numpy.ndarray([0, 0]),)""", "zeros": """def zeros(shape, dtype=float, order='C'): return numpy.ndarray([0, 0])""", } def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.core.multiarray", numpy_core_multiarray_transform ) for method_name, function_src in METHODS_TO_BE_INFERRED.items(): inference_function = functools.partial(infer_numpy_member, function_src) manager.register_transform( Attribute, inference_tip(inference_function), functools.partial(attribute_looks_like_numpy_member, method_name), ) manager.register_transform( Name, inference_tip(inference_function), functools.partial(name_looks_like_numpy_member, method_name), ) astroid-3.2.2/astroid/brain/brain_numpy_core_einsumfunc.py0000664000175000017500000000156514622475517024016 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hooks for numpy.core.einsumfunc module: https://github.com/numpy/numpy/blob/main/numpy/core/einsumfunc.py. """ from astroid import nodes from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_core_einsumfunc_transform() -> nodes.Module: return parse( """ def einsum(*operands, out=None, optimize=False, **kwargs): return numpy.ndarray([0, 0]) """ ) def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.core.einsumfunc", numpy_core_einsumfunc_transform ) astroid-3.2.2/astroid/brain/brain_datetime.py0000664000175000017500000000140314622475517021165 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.const import PY312_PLUS from astroid.manager import AstroidManager def datetime_transform(): """The datetime module was C-accelerated in Python 3.12, so use the Python source.""" return AstroidBuilder(AstroidManager()).string_build("from _pydatetime import *") def register(manager: AstroidManager) -> None: if PY312_PLUS: register_module_extender(manager, "datetime", datetime_transform) astroid-3.2.2/astroid/brain/brain_scipy_signal.py0000775000175000017500000000443014622475517022063 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for scipy.signal module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def scipy_signal(): return parse( """ # different functions defined in scipy.signals def barthann(M, sym=True): return numpy.ndarray([0]) def bartlett(M, sym=True): return numpy.ndarray([0]) def blackman(M, sym=True): return numpy.ndarray([0]) def blackmanharris(M, sym=True): return numpy.ndarray([0]) def bohman(M, sym=True): return numpy.ndarray([0]) def boxcar(M, sym=True): return numpy.ndarray([0]) def chebwin(M, at, sym=True): return numpy.ndarray([0]) def cosine(M, sym=True): return numpy.ndarray([0]) def exponential(M, center=None, tau=1.0, sym=True): return numpy.ndarray([0]) def flattop(M, sym=True): return numpy.ndarray([0]) def gaussian(M, std, sym=True): return numpy.ndarray([0]) def general_gaussian(M, p, sig, sym=True): return numpy.ndarray([0]) def hamming(M, sym=True): return numpy.ndarray([0]) def hann(M, sym=True): return numpy.ndarray([0]) def hanning(M, sym=True): return numpy.ndarray([0]) def impulse2(system, X0=None, T=None, N=None, **kwargs): return numpy.ndarray([0]), numpy.ndarray([0]) def kaiser(M, beta, sym=True): return numpy.ndarray([0]) def nuttall(M, sym=True): return numpy.ndarray([0]) def parzen(M, sym=True): return numpy.ndarray([0]) def slepian(M, width, sym=True): return numpy.ndarray([0]) def step2(system, X0=None, T=None, N=None, **kwargs): return numpy.ndarray([0]), numpy.ndarray([0]) def triang(M, sym=True): return numpy.ndarray([0]) def tukey(M, alpha=0.5, sym=True): return numpy.ndarray([0]) """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "scipy.signal", scipy_signal) astroid-3.2.2/astroid/brain/brain_six.py0000664000175000017500000001676214622475517020212 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for six module.""" from textwrap import dedent from astroid import nodes from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.exceptions import ( AstroidBuildingError, AttributeInferenceError, InferenceError, ) from astroid.manager import AstroidManager SIX_ADD_METACLASS = "six.add_metaclass" SIX_WITH_METACLASS = "six.with_metaclass" def default_predicate(line): return line.strip() def _indent(text, prefix, predicate=default_predicate) -> str: """Adds 'prefix' to the beginning of selected lines in 'text'. If 'predicate' is provided, 'prefix' will only be added to the lines where 'predicate(line)' is True. If 'predicate' is not provided, it will default to adding 'prefix' to all non-empty lines that do not consist solely of whitespace characters. """ def prefixed_lines(): for line in text.splitlines(True): yield prefix + line if predicate(line) else line return "".join(prefixed_lines()) _IMPORTS = """ import _io cStringIO = _io.StringIO filter = filter from itertools import filterfalse input = input from sys import intern map = map range = range from importlib import reload reload_module = lambda module: reload(module) from functools import reduce from shlex import quote as shlex_quote from io import StringIO from collections import UserDict, UserList, UserString xrange = range zip = zip from itertools import zip_longest import builtins import configparser import copyreg import _dummy_thread import http.cookiejar as http_cookiejar import http.cookies as http_cookies import html.entities as html_entities import html.parser as html_parser import http.client as http_client import http.server as http_server BaseHTTPServer = CGIHTTPServer = SimpleHTTPServer = http.server import pickle as cPickle import queue import reprlib import socketserver import _thread import winreg import xmlrpc.server as xmlrpc_server import xmlrpc.client as xmlrpc_client import urllib.robotparser as urllib_robotparser import email.mime.multipart as email_mime_multipart import email.mime.nonmultipart as email_mime_nonmultipart import email.mime.text as email_mime_text import email.mime.base as email_mime_base import urllib.parse as urllib_parse import urllib.error as urllib_error import tkinter import tkinter.dialog as tkinter_dialog import tkinter.filedialog as tkinter_filedialog import tkinter.scrolledtext as tkinter_scrolledtext import tkinter.simpledialog as tkinder_simpledialog import tkinter.tix as tkinter_tix import tkinter.ttk as tkinter_ttk import tkinter.constants as tkinter_constants import tkinter.dnd as tkinter_dnd import tkinter.colorchooser as tkinter_colorchooser import tkinter.commondialog as tkinter_commondialog import tkinter.filedialog as tkinter_tkfiledialog import tkinter.font as tkinter_font import tkinter.messagebox as tkinter_messagebox import urllib import urllib.request as urllib_request import urllib.robotparser as urllib_robotparser import urllib.parse as urllib_parse import urllib.error as urllib_error """ def six_moves_transform(): code = dedent( """ class Moves(object): {} moves = Moves() """ ).format(_indent(_IMPORTS, " ")) module = AstroidBuilder(AstroidManager()).string_build(code) module.name = "six.moves" return module def _six_fail_hook(modname): """Fix six.moves imports due to the dynamic nature of this class. Construct a pseudo-module which contains all the necessary imports for six :param modname: Name of failed module :type modname: str :return: An astroid module :rtype: nodes.Module """ attribute_of = modname != "six.moves" and modname.startswith("six.moves") if modname != "six.moves" and not attribute_of: raise AstroidBuildingError(modname=modname) module = AstroidBuilder(AstroidManager()).string_build(_IMPORTS) module.name = "six.moves" if attribute_of: # Facilitate import of submodules in Moves start_index = len(module.name) attribute = modname[start_index:].lstrip(".").replace(".", "_") try: import_attr = module.getattr(attribute)[0] except AttributeInferenceError as exc: raise AstroidBuildingError(modname=modname) from exc if isinstance(import_attr, nodes.Import): submodule = AstroidManager().ast_from_module_name(import_attr.names[0][0]) return submodule # Let dummy submodule imports pass through # This will cause an Uninferable result, which is okay return module def _looks_like_decorated_with_six_add_metaclass(node) -> bool: if not node.decorators: return False for decorator in node.decorators.nodes: if not isinstance(decorator, nodes.Call): continue if decorator.func.as_string() == SIX_ADD_METACLASS: return True return False def transform_six_add_metaclass(node): # pylint: disable=inconsistent-return-statements """Check if the given class node is decorated with *six.add_metaclass*. If so, inject its argument as the metaclass of the underlying class. """ if not node.decorators: return for decorator in node.decorators.nodes: if not isinstance(decorator, nodes.Call): continue try: func = next(decorator.func.infer()) except (InferenceError, StopIteration): continue if func.qname() == SIX_ADD_METACLASS and decorator.args: metaclass = decorator.args[0] node._metaclass = metaclass return node return def _looks_like_nested_from_six_with_metaclass(node) -> bool: if len(node.bases) != 1: return False base = node.bases[0] if not isinstance(base, nodes.Call): return False try: if hasattr(base.func, "expr"): # format when explicit 'six.with_metaclass' is used mod = base.func.expr.name func = base.func.attrname func = f"{mod}.{func}" else: # format when 'with_metaclass' is used directly (local import from six) # check reference module to avoid 'with_metaclass' name clashes mod = base.parent.parent import_from = mod.locals["with_metaclass"][0] func = f"{import_from.modname}.{base.func.name}" except (AttributeError, KeyError, IndexError): return False return func == SIX_WITH_METACLASS def transform_six_with_metaclass(node): """Check if the given class node is defined with *six.with_metaclass*. If so, inject its argument as the metaclass of the underlying class. """ call = node.bases[0] node._metaclass = call.args[0] return node def register(manager: AstroidManager) -> None: register_module_extender(manager, "six", six_moves_transform) register_module_extender( manager, "requests.packages.urllib3.packages.six", six_moves_transform ) manager.register_failed_import_hook(_six_fail_hook) manager.register_transform( nodes.ClassDef, transform_six_add_metaclass, _looks_like_decorated_with_six_add_metaclass, ) manager.register_transform( nodes.ClassDef, transform_six_with_metaclass, _looks_like_nested_from_six_with_metaclass, ) astroid-3.2.2/astroid/brain/brain_threading.py0000664000175000017500000000163214622475517021342 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def _thread_transform(): return parse( """ class lock(object): def acquire(self, blocking=True, timeout=-1): return False def release(self): pass def __enter__(self): return True def __exit__(self, *args): pass def locked(self): return False def Lock(*args, **kwargs): return lock() """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "threading", _thread_transform) astroid-3.2.2/astroid/brain/brain_numpy_core_fromnumeric.py0000664000175000017500000000143014622475517024157 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy.core.fromnumeric module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_core_fromnumeric_transform(): return parse( """ def sum(a, axis=None, dtype=None, out=None, keepdims=None, initial=None): return numpy.ndarray([0, 0]) """ ) def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.core.fromnumeric", numpy_core_fromnumeric_transform ) astroid-3.2.2/astroid/brain/brain_pytest.py0000664000175000017500000000433614622475517020731 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for pytest.""" from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.manager import AstroidManager def pytest_transform(): return AstroidBuilder(AstroidManager()).string_build( """ try: import _pytest.mark import _pytest.recwarn import _pytest.runner import _pytest.python import _pytest.skipping import _pytest.assertion except ImportError: pass else: deprecated_call = _pytest.recwarn.deprecated_call warns = _pytest.recwarn.warns exit = _pytest.runner.exit fail = _pytest.runner.fail skip = _pytest.runner.skip importorskip = _pytest.runner.importorskip xfail = _pytest.skipping.xfail mark = _pytest.mark.MarkGenerator() raises = _pytest.python.raises # New in pytest 3.0 try: approx = _pytest.python.approx register_assert_rewrite = _pytest.assertion.register_assert_rewrite except AttributeError: pass # Moved in pytest 3.0 try: import _pytest.freeze_support freeze_includes = _pytest.freeze_support.freeze_includes except ImportError: try: import _pytest.genscript freeze_includes = _pytest.genscript.freeze_includes except ImportError: pass try: import _pytest.debugging set_trace = _pytest.debugging.pytestPDB().set_trace except ImportError: try: import _pytest.pdb set_trace = _pytest.pdb.pytestPDB().set_trace except ImportError: pass try: import _pytest.fixtures fixture = _pytest.fixtures.fixture yield_fixture = _pytest.fixtures.yield_fixture except ImportError: try: import _pytest.python fixture = _pytest.python.fixture yield_fixture = _pytest.python.yield_fixture except ImportError: pass """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "pytest", pytest_transform) register_module_extender(manager, "py.test", pytest_transform) astroid-3.2.2/astroid/brain/helpers.py0000664000175000017500000001041414622475517017662 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from collections.abc import Callable from astroid.manager import AstroidManager from astroid.nodes.scoped_nodes import Module def register_module_extender( manager: AstroidManager, module_name: str, get_extension_mod: Callable[[], Module] ) -> None: def transform(node: Module) -> None: extension_module = get_extension_mod() for name, objs in extension_module.locals.items(): node.locals[name] = objs for obj in objs: if obj.parent is extension_module: obj.parent = node manager.register_transform(Module, transform, lambda n: n.name == module_name) # pylint: disable-next=too-many-locals def register_all_brains(manager: AstroidManager) -> None: from astroid.brain import ( # pylint: disable=import-outside-toplevel brain_argparse, brain_attrs, brain_boto3, brain_builtin_inference, brain_collections, brain_crypt, brain_ctypes, brain_curses, brain_dataclasses, brain_datetime, brain_dateutil, brain_fstrings, brain_functools, brain_gi, brain_hashlib, brain_http, brain_hypothesis, brain_io, brain_mechanize, brain_multiprocessing, brain_namedtuple_enum, brain_nose, brain_numpy_core_einsumfunc, brain_numpy_core_fromnumeric, brain_numpy_core_function_base, brain_numpy_core_multiarray, brain_numpy_core_numeric, brain_numpy_core_numerictypes, brain_numpy_core_umath, brain_numpy_ma, brain_numpy_ndarray, brain_numpy_random_mtrand, brain_pathlib, brain_pkg_resources, brain_pytest, brain_qt, brain_random, brain_re, brain_regex, brain_responses, brain_scipy_signal, brain_signal, brain_six, brain_sqlalchemy, brain_ssl, brain_subprocess, brain_threading, brain_type, brain_typing, brain_unittest, brain_uuid, ) brain_argparse.register(manager) brain_attrs.register(manager) brain_boto3.register(manager) brain_builtin_inference.register(manager) brain_collections.register(manager) brain_crypt.register(manager) brain_ctypes.register(manager) brain_curses.register(manager) brain_dataclasses.register(manager) brain_datetime.register(manager) brain_dateutil.register(manager) brain_fstrings.register(manager) brain_functools.register(manager) brain_gi.register(manager) brain_hashlib.register(manager) brain_http.register(manager) brain_hypothesis.register(manager) brain_io.register(manager) brain_mechanize.register(manager) brain_multiprocessing.register(manager) brain_namedtuple_enum.register(manager) brain_nose.register(manager) brain_numpy_core_einsumfunc.register(manager) brain_numpy_core_fromnumeric.register(manager) brain_numpy_core_function_base.register(manager) brain_numpy_core_multiarray.register(manager) brain_numpy_core_numerictypes.register(manager) brain_numpy_core_umath.register(manager) brain_numpy_random_mtrand.register(manager) brain_numpy_ma.register(manager) brain_numpy_ndarray.register(manager) brain_numpy_core_numeric.register(manager) brain_pathlib.register(manager) brain_pkg_resources.register(manager) brain_pytest.register(manager) brain_qt.register(manager) brain_random.register(manager) brain_re.register(manager) brain_regex.register(manager) brain_responses.register(manager) brain_scipy_signal.register(manager) brain_signal.register(manager) brain_six.register(manager) brain_sqlalchemy.register(manager) brain_ssl.register(manager) brain_subprocess.register(manager) brain_threading.register(manager) brain_type.register(manager) brain_typing.register(manager) brain_unittest.register(manager) brain_uuid.register(manager) astroid-3.2.2/astroid/brain/brain_multiprocessing.py0000664000175000017500000000627414622475517022633 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.bases import BoundMethod from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.exceptions import InferenceError from astroid.manager import AstroidManager from astroid.nodes.scoped_nodes import FunctionDef def _multiprocessing_transform(): module = parse( """ from multiprocessing.managers import SyncManager def Manager(): return SyncManager() """ ) # Multiprocessing uses a getattr lookup inside contexts, # in order to get the attributes they need. Since it's extremely # dynamic, we use this approach to fake it. node = parse( """ from multiprocessing.context import DefaultContext, BaseContext default = DefaultContext() base = BaseContext() """ ) try: context = next(node["default"].infer()) base = next(node["base"].infer()) except (InferenceError, StopIteration): return module for node in (context, base): for key, value in node.locals.items(): if key.startswith("_"): continue value = value[0] if isinstance(value, FunctionDef): # We need to rebound this, since otherwise # it will have an extra argument (self). value = BoundMethod(value, node) module[key] = value return module def _multiprocessing_managers_transform(): return parse( """ import array import threading import multiprocessing.pool as pool import queue class Namespace(object): pass class Value(object): def __init__(self, typecode, value, lock=True): self._typecode = typecode self._value = value def get(self): return self._value def set(self, value): self._value = value def __repr__(self): return '%s(%r, %r)'%(type(self).__name__, self._typecode, self._value) value = property(get, set) def Array(typecode, sequence, lock=True): return array.array(typecode, sequence) class SyncManager(object): Queue = JoinableQueue = queue.Queue Event = threading.Event RLock = threading.RLock Lock = threading.Lock BoundedSemaphore = threading.BoundedSemaphore Condition = threading.Condition Barrier = threading.Barrier Pool = pool.Pool list = list dict = dict Value = Value Array = Array Namespace = Namespace __enter__ = lambda self: self __exit__ = lambda *args: args def start(self, initializer=None, initargs=None): pass def shutdown(self): pass """ ) def register(manager: AstroidManager) -> None: register_module_extender( manager, "multiprocessing.managers", _multiprocessing_managers_transform ) register_module_extender(manager, "multiprocessing", _multiprocessing_transform) astroid-3.2.2/astroid/brain/brain_numpy_core_numeric.py0000664000175000017500000000331414622475517023276 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy.core.numeric module.""" import functools from astroid.brain.brain_numpy_utils import ( attribute_looks_like_numpy_member, infer_numpy_member, ) from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.nodes.node_classes import Attribute def numpy_core_numeric_transform(): return parse( """ # different functions defined in numeric.py import numpy def zeros_like(a, dtype=None, order='K', subok=True, shape=None): return numpy.ndarray((0, 0)) def ones_like(a, dtype=None, order='K', subok=True, shape=None): return numpy.ndarray((0, 0)) def full_like(a, fill_value, dtype=None, order='K', subok=True, shape=None): return numpy.ndarray((0, 0)) """ ) METHODS_TO_BE_INFERRED = { "ones": """def ones(shape, dtype=None, order='C'): return numpy.ndarray([0, 0])""" } def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.core.numeric", numpy_core_numeric_transform ) for method_name, function_src in METHODS_TO_BE_INFERRED.items(): inference_function = functools.partial(infer_numpy_member, function_src) manager.register_transform( Attribute, inference_tip(inference_function), functools.partial(attribute_looks_like_numpy_member, method_name), ) astroid-3.2.2/astroid/brain/brain_argparse.py0000664000175000017500000000342514622475517021203 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid import arguments, inference_tip, nodes from astroid.context import InferenceContext from astroid.exceptions import UseInferenceDefault from astroid.manager import AstroidManager def infer_namespace(node, context: InferenceContext | None = None): callsite = arguments.CallSite.from_call(node, context=context) if not callsite.keyword_arguments: # Cannot make sense of it. raise UseInferenceDefault() class_node = nodes.ClassDef( "Namespace", lineno=node.lineno, col_offset=node.col_offset, parent=nodes.Unknown(), end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) # Set parent manually until ClassDef constructor fixed: # https://github.com/pylint-dev/astroid/issues/1490 class_node.parent = node.parent for attr in set(callsite.keyword_arguments): fake_node = nodes.EmptyNode() fake_node.parent = class_node fake_node.attrname = attr class_node.instance_attrs[attr] = [fake_node] return iter((class_node.instantiate_class(),)) def _looks_like_namespace(node) -> bool: func = node.func if isinstance(func, nodes.Attribute): return ( func.attrname == "Namespace" and isinstance(func.expr, nodes.Name) and func.expr.name == "argparse" ) return False def register(manager: AstroidManager) -> None: manager.register_transform( nodes.Call, inference_tip(infer_namespace), _looks_like_namespace ) astroid-3.2.2/astroid/brain/brain_sqlalchemy.py0000664000175000017500000000204514622475517021536 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def _session_transform(): return parse( """ from sqlalchemy.orm.session import Session class sessionmaker: def __init__( self, bind=None, class_=Session, autoflush=True, autocommit=False, expire_on_commit=True, info=None, **kw ): return def __call__(self, **local_kw): return Session() def configure(self, **new_kw): return return Session() """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "sqlalchemy.orm.session", _session_transform) astroid-3.2.2/astroid/brain/brain_curses.py0000664000175000017500000000671114622475517020704 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def _curses_transform(): return parse( """ A_ALTCHARSET = 1 A_BLINK = 1 A_BOLD = 1 A_DIM = 1 A_INVIS = 1 A_ITALIC = 1 A_NORMAL = 1 A_PROTECT = 1 A_REVERSE = 1 A_STANDOUT = 1 A_UNDERLINE = 1 A_HORIZONTAL = 1 A_LEFT = 1 A_LOW = 1 A_RIGHT = 1 A_TOP = 1 A_VERTICAL = 1 A_CHARTEXT = 1 A_ATTRIBUTES = 1 A_CHARTEXT = 1 A_COLOR = 1 KEY_MIN = 1 KEY_BREAK = 1 KEY_DOWN = 1 KEY_UP = 1 KEY_LEFT = 1 KEY_RIGHT = 1 KEY_HOME = 1 KEY_BACKSPACE = 1 KEY_F0 = 1 KEY_Fn = 1 KEY_DL = 1 KEY_IL = 1 KEY_DC = 1 KEY_IC = 1 KEY_EIC = 1 KEY_CLEAR = 1 KEY_EOS = 1 KEY_EOL = 1 KEY_SF = 1 KEY_SR = 1 KEY_NPAGE = 1 KEY_PPAGE = 1 KEY_STAB = 1 KEY_CTAB = 1 KEY_CATAB = 1 KEY_ENTER = 1 KEY_SRESET = 1 KEY_RESET = 1 KEY_PRINT = 1 KEY_LL = 1 KEY_A1 = 1 KEY_A3 = 1 KEY_B2 = 1 KEY_C1 = 1 KEY_C3 = 1 KEY_BTAB = 1 KEY_BEG = 1 KEY_CANCEL = 1 KEY_CLOSE = 1 KEY_COMMAND = 1 KEY_COPY = 1 KEY_CREATE = 1 KEY_END = 1 KEY_EXIT = 1 KEY_FIND = 1 KEY_HELP = 1 KEY_MARK = 1 KEY_MESSAGE = 1 KEY_MOVE = 1 KEY_NEXT = 1 KEY_OPEN = 1 KEY_OPTIONS = 1 KEY_PREVIOUS = 1 KEY_REDO = 1 KEY_REFERENCE = 1 KEY_REFRESH = 1 KEY_REPLACE = 1 KEY_RESTART = 1 KEY_RESUME = 1 KEY_SAVE = 1 KEY_SBEG = 1 KEY_SCANCEL = 1 KEY_SCOMMAND = 1 KEY_SCOPY = 1 KEY_SCREATE = 1 KEY_SDC = 1 KEY_SDL = 1 KEY_SELECT = 1 KEY_SEND = 1 KEY_SEOL = 1 KEY_SEXIT = 1 KEY_SFIND = 1 KEY_SHELP = 1 KEY_SHOME = 1 KEY_SIC = 1 KEY_SLEFT = 1 KEY_SMESSAGE = 1 KEY_SMOVE = 1 KEY_SNEXT = 1 KEY_SOPTIONS = 1 KEY_SPREVIOUS = 1 KEY_SPRINT = 1 KEY_SREDO = 1 KEY_SREPLACE = 1 KEY_SRIGHT = 1 KEY_SRSUME = 1 KEY_SSAVE = 1 KEY_SSUSPEND = 1 KEY_SUNDO = 1 KEY_SUSPEND = 1 KEY_UNDO = 1 KEY_MOUSE = 1 KEY_RESIZE = 1 KEY_MAX = 1 ACS_BBSS = 1 ACS_BLOCK = 1 ACS_BOARD = 1 ACS_BSBS = 1 ACS_BSSB = 1 ACS_BSSS = 1 ACS_BTEE = 1 ACS_BULLET = 1 ACS_CKBOARD = 1 ACS_DARROW = 1 ACS_DEGREE = 1 ACS_DIAMOND = 1 ACS_GEQUAL = 1 ACS_HLINE = 1 ACS_LANTERN = 1 ACS_LARROW = 1 ACS_LEQUAL = 1 ACS_LLCORNER = 1 ACS_LRCORNER = 1 ACS_LTEE = 1 ACS_NEQUAL = 1 ACS_PI = 1 ACS_PLMINUS = 1 ACS_PLUS = 1 ACS_RARROW = 1 ACS_RTEE = 1 ACS_S1 = 1 ACS_S3 = 1 ACS_S7 = 1 ACS_S9 = 1 ACS_SBBS = 1 ACS_SBSB = 1 ACS_SBSS = 1 ACS_SSBB = 1 ACS_SSBS = 1 ACS_SSSB = 1 ACS_SSSS = 1 ACS_STERLING = 1 ACS_TTEE = 1 ACS_UARROW = 1 ACS_ULCORNER = 1 ACS_URCORNER = 1 ACS_VLINE = 1 COLOR_BLACK = 1 COLOR_BLUE = 1 COLOR_CYAN = 1 COLOR_GREEN = 1 COLOR_MAGENTA = 1 COLOR_RED = 1 COLOR_WHITE = 1 COLOR_YELLOW = 1 """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "curses", _curses_transform) astroid-3.2.2/astroid/brain/brain_qt.py0000664000175000017500000000547214622475517020027 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the PyQT library.""" from astroid import nodes, parse from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.manager import AstroidManager def _looks_like_signal( node: nodes.FunctionDef, signal_name: str = "pyqtSignal" ) -> bool: """Detect a Signal node.""" klasses = node.instance_attrs.get("__class__", []) # On PySide2 or PySide6 (since Qt 5.15.2) the Signal class changed locations if node.qname().partition(".")[0] in {"PySide2", "PySide6"}: return any(cls.qname() == "Signal" for cls in klasses) # pragma: no cover if klasses: try: return klasses[0].name == signal_name except AttributeError: # pragma: no cover # return False if the cls does not have a name attribute pass return False def transform_pyqt_signal(node: nodes.FunctionDef) -> None: module = parse( """ _UNSET = object() class pyqtSignal(object): def connect(self, slot, type=None, no_receiver_check=False): pass def disconnect(self, slot=_UNSET): pass def emit(self, *args): pass """ ) signal_cls: nodes.ClassDef = module["pyqtSignal"] node.instance_attrs["emit"] = [signal_cls["emit"]] node.instance_attrs["disconnect"] = [signal_cls["disconnect"]] node.instance_attrs["connect"] = [signal_cls["connect"]] def transform_pyside_signal(node: nodes.FunctionDef) -> None: module = parse( """ class NotPySideSignal(object): def connect(self, receiver, type=None): pass def disconnect(self, receiver): pass def emit(self, *args): pass """ ) signal_cls: nodes.ClassDef = module["NotPySideSignal"] node.instance_attrs["connect"] = [signal_cls["connect"]] node.instance_attrs["disconnect"] = [signal_cls["disconnect"]] node.instance_attrs["emit"] = [signal_cls["emit"]] def pyqt4_qtcore_transform(): return AstroidBuilder(AstroidManager()).string_build( """ def SIGNAL(signal_name): pass class QObject(object): def emit(self, signal): pass """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "PyQt4.QtCore", pyqt4_qtcore_transform) manager.register_transform( nodes.FunctionDef, transform_pyqt_signal, _looks_like_signal ) manager.register_transform( nodes.ClassDef, transform_pyside_signal, lambda node: node.qname() in {"PySide.QtCore.Signal", "PySide2.QtCore.Signal"}, ) astroid-3.2.2/astroid/brain/__init__.py0000664000175000017500000000000014622475517017745 0ustar epsilonepsilonastroid-3.2.2/astroid/brain/brain_unittest.py0000664000175000017500000000216014622475517021251 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for unittest module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def IsolatedAsyncioTestCaseImport(): """ In the unittest package, the IsolatedAsyncioTestCase class is imported lazily. I.E. only when the ``__getattr__`` method of the unittest module is called with 'IsolatedAsyncioTestCase' as argument. Thus the IsolatedAsyncioTestCase is not imported statically (during import time). This function mocks a classical static import of the IsolatedAsyncioTestCase. (see https://github.com/pylint-dev/pylint/issues/4060) """ return parse( """ from .async_case import IsolatedAsyncioTestCase """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "unittest", IsolatedAsyncioTestCaseImport) astroid-3.2.2/astroid/brain/brain_numpy_core_umath.py0000664000175000017500000001151314622475517022752 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt # Note: starting with version 1.18 numpy module has `__getattr__` method which prevent # `pylint` to emit `no-member` message for all numpy's attributes. (see pylint's module # typecheck in `_emit_no_member` function) """Astroid hooks for numpy.core.umath module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_core_umath_transform(): ufunc_optional_keyword_arguments = ( """out=None, where=True, casting='same_kind', order='K', """ """dtype=None, subok=True""" ) return parse( """ class FakeUfunc: def __init__(self): self.__doc__ = str() self.__name__ = str() self.nin = 0 self.nout = 0 self.nargs = 0 self.ntypes = 0 self.types = None self.identity = None self.signature = None @classmethod def reduce(cls, a, axis=None, dtype=None, out=None): return numpy.ndarray([0, 0]) @classmethod def accumulate(cls, array, axis=None, dtype=None, out=None): return numpy.ndarray([0, 0]) @classmethod def reduceat(cls, a, indices, axis=None, dtype=None, out=None): return numpy.ndarray([0, 0]) @classmethod def outer(cls, A, B, **kwargs): return numpy.ndarray([0, 0]) @classmethod def at(cls, a, indices, b=None): return numpy.ndarray([0, 0]) class FakeUfuncOneArg(FakeUfunc): def __call__(self, x, {opt_args:s}): return numpy.ndarray([0, 0]) class FakeUfuncOneArgBis(FakeUfunc): def __call__(self, x, {opt_args:s}): return numpy.ndarray([0, 0]), numpy.ndarray([0, 0]) class FakeUfuncTwoArgs(FakeUfunc): def __call__(self, x1, x2, {opt_args:s}): return numpy.ndarray([0, 0]) # Constants e = 2.718281828459045 euler_gamma = 0.5772156649015329 # One arg functions with optional kwargs arccos = FakeUfuncOneArg() arccosh = FakeUfuncOneArg() arcsin = FakeUfuncOneArg() arcsinh = FakeUfuncOneArg() arctan = FakeUfuncOneArg() arctanh = FakeUfuncOneArg() cbrt = FakeUfuncOneArg() conj = FakeUfuncOneArg() conjugate = FakeUfuncOneArg() cosh = FakeUfuncOneArg() deg2rad = FakeUfuncOneArg() degrees = FakeUfuncOneArg() exp2 = FakeUfuncOneArg() expm1 = FakeUfuncOneArg() fabs = FakeUfuncOneArg() frexp = FakeUfuncOneArgBis() isfinite = FakeUfuncOneArg() isinf = FakeUfuncOneArg() log = FakeUfuncOneArg() log1p = FakeUfuncOneArg() log2 = FakeUfuncOneArg() logical_not = FakeUfuncOneArg() modf = FakeUfuncOneArgBis() negative = FakeUfuncOneArg() positive = FakeUfuncOneArg() rad2deg = FakeUfuncOneArg() radians = FakeUfuncOneArg() reciprocal = FakeUfuncOneArg() rint = FakeUfuncOneArg() sign = FakeUfuncOneArg() signbit = FakeUfuncOneArg() sinh = FakeUfuncOneArg() spacing = FakeUfuncOneArg() square = FakeUfuncOneArg() tan = FakeUfuncOneArg() tanh = FakeUfuncOneArg() trunc = FakeUfuncOneArg() # Two args functions with optional kwargs add = FakeUfuncTwoArgs() bitwise_and = FakeUfuncTwoArgs() bitwise_or = FakeUfuncTwoArgs() bitwise_xor = FakeUfuncTwoArgs() copysign = FakeUfuncTwoArgs() divide = FakeUfuncTwoArgs() divmod = FakeUfuncTwoArgs() equal = FakeUfuncTwoArgs() float_power = FakeUfuncTwoArgs() floor_divide = FakeUfuncTwoArgs() fmax = FakeUfuncTwoArgs() fmin = FakeUfuncTwoArgs() fmod = FakeUfuncTwoArgs() greater = FakeUfuncTwoArgs() gcd = FakeUfuncTwoArgs() hypot = FakeUfuncTwoArgs() heaviside = FakeUfuncTwoArgs() lcm = FakeUfuncTwoArgs() ldexp = FakeUfuncTwoArgs() left_shift = FakeUfuncTwoArgs() less = FakeUfuncTwoArgs() logaddexp = FakeUfuncTwoArgs() logaddexp2 = FakeUfuncTwoArgs() logical_and = FakeUfuncTwoArgs() logical_or = FakeUfuncTwoArgs() logical_xor = FakeUfuncTwoArgs() maximum = FakeUfuncTwoArgs() minimum = FakeUfuncTwoArgs() multiply = FakeUfuncTwoArgs() nextafter = FakeUfuncTwoArgs() not_equal = FakeUfuncTwoArgs() power = FakeUfuncTwoArgs() remainder = FakeUfuncTwoArgs() right_shift = FakeUfuncTwoArgs() subtract = FakeUfuncTwoArgs() true_divide = FakeUfuncTwoArgs() """.format( opt_args=ufunc_optional_keyword_arguments ) ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "numpy.core.umath", numpy_core_umath_transform) astroid-3.2.2/astroid/brain/brain_numpy_core_numerictypes.py0000664000175000017500000002063614622475517024371 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt # TODO(hippo91) : correct the methods signature. """Astroid hooks for numpy.core.numerictypes module.""" from astroid.brain.brain_numpy_utils import numpy_supports_type_hints from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_core_numerictypes_transform(): # TODO: Uniformize the generic API with the ndarray one. # According to numpy doc the generic object should expose # the same API than ndarray. This has been done here partially # through the astype method. generic_src = """ class generic(object): def __init__(self, value): self.T = np.ndarray([0, 0]) self.base = None self.data = None self.dtype = None self.flags = None # Should be a numpy.flatiter instance but not available for now # Putting an array instead so that iteration and indexing are authorized self.flat = np.ndarray([0, 0]) self.imag = None self.itemsize = None self.nbytes = None self.ndim = None self.real = None self.size = None self.strides = None def all(self): return uninferable def any(self): return uninferable def argmax(self): return uninferable def argmin(self): return uninferable def argsort(self): return uninferable def astype(self, dtype, order='K', casting='unsafe', subok=True, copy=True): return np.ndarray([0, 0]) def base(self): return uninferable def byteswap(self): return uninferable def choose(self): return uninferable def clip(self): return uninferable def compress(self): return uninferable def conj(self): return uninferable def conjugate(self): return uninferable def copy(self): return uninferable def cumprod(self): return uninferable def cumsum(self): return uninferable def data(self): return uninferable def diagonal(self): return uninferable def dtype(self): return uninferable def dump(self): return uninferable def dumps(self): return uninferable def fill(self): return uninferable def flags(self): return uninferable def flat(self): return uninferable def flatten(self): return uninferable def getfield(self): return uninferable def imag(self): return uninferable def item(self): return uninferable def itemset(self): return uninferable def itemsize(self): return uninferable def max(self): return uninferable def mean(self): return uninferable def min(self): return uninferable def nbytes(self): return uninferable def ndim(self): return uninferable def newbyteorder(self): return uninferable def nonzero(self): return uninferable def prod(self): return uninferable def ptp(self): return uninferable def put(self): return uninferable def ravel(self): return uninferable def real(self): return uninferable def repeat(self): return uninferable def reshape(self): return uninferable def resize(self): return uninferable def round(self): return uninferable def searchsorted(self): return uninferable def setfield(self): return uninferable def setflags(self): return uninferable def shape(self): return uninferable def size(self): return uninferable def sort(self): return uninferable def squeeze(self): return uninferable def std(self): return uninferable def strides(self): return uninferable def sum(self): return uninferable def swapaxes(self): return uninferable def take(self): return uninferable def tobytes(self): return uninferable def tofile(self): return uninferable def tolist(self): return uninferable def tostring(self): return uninferable def trace(self): return uninferable def transpose(self): return uninferable def var(self): return uninferable def view(self): return uninferable """ if numpy_supports_type_hints(): generic_src += """ @classmethod def __class_getitem__(cls, value): return cls """ return parse( generic_src + """ class dtype(object): def __init__(self, obj, align=False, copy=False): self.alignment = None self.base = None self.byteorder = None self.char = None self.descr = None self.fields = None self.flags = None self.hasobject = None self.isalignedstruct = None self.isbuiltin = None self.isnative = None self.itemsize = None self.kind = None self.metadata = None self.name = None self.names = None self.num = None self.shape = None self.str = None self.subdtype = None self.type = None def newbyteorder(self, new_order='S'): return uninferable def __neg__(self): return uninferable class busdaycalendar(object): def __init__(self, weekmask='1111100', holidays=None): self.holidays = None self.weekmask = None class flexible(generic): pass class bool_(generic): pass class number(generic): def __neg__(self): return uninferable class datetime64(generic): def __init__(self, nb, unit=None): pass class void(flexible): def __init__(self, *args, **kwargs): self.base = None self.dtype = None self.flags = None def getfield(self): return uninferable def setfield(self): return uninferable class character(flexible): pass class integer(number): def __init__(self, value): self.denominator = None self.numerator = None class inexact(number): pass class str_(str, character): def maketrans(self, x, y=None, z=None): return uninferable class bytes_(bytes, character): def fromhex(self, string): return uninferable def maketrans(self, frm, to): return uninferable class signedinteger(integer): pass class unsignedinteger(integer): pass class complexfloating(inexact): pass class floating(inexact): pass class float64(floating, float): def fromhex(self, string): return uninferable class uint64(unsignedinteger): pass class complex64(complexfloating): pass class int16(signedinteger): pass class float96(floating): pass class int8(signedinteger): pass class uint32(unsignedinteger): pass class uint8(unsignedinteger): pass class _typedict(dict): pass class complex192(complexfloating): pass class timedelta64(signedinteger): def __init__(self, nb, unit=None): pass class int32(signedinteger): pass class uint16(unsignedinteger): pass class float32(floating): pass class complex128(complexfloating, complex): pass class float16(floating): pass class int64(signedinteger): pass buffer_type = memoryview bool8 = bool_ byte = int8 bytes0 = bytes_ cdouble = complex128 cfloat = complex128 clongdouble = complex192 clongfloat = complex192 complex_ = complex128 csingle = complex64 double = float64 float_ = float64 half = float16 int0 = int32 int_ = int32 intc = int32 intp = int32 long = int32 longcomplex = complex192 longdouble = float96 longfloat = float96 longlong = int64 object0 = object_ object_ = object_ short = int16 single = float32 singlecomplex = complex64 str0 = str_ string_ = bytes_ ubyte = uint8 uint = uint32 uint0 = uint32 uintc = uint32 uintp = uint32 ulonglong = uint64 unicode = str_ unicode_ = str_ ushort = uint16 void0 = void """ ) def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.core.numerictypes", numpy_core_numerictypes_transform ) astroid-3.2.2/astroid/brain/brain_numpy_ndarray.py0000664000175000017500000002155214622475517022270 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy ndarray class.""" from __future__ import annotations from astroid.brain.brain_numpy_utils import numpy_supports_type_hints from astroid.builder import extract_node from astroid.context import InferenceContext from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.nodes.node_classes import Attribute def infer_numpy_ndarray(node, context: InferenceContext | None = None): ndarray = """ class ndarray(object): def __init__(self, shape, dtype=float, buffer=None, offset=0, strides=None, order=None): self.T = numpy.ndarray([0, 0]) self.base = None self.ctypes = None self.data = None self.dtype = None self.flags = None # Should be a numpy.flatiter instance but not available for now # Putting an array instead so that iteration and indexing are authorized self.flat = np.ndarray([0, 0]) self.imag = np.ndarray([0, 0]) self.itemsize = None self.nbytes = None self.ndim = None self.real = np.ndarray([0, 0]) self.shape = numpy.ndarray([0, 0]) self.size = None self.strides = None def __abs__(self): return numpy.ndarray([0, 0]) def __add__(self, value): return numpy.ndarray([0, 0]) def __and__(self, value): return numpy.ndarray([0, 0]) def __array__(self, dtype=None): return numpy.ndarray([0, 0]) def __array_wrap__(self, obj): return numpy.ndarray([0, 0]) def __contains__(self, key): return True def __copy__(self): return numpy.ndarray([0, 0]) def __deepcopy__(self, memo): return numpy.ndarray([0, 0]) def __divmod__(self, value): return (numpy.ndarray([0, 0]), numpy.ndarray([0, 0])) def __eq__(self, value): return numpy.ndarray([0, 0]) def __float__(self): return 0. def __floordiv__(self): return numpy.ndarray([0, 0]) def __ge__(self, value): return numpy.ndarray([0, 0]) def __getitem__(self, key): return uninferable def __gt__(self, value): return numpy.ndarray([0, 0]) def __iadd__(self, value): return numpy.ndarray([0, 0]) def __iand__(self, value): return numpy.ndarray([0, 0]) def __ifloordiv__(self, value): return numpy.ndarray([0, 0]) def __ilshift__(self, value): return numpy.ndarray([0, 0]) def __imod__(self, value): return numpy.ndarray([0, 0]) def __imul__(self, value): return numpy.ndarray([0, 0]) def __int__(self): return 0 def __invert__(self): return numpy.ndarray([0, 0]) def __ior__(self, value): return numpy.ndarray([0, 0]) def __ipow__(self, value): return numpy.ndarray([0, 0]) def __irshift__(self, value): return numpy.ndarray([0, 0]) def __isub__(self, value): return numpy.ndarray([0, 0]) def __itruediv__(self, value): return numpy.ndarray([0, 0]) def __ixor__(self, value): return numpy.ndarray([0, 0]) def __le__(self, value): return numpy.ndarray([0, 0]) def __len__(self): return 1 def __lshift__(self, value): return numpy.ndarray([0, 0]) def __lt__(self, value): return numpy.ndarray([0, 0]) def __matmul__(self, value): return numpy.ndarray([0, 0]) def __mod__(self, value): return numpy.ndarray([0, 0]) def __mul__(self, value): return numpy.ndarray([0, 0]) def __ne__(self, value): return numpy.ndarray([0, 0]) def __neg__(self): return numpy.ndarray([0, 0]) def __or__(self, value): return numpy.ndarray([0, 0]) def __pos__(self): return numpy.ndarray([0, 0]) def __pow__(self): return numpy.ndarray([0, 0]) def __repr__(self): return str() def __rshift__(self): return numpy.ndarray([0, 0]) def __setitem__(self, key, value): return uninferable def __str__(self): return str() def __sub__(self, value): return numpy.ndarray([0, 0]) def __truediv__(self, value): return numpy.ndarray([0, 0]) def __xor__(self, value): return numpy.ndarray([0, 0]) def all(self, axis=None, out=None, keepdims=False): return np.ndarray([0, 0]) def any(self, axis=None, out=None, keepdims=False): return np.ndarray([0, 0]) def argmax(self, axis=None, out=None): return np.ndarray([0, 0]) def argmin(self, axis=None, out=None): return np.ndarray([0, 0]) def argpartition(self, kth, axis=-1, kind='introselect', order=None): return np.ndarray([0, 0]) def argsort(self, axis=-1, kind='quicksort', order=None): return np.ndarray([0, 0]) def astype(self, dtype, order='K', casting='unsafe', subok=True, copy=True): return np.ndarray([0, 0]) def byteswap(self, inplace=False): return np.ndarray([0, 0]) def choose(self, choices, out=None, mode='raise'): return np.ndarray([0, 0]) def clip(self, min=None, max=None, out=None): return np.ndarray([0, 0]) def compress(self, condition, axis=None, out=None): return np.ndarray([0, 0]) def conj(self): return np.ndarray([0, 0]) def conjugate(self): return np.ndarray([0, 0]) def copy(self, order='C'): return np.ndarray([0, 0]) def cumprod(self, axis=None, dtype=None, out=None): return np.ndarray([0, 0]) def cumsum(self, axis=None, dtype=None, out=None): return np.ndarray([0, 0]) def diagonal(self, offset=0, axis1=0, axis2=1): return np.ndarray([0, 0]) def dot(self, b, out=None): return np.ndarray([0, 0]) def dump(self, file): return None def dumps(self): return str() def fill(self, value): return None def flatten(self, order='C'): return np.ndarray([0, 0]) def getfield(self, dtype, offset=0): return np.ndarray([0, 0]) def item(self, *args): return uninferable def itemset(self, *args): return None def max(self, axis=None, out=None): return np.ndarray([0, 0]) def mean(self, axis=None, dtype=None, out=None, keepdims=False): return np.ndarray([0, 0]) def min(self, axis=None, out=None, keepdims=False): return np.ndarray([0, 0]) def newbyteorder(self, new_order='S'): return np.ndarray([0, 0]) def nonzero(self): return (1,) def partition(self, kth, axis=-1, kind='introselect', order=None): return None def prod(self, axis=None, dtype=None, out=None, keepdims=False): return np.ndarray([0, 0]) def ptp(self, axis=None, out=None): return np.ndarray([0, 0]) def put(self, indices, values, mode='raise'): return None def ravel(self, order='C'): return np.ndarray([0, 0]) def repeat(self, repeats, axis=None): return np.ndarray([0, 0]) def reshape(self, shape, order='C'): return np.ndarray([0, 0]) def resize(self, new_shape, refcheck=True): return None def round(self, decimals=0, out=None): return np.ndarray([0, 0]) def searchsorted(self, v, side='left', sorter=None): return np.ndarray([0, 0]) def setfield(self, val, dtype, offset=0): return None def setflags(self, write=None, align=None, uic=None): return None def sort(self, axis=-1, kind='quicksort', order=None): return None def squeeze(self, axis=None): return np.ndarray([0, 0]) def std(self, axis=None, dtype=None, out=None, ddof=0, keepdims=False): return np.ndarray([0, 0]) def sum(self, axis=None, dtype=None, out=None, keepdims=False): return np.ndarray([0, 0]) def swapaxes(self, axis1, axis2): return np.ndarray([0, 0]) def take(self, indices, axis=None, out=None, mode='raise'): return np.ndarray([0, 0]) def tobytes(self, order='C'): return b'' def tofile(self, fid, sep="", format="%s"): return None def tolist(self, ): return [] def tostring(self, order='C'): return b'' def trace(self, offset=0, axis1=0, axis2=1, dtype=None, out=None): return np.ndarray([0, 0]) def transpose(self, *axes): return np.ndarray([0, 0]) def var(self, axis=None, dtype=None, out=None, ddof=0, keepdims=False): return np.ndarray([0, 0]) def view(self, dtype=None, type=None): return np.ndarray([0, 0]) """ if numpy_supports_type_hints(): ndarray += """ @classmethod def __class_getitem__(cls, value): return cls """ node = extract_node(ndarray) return node.infer(context=context) def _looks_like_numpy_ndarray(node) -> bool: return isinstance(node, Attribute) and node.attrname == "ndarray" def register(manager: AstroidManager) -> None: manager.register_transform( Attribute, inference_tip(infer_numpy_ndarray), _looks_like_numpy_ndarray, ) astroid-3.2.2/astroid/brain/brain_pathlib.py0000664000175000017500000000311214622475517021013 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from collections.abc import Iterator from astroid import bases, context, inference_tip, nodes from astroid.builder import _extract_single_node from astroid.exceptions import InferenceError, UseInferenceDefault from astroid.manager import AstroidManager PATH_TEMPLATE = """ from pathlib import Path Path """ def _looks_like_parents_subscript(node: nodes.Subscript) -> bool: if not ( isinstance(node.value, nodes.Attribute) and node.value.attrname == "parents" ): return False try: value = next(node.value.infer()) except (InferenceError, StopIteration): return False return ( isinstance(value, bases.Instance) and isinstance(value._proxied, nodes.ClassDef) and value.qname() == "pathlib._PathParents" ) def infer_parents_subscript( subscript_node: nodes.Subscript, ctx: context.InferenceContext | None = None ) -> Iterator[bases.Instance]: if isinstance(subscript_node.slice, nodes.Const): path_cls = next(_extract_single_node(PATH_TEMPLATE).infer()) return iter([path_cls.instantiate_class()]) raise UseInferenceDefault def register(manager: AstroidManager) -> None: manager.register_transform( nodes.Subscript, inference_tip(infer_parents_subscript), _looks_like_parents_subscript, ) astroid-3.2.2/astroid/brain/brain_http.py0000664000175000017500000002467714622475517020372 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid brain hints for some of the `http` module.""" import textwrap from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.manager import AstroidManager def _http_transform(): code = textwrap.dedent( """ from enum import IntEnum from collections import namedtuple _HTTPStatus = namedtuple('_HTTPStatus', 'value phrase description') class HTTPStatus(IntEnum): @property def phrase(self): return "" @property def value(self): return 0 @property def description(self): return "" # informational CONTINUE = _HTTPStatus(100, 'Continue', 'Request received, please continue') SWITCHING_PROTOCOLS = _HTTPStatus(101, 'Switching Protocols', 'Switching to new protocol; obey Upgrade header') PROCESSING = _HTTPStatus(102, 'Processing', '') OK = _HTTPStatus(200, 'OK', 'Request fulfilled, document follows') CREATED = _HTTPStatus(201, 'Created', 'Document created, URL follows') ACCEPTED = _HTTPStatus(202, 'Accepted', 'Request accepted, processing continues off-line') NON_AUTHORITATIVE_INFORMATION = _HTTPStatus(203, 'Non-Authoritative Information', 'Request fulfilled from cache') NO_CONTENT = _HTTPStatus(204, 'No Content', 'Request fulfilled, nothing follows') RESET_CONTENT =_HTTPStatus(205, 'Reset Content', 'Clear input form for further input') PARTIAL_CONTENT = _HTTPStatus(206, 'Partial Content', 'Partial content follows') MULTI_STATUS = _HTTPStatus(207, 'Multi-Status', '') ALREADY_REPORTED = _HTTPStatus(208, 'Already Reported', '') IM_USED = _HTTPStatus(226, 'IM Used', '') MULTIPLE_CHOICES = _HTTPStatus(300, 'Multiple Choices', 'Object has several resources -- see URI list') MOVED_PERMANENTLY = _HTTPStatus(301, 'Moved Permanently', 'Object moved permanently -- see URI list') FOUND = _HTTPStatus(302, 'Found', 'Object moved temporarily -- see URI list') SEE_OTHER = _HTTPStatus(303, 'See Other', 'Object moved -- see Method and URL list') NOT_MODIFIED = _HTTPStatus(304, 'Not Modified', 'Document has not changed since given time') USE_PROXY = _HTTPStatus(305, 'Use Proxy', 'You must use proxy specified in Location to access this resource') TEMPORARY_REDIRECT = _HTTPStatus(307, 'Temporary Redirect', 'Object moved temporarily -- see URI list') PERMANENT_REDIRECT = _HTTPStatus(308, 'Permanent Redirect', 'Object moved permanently -- see URI list') BAD_REQUEST = _HTTPStatus(400, 'Bad Request', 'Bad request syntax or unsupported method') UNAUTHORIZED = _HTTPStatus(401, 'Unauthorized', 'No permission -- see authorization schemes') PAYMENT_REQUIRED = _HTTPStatus(402, 'Payment Required', 'No payment -- see charging schemes') FORBIDDEN = _HTTPStatus(403, 'Forbidden', 'Request forbidden -- authorization will not help') NOT_FOUND = _HTTPStatus(404, 'Not Found', 'Nothing matches the given URI') METHOD_NOT_ALLOWED = _HTTPStatus(405, 'Method Not Allowed', 'Specified method is invalid for this resource') NOT_ACCEPTABLE = _HTTPStatus(406, 'Not Acceptable', 'URI not available in preferred format') PROXY_AUTHENTICATION_REQUIRED = _HTTPStatus(407, 'Proxy Authentication Required', 'You must authenticate with this proxy before proceeding') REQUEST_TIMEOUT = _HTTPStatus(408, 'Request Timeout', 'Request timed out; try again later') CONFLICT = _HTTPStatus(409, 'Conflict', 'Request conflict') GONE = _HTTPStatus(410, 'Gone', 'URI no longer exists and has been permanently removed') LENGTH_REQUIRED = _HTTPStatus(411, 'Length Required', 'Client must specify Content-Length') PRECONDITION_FAILED = _HTTPStatus(412, 'Precondition Failed', 'Precondition in headers is false') REQUEST_ENTITY_TOO_LARGE = _HTTPStatus(413, 'Request Entity Too Large', 'Entity is too large') REQUEST_URI_TOO_LONG = _HTTPStatus(414, 'Request-URI Too Long', 'URI is too long') UNSUPPORTED_MEDIA_TYPE = _HTTPStatus(415, 'Unsupported Media Type', 'Entity body in unsupported format') REQUESTED_RANGE_NOT_SATISFIABLE = _HTTPStatus(416, 'Requested Range Not Satisfiable', 'Cannot satisfy request range') EXPECTATION_FAILED = _HTTPStatus(417, 'Expectation Failed', 'Expect condition could not be satisfied') MISDIRECTED_REQUEST = _HTTPStatus(421, 'Misdirected Request', 'Server is not able to produce a response') UNPROCESSABLE_ENTITY = _HTTPStatus(422, 'Unprocessable Entity') LOCKED = _HTTPStatus(423, 'Locked') FAILED_DEPENDENCY = _HTTPStatus(424, 'Failed Dependency') UPGRADE_REQUIRED = _HTTPStatus(426, 'Upgrade Required') PRECONDITION_REQUIRED = _HTTPStatus(428, 'Precondition Required', 'The origin server requires the request to be conditional') TOO_MANY_REQUESTS = _HTTPStatus(429, 'Too Many Requests', 'The user has sent too many requests in ' 'a given amount of time ("rate limiting")') REQUEST_HEADER_FIELDS_TOO_LARGE = _HTTPStatus(431, 'Request Header Fields Too Large', 'The server is unwilling to process the request because its header ' 'fields are too large') UNAVAILABLE_FOR_LEGAL_REASONS = _HTTPStatus(451, 'Unavailable For Legal Reasons', 'The server is denying access to the ' 'resource as a consequence of a legal demand') INTERNAL_SERVER_ERROR = _HTTPStatus(500, 'Internal Server Error', 'Server got itself in trouble') NOT_IMPLEMENTED = _HTTPStatus(501, 'Not Implemented', 'Server does not support this operation') BAD_GATEWAY = _HTTPStatus(502, 'Bad Gateway', 'Invalid responses from another server/proxy') SERVICE_UNAVAILABLE = _HTTPStatus(503, 'Service Unavailable', 'The server cannot process the request due to a high load') GATEWAY_TIMEOUT = _HTTPStatus(504, 'Gateway Timeout', 'The gateway server did not receive a timely response') HTTP_VERSION_NOT_SUPPORTED = _HTTPStatus(505, 'HTTP Version Not Supported', 'Cannot fulfill request') VARIANT_ALSO_NEGOTIATES = _HTTPStatus(506, 'Variant Also Negotiates') INSUFFICIENT_STORAGE = _HTTPStatus(507, 'Insufficient Storage') LOOP_DETECTED = _HTTPStatus(508, 'Loop Detected') NOT_EXTENDED = _HTTPStatus(510, 'Not Extended') NETWORK_AUTHENTICATION_REQUIRED = _HTTPStatus(511, 'Network Authentication Required', 'The client needs to authenticate to gain network access') """ ) return AstroidBuilder(AstroidManager()).string_build(code) def _http_client_transform(): return AstroidBuilder(AstroidManager()).string_build( textwrap.dedent( """ from http import HTTPStatus CONTINUE = HTTPStatus.CONTINUE SWITCHING_PROTOCOLS = HTTPStatus.SWITCHING_PROTOCOLS PROCESSING = HTTPStatus.PROCESSING OK = HTTPStatus.OK CREATED = HTTPStatus.CREATED ACCEPTED = HTTPStatus.ACCEPTED NON_AUTHORITATIVE_INFORMATION = HTTPStatus.NON_AUTHORITATIVE_INFORMATION NO_CONTENT = HTTPStatus.NO_CONTENT RESET_CONTENT = HTTPStatus.RESET_CONTENT PARTIAL_CONTENT = HTTPStatus.PARTIAL_CONTENT MULTI_STATUS = HTTPStatus.MULTI_STATUS ALREADY_REPORTED = HTTPStatus.ALREADY_REPORTED IM_USED = HTTPStatus.IM_USED MULTIPLE_CHOICES = HTTPStatus.MULTIPLE_CHOICES MOVED_PERMANENTLY = HTTPStatus.MOVED_PERMANENTLY FOUND = HTTPStatus.FOUND SEE_OTHER = HTTPStatus.SEE_OTHER NOT_MODIFIED = HTTPStatus.NOT_MODIFIED USE_PROXY = HTTPStatus.USE_PROXY TEMPORARY_REDIRECT = HTTPStatus.TEMPORARY_REDIRECT PERMANENT_REDIRECT = HTTPStatus.PERMANENT_REDIRECT BAD_REQUEST = HTTPStatus.BAD_REQUEST UNAUTHORIZED = HTTPStatus.UNAUTHORIZED PAYMENT_REQUIRED = HTTPStatus.PAYMENT_REQUIRED FORBIDDEN = HTTPStatus.FORBIDDEN NOT_FOUND = HTTPStatus.NOT_FOUND METHOD_NOT_ALLOWED = HTTPStatus.METHOD_NOT_ALLOWED NOT_ACCEPTABLE = HTTPStatus.NOT_ACCEPTABLE PROXY_AUTHENTICATION_REQUIRED = HTTPStatus.PROXY_AUTHENTICATION_REQUIRED REQUEST_TIMEOUT = HTTPStatus.REQUEST_TIMEOUT CONFLICT = HTTPStatus.CONFLICT GONE = HTTPStatus.GONE LENGTH_REQUIRED = HTTPStatus.LENGTH_REQUIRED PRECONDITION_FAILED = HTTPStatus.PRECONDITION_FAILED REQUEST_ENTITY_TOO_LARGE = HTTPStatus.REQUEST_ENTITY_TOO_LARGE REQUEST_URI_TOO_LONG = HTTPStatus.REQUEST_URI_TOO_LONG UNSUPPORTED_MEDIA_TYPE = HTTPStatus.UNSUPPORTED_MEDIA_TYPE REQUESTED_RANGE_NOT_SATISFIABLE = HTTPStatus.REQUESTED_RANGE_NOT_SATISFIABLE EXPECTATION_FAILED = HTTPStatus.EXPECTATION_FAILED UNPROCESSABLE_ENTITY = HTTPStatus.UNPROCESSABLE_ENTITY LOCKED = HTTPStatus.LOCKED FAILED_DEPENDENCY = HTTPStatus.FAILED_DEPENDENCY UPGRADE_REQUIRED = HTTPStatus.UPGRADE_REQUIRED PRECONDITION_REQUIRED = HTTPStatus.PRECONDITION_REQUIRED TOO_MANY_REQUESTS = HTTPStatus.TOO_MANY_REQUESTS REQUEST_HEADER_FIELDS_TOO_LARGE = HTTPStatus.REQUEST_HEADER_FIELDS_TOO_LARGE INTERNAL_SERVER_ERROR = HTTPStatus.INTERNAL_SERVER_ERROR NOT_IMPLEMENTED = HTTPStatus.NOT_IMPLEMENTED BAD_GATEWAY = HTTPStatus.BAD_GATEWAY SERVICE_UNAVAILABLE = HTTPStatus.SERVICE_UNAVAILABLE GATEWAY_TIMEOUT = HTTPStatus.GATEWAY_TIMEOUT HTTP_VERSION_NOT_SUPPORTED = HTTPStatus.HTTP_VERSION_NOT_SUPPORTED VARIANT_ALSO_NEGOTIATES = HTTPStatus.VARIANT_ALSO_NEGOTIATES INSUFFICIENT_STORAGE = HTTPStatus.INSUFFICIENT_STORAGE LOOP_DETECTED = HTTPStatus.LOOP_DETECTED NOT_EXTENDED = HTTPStatus.NOT_EXTENDED NETWORK_AUTHENTICATION_REQUIRED = HTTPStatus.NETWORK_AUTHENTICATION_REQUIRED """ ) ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "http", _http_transform) register_module_extender(manager, "http.client", _http_client_transform) astroid-3.2.2/astroid/brain/brain_subprocess.py0000664000175000017500000000575014622475517021572 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import textwrap from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.const import PY39_PLUS, PY310_PLUS, PY311_PLUS from astroid.manager import AstroidManager def _subprocess_transform(): communicate = (bytes("string", "ascii"), bytes("string", "ascii")) communicate_signature = "def communicate(self, input=None, timeout=None)" args = """\ self, args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None, env=None, universal_newlines=None, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None, text=None""" if PY39_PLUS: args += ", user=None, group=None, extra_groups=None, umask=-1" if PY310_PLUS: args += ", pipesize=-1" if PY311_PLUS: args += ", process_group=None" init = f""" def __init__({args}): pass""" wait_signature = "def wait(self, timeout=None)" ctx_manager = """ def __enter__(self): return self def __exit__(self, *args): pass """ py3_args = "args = []" check_output_signature = """ check_output( args, *, stdin=None, stderr=None, shell=False, cwd=None, encoding=None, errors=None, universal_newlines=False, timeout=None, env=None, text=None, restore_signals=True, preexec_fn=None, pass_fds=(), input=None, bufsize=0, executable=None, close_fds=False, startupinfo=None, creationflags=0, start_new_session=False ): """.strip() code = textwrap.dedent( f""" def {check_output_signature} if universal_newlines: return "" return b"" class Popen(object): returncode = pid = 0 stdin = stdout = stderr = file() {py3_args} {communicate_signature}: return {communicate!r} {wait_signature}: return self.returncode def poll(self): return self.returncode def send_signal(self, signal): pass def terminate(self): pass def kill(self): pass {ctx_manager} """ ) if PY39_PLUS: code += """ @classmethod def __class_getitem__(cls, item): pass """ init_lines = textwrap.dedent(init).splitlines() indented_init = "\n".join(" " * 4 + line for line in init_lines) code += indented_init return parse(code) def register(manager: AstroidManager) -> None: register_module_extender(manager, "subprocess", _subprocess_transform) astroid-3.2.2/astroid/brain/brain_pkg_resources.py0000664000175000017500000000431414622475517022250 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid import parse from astroid.brain.helpers import register_module_extender from astroid.manager import AstroidManager def pkg_resources_transform(): return parse( """ def require(*requirements): return pkg_resources.working_set.require(*requirements) def run_script(requires, script_name): return pkg_resources.working_set.run_script(requires, script_name) def iter_entry_points(group, name=None): return pkg_resources.working_set.iter_entry_points(group, name) def resource_exists(package_or_requirement, resource_name): return get_provider(package_or_requirement).has_resource(resource_name) def resource_isdir(package_or_requirement, resource_name): return get_provider(package_or_requirement).resource_isdir( resource_name) def resource_filename(package_or_requirement, resource_name): return get_provider(package_or_requirement).get_resource_filename( self, resource_name) def resource_stream(package_or_requirement, resource_name): return get_provider(package_or_requirement).get_resource_stream( self, resource_name) def resource_string(package_or_requirement, resource_name): return get_provider(package_or_requirement).get_resource_string( self, resource_name) def resource_listdir(package_or_requirement, resource_name): return get_provider(package_or_requirement).resource_listdir( resource_name) def extraction_error(): pass def get_cache_path(archive_name, names=()): extract_path = self.extraction_path or get_default_cache() target_path = os.path.join(extract_path, archive_name+'-tmp', *names) return target_path def postprocess(tempname, filename): pass def set_extraction_path(path): pass def cleanup_resources(force=False): pass def get_distribution(dist): return Distribution(dist) _namespace_packages = {} """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "pkg_resources", pkg_resources_transform) astroid-3.2.2/astroid/brain/brain_collections.py0000664000175000017500000001061214622475517021711 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid.brain.helpers import register_module_extender from astroid.builder import extract_node, parse from astroid.const import PY39_PLUS from astroid.context import InferenceContext from astroid.exceptions import AttributeInferenceError from astroid.manager import AstroidManager from astroid.nodes.scoped_nodes import ClassDef def _collections_transform(): return parse( """ class defaultdict(dict): default_factory = None def __missing__(self, key): pass def __getitem__(self, key): return default_factory """ + _deque_mock() + _ordered_dict_mock() ) def _deque_mock(): base_deque_class = """ class deque(object): maxlen = 0 def __init__(self, iterable=None, maxlen=None): self.iterable = iterable or [] def append(self, x): pass def appendleft(self, x): pass def clear(self): pass def count(self, x): return 0 def extend(self, iterable): pass def extendleft(self, iterable): pass def pop(self): return self.iterable[0] def popleft(self): return self.iterable[0] def remove(self, value): pass def reverse(self): return reversed(self.iterable) def rotate(self, n=1): return self def __iter__(self): return self def __reversed__(self): return self.iterable[::-1] def __getitem__(self, index): return self.iterable[index] def __setitem__(self, index, value): pass def __delitem__(self, index): pass def __bool__(self): return bool(self.iterable) def __nonzero__(self): return bool(self.iterable) def __contains__(self, o): return o in self.iterable def __len__(self): return len(self.iterable) def __copy__(self): return deque(self.iterable) def copy(self): return deque(self.iterable) def index(self, x, start=0, end=0): return 0 def insert(self, i, x): pass def __add__(self, other): pass def __iadd__(self, other): pass def __mul__(self, other): pass def __imul__(self, other): pass def __rmul__(self, other): pass""" if PY39_PLUS: base_deque_class += """ @classmethod def __class_getitem__(self, item): return cls""" return base_deque_class def _ordered_dict_mock(): base_ordered_dict_class = """ class OrderedDict(dict): def __reversed__(self): return self[::-1] def move_to_end(self, key, last=False): pass""" if PY39_PLUS: base_ordered_dict_class += """ @classmethod def __class_getitem__(cls, item): return cls""" return base_ordered_dict_class def _looks_like_subscriptable(node: ClassDef) -> bool: """ Returns True if the node corresponds to a ClassDef of the Collections.abc module that supports subscripting. :param node: ClassDef node """ if node.qname().startswith("_collections") or node.qname().startswith( "collections" ): try: node.getattr("__class_getitem__") return True except AttributeInferenceError: pass return False CLASS_GET_ITEM_TEMPLATE = """ @classmethod def __class_getitem__(cls, item): return cls """ def easy_class_getitem_inference(node, context: InferenceContext | None = None): # Here __class_getitem__ exists but is quite a mess to infer thus # put an easy inference tip func_to_add = extract_node(CLASS_GET_ITEM_TEMPLATE) node.locals["__class_getitem__"] = [func_to_add] def register(manager: AstroidManager) -> None: register_module_extender(manager, "collections", _collections_transform) if PY39_PLUS: # Starting with Python39 some objects of the collection module are subscriptable # thanks to the __class_getitem__ method but the way it is implemented in # _collection_abc makes it difficult to infer. (We would have to handle AssignName inference in the # getitem method of the ClassDef class) Instead we put here a mock of the __class_getitem__ method manager.register_transform( ClassDef, easy_class_getitem_inference, _looks_like_subscriptable ) astroid-3.2.2/astroid/brain/brain_hypothesis.py0000664000175000017500000000341014622475517021570 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hook for the Hypothesis library. Without this hook pylint reports no-value-for-parameter for use of strategies defined using the `@hypothesis.strategies.composite` decorator. For example: from hypothesis import strategies as st @st.composite def a_strategy(draw): return draw(st.integers()) a_strategy() """ from astroid.manager import AstroidManager from astroid.nodes.scoped_nodes import FunctionDef COMPOSITE_NAMES = ( "composite", "st.composite", "strategies.composite", "hypothesis.strategies.composite", ) def is_decorated_with_st_composite(node) -> bool: """Return whether a decorated node has @st.composite applied.""" if node.decorators and node.args.args and node.args.args[0].name == "draw": for decorator_attribute in node.decorators.nodes: if decorator_attribute.as_string() in COMPOSITE_NAMES: return True return False def remove_draw_parameter_from_composite_strategy(node): """Given that the FunctionDef is decorated with @st.composite, remove the first argument (`draw`) - it's always supplied by Hypothesis so we don't need to emit the no-value-for-parameter lint. """ del node.args.args[0] del node.args.annotations[0] del node.args.type_comment_args[0] return node def register(manager: AstroidManager) -> None: manager.register_transform( node_class=FunctionDef, transform=remove_draw_parameter_from_composite_strategy, predicate=is_decorated_with_st_composite, ) astroid-3.2.2/astroid/brain/brain_builtin_inference.py0000664000175000017500000011107714622475517023066 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for various builtins.""" from __future__ import annotations import itertools from collections.abc import Callable, Iterable from functools import partial from typing import TYPE_CHECKING, Any, Iterator, NoReturn, Type, Union, cast from astroid import arguments, helpers, inference_tip, nodes, objects, util from astroid.builder import AstroidBuilder from astroid.context import InferenceContext from astroid.exceptions import ( AstroidTypeError, AttributeInferenceError, InferenceError, MroError, UseInferenceDefault, ) from astroid.manager import AstroidManager from astroid.nodes import scoped_nodes from astroid.typing import ( ConstFactoryResult, InferenceResult, SuccessfulInferenceResult, ) if TYPE_CHECKING: from astroid.bases import Instance ContainerObjects = Union[ objects.FrozenSet, objects.DictItems, objects.DictKeys, objects.DictValues, ] BuiltContainers = Union[ Type[tuple], Type[list], Type[set], Type[frozenset], ] CopyResult = Union[ nodes.Dict, nodes.List, nodes.Set, objects.FrozenSet, ] OBJECT_DUNDER_NEW = "object.__new__" STR_CLASS = """ class whatever(object): def join(self, iterable): return {rvalue} def replace(self, old, new, count=None): return {rvalue} def format(self, *args, **kwargs): return {rvalue} def encode(self, encoding='ascii', errors=None): return b'' def decode(self, encoding='ascii', errors=None): return u'' def capitalize(self): return {rvalue} def title(self): return {rvalue} def lower(self): return {rvalue} def upper(self): return {rvalue} def swapcase(self): return {rvalue} def index(self, sub, start=None, end=None): return 0 def find(self, sub, start=None, end=None): return 0 def count(self, sub, start=None, end=None): return 0 def strip(self, chars=None): return {rvalue} def lstrip(self, chars=None): return {rvalue} def rstrip(self, chars=None): return {rvalue} def rjust(self, width, fillchar=None): return {rvalue} def center(self, width, fillchar=None): return {rvalue} def ljust(self, width, fillchar=None): return {rvalue} """ BYTES_CLASS = """ class whatever(object): def join(self, iterable): return {rvalue} def replace(self, old, new, count=None): return {rvalue} def decode(self, encoding='ascii', errors=None): return u'' def capitalize(self): return {rvalue} def title(self): return {rvalue} def lower(self): return {rvalue} def upper(self): return {rvalue} def swapcase(self): return {rvalue} def index(self, sub, start=None, end=None): return 0 def find(self, sub, start=None, end=None): return 0 def count(self, sub, start=None, end=None): return 0 def strip(self, chars=None): return {rvalue} def lstrip(self, chars=None): return {rvalue} def rstrip(self, chars=None): return {rvalue} def rjust(self, width, fillchar=None): return {rvalue} def center(self, width, fillchar=None): return {rvalue} def ljust(self, width, fillchar=None): return {rvalue} """ def _use_default() -> NoReturn: # pragma: no cover raise UseInferenceDefault() def _extend_string_class(class_node, code, rvalue): """Function to extend builtin str/unicode class.""" code = code.format(rvalue=rvalue) fake = AstroidBuilder(AstroidManager()).string_build(code)["whatever"] for method in fake.mymethods(): method.parent = class_node method.lineno = None method.col_offset = None if "__class__" in method.locals: method.locals["__class__"] = [class_node] class_node.locals[method.name] = [method] method.parent = class_node def _extend_builtins(class_transforms): builtin_ast = AstroidManager().builtins_module for class_name, transform in class_transforms.items(): transform(builtin_ast[class_name]) def on_bootstrap(): """Called by astroid_bootstrapping().""" _extend_builtins( { "bytes": partial(_extend_string_class, code=BYTES_CLASS, rvalue="b''"), "str": partial(_extend_string_class, code=STR_CLASS, rvalue="''"), } ) def _builtin_filter_predicate(node, builtin_name) -> bool: if ( builtin_name == "type" and node.root().name == "re" and isinstance(node.func, nodes.Name) and node.func.name == "type" and isinstance(node.parent, nodes.Assign) and len(node.parent.targets) == 1 and isinstance(node.parent.targets[0], nodes.AssignName) and node.parent.targets[0].name in {"Pattern", "Match"} ): # Handle re.Pattern and re.Match in brain_re # Match these patterns from stdlib/re.py # ```py # Pattern = type(...) # Match = type(...) # ``` return False if isinstance(node.func, nodes.Name) and node.func.name == builtin_name: return True if isinstance(node.func, nodes.Attribute): return ( node.func.attrname == "fromkeys" and isinstance(node.func.expr, nodes.Name) and node.func.expr.name == "dict" ) return False def register_builtin_transform( manager: AstroidManager, transform, builtin_name ) -> None: """Register a new transform function for the given *builtin_name*. The transform function must accept two parameters, a node and an optional context. """ def _transform_wrapper( node: nodes.Call, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator: result = transform(node, context=context) if result: if not result.parent: # Let the transformation function determine # the parent for its result. Otherwise, # we set it to be the node we transformed from. result.parent = node if result.lineno is None: result.lineno = node.lineno # Can be a 'Module' see https://github.com/pylint-dev/pylint/issues/4671 # We don't have a regression test on this one: tread carefully if hasattr(result, "col_offset") and result.col_offset is None: result.col_offset = node.col_offset return iter([result]) manager.register_transform( nodes.Call, inference_tip(_transform_wrapper), partial(_builtin_filter_predicate, builtin_name=builtin_name), ) def _container_generic_inference( node: nodes.Call, context: InferenceContext | None, node_type: type[nodes.BaseContainer], transform: Callable[[SuccessfulInferenceResult], nodes.BaseContainer | None], ) -> nodes.BaseContainer: args = node.args if not args: return node_type( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) if len(node.args) > 1: raise UseInferenceDefault() (arg,) = args transformed = transform(arg) if not transformed: try: inferred = next(arg.infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if isinstance(inferred, util.UninferableBase): raise UseInferenceDefault transformed = transform(inferred) if not transformed or isinstance(transformed, util.UninferableBase): raise UseInferenceDefault return transformed def _container_generic_transform( arg: SuccessfulInferenceResult, context: InferenceContext | None, klass: type[nodes.BaseContainer], iterables: tuple[type[nodes.BaseContainer] | type[ContainerObjects], ...], build_elts: BuiltContainers, ) -> nodes.BaseContainer | None: elts: Iterable | str | bytes if isinstance(arg, klass): return arg if isinstance(arg, iterables): arg = cast(Union[nodes.BaseContainer, ContainerObjects], arg) if all(isinstance(elt, nodes.Const) for elt in arg.elts): elts = [cast(nodes.Const, elt).value for elt in arg.elts] else: # TODO: Does not handle deduplication for sets. elts = [] for element in arg.elts: if not element: continue inferred = util.safe_infer(element, context=context) if inferred: evaluated_object = nodes.EvaluatedObject( original=element, value=inferred ) elts.append(evaluated_object) elif isinstance(arg, nodes.Dict): # Dicts need to have consts as strings already. elts = [ item[0].value if isinstance(item[0], nodes.Const) else _use_default() for item in arg.items ] elif isinstance(arg, nodes.Const) and isinstance(arg.value, (str, bytes)): elts = arg.value else: return None return klass.from_elements(elts=build_elts(elts)) def _infer_builtin_container( node: nodes.Call, context: InferenceContext | None, klass: type[nodes.BaseContainer], iterables: tuple[type[nodes.NodeNG] | type[ContainerObjects], ...], build_elts: BuiltContainers, ) -> nodes.BaseContainer: transform_func = partial( _container_generic_transform, context=context, klass=klass, iterables=iterables, build_elts=build_elts, ) return _container_generic_inference(node, context, klass, transform_func) # pylint: disable=invalid-name infer_tuple = partial( _infer_builtin_container, klass=nodes.Tuple, iterables=( nodes.List, nodes.Set, objects.FrozenSet, objects.DictItems, objects.DictKeys, objects.DictValues, ), build_elts=tuple, ) infer_list = partial( _infer_builtin_container, klass=nodes.List, iterables=( nodes.Tuple, nodes.Set, objects.FrozenSet, objects.DictItems, objects.DictKeys, objects.DictValues, ), build_elts=list, ) infer_set = partial( _infer_builtin_container, klass=nodes.Set, iterables=(nodes.List, nodes.Tuple, objects.FrozenSet, objects.DictKeys), build_elts=set, ) infer_frozenset = partial( _infer_builtin_container, klass=objects.FrozenSet, iterables=(nodes.List, nodes.Tuple, nodes.Set, objects.FrozenSet, objects.DictKeys), build_elts=frozenset, ) def _get_elts(arg, context): def is_iterable(n): return isinstance(n, (nodes.List, nodes.Tuple, nodes.Set)) try: inferred = next(arg.infer(context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if isinstance(inferred, nodes.Dict): items = inferred.items elif is_iterable(inferred): items = [] for elt in inferred.elts: # If an item is not a pair of two items, # then fallback to the default inference. # Also, take in consideration only hashable items, # tuples and consts. We are choosing Names as well. if not is_iterable(elt): raise UseInferenceDefault() if len(elt.elts) != 2: raise UseInferenceDefault() if not isinstance(elt.elts[0], (nodes.Tuple, nodes.Const, nodes.Name)): raise UseInferenceDefault() items.append(tuple(elt.elts)) else: raise UseInferenceDefault() return items def infer_dict(node: nodes.Call, context: InferenceContext | None = None) -> nodes.Dict: """Try to infer a dict call to a Dict node. The function treats the following cases: * dict() * dict(mapping) * dict(iterable) * dict(iterable, **kwargs) * dict(mapping, **kwargs) * dict(**kwargs) If a case can't be inferred, we'll fallback to default inference. """ call = arguments.CallSite.from_call(node, context=context) if call.has_invalid_arguments() or call.has_invalid_keywords(): raise UseInferenceDefault args = call.positional_arguments kwargs = list(call.keyword_arguments.items()) items: list[tuple[InferenceResult, InferenceResult]] if not args and not kwargs: # dict() return nodes.Dict( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) if kwargs and not args: # dict(a=1, b=2, c=4) items = [(nodes.Const(key), value) for key, value in kwargs] elif len(args) == 1 and kwargs: # dict(some_iterable, b=2, c=4) elts = _get_elts(args[0], context) keys = [(nodes.Const(key), value) for key, value in kwargs] items = elts + keys elif len(args) == 1: items = _get_elts(args[0], context) else: raise UseInferenceDefault() value = nodes.Dict( col_offset=node.col_offset, lineno=node.lineno, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) value.postinit(items) return value def infer_super( node: nodes.Call, context: InferenceContext | None = None ) -> objects.Super: """Understand super calls. There are some restrictions for what can be understood: * unbounded super (one argument form) is not understood. * if the super call is not inside a function (classmethod or method), then the default inference will be used. * if the super arguments can't be inferred, the default inference will be used. """ if len(node.args) == 1: # Ignore unbounded super. raise UseInferenceDefault scope = node.scope() if not isinstance(scope, nodes.FunctionDef): # Ignore non-method uses of super. raise UseInferenceDefault if scope.type not in ("classmethod", "method"): # Not interested in staticmethods. raise UseInferenceDefault cls = scoped_nodes.get_wrapping_class(scope) assert cls is not None if not node.args: mro_pointer = cls # In we are in a classmethod, the interpreter will fill # automatically the class as the second argument, not an instance. if scope.type == "classmethod": mro_type = cls else: mro_type = cls.instantiate_class() else: try: mro_pointer = next(node.args[0].infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc try: mro_type = next(node.args[1].infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if isinstance(mro_pointer, util.UninferableBase) or isinstance( mro_type, util.UninferableBase ): # No way we could understand this. raise UseInferenceDefault super_obj = objects.Super( mro_pointer=mro_pointer, mro_type=mro_type, self_class=cls, scope=scope, call=node, ) super_obj.parent = node return super_obj def _infer_getattr_args(node, context): if len(node.args) not in (2, 3): # Not a valid getattr call. raise UseInferenceDefault try: obj = next(node.args[0].infer(context=context)) attr = next(node.args[1].infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if isinstance(obj, util.UninferableBase) or isinstance(attr, util.UninferableBase): # If one of the arguments is something we can't infer, # then also make the result of the getattr call something # which is unknown. return util.Uninferable, util.Uninferable is_string = isinstance(attr, nodes.Const) and isinstance(attr.value, str) if not is_string: raise UseInferenceDefault return obj, attr.value def infer_getattr(node, context: InferenceContext | None = None): """Understand getattr calls. If one of the arguments is an Uninferable object, then the result will be an Uninferable object. Otherwise, the normal attribute lookup will be done. """ obj, attr = _infer_getattr_args(node, context) if ( isinstance(obj, util.UninferableBase) or isinstance(attr, util.UninferableBase) or not hasattr(obj, "igetattr") ): return util.Uninferable try: return next(obj.igetattr(attr, context=context)) except (StopIteration, InferenceError, AttributeInferenceError): if len(node.args) == 3: # Try to infer the default and return it instead. try: return next(node.args[2].infer(context=context)) except (StopIteration, InferenceError) as exc: raise UseInferenceDefault from exc raise UseInferenceDefault def infer_hasattr(node, context: InferenceContext | None = None): """Understand hasattr calls. This always guarantees three possible outcomes for calling hasattr: Const(False) when we are sure that the object doesn't have the intended attribute, Const(True) when we know that the object has the attribute and Uninferable when we are unsure of the outcome of the function call. """ try: obj, attr = _infer_getattr_args(node, context) if ( isinstance(obj, util.UninferableBase) or isinstance(attr, util.UninferableBase) or not hasattr(obj, "getattr") ): return util.Uninferable obj.getattr(attr, context=context) except UseInferenceDefault: # Can't infer something from this function call. return util.Uninferable except AttributeInferenceError: # Doesn't have it. return nodes.Const(False) return nodes.Const(True) def infer_callable(node, context: InferenceContext | None = None): """Understand callable calls. This follows Python's semantics, where an object is callable if it provides an attribute __call__, even though that attribute is something which can't be called. """ if len(node.args) != 1: # Invalid callable call. raise UseInferenceDefault argument = node.args[0] try: inferred = next(argument.infer(context=context)) except (InferenceError, StopIteration): return util.Uninferable if isinstance(inferred, util.UninferableBase): return util.Uninferable return nodes.Const(inferred.callable()) def infer_property( node: nodes.Call, context: InferenceContext | None = None ) -> objects.Property: """Understand `property` class. This only infers the output of `property` call, not the arguments themselves. """ if len(node.args) < 1: # Invalid property call. raise UseInferenceDefault getter = node.args[0] try: inferred = next(getter.infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if not isinstance(inferred, (nodes.FunctionDef, nodes.Lambda)): raise UseInferenceDefault prop_func = objects.Property( function=inferred, name=inferred.name, lineno=node.lineno, col_offset=node.col_offset, ) # Set parent outside __init__: https://github.com/pylint-dev/astroid/issues/1490 prop_func.parent = node prop_func.postinit( body=[], args=inferred.args, doc_node=getattr(inferred, "doc_node", None), ) return prop_func def infer_bool(node, context: InferenceContext | None = None): """Understand bool calls.""" if len(node.args) > 1: # Invalid bool call. raise UseInferenceDefault if not node.args: return nodes.Const(False) argument = node.args[0] try: inferred = next(argument.infer(context=context)) except (InferenceError, StopIteration): return util.Uninferable if isinstance(inferred, util.UninferableBase): return util.Uninferable bool_value = inferred.bool_value(context=context) if isinstance(bool_value, util.UninferableBase): return util.Uninferable return nodes.Const(bool_value) def infer_type(node, context: InferenceContext | None = None): """Understand the one-argument form of *type*.""" if len(node.args) != 1: raise UseInferenceDefault return helpers.object_type(node.args[0], context) def infer_slice(node, context: InferenceContext | None = None): """Understand `slice` calls.""" args = node.args if not 0 < len(args) <= 3: raise UseInferenceDefault infer_func = partial(util.safe_infer, context=context) args = [infer_func(arg) for arg in args] for arg in args: if not arg or isinstance(arg, util.UninferableBase): raise UseInferenceDefault if not isinstance(arg, nodes.Const): raise UseInferenceDefault if not isinstance(arg.value, (type(None), int)): raise UseInferenceDefault if len(args) < 3: # Make sure we have 3 arguments. args.extend([None] * (3 - len(args))) slice_node = nodes.Slice( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) slice_node.postinit(*args) return slice_node def _infer_object__new__decorator( node: nodes.ClassDef, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[Instance]: # Instantiate class immediately # since that's what @object.__new__ does return iter((node.instantiate_class(),)) def _infer_object__new__decorator_check(node) -> bool: """Predicate before inference_tip. Check if the given ClassDef has an @object.__new__ decorator """ if not node.decorators: return False for decorator in node.decorators.nodes: if isinstance(decorator, nodes.Attribute): if decorator.as_string() == OBJECT_DUNDER_NEW: return True return False def infer_issubclass(callnode, context: InferenceContext | None = None): """Infer issubclass() calls. :param nodes.Call callnode: an `issubclass` call :param InferenceContext context: the context for the inference :rtype nodes.Const: Boolean Const value of the `issubclass` call :raises UseInferenceDefault: If the node cannot be inferred """ call = arguments.CallSite.from_call(callnode, context=context) if call.keyword_arguments: # issubclass doesn't support keyword arguments raise UseInferenceDefault("TypeError: issubclass() takes no keyword arguments") if len(call.positional_arguments) != 2: raise UseInferenceDefault( f"Expected two arguments, got {len(call.positional_arguments)}" ) # The left hand argument is the obj to be checked obj_node, class_or_tuple_node = call.positional_arguments try: obj_type = next(obj_node.infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if not isinstance(obj_type, nodes.ClassDef): raise UseInferenceDefault("TypeError: arg 1 must be class") # The right hand argument is the class(es) that the given # object is to be checked against. try: class_container = _class_or_tuple_to_container( class_or_tuple_node, context=context ) except InferenceError as exc: raise UseInferenceDefault from exc try: issubclass_bool = helpers.object_issubclass(obj_type, class_container, context) except AstroidTypeError as exc: raise UseInferenceDefault("TypeError: " + str(exc)) from exc except MroError as exc: raise UseInferenceDefault from exc return nodes.Const(issubclass_bool) def infer_isinstance( callnode: nodes.Call, context: InferenceContext | None = None ) -> nodes.Const: """Infer isinstance calls. :param nodes.Call callnode: an isinstance call :raises UseInferenceDefault: If the node cannot be inferred """ call = arguments.CallSite.from_call(callnode, context=context) if call.keyword_arguments: # isinstance doesn't support keyword arguments raise UseInferenceDefault("TypeError: isinstance() takes no keyword arguments") if len(call.positional_arguments) != 2: raise UseInferenceDefault( f"Expected two arguments, got {len(call.positional_arguments)}" ) # The left hand argument is the obj to be checked obj_node, class_or_tuple_node = call.positional_arguments # The right hand argument is the class(es) that the given # obj is to be check is an instance of try: class_container = _class_or_tuple_to_container( class_or_tuple_node, context=context ) except InferenceError as exc: raise UseInferenceDefault from exc try: isinstance_bool = helpers.object_isinstance(obj_node, class_container, context) except AstroidTypeError as exc: raise UseInferenceDefault("TypeError: " + str(exc)) from exc except MroError as exc: raise UseInferenceDefault from exc if isinstance(isinstance_bool, util.UninferableBase): raise UseInferenceDefault return nodes.Const(isinstance_bool) def _class_or_tuple_to_container( node: InferenceResult, context: InferenceContext | None = None ) -> list[InferenceResult]: # Move inferences results into container # to simplify later logic # raises InferenceError if any of the inferences fall through try: node_infer = next(node.infer(context=context)) except StopIteration as e: raise InferenceError(node=node, context=context) from e # arg2 MUST be a type or a TUPLE of types # for isinstance if isinstance(node_infer, nodes.Tuple): try: class_container = [ next(node.infer(context=context)) for node in node_infer.elts ] except StopIteration as e: raise InferenceError(node=node, context=context) from e else: class_container = [node_infer] return class_container def infer_len(node, context: InferenceContext | None = None): """Infer length calls. :param nodes.Call node: len call to infer :param context.InferenceContext: node context :rtype nodes.Const: a Const node with the inferred length, if possible """ call = arguments.CallSite.from_call(node, context=context) if call.keyword_arguments: raise UseInferenceDefault("TypeError: len() must take no keyword arguments") if len(call.positional_arguments) != 1: raise UseInferenceDefault( "TypeError: len() must take exactly one argument " "({len}) given".format(len=len(call.positional_arguments)) ) [argument_node] = call.positional_arguments try: return nodes.Const(helpers.object_len(argument_node, context=context)) except (AstroidTypeError, InferenceError) as exc: raise UseInferenceDefault(str(exc)) from exc def infer_str(node, context: InferenceContext | None = None): """Infer str() calls. :param nodes.Call node: str() call to infer :param context.InferenceContext: node context :rtype nodes.Const: a Const containing an empty string """ call = arguments.CallSite.from_call(node, context=context) if call.keyword_arguments: raise UseInferenceDefault("TypeError: str() must take no keyword arguments") try: return nodes.Const("") except (AstroidTypeError, InferenceError) as exc: raise UseInferenceDefault(str(exc)) from exc def infer_int(node, context: InferenceContext | None = None): """Infer int() calls. :param nodes.Call node: int() call to infer :param context.InferenceContext: node context :rtype nodes.Const: a Const containing the integer value of the int() call """ call = arguments.CallSite.from_call(node, context=context) if call.keyword_arguments: raise UseInferenceDefault("TypeError: int() must take no keyword arguments") if call.positional_arguments: try: first_value = next(call.positional_arguments[0].infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault(str(exc)) from exc if isinstance(first_value, util.UninferableBase): raise UseInferenceDefault if isinstance(first_value, nodes.Const) and isinstance( first_value.value, (int, str) ): try: actual_value = int(first_value.value) except ValueError: return nodes.Const(0) return nodes.Const(actual_value) return nodes.Const(0) def infer_dict_fromkeys(node, context: InferenceContext | None = None): """Infer dict.fromkeys. :param nodes.Call node: dict.fromkeys() call to infer :param context.InferenceContext context: node context :rtype nodes.Dict: a Dictionary containing the values that astroid was able to infer. In case the inference failed for any reason, an empty dictionary will be inferred instead. """ def _build_dict_with_elements(elements): new_node = nodes.Dict( col_offset=node.col_offset, lineno=node.lineno, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) new_node.postinit(elements) return new_node call = arguments.CallSite.from_call(node, context=context) if call.keyword_arguments: raise UseInferenceDefault("TypeError: int() must take no keyword arguments") if len(call.positional_arguments) not in {1, 2}: raise UseInferenceDefault( "TypeError: Needs between 1 and 2 positional arguments" ) default = nodes.Const(None) values = call.positional_arguments[0] try: inferred_values = next(values.infer(context=context)) except (InferenceError, StopIteration): return _build_dict_with_elements([]) if inferred_values is util.Uninferable: return _build_dict_with_elements([]) # Limit to a couple of potential values, as this can become pretty complicated accepted_iterable_elements = (nodes.Const,) if isinstance(inferred_values, (nodes.List, nodes.Set, nodes.Tuple)): elements = inferred_values.elts for element in elements: if not isinstance(element, accepted_iterable_elements): # Fallback to an empty dict return _build_dict_with_elements([]) elements_with_value = [(element, default) for element in elements] return _build_dict_with_elements(elements_with_value) if isinstance(inferred_values, nodes.Const) and isinstance( inferred_values.value, (str, bytes) ): elements_with_value = [ (nodes.Const(element), default) for element in inferred_values.value ] return _build_dict_with_elements(elements_with_value) if isinstance(inferred_values, nodes.Dict): keys = inferred_values.itered() for key in keys: if not isinstance(key, accepted_iterable_elements): # Fallback to an empty dict return _build_dict_with_elements([]) elements_with_value = [(element, default) for element in keys] return _build_dict_with_elements(elements_with_value) # Fallback to an empty dictionary return _build_dict_with_elements([]) def _infer_copy_method( node: nodes.Call, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[CopyResult]: assert isinstance(node.func, nodes.Attribute) inferred_orig, inferred_copy = itertools.tee(node.func.expr.infer(context=context)) if all( isinstance( inferred_node, (nodes.Dict, nodes.List, nodes.Set, objects.FrozenSet) ) for inferred_node in inferred_orig ): return cast(Iterator[CopyResult], inferred_copy) raise UseInferenceDefault def _is_str_format_call(node: nodes.Call) -> bool: """Catch calls to str.format().""" if not isinstance(node.func, nodes.Attribute) or not node.func.attrname == "format": return False if isinstance(node.func.expr, nodes.Name): value = util.safe_infer(node.func.expr) else: value = node.func.expr return isinstance(value, nodes.Const) and isinstance(value.value, str) def _infer_str_format_call( node: nodes.Call, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[ConstFactoryResult | util.UninferableBase]: """Return a Const node based on the template and passed arguments.""" call = arguments.CallSite.from_call(node, context=context) assert isinstance(node.func, (nodes.Attribute, nodes.AssignAttr, nodes.DelAttr)) value: nodes.Const if isinstance(node.func.expr, nodes.Name): if not (inferred := util.safe_infer(node.func.expr)) or not isinstance( inferred, nodes.Const ): return iter([util.Uninferable]) value = inferred elif isinstance(node.func.expr, nodes.Const): value = node.func.expr else: # pragma: no cover return iter([util.Uninferable]) format_template = value.value # Get the positional arguments passed inferred_positional: list[nodes.Const] = [] for i in call.positional_arguments: one_inferred = util.safe_infer(i, context) if not isinstance(one_inferred, nodes.Const): return iter([util.Uninferable]) inferred_positional.append(one_inferred) pos_values: list[str] = [i.value for i in inferred_positional] # Get the keyword arguments passed inferred_keyword: dict[str, nodes.Const] = {} for k, v in call.keyword_arguments.items(): one_inferred = util.safe_infer(v, context) if not isinstance(one_inferred, nodes.Const): return iter([util.Uninferable]) inferred_keyword[k] = one_inferred keyword_values: dict[str, str] = {k: v.value for k, v in inferred_keyword.items()} try: formatted_string = format_template.format(*pos_values, **keyword_values) except (AttributeError, IndexError, KeyError, TypeError, ValueError): # AttributeError: named field in format string was not found in the arguments # IndexError: there are too few arguments to interpolate # TypeError: Unsupported format string # ValueError: Unknown format code return iter([util.Uninferable]) return iter([nodes.const_factory(formatted_string)]) def register(manager: AstroidManager) -> None: # Builtins inference register_builtin_transform(manager, infer_bool, "bool") register_builtin_transform(manager, infer_super, "super") register_builtin_transform(manager, infer_callable, "callable") register_builtin_transform(manager, infer_property, "property") register_builtin_transform(manager, infer_getattr, "getattr") register_builtin_transform(manager, infer_hasattr, "hasattr") register_builtin_transform(manager, infer_tuple, "tuple") register_builtin_transform(manager, infer_set, "set") register_builtin_transform(manager, infer_list, "list") register_builtin_transform(manager, infer_dict, "dict") register_builtin_transform(manager, infer_frozenset, "frozenset") register_builtin_transform(manager, infer_type, "type") register_builtin_transform(manager, infer_slice, "slice") register_builtin_transform(manager, infer_isinstance, "isinstance") register_builtin_transform(manager, infer_issubclass, "issubclass") register_builtin_transform(manager, infer_len, "len") register_builtin_transform(manager, infer_str, "str") register_builtin_transform(manager, infer_int, "int") register_builtin_transform(manager, infer_dict_fromkeys, "dict.fromkeys") # Infer object.__new__ calls manager.register_transform( nodes.ClassDef, inference_tip(_infer_object__new__decorator), _infer_object__new__decorator_check, ) manager.register_transform( nodes.Call, inference_tip(_infer_copy_method), lambda node: isinstance(node.func, nodes.Attribute) and node.func.attrname == "copy", ) manager.register_transform( nodes.Call, inference_tip(_infer_str_format_call), _is_str_format_call, ) astroid-3.2.2/astroid/brain/brain_signal.py0000664000175000017500000000753414622475517020661 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the signal library. The signal module generates the 'Signals', 'Handlers' and 'Sigmasks' IntEnums dynamically using the IntEnum._convert() classmethod, which modifies the module globals. Astroid is unable to handle this type of code. Without these hooks, the following are erroneously triggered by Pylint: * E1101: Module 'signal' has no 'Signals' member (no-member) * E1101: Module 'signal' has no 'Handlers' member (no-member) * E1101: Module 'signal' has no 'Sigmasks' member (no-member) These enums are defined slightly differently depending on the user's operating system and platform. These platform differences should follow the current Python typeshed stdlib `signal.pyi` stub file, available at: * https://github.com/python/typeshed/blob/master/stdlib/signal.pyi Note that the enum.auto() values defined here for the Signals, Handlers and Sigmasks IntEnums are just dummy integer values, and do not correspond to the actual standard signal numbers - which may vary depending on the system. """ import sys from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def _signals_enums_transform(): """Generates the AST for 'Signals', 'Handlers' and 'Sigmasks' IntEnums.""" return parse(_signals_enum() + _handlers_enum() + _sigmasks_enum()) def _signals_enum() -> str: """Generates the source code for the Signals int enum.""" signals_enum = """ import enum class Signals(enum.IntEnum): SIGABRT = enum.auto() SIGEMT = enum.auto() SIGFPE = enum.auto() SIGILL = enum.auto() SIGINFO = enum.auto() SIGINT = enum.auto() SIGSEGV = enum.auto() SIGTERM = enum.auto() """ if sys.platform != "win32": signals_enum += """ SIGALRM = enum.auto() SIGBUS = enum.auto() SIGCHLD = enum.auto() SIGCONT = enum.auto() SIGHUP = enum.auto() SIGIO = enum.auto() SIGIOT = enum.auto() SIGKILL = enum.auto() SIGPIPE = enum.auto() SIGPROF = enum.auto() SIGQUIT = enum.auto() SIGSTOP = enum.auto() SIGSYS = enum.auto() SIGTRAP = enum.auto() SIGTSTP = enum.auto() SIGTTIN = enum.auto() SIGTTOU = enum.auto() SIGURG = enum.auto() SIGUSR1 = enum.auto() SIGUSR2 = enum.auto() SIGVTALRM = enum.auto() SIGWINCH = enum.auto() SIGXCPU = enum.auto() SIGXFSZ = enum.auto() """ if sys.platform == "win32": signals_enum += """ SIGBREAK = enum.auto() """ if sys.platform not in ("darwin", "win32"): signals_enum += """ SIGCLD = enum.auto() SIGPOLL = enum.auto() SIGPWR = enum.auto() SIGRTMAX = enum.auto() SIGRTMIN = enum.auto() """ return signals_enum def _handlers_enum() -> str: """Generates the source code for the Handlers int enum.""" return """ import enum class Handlers(enum.IntEnum): SIG_DFL = enum.auto() SIG_IGN = eunm.auto() """ def _sigmasks_enum() -> str: """Generates the source code for the Sigmasks int enum.""" if sys.platform != "win32": return """ import enum class Sigmasks(enum.IntEnum): SIG_BLOCK = enum.auto() SIG_UNBLOCK = enum.auto() SIG_SETMASK = enum.auto() """ return "" def register(manager: AstroidManager) -> None: register_module_extender(manager, "signal", _signals_enums_transform) astroid-3.2.2/astroid/brain/brain_random.py0000664000175000017500000000614214622475517020656 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import random from astroid.context import InferenceContext from astroid.exceptions import UseInferenceDefault from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.nodes.node_classes import ( Attribute, Call, Const, EvaluatedObject, List, Name, Set, Tuple, ) from astroid.util import safe_infer ACCEPTED_ITERABLES_FOR_SAMPLE = (List, Set, Tuple) def _clone_node_with_lineno(node, parent, lineno): if isinstance(node, EvaluatedObject): node = node.original cls = node.__class__ other_fields = node._other_fields _astroid_fields = node._astroid_fields init_params = { "lineno": lineno, "col_offset": node.col_offset, "parent": parent, "end_lineno": node.end_lineno, "end_col_offset": node.end_col_offset, } postinit_params = {param: getattr(node, param) for param in _astroid_fields} if other_fields: init_params.update({param: getattr(node, param) for param in other_fields}) new_node = cls(**init_params) if hasattr(node, "postinit") and _astroid_fields: new_node.postinit(**postinit_params) return new_node def infer_random_sample(node, context: InferenceContext | None = None): if len(node.args) != 2: raise UseInferenceDefault inferred_length = safe_infer(node.args[1], context=context) if not isinstance(inferred_length, Const): raise UseInferenceDefault if not isinstance(inferred_length.value, int): raise UseInferenceDefault inferred_sequence = safe_infer(node.args[0], context=context) if not inferred_sequence: raise UseInferenceDefault if not isinstance(inferred_sequence, ACCEPTED_ITERABLES_FOR_SAMPLE): raise UseInferenceDefault if inferred_length.value > len(inferred_sequence.elts): # In this case, this will raise a ValueError raise UseInferenceDefault try: elts = random.sample(inferred_sequence.elts, inferred_length.value) except ValueError as exc: raise UseInferenceDefault from exc new_node = List( lineno=node.lineno, col_offset=node.col_offset, parent=node.scope(), end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) new_elts = [ _clone_node_with_lineno(elt, parent=new_node, lineno=new_node.lineno) for elt in elts ] new_node.postinit(new_elts) return iter((new_node,)) def _looks_like_random_sample(node) -> bool: func = node.func if isinstance(func, Attribute): return func.attrname == "sample" if isinstance(func, Name): return func.name == "sample" return False def register(manager: AstroidManager) -> None: manager.register_transform( Call, inference_tip(infer_random_sample), _looks_like_random_sample ) astroid-3.2.2/astroid/brain/brain_fstrings.py0000664000175000017500000000532114622475517021233 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import collections.abc from typing import TypeVar from astroid import nodes from astroid.manager import AstroidManager _NodeT = TypeVar("_NodeT", bound=nodes.NodeNG) def _clone_node_with_lineno( node: _NodeT, parent: nodes.NodeNG, lineno: int | None ) -> _NodeT: cls = node.__class__ other_fields = node._other_fields _astroid_fields = node._astroid_fields init_params = { "lineno": lineno, "col_offset": node.col_offset, "parent": parent, "end_lineno": node.end_lineno, "end_col_offset": node.end_col_offset, } postinit_params = {param: getattr(node, param) for param in _astroid_fields} if other_fields: init_params.update({param: getattr(node, param) for param in other_fields}) new_node = cls(**init_params) if hasattr(node, "postinit") and _astroid_fields: for param, child in postinit_params.items(): if child and not isinstance(child, collections.abc.Sequence): cloned_child = _clone_node_with_lineno( node=child, lineno=new_node.lineno, parent=new_node ) postinit_params[param] = cloned_child new_node.postinit(**postinit_params) return new_node def _transform_formatted_value( # pylint: disable=inconsistent-return-statements node: nodes.FormattedValue, ) -> nodes.FormattedValue | None: if node.value and node.value.lineno == 1: if node.lineno != node.value.lineno: new_node = nodes.FormattedValue( lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) new_value = _clone_node_with_lineno( node=node.value, lineno=node.lineno, parent=new_node ) new_node.postinit( value=new_value, conversion=node.conversion, format_spec=node.format_spec, ) return new_node # TODO: this fix tries to *patch* http://bugs.python.org/issue29051 # The problem is that FormattedValue.value, which is a Name node, # has wrong line numbers, usually 1. This creates problems for pylint, # which expects correct line numbers for things such as message control. def register(manager: AstroidManager) -> None: manager.register_transform(nodes.FormattedValue, _transform_formatted_value) astroid-3.2.2/astroid/brain/brain_numpy_ma.py0000664000175000017500000000166414622475517021227 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy ma module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_ma_transform(): """ Infer the call of various numpy.ma functions. :param node: node to infer :param context: inference context """ return parse( """ import numpy.ma def masked_where(condition, a, copy=True): return numpy.ma.masked_array(a, mask=[]) def masked_invalid(a, copy=True): return numpy.ma.masked_array(a, mask=[]) """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "numpy.ma", numpy_ma_transform) astroid-3.2.2/astroid/brain/brain_hashlib.py0000664000175000017500000000547114622475517021014 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.const import PY39_PLUS from astroid.manager import AstroidManager def _hashlib_transform(): maybe_usedforsecurity = ", usedforsecurity=True" if PY39_PLUS else "" init_signature = f"value=''{maybe_usedforsecurity}" digest_signature = "self" shake_digest_signature = "self, length" template = """ class %(name)s: def __init__(self, %(init_signature)s): pass def digest(%(digest_signature)s): return %(digest)s def copy(self): return self def update(self, value): pass def hexdigest(%(digest_signature)s): return '' @property def name(self): return %(name)r @property def block_size(self): return 1 @property def digest_size(self): return 1 """ algorithms_with_signature = dict.fromkeys( [ "md5", "sha1", "sha224", "sha256", "sha384", "sha512", "sha3_224", "sha3_256", "sha3_384", "sha3_512", ], (init_signature, digest_signature), ) blake2b_signature = ( "data=b'', *, digest_size=64, key=b'', salt=b'', " "person=b'', fanout=1, depth=1, leaf_size=0, node_offset=0, " f"node_depth=0, inner_size=0, last_node=False{maybe_usedforsecurity}" ) blake2s_signature = ( "data=b'', *, digest_size=32, key=b'', salt=b'', " "person=b'', fanout=1, depth=1, leaf_size=0, node_offset=0, " f"node_depth=0, inner_size=0, last_node=False{maybe_usedforsecurity}" ) shake_algorithms = dict.fromkeys( ["shake_128", "shake_256"], (init_signature, shake_digest_signature), ) algorithms_with_signature.update(shake_algorithms) algorithms_with_signature.update( { "blake2b": (blake2b_signature, digest_signature), "blake2s": (blake2s_signature, digest_signature), } ) classes = "".join( template % { "name": hashfunc, "digest": 'b""', "init_signature": init_signature, "digest_signature": digest_signature, } for hashfunc, ( init_signature, digest_signature, ) in algorithms_with_signature.items() ) return parse(classes) def register(manager: AstroidManager) -> None: register_module_extender(manager, "hashlib", _hashlib_transform) astroid-3.2.2/astroid/brain/brain_re.py0000664000175000017500000000567314622475517020014 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid import context, inference_tip, nodes from astroid.brain.helpers import register_module_extender from astroid.builder import _extract_single_node, parse from astroid.const import PY39_PLUS, PY311_PLUS from astroid.manager import AstroidManager def _re_transform() -> nodes.Module: # The RegexFlag enum exposes all its entries by updating globals() # In 3.6-3.10 all flags come from sre_compile # On 3.11+ all flags come from re._compiler if PY311_PLUS: import_compiler = "import re._compiler as _compiler" else: import_compiler = "import sre_compile as _compiler" return parse( f""" {import_compiler} NOFLAG = 0 ASCII = _compiler.SRE_FLAG_ASCII IGNORECASE = _compiler.SRE_FLAG_IGNORECASE LOCALE = _compiler.SRE_FLAG_LOCALE UNICODE = _compiler.SRE_FLAG_UNICODE MULTILINE = _compiler.SRE_FLAG_MULTILINE DOTALL = _compiler.SRE_FLAG_DOTALL VERBOSE = _compiler.SRE_FLAG_VERBOSE TEMPLATE = _compiler.SRE_FLAG_TEMPLATE DEBUG = _compiler.SRE_FLAG_DEBUG A = ASCII I = IGNORECASE L = LOCALE U = UNICODE M = MULTILINE S = DOTALL X = VERBOSE T = TEMPLATE """ ) CLASS_GETITEM_TEMPLATE = """ @classmethod def __class_getitem__(cls, item): return cls """ def _looks_like_pattern_or_match(node: nodes.Call) -> bool: """Check for re.Pattern or re.Match call in stdlib. Match these patterns from stdlib/re.py ```py Pattern = type(...) Match = type(...) ``` """ return ( node.root().name == "re" and isinstance(node.func, nodes.Name) and node.func.name == "type" and isinstance(node.parent, nodes.Assign) and len(node.parent.targets) == 1 and isinstance(node.parent.targets[0], nodes.AssignName) and node.parent.targets[0].name in {"Pattern", "Match"} ) def infer_pattern_match(node: nodes.Call, ctx: context.InferenceContext | None = None): """Infer re.Pattern and re.Match as classes. For PY39+ add `__class_getitem__`. """ class_def = nodes.ClassDef( name=node.parent.targets[0].name, lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) if PY39_PLUS: func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) class_def.locals["__class_getitem__"] = [func_to_add] return iter([class_def]) def register(manager: AstroidManager) -> None: register_module_extender(manager, "re", _re_transform) manager.register_transform( nodes.Call, inference_tip(infer_pattern_match), _looks_like_pattern_or_match ) astroid-3.2.2/astroid/brain/brain_attrs.py0000664000175000017500000000616014622475517020533 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hook for the attrs library Without this hook pylint reports unsupported-assignment-operation for attrs classes """ from astroid.manager import AstroidManager from astroid.nodes.node_classes import AnnAssign, Assign, AssignName, Call, Unknown from astroid.nodes.scoped_nodes import ClassDef from astroid.util import safe_infer ATTRIB_NAMES = frozenset( ( "attr.Factory", "attr.ib", "attrib", "attr.attrib", "attr.field", "attrs.field", "field", ) ) ATTRS_NAMES = frozenset( ( "attr.s", "attrs", "attr.attrs", "attr.attributes", "attr.define", "attr.mutable", "attr.frozen", "attrs.define", "attrs.mutable", "attrs.frozen", ) ) def is_decorated_with_attrs(node, decorator_names=ATTRS_NAMES) -> bool: """Return whether a decorated node has an attr decorator applied.""" if not node.decorators: return False for decorator_attribute in node.decorators.nodes: if isinstance(decorator_attribute, Call): # decorator with arguments decorator_attribute = decorator_attribute.func if decorator_attribute.as_string() in decorator_names: return True inferred = safe_infer(decorator_attribute) if inferred and inferred.root().name == "attr._next_gen": return True return False def attr_attributes_transform(node: ClassDef) -> None: """Given that the ClassNode has an attr decorator, rewrite class attributes as instance attributes """ # Astroid can't infer this attribute properly # Prevents https://github.com/pylint-dev/pylint/issues/1884 node.locals["__attrs_attrs__"] = [Unknown(parent=node)] for cdef_body_node in node.body: if not isinstance(cdef_body_node, (Assign, AnnAssign)): continue if isinstance(cdef_body_node.value, Call): if cdef_body_node.value.func.as_string() not in ATTRIB_NAMES: continue else: continue targets = ( cdef_body_node.targets if hasattr(cdef_body_node, "targets") else [cdef_body_node.target] ) for target in targets: rhs_node = Unknown( lineno=cdef_body_node.lineno, col_offset=cdef_body_node.col_offset, parent=cdef_body_node, ) if isinstance(target, AssignName): # Could be a subscript if the code analysed is # i = Optional[str] = "" # See https://github.com/pylint-dev/pylint/issues/4439 node.locals[target.name] = [rhs_node] node.instance_attrs[target.name] = [rhs_node] def register(manager: AstroidManager) -> None: manager.register_transform( ClassDef, attr_attributes_transform, is_decorated_with_attrs ) astroid-3.2.2/astroid/brain/brain_numpy_core_function_base.py0000664000175000017500000000260014622475517024450 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for numpy.core.function_base module.""" import functools from astroid.brain.brain_numpy_utils import ( attribute_looks_like_numpy_member, infer_numpy_member, ) from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.nodes.node_classes import Attribute METHODS_TO_BE_INFERRED = { "linspace": """def linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None, axis=0): return numpy.ndarray([0, 0])""", "logspace": """def logspace(start, stop, num=50, endpoint=True, base=10.0, dtype=None, axis=0): return numpy.ndarray([0, 0])""", "geomspace": """def geomspace(start, stop, num=50, endpoint=True, dtype=None, axis=0): return numpy.ndarray([0, 0])""", } def register(manager: AstroidManager) -> None: for func_name, func_src in METHODS_TO_BE_INFERRED.items(): inference_function = functools.partial(infer_numpy_member, func_src) manager.register_transform( Attribute, inference_tip(inference_function), functools.partial(attribute_looks_like_numpy_member, func_name), ) astroid-3.2.2/astroid/brain/brain_ssl.py0000664000175000017500000001463214622475517020202 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the ssl library.""" from astroid import parse from astroid.brain.helpers import register_module_extender from astroid.const import PY310_PLUS from astroid.manager import AstroidManager def _verifyflags_enum() -> str: enum = """ class VerifyFlags(_IntFlag): VERIFY_DEFAULT = 0 VERIFY_CRL_CHECK_LEAF = 1 VERIFY_CRL_CHECK_CHAIN = 2 VERIFY_X509_STRICT = 3 VERIFY_X509_TRUSTED_FIRST = 4""" if PY310_PLUS: enum += """ VERIFY_ALLOW_PROXY_CERTS = 5 VERIFY_X509_PARTIAL_CHAIN = 6 """ return enum def _options_enum() -> str: enum = """ class Options(_IntFlag): OP_ALL = 1 OP_NO_SSLv2 = 2 OP_NO_SSLv3 = 3 OP_NO_TLSv1 = 4 OP_NO_TLSv1_1 = 5 OP_NO_TLSv1_2 = 6 OP_NO_TLSv1_3 = 7 OP_CIPHER_SERVER_PREFERENCE = 8 OP_SINGLE_DH_USE = 9 OP_SINGLE_ECDH_USE = 10 OP_NO_COMPRESSION = 11 OP_NO_TICKET = 12 OP_NO_RENEGOTIATION = 13 OP_ENABLE_MIDDLEBOX_COMPAT = 14""" return enum def ssl_transform(): return parse( """ # Import necessary for conversion of objects defined in C into enums from enum import IntEnum as _IntEnum, IntFlag as _IntFlag from _ssl import OPENSSL_VERSION_NUMBER, OPENSSL_VERSION_INFO, OPENSSL_VERSION from _ssl import _SSLContext, MemoryBIO from _ssl import ( SSLError, SSLZeroReturnError, SSLWantReadError, SSLWantWriteError, SSLSyscallError, SSLEOFError, ) from _ssl import CERT_NONE, CERT_OPTIONAL, CERT_REQUIRED from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj from _ssl import RAND_status, RAND_add, RAND_bytes, RAND_pseudo_bytes try: from _ssl import RAND_egd except ImportError: # LibreSSL does not provide RAND_egd pass from _ssl import (OP_ALL, OP_CIPHER_SERVER_PREFERENCE, OP_NO_COMPRESSION, OP_NO_SSLv2, OP_NO_SSLv3, OP_NO_TLSv1, OP_NO_TLSv1_1, OP_NO_TLSv1_2, OP_SINGLE_DH_USE, OP_SINGLE_ECDH_USE) from _ssl import (ALERT_DESCRIPTION_ACCESS_DENIED, ALERT_DESCRIPTION_BAD_CERTIFICATE, ALERT_DESCRIPTION_BAD_CERTIFICATE_HASH_VALUE, ALERT_DESCRIPTION_BAD_CERTIFICATE_STATUS_RESPONSE, ALERT_DESCRIPTION_BAD_RECORD_MAC, ALERT_DESCRIPTION_CERTIFICATE_EXPIRED, ALERT_DESCRIPTION_CERTIFICATE_REVOKED, ALERT_DESCRIPTION_CERTIFICATE_UNKNOWN, ALERT_DESCRIPTION_CERTIFICATE_UNOBTAINABLE, ALERT_DESCRIPTION_CLOSE_NOTIFY, ALERT_DESCRIPTION_DECODE_ERROR, ALERT_DESCRIPTION_DECOMPRESSION_FAILURE, ALERT_DESCRIPTION_DECRYPT_ERROR, ALERT_DESCRIPTION_HANDSHAKE_FAILURE, ALERT_DESCRIPTION_ILLEGAL_PARAMETER, ALERT_DESCRIPTION_INSUFFICIENT_SECURITY, ALERT_DESCRIPTION_INTERNAL_ERROR, ALERT_DESCRIPTION_NO_RENEGOTIATION, ALERT_DESCRIPTION_PROTOCOL_VERSION, ALERT_DESCRIPTION_RECORD_OVERFLOW, ALERT_DESCRIPTION_UNEXPECTED_MESSAGE, ALERT_DESCRIPTION_UNKNOWN_CA, ALERT_DESCRIPTION_UNKNOWN_PSK_IDENTITY, ALERT_DESCRIPTION_UNRECOGNIZED_NAME, ALERT_DESCRIPTION_UNSUPPORTED_CERTIFICATE, ALERT_DESCRIPTION_UNSUPPORTED_EXTENSION, ALERT_DESCRIPTION_USER_CANCELLED) from _ssl import (SSL_ERROR_EOF, SSL_ERROR_INVALID_ERROR_CODE, SSL_ERROR_SSL, SSL_ERROR_SYSCALL, SSL_ERROR_WANT_CONNECT, SSL_ERROR_WANT_READ, SSL_ERROR_WANT_WRITE, SSL_ERROR_WANT_X509_LOOKUP, SSL_ERROR_ZERO_RETURN) from _ssl import VERIFY_CRL_CHECK_CHAIN, VERIFY_CRL_CHECK_LEAF, VERIFY_DEFAULT, VERIFY_X509_STRICT from _ssl import HAS_SNI, HAS_ECDH, HAS_NPN, HAS_ALPN from _ssl import _OPENSSL_API_VERSION from _ssl import PROTOCOL_SSLv23, PROTOCOL_TLSv1, PROTOCOL_TLSv1_1, PROTOCOL_TLSv1_2 from _ssl import PROTOCOL_TLS, PROTOCOL_TLS_CLIENT, PROTOCOL_TLS_SERVER class AlertDescription(_IntEnum): ALERT_DESCRIPTION_ACCESS_DENIED = 0 ALERT_DESCRIPTION_BAD_CERTIFICATE = 1 ALERT_DESCRIPTION_BAD_CERTIFICATE_HASH_VALUE = 2 ALERT_DESCRIPTION_BAD_CERTIFICATE_STATUS_RESPONSE = 3 ALERT_DESCRIPTION_BAD_RECORD_MAC = 4 ALERT_DESCRIPTION_CERTIFICATE_EXPIRED = 5 ALERT_DESCRIPTION_CERTIFICATE_REVOKED = 6 ALERT_DESCRIPTION_CERTIFICATE_UNKNOWN = 7 ALERT_DESCRIPTION_CERTIFICATE_UNOBTAINABLE = 8 ALERT_DESCRIPTION_CLOSE_NOTIFY = 9 ALERT_DESCRIPTION_DECODE_ERROR = 10 ALERT_DESCRIPTION_DECOMPRESSION_FAILURE = 11 ALERT_DESCRIPTION_DECRYPT_ERROR = 12 ALERT_DESCRIPTION_HANDSHAKE_FAILURE = 13 ALERT_DESCRIPTION_ILLEGAL_PARAMETER = 14 ALERT_DESCRIPTION_INSUFFICIENT_SECURITY = 15 ALERT_DESCRIPTION_INTERNAL_ERROR = 16 ALERT_DESCRIPTION_NO_RENEGOTIATION = 17 ALERT_DESCRIPTION_PROTOCOL_VERSION = 18 ALERT_DESCRIPTION_RECORD_OVERFLOW = 19 ALERT_DESCRIPTION_UNEXPECTED_MESSAGE = 20 ALERT_DESCRIPTION_UNKNOWN_CA = 21 ALERT_DESCRIPTION_UNKNOWN_PSK_IDENTITY = 22 ALERT_DESCRIPTION_UNRECOGNIZED_NAME = 23 ALERT_DESCRIPTION_UNSUPPORTED_CERTIFICATE = 24 ALERT_DESCRIPTION_UNSUPPORTED_EXTENSION = 25 ALERT_DESCRIPTION_USER_CANCELLED = 26 class SSLErrorNumber(_IntEnum): SSL_ERROR_EOF = 0 SSL_ERROR_INVALID_ERROR_CODE = 1 SSL_ERROR_SSL = 2 SSL_ERROR_SYSCALL = 3 SSL_ERROR_WANT_CONNECT = 4 SSL_ERROR_WANT_READ = 5 SSL_ERROR_WANT_WRITE = 6 SSL_ERROR_WANT_X509_LOOKUP = 7 SSL_ERROR_ZERO_RETURN = 8 class VerifyMode(_IntEnum): CERT_NONE = 0 CERT_OPTIONAL = 1 CERT_REQUIRED = 2 """ + _verifyflags_enum() + _options_enum() ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "ssl", ssl_transform) astroid-3.2.2/astroid/brain/brain_ctypes.py0000664000175000017500000000524014622475517020703 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hooks for ctypes module. Inside the ctypes module, the value class is defined inside the C coded module _ctypes. Thus astroid doesn't know that the value member is a builtin type among float, int, bytes or str. """ import sys from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def enrich_ctypes_redefined_types(): """ For each ctypes redefined types, overload 'value' and '_type_' members definition. Overloading 'value' is mandatory otherwise astroid cannot infer the correct type for it. Overloading '_type_' is necessary because the class definition made here replaces the original one, in which '_type_' member is defined. Luckily those original class definitions are very short and contain only the '_type_' member definition. """ c_class_to_type = ( ("c_byte", "int", "b"), ("c_char", "bytes", "c"), ("c_double", "float", "d"), ("c_float", "float", "f"), ("c_int", "int", "i"), ("c_int16", "int", "h"), ("c_int32", "int", "i"), ("c_int64", "int", "l"), ("c_int8", "int", "b"), ("c_long", "int", "l"), ("c_longdouble", "float", "g"), ("c_longlong", "int", "l"), ("c_short", "int", "h"), ("c_size_t", "int", "L"), ("c_ssize_t", "int", "l"), ("c_ubyte", "int", "B"), ("c_uint", "int", "I"), ("c_uint16", "int", "H"), ("c_uint32", "int", "I"), ("c_uint64", "int", "L"), ("c_uint8", "int", "B"), ("c_ulong", "int", "L"), ("c_ulonglong", "int", "L"), ("c_ushort", "int", "H"), ("c_wchar", "str", "u"), ) src = [ """ from _ctypes import _SimpleCData class c_bool(_SimpleCData): def __init__(self, value): self.value = True self._type_ = '?' """ ] for c_type, builtin_type, type_code in c_class_to_type: src.append( f""" class {c_type}(_SimpleCData): def __init__(self, value): self.value = {builtin_type}(value) self._type_ = '{type_code}' """ ) return parse("\n".join(src)) def register(manager: AstroidManager) -> None: if not hasattr(sys, "pypy_version_info"): # No need of this module in pypy where everything is written in python register_module_extender(manager, "ctypes", enrich_ctypes_redefined_types) astroid-3.2.2/astroid/brain/brain_dateutil.py0000664000175000017500000000146314622475517021212 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for dateutil.""" import textwrap from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.manager import AstroidManager def dateutil_transform(): return AstroidBuilder(AstroidManager()).string_build( textwrap.dedent( """ import datetime def parse(timestr, parserinfo=None, **kwargs): return datetime.datetime() """ ) ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "dateutil.parser", dateutil_transform) astroid-3.2.2/astroid/brain/brain_numpy_utils.py0000664000175000017500000000477114622475517021774 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Different utilities for the numpy brains.""" from __future__ import annotations from astroid.builder import extract_node from astroid.context import InferenceContext from astroid.nodes.node_classes import Attribute, Import, Name # Class subscript is available in numpy starting with version 1.20.0 NUMPY_VERSION_TYPE_HINTS_SUPPORT = ("1", "20", "0") def numpy_supports_type_hints() -> bool: """Returns True if numpy supports type hints.""" np_ver = _get_numpy_version() return np_ver and np_ver > NUMPY_VERSION_TYPE_HINTS_SUPPORT def _get_numpy_version() -> tuple[str, str, str]: """ Return the numpy version number if numpy can be imported. Otherwise returns ('0', '0', '0') """ try: import numpy # pylint: disable=import-outside-toplevel return tuple(numpy.version.version.split(".")) except (ImportError, AttributeError): return ("0", "0", "0") def infer_numpy_member(src, node, context: InferenceContext | None = None): node = extract_node(src) return node.infer(context=context) def _is_a_numpy_module(node: Name) -> bool: """ Returns True if the node is a representation of a numpy module. For example in : import numpy as np x = np.linspace(1, 2) The node is a representation of the numpy module. :param node: node to test :return: True if the node is a representation of the numpy module. """ module_nickname = node.name potential_import_target = [ x for x in node.lookup(module_nickname)[1] if isinstance(x, Import) ] return any( ("numpy", module_nickname) in target.names or ("numpy", None) in target.names for target in potential_import_target ) def name_looks_like_numpy_member(member_name: str, node: Name) -> bool: """ Returns True if the Name is a member of numpy whose name is member_name. """ return node.name == member_name and node.root().name.startswith("numpy") def attribute_looks_like_numpy_member(member_name: str, node: Attribute) -> bool: """ Returns True if the Attribute is a member of numpy whose name is member_name. """ return ( node.attrname == member_name and isinstance(node.expr, Name) and _is_a_numpy_module(node.expr) ) astroid-3.2.2/astroid/brain/brain_crypt.py0000664000175000017500000000162314622475517020536 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def _re_transform(): return parse( """ from collections import namedtuple _Method = namedtuple('_Method', 'name ident salt_chars total_size') METHOD_SHA512 = _Method('SHA512', '6', 16, 106) METHOD_SHA256 = _Method('SHA256', '5', 16, 63) METHOD_BLOWFISH = _Method('BLOWFISH', 2, 'b', 22) METHOD_MD5 = _Method('MD5', '1', 8, 34) METHOD_CRYPT = _Method('CRYPT', None, 2, 13) """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "crypt", _re_transform) astroid-3.2.2/astroid/brain/brain_namedtuple_enum.py0000664000175000017500000005655014622475517022570 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the Python standard library.""" from __future__ import annotations import functools import keyword from collections.abc import Iterator from textwrap import dedent from typing import Final import astroid from astroid import arguments, bases, inference_tip, nodes, util from astroid.builder import AstroidBuilder, _extract_single_node, extract_node from astroid.context import InferenceContext from astroid.exceptions import ( AstroidTypeError, AstroidValueError, InferenceError, UseInferenceDefault, ) from astroid.manager import AstroidManager ENUM_QNAME: Final[str] = "enum.Enum" TYPING_NAMEDTUPLE_QUALIFIED: Final = { "typing.NamedTuple", "typing_extensions.NamedTuple", } TYPING_NAMEDTUPLE_BASENAMES: Final = { "NamedTuple", "typing.NamedTuple", "typing_extensions.NamedTuple", } def _infer_first(node, context): if isinstance(node, util.UninferableBase): raise UseInferenceDefault try: value = next(node.infer(context=context)) except StopIteration as exc: raise InferenceError from exc if isinstance(value, util.UninferableBase): raise UseInferenceDefault() return value def _find_func_form_arguments(node, context): def _extract_namedtuple_arg_or_keyword( # pylint: disable=inconsistent-return-statements position, key_name=None ): if len(args) > position: return _infer_first(args[position], context) if key_name and key_name in found_keywords: return _infer_first(found_keywords[key_name], context) args = node.args keywords = node.keywords found_keywords = ( {keyword.arg: keyword.value for keyword in keywords} if keywords else {} ) name = _extract_namedtuple_arg_or_keyword(position=0, key_name="typename") names = _extract_namedtuple_arg_or_keyword(position=1, key_name="field_names") if name and names: return name.value, names raise UseInferenceDefault() def infer_func_form( node: nodes.Call, base_type: list[nodes.NodeNG], context: InferenceContext | None = None, enum: bool = False, ) -> tuple[nodes.ClassDef, str, list[str]]: """Specific inference function for namedtuple or Python 3 enum.""" # node is a Call node, class name as first argument and generated class # attributes as second argument # namedtuple or enums list of attributes can be a list of strings or a # whitespace-separate string try: name, names = _find_func_form_arguments(node, context) try: attributes: list[str] = names.value.replace(",", " ").split() except AttributeError as exc: # Handle attributes of NamedTuples if not enum: attributes = [] fields = _get_namedtuple_fields(node) if fields: fields_node = extract_node(fields) attributes = [ _infer_first(const, context).value for const in fields_node.elts ] # Handle attributes of Enums else: # Enums supports either iterator of (name, value) pairs # or mappings. if hasattr(names, "items") and isinstance(names.items, list): attributes = [ _infer_first(const[0], context).value for const in names.items if isinstance(const[0], nodes.Const) ] elif hasattr(names, "elts"): # Enums can support either ["a", "b", "c"] # or [("a", 1), ("b", 2), ...], but they can't # be mixed. if all(isinstance(const, nodes.Tuple) for const in names.elts): attributes = [ _infer_first(const.elts[0], context).value for const in names.elts if isinstance(const, nodes.Tuple) ] else: attributes = [ _infer_first(const, context).value for const in names.elts ] else: raise AttributeError from exc if not attributes: raise AttributeError from exc except (AttributeError, InferenceError) as exc: raise UseInferenceDefault from exc if not enum: # namedtuple maps sys.intern(str()) over over field_names attributes = [str(attr) for attr in attributes] # XXX this should succeed *unless* __str__/__repr__ is incorrect or throws # in which case we should not have inferred these values and raised earlier attributes = [attr for attr in attributes if " " not in attr] # If we can't infer the name of the class, don't crash, up to this point # we know it is a namedtuple anyway. name = name or "Uninferable" # we want to return a Class node instance with proper attributes set class_node = nodes.ClassDef( name, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, parent=nodes.Unknown(), ) # A typical ClassDef automatically adds its name to the parent scope, # but doing so causes problems, so defer setting parent until after init # see: https://github.com/pylint-dev/pylint/issues/5982 class_node.parent = node.parent class_node.postinit( # set base class=tuple bases=base_type, body=[], decorators=None, ) # XXX add __init__(*attributes) method for attr in attributes: fake_node = nodes.EmptyNode() fake_node.parent = class_node fake_node.attrname = attr class_node.instance_attrs[attr] = [fake_node] return class_node, name, attributes def _has_namedtuple_base(node): """Predicate for class inference tip. :type node: ClassDef :rtype: bool """ return set(node.basenames) & TYPING_NAMEDTUPLE_BASENAMES def _looks_like(node, name) -> bool: func = node.func if isinstance(func, nodes.Attribute): return func.attrname == name if isinstance(func, nodes.Name): return func.name == name return False _looks_like_namedtuple = functools.partial(_looks_like, name="namedtuple") _looks_like_enum = functools.partial(_looks_like, name="Enum") _looks_like_typing_namedtuple = functools.partial(_looks_like, name="NamedTuple") def infer_named_tuple( node: nodes.Call, context: InferenceContext | None = None ) -> Iterator[nodes.ClassDef]: """Specific inference function for namedtuple Call node.""" tuple_base_name: list[nodes.NodeNG] = [ nodes.Name( name="tuple", parent=node.root(), lineno=0, col_offset=0, end_lineno=None, end_col_offset=None, ) ] class_node, name, attributes = infer_func_form( node, tuple_base_name, context=context ) call_site = arguments.CallSite.from_call(node, context=context) node = extract_node("import collections; collections.namedtuple") try: func = next(node.infer()) except StopIteration as e: raise InferenceError(node=node) from e try: rename = next( call_site.infer_argument(func, "rename", context or InferenceContext()) ).bool_value() except (InferenceError, StopIteration): rename = False try: attributes = _check_namedtuple_attributes(name, attributes, rename) except AstroidTypeError as exc: raise UseInferenceDefault("TypeError: " + str(exc)) from exc except AstroidValueError as exc: raise UseInferenceDefault("ValueError: " + str(exc)) from exc replace_args = ", ".join(f"{arg}=None" for arg in attributes) field_def = ( " {name} = property(lambda self: self[{index:d}], " "doc='Alias for field number {index:d}')" ) field_defs = "\n".join( field_def.format(name=name, index=index) for index, name in enumerate(attributes) ) fake = AstroidBuilder(AstroidManager()).string_build( f""" class {name}(tuple): __slots__ = () _fields = {attributes!r} def _asdict(self): return self.__dict__ @classmethod def _make(cls, iterable, new=tuple.__new__, len=len): return new(cls, iterable) def _replace(self, {replace_args}): return self def __getnewargs__(self): return tuple(self) {field_defs} """ ) class_node.locals["_asdict"] = fake.body[0].locals["_asdict"] class_node.locals["_make"] = fake.body[0].locals["_make"] class_node.locals["_replace"] = fake.body[0].locals["_replace"] class_node.locals["_fields"] = fake.body[0].locals["_fields"] for attr in attributes: class_node.locals[attr] = fake.body[0].locals[attr] # we use UseInferenceDefault, we can't be a generator so return an iterator return iter([class_node]) def _get_renamed_namedtuple_attributes(field_names): names = list(field_names) seen = set() for i, name in enumerate(field_names): if ( not all(c.isalnum() or c == "_" for c in name) or keyword.iskeyword(name) or not name or name[0].isdigit() or name.startswith("_") or name in seen ): names[i] = "_%d" % i seen.add(name) return tuple(names) def _check_namedtuple_attributes(typename, attributes, rename=False): attributes = tuple(attributes) if rename: attributes = _get_renamed_namedtuple_attributes(attributes) # The following snippet is derived from the CPython Lib/collections/__init__.py sources # for name in (typename, *attributes): if not isinstance(name, str): raise AstroidTypeError("Type names and field names must be strings") if not name.isidentifier(): raise AstroidValueError( "Type names and field names must be valid" + f"identifiers: {name!r}" ) if keyword.iskeyword(name): raise AstroidValueError( f"Type names and field names cannot be a keyword: {name!r}" ) seen = set() for name in attributes: if name.startswith("_") and not rename: raise AstroidValueError( f"Field names cannot start with an underscore: {name!r}" ) if name in seen: raise AstroidValueError(f"Encountered duplicate field name: {name!r}") seen.add(name) # return attributes def infer_enum( node: nodes.Call, context: InferenceContext | None = None ) -> Iterator[bases.Instance]: """Specific inference function for enum Call node.""" # Raise `UseInferenceDefault` if `node` is a call to a a user-defined Enum. try: inferred = node.func.infer(context) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if not any( isinstance(item, nodes.ClassDef) and item.qname() == ENUM_QNAME for item in inferred ): raise UseInferenceDefault enum_meta = _extract_single_node( """ class EnumMeta(object): 'docstring' def __call__(self, node): class EnumAttribute(object): name = '' value = 0 return EnumAttribute() def __iter__(self): class EnumAttribute(object): name = '' value = 0 return [EnumAttribute()] def __reversed__(self): class EnumAttribute(object): name = '' value = 0 return (EnumAttribute, ) def __next__(self): return next(iter(self)) def __getitem__(self, attr): class Value(object): @property def name(self): return '' @property def value(self): return attr return Value() __members__ = [''] """ ) class_node = infer_func_form(node, [enum_meta], context=context, enum=True)[0] return iter([class_node.instantiate_class()]) INT_FLAG_ADDITION_METHODS = """ def __or__(self, other): return {name}(self.value | other.value) def __and__(self, other): return {name}(self.value & other.value) def __xor__(self, other): return {name}(self.value ^ other.value) def __add__(self, other): return {name}(self.value + other.value) def __div__(self, other): return {name}(self.value / other.value) def __invert__(self): return {name}(~self.value) def __mul__(self, other): return {name}(self.value * other.value) """ def infer_enum_class(node: nodes.ClassDef) -> nodes.ClassDef: """Specific inference for enums.""" for basename in (b for cls in node.mro() for b in cls.basenames): if node.root().name == "enum": # Skip if the class is directly from enum module. break dunder_members = {} target_names = set() for local, values in node.locals.items(): if ( any(not isinstance(value, nodes.AssignName) for value in values) or local == "_ignore_" ): continue stmt = values[0].statement() if isinstance(stmt, nodes.Assign): if isinstance(stmt.targets[0], nodes.Tuple): targets = stmt.targets[0].itered() else: targets = stmt.targets elif isinstance(stmt, nodes.AnnAssign): targets = [stmt.target] else: continue inferred_return_value = None if stmt.value is not None: if isinstance(stmt.value, nodes.Const): if isinstance(stmt.value.value, str): inferred_return_value = repr(stmt.value.value) else: inferred_return_value = stmt.value.value else: inferred_return_value = stmt.value.as_string() new_targets = [] for target in targets: if isinstance(target, nodes.Starred): continue target_names.add(target.name) # Replace all the assignments with our mocked class. classdef = dedent( """ class {name}({types}): @property def value(self): return {return_value} @property def _value_(self): return {return_value} @property def name(self): return "{name}" @property def _name_(self): return "{name}" """.format( name=target.name, types=", ".join(node.basenames), return_value=inferred_return_value, ) ) if "IntFlag" in basename: # Alright, we need to add some additional methods. # Unfortunately we still can't infer the resulting objects as # Enum members, but once we'll be able to do that, the following # should result in some nice symbolic execution classdef += INT_FLAG_ADDITION_METHODS.format(name=target.name) fake = AstroidBuilder( AstroidManager(), apply_transforms=False ).string_build(classdef)[target.name] fake.parent = target.parent for method in node.mymethods(): fake.locals[method.name] = [method] new_targets.append(fake.instantiate_class()) if stmt.value is None: continue dunder_members[local] = fake node.locals[local] = new_targets # The undocumented `_value2member_map_` member: node.locals["_value2member_map_"] = [ nodes.Dict( parent=node, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) ] members = nodes.Dict( parent=node, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) members.postinit( [ ( nodes.Const(k, parent=members), nodes.Name( v.name, parent=members, lineno=v.lineno, col_offset=v.col_offset, end_lineno=v.end_lineno, end_col_offset=v.end_col_offset, ), ) for k, v in dunder_members.items() ] ) node.locals["__members__"] = [members] # The enum.Enum class itself defines two @DynamicClassAttribute data-descriptors # "name" and "value" (which we override in the mocked class for each enum member # above). When dealing with inference of an arbitrary instance of the enum # class, e.g. in a method defined in the class body like: # class SomeEnum(enum.Enum): # def method(self): # self.name # <- here # In the absence of an enum member called "name" or "value", these attributes # should resolve to the descriptor on that particular instance, i.e. enum member. # For "value", we have no idea what that should be, but for "name", we at least # know that it should be a string, so infer that as a guess. if "name" not in target_names: code = dedent( """ @property def name(self): return '' """ ) name_dynamicclassattr = AstroidBuilder(AstroidManager()).string_build(code)[ "name" ] node.locals["name"] = [name_dynamicclassattr] break return node def infer_typing_namedtuple_class(class_node, context: InferenceContext | None = None): """Infer a subclass of typing.NamedTuple.""" # Check if it has the corresponding bases annassigns_fields = [ annassign.target.name for annassign in class_node.body if isinstance(annassign, nodes.AnnAssign) ] code = dedent( """ from collections import namedtuple namedtuple({typename!r}, {fields!r}) """ ).format(typename=class_node.name, fields=",".join(annassigns_fields)) node = extract_node(code) try: generated_class_node = next(infer_named_tuple(node, context)) except StopIteration as e: raise InferenceError(node=node, context=context) from e for method in class_node.mymethods(): generated_class_node.locals[method.name] = [method] for body_node in class_node.body: if isinstance(body_node, nodes.Assign): for target in body_node.targets: attr = target.name generated_class_node.locals[attr] = class_node.locals[attr] elif isinstance(body_node, nodes.ClassDef): generated_class_node.locals[body_node.name] = [body_node] return iter((generated_class_node,)) def infer_typing_namedtuple_function(node, context: InferenceContext | None = None): """ Starting with python3.9, NamedTuple is a function of the typing module. The class NamedTuple is build dynamically through a call to `type` during initialization of the `_NamedTuple` variable. """ klass = extract_node( """ from typing import _NamedTuple _NamedTuple """ ) return klass.infer(context) def infer_typing_namedtuple( node: nodes.Call, context: InferenceContext | None = None ) -> Iterator[nodes.ClassDef]: """Infer a typing.NamedTuple(...) call.""" # This is essentially a namedtuple with different arguments # so we extract the args and infer a named tuple. try: func = next(node.func.infer()) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if func.qname() not in TYPING_NAMEDTUPLE_QUALIFIED: raise UseInferenceDefault if len(node.args) != 2: raise UseInferenceDefault if not isinstance(node.args[1], (nodes.List, nodes.Tuple)): raise UseInferenceDefault return infer_named_tuple(node, context) def _get_namedtuple_fields(node: nodes.Call) -> str: """Get and return fields of a NamedTuple in code-as-a-string. Because the fields are represented in their code form we can extract a node from them later on. """ names = [] container = None try: container = next(node.args[1].infer()) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc # We pass on IndexError as we'll try to infer 'field_names' from the keywords except IndexError: pass if not container: for keyword_node in node.keywords: if keyword_node.arg == "field_names": try: container = next(keyword_node.value.infer()) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc break if not isinstance(container, nodes.BaseContainer): raise UseInferenceDefault for elt in container.elts: if isinstance(elt, nodes.Const): names.append(elt.as_string()) continue if not isinstance(elt, (nodes.List, nodes.Tuple)): raise UseInferenceDefault if len(elt.elts) != 2: raise UseInferenceDefault names.append(elt.elts[0].as_string()) if names: field_names = f"({','.join(names)},)" else: field_names = "" return field_names def _is_enum_subclass(cls: astroid.ClassDef) -> bool: """Return whether cls is a subclass of an Enum.""" return cls.is_subtype_of("enum.Enum") def register(manager: AstroidManager) -> None: manager.register_transform( nodes.Call, inference_tip(infer_named_tuple), _looks_like_namedtuple ) manager.register_transform(nodes.Call, inference_tip(infer_enum), _looks_like_enum) manager.register_transform( nodes.ClassDef, infer_enum_class, predicate=_is_enum_subclass ) manager.register_transform( nodes.ClassDef, inference_tip(infer_typing_namedtuple_class), _has_namedtuple_base, ) manager.register_transform( nodes.FunctionDef, inference_tip(infer_typing_namedtuple_function), lambda node: node.name == "NamedTuple" and getattr(node.root(), "name", None) == "typing", ) manager.register_transform( nodes.Call, inference_tip(infer_typing_namedtuple), _looks_like_typing_namedtuple, ) astroid-3.2.2/astroid/brain/brain_nose.py0000664000175000017500000000454214622475517020344 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Hooks for nose library.""" import re import textwrap from astroid.bases import BoundMethod from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.exceptions import InferenceError from astroid.manager import AstroidManager from astroid.nodes import List, Module CAPITALS = re.compile("([A-Z])") def _pep8(name, caps=CAPITALS): return caps.sub(lambda m: "_" + m.groups()[0].lower(), name) def _nose_tools_functions(): """Get an iterator of names and bound methods.""" module = AstroidBuilder().string_build( textwrap.dedent( """ import unittest class Test(unittest.TestCase): pass a = Test() """ ) ) try: case = next(module["a"].infer()) except (InferenceError, StopIteration): return for method in case.methods(): if method.name.startswith("assert") and "_" not in method.name: pep8_name = _pep8(method.name) yield pep8_name, BoundMethod(method, case) if method.name == "assertEqual": # nose also exports assert_equals. yield "assert_equals", BoundMethod(method, case) def _nose_tools_transform(node): for method_name, method in _nose_tools_functions(): node.locals[method_name] = [method] def _nose_tools_trivial_transform(): """Custom transform for the nose.tools module.""" stub = AstroidBuilder().string_build("""__all__ = []""") all_entries = ["ok_", "eq_"] for pep8_name, method in _nose_tools_functions(): all_entries.append(pep8_name) stub[pep8_name] = method # Update the __all__ variable, since nose.tools # does this manually with .append. all_assign = stub["__all__"].parent all_object = List(all_entries) all_object.parent = all_assign all_assign.value = all_object return stub def register(manager: AstroidManager) -> None: register_module_extender( manager, "nose.tools.trivial", _nose_tools_trivial_transform ) manager.register_transform( Module, _nose_tools_transform, lambda n: n.name == "nose.tools" ) astroid-3.2.2/astroid/brain/brain_regex.py0000664000175000017500000000664714622475517020522 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid import context, inference_tip, nodes from astroid.brain.helpers import register_module_extender from astroid.builder import _extract_single_node, parse from astroid.const import PY39_PLUS from astroid.manager import AstroidManager def _regex_transform() -> nodes.Module: """The RegexFlag enum exposes all its entries by updating globals(). We hard-code the flags for now. # pylint: disable-next=line-too-long See https://github.com/mrabarnett/mrab-regex/blob/2022.10.31/regex_3/regex.py#L200 """ return parse( """ A = ASCII = 0x80 # Assume ASCII locale. B = BESTMATCH = 0x1000 # Best fuzzy match. D = DEBUG = 0x200 # Print parsed pattern. E = ENHANCEMATCH = 0x8000 # Attempt to improve the fit after finding the first # fuzzy match. F = FULLCASE = 0x4000 # Unicode full case-folding. I = IGNORECASE = 0x2 # Ignore case. L = LOCALE = 0x4 # Assume current 8-bit locale. M = MULTILINE = 0x8 # Make anchors look for newline. P = POSIX = 0x10000 # POSIX-style matching (leftmost longest). R = REVERSE = 0x400 # Search backwards. S = DOTALL = 0x10 # Make dot match newline. U = UNICODE = 0x20 # Assume Unicode locale. V0 = VERSION0 = 0x2000 # Old legacy behaviour. DEFAULT_VERSION = V0 V1 = VERSION1 = 0x100 # New enhanced behaviour. W = WORD = 0x800 # Default Unicode word breaks. X = VERBOSE = 0x40 # Ignore whitespace and comments. T = TEMPLATE = 0x1 # Template (present because re module has it). """ ) CLASS_GETITEM_TEMPLATE = """ @classmethod def __class_getitem__(cls, item): return cls """ def _looks_like_pattern_or_match(node: nodes.Call) -> bool: """Check for regex.Pattern or regex.Match call in stdlib. Match these patterns from stdlib/re.py ```py Pattern = type(...) Match = type(...) ``` """ return ( node.root().name == "regex.regex" and isinstance(node.func, nodes.Name) and node.func.name == "type" and isinstance(node.parent, nodes.Assign) and len(node.parent.targets) == 1 and isinstance(node.parent.targets[0], nodes.AssignName) and node.parent.targets[0].name in {"Pattern", "Match"} ) def infer_pattern_match(node: nodes.Call, ctx: context.InferenceContext | None = None): """Infer regex.Pattern and regex.Match as classes. For PY39+ add `__class_getitem__`. """ class_def = nodes.ClassDef( name=node.parent.targets[0].name, lineno=node.lineno, col_offset=node.col_offset, parent=node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) if PY39_PLUS: func_to_add = _extract_single_node(CLASS_GETITEM_TEMPLATE) class_def.locals["__class_getitem__"] = [func_to_add] return iter([class_def]) def register(manager: AstroidManager) -> None: register_module_extender(manager, "regex", _regex_transform) manager.register_transform( nodes.Call, inference_tip(infer_pattern_match), _looks_like_pattern_or_match ) astroid-3.2.2/astroid/brain/brain_functools.py0000664000175000017500000001436614622475517021421 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for understanding functools library module.""" from __future__ import annotations from collections.abc import Iterator from functools import partial from itertools import chain from astroid import BoundMethod, arguments, extract_node, nodes, objects from astroid.context import InferenceContext from astroid.exceptions import InferenceError, UseInferenceDefault from astroid.inference_tip import inference_tip from astroid.interpreter import objectmodel from astroid.manager import AstroidManager from astroid.nodes.node_classes import AssignName, Attribute, Call, Name from astroid.nodes.scoped_nodes import FunctionDef from astroid.typing import InferenceResult, SuccessfulInferenceResult from astroid.util import UninferableBase, safe_infer LRU_CACHE = "functools.lru_cache" class LruWrappedModel(objectmodel.FunctionModel): """Special attribute model for functions decorated with functools.lru_cache. The said decorators patches at decoration time some functions onto the decorated function. """ @property def attr___wrapped__(self): return self._instance @property def attr_cache_info(self): cache_info = extract_node( """ from functools import _CacheInfo _CacheInfo(0, 0, 0, 0) """ ) class CacheInfoBoundMethod(BoundMethod): def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: res = safe_infer(cache_info) assert res is not None yield res return CacheInfoBoundMethod(proxy=self._instance, bound=self._instance) @property def attr_cache_clear(self): node = extract_node("""def cache_clear(self): pass""") return BoundMethod(proxy=node, bound=self._instance.parent.scope()) def _transform_lru_cache(node, context: InferenceContext | None = None) -> None: # TODO: this is not ideal, since the node should be immutable, # but due to https://github.com/pylint-dev/astroid/issues/354, # there's not much we can do now. # Replacing the node would work partially, because, # in pylint, the old node would still be available, leading # to spurious false positives. node.special_attributes = LruWrappedModel()(node) def _functools_partial_inference( node: nodes.Call, context: InferenceContext | None = None ) -> Iterator[objects.PartialFunction]: call = arguments.CallSite.from_call(node, context=context) number_of_positional = len(call.positional_arguments) if number_of_positional < 1: raise UseInferenceDefault("functools.partial takes at least one argument") if number_of_positional == 1 and not call.keyword_arguments: raise UseInferenceDefault( "functools.partial needs at least to have some filled arguments" ) partial_function = call.positional_arguments[0] try: inferred_wrapped_function = next(partial_function.infer(context=context)) except (InferenceError, StopIteration) as exc: raise UseInferenceDefault from exc if isinstance(inferred_wrapped_function, UninferableBase): raise UseInferenceDefault("Cannot infer the wrapped function") if not isinstance(inferred_wrapped_function, FunctionDef): raise UseInferenceDefault("The wrapped function is not a function") # Determine if the passed keywords into the callsite are supported # by the wrapped function. if not inferred_wrapped_function.args: function_parameters = [] else: function_parameters = chain( inferred_wrapped_function.args.args or (), inferred_wrapped_function.args.posonlyargs or (), inferred_wrapped_function.args.kwonlyargs or (), ) parameter_names = { param.name for param in function_parameters if isinstance(param, AssignName) } if set(call.keyword_arguments) - parameter_names: raise UseInferenceDefault("wrapped function received unknown parameters") partial_function = objects.PartialFunction( call, name=inferred_wrapped_function.name, lineno=inferred_wrapped_function.lineno, col_offset=inferred_wrapped_function.col_offset, parent=node.parent, ) partial_function.postinit( args=inferred_wrapped_function.args, body=inferred_wrapped_function.body, decorators=inferred_wrapped_function.decorators, returns=inferred_wrapped_function.returns, type_comment_returns=inferred_wrapped_function.type_comment_returns, type_comment_args=inferred_wrapped_function.type_comment_args, doc_node=inferred_wrapped_function.doc_node, ) return iter((partial_function,)) def _looks_like_lru_cache(node) -> bool: """Check if the given function node is decorated with lru_cache.""" if not node.decorators: return False for decorator in node.decorators.nodes: if not isinstance(decorator, (Attribute, Call)): continue if _looks_like_functools_member(decorator, "lru_cache"): return True return False def _looks_like_functools_member(node: Attribute | Call, member: str) -> bool: """Check if the given Call node is the wanted member of functools.""" if isinstance(node, Attribute): return node.attrname == member if isinstance(node.func, Name): return node.func.name == member if isinstance(node.func, Attribute): return ( node.func.attrname == member and isinstance(node.func.expr, Name) and node.func.expr.name == "functools" ) return False _looks_like_partial = partial(_looks_like_functools_member, member="partial") def register(manager: AstroidManager) -> None: manager.register_transform(FunctionDef, _transform_lru_cache, _looks_like_lru_cache) manager.register_transform( Call, inference_tip(_functools_partial_inference), _looks_like_partial, ) astroid-3.2.2/astroid/brain/brain_numpy_random_mtrand.py0000664000175000017500000000665014622475517023457 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt # TODO(hippo91) : correct the functions return types """Astroid hooks for numpy.random.mtrand module.""" from astroid.brain.helpers import register_module_extender from astroid.builder import parse from astroid.manager import AstroidManager def numpy_random_mtrand_transform(): return parse( """ def beta(a, b, size=None): return uninferable def binomial(n, p, size=None): return uninferable def bytes(length): return uninferable def chisquare(df, size=None): return uninferable def choice(a, size=None, replace=True, p=None): return uninferable def dirichlet(alpha, size=None): return uninferable def exponential(scale=1.0, size=None): return uninferable def f(dfnum, dfden, size=None): return uninferable def gamma(shape, scale=1.0, size=None): return uninferable def geometric(p, size=None): return uninferable def get_state(): return uninferable def gumbel(loc=0.0, scale=1.0, size=None): return uninferable def hypergeometric(ngood, nbad, nsample, size=None): return uninferable def laplace(loc=0.0, scale=1.0, size=None): return uninferable def logistic(loc=0.0, scale=1.0, size=None): return uninferable def lognormal(mean=0.0, sigma=1.0, size=None): return uninferable def logseries(p, size=None): return uninferable def multinomial(n, pvals, size=None): return uninferable def multivariate_normal(mean, cov, size=None): return uninferable def negative_binomial(n, p, size=None): return uninferable def noncentral_chisquare(df, nonc, size=None): return uninferable def noncentral_f(dfnum, dfden, nonc, size=None): return uninferable def normal(loc=0.0, scale=1.0, size=None): return uninferable def pareto(a, size=None): return uninferable def permutation(x): return uninferable def poisson(lam=1.0, size=None): return uninferable def power(a, size=None): return uninferable def rand(*args): return uninferable def randint(low, high=None, size=None, dtype='l'): import numpy return numpy.ndarray((1,1)) def randn(*args): return uninferable def random(size=None): return uninferable def random_integers(low, high=None, size=None): return uninferable def random_sample(size=None): return uninferable def rayleigh(scale=1.0, size=None): return uninferable def seed(seed=None): return uninferable def set_state(state): return uninferable def shuffle(x): return uninferable def standard_cauchy(size=None): return uninferable def standard_exponential(size=None): return uninferable def standard_gamma(shape, size=None): return uninferable def standard_normal(size=None): return uninferable def standard_t(df, size=None): return uninferable def triangular(left, mode, right, size=None): return uninferable def uniform(low=0.0, high=1.0, size=None): return uninferable def vonmises(mu, kappa, size=None): return uninferable def wald(mean, scale, size=None): return uninferable def weibull(a, size=None): return uninferable def zipf(a, size=None): return uninferable """ ) def register(manager: AstroidManager) -> None: register_module_extender( manager, "numpy.random.mtrand", numpy_random_mtrand_transform ) astroid-3.2.2/astroid/brain/brain_dataclasses.py0000664000175000017500000005327614622475517021677 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ Astroid hook for the dataclasses library. Support built-in dataclasses, pydantic.dataclasses, and marshmallow_dataclass-annotated dataclasses. References: - https://docs.python.org/3/library/dataclasses.html - https://pydantic-docs.helpmanual.io/usage/dataclasses/ - https://lovasoa.github.io/marshmallow_dataclass/ """ from __future__ import annotations from collections.abc import Iterator from typing import Literal, Tuple, Union from astroid import bases, context, nodes from astroid.builder import parse from astroid.const import PY39_PLUS, PY310_PLUS from astroid.exceptions import AstroidSyntaxError, InferenceError, UseInferenceDefault from astroid.inference_tip import inference_tip from astroid.manager import AstroidManager from astroid.typing import InferenceResult from astroid.util import Uninferable, UninferableBase, safe_infer _FieldDefaultReturn = Union[ None, Tuple[Literal["default"], nodes.NodeNG], Tuple[Literal["default_factory"], nodes.Call], ] DATACLASSES_DECORATORS = frozenset(("dataclass",)) FIELD_NAME = "field" DATACLASS_MODULES = frozenset( ("dataclasses", "marshmallow_dataclass", "pydantic.dataclasses") ) DEFAULT_FACTORY = "_HAS_DEFAULT_FACTORY" # based on typing.py def is_decorated_with_dataclass( node: nodes.ClassDef, decorator_names: frozenset[str] = DATACLASSES_DECORATORS ) -> bool: """Return True if a decorated node has a `dataclass` decorator applied.""" if not isinstance(node, nodes.ClassDef) or not node.decorators: return False return any( _looks_like_dataclass_decorator(decorator_attribute, decorator_names) for decorator_attribute in node.decorators.nodes ) def dataclass_transform(node: nodes.ClassDef) -> None: """Rewrite a dataclass to be easily understood by pylint.""" node.is_dataclass = True for assign_node in _get_dataclass_attributes(node): name = assign_node.target.name rhs_node = nodes.Unknown( lineno=assign_node.lineno, col_offset=assign_node.col_offset, parent=assign_node, ) rhs_node = AstroidManager().visit_transforms(rhs_node) node.instance_attrs[name] = [rhs_node] if not _check_generate_dataclass_init(node): return kw_only_decorated = False if PY310_PLUS and node.decorators.nodes: for decorator in node.decorators.nodes: if not isinstance(decorator, nodes.Call): kw_only_decorated = False break for keyword in decorator.keywords: if keyword.arg == "kw_only": kw_only_decorated = keyword.value.bool_value() init_str = _generate_dataclass_init( node, list(_get_dataclass_attributes(node, init=True)), kw_only_decorated, ) try: init_node = parse(init_str)["__init__"] except AstroidSyntaxError: pass else: init_node.parent = node init_node.lineno, init_node.col_offset = None, None node.locals["__init__"] = [init_node] root = node.root() if DEFAULT_FACTORY not in root.locals: new_assign = parse(f"{DEFAULT_FACTORY} = object()").body[0] new_assign.parent = root root.locals[DEFAULT_FACTORY] = [new_assign.targets[0]] def _get_dataclass_attributes( node: nodes.ClassDef, init: bool = False ) -> Iterator[nodes.AnnAssign]: """Yield the AnnAssign nodes of dataclass attributes for the node. If init is True, also include InitVars. """ for assign_node in node.body: if not isinstance(assign_node, nodes.AnnAssign) or not isinstance( assign_node.target, nodes.AssignName ): continue # Annotation is never None if _is_class_var(assign_node.annotation): # type: ignore[arg-type] continue if _is_keyword_only_sentinel(assign_node.annotation): continue # Annotation is never None if not init and _is_init_var(assign_node.annotation): # type: ignore[arg-type] continue yield assign_node def _check_generate_dataclass_init(node: nodes.ClassDef) -> bool: """Return True if we should generate an __init__ method for node. This is True when: - node doesn't define its own __init__ method - the dataclass decorator was called *without* the keyword argument init=False """ if "__init__" in node.locals: return False found = None for decorator_attribute in node.decorators.nodes: if not isinstance(decorator_attribute, nodes.Call): continue if _looks_like_dataclass_decorator(decorator_attribute): found = decorator_attribute if found is None: return True # Check for keyword arguments of the form init=False return not any( keyword.arg == "init" and not keyword.value.bool_value() # type: ignore[union-attr] # value is never None for keyword in found.keywords ) def _find_arguments_from_base_classes( node: nodes.ClassDef, ) -> tuple[ dict[str, tuple[str | None, str | None]], dict[str, tuple[str | None, str | None]] ]: """Iterate through all bases and get their typing and defaults.""" pos_only_store: dict[str, tuple[str | None, str | None]] = {} kw_only_store: dict[str, tuple[str | None, str | None]] = {} # See TODO down below # all_have_defaults = True for base in reversed(node.mro()): if not base.is_dataclass: continue try: base_init: nodes.FunctionDef = base.locals["__init__"][0] except KeyError: continue pos_only, kw_only = base_init.args._get_arguments_data() for posarg, data in pos_only.items(): # if data[1] is None: # if all_have_defaults and pos_only_store: # # TODO: This should return an Uninferable as this would raise # # a TypeError at runtime. However, transforms can't return # # Uninferables currently. # pass # all_have_defaults = False pos_only_store[posarg] = data for kwarg, data in kw_only.items(): kw_only_store[kwarg] = data return pos_only_store, kw_only_store def _parse_arguments_into_strings( pos_only_store: dict[str, tuple[str | None, str | None]], kw_only_store: dict[str, tuple[str | None, str | None]], ) -> tuple[str, str]: """Parse positional and keyword arguments into strings for an __init__ method.""" pos_only, kw_only = "", "" for pos_arg, data in pos_only_store.items(): pos_only += pos_arg if data[0]: pos_only += ": " + data[0] if data[1]: pos_only += " = " + data[1] pos_only += ", " for kw_arg, data in kw_only_store.items(): kw_only += kw_arg if data[0]: kw_only += ": " + data[0] if data[1]: kw_only += " = " + data[1] kw_only += ", " return pos_only, kw_only def _get_previous_field_default(node: nodes.ClassDef, name: str) -> nodes.NodeNG | None: """Get the default value of a previously defined field.""" for base in reversed(node.mro()): if not base.is_dataclass: continue if name in base.locals: for assign in base.locals[name]: if ( isinstance(assign.parent, nodes.AnnAssign) and assign.parent.value and isinstance(assign.parent.value, nodes.Call) and _looks_like_dataclass_field_call(assign.parent.value) ): default = _get_field_default(assign.parent.value) if default: return default[1] return None def _generate_dataclass_init( # pylint: disable=too-many-locals node: nodes.ClassDef, assigns: list[nodes.AnnAssign], kw_only_decorated: bool ) -> str: """Return an init method for a dataclass given the targets.""" params: list[str] = [] kw_only_params: list[str] = [] assignments: list[str] = [] prev_pos_only_store, prev_kw_only_store = _find_arguments_from_base_classes(node) for assign in assigns: name, annotation, value = assign.target.name, assign.annotation, assign.value # Check whether this assign is overriden by a property assignment property_node: nodes.FunctionDef | None = None for additional_assign in node.locals[name]: if not isinstance(additional_assign, nodes.FunctionDef): continue if not additional_assign.decorators: continue if "builtins.property" in additional_assign.decoratornames(): property_node = additional_assign break is_field = isinstance(value, nodes.Call) and _looks_like_dataclass_field_call( value, check_scope=False ) if is_field: # Skip any fields that have `init=False` if any( keyword.arg == "init" and not keyword.value.bool_value() for keyword in value.keywords # type: ignore[union-attr] # value is never None ): # Also remove the name from the previous arguments to be inserted later prev_pos_only_store.pop(name, None) prev_kw_only_store.pop(name, None) continue if _is_init_var(annotation): # type: ignore[arg-type] # annotation is never None init_var = True if isinstance(annotation, nodes.Subscript): annotation = annotation.slice else: # Cannot determine type annotation for parameter from InitVar annotation = None assignment_str = "" else: init_var = False assignment_str = f"self.{name} = {name}" ann_str, default_str = None, None if annotation is not None: ann_str = annotation.as_string() if value: if is_field: result = _get_field_default(value) # type: ignore[arg-type] if result: default_type, default_node = result if default_type == "default": default_str = default_node.as_string() elif default_type == "default_factory": default_str = DEFAULT_FACTORY assignment_str = ( f"self.{name} = {default_node.as_string()} " f"if {name} is {DEFAULT_FACTORY} else {name}" ) else: default_str = value.as_string() elif property_node: # We set the result of the property call as default # This hides the fact that this would normally be a 'property object' # But we can't represent those as string try: # Call str to make sure also Uninferable gets stringified default_str = str( next(property_node.infer_call_result(None)).as_string() ) except (InferenceError, StopIteration): pass else: # Even with `init=False` the default value still can be propogated to # later assignments. Creating weird signatures like: # (self, a: str = 1) -> None previous_default = _get_previous_field_default(node, name) if previous_default: default_str = previous_default.as_string() # Construct the param string to add to the init if necessary param_str = name if ann_str is not None: param_str += f": {ann_str}" if default_str is not None: param_str += f" = {default_str}" # If the field is a kw_only field, we need to add it to the kw_only_params # This overwrites whether or not the class is kw_only decorated if is_field: kw_only = [k for k in value.keywords if k.arg == "kw_only"] # type: ignore[union-attr] if kw_only: if kw_only[0].value.bool_value(): kw_only_params.append(param_str) else: params.append(param_str) continue # If kw_only decorated, we need to add all parameters to the kw_only_params if kw_only_decorated: if name in prev_kw_only_store: prev_kw_only_store[name] = (ann_str, default_str) else: kw_only_params.append(param_str) else: # If the name was previously seen, overwrite that data # pylint: disable-next=else-if-used if name in prev_pos_only_store: prev_pos_only_store[name] = (ann_str, default_str) elif name in prev_kw_only_store: params = [name, *params] prev_kw_only_store.pop(name) else: params.append(param_str) if not init_var: assignments.append(assignment_str) prev_pos_only, prev_kw_only = _parse_arguments_into_strings( prev_pos_only_store, prev_kw_only_store ) # Construct the new init method paramter string # First we do the positional only parameters, making sure to add the # the self parameter and the comma to allow adding keyword only parameters params_string = "" if "self" in prev_pos_only else "self, " params_string += prev_pos_only + ", ".join(params) if not params_string.endswith(", "): params_string += ", " # Then we add the keyword only parameters if prev_kw_only or kw_only_params: params_string += "*, " params_string += f"{prev_kw_only}{', '.join(kw_only_params)}" assignments_string = "\n ".join(assignments) if assignments else "pass" return f"def __init__({params_string}) -> None:\n {assignments_string}" def infer_dataclass_attribute( node: nodes.Unknown, ctx: context.InferenceContext | None = None ) -> Iterator[InferenceResult]: """Inference tip for an Unknown node that was dynamically generated to represent a dataclass attribute. In the case that a default value is provided, that is inferred first. Then, an Instance of the annotated class is yielded. """ assign = node.parent if not isinstance(assign, nodes.AnnAssign): yield Uninferable return annotation, value = assign.annotation, assign.value if value is not None: yield from value.infer(context=ctx) if annotation is not None: yield from _infer_instance_from_annotation(annotation, ctx=ctx) else: yield Uninferable def infer_dataclass_field_call( node: nodes.Call, ctx: context.InferenceContext | None = None ) -> Iterator[InferenceResult]: """Inference tip for dataclass field calls.""" if not isinstance(node.parent, (nodes.AnnAssign, nodes.Assign)): raise UseInferenceDefault result = _get_field_default(node) if not result: yield Uninferable else: default_type, default = result if default_type == "default": yield from default.infer(context=ctx) else: new_call = parse(default.as_string()).body[0].value new_call.parent = node.parent yield from new_call.infer(context=ctx) def _looks_like_dataclass_decorator( node: nodes.NodeNG, decorator_names: frozenset[str] = DATACLASSES_DECORATORS ) -> bool: """Return True if node looks like a dataclass decorator. Uses inference to lookup the value of the node, and if that fails, matches against specific names. """ if isinstance(node, nodes.Call): # decorator with arguments node = node.func try: inferred = next(node.infer()) except (InferenceError, StopIteration): inferred = Uninferable if isinstance(inferred, UninferableBase): if isinstance(node, nodes.Name): return node.name in decorator_names if isinstance(node, nodes.Attribute): return node.attrname in decorator_names return False return ( isinstance(inferred, nodes.FunctionDef) and inferred.name in decorator_names and inferred.root().name in DATACLASS_MODULES ) def _looks_like_dataclass_attribute(node: nodes.Unknown) -> bool: """Return True if node was dynamically generated as the child of an AnnAssign statement. """ parent = node.parent if not parent: return False scope = parent.scope() return ( isinstance(parent, nodes.AnnAssign) and isinstance(scope, nodes.ClassDef) and is_decorated_with_dataclass(scope) ) def _looks_like_dataclass_field_call( node: nodes.Call, check_scope: bool = True ) -> bool: """Return True if node is calling dataclasses field or Field from an AnnAssign statement directly in the body of a ClassDef. If check_scope is False, skips checking the statement and body. """ if check_scope: stmt = node.statement() scope = stmt.scope() if not ( isinstance(stmt, nodes.AnnAssign) and stmt.value is not None and isinstance(scope, nodes.ClassDef) and is_decorated_with_dataclass(scope) ): return False try: inferred = next(node.func.infer()) except (InferenceError, StopIteration): return False if not isinstance(inferred, nodes.FunctionDef): return False return inferred.name == FIELD_NAME and inferred.root().name in DATACLASS_MODULES def _get_field_default(field_call: nodes.Call) -> _FieldDefaultReturn: """Return a the default value of a field call, and the corresponding keyword argument name. field(default=...) results in the ... node field(default_factory=...) results in a Call node with func ... and no arguments If neither or both arguments are present, return ("", None) instead, indicating that there is not a valid default value. """ default, default_factory = None, None for keyword in field_call.keywords: if keyword.arg == "default": default = keyword.value elif keyword.arg == "default_factory": default_factory = keyword.value if default is not None and default_factory is None: return "default", default if default is None and default_factory is not None: new_call = nodes.Call( lineno=field_call.lineno, col_offset=field_call.col_offset, parent=field_call.parent, end_lineno=field_call.end_lineno, end_col_offset=field_call.end_col_offset, ) new_call.postinit(func=default_factory, args=[], keywords=[]) return "default_factory", new_call return None def _is_class_var(node: nodes.NodeNG) -> bool: """Return True if node is a ClassVar, with or without subscripting.""" if PY39_PLUS: try: inferred = next(node.infer()) except (InferenceError, StopIteration): return False return getattr(inferred, "name", "") == "ClassVar" # Before Python 3.9, inference returns typing._SpecialForm instead of ClassVar. # Our backup is to inspect the node's structure. return isinstance(node, nodes.Subscript) and ( isinstance(node.value, nodes.Name) and node.value.name == "ClassVar" or isinstance(node.value, nodes.Attribute) and node.value.attrname == "ClassVar" ) def _is_keyword_only_sentinel(node: nodes.NodeNG) -> bool: """Return True if node is the KW_ONLY sentinel.""" if not PY310_PLUS: return False inferred = safe_infer(node) return ( isinstance(inferred, bases.Instance) and inferred.qname() == "dataclasses._KW_ONLY_TYPE" ) def _is_init_var(node: nodes.NodeNG) -> bool: """Return True if node is an InitVar, with or without subscripting.""" try: inferred = next(node.infer()) except (InferenceError, StopIteration): return False return getattr(inferred, "name", "") == "InitVar" # Allowed typing classes for which we support inferring instances _INFERABLE_TYPING_TYPES = frozenset( ( "Dict", "FrozenSet", "List", "Set", "Tuple", ) ) def _infer_instance_from_annotation( node: nodes.NodeNG, ctx: context.InferenceContext | None = None ) -> Iterator[UninferableBase | bases.Instance]: """Infer an instance corresponding to the type annotation represented by node. Currently has limited support for the typing module. """ klass = None try: klass = next(node.infer(context=ctx)) except (InferenceError, StopIteration): yield Uninferable if not isinstance(klass, nodes.ClassDef): yield Uninferable elif klass.root().name in { "typing", "_collections_abc", "", }: # "" because of synthetic nodes in brain_typing.py if klass.name in _INFERABLE_TYPING_TYPES: yield klass.instantiate_class() else: yield Uninferable else: yield klass.instantiate_class() def register(manager: AstroidManager) -> None: manager.register_transform( nodes.ClassDef, dataclass_transform, is_decorated_with_dataclass ) manager.register_transform( nodes.Call, inference_tip(infer_dataclass_field_call, raise_on_overwrite=True), _looks_like_dataclass_field_call, ) manager.register_transform( nodes.Unknown, inference_tip(infer_dataclass_attribute, raise_on_overwrite=True), _looks_like_dataclass_attribute, ) astroid-3.2.2/astroid/brain/brain_mechanize.py0000664000175000017500000000521214622475517021336 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.brain.helpers import register_module_extender from astroid.builder import AstroidBuilder from astroid.manager import AstroidManager def mechanize_transform(): return AstroidBuilder(AstroidManager()).string_build( """class Browser(object): def __getattr__(self, name): return None def __getitem__(self, name): return None def __setitem__(self, name, val): return None def back(self, n=1): return None def clear_history(self): return None def click(self, *args, **kwds): return None def click_link(self, link=None, **kwds): return None def close(self): return None def encoding(self): return None def find_link( self, text=None, text_regex=None, name=None, name_regex=None, url=None, url_regex=None, tag=None, predicate=None, nr=0, ): return None def follow_link(self, link=None, **kwds): return None def forms(self): return None def geturl(self): return None def global_form(self): return None def links(self, **kwds): return None def open_local_file(self, filename): return None def open(self, url, data=None, timeout=None): return None def open_novisit(self, url, data=None, timeout=None): return None def open_local_file(self, filename): return None def reload(self): return None def response(self): return None def select_form(self, name=None, predicate=None, nr=None, **attrs): return None def set_cookie(self, cookie_string): return None def set_handle_referer(self, handle): return None def set_header(self, header, value=None): return None def set_html(self, html, url="http://example.com/"): return None def set_response(self, response): return None def set_simple_cookie(self, name, value, domain, path="/"): return None def submit(self, *args, **kwds): return None def title(self): return None def viewing_html(self): return None def visit_response(self, response, request=None): return None """ ) def register(manager: AstroidManager) -> None: register_module_extender(manager, "mechanize", mechanize_transform) astroid-3.2.2/astroid/brain/brain_gi.py0000664000175000017500000001665614622475517020010 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the Python 2 GObject introspection bindings. Helps with understanding everything imported from 'gi.repository' """ # pylint:disable=import-error,import-outside-toplevel import inspect import itertools import re import sys import warnings from astroid import nodes from astroid.builder import AstroidBuilder from astroid.exceptions import AstroidBuildingError from astroid.manager import AstroidManager _inspected_modules = {} _identifier_re = r"^[A-Za-z_]\w*$" _special_methods = frozenset( { "__lt__", "__le__", "__eq__", "__ne__", "__ge__", "__gt__", "__iter__", "__getitem__", "__setitem__", "__delitem__", "__len__", "__bool__", "__nonzero__", "__next__", "__str__", "__contains__", "__enter__", "__exit__", "__repr__", "__getattr__", "__setattr__", "__delattr__", "__del__", "__hash__", } ) def _gi_build_stub(parent): # noqa: C901 """ Inspect the passed module recursively and build stubs for functions, classes, etc. """ classes = {} functions = {} constants = {} methods = {} for name in dir(parent): if name.startswith("__") and name not in _special_methods: continue # Check if this is a valid name in python if not re.match(_identifier_re, name): continue try: obj = getattr(parent, name) except Exception: # pylint: disable=broad-except # gi.module.IntrospectionModule.__getattr__() can raise all kinds of things # like ValueError, TypeError, NotImplementedError, RepositoryError, etc continue if inspect.isclass(obj): classes[name] = obj elif inspect.isfunction(obj) or inspect.isbuiltin(obj): functions[name] = obj elif inspect.ismethod(obj) or inspect.ismethoddescriptor(obj): methods[name] = obj elif ( str(obj).startswith(" bool: # Return whether this looks like a call to gi.require_version(, ) # Only accept function calls with two constant arguments if len(node.args) != 2: return False if not all(isinstance(arg, nodes.Const) for arg in node.args): return False func = node.func if isinstance(func, nodes.Attribute): if func.attrname != "require_version": return False if isinstance(func.expr, nodes.Name) and func.expr.name == "gi": return True return False if isinstance(func, nodes.Name): return func.name == "require_version" return False def _register_require_version(node): # Load the gi.require_version locally try: import gi gi.require_version(node.args[0].value, node.args[1].value) except Exception: # pylint:disable=broad-except pass return node def register(manager: AstroidManager) -> None: manager.register_failed_import_hook(_import_gi_module) manager.register_transform( nodes.Call, _register_require_version, _looks_like_require_version ) astroid-3.2.2/astroid/brain/brain_uuid.py0000664000175000017500000000132714622475517020344 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Astroid hooks for the UUID module.""" from astroid.manager import AstroidManager from astroid.nodes.node_classes import Const from astroid.nodes.scoped_nodes import ClassDef def _patch_uuid_class(node: ClassDef) -> None: # The .int member is patched using __dict__ node.locals["int"] = [Const(0, parent=node)] def register(manager: AstroidManager) -> None: manager.register_transform( ClassDef, _patch_uuid_class, lambda node: node.qname() == "uuid.UUID" ) astroid-3.2.2/astroid/protocols.py0000664000175000017500000007575214622475517017171 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains a set of functions to handle python protocols for nodes where it makes sense. """ from __future__ import annotations import collections import itertools import operator as operator_mod from collections.abc import Callable, Generator, Iterator, Sequence from typing import TYPE_CHECKING, Any, TypeVar from astroid import bases, decorators, nodes, util from astroid.const import Context from astroid.context import InferenceContext, copy_context from astroid.exceptions import ( AstroidIndexError, AstroidTypeError, AttributeInferenceError, InferenceError, NoDefault, ) from astroid.nodes import node_classes from astroid.typing import ( ConstFactoryResult, InferenceResult, SuccessfulInferenceResult, ) if TYPE_CHECKING: _TupleListNodeT = TypeVar("_TupleListNodeT", nodes.Tuple, nodes.List) _CONTEXTLIB_MGR = "contextlib.contextmanager" _UNARY_OPERATORS: dict[str, Callable[[Any], Any]] = { "+": operator_mod.pos, "-": operator_mod.neg, "~": operator_mod.invert, "not": operator_mod.not_, } def _infer_unary_op(obj: Any, op: str) -> ConstFactoryResult: """Perform unary operation on `obj`, unless it is `NotImplemented`. Can raise TypeError if operation is unsupported. """ if obj is NotImplemented: value = obj else: func = _UNARY_OPERATORS[op] value = func(obj) return nodes.const_factory(value) def tuple_infer_unary_op(self, op): return _infer_unary_op(tuple(self.elts), op) def list_infer_unary_op(self, op): return _infer_unary_op(self.elts, op) def set_infer_unary_op(self, op): return _infer_unary_op(set(self.elts), op) def const_infer_unary_op(self, op): return _infer_unary_op(self.value, op) def dict_infer_unary_op(self, op): return _infer_unary_op(dict(self.items), op) # Binary operations BIN_OP_IMPL = { "+": lambda a, b: a + b, "-": lambda a, b: a - b, "/": lambda a, b: a / b, "//": lambda a, b: a // b, "*": lambda a, b: a * b, "**": lambda a, b: a**b, "%": lambda a, b: a % b, "&": lambda a, b: a & b, "|": lambda a, b: a | b, "^": lambda a, b: a ^ b, "<<": lambda a, b: a << b, ">>": lambda a, b: a >> b, "@": operator_mod.matmul, } for _KEY, _IMPL in list(BIN_OP_IMPL.items()): BIN_OP_IMPL[_KEY + "="] = _IMPL @decorators.yes_if_nothing_inferred def const_infer_binary_op( self: nodes.Const, opnode: nodes.AugAssign | nodes.BinOp, operator: str, other: InferenceResult, context: InferenceContext, _: SuccessfulInferenceResult, ) -> Generator[ConstFactoryResult | util.UninferableBase, None, None]: not_implemented = nodes.Const(NotImplemented) if isinstance(other, nodes.Const): if ( operator == "**" and isinstance(self.value, (int, float)) and isinstance(other.value, (int, float)) and (self.value > 1e5 or other.value > 1e5) ): yield not_implemented return try: impl = BIN_OP_IMPL[operator] try: yield nodes.const_factory(impl(self.value, other.value)) except TypeError: # ArithmeticError is not enough: float >> float is a TypeError yield not_implemented except Exception: # pylint: disable=broad-except yield util.Uninferable except TypeError: yield not_implemented elif isinstance(self.value, str) and operator == "%": # TODO(cpopa): implement string interpolation later on. yield util.Uninferable else: yield not_implemented def _multiply_seq_by_int( self: _TupleListNodeT, opnode: nodes.AugAssign | nodes.BinOp, value: int, context: InferenceContext, ) -> _TupleListNodeT: node = self.__class__(parent=opnode) if value > 1e8: node.elts = [util.Uninferable] return node filtered_elts = ( util.safe_infer(elt, context) or util.Uninferable for elt in self.elts if not isinstance(elt, util.UninferableBase) ) node.elts = list(filtered_elts) * value return node def _filter_uninferable_nodes( elts: Sequence[InferenceResult], context: InferenceContext ) -> Iterator[SuccessfulInferenceResult]: for elt in elts: if isinstance(elt, util.UninferableBase): yield nodes.Unknown() else: for inferred in elt.infer(context): if not isinstance(inferred, util.UninferableBase): yield inferred else: yield nodes.Unknown() @decorators.yes_if_nothing_inferred def tl_infer_binary_op( self: _TupleListNodeT, opnode: nodes.AugAssign | nodes.BinOp, operator: str, other: InferenceResult, context: InferenceContext, method: SuccessfulInferenceResult, ) -> Generator[_TupleListNodeT | nodes.Const | util.UninferableBase, None, None]: """Infer a binary operation on a tuple or list. The instance on which the binary operation is performed is a tuple or list. This refers to the left-hand side of the operation, so: 'tuple() + 1' or '[] + A()' """ from astroid import helpers # pylint: disable=import-outside-toplevel # For tuples and list the boundnode is no longer the tuple or list instance context.boundnode = None not_implemented = nodes.Const(NotImplemented) if isinstance(other, self.__class__) and operator == "+": node = self.__class__(parent=opnode) node.elts = list( itertools.chain( _filter_uninferable_nodes(self.elts, context), _filter_uninferable_nodes(other.elts, context), ) ) yield node elif isinstance(other, nodes.Const) and operator == "*": if not isinstance(other.value, int): yield not_implemented return yield _multiply_seq_by_int(self, opnode, other.value, context) elif isinstance(other, bases.Instance) and operator == "*": # Verify if the instance supports __index__. as_index = helpers.class_instance_as_index(other) if not as_index: yield util.Uninferable elif not isinstance(as_index.value, int): # pragma: no cover # already checked by class_instance_as_index() but faster than casting raise AssertionError("Please open a bug report.") else: yield _multiply_seq_by_int(self, opnode, as_index.value, context) else: yield not_implemented @decorators.yes_if_nothing_inferred def instance_class_infer_binary_op( self: nodes.ClassDef, opnode: nodes.AugAssign | nodes.BinOp, operator: str, other: InferenceResult, context: InferenceContext, method: SuccessfulInferenceResult, ) -> Generator[InferenceResult, None, None]: return method.infer_call_result(self, context) # assignment ################################################################## # pylint: disable-next=pointless-string-statement """The assigned_stmts method is responsible to return the assigned statement (e.g. not inferred) according to the assignment type. The `assign_path` argument is used to record the lhs path of the original node. For instance if we want assigned statements for 'c' in 'a, (b,c)', assign_path will be [1, 1] once arrived to the Assign node. The `context` argument is the current inference context which should be given to any intermediary inference necessary. """ def _resolve_looppart(parts, assign_path, context): """Recursive function to resolve multiple assignments on loops.""" assign_path = assign_path[:] index = assign_path.pop(0) for part in parts: if isinstance(part, util.UninferableBase): continue if not hasattr(part, "itered"): continue try: itered = part.itered() except TypeError: continue try: if isinstance(itered[index], (nodes.Const, nodes.Name)): itered = [part] except IndexError: pass for stmt in itered: index_node = nodes.Const(index) try: assigned = stmt.getitem(index_node, context) except (AttributeError, AstroidTypeError, AstroidIndexError): continue if not assign_path: # we achieved to resolved the assignment path, # don't infer the last part yield assigned elif isinstance(assigned, util.UninferableBase): break else: # we are not yet on the last part of the path # search on each possibly inferred value try: yield from _resolve_looppart( assigned.infer(context), assign_path, context ) except InferenceError: break @decorators.raise_if_nothing_inferred def for_assigned_stmts( self: nodes.For | nodes.Comprehension, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: if isinstance(self, nodes.AsyncFor) or getattr(self, "is_async", False): # Skip inferring of async code for now return { "node": self, "unknown": node, "assign_path": assign_path, "context": context, } if assign_path is None: for lst in self.iter.infer(context): if isinstance(lst, (nodes.Tuple, nodes.List)): yield from lst.elts else: yield from _resolve_looppart(self.iter.infer(context), assign_path, context) return { "node": self, "unknown": node, "assign_path": assign_path, "context": context, } def sequence_assigned_stmts( self: nodes.Tuple | nodes.List, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: if assign_path is None: assign_path = [] try: index = self.elts.index(node) # type: ignore[arg-type] except ValueError as exc: raise InferenceError( "Tried to retrieve a node {node!r} which does not exist", node=self, assign_path=assign_path, context=context, ) from exc assign_path.insert(0, index) return self.parent.assigned_stmts( node=self, context=context, assign_path=assign_path ) def assend_assigned_stmts( self: nodes.AssignName | nodes.AssignAttr, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: return self.parent.assigned_stmts(node=self, context=context) def _arguments_infer_argname( self, name: str | None, context: InferenceContext ) -> Generator[InferenceResult, None, None]: # arguments information may be missing, in which case we can't do anything # more from astroid import arguments # pylint: disable=import-outside-toplevel if not self.arguments: yield util.Uninferable return args = [arg for arg in self.arguments if arg.name not in [self.vararg, self.kwarg]] functype = self.parent.type # first argument of instance/class method if ( args and getattr(self.arguments[0], "name", None) == name and functype != "staticmethod" ): cls = self.parent.parent.scope() is_metaclass = isinstance(cls, nodes.ClassDef) and cls.type == "metaclass" # If this is a metaclass, then the first argument will always # be the class, not an instance. if context.boundnode and isinstance(context.boundnode, bases.Instance): cls = context.boundnode._proxied if is_metaclass or functype == "classmethod": yield cls return if functype == "method": yield cls.instantiate_class() return if context and context.callcontext: callee = context.callcontext.callee while hasattr(callee, "_proxied"): callee = callee._proxied if getattr(callee, "name", None) == self.parent.name: call_site = arguments.CallSite(context.callcontext, context.extra_context) yield from call_site.infer_argument(self.parent, name, context) return if name == self.vararg: vararg = nodes.const_factory(()) vararg.parent = self if not args and self.parent.name == "__init__": cls = self.parent.parent.scope() vararg.elts = [cls.instantiate_class()] yield vararg return if name == self.kwarg: kwarg = nodes.const_factory({}) kwarg.parent = self yield kwarg return # if there is a default value, yield it. And then yield Uninferable to reflect # we can't guess given argument value try: context = copy_context(context) yield from self.default_value(name).infer(context) yield util.Uninferable except NoDefault: yield util.Uninferable def arguments_assigned_stmts( self: nodes.Arguments, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: from astroid import arguments # pylint: disable=import-outside-toplevel try: node_name = node.name # type: ignore[union-attr] except AttributeError: # Added to handle edge cases where node.name is not defined. # https://github.com/pylint-dev/astroid/pull/1644#discussion_r901545816 node_name = None # pragma: no cover if context and context.callcontext: callee = context.callcontext.callee while hasattr(callee, "_proxied"): callee = callee._proxied else: return _arguments_infer_argname(self, node_name, context) if node and getattr(callee, "name", None) == node.frame().name: # reset call context/name callcontext = context.callcontext context = copy_context(context) context.callcontext = None args = arguments.CallSite(callcontext, context=context) return args.infer_argument(self.parent, node_name, context) return _arguments_infer_argname(self, node_name, context) @decorators.raise_if_nothing_inferred def assign_assigned_stmts( self: nodes.AugAssign | nodes.Assign | nodes.AnnAssign | nodes.TypeAlias, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: if not assign_path: yield self.value return None yield from _resolve_assignment_parts( self.value.infer(context), assign_path, context ) return { "node": self, "unknown": node, "assign_path": assign_path, "context": context, } def assign_annassigned_stmts( self: nodes.AnnAssign, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: for inferred in assign_assigned_stmts(self, node, context, assign_path): if inferred is None: yield util.Uninferable else: yield inferred def _resolve_assignment_parts(parts, assign_path, context): """Recursive function to resolve multiple assignments.""" assign_path = assign_path[:] index = assign_path.pop(0) for part in parts: assigned = None if isinstance(part, nodes.Dict): # A dictionary in an iterating context try: assigned, _ = part.items[index] except IndexError: return elif hasattr(part, "getitem"): index_node = nodes.Const(index) try: assigned = part.getitem(index_node, context) except (AstroidTypeError, AstroidIndexError): return if not assigned: return if not assign_path: # we achieved to resolved the assignment path, don't infer the # last part yield assigned elif isinstance(assigned, util.UninferableBase): return else: # we are not yet on the last part of the path search on each # possibly inferred value try: yield from _resolve_assignment_parts( assigned.infer(context), assign_path, context ) except InferenceError: return @decorators.raise_if_nothing_inferred def excepthandler_assigned_stmts( self: nodes.ExceptHandler, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: from astroid import objects # pylint: disable=import-outside-toplevel for assigned in node_classes.unpack_infer(self.type): if isinstance(assigned, nodes.ClassDef): assigned = objects.ExceptionInstance(assigned) yield assigned return { "node": self, "unknown": node, "assign_path": assign_path, "context": context, } def _infer_context_manager(self, mgr, context): try: inferred = next(mgr.infer(context=context)) except StopIteration as e: raise InferenceError(node=mgr) from e if isinstance(inferred, bases.Generator): # Check if it is decorated with contextlib.contextmanager. func = inferred.parent if not func.decorators: raise InferenceError( "No decorators found on inferred generator %s", node=func ) for decorator_node in func.decorators.nodes: decorator = next(decorator_node.infer(context=context), None) if isinstance(decorator, nodes.FunctionDef): if decorator.qname() == _CONTEXTLIB_MGR: break else: # It doesn't interest us. raise InferenceError(node=func) try: yield next(inferred.infer_yield_types()) except StopIteration as e: raise InferenceError(node=func) from e elif isinstance(inferred, bases.Instance): try: enter = next(inferred.igetattr("__enter__", context=context)) except (InferenceError, AttributeInferenceError, StopIteration) as exc: raise InferenceError(node=inferred) from exc if not isinstance(enter, bases.BoundMethod): raise InferenceError(node=enter) yield from enter.infer_call_result(self, context) else: raise InferenceError(node=mgr) @decorators.raise_if_nothing_inferred def with_assigned_stmts( self: nodes.With, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: """Infer names and other nodes from a *with* statement. This enables only inference for name binding in a *with* statement. For instance, in the following code, inferring `func` will return the `ContextManager` class, not whatever ``__enter__`` returns. We are doing this intentionally, because we consider that the context manager result is whatever __enter__ returns and what it is binded using the ``as`` keyword. class ContextManager(object): def __enter__(self): return 42 with ContextManager() as f: pass # ContextManager().infer() will return ContextManager # f.infer() will return 42. Arguments: self: nodes.With node: The target of the assignment, `as (a, b)` in `with foo as (a, b)`. context: Inference context used for caching already inferred objects assign_path: A list of indices, where each index specifies what item to fetch from the inference results. """ try: mgr = next(mgr for (mgr, vars) in self.items if vars == node) except StopIteration: return None if assign_path is None: yield from _infer_context_manager(self, mgr, context) else: for result in _infer_context_manager(self, mgr, context): # Walk the assign_path and get the item at the final index. obj = result for index in assign_path: if not hasattr(obj, "elts"): raise InferenceError( "Wrong type ({targets!r}) for {node!r} assignment", node=self, targets=node, assign_path=assign_path, context=context, ) try: obj = obj.elts[index] except IndexError as exc: raise InferenceError( "Tried to infer a nonexistent target with index {index} " "in {node!r}.", node=self, targets=node, assign_path=assign_path, context=context, ) from exc except TypeError as exc: raise InferenceError( "Tried to unpack a non-iterable value in {node!r}.", node=self, targets=node, assign_path=assign_path, context=context, ) from exc yield obj return { "node": self, "unknown": node, "assign_path": assign_path, "context": context, } @decorators.raise_if_nothing_inferred def named_expr_assigned_stmts( self: nodes.NamedExpr, node: node_classes.AssignedStmtsPossibleNode, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: """Infer names and other nodes from an assignment expression.""" if self.target == node: yield from self.value.infer(context=context) else: raise InferenceError( "Cannot infer NamedExpr node {node!r}", node=self, assign_path=assign_path, context=context, ) @decorators.yes_if_nothing_inferred def starred_assigned_stmts( # noqa: C901 self: nodes.Starred, node: node_classes.AssignedStmtsPossibleNode = None, context: InferenceContext | None = None, assign_path: list[int] | None = None, ) -> Any: """ Arguments: self: nodes.Starred node: a node related to the current underlying Node. context: Inference context used for caching already inferred objects assign_path: A list of indices, where each index specifies what item to fetch from the inference results. """ # pylint: disable=too-many-locals,too-many-statements def _determine_starred_iteration_lookups( starred: nodes.Starred, target: nodes.Tuple, lookups: list[tuple[int, int]] ) -> None: # Determine the lookups for the rhs of the iteration itered = target.itered() for index, element in enumerate(itered): if ( isinstance(element, nodes.Starred) and element.value.name == starred.value.name ): lookups.append((index, len(itered))) break if isinstance(element, nodes.Tuple): lookups.append((index, len(element.itered()))) _determine_starred_iteration_lookups(starred, element, lookups) stmt = self.statement() if not isinstance(stmt, (nodes.Assign, nodes.For)): raise InferenceError( "Statement {stmt!r} enclosing {node!r} must be an Assign or For node.", node=self, stmt=stmt, unknown=node, context=context, ) if context is None: context = InferenceContext() if isinstance(stmt, nodes.Assign): value = stmt.value lhs = stmt.targets[0] if not isinstance(lhs, nodes.BaseContainer): yield util.Uninferable return if sum(1 for _ in lhs.nodes_of_class(nodes.Starred)) > 1: raise InferenceError( "Too many starred arguments in the assignment targets {lhs!r}.", node=self, targets=lhs, unknown=node, context=context, ) try: rhs = next(value.infer(context)) except (InferenceError, StopIteration): yield util.Uninferable return if isinstance(rhs, util.UninferableBase) or not hasattr(rhs, "itered"): yield util.Uninferable return try: elts = collections.deque(rhs.itered()) # type: ignore[union-attr] except TypeError: yield util.Uninferable return # Unpack iteratively the values from the rhs of the assignment, # until the find the starred node. What will remain will # be the list of values which the Starred node will represent # This is done in two steps, from left to right to remove # anything before the starred node and from right to left # to remove anything after the starred node. for index, left_node in enumerate(lhs.elts): if not isinstance(left_node, nodes.Starred): if not elts: break elts.popleft() continue lhs_elts = collections.deque(reversed(lhs.elts[index:])) for right_node in lhs_elts: if not isinstance(right_node, nodes.Starred): if not elts: break elts.pop() continue # We're done unpacking. packed = nodes.List( ctx=Context.Store, parent=self, lineno=lhs.lineno, col_offset=lhs.col_offset, ) packed.postinit(elts=list(elts)) yield packed break if isinstance(stmt, nodes.For): try: inferred_iterable = next(stmt.iter.infer(context=context)) except (InferenceError, StopIteration): yield util.Uninferable return if isinstance(inferred_iterable, util.UninferableBase) or not hasattr( inferred_iterable, "itered" ): yield util.Uninferable return try: itered = inferred_iterable.itered() # type: ignore[union-attr] except TypeError: yield util.Uninferable return target = stmt.target if not isinstance(target, nodes.Tuple): raise InferenceError( "Could not make sense of this, the target must be a tuple", context=context, ) lookups: list[tuple[int, int]] = [] _determine_starred_iteration_lookups(self, target, lookups) if not lookups: raise InferenceError( "Could not make sense of this, needs at least a lookup", context=context ) # Make the last lookup a slice, since that what we want for a Starred node last_element_index, last_element_length = lookups[-1] is_starred_last = last_element_index == (last_element_length - 1) lookup_slice = slice( last_element_index, None if is_starred_last else (last_element_length - last_element_index), ) last_lookup = lookup_slice for element in itered: # We probably want to infer the potential values *for each* element in an # iterable, but we can't infer a list of all values, when only a list of # step values are expected: # # for a, *b in [...]: # b # # *b* should now point to just the elements at that particular iteration step, # which astroid can't know about. found_element = None for index, lookup in enumerate(lookups): if not hasattr(element, "itered"): break if index + 1 is len(lookups): cur_lookup: slice | int = last_lookup else: # Grab just the index, not the whole length cur_lookup = lookup[0] try: itered_inner_element = element.itered() element = itered_inner_element[cur_lookup] except IndexError: break except TypeError: # Most likely the itered() call failed, cannot make sense of this yield util.Uninferable return else: found_element = element unpacked = nodes.List( ctx=Context.Store, parent=self, lineno=self.lineno, col_offset=self.col_offset, ) unpacked.postinit(elts=found_element or []) yield unpacked return yield util.Uninferable @decorators.yes_if_nothing_inferred def match_mapping_assigned_stmts( self: nodes.MatchMapping, node: nodes.AssignName, context: InferenceContext | None = None, assign_path: None = None, ) -> Generator[nodes.NodeNG, None, None]: """Return empty generator (return -> raises StopIteration) so inferred value is Uninferable. """ return yield @decorators.yes_if_nothing_inferred def match_star_assigned_stmts( self: nodes.MatchStar, node: nodes.AssignName, context: InferenceContext | None = None, assign_path: None = None, ) -> Generator[nodes.NodeNG, None, None]: """Return empty generator (return -> raises StopIteration) so inferred value is Uninferable. """ return yield @decorators.yes_if_nothing_inferred def match_as_assigned_stmts( self: nodes.MatchAs, node: nodes.AssignName, context: InferenceContext | None = None, assign_path: None = None, ) -> Generator[nodes.NodeNG, None, None]: """Infer MatchAs as the Match subject if it's the only MatchCase pattern else raise StopIteration to yield Uninferable. """ if ( isinstance(self.parent, nodes.MatchCase) and isinstance(self.parent.parent, nodes.Match) and self.pattern is None ): yield self.parent.parent.subject @decorators.yes_if_nothing_inferred def generic_type_assigned_stmts( self: nodes.TypeVar | nodes.TypeVarTuple | nodes.ParamSpec, node: nodes.AssignName, context: InferenceContext | None = None, assign_path: None = None, ) -> Generator[nodes.NodeNG, None, None]: """Hack. Return any Node so inference doesn't fail when evaluating __class_getitem__. Revert if it's causing issues. """ yield nodes.Const(None) astroid-3.2.2/astroid/nodes/0000775000175000017500000000000014622475517015663 5ustar epsilonepsilonastroid-3.2.2/astroid/nodes/scoped_nodes/0000775000175000017500000000000014622475517020330 5ustar epsilonepsilonastroid-3.2.2/astroid/nodes/scoped_nodes/__init__.py0000664000175000017500000000231514622475517022442 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains all classes that are considered a "scoped" node and anything related. A scope node is a node that opens a new local scope in the language definition: Module, ClassDef, FunctionDef (and Lambda, GeneratorExp, DictComp and SetComp to some extent). """ from astroid.nodes.scoped_nodes.mixin import ComprehensionScope, LocalsDictNodeNG from astroid.nodes.scoped_nodes.scoped_nodes import ( AsyncFunctionDef, ClassDef, DictComp, FunctionDef, GeneratorExp, Lambda, ListComp, Module, SetComp, _is_metaclass, function_to_method, get_wrapping_class, ) from astroid.nodes.scoped_nodes.utils import builtin_lookup __all__ = ( "AsyncFunctionDef", "ClassDef", "ComprehensionScope", "DictComp", "FunctionDef", "GeneratorExp", "Lambda", "ListComp", "LocalsDictNodeNG", "Module", "SetComp", "builtin_lookup", "function_to_method", "get_wrapping_class", "_is_metaclass", ) astroid-3.2.2/astroid/nodes/scoped_nodes/mixin.py0000664000175000017500000001575714622475517022045 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains mixin classes for scoped nodes.""" from __future__ import annotations from typing import TYPE_CHECKING, TypeVar, overload from astroid.exceptions import ParentMissingError from astroid.filter_statements import _filter_stmts from astroid.nodes import _base_nodes, scoped_nodes from astroid.nodes.scoped_nodes.utils import builtin_lookup from astroid.typing import InferenceResult, SuccessfulInferenceResult if TYPE_CHECKING: from astroid import nodes _T = TypeVar("_T") class LocalsDictNodeNG(_base_nodes.LookupMixIn): """this class provides locals handling common to Module, FunctionDef and ClassDef nodes, including a dict like interface for direct access to locals information """ # attributes below are set by the builder module or by raw factories locals: dict[str, list[InferenceResult]] """A map of the name of a local variable to the node defining the local.""" def qname(self) -> str: """Get the 'qualified' name of the node. For example: module.name, module.class.name ... :returns: The qualified name. :rtype: str """ # pylint: disable=no-member; github.com/pylint-dev/astroid/issues/278 if self.parent is None: return self.name try: return f"{self.parent.frame().qname()}.{self.name}" except ParentMissingError: return self.name def scope(self: _T) -> _T: """The first parent node defining a new scope. :returns: The first parent scope node. :rtype: Module or FunctionDef or ClassDef or Lambda or GenExpr """ return self def scope_lookup( self, node: _base_nodes.LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[nodes.NodeNG]]: """Lookup where the given variable is assigned. :param node: The node to look for assignments up to. Any assignments after the given node are ignored. :param name: The name of the variable to find assignments for. :param offset: The line offset to filter statements up to. :returns: This scope node and the list of assignments associated to the given name according to the scope where it has been found (locals, globals or builtin). """ raise NotImplementedError def _scope_lookup( self, node: _base_nodes.LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[nodes.NodeNG]]: """XXX method for interfacing the scope lookup""" try: stmts = _filter_stmts(node, self.locals[name], self, offset) except KeyError: stmts = () if stmts: return self, stmts # Handle nested scopes: since class names do not extend to nested # scopes (e.g., methods), we find the next enclosing non-class scope pscope = self.parent and self.parent.scope() while pscope is not None: if not isinstance(pscope, scoped_nodes.ClassDef): return pscope.scope_lookup(node, name) pscope = pscope.parent and pscope.parent.scope() # self is at the top level of a module, or is enclosed only by ClassDefs return builtin_lookup(name) def set_local(self, name: str, stmt: nodes.NodeNG) -> None: """Define that the given name is declared in the given statement node. .. seealso:: :meth:`scope` :param name: The name that is being defined. :param stmt: The statement that defines the given name. """ # assert not stmt in self.locals.get(name, ()), (self, stmt) self.locals.setdefault(name, []).append(stmt) __setitem__ = set_local def _append_node(self, child: nodes.NodeNG) -> None: """append a child, linking it in the tree""" # pylint: disable=no-member; depending by the class # which uses the current class as a mixin or base class. # It's rewritten in 2.0, so it makes no sense for now # to spend development time on it. self.body.append(child) # type: ignore[attr-defined] child.parent = self @overload def add_local_node( self, child_node: nodes.ClassDef, name: str | None = ... ) -> None: ... @overload def add_local_node(self, child_node: nodes.NodeNG, name: str) -> None: ... def add_local_node(self, child_node: nodes.NodeNG, name: str | None = None) -> None: """Append a child that should alter the locals of this scope node. :param child_node: The child node that will alter locals. :param name: The name of the local that will be altered by the given child node. """ if name != "__class__": # add __class__ node as a child will cause infinite recursion later! self._append_node(child_node) self.set_local(name or child_node.name, child_node) # type: ignore[attr-defined] def __getitem__(self, item: str) -> SuccessfulInferenceResult: """The first node the defines the given local. :param item: The name of the locally defined object. :raises KeyError: If the name is not defined. """ return self.locals[item][0] def __iter__(self): """Iterate over the names of locals defined in this scoped node. :returns: The names of the defined locals. :rtype: iterable(str) """ return iter(self.keys()) def keys(self): """The names of locals defined in this scoped node. :returns: The names of the defined locals. :rtype: list(str) """ return list(self.locals.keys()) def values(self): """The nodes that define the locals in this scoped node. :returns: The nodes that define locals. :rtype: list(NodeNG) """ # pylint: disable=consider-using-dict-items # It look like this class override items/keys/values, # probably not worth the headache return [self[key] for key in self.keys()] def items(self): """Get the names of the locals and the node that defines the local. :returns: The names of locals and their associated node. :rtype: list(tuple(str, NodeNG)) """ return list(zip(self.keys(), self.values())) def __contains__(self, name) -> bool: """Check if a local is defined in this scope. :param name: The name of the local to check for. :type name: str :returns: Whether this node has a local of the given name, """ return name in self.locals class ComprehensionScope(LocalsDictNodeNG): """Scoping for different types of comprehensions.""" scope_lookup = LocalsDictNodeNG._scope_lookup generators: list[nodes.Comprehension] """The generators that are looped through.""" astroid-3.2.2/astroid/nodes/scoped_nodes/scoped_nodes.py0000664000175000017500000031502714622475517023357 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ This module contains the classes for "scoped" node, i.e. which are opening a new local scope in the language definition : Module, ClassDef, FunctionDef (and Lambda, GeneratorExp, DictComp and SetComp to some extent). """ from __future__ import annotations import io import itertools import os import warnings from collections.abc import Generator, Iterable, Iterator, Sequence from functools import cached_property, lru_cache from typing import TYPE_CHECKING, Any, ClassVar, Literal, NoReturn, TypeVar from astroid import bases, protocols, util from astroid.const import IS_PYPY, PY38, PY39_PLUS, PYPY_7_3_11_PLUS from astroid.context import ( CallContext, InferenceContext, bind_context_to_node, copy_context, ) from astroid.exceptions import ( AstroidBuildingError, AstroidTypeError, AttributeInferenceError, DuplicateBasesError, InconsistentMroError, InferenceError, MroError, ParentMissingError, StatementMissing, TooManyLevelsError, ) from astroid.interpreter.dunder_lookup import lookup from astroid.interpreter.objectmodel import ClassModel, FunctionModel, ModuleModel from astroid.manager import AstroidManager from astroid.nodes import ( Arguments, Const, NodeNG, Unknown, _base_nodes, const_factory, node_classes, ) from astroid.nodes.scoped_nodes.mixin import ComprehensionScope, LocalsDictNodeNG from astroid.nodes.scoped_nodes.utils import builtin_lookup from astroid.nodes.utils import Position from astroid.typing import ( InferBinaryOp, InferenceErrorInfo, InferenceResult, SuccessfulInferenceResult, ) if TYPE_CHECKING: from astroid import nodes, objects from astroid.nodes._base_nodes import LookupMixIn ITER_METHODS = ("__iter__", "__getitem__") EXCEPTION_BASE_CLASSES = frozenset({"Exception", "BaseException"}) BUILTIN_DESCRIPTORS = frozenset( {"classmethod", "staticmethod", "builtins.classmethod", "builtins.staticmethod"} ) _T = TypeVar("_T") def _c3_merge(sequences, cls, context): """Merges MROs in *sequences* to a single MRO using the C3 algorithm. Adapted from http://www.python.org/download/releases/2.3/mro/. """ result = [] while True: sequences = [s for s in sequences if s] # purge empty sequences if not sequences: return result for s1 in sequences: # find merge candidates among seq heads candidate = s1[0] for s2 in sequences: if candidate in s2[1:]: candidate = None break # reject the current head, it appears later else: break if not candidate: # Show all the remaining bases, which were considered as # candidates for the next mro sequence. raise InconsistentMroError( message="Cannot create a consistent method resolution order " "for MROs {mros} of class {cls!r}.", mros=sequences, cls=cls, context=context, ) result.append(candidate) # remove the chosen candidate for seq in sequences: if seq[0] == candidate: del seq[0] return None def clean_typing_generic_mro(sequences: list[list[ClassDef]]) -> None: """A class can inherit from typing.Generic directly, as base, and as base of bases. The merged MRO must however only contain the last entry. To prepare for _c3_merge, remove some typing.Generic entries from sequences if multiple are present. This method will check if Generic is in inferred_bases and also part of bases_mro. If true, remove it from inferred_bases as well as its entry the bases_mro. Format sequences: [[self]] + bases_mro + [inferred_bases] """ bases_mro = sequences[1:-1] inferred_bases = sequences[-1] # Check if Generic is part of inferred_bases for i, base in enumerate(inferred_bases): if base.qname() == "typing.Generic": position_in_inferred_bases = i break else: return # Check if also part of bases_mro # Ignore entry for typing.Generic for i, seq in enumerate(bases_mro): if i == position_in_inferred_bases: continue if any(base.qname() == "typing.Generic" for base in seq): break else: return # Found multiple Generics in mro, remove entry from inferred_bases # and the corresponding one from bases_mro inferred_bases.pop(position_in_inferred_bases) bases_mro.pop(position_in_inferred_bases) def clean_duplicates_mro( sequences: list[list[ClassDef]], cls: ClassDef, context: InferenceContext | None, ) -> list[list[ClassDef]]: for sequence in sequences: seen = set() for node in sequence: lineno_and_qname = (node.lineno, node.qname()) if lineno_and_qname in seen: raise DuplicateBasesError( message="Duplicates found in MROs {mros} for {cls!r}.", mros=sequences, cls=cls, context=context, ) seen.add(lineno_and_qname) return sequences def function_to_method(n, klass): if isinstance(n, FunctionDef): if n.type == "classmethod": return bases.BoundMethod(n, klass) if n.type == "property": return n if n.type != "staticmethod": return bases.UnboundMethod(n) return n class Module(LocalsDictNodeNG): """Class representing an :class:`ast.Module` node. >>> import astroid >>> node = astroid.extract_node('import astroid') >>> node >>> node.parent """ _astroid_fields = ("doc_node", "body") doc_node: Const | None """The doc node associated with this node.""" # attributes below are set by the builder module or by raw factories file_bytes: str | bytes | None = None """The string/bytes that this ast was built from.""" file_encoding: str | None = None """The encoding of the source file. This is used to get unicode out of a source file. Python 2 only. """ special_attributes = ModuleModel() """The names of special attributes that this module has.""" # names of module attributes available through the global scope scope_attrs: ClassVar[set[str]] = { "__name__", "__doc__", "__file__", "__path__", "__package__", } """The names of module attributes available through the global scope.""" _other_fields = ( "name", "file", "path", "package", "pure_python", "future_imports", ) _other_other_fields = ("locals", "globals") def __init__( self, name: str, file: str | None = None, path: Sequence[str] | None = None, package: bool = False, pure_python: bool = True, ) -> None: self.name = name """The name of the module.""" self.file = file """The path to the file that this ast has been extracted from. This will be ``None`` when the representation has been built from a built-in module. """ self.path = path self.package = package """Whether the node represents a package or a module.""" self.pure_python = pure_python """Whether the ast was built from source.""" self.globals: dict[str, list[InferenceResult]] """A map of the name of a global variable to the node defining the global.""" self.locals = self.globals = {} """A map of the name of a local variable to the node defining the local.""" self.body: list[node_classes.NodeNG] = [] """The contents of the module.""" self.future_imports: set[str] = set() """The imports from ``__future__``.""" super().__init__( lineno=0, parent=None, col_offset=0, end_lineno=None, end_col_offset=None ) # pylint: enable=redefined-builtin def postinit( self, body: list[node_classes.NodeNG], *, doc_node: Const | None = None ): self.body = body self.doc_node = doc_node def _get_stream(self): if self.file_bytes is not None: return io.BytesIO(self.file_bytes) if self.file is not None: # pylint: disable=consider-using-with stream = open(self.file, "rb") return stream return None def stream(self): """Get a stream to the underlying file or bytes. :type: file or io.BytesIO or None """ return self._get_stream() def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from where this node starts to where this node ends. :param lineno: Unused. :returns: The range of line numbers that this node belongs to. """ return self.fromlineno, self.tolineno def scope_lookup( self, node: LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[node_classes.NodeNG]]: """Lookup where the given variable is assigned. :param node: The node to look for assignments up to. Any assignments after the given node are ignored. :param name: The name of the variable to find assignments for. :param offset: The line offset to filter statements up to. :returns: This scope node and the list of assignments associated to the given name according to the scope where it has been found (locals, globals or builtin). """ if name in self.scope_attrs and name not in self.locals: try: return self, self.getattr(name) except AttributeInferenceError: return self, [] return self._scope_lookup(node, name, offset) def pytype(self) -> Literal["builtins.module"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.module" def display_type(self) -> str: """A human readable type of this node. :returns: The type of this node. :rtype: str """ return "Module" def getattr( self, name, context: InferenceContext | None = None, ignore_locals=False ): if not name: raise AttributeInferenceError(target=self, attribute=name, context=context) result = [] name_in_locals = name in self.locals if name in self.special_attributes and not ignore_locals and not name_in_locals: result = [self.special_attributes.lookup(name)] if name == "__name__": result.append(const_factory("__main__")) elif not ignore_locals and name_in_locals: result = self.locals[name] elif self.package: try: result = [self.import_module(name, relative_only=True)] except (AstroidBuildingError, SyntaxError) as exc: raise AttributeInferenceError( target=self, attribute=name, context=context ) from exc result = [n for n in result if not isinstance(n, node_classes.DelName)] if result: return result raise AttributeInferenceError(target=self, attribute=name, context=context) def igetattr( self, name: str, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Infer the possible values of the given variable. :param name: The name of the variable to infer. :returns: The inferred possible values. """ # set lookup name since this is necessary to infer on import nodes for # instance context = copy_context(context) context.lookupname = name try: return bases._infer_stmts(self.getattr(name, context), context, frame=self) except AttributeInferenceError as error: raise InferenceError( str(error), target=self, attribute=name, context=context ) from error def fully_defined(self) -> bool: """Check if this module has been build from a .py file. If so, the module contains a complete representation, including the code. :returns: Whether the module has been built from a .py file. """ return self.file is not None and self.file.endswith(".py") def statement(self, *, future: Literal[None, True] = None) -> NoReturn: """The first parent node, including self, marked as statement node. When called on a :class:`Module` this raises a StatementMissing. """ if future is not None: warnings.warn( "The future arg will be removed in astroid 4.0.", DeprecationWarning, stacklevel=2, ) raise StatementMissing(target=self) def previous_sibling(self): """The previous sibling statement. :returns: The previous sibling statement node. :rtype: NodeNG or None """ def next_sibling(self): """The next sibling statement node. :returns: The next sibling statement node. :rtype: NodeNG or None """ _absolute_import_activated = True def absolute_import_activated(self) -> bool: """Whether :pep:`328` absolute import behaviour has been enabled. :returns: Whether :pep:`328` has been enabled. """ return self._absolute_import_activated def import_module( self, modname: str, relative_only: bool = False, level: int | None = None, use_cache: bool = True, ) -> Module: """Get the ast for a given module as if imported from this module. :param modname: The name of the module to "import". :param relative_only: Whether to only consider relative imports. :param level: The level of relative import. :param use_cache: Whether to use the astroid_cache of modules. :returns: The imported module ast. """ if relative_only and level is None: level = 0 absmodname = self.relative_to_absolute_name(modname, level) try: return AstroidManager().ast_from_module_name( absmodname, use_cache=use_cache ) except AstroidBuildingError: # we only want to import a sub module or package of this module, # skip here if relative_only: raise # Don't repeat the same operation, e.g. for missing modules # like "_winapi" or "nt" on POSIX systems. if modname == absmodname: raise return AstroidManager().ast_from_module_name(modname, use_cache=use_cache) def relative_to_absolute_name(self, modname: str, level: int | None) -> str: """Get the absolute module name for a relative import. The relative import can be implicit or explicit. :param modname: The module name to convert. :param level: The level of relative import. :returns: The absolute module name. :raises TooManyLevelsError: When the relative import refers to a module too far above this one. """ # XXX this returns non sens when called on an absolute import # like 'pylint.checkers.astroid.utils' # XXX doesn't return absolute name if self.name isn't absolute name if self.absolute_import_activated() and level is None: return modname if level: if self.package: level = level - 1 package_name = self.name.rsplit(".", level)[0] elif ( self.path and not os.path.exists(os.path.dirname(self.path[0]) + "/__init__.py") and os.path.exists( os.path.dirname(self.path[0]) + "/" + modname.split(".")[0] ) ): level = level - 1 package_name = "" else: package_name = self.name.rsplit(".", level)[0] if level and self.name.count(".") < level: raise TooManyLevelsError(level=level, name=self.name) elif self.package: package_name = self.name else: package_name = self.name.rsplit(".", 1)[0] if package_name: if not modname: return package_name return f"{package_name}.{modname}" return modname def wildcard_import_names(self): """The list of imported names when this module is 'wildcard imported'. It doesn't include the '__builtins__' name which is added by the current CPython implementation of wildcard imports. :returns: The list of imported names. :rtype: list(str) """ # We separate the different steps of lookup in try/excepts # to avoid catching too many Exceptions default = [name for name in self.keys() if not name.startswith("_")] try: all_values = self["__all__"] except KeyError: return default try: explicit = next(all_values.assigned_stmts()) except (InferenceError, StopIteration): return default except AttributeError: # not an assignment node # XXX infer? return default # Try our best to detect the exported name. inferred = [] try: explicit = next(explicit.infer()) except (InferenceError, StopIteration): return default if not isinstance(explicit, (node_classes.Tuple, node_classes.List)): return default def str_const(node) -> bool: return isinstance(node, node_classes.Const) and isinstance(node.value, str) for node in explicit.elts: if str_const(node): inferred.append(node.value) else: try: inferred_node = next(node.infer()) except (InferenceError, StopIteration): continue if str_const(inferred_node): inferred.append(inferred_node.value) return inferred def public_names(self): """The list of the names that are publicly available in this module. :returns: The list of public names. :rtype: list(str) """ return [name for name in self.keys() if not name.startswith("_")] def bool_value(self, context: InferenceContext | None = None) -> bool: """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`Module` this is always ``True``. """ return True def get_children(self): yield from self.body def frame(self: _T, *, future: Literal[None, True] = None) -> _T: """The node's frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, :class:`ClassDef` or :class:`Lambda`. :returns: The node itself. """ return self def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[Module, None, None]: yield self class GeneratorExp(ComprehensionScope): """Class representing an :class:`ast.GeneratorExp` node. >>> import astroid >>> node = astroid.extract_node('(thing for thing in things if thing)') >>> node """ _astroid_fields = ("elt", "generators") _other_other_fields = ("locals",) elt: NodeNG """The element that forms the output of the expression.""" def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.locals = {} """A map of the name of a local variable to the node defining the local.""" self.generators: list[nodes.Comprehension] = [] """The generators that are looped through.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, elt: NodeNG, generators: list[nodes.Comprehension]) -> None: self.elt = elt self.generators = generators def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`GeneratorExp` this is always ``True``. """ return True def get_children(self): yield self.elt yield from self.generators class DictComp(ComprehensionScope): """Class representing an :class:`ast.DictComp` node. >>> import astroid >>> node = astroid.extract_node('{k:v for k, v in things if k > v}') >>> node """ _astroid_fields = ("key", "value", "generators") _other_other_fields = ("locals",) key: NodeNG """What produces the keys.""" value: NodeNG """What produces the values.""" def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.locals = {} """A map of the name of a local variable to the node defining the local.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, key: NodeNG, value: NodeNG, generators: list[nodes.Comprehension] ) -> None: self.key = key self.value = value self.generators = generators def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`DictComp` this is always :class:`Uninferable`. :rtype: Uninferable """ return util.Uninferable def get_children(self): yield self.key yield self.value yield from self.generators class SetComp(ComprehensionScope): """Class representing an :class:`ast.SetComp` node. >>> import astroid >>> node = astroid.extract_node('{thing for thing in things if thing}') >>> node """ _astroid_fields = ("elt", "generators") _other_other_fields = ("locals",) elt: NodeNG """The element that forms the output of the expression.""" def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.locals = {} """A map of the name of a local variable to the node defining the local.""" self.generators: list[nodes.Comprehension] = [] """The generators that are looped through.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, elt: NodeNG, generators: list[nodes.Comprehension]) -> None: self.elt = elt self.generators = generators def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`SetComp` this is always :class:`Uninferable`. :rtype: Uninferable """ return util.Uninferable def get_children(self): yield self.elt yield from self.generators class ListComp(ComprehensionScope): """Class representing an :class:`ast.ListComp` node. >>> import astroid >>> node = astroid.extract_node('[thing for thing in things if thing]') >>> node """ _astroid_fields = ("elt", "generators") _other_other_fields = ("locals",) elt: NodeNG """The element that forms the output of the expression.""" def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.locals = {} """A map of the name of a local variable to the node defining it.""" self.generators: list[nodes.Comprehension] = [] """The generators that are looped through.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, elt: NodeNG, generators: list[nodes.Comprehension]): self.elt = elt self.generators = generators def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`ListComp` this is always :class:`Uninferable`. :rtype: Uninferable """ return util.Uninferable def get_children(self): yield self.elt yield from self.generators def _infer_decorator_callchain(node): """Detect decorator call chaining and see if the end result is a static or a classmethod. """ if not isinstance(node, FunctionDef): return None if not node.parent: return None try: result = next(node.infer_call_result(node.parent), None) except InferenceError: return None if isinstance(result, bases.Instance): result = result._proxied if isinstance(result, ClassDef): if result.is_subtype_of("builtins.classmethod"): return "classmethod" if result.is_subtype_of("builtins.staticmethod"): return "staticmethod" if isinstance(result, FunctionDef): if not result.decorators: return None # Determine if this function is decorated with one of the builtin descriptors we want. for decorator in result.decorators.nodes: if isinstance(decorator, node_classes.Name): if decorator.name in BUILTIN_DESCRIPTORS: return decorator.name if ( isinstance(decorator, node_classes.Attribute) and isinstance(decorator.expr, node_classes.Name) and decorator.expr.name == "builtins" and decorator.attrname in BUILTIN_DESCRIPTORS ): return decorator.attrname return None class Lambda(_base_nodes.FilterStmtsBaseNode, LocalsDictNodeNG): """Class representing an :class:`ast.Lambda` node. >>> import astroid >>> node = astroid.extract_node('lambda arg: arg + 1') >>> node l.1 at 0x7f23b2e41518> """ _astroid_fields: ClassVar[tuple[str, ...]] = ("args", "body") _other_other_fields: ClassVar[tuple[str, ...]] = ("locals",) name = "" is_lambda = True special_attributes = FunctionModel() """The names of special attributes that this function has.""" args: Arguments """The arguments that the function takes.""" body: NodeNG """The contents of the function body.""" def implicit_parameters(self) -> Literal[0]: return 0 @property def type(self) -> Literal["method", "function"]: """Whether this is a method or function. :returns: 'method' if this is a method, 'function' otherwise. """ if self.args.arguments and self.args.arguments[0].name == "self": if self.parent and isinstance(self.parent.scope(), ClassDef): return "method" return "function" def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ): self.locals = {} """A map of the name of a local variable to the node defining it.""" self.instance_attrs: dict[str, list[NodeNG]] = {} super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, args: Arguments, body: NodeNG) -> None: self.args = args self.body = body def pytype(self) -> Literal["builtins.instancemethod", "builtins.function"]: """Get the name of the type that this node represents. :returns: The name of the type. """ if "method" in self.type: return "builtins.instancemethod" return "builtins.function" def display_type(self) -> str: """A human readable type of this node. :returns: The type of this node. :rtype: str """ if "method" in self.type: return "Method" return "Function" def callable(self) -> Literal[True]: """Whether this node defines something that is callable. :returns: Whether this defines something that is callable For a :class:`Lambda` this is always ``True``. """ return True def argnames(self) -> list[str]: """Get the names of each of the arguments, including that of the collections of variable-length arguments ("args", "kwargs", etc.), as well as positional-only and keyword-only arguments. :returns: The names of the arguments. :rtype: list(str) """ if self.args.arguments: # maybe None with builtin functions names = [elt.name for elt in self.args.arguments] else: names = [] return names def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: """Infer what the function returns when called.""" return self.body.infer(context) def scope_lookup( self, node: LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[NodeNG]]: """Lookup where the given names is assigned. :param node: The node to look for assignments up to. Any assignments after the given node are ignored. :param name: The name to find assignments for. :param offset: The line offset to filter statements up to. :returns: This scope node and the list of assignments associated to the given name according to the scope where it has been found (locals, globals or builtin). """ if (self.args.defaults and node in self.args.defaults) or ( self.args.kw_defaults and node in self.args.kw_defaults ): if not self.parent: raise ParentMissingError(target=self) frame = self.parent.frame() # line offset to avoid that def func(f=func) resolve the default # value to the defined function offset = -1 else: # check this is not used in function decorators frame = self return frame._scope_lookup(node, name, offset) def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`Lambda` this is always ``True``. """ return True def get_children(self): yield self.args yield self.body def frame(self: _T, *, future: Literal[None, True] = None) -> _T: """The node's frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, :class:`ClassDef` or :class:`Lambda`. :returns: The node itself. """ return self def getattr( self, name: str, context: InferenceContext | None = None ) -> list[NodeNG]: if not name: raise AttributeInferenceError(target=self, attribute=name, context=context) found_attrs = [] if name in self.instance_attrs: found_attrs = self.instance_attrs[name] if name in self.special_attributes: found_attrs.append(self.special_attributes.lookup(name)) if found_attrs: return found_attrs raise AttributeInferenceError(target=self, attribute=name) def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[Lambda, None, None]: yield self def _get_yield_nodes_skip_functions(self): """A Lambda node can contain a Yield node in the body.""" yield from self.body._get_yield_nodes_skip_functions() class FunctionDef( _base_nodes.MultiLineBlockNode, _base_nodes.FilterStmtsBaseNode, _base_nodes.Statement, LocalsDictNodeNG, ): """Class representing an :class:`ast.FunctionDef`. >>> import astroid >>> node = astroid.extract_node(''' ... def my_func(arg): ... return arg + 1 ... ''') >>> node """ _astroid_fields = ( "decorators", "args", "returns", "type_params", "doc_node", "body", ) _multi_line_block_fields = ("body",) returns = None decorators: node_classes.Decorators | None """The decorators that are applied to this method or function.""" doc_node: Const | None """The doc node associated with this node.""" args: Arguments """The arguments that the function takes.""" is_function = True """Whether this node indicates a function. For a :class:`FunctionDef` this is always ``True``. :type: bool """ type_annotation = None """If present, this will contain the type annotation passed by a type comment :type: NodeNG or None """ type_comment_args = None """ If present, this will contain the type annotation for arguments passed by a type comment """ type_comment_returns = None """If present, this will contain the return type annotation, passed by a type comment""" # attributes below are set by the builder module or by raw factories _other_fields = ("name", "position") _other_other_fields = ( "locals", "_type", "type_comment_returns", "type_comment_args", ) _type = None name = "" special_attributes = FunctionModel() """The names of special attributes that this function has.""" def __init__( self, name: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.name = name """The name of the function.""" self.locals = {} """A map of the name of a local variable to the node defining it.""" self.body: list[NodeNG] = [] """The contents of the function body.""" self.type_params: list[nodes.TypeVar | nodes.ParamSpec | nodes.TypeVarTuple] = ( [] ) """PEP 695 (Python 3.12+) type params, e.g. first 'T' in def func[T]() -> T: ...""" self.instance_attrs: dict[str, list[NodeNG]] = {} super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) if parent and not isinstance(parent, Unknown): frame = parent.frame() frame.set_local(name, self) def postinit( self, args: Arguments, body: list[NodeNG], decorators: node_classes.Decorators | None = None, returns=None, type_comment_returns=None, type_comment_args=None, *, position: Position | None = None, doc_node: Const | None = None, type_params: ( list[nodes.TypeVar | nodes.ParamSpec | nodes.TypeVarTuple] | None ) = None, ): """Do some setup after initialisation. :param args: The arguments that the function takes. :param body: The contents of the function body. :param decorators: The decorators that are applied to this method or function. :params type_comment_returns: The return type annotation passed via a type comment. :params type_comment_args: The args type annotation passed via a type comment. :params position: Position of function keyword(s) and name. :param doc_node: The doc node associated with this node. :param type_params: The type_params associated with this node. """ self.args = args self.body = body self.decorators = decorators self.returns = returns self.type_comment_returns = type_comment_returns self.type_comment_args = type_comment_args self.position = position self.doc_node = doc_node self.type_params = type_params or [] @cached_property def extra_decorators(self) -> list[node_classes.Call]: """The extra decorators that this function can have. Additional decorators are considered when they are used as assignments, as in ``method = staticmethod(method)``. The property will return all the callables that are used for decoration. """ if not self.parent or not isinstance(frame := self.parent.frame(), ClassDef): return [] decorators: list[node_classes.Call] = [] for assign in frame._assign_nodes_in_scope: if isinstance(assign.value, node_classes.Call) and isinstance( assign.value.func, node_classes.Name ): for assign_node in assign.targets: if not isinstance(assign_node, node_classes.AssignName): # Support only `name = callable(name)` continue if assign_node.name != self.name: # Interested only in the assignment nodes that # decorates the current method. continue try: meth = frame[self.name] except KeyError: continue else: # Must be a function and in the same frame as the # original method. if ( isinstance(meth, FunctionDef) and assign_node.frame() == frame ): decorators.append(assign.value) return decorators def pytype(self) -> Literal["builtins.instancemethod", "builtins.function"]: """Get the name of the type that this node represents. :returns: The name of the type. """ if "method" in self.type: return "builtins.instancemethod" return "builtins.function" def display_type(self) -> str: """A human readable type of this node. :returns: The type of this node. :rtype: str """ if "method" in self.type: return "Method" return "Function" def callable(self) -> Literal[True]: return True def argnames(self) -> list[str]: """Get the names of each of the arguments, including that of the collections of variable-length arguments ("args", "kwargs", etc.), as well as positional-only and keyword-only arguments. :returns: The names of the arguments. :rtype: list(str) """ if self.args.arguments: # maybe None with builtin functions names = [elt.name for elt in self.args.arguments] else: names = [] return names def getattr( self, name: str, context: InferenceContext | None = None ) -> list[NodeNG]: if not name: raise AttributeInferenceError(target=self, attribute=name, context=context) found_attrs = [] if name in self.instance_attrs: found_attrs = self.instance_attrs[name] if name in self.special_attributes: found_attrs.append(self.special_attributes.lookup(name)) if found_attrs: return found_attrs raise AttributeInferenceError(target=self, attribute=name) @cached_property def type(self) -> str: # pylint: disable=too-many-return-statements # noqa: C901 """The function type for this node. Possible values are: method, function, staticmethod, classmethod. """ for decorator in self.extra_decorators: if decorator.func.name in BUILTIN_DESCRIPTORS: return decorator.func.name if not self.parent: raise ParentMissingError(target=self) frame = self.parent.frame() type_name = "function" if isinstance(frame, ClassDef): if self.name == "__new__": return "classmethod" if self.name == "__init_subclass__": return "classmethod" if self.name == "__class_getitem__": return "classmethod" type_name = "method" if not self.decorators: return type_name for node in self.decorators.nodes: if isinstance(node, node_classes.Name): if node.name in BUILTIN_DESCRIPTORS: return node.name if ( isinstance(node, node_classes.Attribute) and isinstance(node.expr, node_classes.Name) and node.expr.name == "builtins" and node.attrname in BUILTIN_DESCRIPTORS ): return node.attrname if isinstance(node, node_classes.Call): # Handle the following case: # @some_decorator(arg1, arg2) # def func(...) # try: current = next(node.func.infer()) except (InferenceError, StopIteration): continue _type = _infer_decorator_callchain(current) if _type is not None: return _type try: for inferred in node.infer(): # Check to see if this returns a static or a class method. _type = _infer_decorator_callchain(inferred) if _type is not None: return _type if not isinstance(inferred, ClassDef): continue for ancestor in inferred.ancestors(): if not isinstance(ancestor, ClassDef): continue if ancestor.is_subtype_of("builtins.classmethod"): return "classmethod" if ancestor.is_subtype_of("builtins.staticmethod"): return "staticmethod" except InferenceError: pass return type_name @cached_property def fromlineno(self) -> int: """The first line that this node appears on in the source code. Can also return 0 if the line can not be determined. """ # lineno is the line number of the first decorator, we want the def # statement lineno. Similar to 'ClassDef.fromlineno' lineno = self.lineno or 0 if self.decorators is not None: lineno += sum( node.tolineno - (node.lineno or 0) + 1 for node in self.decorators.nodes ) return lineno or 0 @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ return self.args.tolineno def implicit_parameters(self) -> Literal[0, 1]: return 1 if self.is_bound() else 0 def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from the given line number to where this node ends. :param lineno: Unused. :returns: The range of line numbers that this node belongs to, """ return self.fromlineno, self.tolineno def igetattr( self, name: str, context: InferenceContext | None = None ) -> Iterator[InferenceResult]: """Inferred getattr, which returns an iterator of inferred statements.""" try: return bases._infer_stmts(self.getattr(name, context), context, frame=self) except AttributeInferenceError as error: raise InferenceError( str(error), target=self, attribute=name, context=context ) from error def is_method(self) -> bool: """Check if this function node represents a method. :returns: Whether this is a method. """ # check we are defined in a ClassDef, because this is usually expected # (e.g. pylint...) when is_method() return True return ( self.type != "function" and self.parent is not None and isinstance(self.parent.frame(), ClassDef) ) def decoratornames(self, context: InferenceContext | None = None) -> set[str]: """Get the qualified names of each of the decorators on this function. :param context: An inference context that can be passed to inference functions :returns: The names of the decorators. """ result = set() decoratornodes = [] if self.decorators is not None: decoratornodes += self.decorators.nodes decoratornodes += self.extra_decorators for decnode in decoratornodes: try: for infnode in decnode.infer(context=context): result.add(infnode.qname()) except InferenceError: continue return result def is_bound(self) -> bool: """Check if the function is bound to an instance or class. :returns: Whether the function is bound to an instance or class. """ return self.type in {"method", "classmethod"} def is_abstract(self, pass_is_abstract=True, any_raise_is_abstract=False) -> bool: """Check if the method is abstract. A method is considered abstract if any of the following is true: * The only statement is 'raise NotImplementedError' * The only statement is 'raise ' and any_raise_is_abstract is True * The only statement is 'pass' and pass_is_abstract is True * The method is annotated with abc.astractproperty/abc.abstractmethod :returns: Whether the method is abstract. """ if self.decorators: for node in self.decorators.nodes: try: inferred = next(node.infer()) except (InferenceError, StopIteration): continue if inferred and inferred.qname() in { "abc.abstractproperty", "abc.abstractmethod", }: return True for child_node in self.body: if isinstance(child_node, node_classes.Raise): if any_raise_is_abstract: return True if child_node.raises_not_implemented(): return True return pass_is_abstract and isinstance(child_node, node_classes.Pass) # empty function is the same as function with a single "pass" statement if pass_is_abstract: return True return False def is_generator(self) -> bool: """Check if this is a generator function. :returns: Whether this is a generator function. """ yields_without_lambdas = set(self._get_yield_nodes_skip_lambdas()) yields_without_functions = set(self._get_yield_nodes_skip_functions()) # Want an intersecting member that is neither in a lambda nor a function return bool(yields_without_lambdas & yields_without_functions) def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[objects.Property | FunctionDef, None, InferenceErrorInfo]: from astroid import objects # pylint: disable=import-outside-toplevel if not self.decorators or not bases._is_property(self): yield self return InferenceErrorInfo(node=self, context=context) # When inferring a property, we instantiate a new `objects.Property` object, # which in turn, because it inherits from `FunctionDef`, sets itself in the locals # of the wrapping frame. This means that every time we infer a property, the locals # are mutated with a new instance of the property. To avoid this, we detect this # scenario and avoid passing the `parent` argument to the constructor. if not self.parent: raise ParentMissingError(target=self) parent_frame = self.parent.frame() property_already_in_parent_locals = self.name in parent_frame.locals and any( isinstance(val, objects.Property) for val in parent_frame.locals[self.name] ) # We also don't want to pass parent if the definition is within a Try node if isinstance( self.parent, (node_classes.Try, node_classes.If), ): property_already_in_parent_locals = True prop_func = objects.Property( function=self, name=self.name, lineno=self.lineno, parent=self.parent if not property_already_in_parent_locals else None, col_offset=self.col_offset, ) if property_already_in_parent_locals: prop_func.parent = self.parent prop_func.postinit(body=[], args=self.args, doc_node=self.doc_node) yield prop_func return InferenceErrorInfo(node=self, context=context) def infer_yield_result(self, context: InferenceContext | None = None): """Infer what the function yields when called :returns: What the function yields :rtype: iterable(NodeNG or Uninferable) or None """ # pylint: disable=not-an-iterable # https://github.com/pylint-dev/astroid/issues/1015 for yield_ in self.nodes_of_class(node_classes.Yield): if yield_.value is None: const = node_classes.Const(None) const.parent = yield_ const.lineno = yield_.lineno yield const elif yield_.scope() == self: yield from yield_.value.infer(context=context) def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: """Infer what the function returns when called.""" if self.is_generator(): if isinstance(self, AsyncFunctionDef): generator_cls: type[bases.Generator] = bases.AsyncGenerator else: generator_cls = bases.Generator result = generator_cls(self, generator_initial_context=context) yield result return # This is really a gigantic hack to work around metaclass generators # that return transient class-generating functions. Pylint's AST structure # cannot handle a base class object that is only used for calling __new__, # but does not contribute to the inheritance structure itself. We inject # a fake class into the hierarchy here for several well-known metaclass # generators, and filter it out later. if ( self.name == "with_metaclass" and caller is not None and self.args.args and len(self.args.args) == 1 and self.args.vararg is not None ): if isinstance(caller.args, Arguments): assert caller.args.args is not None metaclass = next(caller.args.args[0].infer(context), None) elif isinstance(caller.args, list): metaclass = next(caller.args[0].infer(context), None) else: raise TypeError( # pragma: no cover f"caller.args was neither Arguments nor list; got {type(caller.args)}" ) if isinstance(metaclass, ClassDef): try: class_bases = [ # Find the first non-None inferred base value next( b for b in arg.infer( context=context.clone() if context else context ) if not (isinstance(b, Const) and b.value is None) ) for arg in caller.args[1:] ] except StopIteration as e: raise InferenceError(node=caller.args[1:], context=context) from e new_class = ClassDef( name="temporary_class", lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=self, ) new_class.hide = True new_class.postinit( bases=[ base for base in class_bases if not isinstance(base, util.UninferableBase) ], body=[], decorators=None, metaclass=metaclass, ) yield new_class return returns = self._get_return_nodes_skip_functions() first_return = next(returns, None) if not first_return: if self.body: if self.is_abstract(pass_is_abstract=True, any_raise_is_abstract=True): yield util.Uninferable else: yield node_classes.Const(None) return raise InferenceError("The function does not have any return statements") for returnnode in itertools.chain((first_return,), returns): if returnnode.value is None: yield node_classes.Const(None) else: try: yield from returnnode.value.infer(context) except InferenceError: yield util.Uninferable def bool_value(self, context: InferenceContext | None = None) -> bool: """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`FunctionDef` this is always ``True``. """ return True def get_children(self): if self.decorators is not None: yield self.decorators yield self.args if self.returns is not None: yield self.returns yield from self.type_params yield from self.body def scope_lookup( self, node: LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[nodes.NodeNG]]: """Lookup where the given name is assigned.""" if name == "__class__": # __class__ is an implicit closure reference created by the compiler # if any methods in a class body refer to either __class__ or super. # In our case, we want to be able to look it up in the current scope # when `__class__` is being used. if self.parent and isinstance(frame := self.parent.frame(), ClassDef): return self, [frame] if (self.args.defaults and node in self.args.defaults) or ( self.args.kw_defaults and node in self.args.kw_defaults ): if not self.parent: raise ParentMissingError(target=self) frame = self.parent.frame() # line offset to avoid that def func(f=func) resolve the default # value to the defined function offset = -1 else: # check this is not used in function decorators frame = self return frame._scope_lookup(node, name, offset) def frame(self: _T, *, future: Literal[None, True] = None) -> _T: """The node's frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, :class:`ClassDef` or :class:`Lambda`. :returns: The node itself. """ return self class AsyncFunctionDef(FunctionDef): """Class representing an :class:`ast.FunctionDef` node. A :class:`AsyncFunctionDef` is an asynchronous function created with the `async` keyword. >>> import astroid >>> node = astroid.extract_node(''' async def func(things): async for thing in things: print(thing) ''') >>> node >>> node.body[0] """ def _is_metaclass(klass, seen=None, context: InferenceContext | None = None) -> bool: """Return if the given class can be used as a metaclass. """ if klass.name == "type": return True if seen is None: seen = set() for base in klass.bases: try: for baseobj in base.infer(context=context): baseobj_name = baseobj.qname() if baseobj_name in seen: continue seen.add(baseobj_name) if isinstance(baseobj, bases.Instance): # not abstract return False if baseobj is klass: continue if not isinstance(baseobj, ClassDef): continue if baseobj._type == "metaclass": return True if _is_metaclass(baseobj, seen, context=context): return True except InferenceError: continue return False def _class_type(klass, ancestors=None, context: InferenceContext | None = None): """return a ClassDef node type to differ metaclass and exception from 'regular' classes """ # XXX we have to store ancestors in case we have an ancestor loop if klass._type is not None: return klass._type if _is_metaclass(klass, context=context): klass._type = "metaclass" elif klass.name.endswith("Exception"): klass._type = "exception" else: if ancestors is None: ancestors = set() klass_name = klass.qname() if klass_name in ancestors: # XXX we are in loop ancestors, and have found no type klass._type = "class" return "class" ancestors.add(klass_name) for base in klass.ancestors(recurs=False): name = _class_type(base, ancestors) if name != "class": if name == "metaclass" and not _is_metaclass(klass): # don't propagate it if the current class # can't be a metaclass continue klass._type = base.type break if klass._type is None: klass._type = "class" return klass._type def get_wrapping_class(node): """Get the class that wraps the given node. We consider that a class wraps a node if the class is a parent for the said node. :returns: The class that wraps the given node :rtype: ClassDef or None """ klass = node.frame() while klass is not None and not isinstance(klass, ClassDef): if klass.parent is None: klass = None else: klass = klass.parent.frame() return klass class ClassDef( # pylint: disable=too-many-instance-attributes _base_nodes.FilterStmtsBaseNode, LocalsDictNodeNG, _base_nodes.Statement ): """Class representing an :class:`ast.ClassDef` node. >>> import astroid >>> node = astroid.extract_node(''' class Thing: def my_meth(self, arg): return arg + self.offset ''') >>> node """ # some of the attributes below are set by the builder module or # by a raw factories # a dictionary of class instances attributes _astroid_fields = ( "decorators", "bases", "keywords", "doc_node", "body", "type_params", ) # name decorators = None """The decorators that are applied to this class. :type: Decorators or None """ special_attributes = ClassModel() """The names of special attributes that this class has. :type: objectmodel.ClassModel """ _type = None _metaclass: NodeNG | None = None _metaclass_hack = False hide = False type = property( _class_type, doc=( "The class type for this node.\n\n" "Possible values are: class, metaclass, exception.\n\n" ":type: str" ), ) _other_fields = ("name", "is_dataclass", "position") _other_other_fields = ("locals", "_newstyle") _newstyle: bool | None = None def __init__( self, name: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.instance_attrs: dict[str, NodeNG] = {} self.locals = {} """A map of the name of a local variable to the node defining it.""" self.keywords: list[node_classes.Keyword] = [] """The keywords given to the class definition. This is usually for :pep:`3115` style metaclass declaration. """ self.bases: list[SuccessfulInferenceResult] = [] """What the class inherits from.""" self.body: list[NodeNG] = [] """The contents of the class body.""" self.name = name """The name of the class.""" self.decorators = None """The decorators that are applied to this class.""" self.doc_node: Const | None = None """The doc node associated with this node.""" self.is_dataclass: bool = False """Whether this class is a dataclass.""" self.type_params: list[nodes.TypeVar | nodes.ParamSpec | nodes.TypeVarTuple] = ( [] ) """PEP 695 (Python 3.12+) type params, e.g. class MyClass[T]: ...""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) if parent and not isinstance(parent, Unknown): parent.frame().set_local(name, self) for local_name, node in self.implicit_locals(): self.add_local_node(node, local_name) infer_binary_op: ClassVar[InferBinaryOp[ClassDef]] = ( protocols.instance_class_infer_binary_op ) def implicit_parameters(self) -> Literal[1]: return 1 def implicit_locals(self): """Get implicitly defined class definition locals. :returns: the the name and Const pair for each local :rtype: tuple(tuple(str, node_classes.Const), ...) """ locals_ = (("__module__", self.special_attributes.attr___module__),) # __qualname__ is defined in PEP3155 locals_ += (("__qualname__", self.special_attributes.attr___qualname__),) return locals_ # pylint: disable=redefined-outer-name def postinit( self, bases: list[SuccessfulInferenceResult], body: list[NodeNG], decorators: node_classes.Decorators | None, newstyle: bool | None = None, metaclass: NodeNG | None = None, keywords: list[node_classes.Keyword] | None = None, *, position: Position | None = None, doc_node: Const | None = None, type_params: ( list[nodes.TypeVar | nodes.ParamSpec | nodes.TypeVarTuple] | None ) = None, ) -> None: if keywords is not None: self.keywords = keywords self.bases = bases self.body = body self.decorators = decorators self._newstyle = newstyle self._metaclass = metaclass self.position = position self.doc_node = doc_node self.type_params = type_params or [] def _newstyle_impl(self, context: InferenceContext | None = None): if context is None: context = InferenceContext() if self._newstyle is not None: return self._newstyle for base in self.ancestors(recurs=False, context=context): if base._newstyle_impl(context): self._newstyle = True break klass = self.declared_metaclass() # could be any callable, we'd need to infer the result of klass(name, # bases, dict). punt if it's not a class node. if klass is not None and isinstance(klass, ClassDef): self._newstyle = klass._newstyle_impl(context) if self._newstyle is None: self._newstyle = False return self._newstyle _newstyle = None newstyle = property( _newstyle_impl, doc=("Whether this is a new style class or not\n\n" ":type: bool or None"), ) @cached_property def fromlineno(self) -> int: """The first line that this node appears on in the source code. Can also return 0 if the line can not be determined. """ if IS_PYPY and PY38 and not PYPY_7_3_11_PLUS: # For Python < 3.8 the lineno is the line number of the first decorator. # We want the class statement lineno. Similar to 'FunctionDef.fromlineno' # PyPy (3.8): Fixed with version v7.3.11 lineno = self.lineno or 0 if self.decorators is not None: lineno += sum( node.tolineno - (node.lineno or 0) + 1 for node in self.decorators.nodes ) return lineno or 0 return super().fromlineno @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ if self.bases: return self.bases[-1].tolineno return self.fromlineno def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from the given line number to where this node ends. :param lineno: Unused. :returns: The range of line numbers that this node belongs to, """ return self.fromlineno, self.tolineno def pytype(self) -> Literal["builtins.type", "builtins.classobj"]: """Get the name of the type that this node represents. :returns: The name of the type. """ if self.newstyle: return "builtins.type" return "builtins.classobj" def display_type(self) -> str: """A human readable type of this node. :returns: The type of this node. :rtype: str """ return "Class" def callable(self) -> bool: """Whether this node defines something that is callable. :returns: Whether this defines something that is callable. For a :class:`ClassDef` this is always ``True``. """ return True def is_subtype_of(self, type_name, context: InferenceContext | None = None) -> bool: """Whether this class is a subtype of the given type. :param type_name: The name of the type of check against. :type type_name: str :returns: Whether this class is a subtype of the given type. """ if self.qname() == type_name: return True return any(anc.qname() == type_name for anc in self.ancestors(context=context)) def _infer_type_call(self, caller, context): try: name_node = next(caller.args[0].infer(context)) except StopIteration as e: raise InferenceError(node=caller.args[0], context=context) from e if isinstance(name_node, node_classes.Const) and isinstance( name_node.value, str ): name = name_node.value else: return util.Uninferable result = ClassDef( name, lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=Unknown(), ) # Get the bases of the class. try: class_bases = next(caller.args[1].infer(context)) except StopIteration as e: raise InferenceError(node=caller.args[1], context=context) from e if isinstance(class_bases, (node_classes.Tuple, node_classes.List)): bases = [] for base in class_bases.itered(): inferred = next(base.infer(context=context), None) if inferred: bases.append( node_classes.EvaluatedObject(original=base, value=inferred) ) result.bases = bases else: # There is currently no AST node that can represent an 'unknown' # node (Uninferable is not an AST node), therefore we simply return Uninferable here # although we know at least the name of the class. return util.Uninferable # Get the members of the class try: members = next(caller.args[2].infer(context)) except (InferenceError, StopIteration): members = None if members and isinstance(members, node_classes.Dict): for attr, value in members.items: if isinstance(attr, node_classes.Const) and isinstance(attr.value, str): result.locals[attr.value] = [value] result.parent = caller.parent return result def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: """infer what a class is returning when called""" if self.is_subtype_of("builtins.type", context) and len(caller.args) == 3: result = self._infer_type_call(caller, context) yield result return dunder_call = None try: metaclass = self.metaclass(context=context) if metaclass is not None: # Only get __call__ if it's defined locally for the metaclass. # Otherwise we will find ObjectModel.__call__ which will # return an instance of the metaclass. Instantiating the class is # handled later. if "__call__" in metaclass.locals: dunder_call = next(metaclass.igetattr("__call__", context)) except (AttributeInferenceError, StopIteration): pass if dunder_call and dunder_call.qname() != "builtins.type.__call__": # Call type.__call__ if not set metaclass # (since type is the default metaclass) context = bind_context_to_node(context, self) context.callcontext.callee = dunder_call yield from dunder_call.infer_call_result(caller, context) else: yield self.instantiate_class() def scope_lookup( self, node: LookupMixIn, name: str, offset: int = 0 ) -> tuple[LocalsDictNodeNG, list[nodes.NodeNG]]: """Lookup where the given name is assigned. :param node: The node to look for assignments up to. Any assignments after the given node are ignored. :param name: The name to find assignments for. :param offset: The line offset to filter statements up to. :returns: This scope node and the list of assignments associated to the given name according to the scope where it has been found (locals, globals or builtin). """ # If the name looks like a builtin name, just try to look # into the upper scope of this class. We might have a # decorator that it's poorly named after a builtin object # inside this class. lookup_upper_frame = ( isinstance(node.parent, node_classes.Decorators) and name in AstroidManager().builtins_module ) if ( any( node == base or base.parent_of(node) and not self.type_params for base in self.bases ) or lookup_upper_frame ): # Handle the case where we have either a name # in the bases of a class, which exists before # the actual definition or the case where we have # a Getattr node, with that name. # # name = ... # class A(name): # def name(self): ... # # import name # class A(name.Name): # def name(self): ... if not self.parent: raise ParentMissingError(target=self) frame = self.parent.frame() # line offset to avoid that class A(A) resolve the ancestor to # the defined class offset = -1 else: frame = self return frame._scope_lookup(node, name, offset) @property def basenames(self): """The names of the parent classes Names are given in the order they appear in the class definition. :type: list(str) """ return [bnode.as_string() for bnode in self.bases] def ancestors( self, recurs: bool = True, context: InferenceContext | None = None ) -> Generator[ClassDef, None, None]: """Iterate over the base classes in prefixed depth first order. :param recurs: Whether to recurse or return direct ancestors only. :returns: The base classes """ # FIXME: should be possible to choose the resolution order # FIXME: inference make infinite loops possible here yielded = {self} if context is None: context = InferenceContext() if not self.bases and self.qname() != "builtins.object": # This should always be a ClassDef (which we don't assert for) yield builtin_lookup("object")[1][0] # type: ignore[misc] return for stmt in self.bases: with context.restore_path(): try: for baseobj in stmt.infer(context): if not isinstance(baseobj, ClassDef): if isinstance(baseobj, bases.Instance): baseobj = baseobj._proxied else: continue if not baseobj.hide: if baseobj in yielded: continue yielded.add(baseobj) yield baseobj if not recurs: continue for grandpa in baseobj.ancestors(recurs=True, context=context): if grandpa is self: # This class is the ancestor of itself. break if grandpa in yielded: continue yielded.add(grandpa) yield grandpa except InferenceError: continue def local_attr_ancestors(self, name, context: InferenceContext | None = None): """Iterate over the parents that define the given name. :param name: The name to find definitions for. :type name: str :returns: The parents that define the given name. :rtype: iterable(NodeNG) """ # Look up in the mro if we can. This will result in the # attribute being looked up just as Python does it. try: ancestors: Iterable[ClassDef] = self.mro(context)[1:] except MroError: # Fallback to use ancestors, we can't determine # a sane MRO. ancestors = self.ancestors(context=context) for astroid in ancestors: if name in astroid: yield astroid def instance_attr_ancestors(self, name, context: InferenceContext | None = None): """Iterate over the parents that define the given name as an attribute. :param name: The name to find definitions for. :type name: str :returns: The parents that define the given name as an instance attribute. :rtype: iterable(NodeNG) """ for astroid in self.ancestors(context=context): if name in astroid.instance_attrs: yield astroid def has_base(self, node) -> bool: """Whether this class directly inherits from the given node. :param node: The node to check for. :type node: NodeNG :returns: Whether this class directly inherits from the given node. """ return node in self.bases def local_attr(self, name, context: InferenceContext | None = None): """Get the list of assign nodes associated to the given name. Assignments are looked for in both this class and in parents. :returns: The list of assignments to the given name. :rtype: list(NodeNG) :raises AttributeInferenceError: If no attribute with this name can be found in this class or parent classes. """ result = [] if name in self.locals: result = self.locals[name] else: class_node = next(self.local_attr_ancestors(name, context), None) if class_node: result = class_node.locals[name] result = [n for n in result if not isinstance(n, node_classes.DelAttr)] if result: return result raise AttributeInferenceError(target=self, attribute=name, context=context) def instance_attr(self, name, context: InferenceContext | None = None): """Get the list of nodes associated to the given attribute name. Assignments are looked for in both this class and in parents. :returns: The list of assignments to the given name. :rtype: list(NodeNG) :raises AttributeInferenceError: If no attribute with this name can be found in this class or parent classes. """ # Return a copy, so we don't modify self.instance_attrs, # which could lead to infinite loop. values = list(self.instance_attrs.get(name, [])) # get all values from parents for class_node in self.instance_attr_ancestors(name, context): values += class_node.instance_attrs[name] values = [n for n in values if not isinstance(n, node_classes.DelAttr)] if values: return values raise AttributeInferenceError(target=self, attribute=name, context=context) def instantiate_class(self) -> bases.Instance: """Get an :class:`Instance` of the :class:`ClassDef` node. :returns: An :class:`Instance` of the :class:`ClassDef` node """ from astroid import objects # pylint: disable=import-outside-toplevel try: if any(cls.name in EXCEPTION_BASE_CLASSES for cls in self.mro()): # Subclasses of exceptions can be exception instances return objects.ExceptionInstance(self) except MroError: pass return bases.Instance(self) def getattr( self, name: str, context: InferenceContext | None = None, class_context: bool = True, ) -> list[InferenceResult]: """Get an attribute from this class, using Python's attribute semantic. This method doesn't look in the :attr:`instance_attrs` dictionary since it is done by an :class:`Instance` proxy at inference time. It may return an :class:`Uninferable` object if the attribute has not been found, but a ``__getattr__`` or ``__getattribute__`` method is defined. If ``class_context`` is given, then it is considered that the attribute is accessed from a class context, e.g. ClassDef.attribute, otherwise it might have been accessed from an instance as well. If ``class_context`` is used in that case, then a lookup in the implicit metaclass and the explicit metaclass will be done. :param name: The attribute to look for. :param class_context: Whether the attribute can be accessed statically. :returns: The attribute. :raises AttributeInferenceError: If the attribute cannot be inferred. """ if not name: raise AttributeInferenceError(target=self, attribute=name, context=context) # don't modify the list in self.locals! values: list[InferenceResult] = list(self.locals.get(name, [])) for classnode in self.ancestors(recurs=True, context=context): values += classnode.locals.get(name, []) if name in self.special_attributes and class_context and not values: result = [self.special_attributes.lookup(name)] if name == "__bases__": # Need special treatment, since they are mutable # and we need to return all the values. result += values return result if class_context: values += self._metaclass_lookup_attribute(name, context) # Remove AnnAssigns without value, which are not attributes in the purest sense. for value in values.copy(): if isinstance(value, node_classes.AssignName): stmt = value.statement() if isinstance(stmt, node_classes.AnnAssign) and stmt.value is None: values.pop(values.index(value)) if not values: raise AttributeInferenceError(target=self, attribute=name, context=context) return values @lru_cache(maxsize=1024) # noqa def _metaclass_lookup_attribute(self, name, context): """Search the given name in the implicit and the explicit metaclass.""" attrs = set() implicit_meta = self.implicit_metaclass() context = copy_context(context) metaclass = self.metaclass(context=context) for cls in (implicit_meta, metaclass): if cls and cls != self and isinstance(cls, ClassDef): cls_attributes = self._get_attribute_from_metaclass(cls, name, context) attrs.update(set(cls_attributes)) return attrs def _get_attribute_from_metaclass(self, cls, name, context): from astroid import objects # pylint: disable=import-outside-toplevel try: attrs = cls.getattr(name, context=context, class_context=True) except AttributeInferenceError: return for attr in bases._infer_stmts(attrs, context, frame=cls): if not isinstance(attr, FunctionDef): yield attr continue if isinstance(attr, objects.Property): yield attr continue if attr.type == "classmethod": # If the method is a classmethod, then it will # be bound to the metaclass, not to the class # from where the attribute is retrieved. # get_wrapping_class could return None, so just # default to the current class. frame = get_wrapping_class(attr) or self yield bases.BoundMethod(attr, frame) elif attr.type == "staticmethod": yield attr else: yield bases.BoundMethod(attr, self) def igetattr( self, name: str, context: InferenceContext | None = None, class_context: bool = True, ) -> Iterator[InferenceResult]: """Infer the possible values of the given variable. :param name: The name of the variable to infer. :returns: The inferred possible values. """ from astroid import objects # pylint: disable=import-outside-toplevel # set lookup name since this is necessary to infer on import nodes for # instance context = copy_context(context) context.lookupname = name metaclass = self.metaclass(context=context) try: attributes = self.getattr(name, context, class_context=class_context) # If we have more than one attribute, make sure that those starting from # the second one are from the same scope. This is to account for modifications # to the attribute happening *after* the attribute's definition (e.g. AugAssigns on lists) if len(attributes) > 1: first_attr, attributes = attributes[0], attributes[1:] first_scope = first_attr.parent.scope() attributes = [first_attr] + [ attr for attr in attributes if attr.parent and attr.parent.scope() == first_scope ] functions = [attr for attr in attributes if isinstance(attr, FunctionDef)] if functions: # Prefer only the last function, unless a property is involved. last_function = functions[-1] attributes = [ a for a in attributes if a not in functions or a is last_function or bases._is_property(a) ] for inferred in bases._infer_stmts(attributes, context, frame=self): # yield Uninferable object instead of descriptors when necessary if not isinstance(inferred, node_classes.Const) and isinstance( inferred, bases.Instance ): try: inferred._proxied.getattr("__get__", context) except AttributeInferenceError: yield inferred else: yield util.Uninferable elif isinstance(inferred, objects.Property): function = inferred.function if not class_context: # Through an instance so we can solve the property yield from function.infer_call_result( caller=self, context=context ) # If we're in a class context, we need to determine if the property # was defined in the metaclass (a derived class must be a subclass of # the metaclass of all its bases), in which case we can resolve the # property. If not, i.e. the property is defined in some base class # instead, then we return the property object elif metaclass and function.parent.scope() is metaclass: # Resolve a property as long as it is not accessed through # the class itself. yield from function.infer_call_result( caller=self, context=context ) else: yield inferred else: yield function_to_method(inferred, self) except AttributeInferenceError as error: if not name.startswith("__") and self.has_dynamic_getattr(context): # class handle some dynamic attributes, return a Uninferable object yield util.Uninferable else: raise InferenceError( str(error), target=self, attribute=name, context=context ) from error def has_dynamic_getattr(self, context: InferenceContext | None = None) -> bool: """Check if the class has a custom __getattr__ or __getattribute__. If any such method is found and it is not from builtins, nor from an extension module, then the function will return True. :returns: Whether the class has a custom __getattr__ or __getattribute__. """ def _valid_getattr(node): root = node.root() return root.name != "builtins" and getattr(root, "pure_python", None) try: return _valid_getattr(self.getattr("__getattr__", context)[0]) except AttributeInferenceError: # if self.newstyle: XXX cause an infinite recursion error try: getattribute = self.getattr("__getattribute__", context)[0] return _valid_getattr(getattribute) except AttributeInferenceError: pass return False def getitem(self, index, context: InferenceContext | None = None): """Return the inference of a subscript. This is basically looking up the method in the metaclass and calling it. :returns: The inferred value of a subscript to this class. :rtype: NodeNG :raises AstroidTypeError: If this class does not define a ``__getitem__`` method. """ try: methods = lookup(self, "__getitem__", context=context) except AttributeInferenceError as exc: if isinstance(self, ClassDef): # subscripting a class definition may be # achieved thanks to __class_getitem__ method # which is a classmethod defined in the class # that supports subscript and not in the metaclass try: methods = self.getattr("__class_getitem__") # Here it is assumed that the __class_getitem__ node is # a FunctionDef. One possible improvement would be to deal # with more generic inference. except AttributeInferenceError: raise AstroidTypeError(node=self, context=context) from exc else: raise AstroidTypeError(node=self, context=context) from exc method = methods[0] # Create a new callcontext for providing index as an argument. new_context = bind_context_to_node(context, self) new_context.callcontext = CallContext(args=[index], callee=method) try: return next(method.infer_call_result(self, new_context), util.Uninferable) except AttributeError: # Starting with python3.9, builtin types list, dict etc... # are subscriptable thanks to __class_getitem___ classmethod. # However in such case the method is bound to an EmptyNode and # EmptyNode doesn't have infer_call_result method yielding to # AttributeError if ( isinstance(method, node_classes.EmptyNode) and self.pytype() == "builtins.type" and PY39_PLUS ): return self raise except InferenceError: return util.Uninferable def methods(self): """Iterate over all of the method defined in this class and its parents. :returns: The methods defined on the class. :rtype: iterable(FunctionDef) """ done = {} for astroid in itertools.chain(iter((self,)), self.ancestors()): for meth in astroid.mymethods(): if meth.name in done: continue done[meth.name] = None yield meth def mymethods(self): """Iterate over all of the method defined in this class only. :returns: The methods defined on the class. :rtype: iterable(FunctionDef) """ for member in self.values(): if isinstance(member, FunctionDef): yield member def implicit_metaclass(self): """Get the implicit metaclass of the current class. For newstyle classes, this will return an instance of builtins.type. For oldstyle classes, it will simply return None, since there's no implicit metaclass there. :returns: The metaclass. :rtype: builtins.type or None """ if self.newstyle: return builtin_lookup("type")[1][0] return None def declared_metaclass( self, context: InferenceContext | None = None ) -> SuccessfulInferenceResult | None: """Return the explicit declared metaclass for the current class. An explicit declared metaclass is defined either by passing the ``metaclass`` keyword argument in the class definition line (Python 3) or (Python 2) by having a ``__metaclass__`` class attribute, or if there are no explicit bases but there is a global ``__metaclass__`` variable. :returns: The metaclass of this class, or None if one could not be found. """ for base in self.bases: try: for baseobj in base.infer(context=context): if isinstance(baseobj, ClassDef) and baseobj.hide: self._metaclass = baseobj._metaclass self._metaclass_hack = True break except InferenceError: pass if self._metaclass: # Expects this from Py3k TreeRebuilder try: return next( node for node in self._metaclass.infer(context=context) if not isinstance(node, util.UninferableBase) ) except (InferenceError, StopIteration): return None return None def _find_metaclass( self, seen: set[ClassDef] | None = None, context: InferenceContext | None = None ) -> SuccessfulInferenceResult | None: if seen is None: seen = set() seen.add(self) klass = self.declared_metaclass(context=context) if klass is None: for parent in self.ancestors(context=context): if parent not in seen: klass = parent._find_metaclass(seen) if klass is not None: break return klass def metaclass( self, context: InferenceContext | None = None ) -> SuccessfulInferenceResult | None: """Get the metaclass of this class. If this class does not define explicitly a metaclass, then the first defined metaclass in ancestors will be used instead. :returns: The metaclass of this class. """ return self._find_metaclass(context=context) def has_metaclass_hack(self): return self._metaclass_hack def _islots(self): """Return an iterator with the inferred slots.""" if "__slots__" not in self.locals: return None for slots in self.igetattr("__slots__"): # check if __slots__ is a valid type for meth in ITER_METHODS: try: slots.getattr(meth) break except AttributeInferenceError: continue else: continue if isinstance(slots, node_classes.Const): # a string. Ignore the following checks, # but yield the node, only if it has a value if slots.value: yield slots continue if not hasattr(slots, "itered"): # we can't obtain the values, maybe a .deque? continue if isinstance(slots, node_classes.Dict): values = [item[0] for item in slots.items] else: values = slots.itered() if isinstance(values, util.UninferableBase): continue if not values: # Stop the iteration, because the class # has an empty list of slots. return values for elt in values: try: for inferred in elt.infer(): if not isinstance( inferred, node_classes.Const ) or not isinstance(inferred.value, str): continue if not inferred.value: continue yield inferred except InferenceError: continue return None def _slots(self): if not self.newstyle: raise NotImplementedError( "The concept of slots is undefined for old-style classes." ) slots = self._islots() try: first = next(slots) except StopIteration as exc: # The class doesn't have a __slots__ definition or empty slots. if exc.args and exc.args[0] not in ("", None): return exc.args[0] return None return [first, *slots] # Cached, because inferring them all the time is expensive @cached_property def _all_slots(self): """Get all the slots for this node. :returns: The names of slots for this class. If the class doesn't define any slot, through the ``__slots__`` variable, then this function will return a None. Also, it will return None in the case the slots were not inferred. :rtype: list(str) or None """ def grouped_slots( mro: list[ClassDef], ) -> Iterator[node_classes.NodeNG | None]: for cls in mro: # Not interested in object, since it can't have slots. if cls.qname() == "builtins.object": continue try: cls_slots = cls._slots() except NotImplementedError: continue if cls_slots is not None: yield from cls_slots else: yield None if not self.newstyle: raise NotImplementedError( "The concept of slots is undefined for old-style classes." ) try: mro = self.mro() except MroError as e: raise NotImplementedError( "Cannot get slots while parsing mro fails." ) from e slots = list(grouped_slots(mro)) if not all(slot is not None for slot in slots): return None return sorted(set(slots), key=lambda item: item.value) def slots(self): return self._all_slots def _inferred_bases(self, context: InferenceContext | None = None): # Similar with .ancestors, but the difference is when one base is inferred, # only the first object is wanted. That's because # we aren't interested in superclasses, as in the following # example: # # class SomeSuperClass(object): pass # class SomeClass(SomeSuperClass): pass # class Test(SomeClass): pass # # Inferring SomeClass from the Test's bases will give # us both SomeClass and SomeSuperClass, but we are interested # only in SomeClass. if context is None: context = InferenceContext() if not self.bases and self.qname() != "builtins.object": yield builtin_lookup("object")[1][0] return for stmt in self.bases: try: # Find the first non-None inferred base value baseobj = next( b for b in stmt.infer(context=context.clone()) if not (isinstance(b, Const) and b.value is None) ) except (InferenceError, StopIteration): continue if isinstance(baseobj, bases.Instance): baseobj = baseobj._proxied if not isinstance(baseobj, ClassDef): continue if not baseobj.hide: yield baseobj else: yield from baseobj.bases def _compute_mro(self, context: InferenceContext | None = None): if self.qname() == "builtins.object": return [self] inferred_bases = list(self._inferred_bases(context=context)) bases_mro = [] for base in inferred_bases: if base is self: continue try: mro = base._compute_mro(context=context) bases_mro.append(mro) except NotImplementedError: # Some classes have in their ancestors both newstyle and # old style classes. For these we can't retrieve the .mro, # although in Python it's possible, since the class we are # currently working is in fact new style. # So, we fallback to ancestors here. ancestors = list(base.ancestors(context=context)) bases_mro.append(ancestors) unmerged_mro: list[list[ClassDef]] = [[self], *bases_mro, inferred_bases] unmerged_mro = clean_duplicates_mro(unmerged_mro, self, context) clean_typing_generic_mro(unmerged_mro) return _c3_merge(unmerged_mro, self, context) def mro(self, context: InferenceContext | None = None) -> list[ClassDef]: """Get the method resolution order, using C3 linearization. :returns: The list of ancestors, sorted by the mro. :rtype: list(NodeNG) :raises DuplicateBasesError: Duplicate bases in the same class base :raises InconsistentMroError: A class' MRO is inconsistent """ return self._compute_mro(context=context) def bool_value(self, context: InferenceContext | None = None) -> Literal[True]: """Determine the boolean value of this node. :returns: The boolean value of this node. For a :class:`ClassDef` this is always ``True``. """ return True def get_children(self): if self.decorators is not None: yield self.decorators yield from self.bases if self.keywords is not None: yield from self.keywords yield from self.type_params yield from self.body @cached_property def _assign_nodes_in_scope(self): children_assign_nodes = ( child_node._assign_nodes_in_scope for child_node in self.body ) return list(itertools.chain.from_iterable(children_assign_nodes)) def frame(self: _T, *, future: Literal[None, True] = None) -> _T: """The node's frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, :class:`ClassDef` or :class:`Lambda`. :returns: The node itself. """ return self def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[ClassDef, None, None]: yield self astroid-3.2.2/astroid/nodes/scoped_nodes/utils.py0000664000175000017500000000223514622475517022044 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains utility functions for scoped nodes.""" from __future__ import annotations from typing import TYPE_CHECKING from astroid.manager import AstroidManager if TYPE_CHECKING: from astroid import nodes def builtin_lookup(name: str) -> tuple[nodes.Module, list[nodes.NodeNG]]: """Lookup a name in the builtin module. Return the list of matching statements and the ast for the builtin module """ manager = AstroidManager() try: _builtin_astroid = manager.builtins_module except KeyError: # User manipulated the astroid cache directly! Rebuild everything. manager.clear_cache() _builtin_astroid = manager.builtins_module if name == "__dict__": return _builtin_astroid, () try: stmts: list[nodes.NodeNG] = _builtin_astroid.locals[name] # type: ignore[assignment] except KeyError: stmts = [] return _builtin_astroid, stmts astroid-3.2.2/astroid/nodes/node_classes.py0000664000175000017500000051204514622475517020706 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Module for some node classes. More nodes in scoped_nodes.py""" from __future__ import annotations import abc import ast import itertools import operator import sys import typing import warnings from collections.abc import Generator, Iterable, Iterator, Mapping from functools import cached_property from typing import ( TYPE_CHECKING, Any, Callable, ClassVar, Literal, Optional, Union, ) from astroid import decorators, protocols, util from astroid.bases import Instance, _infer_stmts from astroid.const import _EMPTY_OBJECT_MARKER, Context from astroid.context import CallContext, InferenceContext, copy_context from astroid.exceptions import ( AstroidBuildingError, AstroidError, AstroidIndexError, AstroidTypeError, AstroidValueError, AttributeInferenceError, InferenceError, NameInferenceError, NoDefault, ParentMissingError, _NonDeducibleTypeHierarchy, ) from astroid.interpreter import dunder_lookup from astroid.manager import AstroidManager from astroid.nodes import _base_nodes from astroid.nodes.const import OP_PRECEDENCE from astroid.nodes.node_ng import NodeNG from astroid.typing import ( ConstFactoryResult, InferenceErrorInfo, InferenceResult, SuccessfulInferenceResult, ) if sys.version_info >= (3, 11): from typing import Self else: from typing_extensions import Self if TYPE_CHECKING: from astroid import nodes from astroid.nodes import LocalsDictNodeNG def _is_const(value) -> bool: return isinstance(value, tuple(CONST_CLS)) _NodesT = typing.TypeVar("_NodesT", bound=NodeNG) _BadOpMessageT = typing.TypeVar("_BadOpMessageT", bound=util.BadOperationMessage) AssignedStmtsPossibleNode = Union["List", "Tuple", "AssignName", "AssignAttr", None] AssignedStmtsCall = Callable[ [ _NodesT, AssignedStmtsPossibleNode, Optional[InferenceContext], Optional[typing.List[int]], ], Any, ] InferBinaryOperation = Callable[ [_NodesT, Optional[InferenceContext]], typing.Generator[Union[InferenceResult, _BadOpMessageT], None, None], ] InferLHS = Callable[ [_NodesT, Optional[InferenceContext]], typing.Generator[InferenceResult, None, Optional[InferenceErrorInfo]], ] InferUnaryOp = Callable[[_NodesT, str], ConstFactoryResult] @decorators.raise_if_nothing_inferred def unpack_infer(stmt, context: InferenceContext | None = None): """recursively generate nodes inferred by the given statement. If the inferred value is a list or a tuple, recurse on the elements """ if isinstance(stmt, (List, Tuple)): for elt in stmt.elts: if elt is util.Uninferable: yield elt continue yield from unpack_infer(elt, context) return {"node": stmt, "context": context} # if inferred is a final node, return it and stop inferred = next(stmt.infer(context), util.Uninferable) if inferred is stmt: yield inferred return {"node": stmt, "context": context} # else, infer recursively, except Uninferable object that should be returned as is for inferred in stmt.infer(context): if isinstance(inferred, util.UninferableBase): yield inferred else: yield from unpack_infer(inferred, context) return {"node": stmt, "context": context} def are_exclusive(stmt1, stmt2, exceptions: list[str] | None = None) -> bool: """return true if the two given statements are mutually exclusive `exceptions` may be a list of exception names. If specified, discard If branches and check one of the statement is in an exception handler catching one of the given exceptions. algorithm : 1) index stmt1's parents 2) climb among stmt2's parents until we find a common parent 3) if the common parent is a If or Try statement, look if nodes are in exclusive branches """ # index stmt1's parents stmt1_parents = {} children = {} previous = stmt1 for node in stmt1.node_ancestors(): stmt1_parents[node] = 1 children[node] = previous previous = node # climb among stmt2's parents until we find a common parent previous = stmt2 for node in stmt2.node_ancestors(): if node in stmt1_parents: # if the common parent is a If or Try statement, look if # nodes are in exclusive branches if isinstance(node, If) and exceptions is None: c2attr, c2node = node.locate_child(previous) c1attr, c1node = node.locate_child(children[node]) if "test" in (c1attr, c2attr): # If any node is `If.test`, then it must be inclusive with # the other node (`If.body` and `If.orelse`) return False if c1attr != c2attr: # different `If` branches (`If.body` and `If.orelse`) return True elif isinstance(node, Try): c2attr, c2node = node.locate_child(previous) c1attr, c1node = node.locate_child(children[node]) if c1node is not c2node: first_in_body_caught_by_handlers = ( c2attr == "handlers" and c1attr == "body" and previous.catch(exceptions) ) second_in_body_caught_by_handlers = ( c2attr == "body" and c1attr == "handlers" and children[node].catch(exceptions) ) first_in_else_other_in_handlers = ( c2attr == "handlers" and c1attr == "orelse" ) second_in_else_other_in_handlers = ( c2attr == "orelse" and c1attr == "handlers" ) if any( ( first_in_body_caught_by_handlers, second_in_body_caught_by_handlers, first_in_else_other_in_handlers, second_in_else_other_in_handlers, ) ): return True elif c2attr == "handlers" and c1attr == "handlers": return previous is not children[node] return False previous = node return False # getitem() helpers. _SLICE_SENTINEL = object() def _slice_value(index, context: InferenceContext | None = None): """Get the value of the given slice index.""" if isinstance(index, Const): if isinstance(index.value, (int, type(None))): return index.value elif index is None: return None else: # Try to infer what the index actually is. # Since we can't return all the possible values, # we'll stop at the first possible value. try: inferred = next(index.infer(context=context)) except (InferenceError, StopIteration): pass else: if isinstance(inferred, Const): if isinstance(inferred.value, (int, type(None))): return inferred.value # Use a sentinel, because None can be a valid # value that this function can return, # as it is the case for unspecified bounds. return _SLICE_SENTINEL def _infer_slice(node, context: InferenceContext | None = None): lower = _slice_value(node.lower, context) upper = _slice_value(node.upper, context) step = _slice_value(node.step, context) if all(elem is not _SLICE_SENTINEL for elem in (lower, upper, step)): return slice(lower, upper, step) raise AstroidTypeError( message="Could not infer slice used in subscript", node=node, index=node.parent, context=context, ) def _container_getitem(instance, elts, index, context: InferenceContext | None = None): """Get a slice or an item, using the given *index*, for the given sequence.""" try: if isinstance(index, Slice): index_slice = _infer_slice(index, context=context) new_cls = instance.__class__() new_cls.elts = elts[index_slice] new_cls.parent = instance.parent return new_cls if isinstance(index, Const): return elts[index.value] except ValueError as exc: raise AstroidValueError( message="Slice {index!r} cannot index container", node=instance, index=index, context=context, ) from exc except IndexError as exc: raise AstroidIndexError( message="Index {index!s} out of range", node=instance, index=index, context=context, ) from exc except TypeError as exc: raise AstroidTypeError( message="Type error {error!r}", node=instance, index=index, context=context ) from exc raise AstroidTypeError(f"Could not use {index} as subscript index") class BaseContainer(_base_nodes.ParentAssignNode, Instance, metaclass=abc.ABCMeta): """Base class for Set, FrozenSet, Tuple and List.""" _astroid_fields = ("elts",) def __init__( self, lineno: int | None, col_offset: int | None, parent: NodeNG | None, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.elts: list[SuccessfulInferenceResult] = [] """The elements in the node.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, elts: list[SuccessfulInferenceResult]) -> None: self.elts = elts @classmethod def from_elements(cls, elts: Iterable[Any]) -> Self: """Create a node of this type from the given list of elements. :param elts: The list of elements that the node should contain. :returns: A new node containing the given elements. """ node = cls( lineno=None, col_offset=None, parent=None, end_lineno=None, end_col_offset=None, ) node.elts = [const_factory(e) if _is_const(e) else e for e in elts] return node def itered(self): """An iterator over the elements this node contains. :returns: The contents of this node. :rtype: iterable(NodeNG) """ return self.elts def bool_value(self, context: InferenceContext | None = None) -> bool: """Determine the boolean value of this node. :returns: The boolean value of this node. """ return bool(self.elts) @abc.abstractmethod def pytype(self) -> str: """Get the name of the type that this node represents. :returns: The name of the type. """ def get_children(self): yield from self.elts @decorators.raise_if_nothing_inferred def _infer( self, context: InferenceContext | None = None, **kwargs: Any, ) -> Iterator[Self]: has_starred_named_expr = any( isinstance(e, (Starred, NamedExpr)) for e in self.elts ) if has_starred_named_expr: values = self._infer_sequence_helper(context) new_seq = type(self)( lineno=self.lineno, col_offset=self.col_offset, parent=self.parent, end_lineno=self.end_lineno, end_col_offset=self.end_col_offset, ) new_seq.postinit(values) yield new_seq else: yield self def _infer_sequence_helper( self, context: InferenceContext | None = None ) -> list[SuccessfulInferenceResult]: """Infer all values based on BaseContainer.elts.""" values = [] for elt in self.elts: if isinstance(elt, Starred): starred = util.safe_infer(elt.value, context) if not starred: raise InferenceError(node=self, context=context) if not hasattr(starred, "elts"): raise InferenceError(node=self, context=context) # TODO: fresh context? values.extend(starred._infer_sequence_helper(context)) elif isinstance(elt, NamedExpr): value = util.safe_infer(elt.value, context) if not value: raise InferenceError(node=self, context=context) values.append(value) else: values.append(elt) return values # Name classes class AssignName( _base_nodes.NoChildrenNode, _base_nodes.LookupMixIn, _base_nodes.ParentAssignNode, ): """Variation of :class:`ast.Assign` representing assignment to a name. An :class:`AssignName` is the name of something that is assigned to. This includes variables defined in a function signature or in a loop. >>> import astroid >>> node = astroid.extract_node('variable = range(10)') >>> node >>> list(node.get_children()) [, ] >>> list(node.get_children())[0].as_string() 'variable' """ _other_fields = ("name",) def __init__( self, name: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.name = name """The name that is assigned to.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) assigned_stmts = protocols.assend_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Infer an AssignName: need to inspect the RHS part of the assign node. """ if isinstance(self.parent, AugAssign): return self.parent.infer(context) stmts = list(self.assigned_stmts(context=context)) return _infer_stmts(stmts, context) @decorators.raise_if_nothing_inferred def infer_lhs( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Infer a Name: use name lookup rules. Same implementation as Name._infer.""" # pylint: disable=import-outside-toplevel from astroid.constraint import get_constraints from astroid.helpers import _higher_function_scope frame, stmts = self.lookup(self.name) if not stmts: # Try to see if the name is enclosed in a nested function # and use the higher (first function) scope for searching. parent_function = _higher_function_scope(self.scope()) if parent_function: _, stmts = parent_function.lookup(self.name) if not stmts: raise NameInferenceError( name=self.name, scope=self.scope(), context=context ) context = copy_context(context) context.lookupname = self.name context.constraints[self.name] = get_constraints(self, frame) return _infer_stmts(stmts, context, frame) class DelName( _base_nodes.NoChildrenNode, _base_nodes.LookupMixIn, _base_nodes.ParentAssignNode ): """Variation of :class:`ast.Delete` representing deletion of a name. A :class:`DelName` is the name of something that is deleted. >>> import astroid >>> node = astroid.extract_node("del variable #@") >>> list(node.get_children()) [] >>> list(node.get_children())[0].as_string() 'variable' """ _other_fields = ("name",) def __init__( self, name: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.name = name """The name that is being deleted.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) class Name(_base_nodes.LookupMixIn, _base_nodes.NoChildrenNode): """Class representing an :class:`ast.Name` node. A :class:`Name` node is something that is named, but not covered by :class:`AssignName` or :class:`DelName`. >>> import astroid >>> node = astroid.extract_node('range(10)') >>> node >>> list(node.get_children()) [, ] >>> list(node.get_children())[0].as_string() 'range' """ _other_fields = ("name",) def __init__( self, name: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.name = name """The name that this node refers to.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def _get_name_nodes(self): yield self for child_node in self.get_children(): yield from child_node._get_name_nodes() @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Infer a Name: use name lookup rules Same implementation as AssignName._infer_lhs.""" # pylint: disable=import-outside-toplevel from astroid.constraint import get_constraints from astroid.helpers import _higher_function_scope frame, stmts = self.lookup(self.name) if not stmts: # Try to see if the name is enclosed in a nested function # and use the higher (first function) scope for searching. parent_function = _higher_function_scope(self.scope()) if parent_function: _, stmts = parent_function.lookup(self.name) if not stmts: raise NameInferenceError( name=self.name, scope=self.scope(), context=context ) context = copy_context(context) context.lookupname = self.name context.constraints[self.name] = get_constraints(self, frame) return _infer_stmts(stmts, context, frame) DEPRECATED_ARGUMENT_DEFAULT = "DEPRECATED_ARGUMENT_DEFAULT" class Arguments( _base_nodes.AssignTypeNode ): # pylint: disable=too-many-instance-attributes """Class representing an :class:`ast.arguments` node. An :class:`Arguments` node represents that arguments in a function definition. >>> import astroid >>> node = astroid.extract_node('def foo(bar): pass') >>> node >>> node.args """ # Python 3.4+ uses a different approach regarding annotations, # each argument is a new class, _ast.arg, which exposes an # 'annotation' attribute. In astroid though, arguments are exposed # as is in the Arguments node and the only way to expose annotations # is by using something similar with Python 3.3: # - we expose 'varargannotation' and 'kwargannotation' of annotations # of varargs and kwargs. # - we expose 'annotation', a list with annotations for # for each normal argument. If an argument doesn't have an # annotation, its value will be None. _astroid_fields = ( "args", "defaults", "kwonlyargs", "posonlyargs", "posonlyargs_annotations", "kw_defaults", "annotations", "varargannotation", "kwargannotation", "kwonlyargs_annotations", "type_comment_args", "type_comment_kwonlyargs", "type_comment_posonlyargs", ) _other_fields = ("vararg", "kwarg") args: list[AssignName] | None """The names of the required arguments. Can be None if the associated function does not have a retrievable signature and the arguments are therefore unknown. This can happen with (builtin) functions implemented in C that have incomplete signature information. """ defaults: list[NodeNG] | None """The default values for arguments that can be passed positionally.""" kwonlyargs: list[AssignName] """The keyword arguments that cannot be passed positionally.""" posonlyargs: list[AssignName] """The arguments that can only be passed positionally.""" kw_defaults: list[NodeNG | None] | None """The default values for keyword arguments that cannot be passed positionally.""" annotations: list[NodeNG | None] """The type annotations of arguments that can be passed positionally.""" posonlyargs_annotations: list[NodeNG | None] """The type annotations of arguments that can only be passed positionally.""" kwonlyargs_annotations: list[NodeNG | None] """The type annotations of arguments that cannot be passed positionally.""" type_comment_args: list[NodeNG | None] """The type annotation, passed by a type comment, of each argument. If an argument does not have a type comment, the value for that argument will be None. """ type_comment_kwonlyargs: list[NodeNG | None] """The type annotation, passed by a type comment, of each keyword only argument. If an argument does not have a type comment, the value for that argument will be None. """ type_comment_posonlyargs: list[NodeNG | None] """The type annotation, passed by a type comment, of each positional argument. If an argument does not have a type comment, the value for that argument will be None. """ varargannotation: NodeNG | None """The type annotation for the variable length arguments.""" kwargannotation: NodeNG | None """The type annotation for the variable length keyword arguments.""" vararg_node: AssignName | None """The node for variable length arguments""" kwarg_node: AssignName | None """The node for variable keyword arguments""" def __init__( self, vararg: str | None, kwarg: str | None, parent: NodeNG, vararg_node: AssignName | None = None, kwarg_node: AssignName | None = None, ) -> None: """Almost all attributes can be None for living objects where introspection failed.""" super().__init__( parent=parent, lineno=None, col_offset=None, end_lineno=None, end_col_offset=None, ) self.vararg = vararg """The name of the variable length arguments.""" self.kwarg = kwarg """The name of the variable length keyword arguments.""" self.vararg_node = vararg_node self.kwarg_node = kwarg_node # pylint: disable=too-many-arguments def postinit( self, args: list[AssignName] | None, defaults: list[NodeNG] | None, kwonlyargs: list[AssignName], kw_defaults: list[NodeNG | None] | None, annotations: list[NodeNG | None], posonlyargs: list[AssignName], kwonlyargs_annotations: list[NodeNG | None], posonlyargs_annotations: list[NodeNG | None], varargannotation: NodeNG | None = None, kwargannotation: NodeNG | None = None, type_comment_args: list[NodeNG | None] | None = None, type_comment_kwonlyargs: list[NodeNG | None] | None = None, type_comment_posonlyargs: list[NodeNG | None] | None = None, ) -> None: self.args = args self.defaults = defaults self.kwonlyargs = kwonlyargs self.posonlyargs = posonlyargs self.kw_defaults = kw_defaults self.annotations = annotations self.kwonlyargs_annotations = kwonlyargs_annotations self.posonlyargs_annotations = posonlyargs_annotations # Parameters that got added later and need a default self.varargannotation = varargannotation self.kwargannotation = kwargannotation if type_comment_args is None: type_comment_args = [] self.type_comment_args = type_comment_args if type_comment_kwonlyargs is None: type_comment_kwonlyargs = [] self.type_comment_kwonlyargs = type_comment_kwonlyargs if type_comment_posonlyargs is None: type_comment_posonlyargs = [] self.type_comment_posonlyargs = type_comment_posonlyargs assigned_stmts = protocols.arguments_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def _infer_name(self, frame, name): if self.parent is frame: return name return None @cached_property def fromlineno(self) -> int: """The first line that this node appears on in the source code. Can also return 0 if the line can not be determined. """ lineno = super().fromlineno return max(lineno, self.parent.fromlineno or 0) @cached_property def arguments(self): """Get all the arguments for this node. This includes: * Positional only arguments * Positional arguments * Keyword arguments * Variable arguments (.e.g *args) * Variable keyword arguments (e.g **kwargs) """ retval = list(itertools.chain((self.posonlyargs or ()), (self.args or ()))) if self.vararg_node: retval.append(self.vararg_node) retval += self.kwonlyargs or () if self.kwarg_node: retval.append(self.kwarg_node) return retval def format_args(self, *, skippable_names: set[str] | None = None) -> str: """Get the arguments formatted as string. :returns: The formatted arguments. :rtype: str """ result = [] positional_only_defaults = [] positional_or_keyword_defaults = self.defaults if self.defaults: args = self.args or [] positional_or_keyword_defaults = self.defaults[-len(args) :] positional_only_defaults = self.defaults[: len(self.defaults) - len(args)] if self.posonlyargs: result.append( _format_args( self.posonlyargs, positional_only_defaults, self.posonlyargs_annotations, skippable_names=skippable_names, ) ) result.append("/") if self.args: result.append( _format_args( self.args, positional_or_keyword_defaults, getattr(self, "annotations", None), skippable_names=skippable_names, ) ) if self.vararg: result.append(f"*{self.vararg}") if self.kwonlyargs: if not self.vararg: result.append("*") result.append( _format_args( self.kwonlyargs, self.kw_defaults, self.kwonlyargs_annotations, skippable_names=skippable_names, ) ) if self.kwarg: result.append(f"**{self.kwarg}") return ", ".join(result) def _get_arguments_data( self, ) -> tuple[ dict[str, tuple[str | None, str | None]], dict[str, tuple[str | None, str | None]], ]: """Get the arguments as dictionary with information about typing and defaults. The return tuple contains a dictionary for positional and keyword arguments with their typing and their default value, if any. The method follows a similar order as format_args but instead of formatting into a string it returns the data that is used to do so. """ pos_only: dict[str, tuple[str | None, str | None]] = {} kw_only: dict[str, tuple[str | None, str | None]] = {} # Setup and match defaults with arguments positional_only_defaults = [] positional_or_keyword_defaults = self.defaults if self.defaults: args = self.args or [] positional_or_keyword_defaults = self.defaults[-len(args) :] positional_only_defaults = self.defaults[: len(self.defaults) - len(args)] for index, posonly in enumerate(self.posonlyargs): annotation, default = self.posonlyargs_annotations[index], None if annotation is not None: annotation = annotation.as_string() if positional_only_defaults: default = positional_only_defaults[index].as_string() pos_only[posonly.name] = (annotation, default) for index, arg in enumerate(self.args): annotation, default = self.annotations[index], None if annotation is not None: annotation = annotation.as_string() if positional_or_keyword_defaults: defaults_offset = len(self.args) - len(positional_or_keyword_defaults) default_index = index - defaults_offset if ( default_index > -1 and positional_or_keyword_defaults[default_index] is not None ): default = positional_or_keyword_defaults[default_index].as_string() pos_only[arg.name] = (annotation, default) if self.vararg: annotation = self.varargannotation if annotation is not None: annotation = annotation.as_string() pos_only[self.vararg] = (annotation, None) for index, kwarg in enumerate(self.kwonlyargs): annotation = self.kwonlyargs_annotations[index] if annotation is not None: annotation = annotation.as_string() default = self.kw_defaults[index] if default is not None: default = default.as_string() kw_only[kwarg.name] = (annotation, default) if self.kwarg: annotation = self.kwargannotation if annotation is not None: annotation = annotation.as_string() kw_only[self.kwarg] = (annotation, None) return pos_only, kw_only def default_value(self, argname): """Get the default value for an argument. :param argname: The name of the argument to get the default value for. :type argname: str :raises NoDefault: If there is no default value defined for the given argument. """ args = [ arg for arg in self.arguments if arg.name not in [self.vararg, self.kwarg] ] index = _find_arg(argname, self.kwonlyargs)[0] if (index is not None) and (len(self.kw_defaults) > index): if self.kw_defaults[index] is not None: return self.kw_defaults[index] raise NoDefault(func=self.parent, name=argname) index = _find_arg(argname, args)[0] if index is not None: idx = index - (len(args) - len(self.defaults) - len(self.kw_defaults)) if idx >= 0: return self.defaults[idx] raise NoDefault(func=self.parent, name=argname) def is_argument(self, name) -> bool: """Check if the given name is defined in the arguments. :param name: The name to check for. :type name: str :returns: Whether the given name is defined in the arguments, """ if name == self.vararg: return True if name == self.kwarg: return True return self.find_argname(name)[1] is not None def find_argname(self, argname, rec=DEPRECATED_ARGUMENT_DEFAULT): """Get the index and :class:`AssignName` node for given name. :param argname: The name of the argument to search for. :type argname: str :returns: The index and node for the argument. :rtype: tuple(str or None, AssignName or None) """ if rec != DEPRECATED_ARGUMENT_DEFAULT: # pragma: no cover warnings.warn( "The rec argument will be removed in astroid 3.1.", DeprecationWarning, stacklevel=2, ) if self.arguments: index, argument = _find_arg(argname, self.arguments) if argument: return index, argument return None, None def get_children(self): yield from self.posonlyargs or () for elt in self.posonlyargs_annotations: if elt is not None: yield elt yield from self.args or () if self.defaults is not None: yield from self.defaults yield from self.kwonlyargs for elt in self.kw_defaults or (): if elt is not None: yield elt for elt in self.annotations: if elt is not None: yield elt if self.varargannotation is not None: yield self.varargannotation if self.kwargannotation is not None: yield self.kwargannotation for elt in self.kwonlyargs_annotations: if elt is not None: yield elt @decorators.raise_if_nothing_inferred def _infer( self: nodes.Arguments, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: # pylint: disable-next=import-outside-toplevel from astroid.protocols import _arguments_infer_argname if context is None or context.lookupname is None: raise InferenceError(node=self, context=context) return _arguments_infer_argname(self, context.lookupname, context) def _find_arg(argname, args): for i, arg in enumerate(args): if arg.name == argname: return i, arg return None, None def _format_args( args, defaults=None, annotations=None, skippable_names: set[str] | None = None ) -> str: if skippable_names is None: skippable_names = set() values = [] if args is None: return "" if annotations is None: annotations = [] if defaults is not None: default_offset = len(args) - len(defaults) packed = itertools.zip_longest(args, annotations) for i, (arg, annotation) in enumerate(packed): if arg.name in skippable_names: continue if isinstance(arg, Tuple): values.append(f"({_format_args(arg.elts)})") else: argname = arg.name default_sep = "=" if annotation is not None: argname += ": " + annotation.as_string() default_sep = " = " values.append(argname) if defaults is not None and i >= default_offset: if defaults[i - default_offset] is not None: values[-1] += default_sep + defaults[i - default_offset].as_string() return ", ".join(values) def _infer_attribute( node: nodes.AssignAttr | nodes.Attribute, context: InferenceContext | None = None, **kwargs: Any, ) -> Generator[InferenceResult, None, InferenceErrorInfo]: """Infer an AssignAttr/Attribute node by using getattr on the associated object.""" # pylint: disable=import-outside-toplevel from astroid.constraint import get_constraints from astroid.nodes import ClassDef for owner in node.expr.infer(context): if isinstance(owner, util.UninferableBase): yield owner continue context = copy_context(context) old_boundnode = context.boundnode try: context.boundnode = owner if isinstance(owner, (ClassDef, Instance)): frame = owner if isinstance(owner, ClassDef) else owner._proxied context.constraints[node.attrname] = get_constraints(node, frame=frame) if node.attrname == "argv" and owner.name == "sys": # sys.argv will never be inferable during static analysis # It's value would be the args passed to the linter itself yield util.Uninferable else: yield from owner.igetattr(node.attrname, context) except ( AttributeInferenceError, InferenceError, AttributeError, ): pass finally: context.boundnode = old_boundnode return InferenceErrorInfo(node=node, context=context) class AssignAttr(_base_nodes.LookupMixIn, _base_nodes.ParentAssignNode): """Variation of :class:`ast.Assign` representing assignment to an attribute. >>> import astroid >>> node = astroid.extract_node('self.attribute = range(10)') >>> node >>> list(node.get_children()) [, ] >>> list(node.get_children())[0].as_string() 'self.attribute' """ expr: NodeNG _astroid_fields = ("expr",) _other_fields = ("attrname",) def __init__( self, attrname: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.attrname = attrname """The name of the attribute being assigned to.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, expr: NodeNG) -> None: self.expr = expr assigned_stmts = protocols.assend_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def get_children(self): yield self.expr @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Infer an AssignAttr: need to inspect the RHS part of the assign node. """ if isinstance(self.parent, AugAssign): return self.parent.infer(context) stmts = list(self.assigned_stmts(context=context)) return _infer_stmts(stmts, context) @decorators.raise_if_nothing_inferred @decorators.path_wrapper def infer_lhs( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: return _infer_attribute(self, context, **kwargs) class Assert(_base_nodes.Statement): """Class representing an :class:`ast.Assert` node. An :class:`Assert` node represents an assert statement. >>> import astroid >>> node = astroid.extract_node('assert len(things) == 10, "Not enough things"') >>> node """ _astroid_fields = ("test", "fail") test: NodeNG """The test that passes or fails the assertion.""" fail: NodeNG | None """The message shown when the assertion fails.""" def postinit(self, test: NodeNG, fail: NodeNG | None) -> None: self.fail = fail self.test = test def get_children(self): yield self.test if self.fail is not None: yield self.fail class Assign(_base_nodes.AssignTypeNode, _base_nodes.Statement): """Class representing an :class:`ast.Assign` node. An :class:`Assign` is a statement where something is explicitly asssigned to. >>> import astroid >>> node = astroid.extract_node('variable = range(10)') >>> node """ targets: list[NodeNG] """What is being assigned to.""" value: NodeNG """The value being assigned to the variables.""" type_annotation: NodeNG | None """If present, this will contain the type annotation passed by a type comment""" _astroid_fields = ("targets", "value") _other_other_fields = ("type_annotation",) def postinit( self, targets: list[NodeNG], value: NodeNG, type_annotation: NodeNG | None, ) -> None: self.targets = targets self.value = value self.type_annotation = type_annotation assigned_stmts = protocols.assign_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def get_children(self): yield from self.targets yield self.value @cached_property def _assign_nodes_in_scope(self) -> list[nodes.Assign]: return [self, *self.value._assign_nodes_in_scope] def _get_yield_nodes_skip_functions(self): yield from self.value._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): yield from self.value._get_yield_nodes_skip_lambdas() class AnnAssign(_base_nodes.AssignTypeNode, _base_nodes.Statement): """Class representing an :class:`ast.AnnAssign` node. An :class:`AnnAssign` is an assignment with a type annotation. >>> import astroid >>> node = astroid.extract_node('variable: List[int] = range(10)') >>> node """ _astroid_fields = ("target", "annotation", "value") _other_fields = ("simple",) target: Name | Attribute | Subscript """What is being assigned to.""" annotation: NodeNG """The type annotation of what is being assigned to.""" value: NodeNG | None """The value being assigned to the variables.""" simple: int """Whether :attr:`target` is a pure name or a complex statement.""" def postinit( self, target: Name | Attribute | Subscript, annotation: NodeNG, simple: int, value: NodeNG | None, ) -> None: self.target = target self.annotation = annotation self.value = value self.simple = simple assigned_stmts = protocols.assign_annassigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def get_children(self): yield self.target yield self.annotation if self.value is not None: yield self.value class AugAssign( _base_nodes.AssignTypeNode, _base_nodes.OperatorNode, _base_nodes.Statement ): """Class representing an :class:`ast.AugAssign` node. An :class:`AugAssign` is an assignment paired with an operator. >>> import astroid >>> node = astroid.extract_node('variable += 1') >>> node """ _astroid_fields = ("target", "value") _other_fields = ("op",) target: Name | Attribute | Subscript """What is being assigned to.""" value: NodeNG """The value being assigned to the variable.""" def __init__( self, op: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.op = op """The operator that is being combined with the assignment. This includes the equals sign. """ super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, target: Name | Attribute | Subscript, value: NodeNG) -> None: self.target = target self.value = value assigned_stmts = protocols.assign_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def type_errors(self, context: InferenceContext | None = None): """Get a list of type errors which can occur during inference. Each TypeError is represented by a :class:`BadBinaryOperationMessage` , which holds the original exception. :returns: The list of possible type errors. :rtype: list(BadBinaryOperationMessage) """ try: results = self._infer_augassign(context=context) return [ result for result in results if isinstance(result, util.BadBinaryOperationMessage) ] except InferenceError: return [] def get_children(self): yield self.target yield self.value def _get_yield_nodes_skip_functions(self): """An AugAssign node can contain a Yield node in the value""" yield from self.value._get_yield_nodes_skip_functions() yield from super()._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): """An AugAssign node can contain a Yield node in the value""" yield from self.value._get_yield_nodes_skip_lambdas() yield from super()._get_yield_nodes_skip_lambdas() def _infer_augassign( self, context: InferenceContext | None = None ) -> Generator[InferenceResult | util.BadBinaryOperationMessage, None, None]: """Inference logic for augmented binary operations.""" context = context or InferenceContext() rhs_context = context.clone() lhs_iter = self.target.infer_lhs(context=context) rhs_iter = self.value.infer(context=rhs_context) for lhs, rhs in itertools.product(lhs_iter, rhs_iter): if any(isinstance(value, util.UninferableBase) for value in (rhs, lhs)): # Don't know how to process this. yield util.Uninferable return try: yield from self._infer_binary_operation( left=lhs, right=rhs, binary_opnode=self, context=context, flow_factory=self._get_aug_flow, ) except _NonDeducibleTypeHierarchy: yield util.Uninferable @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self: nodes.AugAssign, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: return self._filter_operation_errors( self._infer_augassign, context, util.BadBinaryOperationMessage ) class BinOp(_base_nodes.OperatorNode): """Class representing an :class:`ast.BinOp` node. A :class:`BinOp` node is an application of a binary operator. >>> import astroid >>> node = astroid.extract_node('a + b') >>> node """ _astroid_fields = ("left", "right") _other_fields = ("op",) left: NodeNG """What is being applied to the operator on the left side.""" right: NodeNG """What is being applied to the operator on the right side.""" def __init__( self, op: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.op = op """The operator.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, left: NodeNG, right: NodeNG) -> None: self.left = left self.right = right def type_errors(self, context: InferenceContext | None = None): """Get a list of type errors which can occur during inference. Each TypeError is represented by a :class:`BadBinaryOperationMessage`, which holds the original exception. :returns: The list of possible type errors. :rtype: list(BadBinaryOperationMessage) """ try: results = self._infer_binop(context=context) return [ result for result in results if isinstance(result, util.BadBinaryOperationMessage) ] except InferenceError: return [] def get_children(self): yield self.left yield self.right def op_precedence(self): return OP_PRECEDENCE[self.op] def op_left_associative(self) -> bool: # 2**3**4 == 2**(3**4) return self.op != "**" def _infer_binop( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: """Binary operation inference logic.""" left = self.left right = self.right # we use two separate contexts for evaluating lhs and rhs because # 1. evaluating lhs may leave some undesired entries in context.path # which may not let us infer right value of rhs context = context or InferenceContext() lhs_context = copy_context(context) rhs_context = copy_context(context) lhs_iter = left.infer(context=lhs_context) rhs_iter = right.infer(context=rhs_context) for lhs, rhs in itertools.product(lhs_iter, rhs_iter): if any(isinstance(value, util.UninferableBase) for value in (rhs, lhs)): # Don't know how to process this. yield util.Uninferable return try: yield from self._infer_binary_operation( lhs, rhs, self, context, self._get_binop_flow ) except _NonDeducibleTypeHierarchy: yield util.Uninferable @decorators.yes_if_nothing_inferred @decorators.path_wrapper def _infer( self: nodes.BinOp, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: return self._filter_operation_errors( self._infer_binop, context, util.BadBinaryOperationMessage ) class BoolOp(NodeNG): """Class representing an :class:`ast.BoolOp` node. A :class:`BoolOp` is an application of a boolean operator. >>> import astroid >>> node = astroid.extract_node('a and b') >>> node """ _astroid_fields = ("values",) _other_fields = ("op",) def __init__( self, op: str, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param op: The operator. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.op: str = op """The operator.""" self.values: list[NodeNG] = [] """The values being applied to the operator.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, values: list[NodeNG] | None = None) -> None: """Do some setup after initialisation. :param values: The values being applied to the operator. """ if values is not None: self.values = values def get_children(self): yield from self.values def op_precedence(self): return OP_PRECEDENCE[self.op] @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self: nodes.BoolOp, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Infer a boolean operation (and / or / not). The function will calculate the boolean operation for all pairs generated through inference for each component node. """ values = self.values if self.op == "or": predicate = operator.truth else: predicate = operator.not_ try: inferred_values = [value.infer(context=context) for value in values] except InferenceError: yield util.Uninferable return None for pair in itertools.product(*inferred_values): if any(isinstance(item, util.UninferableBase) for item in pair): # Can't infer the final result, just yield Uninferable. yield util.Uninferable continue bool_values = [item.bool_value() for item in pair] if any(isinstance(item, util.UninferableBase) for item in bool_values): # Can't infer the final result, just yield Uninferable. yield util.Uninferable continue # Since the boolean operations are short circuited operations, # this code yields the first value for which the predicate is True # and if no value respected the predicate, then the last value will # be returned (or Uninferable if there was no last value). # This is conforming to the semantics of `and` and `or`: # 1 and 0 -> 1 # 0 and 1 -> 0 # 1 or 0 -> 1 # 0 or 1 -> 1 value = util.Uninferable for value, bool_value in zip(pair, bool_values): if predicate(bool_value): yield value break else: yield value return InferenceErrorInfo(node=self, context=context) class Break(_base_nodes.NoChildrenNode, _base_nodes.Statement): """Class representing an :class:`ast.Break` node. >>> import astroid >>> node = astroid.extract_node('break') >>> node """ class Call(NodeNG): """Class representing an :class:`ast.Call` node. A :class:`Call` node is a call to a function, method, etc. >>> import astroid >>> node = astroid.extract_node('function()') >>> node """ _astroid_fields = ("func", "args", "keywords") func: NodeNG """What is being called.""" args: list[NodeNG] """The positional arguments being given to the call.""" keywords: list[Keyword] """The keyword arguments being given to the call.""" def postinit( self, func: NodeNG, args: list[NodeNG], keywords: list[Keyword] ) -> None: self.func = func self.args = args self.keywords = keywords @property def starargs(self) -> list[Starred]: """The positional arguments that unpack something.""" return [arg for arg in self.args if isinstance(arg, Starred)] @property def kwargs(self) -> list[Keyword]: """The keyword arguments that unpack something.""" return [keyword for keyword in self.keywords if keyword.arg is None] def get_children(self): yield self.func yield from self.args yield from self.keywords @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo]: """Infer a Call node by trying to guess what the function returns.""" callcontext = copy_context(context) callcontext.boundnode = None if context is not None: callcontext.extra_context = self._populate_context_lookup(context.clone()) for callee in self.func.infer(context): if isinstance(callee, util.UninferableBase): yield callee continue try: if hasattr(callee, "infer_call_result"): callcontext.callcontext = CallContext( args=self.args, keywords=self.keywords, callee=callee ) yield from callee.infer_call_result( caller=self, context=callcontext ) except InferenceError: continue return InferenceErrorInfo(node=self, context=context) def _populate_context_lookup(self, context: InferenceContext | None): """Allows context to be saved for later for inference inside a function.""" context_lookup: dict[InferenceResult, InferenceContext] = {} if context is None: return context_lookup for arg in self.args: if isinstance(arg, Starred): context_lookup[arg.value] = context else: context_lookup[arg] = context keywords = self.keywords if self.keywords is not None else [] for keyword in keywords: context_lookup[keyword.value] = context return context_lookup COMPARE_OPS: dict[str, Callable[[Any, Any], bool]] = { "==": operator.eq, "!=": operator.ne, "<": operator.lt, "<=": operator.le, ">": operator.gt, ">=": operator.ge, "in": lambda a, b: a in b, "not in": lambda a, b: a not in b, } UNINFERABLE_OPS = { "is", "is not", } class Compare(NodeNG): """Class representing an :class:`ast.Compare` node. A :class:`Compare` node indicates a comparison. >>> import astroid >>> node = astroid.extract_node('a <= b <= c') >>> node >>> node.ops [('<=', ), ('<=', )] """ _astroid_fields = ("left", "ops") left: NodeNG """The value at the left being applied to a comparison operator.""" ops: list[tuple[str, NodeNG]] """The remainder of the operators and their relevant right hand value.""" def postinit(self, left: NodeNG, ops: list[tuple[str, NodeNG]]) -> None: self.left = left self.ops = ops def get_children(self): """Get the child nodes below this node. Overridden to handle the tuple fields and skip returning the operator strings. :returns: The children. :rtype: iterable(NodeNG) """ yield self.left for _, comparator in self.ops: yield comparator # we don't want the 'op' def last_child(self): """An optimized version of list(get_children())[-1] :returns: The last child. :rtype: NodeNG """ # XXX maybe if self.ops: return self.ops[-1][1] # return self.left # TODO: move to util? @staticmethod def _to_literal(node: SuccessfulInferenceResult) -> Any: # Can raise SyntaxError or ValueError from ast.literal_eval # Can raise AttributeError from node.as_string() as not all nodes have a visitor # Is this the stupidest idea or the simplest idea? return ast.literal_eval(node.as_string()) def _do_compare( self, left_iter: Iterable[InferenceResult], op: str, right_iter: Iterable[InferenceResult], ) -> bool | util.UninferableBase: """ If all possible combinations are either True or False, return that: >>> _do_compare([1, 2], '<=', [3, 4]) True >>> _do_compare([1, 2], '==', [3, 4]) False If any item is uninferable, or if some combinations are True and some are False, return Uninferable: >>> _do_compare([1, 3], '<=', [2, 4]) util.Uninferable """ retval: bool | None = None if op in UNINFERABLE_OPS: return util.Uninferable op_func = COMPARE_OPS[op] for left, right in itertools.product(left_iter, right_iter): if isinstance(left, util.UninferableBase) or isinstance( right, util.UninferableBase ): return util.Uninferable try: left, right = self._to_literal(left), self._to_literal(right) except (SyntaxError, ValueError, AttributeError): return util.Uninferable try: expr = op_func(left, right) except TypeError as exc: raise AstroidTypeError from exc if retval is None: retval = expr elif retval != expr: return util.Uninferable # (or both, but "True | False" is basically the same) assert retval is not None return retval # it was all the same value def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[nodes.Const | util.UninferableBase, None, None]: """Chained comparison inference logic.""" retval: bool | util.UninferableBase = True ops = self.ops left_node = self.left lhs = list(left_node.infer(context=context)) # should we break early if first element is uninferable? for op, right_node in ops: # eagerly evaluate rhs so that values can be re-used as lhs rhs = list(right_node.infer(context=context)) try: retval = self._do_compare(lhs, op, rhs) except AstroidTypeError: retval = util.Uninferable break if retval is not True: break # short-circuit lhs = rhs # continue if retval is util.Uninferable: yield retval # type: ignore[misc] else: yield Const(retval) class Comprehension(NodeNG): """Class representing an :class:`ast.comprehension` node. A :class:`Comprehension` indicates the loop inside any type of comprehension including generator expressions. >>> import astroid >>> node = astroid.extract_node('[x for x in some_values]') >>> list(node.get_children()) [, ] >>> list(node.get_children())[1].as_string() 'for x in some_values' """ _astroid_fields = ("target", "iter", "ifs") _other_fields = ("is_async",) optional_assign = True """Whether this node optionally assigns a variable.""" target: NodeNG """What is assigned to by the comprehension.""" iter: NodeNG """What is iterated over by the comprehension.""" ifs: list[NodeNG] """The contents of any if statements that filter the comprehension.""" is_async: bool """Whether this is an asynchronous comprehension or not.""" def postinit( self, target: NodeNG, iter: NodeNG, # pylint: disable = redefined-builtin ifs: list[NodeNG], is_async: bool, ) -> None: self.target = target self.iter = iter self.ifs = ifs self.is_async = is_async assigned_stmts = protocols.for_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def assign_type(self): """The type of assignment that this node performs. :returns: The assignment type. :rtype: NodeNG """ return self def _get_filtered_stmts( self, lookup_node, node, stmts, mystmt: _base_nodes.Statement | None ): """method used in filter_stmts""" if self is mystmt: if isinstance(lookup_node, (Const, Name)): return [lookup_node], True elif self.statement() is mystmt: # original node's statement is the assignment, only keeps # current node (gen exp, list comp) return [node], True return stmts, False def get_children(self): yield self.target yield self.iter yield from self.ifs class Const(_base_nodes.NoChildrenNode, Instance): """Class representing any constant including num, str, bool, None, bytes. >>> import astroid >>> node = astroid.extract_node('(5, "This is a string.", True, None, b"bytes")') >>> node >>> list(node.get_children()) [, , , , ] """ _other_fields = ("value", "kind") def __init__( self, value: Any, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, kind: str | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param value: The value that the constant represents. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param kind: The string prefix. "u" for u-prefixed strings and ``None`` otherwise. Python 3.8+ only. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.value: Any = value """The value that the constant represents.""" self.kind: str | None = kind # can be None """"The string prefix. "u" for u-prefixed strings and ``None`` otherwise. Python 3.8+ only.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) Instance.__init__(self, None) infer_unary_op = protocols.const_infer_unary_op infer_binary_op = protocols.const_infer_binary_op def __getattr__(self, name): # This is needed because of Proxy's __getattr__ method. # Calling object.__new__ on this class without calling # __init__ would result in an infinite loop otherwise # since __getattr__ is called when an attribute doesn't # exist and self._proxied indirectly calls self.value # and Proxy __getattr__ calls self.value if name == "value": raise AttributeError return super().__getattr__(name) def getitem(self, index, context: InferenceContext | None = None): """Get an item from this node if subscriptable. :param index: The node to use as a subscript index. :type index: Const or Slice :raises AstroidTypeError: When the given index cannot be used as a subscript index, or if this node is not subscriptable. """ if isinstance(index, Const): index_value = index.value elif isinstance(index, Slice): index_value = _infer_slice(index, context=context) else: raise AstroidTypeError( f"Could not use type {type(index)} as subscript index" ) try: if isinstance(self.value, (str, bytes)): return Const(self.value[index_value]) except ValueError as exc: raise AstroidValueError( f"Could not index {self.value!r} with {index_value!r}" ) from exc except IndexError as exc: raise AstroidIndexError( message="Index {index!r} out of range", node=self, index=index, context=context, ) from exc except TypeError as exc: raise AstroidTypeError( message="Type error {error!r}", node=self, index=index, context=context ) from exc raise AstroidTypeError(f"{self!r} (value={self.value})") def has_dynamic_getattr(self) -> bool: """Check if the node has a custom __getattr__ or __getattribute__. :returns: Whether the class has a custom __getattr__ or __getattribute__. For a :class:`Const` this is always ``False``. """ return False def itered(self): """An iterator over the elements this node contains. :returns: The contents of this node. :rtype: iterable(Const) :raises TypeError: If this node does not represent something that is iterable. """ if isinstance(self.value, str): return [const_factory(elem) for elem in self.value] raise TypeError(f"Cannot iterate over type {type(self.value)!r}") def pytype(self) -> str: """Get the name of the type that this node represents. :returns: The name of the type. """ return self._proxied.qname() def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. :returns: The boolean value of this node. :rtype: bool """ return bool(self.value) def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[Const]: yield self class Continue(_base_nodes.NoChildrenNode, _base_nodes.Statement): """Class representing an :class:`ast.Continue` node. >>> import astroid >>> node = astroid.extract_node('continue') >>> node """ class Decorators(NodeNG): """A node representing a list of decorators. A :class:`Decorators` is the decorators that are applied to a method or function. >>> import astroid >>> node = astroid.extract_node(''' @property def my_property(self): return 3 ''') >>> node >>> list(node.get_children())[0] """ _astroid_fields = ("nodes",) nodes: list[NodeNG] """The decorators that this node contains.""" def postinit(self, nodes: list[NodeNG]) -> None: self.nodes = nodes def scope(self) -> LocalsDictNodeNG: """The first parent node defining a new scope. These can be Module, FunctionDef, ClassDef, Lambda, or GeneratorExp nodes. :returns: The first parent scope node. """ # skip the function node to go directly to the upper level scope if not self.parent: raise ParentMissingError(target=self) if not self.parent.parent: raise ParentMissingError(target=self.parent) return self.parent.parent.scope() def get_children(self): yield from self.nodes class DelAttr(_base_nodes.ParentAssignNode): """Variation of :class:`ast.Delete` representing deletion of an attribute. >>> import astroid >>> node = astroid.extract_node('del self.attr') >>> node >>> list(node.get_children())[0] """ _astroid_fields = ("expr",) _other_fields = ("attrname",) expr: NodeNG """The name that this node represents.""" def __init__( self, attrname: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.attrname = attrname """The name of the attribute that is being deleted.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, expr: NodeNG) -> None: self.expr = expr def get_children(self): yield self.expr class Delete(_base_nodes.AssignTypeNode, _base_nodes.Statement): """Class representing an :class:`ast.Delete` node. A :class:`Delete` is a ``del`` statement this is deleting something. >>> import astroid >>> node = astroid.extract_node('del self.attr') >>> node """ _astroid_fields = ("targets",) def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.targets: list[NodeNG] = [] """What is being deleted.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, targets: list[NodeNG]) -> None: self.targets = targets def get_children(self): yield from self.targets class Dict(NodeNG, Instance): """Class representing an :class:`ast.Dict` node. A :class:`Dict` is a dictionary that is created with ``{}`` syntax. >>> import astroid >>> node = astroid.extract_node('{1: "1"}') >>> node """ _astroid_fields = ("items",) def __init__( self, lineno: int | None, col_offset: int | None, parent: NodeNG | None, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.items: list[tuple[InferenceResult, InferenceResult]] = [] """The key-value pairs contained in the dictionary.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, items: list[tuple[InferenceResult, InferenceResult]]) -> None: """Do some setup after initialisation. :param items: The key-value pairs contained in the dictionary. """ self.items = items infer_unary_op = protocols.dict_infer_unary_op def pytype(self) -> Literal["builtins.dict"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.dict" def get_children(self): """Get the key and value nodes below this node. Children are returned in the order that they are defined in the source code, key first then the value. :returns: The children. :rtype: iterable(NodeNG) """ for key, value in self.items: yield key yield value def last_child(self): """An optimized version of list(get_children())[-1] :returns: The last child, or None if no children exist. :rtype: NodeNG or None """ if self.items: return self.items[-1][1] return None def itered(self): """An iterator over the keys this node contains. :returns: The keys of this node. :rtype: iterable(NodeNG) """ return [key for (key, _) in self.items] def getitem( self, index: Const | Slice, context: InferenceContext | None = None ) -> NodeNG: """Get an item from this node. :param index: The node to use as a subscript index. :raises AstroidTypeError: When the given index cannot be used as a subscript index, or if this node is not subscriptable. :raises AstroidIndexError: If the given index does not exist in the dictionary. """ for key, value in self.items: # TODO(cpopa): no support for overriding yet, {1:2, **{1: 3}}. if isinstance(key, DictUnpack): inferred_value = util.safe_infer(value, context) if not isinstance(inferred_value, Dict): continue try: return inferred_value.getitem(index, context) except (AstroidTypeError, AstroidIndexError): continue for inferredkey in key.infer(context): if isinstance(inferredkey, util.UninferableBase): continue if isinstance(inferredkey, Const) and isinstance(index, Const): if inferredkey.value == index.value: return value raise AstroidIndexError(index) def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. :returns: The boolean value of this node. :rtype: bool """ return bool(self.items) def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[nodes.Dict]: if not any(isinstance(k, DictUnpack) for k, _ in self.items): yield self else: items = self._infer_map(context) new_seq = type(self)( lineno=self.lineno, col_offset=self.col_offset, parent=self.parent, end_lineno=self.end_lineno, end_col_offset=self.end_col_offset, ) new_seq.postinit(list(items.items())) yield new_seq @staticmethod def _update_with_replacement( lhs_dict: dict[SuccessfulInferenceResult, SuccessfulInferenceResult], rhs_dict: dict[SuccessfulInferenceResult, SuccessfulInferenceResult], ) -> dict[SuccessfulInferenceResult, SuccessfulInferenceResult]: """Delete nodes that equate to duplicate keys. Since an astroid node doesn't 'equal' another node with the same value, this function uses the as_string method to make sure duplicate keys don't get through Note that both the key and the value are astroid nodes Fixes issue with DictUnpack causing duplicate keys in inferred Dict items :param lhs_dict: Dictionary to 'merge' nodes into :param rhs_dict: Dictionary with nodes to pull from :return : merged dictionary of nodes """ combined_dict = itertools.chain(lhs_dict.items(), rhs_dict.items()) # Overwrite keys which have the same string values string_map = {key.as_string(): (key, value) for key, value in combined_dict} # Return to dictionary return dict(string_map.values()) def _infer_map( self, context: InferenceContext | None ) -> dict[SuccessfulInferenceResult, SuccessfulInferenceResult]: """Infer all values based on Dict.items.""" values: dict[SuccessfulInferenceResult, SuccessfulInferenceResult] = {} for name, value in self.items: if isinstance(name, DictUnpack): double_starred = util.safe_infer(value, context) if not double_starred: raise InferenceError if not isinstance(double_starred, Dict): raise InferenceError(node=self, context=context) unpack_items = double_starred._infer_map(context) values = self._update_with_replacement(values, unpack_items) else: key = util.safe_infer(name, context=context) safe_value = util.safe_infer(value, context=context) if any(not elem for elem in (key, safe_value)): raise InferenceError(node=self, context=context) # safe_value is SuccessfulInferenceResult as bool(Uninferable) == False values = self._update_with_replacement(values, {key: safe_value}) return values class Expr(_base_nodes.Statement): """Class representing an :class:`ast.Expr` node. An :class:`Expr` is any expression that does not have its value used or stored. >>> import astroid >>> node = astroid.extract_node('method()') >>> node >>> node.parent """ _astroid_fields = ("value",) value: NodeNG """What the expression does.""" def postinit(self, value: NodeNG) -> None: self.value = value def get_children(self): yield self.value def _get_yield_nodes_skip_functions(self): if not self.value.is_function: yield from self.value._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): if not self.value.is_lambda: yield from self.value._get_yield_nodes_skip_lambdas() class EmptyNode(_base_nodes.NoChildrenNode): """Holds an arbitrary object in the :attr:`LocalsDictNodeNG.locals`.""" object = None def __init__( self, lineno: None = None, col_offset: None = None, parent: None = None, *, end_lineno: None = None, end_col_offset: None = None, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def has_underlying_object(self) -> bool: return self.object is not None and self.object is not _EMPTY_OBJECT_MARKER @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: if not self.has_underlying_object(): yield util.Uninferable else: try: yield from AstroidManager().infer_ast_from_something( self.object, context=context ) except AstroidError: yield util.Uninferable class ExceptHandler( _base_nodes.MultiLineBlockNode, _base_nodes.AssignTypeNode, _base_nodes.Statement ): """Class representing an :class:`ast.ExceptHandler`. node. An :class:`ExceptHandler` is an ``except`` block on a try-except. >>> import astroid >>> node = astroid.extract_node(''' try: do_something() except Exception as error: print("Error!") ''') >>> node >>> node.handlers [] """ _astroid_fields = ("type", "name", "body") _multi_line_block_fields = ("body",) type: NodeNG | None """The types that the block handles.""" name: AssignName | None """The name that the caught exception is assigned to.""" body: list[NodeNG] """The contents of the block.""" assigned_stmts = protocols.excepthandler_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def postinit( self, type: NodeNG | None, # pylint: disable = redefined-builtin name: AssignName | None, body: list[NodeNG], ) -> None: self.type = type self.name = name self.body = body def get_children(self): if self.type is not None: yield self.type if self.name is not None: yield self.name yield from self.body @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ if self.name: return self.name.tolineno if self.type: return self.type.tolineno return self.lineno def catch(self, exceptions: list[str] | None) -> bool: """Check if this node handles any of the given :param exceptions: The names of the exceptions to check for. """ if self.type is None or exceptions is None: return True return any(node.name in exceptions for node in self.type._get_name_nodes()) class For( _base_nodes.MultiLineWithElseBlockNode, _base_nodes.AssignTypeNode, _base_nodes.Statement, ): """Class representing an :class:`ast.For` node. >>> import astroid >>> node = astroid.extract_node('for thing in things: print(thing)') >>> node """ _astroid_fields = ("target", "iter", "body", "orelse") _other_other_fields = ("type_annotation",) _multi_line_block_fields = ("body", "orelse") optional_assign = True """Whether this node optionally assigns a variable. This is always ``True`` for :class:`For` nodes. """ target: NodeNG """What the loop assigns to.""" iter: NodeNG """What the loop iterates over.""" body: list[NodeNG] """The contents of the body of the loop.""" orelse: list[NodeNG] """The contents of the ``else`` block of the loop.""" type_annotation: NodeNG | None """If present, this will contain the type annotation passed by a type comment""" def postinit( self, target: NodeNG, iter: NodeNG, # pylint: disable = redefined-builtin body: list[NodeNG], orelse: list[NodeNG], type_annotation: NodeNG | None, ) -> None: self.target = target self.iter = iter self.body = body self.orelse = orelse self.type_annotation = type_annotation assigned_stmts = protocols.for_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ return self.iter.tolineno def get_children(self): yield self.target yield self.iter yield from self.body yield from self.orelse class AsyncFor(For): """Class representing an :class:`ast.AsyncFor` node. An :class:`AsyncFor` is an asynchronous :class:`For` built with the ``async`` keyword. >>> import astroid >>> node = astroid.extract_node(''' async def func(things): async for thing in things: print(thing) ''') >>> node >>> node.body[0] """ class Await(NodeNG): """Class representing an :class:`ast.Await` node. An :class:`Await` is the ``await`` keyword. >>> import astroid >>> node = astroid.extract_node(''' async def func(things): await other_func() ''') >>> node >>> node.body[0] >>> list(node.body[0].get_children())[0] """ _astroid_fields = ("value",) value: NodeNG """What to wait for.""" def postinit(self, value: NodeNG) -> None: self.value = value def get_children(self): yield self.value class ImportFrom(_base_nodes.ImportNode): """Class representing an :class:`ast.ImportFrom` node. >>> import astroid >>> node = astroid.extract_node('from my_package import my_module') >>> node """ _other_fields = ("modname", "names", "level") def __init__( self, fromname: str | None, names: list[tuple[str, str | None]], level: int | None = 0, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param fromname: The module that is being imported from. :param names: What is being imported from the module. :param level: The level of relative import. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.modname: str | None = fromname # can be None """The module that is being imported from. This is ``None`` for relative imports. """ self.names: list[tuple[str, str | None]] = names """What is being imported from the module. Each entry is a :class:`tuple` of the name being imported, and the alias that the name is assigned to (if any). """ # TODO When is 'level' None? self.level: int | None = level # can be None """The level of relative import. Essentially this is the number of dots in the import. This is always 0 for absolute imports. """ super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, asname: bool = True, **kwargs: Any, ) -> Generator[InferenceResult, None, None]: """Infer a ImportFrom node: return the imported module/object.""" context = context or InferenceContext() name = context.lookupname if name is None: raise InferenceError(node=self, context=context) if asname: try: name = self.real_name(name) except AttributeInferenceError as exc: # See https://github.com/pylint-dev/pylint/issues/4692 raise InferenceError(node=self, context=context) from exc try: module = self.do_import_module() except AstroidBuildingError as exc: raise InferenceError(node=self, context=context) from exc try: context = copy_context(context) context.lookupname = name stmts = module.getattr(name, ignore_locals=module is self.root()) return _infer_stmts(stmts, context) except AttributeInferenceError as error: raise InferenceError( str(error), target=self, attribute=name, context=context ) from error class Attribute(NodeNG): """Class representing an :class:`ast.Attribute` node.""" expr: NodeNG _astroid_fields = ("expr",) _other_fields = ("attrname",) def __init__( self, attrname: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.attrname = attrname """The name of the attribute.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, expr: NodeNG) -> None: self.expr = expr def get_children(self): yield self.expr @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo]: return _infer_attribute(self, context, **kwargs) class Global(_base_nodes.NoChildrenNode, _base_nodes.Statement): """Class representing an :class:`ast.Global` node. >>> import astroid >>> node = astroid.extract_node('global a_global') >>> node """ _other_fields = ("names",) def __init__( self, names: list[str], lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param names: The names being declared as global. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.names: list[str] = names """The names being declared as global.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def _infer_name(self, frame, name): return name @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: if context is None or context.lookupname is None: raise InferenceError(node=self, context=context) try: # pylint: disable-next=no-member return _infer_stmts(self.root().getattr(context.lookupname), context) except AttributeInferenceError as error: raise InferenceError( str(error), target=self, attribute=context.lookupname, context=context ) from error class If(_base_nodes.MultiLineWithElseBlockNode, _base_nodes.Statement): """Class representing an :class:`ast.If` node. >>> import astroid >>> node = astroid.extract_node('if condition: print(True)') >>> node """ _astroid_fields = ("test", "body", "orelse") _multi_line_block_fields = ("body", "orelse") test: NodeNG """The condition that the statement tests.""" body: list[NodeNG] """The contents of the block.""" orelse: list[NodeNG] """The contents of the ``else`` block.""" def postinit(self, test: NodeNG, body: list[NodeNG], orelse: list[NodeNG]) -> None: self.test = test self.body = body self.orelse = orelse @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ return self.test.tolineno def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from the given line number to where this node ends. :param lineno: The line number to start the range at. :returns: The range of line numbers that this node belongs to, starting at the given line number. """ if lineno == self.body[0].fromlineno: return lineno, lineno if lineno <= self.body[-1].tolineno: return lineno, self.body[-1].tolineno return self._elsed_block_range(lineno, self.orelse, self.body[0].fromlineno - 1) def get_children(self): yield self.test yield from self.body yield from self.orelse def has_elif_block(self): return len(self.orelse) == 1 and isinstance(self.orelse[0], If) def _get_yield_nodes_skip_functions(self): """An If node can contain a Yield node in the test""" yield from self.test._get_yield_nodes_skip_functions() yield from super()._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): """An If node can contain a Yield node in the test""" yield from self.test._get_yield_nodes_skip_lambdas() yield from super()._get_yield_nodes_skip_lambdas() class IfExp(NodeNG): """Class representing an :class:`ast.IfExp` node. >>> import astroid >>> node = astroid.extract_node('value if condition else other') >>> node """ _astroid_fields = ("test", "body", "orelse") test: NodeNG """The condition that the statement tests.""" body: NodeNG """The contents of the block.""" orelse: NodeNG """The contents of the ``else`` block.""" def postinit(self, test: NodeNG, body: NodeNG, orelse: NodeNG) -> None: self.test = test self.body = body self.orelse = orelse def get_children(self): yield self.test yield self.body yield self.orelse def op_left_associative(self) -> Literal[False]: # `1 if True else 2 if False else 3` is parsed as # `1 if True else (2 if False else 3)` return False @decorators.raise_if_nothing_inferred def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: """Support IfExp inference. If we can't infer the truthiness of the condition, we default to inferring both branches. Otherwise, we infer either branch depending on the condition. """ both_branches = False # We use two separate contexts for evaluating lhs and rhs because # evaluating lhs may leave some undesired entries in context.path # which may not let us infer right value of rhs. context = context or InferenceContext() lhs_context = copy_context(context) rhs_context = copy_context(context) try: test = next(self.test.infer(context=context.clone())) except (InferenceError, StopIteration): both_branches = True else: if not isinstance(test, util.UninferableBase): if test.bool_value(): yield from self.body.infer(context=lhs_context) else: yield from self.orelse.infer(context=rhs_context) else: both_branches = True if both_branches: yield from self.body.infer(context=lhs_context) yield from self.orelse.infer(context=rhs_context) class Import(_base_nodes.ImportNode): """Class representing an :class:`ast.Import` node. >>> import astroid >>> node = astroid.extract_node('import astroid') >>> node """ _other_fields = ("names",) def __init__( self, names: list[tuple[str, str | None]], lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param names: The names being imported. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.names: list[tuple[str, str | None]] = names """The names being imported. Each entry is a :class:`tuple` of the name being imported, and the alias that the name is assigned to (if any). """ super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self, context: InferenceContext | None = None, asname: bool = True, **kwargs: Any, ) -> Generator[nodes.Module, None, None]: """Infer an Import node: return the imported module/object.""" context = context or InferenceContext() name = context.lookupname if name is None: raise InferenceError(node=self, context=context) try: if asname: yield self.do_import_module(self.real_name(name)) else: yield self.do_import_module(name) except AstroidBuildingError as exc: raise InferenceError(node=self, context=context) from exc class Keyword(NodeNG): """Class representing an :class:`ast.keyword` node. >>> import astroid >>> node = astroid.extract_node('function(a_kwarg=True)') >>> node >>> node.keywords [] """ _astroid_fields = ("value",) _other_fields = ("arg",) value: NodeNG """The value being assigned to the keyword argument.""" def __init__( self, arg: str | None, lineno: int | None, col_offset: int | None, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.arg = arg """The argument being assigned to.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, value: NodeNG) -> None: self.value = value def get_children(self): yield self.value class List(BaseContainer): """Class representing an :class:`ast.List` node. >>> import astroid >>> node = astroid.extract_node('[1, 2, 3]') >>> node """ _other_fields = ("ctx",) def __init__( self, ctx: Context | None = None, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param ctx: Whether the list is assigned to or loaded from. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.ctx: Context | None = ctx """Whether the list is assigned to or loaded from.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) assigned_stmts = protocols.sequence_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ infer_unary_op = protocols.list_infer_unary_op infer_binary_op = protocols.tl_infer_binary_op def pytype(self) -> Literal["builtins.list"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.list" def getitem(self, index, context: InferenceContext | None = None): """Get an item from this node. :param index: The node to use as a subscript index. :type index: Const or Slice """ return _container_getitem(self, self.elts, index, context=context) class Nonlocal(_base_nodes.NoChildrenNode, _base_nodes.Statement): """Class representing an :class:`ast.Nonlocal` node. >>> import astroid >>> node = astroid.extract_node(''' def function(): nonlocal var ''') >>> node >>> node.body[0] """ _other_fields = ("names",) def __init__( self, names: list[str], lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param names: The names being declared as not local. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.names: list[str] = names """The names being declared as not local.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def _infer_name(self, frame, name): return name class ParamSpec(_base_nodes.AssignTypeNode): """Class representing a :class:`ast.ParamSpec` node. >>> import astroid >>> node = astroid.extract_node('type Alias[**P] = Callable[P, int]') >>> node.type_params[0] """ _astroid_fields = ("name",) name: AssignName def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int, end_col_offset: int, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, name: AssignName) -> None: self.name = name def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[ParamSpec]: yield self assigned_stmts = protocols.generic_type_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ class Pass(_base_nodes.NoChildrenNode, _base_nodes.Statement): """Class representing an :class:`ast.Pass` node. >>> import astroid >>> node = astroid.extract_node('pass') >>> node """ class Raise(_base_nodes.Statement): """Class representing an :class:`ast.Raise` node. >>> import astroid >>> node = astroid.extract_node('raise RuntimeError("Something bad happened!")') >>> node """ _astroid_fields = ("exc", "cause") exc: NodeNG | None """What is being raised.""" cause: NodeNG | None """The exception being used to raise this one.""" def postinit( self, exc: NodeNG | None, cause: NodeNG | None, ) -> None: self.exc = exc self.cause = cause def raises_not_implemented(self) -> bool: """Check if this node raises a :class:`NotImplementedError`. :returns: Whether this node raises a :class:`NotImplementedError`. """ if not self.exc: return False return any( name.name == "NotImplementedError" for name in self.exc._get_name_nodes() ) def get_children(self): if self.exc is not None: yield self.exc if self.cause is not None: yield self.cause class Return(_base_nodes.Statement): """Class representing an :class:`ast.Return` node. >>> import astroid >>> node = astroid.extract_node('return True') >>> node """ _astroid_fields = ("value",) value: NodeNG | None """The value being returned.""" def postinit(self, value: NodeNG | None) -> None: self.value = value def get_children(self): if self.value is not None: yield self.value def is_tuple_return(self): return isinstance(self.value, Tuple) def _get_return_nodes_skip_functions(self): yield self class Set(BaseContainer): """Class representing an :class:`ast.Set` node. >>> import astroid >>> node = astroid.extract_node('{1, 2, 3}') >>> node """ infer_unary_op = protocols.set_infer_unary_op def pytype(self) -> Literal["builtins.set"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.set" class Slice(NodeNG): """Class representing an :class:`ast.Slice` node. >>> import astroid >>> node = astroid.extract_node('things[1:3]') >>> node >>> node.slice """ _astroid_fields = ("lower", "upper", "step") lower: NodeNG | None """The lower index in the slice.""" upper: NodeNG | None """The upper index in the slice.""" step: NodeNG | None """The step to take between indexes.""" def postinit( self, lower: NodeNG | None, upper: NodeNG | None, step: NodeNG | None, ) -> None: self.lower = lower self.upper = upper self.step = step def _wrap_attribute(self, attr): """Wrap the empty attributes of the Slice in a Const node.""" if not attr: const = const_factory(attr) const.parent = self return const return attr @cached_property def _proxied(self) -> nodes.ClassDef: builtins = AstroidManager().builtins_module return builtins.getattr("slice")[0] def pytype(self) -> Literal["builtins.slice"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.slice" def display_type(self) -> Literal["Slice"]: """A human readable type of this node. :returns: The type of this node. """ return "Slice" def igetattr( self, attrname: str, context: InferenceContext | None = None ) -> Iterator[SuccessfulInferenceResult]: """Infer the possible values of the given attribute on the slice. :param attrname: The name of the attribute to infer. :returns: The inferred possible values. """ if attrname == "start": yield self._wrap_attribute(self.lower) elif attrname == "stop": yield self._wrap_attribute(self.upper) elif attrname == "step": yield self._wrap_attribute(self.step) else: yield from self.getattr(attrname, context=context) def getattr(self, attrname, context: InferenceContext | None = None): return self._proxied.getattr(attrname, context) def get_children(self): if self.lower is not None: yield self.lower if self.upper is not None: yield self.upper if self.step is not None: yield self.step def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[Slice]: yield self class Starred(_base_nodes.ParentAssignNode): """Class representing an :class:`ast.Starred` node. >>> import astroid >>> node = astroid.extract_node('*args') >>> node """ _astroid_fields = ("value",) _other_fields = ("ctx",) value: NodeNG """What is being unpacked.""" def __init__( self, ctx: Context, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.ctx = ctx """Whether the starred item is assigned to or loaded from.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, value: NodeNG) -> None: self.value = value assigned_stmts = protocols.starred_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def get_children(self): yield self.value class Subscript(NodeNG): """Class representing an :class:`ast.Subscript` node. >>> import astroid >>> node = astroid.extract_node('things[1:3]') >>> node """ _SUBSCRIPT_SENTINEL = object() _astroid_fields = ("value", "slice") _other_fields = ("ctx",) value: NodeNG """What is being indexed.""" slice: NodeNG """The slice being used to lookup.""" def __init__( self, ctx: Context, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.ctx = ctx """Whether the subscripted item is assigned to or loaded from.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) # pylint: disable=redefined-builtin; had to use the same name as builtin ast module. def postinit(self, value: NodeNG, slice: NodeNG) -> None: self.value = value self.slice = slice def get_children(self): yield self.value yield self.slice def _infer_subscript( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """Inference for subscripts. We're understanding if the index is a Const or a slice, passing the result of inference to the value's `getitem` method, which should handle each supported index type accordingly. """ from astroid import helpers # pylint: disable=import-outside-toplevel found_one = False for value in self.value.infer(context): if isinstance(value, util.UninferableBase): yield util.Uninferable return None for index in self.slice.infer(context): if isinstance(index, util.UninferableBase): yield util.Uninferable return None # Try to deduce the index value. index_value = self._SUBSCRIPT_SENTINEL if value.__class__ == Instance: index_value = index elif index.__class__ == Instance: instance_as_index = helpers.class_instance_as_index(index) if instance_as_index: index_value = instance_as_index else: index_value = index if index_value is self._SUBSCRIPT_SENTINEL: raise InferenceError(node=self, context=context) try: assigned = value.getitem(index_value, context) except ( AstroidTypeError, AstroidIndexError, AstroidValueError, AttributeInferenceError, AttributeError, ) as exc: raise InferenceError(node=self, context=context) from exc # Prevent inferring if the inferred subscript # is the same as the original subscripted object. if self is assigned or isinstance(assigned, util.UninferableBase): yield util.Uninferable return None yield from assigned.infer(context) found_one = True if found_one: return InferenceErrorInfo(node=self, context=context) return None @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer(self, context: InferenceContext | None = None, **kwargs: Any): return self._infer_subscript(context, **kwargs) @decorators.raise_if_nothing_inferred def infer_lhs(self, context: InferenceContext | None = None, **kwargs: Any): return self._infer_subscript(context, **kwargs) class Try(_base_nodes.MultiLineWithElseBlockNode, _base_nodes.Statement): """Class representing a :class:`ast.Try` node. >>> import astroid >>> node = astroid.extract_node(''' try: do_something() except Exception as error: print("Error!") finally: print("Cleanup!") ''') >>> node """ _astroid_fields = ("body", "handlers", "orelse", "finalbody") _multi_line_block_fields = ("body", "handlers", "orelse", "finalbody") def __init__( self, *, lineno: int, col_offset: int, end_lineno: int, end_col_offset: int, parent: NodeNG, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.body: list[NodeNG] = [] """The contents of the block to catch exceptions from.""" self.handlers: list[ExceptHandler] = [] """The exception handlers.""" self.orelse: list[NodeNG] = [] """The contents of the ``else`` block.""" self.finalbody: list[NodeNG] = [] """The contents of the ``finally`` block.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, body: list[NodeNG], handlers: list[ExceptHandler], orelse: list[NodeNG], finalbody: list[NodeNG], ) -> None: """Do some setup after initialisation. :param body: The contents of the block to catch exceptions from. :param handlers: The exception handlers. :param orelse: The contents of the ``else`` block. :param finalbody: The contents of the ``finally`` block. """ self.body = body self.handlers = handlers self.orelse = orelse self.finalbody = finalbody def _infer_name(self, frame, name): return name def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from a given line number to where this node ends.""" if lineno == self.fromlineno: return lineno, lineno if self.body and self.body[0].fromlineno <= lineno <= self.body[-1].tolineno: # Inside try body - return from lineno till end of try body return lineno, self.body[-1].tolineno for exhandler in self.handlers: if exhandler.type and lineno == exhandler.type.fromlineno: return lineno, lineno if exhandler.body[0].fromlineno <= lineno <= exhandler.body[-1].tolineno: return lineno, exhandler.body[-1].tolineno if self.orelse: if self.orelse[0].fromlineno - 1 == lineno: return lineno, lineno if self.orelse[0].fromlineno <= lineno <= self.orelse[-1].tolineno: return lineno, self.orelse[-1].tolineno if self.finalbody: if self.finalbody[0].fromlineno - 1 == lineno: return lineno, lineno if self.finalbody[0].fromlineno <= lineno <= self.finalbody[-1].tolineno: return lineno, self.finalbody[-1].tolineno return lineno, self.tolineno def get_children(self): yield from self.body yield from self.handlers yield from self.orelse yield from self.finalbody class TryStar(_base_nodes.MultiLineWithElseBlockNode, _base_nodes.Statement): """Class representing an :class:`ast.TryStar` node.""" _astroid_fields = ("body", "handlers", "orelse", "finalbody") _multi_line_block_fields = ("body", "handlers", "orelse", "finalbody") def __init__( self, *, lineno: int | None = None, col_offset: int | None = None, end_lineno: int | None = None, end_col_offset: int | None = None, parent: NodeNG | None = None, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.body: list[NodeNG] = [] """The contents of the block to catch exceptions from.""" self.handlers: list[ExceptHandler] = [] """The exception handlers.""" self.orelse: list[NodeNG] = [] """The contents of the ``else`` block.""" self.finalbody: list[NodeNG] = [] """The contents of the ``finally`` block.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, body: list[NodeNG] | None = None, handlers: list[ExceptHandler] | None = None, orelse: list[NodeNG] | None = None, finalbody: list[NodeNG] | None = None, ) -> None: """Do some setup after initialisation. :param body: The contents of the block to catch exceptions from. :param handlers: The exception handlers. :param orelse: The contents of the ``else`` block. :param finalbody: The contents of the ``finally`` block. """ if body: self.body = body if handlers: self.handlers = handlers if orelse: self.orelse = orelse if finalbody: self.finalbody = finalbody def _infer_name(self, frame, name): return name def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from a given line number to where this node ends.""" if lineno == self.fromlineno: return lineno, lineno if self.body and self.body[0].fromlineno <= lineno <= self.body[-1].tolineno: # Inside try body - return from lineno till end of try body return lineno, self.body[-1].tolineno for exhandler in self.handlers: if exhandler.type and lineno == exhandler.type.fromlineno: return lineno, lineno if exhandler.body[0].fromlineno <= lineno <= exhandler.body[-1].tolineno: return lineno, exhandler.body[-1].tolineno if self.orelse: if self.orelse[0].fromlineno - 1 == lineno: return lineno, lineno if self.orelse[0].fromlineno <= lineno <= self.orelse[-1].tolineno: return lineno, self.orelse[-1].tolineno if self.finalbody: if self.finalbody[0].fromlineno - 1 == lineno: return lineno, lineno if self.finalbody[0].fromlineno <= lineno <= self.finalbody[-1].tolineno: return lineno, self.finalbody[-1].tolineno return lineno, self.tolineno def get_children(self): yield from self.body yield from self.handlers yield from self.orelse yield from self.finalbody class Tuple(BaseContainer): """Class representing an :class:`ast.Tuple` node. >>> import astroid >>> node = astroid.extract_node('(1, 2, 3)') >>> node """ _other_fields = ("ctx",) def __init__( self, ctx: Context | None = None, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param ctx: Whether the tuple is assigned to or loaded from. :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.ctx: Context | None = ctx """Whether the tuple is assigned to or loaded from.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) assigned_stmts = protocols.sequence_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ infer_unary_op = protocols.tuple_infer_unary_op infer_binary_op = protocols.tl_infer_binary_op def pytype(self) -> Literal["builtins.tuple"]: """Get the name of the type that this node represents. :returns: The name of the type. """ return "builtins.tuple" def getitem(self, index, context: InferenceContext | None = None): """Get an item from this node. :param index: The node to use as a subscript index. :type index: Const or Slice """ return _container_getitem(self, self.elts, index, context=context) class TypeAlias(_base_nodes.AssignTypeNode, _base_nodes.Statement): """Class representing a :class:`ast.TypeAlias` node. >>> import astroid >>> node = astroid.extract_node('type Point = tuple[float, float]') >>> node """ _astroid_fields = ("name", "type_params", "value") name: AssignName type_params: list[TypeVar | ParamSpec | TypeVarTuple] value: NodeNG def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int, end_col_offset: int, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, name: AssignName, type_params: list[TypeVar | ParamSpec | TypeVarTuple], value: NodeNG, ) -> None: self.name = name self.type_params = type_params self.value = value def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[TypeAlias]: yield self assigned_stmts: ClassVar[ Callable[ [ TypeAlias, AssignName, InferenceContext | None, None, ], Generator[NodeNG, None, None], ] ] = protocols.assign_assigned_stmts class TypeVar(_base_nodes.AssignTypeNode): """Class representing a :class:`ast.TypeVar` node. >>> import astroid >>> node = astroid.extract_node('type Point[T] = tuple[float, float]') >>> node.type_params[0] """ _astroid_fields = ("name", "bound") name: AssignName bound: NodeNG | None def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int, end_col_offset: int, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, name: AssignName, bound: NodeNG | None) -> None: self.name = name self.bound = bound def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[TypeVar]: yield self assigned_stmts = protocols.generic_type_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ class TypeVarTuple(_base_nodes.AssignTypeNode): """Class representing a :class:`ast.TypeVarTuple` node. >>> import astroid >>> node = astroid.extract_node('type Alias[*Ts] = tuple[*Ts]') >>> node.type_params[0] """ _astroid_fields = ("name",) name: AssignName def __init__( self, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int, end_col_offset: int, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, name: AssignName) -> None: self.name = name def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Iterator[TypeVarTuple]: yield self assigned_stmts = protocols.generic_type_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ UNARY_OP_METHOD = { "+": "__pos__", "-": "__neg__", "~": "__invert__", "not": None, # XXX not '__nonzero__' } class UnaryOp(_base_nodes.OperatorNode): """Class representing an :class:`ast.UnaryOp` node. >>> import astroid >>> node = astroid.extract_node('-5') >>> node """ _astroid_fields = ("operand",) _other_fields = ("op",) operand: NodeNG """What the unary operator is applied to.""" def __init__( self, op: str, lineno: int, col_offset: int, parent: NodeNG, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.op = op """The operator.""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, operand: NodeNG) -> None: self.operand = operand def type_errors(self, context: InferenceContext | None = None): """Get a list of type errors which can occur during inference. Each TypeError is represented by a :class:`BadUnaryOperationMessage`, which holds the original exception. :returns: The list of possible type errors. :rtype: list(BadUnaryOperationMessage) """ try: results = self._infer_unaryop(context=context) return [ result for result in results if isinstance(result, util.BadUnaryOperationMessage) ] except InferenceError: return [] def get_children(self): yield self.operand def op_precedence(self): if self.op == "not": return OP_PRECEDENCE[self.op] return super().op_precedence() def _infer_unaryop( self: nodes.UnaryOp, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[ InferenceResult | util.BadUnaryOperationMessage, None, InferenceErrorInfo ]: """Infer what an UnaryOp should return when evaluated.""" from astroid.nodes import ClassDef # pylint: disable=import-outside-toplevel for operand in self.operand.infer(context): try: yield operand.infer_unary_op(self.op) except TypeError as exc: # The operand doesn't support this operation. yield util.BadUnaryOperationMessage(operand, self.op, exc) except AttributeError as exc: meth = UNARY_OP_METHOD[self.op] if meth is None: # `not node`. Determine node's boolean # value and negate its result, unless it is # Uninferable, which will be returned as is. bool_value = operand.bool_value() if not isinstance(bool_value, util.UninferableBase): yield const_factory(not bool_value) else: yield util.Uninferable else: if not isinstance(operand, (Instance, ClassDef)): # The operation was used on something which # doesn't support it. yield util.BadUnaryOperationMessage(operand, self.op, exc) continue try: try: methods = dunder_lookup.lookup(operand, meth) except AttributeInferenceError: yield util.BadUnaryOperationMessage(operand, self.op, exc) continue meth = methods[0] inferred = next(meth.infer(context=context), None) if ( isinstance(inferred, util.UninferableBase) or not inferred.callable() ): continue context = copy_context(context) context.boundnode = operand context.callcontext = CallContext(args=[], callee=inferred) call_results = inferred.infer_call_result(self, context=context) result = next(call_results, None) if result is None: # Failed to infer, return the same type. yield operand else: yield result except AttributeInferenceError as inner_exc: # The unary operation special method was not found. yield util.BadUnaryOperationMessage(operand, self.op, inner_exc) except InferenceError: yield util.Uninferable @decorators.raise_if_nothing_inferred @decorators.path_wrapper def _infer( self: nodes.UnaryOp, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo]: """Infer what an UnaryOp should return when evaluated.""" yield from self._filter_operation_errors( self._infer_unaryop, context, util.BadUnaryOperationMessage ) return InferenceErrorInfo(node=self, context=context) class While(_base_nodes.MultiLineWithElseBlockNode, _base_nodes.Statement): """Class representing an :class:`ast.While` node. >>> import astroid >>> node = astroid.extract_node(''' while condition(): print("True") ''') >>> node """ _astroid_fields = ("test", "body", "orelse") _multi_line_block_fields = ("body", "orelse") test: NodeNG """The condition that the loop tests.""" body: list[NodeNG] """The contents of the loop.""" orelse: list[NodeNG] """The contents of the ``else`` block.""" def postinit( self, test: NodeNG, body: list[NodeNG], orelse: list[NodeNG], ) -> None: self.test = test self.body = body self.orelse = orelse @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ return self.test.tolineno def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from the given line number to where this node ends. :param lineno: The line number to start the range at. :returns: The range of line numbers that this node belongs to, starting at the given line number. """ return self._elsed_block_range(lineno, self.orelse) def get_children(self): yield self.test yield from self.body yield from self.orelse def _get_yield_nodes_skip_functions(self): """A While node can contain a Yield node in the test""" yield from self.test._get_yield_nodes_skip_functions() yield from super()._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): """A While node can contain a Yield node in the test""" yield from self.test._get_yield_nodes_skip_lambdas() yield from super()._get_yield_nodes_skip_lambdas() class With( _base_nodes.MultiLineWithElseBlockNode, _base_nodes.AssignTypeNode, _base_nodes.Statement, ): """Class representing an :class:`ast.With` node. >>> import astroid >>> node = astroid.extract_node(''' with open(file_path) as file_: print(file_.read()) ''') >>> node """ _astroid_fields = ("items", "body") _other_other_fields = ("type_annotation",) _multi_line_block_fields = ("body",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.items: list[tuple[NodeNG, NodeNG | None]] = [] """The pairs of context managers and the names they are assigned to.""" self.body: list[NodeNG] = [] """The contents of the ``with`` block.""" self.type_annotation: NodeNG | None = None # can be None """If present, this will contain the type annotation passed by a type comment""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, items: list[tuple[NodeNG, NodeNG | None]] | None = None, body: list[NodeNG] | None = None, type_annotation: NodeNG | None = None, ) -> None: """Do some setup after initialisation. :param items: The pairs of context managers and the names they are assigned to. :param body: The contents of the ``with`` block. """ if items is not None: self.items = items if body is not None: self.body = body self.type_annotation = type_annotation assigned_stmts = protocols.with_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ @cached_property def blockstart_tolineno(self): """The line on which the beginning of this block ends. :type: int """ return self.items[-1][0].tolineno def get_children(self): """Get the child nodes below this node. :returns: The children. :rtype: iterable(NodeNG) """ for expr, var in self.items: yield expr if var: yield var yield from self.body class AsyncWith(With): """Asynchronous ``with`` built with the ``async`` keyword.""" class Yield(NodeNG): """Class representing an :class:`ast.Yield` node. >>> import astroid >>> node = astroid.extract_node('yield True') >>> node """ _astroid_fields = ("value",) value: NodeNG | None """The value to yield.""" def postinit(self, value: NodeNG | None) -> None: self.value = value def get_children(self): if self.value is not None: yield self.value def _get_yield_nodes_skip_functions(self): yield self def _get_yield_nodes_skip_lambdas(self): yield self class YieldFrom(Yield): # TODO value is required, not optional """Class representing an :class:`ast.YieldFrom` node.""" class DictUnpack(_base_nodes.NoChildrenNode): """Represents the unpacking of dicts into dicts using :pep:`448`.""" class FormattedValue(NodeNG): """Class representing an :class:`ast.FormattedValue` node. Represents a :pep:`498` format string. >>> import astroid >>> node = astroid.extract_node('f"Format {type_}"') >>> node >>> node.values [, ] """ _astroid_fields = ("value", "format_spec") _other_fields = ("conversion",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.value: NodeNG """The value to be formatted into the string.""" self.conversion: int """The type of formatting to be applied to the value. .. seealso:: :class:`ast.FormattedValue` """ self.format_spec: JoinedStr | None = None """The formatting to be applied to the value. .. seealso:: :class:`ast.FormattedValue` """ super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, value: NodeNG, conversion: int, format_spec: JoinedStr | None = None, ) -> None: """Do some setup after initialisation. :param value: The value to be formatted into the string. :param conversion: The type of formatting to be applied to the value. :param format_spec: The formatting to be applied to the value. :type format_spec: JoinedStr or None """ self.value = value self.conversion = conversion self.format_spec = format_spec def get_children(self): yield self.value if self.format_spec is not None: yield self.format_spec class JoinedStr(NodeNG): """Represents a list of string expressions to be joined. >>> import astroid >>> node = astroid.extract_node('f"Format {type_}"') >>> node """ _astroid_fields = ("values",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.values: list[NodeNG] = [] """The string expressions to be joined. :type: list(FormattedValue or Const) """ super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, values: list[NodeNG] | None = None) -> None: """Do some setup after initialisation. :param value: The string expressions to be joined. :type: list(FormattedValue or Const) """ if values is not None: self.values = values def get_children(self): yield from self.values class NamedExpr(_base_nodes.AssignTypeNode): """Represents the assignment from the assignment expression >>> import astroid >>> module = astroid.parse('if a := 1: pass') >>> module.body[0].test """ _astroid_fields = ("target", "value") optional_assign = True """Whether this node optionally assigns a variable. Since NamedExpr are not always called they do not always assign.""" def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: """ :param lineno: The line that this node appears on in the source code. :param col_offset: The column that this node appears on in the source code. :param parent: The parent node in the syntax tree. :param end_lineno: The last line this node appears on in the source code. :param end_col_offset: The end column this node appears on in the source code. Note: This is after the last symbol. """ self.target: NodeNG """The assignment target :type: Name """ self.value: NodeNG """The value that gets assigned in the expression""" super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, target: NodeNG, value: NodeNG) -> None: self.target = target self.value = value assigned_stmts = protocols.named_expr_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ def frame( self, *, future: Literal[None, True] = None ) -> nodes.FunctionDef | nodes.Module | nodes.ClassDef | nodes.Lambda: """The first parent frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, or :class:`ClassDef`. :returns: The first parent frame node. """ if future is not None: warnings.warn( "The future arg will be removed in astroid 4.0.", DeprecationWarning, stacklevel=2, ) if not self.parent: raise ParentMissingError(target=self) # For certain parents NamedExpr evaluate to the scope of the parent if isinstance(self.parent, (Arguments, Keyword, Comprehension)): if not self.parent.parent: raise ParentMissingError(target=self.parent) if not self.parent.parent.parent: raise ParentMissingError(target=self.parent.parent) return self.parent.parent.parent.frame() return self.parent.frame() def scope(self) -> LocalsDictNodeNG: """The first parent node defining a new scope. These can be Module, FunctionDef, ClassDef, Lambda, or GeneratorExp nodes. :returns: The first parent scope node. """ if not self.parent: raise ParentMissingError(target=self) # For certain parents NamedExpr evaluate to the scope of the parent if isinstance(self.parent, (Arguments, Keyword, Comprehension)): if not self.parent.parent: raise ParentMissingError(target=self.parent) if not self.parent.parent.parent: raise ParentMissingError(target=self.parent.parent) return self.parent.parent.parent.scope() return self.parent.scope() def set_local(self, name: str, stmt: NodeNG) -> None: """Define that the given name is declared in the given statement node. NamedExpr's in Arguments, Keyword or Comprehension are evaluated in their parent's parent scope. So we add to their frame's locals. .. seealso:: :meth:`scope` :param name: The name that is being defined. :param stmt: The statement that defines the given name. """ self.frame().set_local(name, stmt) class Unknown(_base_nodes.AssignTypeNode): """This node represents a node in a constructed AST where introspection is not possible. At the moment, it's only used in the args attribute of FunctionDef nodes where function signature introspection failed. """ name = "Unknown" def __init__( self, lineno: None = None, col_offset: None = None, parent: None = None, *, end_lineno: None = None, end_col_offset: None = None, ) -> None: super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def qname(self) -> Literal["Unknown"]: return "Unknown" def _infer(self, context: InferenceContext | None = None, **kwargs): """Inference on an Unknown node immediately terminates.""" yield util.Uninferable class EvaluatedObject(NodeNG): """Contains an object that has already been inferred This class is useful to pre-evaluate a particular node, with the resulting class acting as the non-evaluated node. """ name = "EvaluatedObject" _astroid_fields = ("original",) _other_fields = ("value",) def __init__( self, original: SuccessfulInferenceResult, value: InferenceResult ) -> None: self.original: SuccessfulInferenceResult = original """The original node that has already been evaluated""" self.value: InferenceResult = value """The inferred value""" super().__init__( lineno=self.original.lineno, col_offset=self.original.col_offset, parent=self.original.parent, end_lineno=self.original.end_lineno, end_col_offset=self.original.end_col_offset, ) def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[NodeNG | util.UninferableBase, None, None]: yield self.value # Pattern matching ####################################################### class Match(_base_nodes.Statement, _base_nodes.MultiLineBlockNode): """Class representing a :class:`ast.Match` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case 200: ... case _: ... ''') >>> node """ _astroid_fields = ("subject", "cases") _multi_line_block_fields = ("cases",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.subject: NodeNG self.cases: list[MatchCase] super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, subject: NodeNG, cases: list[MatchCase], ) -> None: self.subject = subject self.cases = cases class Pattern(NodeNG): """Base class for all Pattern nodes.""" class MatchCase(_base_nodes.MultiLineBlockNode): """Class representing a :class:`ast.match_case` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case 200: ... ''') >>> node.cases[0] """ _astroid_fields = ("pattern", "guard", "body") _multi_line_block_fields = ("body",) lineno: None col_offset: None end_lineno: None end_col_offset: None def __init__(self, *, parent: NodeNG | None = None) -> None: self.pattern: Pattern self.guard: NodeNG | None self.body: list[NodeNG] super().__init__( parent=parent, lineno=None, col_offset=None, end_lineno=None, end_col_offset=None, ) def postinit( self, *, pattern: Pattern, guard: NodeNG | None, body: list[NodeNG], ) -> None: self.pattern = pattern self.guard = guard self.body = body class MatchValue(Pattern): """Class representing a :class:`ast.MatchValue` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case 200: ... ''') >>> node.cases[0].pattern """ _astroid_fields = ("value",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.value: NodeNG super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, value: NodeNG) -> None: self.value = value class MatchSingleton(Pattern): """Class representing a :class:`ast.MatchSingleton` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case True: ... case False: ... case None: ... ''') >>> node.cases[0].pattern >>> node.cases[1].pattern >>> node.cases[2].pattern """ _other_fields = ("value",) def __init__( self, *, value: Literal[True, False, None], lineno: int | None = None, col_offset: int | None = None, end_lineno: int | None = None, end_col_offset: int | None = None, parent: NodeNG | None = None, ) -> None: self.value = value super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) class MatchSequence(Pattern): """Class representing a :class:`ast.MatchSequence` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case [1, 2]: ... case (1, 2, *_): ... ''') >>> node.cases[0].pattern >>> node.cases[1].pattern """ _astroid_fields = ("patterns",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.patterns: list[Pattern] super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, patterns: list[Pattern]) -> None: self.patterns = patterns class MatchMapping(_base_nodes.AssignTypeNode, Pattern): """Class representing a :class:`ast.MatchMapping` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case {1: "Hello", 2: "World", 3: _, **rest}: ... ''') >>> node.cases[0].pattern """ _astroid_fields = ("keys", "patterns", "rest") def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.keys: list[NodeNG] self.patterns: list[Pattern] self.rest: AssignName | None super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, keys: list[NodeNG], patterns: list[Pattern], rest: AssignName | None, ) -> None: self.keys = keys self.patterns = patterns self.rest = rest assigned_stmts = protocols.match_mapping_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ class MatchClass(Pattern): """Class representing a :class:`ast.MatchClass` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case Point2D(0, 0): ... case Point3D(x=0, y=0, z=0): ... ''') >>> node.cases[0].pattern >>> node.cases[1].pattern """ _astroid_fields = ("cls", "patterns", "kwd_patterns") _other_fields = ("kwd_attrs",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.cls: NodeNG self.patterns: list[Pattern] self.kwd_attrs: list[str] self.kwd_patterns: list[Pattern] super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, cls: NodeNG, patterns: list[Pattern], kwd_attrs: list[str], kwd_patterns: list[Pattern], ) -> None: self.cls = cls self.patterns = patterns self.kwd_attrs = kwd_attrs self.kwd_patterns = kwd_patterns class MatchStar(_base_nodes.AssignTypeNode, Pattern): """Class representing a :class:`ast.MatchStar` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case [1, *_]: ... ''') >>> node.cases[0].pattern.patterns[1] """ _astroid_fields = ("name",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.name: AssignName | None super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, name: AssignName | None) -> None: self.name = name assigned_stmts = protocols.match_star_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ class MatchAs(_base_nodes.AssignTypeNode, Pattern): """Class representing a :class:`ast.MatchAs` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case [1, a]: ... case {'key': b}: ... case Point2D(0, 0) as c: ... case d: ... ''') >>> node.cases[0].pattern.patterns[1] >>> node.cases[1].pattern.patterns[0] >>> node.cases[2].pattern >>> node.cases[3].pattern """ _astroid_fields = ("pattern", "name") def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.pattern: Pattern | None self.name: AssignName | None super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit( self, *, pattern: Pattern | None, name: AssignName | None, ) -> None: self.pattern = pattern self.name = name assigned_stmts = protocols.match_as_assigned_stmts """Returns the assigned statement (non inferred) according to the assignment type. See astroid/protocols.py for actual implementation. """ class MatchOr(Pattern): """Class representing a :class:`ast.MatchOr` node. >>> import astroid >>> node = astroid.extract_node(''' match x: case 400 | 401 | 402: ... ''') >>> node.cases[0].pattern """ _astroid_fields = ("patterns",) def __init__( self, lineno: int | None = None, col_offset: int | None = None, parent: NodeNG | None = None, *, end_lineno: int | None = None, end_col_offset: int | None = None, ) -> None: self.patterns: list[Pattern] super().__init__( lineno=lineno, col_offset=col_offset, end_lineno=end_lineno, end_col_offset=end_col_offset, parent=parent, ) def postinit(self, *, patterns: list[Pattern]) -> None: self.patterns = patterns # constants ############################################################## # The _proxied attribute of all container types (List, Tuple, etc.) # are set during bootstrapping by _astroid_bootstrapping(). CONST_CLS: dict[type, type[NodeNG]] = { list: List, tuple: Tuple, dict: Dict, set: Set, type(None): Const, type(NotImplemented): Const, type(...): Const, bool: Const, int: Const, float: Const, complex: Const, str: Const, bytes: Const, } def _create_basic_elements( value: Iterable[Any], node: List | Set | Tuple ) -> list[NodeNG]: """Create a list of nodes to function as the elements of a new node.""" elements: list[NodeNG] = [] for element in value: element_node = const_factory(element) element_node.parent = node elements.append(element_node) return elements def _create_dict_items( values: Mapping[Any, Any], node: Dict ) -> list[tuple[SuccessfulInferenceResult, SuccessfulInferenceResult]]: """Create a list of node pairs to function as the items of a new dict node.""" elements: list[tuple[SuccessfulInferenceResult, SuccessfulInferenceResult]] = [] for key, value in values.items(): key_node = const_factory(key) key_node.parent = node value_node = const_factory(value) value_node.parent = node elements.append((key_node, value_node)) return elements def const_factory(value: Any) -> ConstFactoryResult: """Return an astroid node for a python value.""" assert not isinstance(value, NodeNG) # This only handles instances of the CONST types. Any # subclasses get inferred as EmptyNode. # TODO: See if we should revisit these with the normal builder. if value.__class__ not in CONST_CLS: node = EmptyNode() node.object = value return node instance: List | Set | Tuple | Dict initializer_cls = CONST_CLS[value.__class__] if issubclass(initializer_cls, (List, Set, Tuple)): instance = initializer_cls( lineno=None, col_offset=None, parent=None, end_lineno=None, end_col_offset=None, ) instance.postinit(_create_basic_elements(value, instance)) return instance if issubclass(initializer_cls, Dict): instance = initializer_cls( lineno=None, col_offset=None, parent=None, end_lineno=None, end_col_offset=None, ) instance.postinit(_create_dict_items(value, instance)) return instance return Const(value) astroid-3.2.2/astroid/nodes/__init__.py0000664000175000017500000001121014622475517017767 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Every available node class. .. seealso:: :doc:`ast documentation ` All nodes inherit from :class:`~astroid.nodes.node_classes.NodeNG`. """ # Nodes not present in the builtin ast module: DictUnpack, Unknown, and EvaluatedObject. from astroid.nodes.node_classes import ( CONST_CLS, AnnAssign, Arguments, Assert, Assign, AssignAttr, AssignName, AsyncFor, AsyncWith, Attribute, AugAssign, Await, BaseContainer, BinOp, BoolOp, Break, Call, Compare, Comprehension, Const, Continue, Decorators, DelAttr, Delete, DelName, Dict, DictUnpack, EmptyNode, EvaluatedObject, ExceptHandler, Expr, For, FormattedValue, Global, If, IfExp, Import, ImportFrom, JoinedStr, Keyword, List, Match, MatchAs, MatchCase, MatchClass, MatchMapping, MatchOr, MatchSequence, MatchSingleton, MatchStar, MatchValue, Name, NamedExpr, NodeNG, Nonlocal, ParamSpec, Pass, Pattern, Raise, Return, Set, Slice, Starred, Subscript, Try, TryStar, Tuple, TypeAlias, TypeVar, TypeVarTuple, UnaryOp, Unknown, While, With, Yield, YieldFrom, are_exclusive, const_factory, unpack_infer, ) from astroid.nodes.scoped_nodes import ( AsyncFunctionDef, ClassDef, ComprehensionScope, DictComp, FunctionDef, GeneratorExp, Lambda, ListComp, LocalsDictNodeNG, Module, SetComp, builtin_lookup, function_to_method, get_wrapping_class, ) from astroid.nodes.utils import Position ALL_NODE_CLASSES = ( BaseContainer, AnnAssign, Arguments, Assert, Assign, AssignAttr, AssignName, AsyncFor, AsyncFunctionDef, AsyncWith, Attribute, AugAssign, Await, BinOp, BoolOp, Break, Call, ClassDef, Compare, Comprehension, ComprehensionScope, Const, const_factory, Continue, Decorators, DelAttr, Delete, DelName, Dict, DictComp, DictUnpack, EmptyNode, EvaluatedObject, ExceptHandler, Expr, For, FormattedValue, FunctionDef, GeneratorExp, Global, If, IfExp, Import, ImportFrom, JoinedStr, Keyword, Lambda, List, ListComp, LocalsDictNodeNG, Match, MatchAs, MatchCase, MatchClass, MatchMapping, MatchOr, MatchSequence, MatchSingleton, MatchStar, MatchValue, Module, Name, NamedExpr, NodeNG, Nonlocal, ParamSpec, Pass, Pattern, Raise, Return, Set, SetComp, Slice, Starred, Subscript, Try, TryStar, Tuple, TypeAlias, TypeVar, TypeVarTuple, UnaryOp, Unknown, While, With, Yield, YieldFrom, ) __all__ = ( "AnnAssign", "are_exclusive", "Arguments", "Assert", "Assign", "AssignAttr", "AssignName", "AsyncFor", "AsyncFunctionDef", "AsyncWith", "Attribute", "AugAssign", "Await", "BaseContainer", "BinOp", "BoolOp", "Break", "builtin_lookup", "Call", "ClassDef", "CONST_CLS", "Compare", "Comprehension", "ComprehensionScope", "Const", "const_factory", "Continue", "Decorators", "DelAttr", "Delete", "DelName", "Dict", "DictComp", "DictUnpack", "EmptyNode", "EvaluatedObject", "ExceptHandler", "Expr", "For", "FormattedValue", "FunctionDef", "function_to_method", "GeneratorExp", "get_wrapping_class", "Global", "If", "IfExp", "Import", "ImportFrom", "JoinedStr", "Keyword", "Lambda", "List", "ListComp", "LocalsDictNodeNG", "Match", "MatchAs", "MatchCase", "MatchClass", "MatchMapping", "MatchOr", "MatchSequence", "MatchSingleton", "MatchStar", "MatchValue", "Module", "Name", "NamedExpr", "NodeNG", "Nonlocal", "ParamSpec", "Pass", "Position", "Raise", "Return", "Set", "SetComp", "Slice", "Starred", "Subscript", "Try", "TryStar", "Tuple", "TypeAlias", "TypeVar", "TypeVarTuple", "UnaryOp", "Unknown", "unpack_infer", "While", "With", "Yield", "YieldFrom", ) astroid-3.2.2/astroid/nodes/as_string.py0000664000175000017500000006331514622475517020236 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module renders Astroid nodes as string""" from __future__ import annotations import warnings from collections.abc import Iterator from typing import TYPE_CHECKING from astroid import nodes if TYPE_CHECKING: from astroid import objects from astroid.nodes import Const from astroid.nodes.node_classes import ( Match, MatchAs, MatchCase, MatchClass, MatchMapping, MatchOr, MatchSequence, MatchSingleton, MatchStar, MatchValue, Unknown, ) # pylint: disable=unused-argument DOC_NEWLINE = "\0" # Visitor pattern require argument all the time and is not better with staticmethod # noinspection PyUnusedLocal,PyMethodMayBeStatic class AsStringVisitor: """Visitor to render an Astroid node as a valid python code string""" def __init__(self, indent: str = " "): self.indent: str = indent def __call__(self, node) -> str: """Makes this visitor behave as a simple function""" return node.accept(self).replace(DOC_NEWLINE, "\n") def _docs_dedent(self, doc_node: Const | None) -> str: """Stop newlines in docs being indented by self._stmt_list""" if not doc_node: return "" return '\n{}"""{}"""'.format( self.indent, doc_node.value.replace("\n", DOC_NEWLINE) ) def _stmt_list(self, stmts: list, indent: bool = True) -> str: """return a list of nodes to string""" stmts_str: str = "\n".join( nstr for nstr in [n.accept(self) for n in stmts] if nstr ) if not indent: return stmts_str return self.indent + stmts_str.replace("\n", "\n" + self.indent) def _precedence_parens(self, node, child, is_left: bool = True) -> str: """Wrap child in parens only if required to keep same semantics""" if self._should_wrap(node, child, is_left): return f"({child.accept(self)})" return child.accept(self) def _should_wrap(self, node, child, is_left: bool) -> bool: """Wrap child if: - it has lower precedence - same precedence with position opposite to associativity direction """ node_precedence = node.op_precedence() child_precedence = child.op_precedence() if node_precedence > child_precedence: # 3 * (4 + 5) return True if ( node_precedence == child_precedence and is_left != node.op_left_associative() ): # 3 - (4 - 5) # (2**3)**4 return True return False # visit_ methods ########################################### def visit_await(self, node) -> str: return f"await {node.value.accept(self)}" def visit_asyncwith(self, node) -> str: return f"async {self.visit_with(node)}" def visit_asyncfor(self, node) -> str: return f"async {self.visit_for(node)}" def visit_arguments(self, node) -> str: """return an astroid.Function node as string""" return node.format_args() def visit_assignattr(self, node) -> str: """return an astroid.AssAttr node as string""" return self.visit_attribute(node) def visit_assert(self, node) -> str: """return an astroid.Assert node as string""" if node.fail: return f"assert {node.test.accept(self)}, {node.fail.accept(self)}" return f"assert {node.test.accept(self)}" def visit_assignname(self, node) -> str: """return an astroid.AssName node as string""" return node.name def visit_assign(self, node) -> str: """return an astroid.Assign node as string""" lhs = " = ".join(n.accept(self) for n in node.targets) return f"{lhs} = {node.value.accept(self)}" def visit_augassign(self, node) -> str: """return an astroid.AugAssign node as string""" return f"{node.target.accept(self)} {node.op} {node.value.accept(self)}" def visit_annassign(self, node) -> str: """Return an astroid.AugAssign node as string""" target = node.target.accept(self) annotation = node.annotation.accept(self) if node.value is None: return f"{target}: {annotation}" return f"{target}: {annotation} = {node.value.accept(self)}" def visit_binop(self, node) -> str: """return an astroid.BinOp node as string""" left = self._precedence_parens(node, node.left) right = self._precedence_parens(node, node.right, is_left=False) if node.op == "**": return f"{left}{node.op}{right}" return f"{left} {node.op} {right}" def visit_boolop(self, node) -> str: """return an astroid.BoolOp node as string""" values = [f"{self._precedence_parens(node, n)}" for n in node.values] return (f" {node.op} ").join(values) def visit_break(self, node) -> str: """return an astroid.Break node as string""" return "break" def visit_call(self, node) -> str: """return an astroid.Call node as string""" expr_str = self._precedence_parens(node, node.func) args = [arg.accept(self) for arg in node.args] if node.keywords: keywords = [kwarg.accept(self) for kwarg in node.keywords] else: keywords = [] args.extend(keywords) return f"{expr_str}({', '.join(args)})" def visit_classdef(self, node) -> str: """return an astroid.ClassDef node as string""" decorate = node.decorators.accept(self) if node.decorators else "" args = [n.accept(self) for n in node.bases] if node._metaclass and not node.has_metaclass_hack(): args.append("metaclass=" + node._metaclass.accept(self)) args += [n.accept(self) for n in node.keywords] args_str = f"({', '.join(args)})" if args else "" docs = self._docs_dedent(node.doc_node) # TODO: handle type_params return "\n\n{}class {}{}:{}\n{}\n".format( decorate, node.name, args_str, docs, self._stmt_list(node.body) ) def visit_compare(self, node) -> str: """return an astroid.Compare node as string""" rhs_str = " ".join( f"{op} {self._precedence_parens(node, expr, is_left=False)}" for op, expr in node.ops ) return f"{self._precedence_parens(node, node.left)} {rhs_str}" def visit_comprehension(self, node) -> str: """return an astroid.Comprehension node as string""" ifs = "".join(f" if {n.accept(self)}" for n in node.ifs) generated = f"for {node.target.accept(self)} in {node.iter.accept(self)}{ifs}" return f"{'async ' if node.is_async else ''}{generated}" def visit_const(self, node) -> str: """return an astroid.Const node as string""" if node.value is Ellipsis: return "..." return repr(node.value) def visit_continue(self, node) -> str: """return an astroid.Continue node as string""" return "continue" def visit_delete(self, node) -> str: # XXX check if correct """return an astroid.Delete node as string""" return f"del {', '.join(child.accept(self) for child in node.targets)}" def visit_delattr(self, node) -> str: """return an astroid.DelAttr node as string""" return self.visit_attribute(node) def visit_delname(self, node) -> str: """return an astroid.DelName node as string""" return node.name def visit_decorators(self, node) -> str: """return an astroid.Decorators node as string""" return "@%s\n" % "\n@".join(item.accept(self) for item in node.nodes) def visit_dict(self, node) -> str: """return an astroid.Dict node as string""" return "{%s}" % ", ".join(self._visit_dict(node)) def _visit_dict(self, node) -> Iterator[str]: for key, value in node.items: key = key.accept(self) value = value.accept(self) if key == "**": # It can only be a DictUnpack node. yield key + value else: yield f"{key}: {value}" def visit_dictunpack(self, node) -> str: return "**" def visit_dictcomp(self, node) -> str: """return an astroid.DictComp node as string""" return "{{{}: {} {}}}".format( node.key.accept(self), node.value.accept(self), " ".join(n.accept(self) for n in node.generators), ) def visit_expr(self, node) -> str: """return an astroid.Discard node as string""" return node.value.accept(self) def visit_emptynode(self, node) -> str: """dummy method for visiting an Empty node""" return "" def visit_excepthandler(self, node) -> str: n = "except" if isinstance(getattr(node, "parent", None), nodes.TryStar): n = "except*" if node.type: if node.name: excs = f"{n} {node.type.accept(self)} as {node.name.accept(self)}" else: excs = f"{n} {node.type.accept(self)}" else: excs = f"{n}" return f"{excs}:\n{self._stmt_list(node.body)}" def visit_empty(self, node) -> str: """return an Empty node as string""" return "" def visit_for(self, node) -> str: """return an astroid.For node as string""" fors = "for {} in {}:\n{}".format( node.target.accept(self), node.iter.accept(self), self._stmt_list(node.body) ) if node.orelse: fors = f"{fors}\nelse:\n{self._stmt_list(node.orelse)}" return fors def visit_importfrom(self, node) -> str: """return an astroid.ImportFrom node as string""" return "from {} import {}".format( "." * (node.level or 0) + node.modname, _import_string(node.names) ) def visit_joinedstr(self, node) -> str: string = "".join( # Use repr on the string literal parts # to get proper escapes, e.g. \n, \\, \" # But strip the quotes off the ends # (they will always be one character: ' or ") ( repr(value.value)[1:-1] # Literal braces must be doubled to escape them .replace("{", "{{").replace("}", "}}") # Each value in values is either a string literal (Const) # or a FormattedValue if type(value).__name__ == "Const" else value.accept(self) ) for value in node.values ) # Try to find surrounding quotes that don't appear at all in the string. # Because the formatted values inside {} can't contain backslash (\) # using a triple quote is sometimes necessary for quote in ("'", '"', '"""', "'''"): if quote not in string: break return "f" + quote + string + quote def visit_formattedvalue(self, node) -> str: result = node.value.accept(self) if node.conversion and node.conversion >= 0: # e.g. if node.conversion == 114: result += "!r" result += "!" + chr(node.conversion) if node.format_spec: # The format spec is itself a JoinedString, i.e. an f-string # We strip the f and quotes of the ends result += ":" + node.format_spec.accept(self)[2:-1] return "{%s}" % result def handle_functiondef(self, node: nodes.FunctionDef, keyword: str) -> str: """return a (possibly async) function definition node as string""" decorate = node.decorators.accept(self) if node.decorators else "" docs = self._docs_dedent(node.doc_node) trailer = ":" if node.returns: return_annotation = " -> " + node.returns.as_string() trailer = return_annotation + ":" # TODO: handle type_params def_format = "\n%s%s %s(%s)%s%s\n%s" return def_format % ( decorate, keyword, node.name, node.args.accept(self), trailer, docs, self._stmt_list(node.body), ) def visit_functiondef(self, node: nodes.FunctionDef) -> str: """return an astroid.FunctionDef node as string""" return self.handle_functiondef(node, "def") def visit_asyncfunctiondef(self, node: nodes.AsyncFunctionDef) -> str: """return an astroid.AsyncFunction node as string""" return self.handle_functiondef(node, "async def") def visit_generatorexp(self, node) -> str: """return an astroid.GeneratorExp node as string""" return "({} {})".format( node.elt.accept(self), " ".join(n.accept(self) for n in node.generators) ) def visit_attribute(self, node) -> str: """return an astroid.Getattr node as string""" try: left = self._precedence_parens(node, node.expr) except RecursionError: warnings.warn( "Recursion limit exhausted; defaulting to adding parentheses.", UserWarning, stacklevel=2, ) left = f"({node.expr.accept(self)})" if left.isdigit(): left = f"({left})" return f"{left}.{node.attrname}" def visit_global(self, node) -> str: """return an astroid.Global node as string""" return f"global {', '.join(node.names)}" def visit_if(self, node) -> str: """return an astroid.If node as string""" ifs = [f"if {node.test.accept(self)}:\n{self._stmt_list(node.body)}"] if node.has_elif_block(): ifs.append(f"el{self._stmt_list(node.orelse, indent=False)}") elif node.orelse: ifs.append(f"else:\n{self._stmt_list(node.orelse)}") return "\n".join(ifs) def visit_ifexp(self, node) -> str: """return an astroid.IfExp node as string""" return "{} if {} else {}".format( self._precedence_parens(node, node.body, is_left=True), self._precedence_parens(node, node.test, is_left=True), self._precedence_parens(node, node.orelse, is_left=False), ) def visit_import(self, node) -> str: """return an astroid.Import node as string""" return f"import {_import_string(node.names)}" def visit_keyword(self, node) -> str: """return an astroid.Keyword node as string""" if node.arg is None: return f"**{node.value.accept(self)}" return f"{node.arg}={node.value.accept(self)}" def visit_lambda(self, node) -> str: """return an astroid.Lambda node as string""" args = node.args.accept(self) body = node.body.accept(self) if args: return f"lambda {args}: {body}" return f"lambda: {body}" def visit_list(self, node) -> str: """return an astroid.List node as string""" return f"[{', '.join(child.accept(self) for child in node.elts)}]" def visit_listcomp(self, node) -> str: """return an astroid.ListComp node as string""" return "[{} {}]".format( node.elt.accept(self), " ".join(n.accept(self) for n in node.generators) ) def visit_module(self, node) -> str: """return an astroid.Module node as string""" docs = f'"""{node.doc_node.value}"""\n\n' if node.doc_node else "" return docs + "\n".join(n.accept(self) for n in node.body) + "\n\n" def visit_name(self, node) -> str: """return an astroid.Name node as string""" return node.name def visit_namedexpr(self, node) -> str: """Return an assignment expression node as string""" target = node.target.accept(self) value = node.value.accept(self) return f"{target} := {value}" def visit_nonlocal(self, node) -> str: """return an astroid.Nonlocal node as string""" return f"nonlocal {', '.join(node.names)}" def visit_paramspec(self, node: nodes.ParamSpec) -> str: """return an astroid.ParamSpec node as string""" return node.name.accept(self) def visit_pass(self, node) -> str: """return an astroid.Pass node as string""" return "pass" def visit_partialfunction(self, node: objects.PartialFunction) -> str: """Return an objects.PartialFunction as string.""" return self.visit_functiondef(node) def visit_raise(self, node) -> str: """return an astroid.Raise node as string""" if node.exc: if node.cause: return f"raise {node.exc.accept(self)} from {node.cause.accept(self)}" return f"raise {node.exc.accept(self)}" return "raise" def visit_return(self, node) -> str: """return an astroid.Return node as string""" if node.is_tuple_return() and len(node.value.elts) > 1: elts = [child.accept(self) for child in node.value.elts] return f"return {', '.join(elts)}" if node.value: return f"return {node.value.accept(self)}" return "return" def visit_set(self, node) -> str: """return an astroid.Set node as string""" return "{%s}" % ", ".join(child.accept(self) for child in node.elts) def visit_setcomp(self, node) -> str: """return an astroid.SetComp node as string""" return "{{{} {}}}".format( node.elt.accept(self), " ".join(n.accept(self) for n in node.generators) ) def visit_slice(self, node) -> str: """return an astroid.Slice node as string""" lower = node.lower.accept(self) if node.lower else "" upper = node.upper.accept(self) if node.upper else "" step = node.step.accept(self) if node.step else "" if step: return f"{lower}:{upper}:{step}" return f"{lower}:{upper}" def visit_subscript(self, node) -> str: """return an astroid.Subscript node as string""" idx = node.slice if idx.__class__.__name__.lower() == "index": idx = idx.value idxstr = idx.accept(self) if idx.__class__.__name__.lower() == "tuple" and idx.elts: # Remove parenthesis in tuple and extended slice. # a[(::1, 1:)] is not valid syntax. idxstr = idxstr[1:-1] return f"{self._precedence_parens(node, node.value)}[{idxstr}]" def visit_try(self, node) -> str: """return an astroid.Try node as string""" trys = [f"try:\n{self._stmt_list(node.body)}"] for handler in node.handlers: trys.append(handler.accept(self)) if node.orelse: trys.append(f"else:\n{self._stmt_list(node.orelse)}") if node.finalbody: trys.append(f"finally:\n{self._stmt_list(node.finalbody)}") return "\n".join(trys) def visit_trystar(self, node) -> str: """return an astroid.TryStar node as string""" trys = [f"try:\n{self._stmt_list(node.body)}"] for handler in node.handlers: trys.append(handler.accept(self)) if node.orelse: trys.append(f"else:\n{self._stmt_list(node.orelse)}") if node.finalbody: trys.append(f"finally:\n{self._stmt_list(node.finalbody)}") return "\n".join(trys) def visit_tuple(self, node) -> str: """return an astroid.Tuple node as string""" if len(node.elts) == 1: return f"({node.elts[0].accept(self)}, )" return f"({', '.join(child.accept(self) for child in node.elts)})" def visit_typealias(self, node: nodes.TypeAlias) -> str: """return an astroid.TypeAlias node as string""" return node.name.accept(self) if node.name else "_" def visit_typevar(self, node: nodes.TypeVar) -> str: """return an astroid.TypeVar node as string""" return node.name.accept(self) if node.name else "_" def visit_typevartuple(self, node: nodes.TypeVarTuple) -> str: """return an astroid.TypeVarTuple node as string""" return "*" + node.name.accept(self) if node.name else "" def visit_unaryop(self, node) -> str: """return an astroid.UnaryOp node as string""" if node.op == "not": operator = "not " else: operator = node.op return f"{operator}{self._precedence_parens(node, node.operand)}" def visit_while(self, node) -> str: """return an astroid.While node as string""" whiles = f"while {node.test.accept(self)}:\n{self._stmt_list(node.body)}" if node.orelse: whiles = f"{whiles}\nelse:\n{self._stmt_list(node.orelse)}" return whiles def visit_with(self, node) -> str: # 'with' without 'as' is possible """return an astroid.With node as string""" items = ", ".join( f"{expr.accept(self)}" + (v and f" as {v.accept(self)}" or "") for expr, v in node.items ) return f"with {items}:\n{self._stmt_list(node.body)}" def visit_yield(self, node) -> str: """yield an ast.Yield node as string""" yi_val = (" " + node.value.accept(self)) if node.value else "" expr = "yield" + yi_val if node.parent.is_statement: return expr return f"({expr})" def visit_yieldfrom(self, node) -> str: """Return an astroid.YieldFrom node as string.""" yi_val = (" " + node.value.accept(self)) if node.value else "" expr = "yield from" + yi_val if node.parent.is_statement: return expr return f"({expr})" def visit_starred(self, node) -> str: """return Starred node as string""" return "*" + node.value.accept(self) def visit_match(self, node: Match) -> str: """Return an astroid.Match node as string.""" return f"match {node.subject.accept(self)}:\n{self._stmt_list(node.cases)}" def visit_matchcase(self, node: MatchCase) -> str: """Return an astroid.MatchCase node as string.""" guard_str = f" if {node.guard.accept(self)}" if node.guard else "" return ( f"case {node.pattern.accept(self)}{guard_str}:\n" f"{self._stmt_list(node.body)}" ) def visit_matchvalue(self, node: MatchValue) -> str: """Return an astroid.MatchValue node as string.""" return node.value.accept(self) @staticmethod def visit_matchsingleton(node: MatchSingleton) -> str: """Return an astroid.MatchSingleton node as string.""" return str(node.value) def visit_matchsequence(self, node: MatchSequence) -> str: """Return an astroid.MatchSequence node as string.""" if node.patterns is None: return "[]" return f"[{', '.join(p.accept(self) for p in node.patterns)}]" def visit_matchmapping(self, node: MatchMapping) -> str: """Return an astroid.MatchMapping node as string.""" mapping_strings: list[str] = [] if node.keys and node.patterns: mapping_strings.extend( f"{key.accept(self)}: {p.accept(self)}" for key, p in zip(node.keys, node.patterns) ) if node.rest: mapping_strings.append(f"**{node.rest.accept(self)}") return f"{'{'}{', '.join(mapping_strings)}{'}'}" def visit_matchclass(self, node: MatchClass) -> str: """Return an astroid.MatchClass node as string.""" if node.cls is None: raise AssertionError(f"{node} does not have a 'cls' node") class_strings: list[str] = [] if node.patterns: class_strings.extend(p.accept(self) for p in node.patterns) if node.kwd_attrs and node.kwd_patterns: for attr, pattern in zip(node.kwd_attrs, node.kwd_patterns): class_strings.append(f"{attr}={pattern.accept(self)}") return f"{node.cls.accept(self)}({', '.join(class_strings)})" def visit_matchstar(self, node: MatchStar) -> str: """Return an astroid.MatchStar node as string.""" return f"*{node.name.accept(self) if node.name else '_'}" def visit_matchas(self, node: MatchAs) -> str: """Return an astroid.MatchAs node as string.""" # pylint: disable=import-outside-toplevel # Prevent circular dependency from astroid.nodes.node_classes import MatchClass, MatchMapping, MatchSequence if isinstance(node.parent, (MatchSequence, MatchMapping, MatchClass)): return node.name.accept(self) if node.name else "_" return ( f"{node.pattern.accept(self) if node.pattern else '_'}" f"{f' as {node.name.accept(self)}' if node.name else ''}" ) def visit_matchor(self, node: MatchOr) -> str: """Return an astroid.MatchOr node as string.""" if node.patterns is None: raise AssertionError(f"{node} does not have pattern nodes") return " | ".join(p.accept(self) for p in node.patterns) # These aren't for real AST nodes, but for inference objects. def visit_frozenset(self, node): return node.parent.accept(self) def visit_super(self, node): return node.parent.accept(self) def visit_uninferable(self, node): return str(node) def visit_property(self, node): return node.function.accept(self) def visit_evaluatedobject(self, node): return node.original.accept(self) def visit_unknown(self, node: Unknown) -> str: return str(node) def _import_string(names) -> str: """return a list of (name, asname) formatted as a string""" _names = [] for name, asname in names: if asname is not None: _names.append(f"{name} as {asname}") else: _names.append(name) return ", ".join(_names) # This sets the default indent to 4 spaces. to_code = AsStringVisitor(" ") astroid-3.2.2/astroid/nodes/const.py0000664000175000017500000000144714622475517017371 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt OP_PRECEDENCE = { op: precedence for precedence, ops in enumerate( [ ["Lambda"], # lambda x: x + 1 ["IfExp"], # 1 if True else 2 ["or"], ["and"], ["not"], ["Compare"], # in, not in, is, is not, <, <=, >, >=, !=, == ["|"], ["^"], ["&"], ["<<", ">>"], ["+", "-"], ["*", "@", "/", "//", "%"], ["UnaryOp"], # +, -, ~ ["**"], ["Await"], ] ) for op in ops } astroid-3.2.2/astroid/nodes/node_ng.py0000664000175000017500000006464514622475517017665 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import pprint import sys import warnings from collections.abc import Generator, Iterator from functools import cached_property from functools import singledispatch as _singledispatch from typing import ( TYPE_CHECKING, Any, ClassVar, Literal, Tuple, Type, TypeVar, Union, cast, overload, ) from astroid import util from astroid.context import InferenceContext from astroid.exceptions import ( AstroidError, InferenceError, ParentMissingError, StatementMissing, UseInferenceDefault, ) from astroid.manager import AstroidManager from astroid.nodes.as_string import AsStringVisitor from astroid.nodes.const import OP_PRECEDENCE from astroid.nodes.utils import Position from astroid.typing import InferenceErrorInfo, InferenceResult, InferFn if sys.version_info >= (3, 11): from typing import Self else: from typing_extensions import Self if TYPE_CHECKING: from astroid import nodes from astroid.nodes import _base_nodes # Types for 'NodeNG.nodes_of_class()' _NodesT = TypeVar("_NodesT", bound="NodeNG") _NodesT2 = TypeVar("_NodesT2", bound="NodeNG") _NodesT3 = TypeVar("_NodesT3", bound="NodeNG") SkipKlassT = Union[None, Type["NodeNG"], Tuple[Type["NodeNG"], ...]] class NodeNG: """A node of the new Abstract Syntax Tree (AST). This is the base class for all Astroid node classes. """ is_statement: ClassVar[bool] = False """Whether this node indicates a statement.""" optional_assign: ClassVar[bool] = ( False # True for For (and for Comprehension if py <3.0) ) """Whether this node optionally assigns a variable. This is for loop assignments because loop won't necessarily perform an assignment if the loop has no iterations. This is also the case from comprehensions in Python 2. """ is_function: ClassVar[bool] = False # True for FunctionDef nodes """Whether this node indicates a function.""" is_lambda: ClassVar[bool] = False # Attributes below are set by the builder module or by raw factories _astroid_fields: ClassVar[tuple[str, ...]] = () """Node attributes that contain child nodes. This is redefined in most concrete classes. """ _other_fields: ClassVar[tuple[str, ...]] = () """Node attributes that do not contain child nodes.""" _other_other_fields: ClassVar[tuple[str, ...]] = () """Attributes that contain AST-dependent fields.""" # instance specific inference function infer(node, context) _explicit_inference: InferFn[Self] | None = None def __init__( self, lineno: int | None, col_offset: int | None, parent: NodeNG | None, *, end_lineno: int | None, end_col_offset: int | None, ) -> None: self.lineno = lineno """The line that this node appears on in the source code.""" self.col_offset = col_offset """The column that this node appears on in the source code.""" self.parent = parent """The parent node in the syntax tree.""" self.end_lineno = end_lineno """The last line this node appears on in the source code.""" self.end_col_offset = end_col_offset """The end column this node appears on in the source code. Note: This is after the last symbol. """ self.position: Position | None = None """Position of keyword(s) and name. Used as fallback for block nodes which might not provide good enough positional information. E.g. ClassDef, FunctionDef. """ def infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, None]: """Get a generator of the inferred values. This is the main entry point to the inference system. .. seealso:: :ref:`inference` If the instance has some explicit inference function set, it will be called instead of the default interface. :returns: The inferred values. :rtype: iterable """ if context is None: context = InferenceContext() else: context = context.extra_context.get(self, context) if self._explicit_inference is not None: # explicit_inference is not bound, give it self explicitly try: for result in self._explicit_inference( self, # type: ignore[arg-type] context, **kwargs, ): context.nodes_inferred += 1 yield result return except UseInferenceDefault: pass key = (self, context.lookupname, context.callcontext, context.boundnode) if key in context.inferred: yield from context.inferred[key] return results = [] # Limit inference amount to help with performance issues with # exponentially exploding possible results. limit = AstroidManager().max_inferable_values for i, result in enumerate(self._infer(context=context, **kwargs)): if i >= limit or (context.nodes_inferred > context.max_inferred): results.append(util.Uninferable) yield util.Uninferable break results.append(result) yield result context.nodes_inferred += 1 # Cache generated results for subsequent inferences of the # same node using the same context context.inferred[key] = tuple(results) return def repr_name(self) -> str: """Get a name for nice representation. This is either :attr:`name`, :attr:`attrname`, or the empty string. """ if all(name not in self._astroid_fields for name in ("name", "attrname")): return getattr(self, "name", "") or getattr(self, "attrname", "") return "" def __str__(self) -> str: rname = self.repr_name() cname = type(self).__name__ if rname: string = "%(cname)s.%(rname)s(%(fields)s)" alignment = len(cname) + len(rname) + 2 else: string = "%(cname)s(%(fields)s)" alignment = len(cname) + 1 result = [] for field in self._other_fields + self._astroid_fields: value = getattr(self, field, "Unknown") width = 80 - len(field) - alignment lines = pprint.pformat(value, indent=2, width=width).splitlines(True) inner = [lines[0]] for line in lines[1:]: inner.append(" " * alignment + line) result.append(f"{field}={''.join(inner)}") return string % { "cname": cname, "rname": rname, "fields": (",\n" + " " * alignment).join(result), } def __repr__(self) -> str: rname = self.repr_name() # The dependencies used to calculate fromlineno (if not cached) may not exist at the time try: lineno = self.fromlineno except AttributeError: lineno = 0 if rname: string = "<%(cname)s.%(rname)s l.%(lineno)s at 0x%(id)x>" else: string = "<%(cname)s l.%(lineno)s at 0x%(id)x>" return string % { "cname": type(self).__name__, "rname": rname, "lineno": lineno, "id": id(self), } def accept(self, visitor: AsStringVisitor) -> str: """Visit this node using the given visitor.""" func = getattr(visitor, "visit_" + self.__class__.__name__.lower()) return func(self) def get_children(self) -> Iterator[NodeNG]: """Get the child nodes below this node.""" for field in self._astroid_fields: attr = getattr(self, field) if attr is None: continue if isinstance(attr, (list, tuple)): yield from attr else: yield attr yield from () def last_child(self) -> NodeNG | None: """An optimized version of list(get_children())[-1].""" for field in self._astroid_fields[::-1]: attr = getattr(self, field) if not attr: # None or empty list / tuple continue if isinstance(attr, (list, tuple)): return attr[-1] return attr return None def node_ancestors(self) -> Iterator[NodeNG]: """Yield parent, grandparent, etc until there are no more.""" parent = self.parent while parent is not None: yield parent parent = parent.parent def parent_of(self, node) -> bool: """Check if this node is the parent of the given node. :param node: The node to check if it is the child. :type node: NodeNG :returns: Whether this node is the parent of the given node. """ return any(self is parent for parent in node.node_ancestors()) def statement(self, *, future: Literal[None, True] = None) -> _base_nodes.Statement: """The first parent node, including self, marked as statement node. :raises StatementMissing: If self has no parent attribute. """ if future is not None: warnings.warn( "The future arg will be removed in astroid 4.0.", DeprecationWarning, stacklevel=2, ) if self.is_statement: return cast("_base_nodes.Statement", self) if not self.parent: raise StatementMissing(target=self) return self.parent.statement() def frame( self, *, future: Literal[None, True] = None ) -> nodes.FunctionDef | nodes.Module | nodes.ClassDef | nodes.Lambda: """The first parent frame node. A frame node is a :class:`Module`, :class:`FunctionDef`, :class:`ClassDef` or :class:`Lambda`. :returns: The first parent frame node. :raises ParentMissingError: If self has no parent attribute. """ if future is not None: warnings.warn( "The future arg will be removed in astroid 4.0.", DeprecationWarning, stacklevel=2, ) if self.parent is None: raise ParentMissingError(target=self) return self.parent.frame(future=future) def scope(self) -> nodes.LocalsDictNodeNG: """The first parent node defining a new scope. These can be Module, FunctionDef, ClassDef, Lambda, or GeneratorExp nodes. :returns: The first parent scope node. """ if not self.parent: raise ParentMissingError(target=self) return self.parent.scope() def root(self) -> nodes.Module: """Return the root node of the syntax tree. :returns: The root node. """ if not (parent := self.parent): return self # type: ignore[return-value] # Only 'Module' does not have a parent node. while parent.parent: parent = parent.parent return parent # type: ignore[return-value] # Only 'Module' does not have a parent node. def child_sequence(self, child): """Search for the sequence that contains this child. :param child: The child node to search sequences for. :type child: NodeNG :returns: The sequence containing the given child node. :rtype: iterable(NodeNG) :raises AstroidError: If no sequence could be found that contains the given child. """ for field in self._astroid_fields: node_or_sequence = getattr(self, field) if node_or_sequence is child: return [node_or_sequence] # /!\ compiler.ast Nodes have an __iter__ walking over child nodes if ( isinstance(node_or_sequence, (tuple, list)) and child in node_or_sequence ): return node_or_sequence msg = "Could not find %s in %s's children" raise AstroidError(msg % (repr(child), repr(self))) def locate_child(self, child): """Find the field of this node that contains the given child. :param child: The child node to search fields for. :type child: NodeNG :returns: A tuple of the name of the field that contains the child, and the sequence or node that contains the child node. :rtype: tuple(str, iterable(NodeNG) or NodeNG) :raises AstroidError: If no field could be found that contains the given child. """ for field in self._astroid_fields: node_or_sequence = getattr(self, field) # /!\ compiler.ast Nodes have an __iter__ walking over child nodes if child is node_or_sequence: return field, child if ( isinstance(node_or_sequence, (tuple, list)) and child in node_or_sequence ): return field, node_or_sequence msg = "Could not find %s in %s's children" raise AstroidError(msg % (repr(child), repr(self))) # FIXME : should we merge child_sequence and locate_child ? locate_child # is only used in are_exclusive, child_sequence one time in pylint. def next_sibling(self): """The next sibling statement node. :returns: The next sibling statement node. :rtype: NodeNG or None """ return self.parent.next_sibling() def previous_sibling(self): """The previous sibling statement. :returns: The previous sibling statement node. :rtype: NodeNG or None """ return self.parent.previous_sibling() # these are lazy because they're relatively expensive to compute for every # single node, and they rarely get looked at @cached_property def fromlineno(self) -> int: """The first line that this node appears on in the source code. Can also return 0 if the line can not be determined. """ if self.lineno is None: return self._fixed_source_line() return self.lineno @cached_property def tolineno(self) -> int: """The last line that this node appears on in the source code. Can also return 0 if the line can not be determined. """ if self.end_lineno is not None: return self.end_lineno if not self._astroid_fields: # can't have children last_child = None else: last_child = self.last_child() if last_child is None: return self.fromlineno return last_child.tolineno def _fixed_source_line(self) -> int: """Attempt to find the line that this node appears on. We need this method since not all nodes have :attr:`lineno` set. Will return 0 if the line number can not be determined. """ line = self.lineno _node = self try: while line is None: _node = next(_node.get_children()) line = _node.lineno except StopIteration: parent = self.parent while parent and line is None: line = parent.lineno parent = parent.parent return line or 0 def block_range(self, lineno: int) -> tuple[int, int]: """Get a range from the given line number to where this node ends. :param lineno: The line number to start the range at. :returns: The range of line numbers that this node belongs to, starting at the given line number. """ return lineno, self.tolineno def set_local(self, name: str, stmt: NodeNG) -> None: """Define that the given name is declared in the given statement node. This definition is stored on the parent scope node. .. seealso:: :meth:`scope` :param name: The name that is being defined. :param stmt: The statement that defines the given name. """ assert self.parent self.parent.set_local(name, stmt) @overload def nodes_of_class( self, klass: type[_NodesT], skip_klass: SkipKlassT = ..., ) -> Iterator[_NodesT]: ... @overload def nodes_of_class( self, klass: tuple[type[_NodesT], type[_NodesT2]], skip_klass: SkipKlassT = ..., ) -> Iterator[_NodesT] | Iterator[_NodesT2]: ... @overload def nodes_of_class( self, klass: tuple[type[_NodesT], type[_NodesT2], type[_NodesT3]], skip_klass: SkipKlassT = ..., ) -> Iterator[_NodesT] | Iterator[_NodesT2] | Iterator[_NodesT3]: ... @overload def nodes_of_class( self, klass: tuple[type[_NodesT], ...], skip_klass: SkipKlassT = ..., ) -> Iterator[_NodesT]: ... def nodes_of_class( # type: ignore[misc] # mypy doesn't correctly recognize the overloads self, klass: ( type[_NodesT] | tuple[type[_NodesT], type[_NodesT2]] | tuple[type[_NodesT], type[_NodesT2], type[_NodesT3]] | tuple[type[_NodesT], ...] ), skip_klass: SkipKlassT = None, ) -> Iterator[_NodesT] | Iterator[_NodesT2] | Iterator[_NodesT3]: """Get the nodes (including this one or below) of the given types. :param klass: The types of node to search for. :param skip_klass: The types of node to ignore. This is useful to ignore subclasses of :attr:`klass`. :returns: The node of the given types. """ if isinstance(self, klass): yield self if skip_klass is None: for child_node in self.get_children(): yield from child_node.nodes_of_class(klass, skip_klass) return for child_node in self.get_children(): if isinstance(child_node, skip_klass): continue yield from child_node.nodes_of_class(klass, skip_klass) @cached_property def _assign_nodes_in_scope(self) -> list[nodes.Assign]: return [] def _get_name_nodes(self): for child_node in self.get_children(): yield from child_node._get_name_nodes() def _get_return_nodes_skip_functions(self): yield from () def _get_yield_nodes_skip_functions(self): yield from () def _get_yield_nodes_skip_lambdas(self): yield from () def _infer_name(self, frame, name): # overridden for ImportFrom, Import, Global, Try, TryStar and Arguments pass def _infer( self, context: InferenceContext | None = None, **kwargs: Any ) -> Generator[InferenceResult, None, InferenceErrorInfo | None]: """We don't know how to resolve a statement by default.""" # this method is overridden by most concrete classes raise InferenceError( "No inference function for {node!r}.", node=self, context=context ) def inferred(self): """Get a list of the inferred values. .. seealso:: :ref:`inference` :returns: The inferred values. :rtype: list """ return list(self.infer()) def instantiate_class(self): """Instantiate an instance of the defined class. .. note:: On anything other than a :class:`ClassDef` this will return self. :returns: An instance of the defined class. :rtype: object """ return self def has_base(self, node) -> bool: """Check if this node inherits from the given type. :param node: The node defining the base to look for. Usually this is a :class:`Name` node. :type node: NodeNG """ return False def callable(self) -> bool: """Whether this node defines something that is callable. :returns: Whether this defines something that is callable. """ return False def eq(self, value) -> bool: return False def as_string(self) -> str: """Get the source code that this node represents.""" return AsStringVisitor()(self) def repr_tree( self, ids=False, include_linenos=False, ast_state=False, indent=" ", max_depth=0, max_width=80, ) -> str: """Get a string representation of the AST from this node. :param ids: If true, includes the ids with the node type names. :type ids: bool :param include_linenos: If true, includes the line numbers and column offsets. :type include_linenos: bool :param ast_state: If true, includes information derived from the whole AST like local and global variables. :type ast_state: bool :param indent: A string to use to indent the output string. :type indent: str :param max_depth: If set to a positive integer, won't return nodes deeper than max_depth in the string. :type max_depth: int :param max_width: Attempt to format the output string to stay within this number of characters, but can exceed it under some circumstances. Only positive integer values are valid, the default is 80. :type max_width: int :returns: The string representation of the AST. :rtype: str """ @_singledispatch def _repr_tree(node, result, done, cur_indent="", depth=1): """Outputs a representation of a non-tuple/list, non-node that's contained within an AST, including strings. """ lines = pprint.pformat( node, width=max(max_width - len(cur_indent), 1) ).splitlines(True) result.append(lines[0]) result.extend([cur_indent + line for line in lines[1:]]) return len(lines) != 1 # pylint: disable=unused-variable,useless-suppression; doesn't understand singledispatch @_repr_tree.register(tuple) @_repr_tree.register(list) def _repr_seq(node, result, done, cur_indent="", depth=1): """Outputs a representation of a sequence that's contained within an AST. """ cur_indent += indent result.append("[") if not node: broken = False elif len(node) == 1: broken = _repr_tree(node[0], result, done, cur_indent, depth) elif len(node) == 2: broken = _repr_tree(node[0], result, done, cur_indent, depth) if not broken: result.append(", ") else: result.append(",\n") result.append(cur_indent) broken = _repr_tree(node[1], result, done, cur_indent, depth) or broken else: result.append("\n") result.append(cur_indent) for child in node[:-1]: _repr_tree(child, result, done, cur_indent, depth) result.append(",\n") result.append(cur_indent) _repr_tree(node[-1], result, done, cur_indent, depth) broken = True result.append("]") return broken # pylint: disable=unused-variable,useless-suppression; doesn't understand singledispatch @_repr_tree.register(NodeNG) def _repr_node(node, result, done, cur_indent="", depth=1): """Outputs a strings representation of an astroid node.""" if node in done: result.append( indent + f" max_depth: result.append("...") return False depth += 1 cur_indent += indent if ids: result.append(f"{type(node).__name__}<0x{id(node):x}>(\n") else: result.append(f"{type(node).__name__}(") fields = [] if include_linenos: fields.extend(("lineno", "col_offset")) fields.extend(node._other_fields) fields.extend(node._astroid_fields) if ast_state: fields.extend(node._other_other_fields) if not fields: broken = False elif len(fields) == 1: result.append(f"{fields[0]}=") broken = _repr_tree( getattr(node, fields[0]), result, done, cur_indent, depth ) else: result.append("\n") result.append(cur_indent) for field in fields[:-1]: # TODO: Remove this after removal of the 'doc' attribute if field == "doc": continue result.append(f"{field}=") _repr_tree(getattr(node, field), result, done, cur_indent, depth) result.append(",\n") result.append(cur_indent) result.append(f"{fields[-1]}=") _repr_tree(getattr(node, fields[-1]), result, done, cur_indent, depth) broken = True result.append(")") return broken result: list[str] = [] _repr_tree(self, result, set()) return "".join(result) def bool_value(self, context: InferenceContext | None = None): """Determine the boolean value of this node. The boolean value of a node can have three possible values: * False: For instance, empty data structures, False, empty strings, instances which return explicitly False from the __nonzero__ / __bool__ method. * True: Most of constructs are True by default: classes, functions, modules etc * Uninferable: The inference engine is uncertain of the node's value. :returns: The boolean value of this node. :rtype: bool or Uninferable """ return util.Uninferable def op_precedence(self): # Look up by class name or default to highest precedence return OP_PRECEDENCE.get(self.__class__.__name__, len(OP_PRECEDENCE)) def op_left_associative(self) -> bool: # Everything is left associative except `**` and IfExp return True astroid-3.2.2/astroid/nodes/utils.py0000664000175000017500000000066114622475517017400 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from typing import NamedTuple class Position(NamedTuple): """Position with line and column information.""" lineno: int col_offset: int end_lineno: int end_col_offset: int astroid-3.2.2/astroid/nodes/_base_nodes.py0000664000175000017500000005705714622475517020514 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains some base nodes that can be inherited for the different nodes. Previously these were called Mixin nodes. """ from __future__ import annotations import itertools from collections.abc import Generator, Iterator from functools import cached_property, lru_cache, partial from typing import TYPE_CHECKING, Any, Callable, ClassVar, Optional, Union from astroid import bases, nodes, util from astroid.const import PY310_PLUS from astroid.context import ( CallContext, InferenceContext, bind_context_to_node, ) from astroid.exceptions import ( AttributeInferenceError, InferenceError, ) from astroid.interpreter import dunder_lookup from astroid.nodes.node_ng import NodeNG from astroid.typing import InferenceResult if TYPE_CHECKING: from astroid.nodes.node_classes import LocalsDictNodeNG GetFlowFactory = Callable[ [ InferenceResult, Optional[InferenceResult], Union[nodes.AugAssign, nodes.BinOp], InferenceResult, Optional[InferenceResult], InferenceContext, InferenceContext, ], list[partial[Generator[InferenceResult, None, None]]], ] class Statement(NodeNG): """Statement node adding a few attributes. NOTE: This class is part of the public API of 'astroid.nodes'. """ is_statement = True """Whether this node indicates a statement.""" def next_sibling(self): """The next sibling statement node. :returns: The next sibling statement node. :rtype: NodeNG or None """ stmts = self.parent.child_sequence(self) index = stmts.index(self) try: return stmts[index + 1] except IndexError: return None def previous_sibling(self): """The previous sibling statement. :returns: The previous sibling statement node. :rtype: NodeNG or None """ stmts = self.parent.child_sequence(self) index = stmts.index(self) if index >= 1: return stmts[index - 1] return None class NoChildrenNode(NodeNG): """Base nodes for nodes with no children, e.g. Pass.""" def get_children(self) -> Iterator[NodeNG]: yield from () class FilterStmtsBaseNode(NodeNG): """Base node for statement filtering and assignment type.""" def _get_filtered_stmts(self, _, node, _stmts, mystmt: Statement | None): """Method used in _filter_stmts to get statements and trigger break.""" if self.statement() is mystmt: # original node's statement is the assignment, only keep # current node (gen exp, list comp) return [node], True return _stmts, False def assign_type(self): return self class AssignTypeNode(NodeNG): """Base node for nodes that can 'assign' such as AnnAssign.""" def assign_type(self): return self def _get_filtered_stmts(self, lookup_node, node, _stmts, mystmt: Statement | None): """Method used in filter_stmts.""" if self is mystmt: return _stmts, True if self.statement() is mystmt: # original node's statement is the assignment, only keep # current node (gen exp, list comp) return [node], True return _stmts, False class ParentAssignNode(AssignTypeNode): """Base node for nodes whose assign_type is determined by the parent node.""" def assign_type(self): return self.parent.assign_type() class ImportNode(FilterStmtsBaseNode, NoChildrenNode, Statement): """Base node for From and Import Nodes.""" modname: str | None """The module that is being imported from. This is ``None`` for relative imports. """ names: list[tuple[str, str | None]] """What is being imported from the module. Each entry is a :class:`tuple` of the name being imported, and the alias that the name is assigned to (if any). """ def _infer_name(self, frame, name): return name def do_import_module(self, modname: str | None = None) -> nodes.Module: """Return the ast for a module whose name is imported by .""" mymodule = self.root() level: int | None = getattr(self, "level", None) # Import has no level if modname is None: modname = self.modname # If the module ImportNode is importing is a module with the same name # as the file that contains the ImportNode we don't want to use the cache # to make sure we use the import system to get the correct module. if ( modname # pylint: disable-next=no-member # pylint doesn't recognize type of mymodule and mymodule.relative_to_absolute_name(modname, level) == mymodule.name ): use_cache = False else: use_cache = True # pylint: disable-next=no-member # pylint doesn't recognize type of mymodule return mymodule.import_module( modname, level=level, relative_only=bool(level and level >= 1), use_cache=use_cache, ) def real_name(self, asname: str) -> str: """Get name from 'as' name.""" for name, _asname in self.names: if name == "*": return asname if not _asname: name = name.split(".", 1)[0] _asname = name if asname == _asname: return name raise AttributeInferenceError( "Could not find original name for {attribute} in {target!r}", target=self, attribute=asname, ) class MultiLineBlockNode(NodeNG): """Base node for multi-line blocks, e.g. For and FunctionDef. Note that this does not apply to every node with a `body` field. For instance, an If node has a multi-line body, but the body of an IfExpr is not multi-line, and hence cannot contain Return nodes, Assign nodes, etc. """ _multi_line_block_fields: ClassVar[tuple[str, ...]] = () @cached_property def _multi_line_blocks(self): return tuple(getattr(self, field) for field in self._multi_line_block_fields) def _get_return_nodes_skip_functions(self): for block in self._multi_line_blocks: for child_node in block: if child_node.is_function: continue yield from child_node._get_return_nodes_skip_functions() def _get_yield_nodes_skip_functions(self): for block in self._multi_line_blocks: for child_node in block: if child_node.is_function: continue yield from child_node._get_yield_nodes_skip_functions() def _get_yield_nodes_skip_lambdas(self): for block in self._multi_line_blocks: for child_node in block: if child_node.is_lambda: continue yield from child_node._get_yield_nodes_skip_lambdas() @cached_property def _assign_nodes_in_scope(self) -> list[nodes.Assign]: children_assign_nodes = ( child_node._assign_nodes_in_scope for block in self._multi_line_blocks for child_node in block ) return list(itertools.chain.from_iterable(children_assign_nodes)) class MultiLineWithElseBlockNode(MultiLineBlockNode): """Base node for multi-line blocks that can have else statements.""" @cached_property def blockstart_tolineno(self): return self.lineno def _elsed_block_range( self, lineno: int, orelse: list[nodes.NodeNG], last: int | None = None ) -> tuple[int, int]: """Handle block line numbers range for try/finally, for, if and while statements. """ if lineno == self.fromlineno: return lineno, lineno if orelse: if lineno >= orelse[0].fromlineno: return lineno, orelse[-1].tolineno return lineno, orelse[0].fromlineno - 1 return lineno, last or self.tolineno class LookupMixIn(NodeNG): """Mixin to look up a name in the right scope.""" @lru_cache # noqa def lookup(self, name: str) -> tuple[LocalsDictNodeNG, list[NodeNG]]: """Lookup where the given variable is assigned. The lookup starts from self's scope. If self is not a frame itself and the name is found in the inner frame locals, statements will be filtered to remove ignorable statements according to self's location. :param name: The name of the variable to find assignments for. :returns: The scope node and the list of assignments associated to the given name according to the scope where it has been found (locals, globals or builtin). """ return self.scope().scope_lookup(self, name) def ilookup(self, name): """Lookup the inferred values of the given variable. :param name: The variable name to find values for. :type name: str :returns: The inferred values of the statements returned from :meth:`lookup`. :rtype: iterable """ frame, stmts = self.lookup(name) context = InferenceContext() return bases._infer_stmts(stmts, context, frame) def _reflected_name(name) -> str: return "__r" + name[2:] def _augmented_name(name) -> str: return "__i" + name[2:] BIN_OP_METHOD = { "+": "__add__", "-": "__sub__", "/": "__truediv__", "//": "__floordiv__", "*": "__mul__", "**": "__pow__", "%": "__mod__", "&": "__and__", "|": "__or__", "^": "__xor__", "<<": "__lshift__", ">>": "__rshift__", "@": "__matmul__", } REFLECTED_BIN_OP_METHOD = { key: _reflected_name(value) for (key, value) in BIN_OP_METHOD.items() } AUGMENTED_OP_METHOD = { key + "=": _augmented_name(value) for (key, value) in BIN_OP_METHOD.items() } class OperatorNode(NodeNG): @staticmethod def _filter_operation_errors( infer_callable: Callable[ [InferenceContext | None], Generator[InferenceResult | util.BadOperationMessage, None, None], ], context: InferenceContext | None, error: type[util.BadOperationMessage], ) -> Generator[InferenceResult, None, None]: for result in infer_callable(context): if isinstance(result, error): # For the sake of .infer(), we don't care about operation # errors, which is the job of a linter. So return something # which shows that we can't infer the result. yield util.Uninferable else: yield result @staticmethod def _is_not_implemented(const) -> bool: """Check if the given const node is NotImplemented.""" return isinstance(const, nodes.Const) and const.value is NotImplemented @staticmethod def _infer_old_style_string_formatting( instance: nodes.Const, other: nodes.NodeNG, context: InferenceContext ) -> tuple[util.UninferableBase | nodes.Const]: """Infer the result of '"string" % ...'. TODO: Instead of returning Uninferable we should rely on the call to '%' to see if the result is actually uninferable. """ if isinstance(other, nodes.Tuple): if util.Uninferable in other.elts: return (util.Uninferable,) inferred_positional = [util.safe_infer(i, context) for i in other.elts] if all(isinstance(i, nodes.Const) for i in inferred_positional): values = tuple(i.value for i in inferred_positional) else: values = None elif isinstance(other, nodes.Dict): values: dict[Any, Any] = {} for pair in other.items: key = util.safe_infer(pair[0], context) if not isinstance(key, nodes.Const): return (util.Uninferable,) value = util.safe_infer(pair[1], context) if not isinstance(value, nodes.Const): return (util.Uninferable,) values[key.value] = value.value elif isinstance(other, nodes.Const): values = other.value else: return (util.Uninferable,) try: return (nodes.const_factory(instance.value % values),) except (TypeError, KeyError, ValueError): return (util.Uninferable,) @staticmethod def _invoke_binop_inference( instance: InferenceResult, opnode: nodes.AugAssign | nodes.BinOp, op: str, other: InferenceResult, context: InferenceContext, method_name: str, ) -> Generator[InferenceResult, None, None]: """Invoke binary operation inference on the given instance.""" methods = dunder_lookup.lookup(instance, method_name) context = bind_context_to_node(context, instance) method = methods[0] context.callcontext.callee = method if ( isinstance(instance, nodes.Const) and isinstance(instance.value, str) and op == "%" ): return iter( OperatorNode._infer_old_style_string_formatting( instance, other, context ) ) try: inferred = next(method.infer(context=context)) except StopIteration as e: raise InferenceError(node=method, context=context) from e if isinstance(inferred, util.UninferableBase): raise InferenceError if not isinstance( instance, (nodes.Const, nodes.Tuple, nodes.List, nodes.ClassDef, bases.Instance), ): raise InferenceError # pragma: no cover # Used as a failsafe return instance.infer_binary_op(opnode, op, other, context, inferred) @staticmethod def _aug_op( instance: InferenceResult, opnode: nodes.AugAssign, op: str, other: InferenceResult, context: InferenceContext, reverse: bool = False, ) -> partial[Generator[InferenceResult, None, None]]: """Get an inference callable for an augmented binary operation.""" method_name = AUGMENTED_OP_METHOD[op] return partial( OperatorNode._invoke_binop_inference, instance=instance, op=op, opnode=opnode, other=other, context=context, method_name=method_name, ) @staticmethod def _bin_op( instance: InferenceResult, opnode: nodes.AugAssign | nodes.BinOp, op: str, other: InferenceResult, context: InferenceContext, reverse: bool = False, ) -> partial[Generator[InferenceResult, None, None]]: """Get an inference callable for a normal binary operation. If *reverse* is True, then the reflected method will be used instead. """ if reverse: method_name = REFLECTED_BIN_OP_METHOD[op] else: method_name = BIN_OP_METHOD[op] return partial( OperatorNode._invoke_binop_inference, instance=instance, op=op, opnode=opnode, other=other, context=context, method_name=method_name, ) @staticmethod def _bin_op_or_union_type( left: bases.UnionType | nodes.ClassDef | nodes.Const, right: bases.UnionType | nodes.ClassDef | nodes.Const, ) -> Generator[InferenceResult, None, None]: """Create a new UnionType instance for binary or, e.g. int | str.""" yield bases.UnionType(left, right) @staticmethod def _get_binop_contexts(context, left, right): """Get contexts for binary operations. This will return two inference contexts, the first one for x.__op__(y), the other one for y.__rop__(x), where only the arguments are inversed. """ # The order is important, since the first one should be # left.__op__(right). for arg in (right, left): new_context = context.clone() new_context.callcontext = CallContext(args=[arg]) new_context.boundnode = None yield new_context @staticmethod def _same_type(type1, type2) -> bool: """Check if type1 is the same as type2.""" return type1.qname() == type2.qname() @staticmethod def _get_aug_flow( left: InferenceResult, left_type: InferenceResult | None, aug_opnode: nodes.AugAssign, right: InferenceResult, right_type: InferenceResult | None, context: InferenceContext, reverse_context: InferenceContext, ) -> list[partial[Generator[InferenceResult, None, None]]]: """Get the flow for augmented binary operations. The rules are a bit messy: * if left and right have the same type, then left.__augop__(right) is first tried and then left.__op__(right). * if left and right are unrelated typewise, then left.__augop__(right) is tried, then left.__op__(right) is tried and then right.__rop__(left) is tried. * if left is a subtype of right, then left.__augop__(right) is tried and then left.__op__(right). * if left is a supertype of right, then left.__augop__(right) is tried, then right.__rop__(left) and then left.__op__(right) """ from astroid import helpers # pylint: disable=import-outside-toplevel bin_op = aug_opnode.op.strip("=") aug_op = aug_opnode.op if OperatorNode._same_type(left_type, right_type): methods = [ OperatorNode._aug_op(left, aug_opnode, aug_op, right, context), OperatorNode._bin_op(left, aug_opnode, bin_op, right, context), ] elif helpers.is_subtype(left_type, right_type): methods = [ OperatorNode._aug_op(left, aug_opnode, aug_op, right, context), OperatorNode._bin_op(left, aug_opnode, bin_op, right, context), ] elif helpers.is_supertype(left_type, right_type): methods = [ OperatorNode._aug_op(left, aug_opnode, aug_op, right, context), OperatorNode._bin_op( right, aug_opnode, bin_op, left, reverse_context, reverse=True ), OperatorNode._bin_op(left, aug_opnode, bin_op, right, context), ] else: methods = [ OperatorNode._aug_op(left, aug_opnode, aug_op, right, context), OperatorNode._bin_op(left, aug_opnode, bin_op, right, context), OperatorNode._bin_op( right, aug_opnode, bin_op, left, reverse_context, reverse=True ), ] return methods @staticmethod def _get_binop_flow( left: InferenceResult, left_type: InferenceResult | None, binary_opnode: nodes.AugAssign | nodes.BinOp, right: InferenceResult, right_type: InferenceResult | None, context: InferenceContext, reverse_context: InferenceContext, ) -> list[partial[Generator[InferenceResult, None, None]]]: """Get the flow for binary operations. The rules are a bit messy: * if left and right have the same type, then only one method will be called, left.__op__(right) * if left and right are unrelated typewise, then first left.__op__(right) is tried and if this does not exist or returns NotImplemented, then right.__rop__(left) is tried. * if left is a subtype of right, then only left.__op__(right) is tried. * if left is a supertype of right, then right.__rop__(left) is first tried and then left.__op__(right) """ from astroid import helpers # pylint: disable=import-outside-toplevel op = binary_opnode.op if OperatorNode._same_type(left_type, right_type): methods = [OperatorNode._bin_op(left, binary_opnode, op, right, context)] elif helpers.is_subtype(left_type, right_type): methods = [OperatorNode._bin_op(left, binary_opnode, op, right, context)] elif helpers.is_supertype(left_type, right_type): methods = [ OperatorNode._bin_op( right, binary_opnode, op, left, reverse_context, reverse=True ), OperatorNode._bin_op(left, binary_opnode, op, right, context), ] else: methods = [ OperatorNode._bin_op(left, binary_opnode, op, right, context), OperatorNode._bin_op( right, binary_opnode, op, left, reverse_context, reverse=True ), ] if ( PY310_PLUS and op == "|" and ( isinstance(left, (bases.UnionType, nodes.ClassDef)) or isinstance(left, nodes.Const) and left.value is None ) and ( isinstance(right, (bases.UnionType, nodes.ClassDef)) or isinstance(right, nodes.Const) and right.value is None ) ): methods.extend([partial(OperatorNode._bin_op_or_union_type, left, right)]) return methods @staticmethod def _infer_binary_operation( left: InferenceResult, right: InferenceResult, binary_opnode: nodes.AugAssign | nodes.BinOp, context: InferenceContext, flow_factory: GetFlowFactory, ) -> Generator[InferenceResult | util.BadBinaryOperationMessage, None, None]: """Infer a binary operation between a left operand and a right operand. This is used by both normal binary operations and augmented binary operations, the only difference is the flow factory used. """ from astroid import helpers # pylint: disable=import-outside-toplevel context, reverse_context = OperatorNode._get_binop_contexts( context, left, right ) left_type = helpers.object_type(left) right_type = helpers.object_type(right) methods = flow_factory( left, left_type, binary_opnode, right, right_type, context, reverse_context ) for method in methods: try: results = list(method()) except AttributeError: continue except AttributeInferenceError: continue except InferenceError: yield util.Uninferable return else: if any(isinstance(result, util.UninferableBase) for result in results): yield util.Uninferable return if all(map(OperatorNode._is_not_implemented, results)): continue not_implemented = sum( 1 for result in results if OperatorNode._is_not_implemented(result) ) if not_implemented and not_implemented != len(results): # Can't infer yet what this is. yield util.Uninferable return yield from results return # The operation doesn't seem to be supported so let the caller know about it yield util.BadBinaryOperationMessage(left_type, binary_opnode.op, right_type) astroid-3.2.2/astroid/util.py0000664000175000017500000001135614622475517016110 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import warnings from typing import TYPE_CHECKING, Any, Final, Literal from astroid.exceptions import InferenceError if TYPE_CHECKING: from astroid import bases, nodes from astroid.context import InferenceContext from astroid.typing import InferenceResult class UninferableBase: """Special inference object, which is returned when inference fails. This is meant to be used as a singleton. Use astroid.util.Uninferable to access it. """ def __repr__(self) -> Literal["Uninferable"]: return "Uninferable" __str__ = __repr__ def __getattribute__(self, name: str) -> Any: if name == "next": raise AttributeError("next method should not be called") if name.startswith("__") and name.endswith("__"): return object.__getattribute__(self, name) if name == "accept": return object.__getattribute__(self, name) return self def __call__(self, *args: Any, **kwargs: Any) -> UninferableBase: return self def __bool__(self) -> Literal[False]: return False __nonzero__ = __bool__ def accept(self, visitor): return visitor.visit_uninferable(self) Uninferable: Final = UninferableBase() class BadOperationMessage: """Object which describes a TypeError occurred somewhere in the inference chain. This is not an exception, but a container object which holds the types and the error which occurred. """ class BadUnaryOperationMessage(BadOperationMessage): """Object which describes operational failures on UnaryOps.""" def __init__(self, operand, op, error): self.operand = operand self.op = op self.error = error @property def _object_type_helper(self): from astroid import helpers # pylint: disable=import-outside-toplevel return helpers.object_type def _object_type(self, obj): objtype = self._object_type_helper(obj) if isinstance(objtype, UninferableBase): return None return objtype def __str__(self) -> str: if hasattr(self.operand, "name"): operand_type = self.operand.name else: object_type = self._object_type(self.operand) if hasattr(object_type, "name"): operand_type = object_type.name else: # Just fallback to as_string operand_type = object_type.as_string() msg = "bad operand type for unary {}: {}" return msg.format(self.op, operand_type) class BadBinaryOperationMessage(BadOperationMessage): """Object which describes type errors for BinOps.""" def __init__(self, left_type, op, right_type): self.left_type = left_type self.right_type = right_type self.op = op def __str__(self) -> str: msg = "unsupported operand type(s) for {}: {!r} and {!r}" return msg.format(self.op, self.left_type.name, self.right_type.name) def _instancecheck(cls, other) -> bool: wrapped = cls.__wrapped__ other_cls = other.__class__ is_instance_of = wrapped is other_cls or issubclass(other_cls, wrapped) warnings.warn( "%r is deprecated and slated for removal in astroid " "2.0, use %r instead" % (cls.__class__.__name__, wrapped.__name__), PendingDeprecationWarning, stacklevel=2, ) return is_instance_of def check_warnings_filter() -> bool: """Return True if any other than the default DeprecationWarning filter is enabled. https://docs.python.org/3/library/warnings.html#default-warning-filter """ return any( issubclass(DeprecationWarning, filter[2]) and filter[0] != "ignore" and filter[3] != "__main__" for filter in warnings.filters ) def safe_infer( node: nodes.NodeNG | bases.Proxy | UninferableBase, context: InferenceContext | None = None, ) -> InferenceResult | None: """Return the inferred value for the given node. Return None if inference failed or if there is some ambiguity (more than one node has been inferred). """ if isinstance(node, UninferableBase): return node try: inferit = node.infer(context=context) value = next(inferit) except (InferenceError, StopIteration): return None try: next(inferit) return None # None if there is ambiguity on the inferred node except InferenceError: return None # there is some kind of ambiguity except StopIteration: return value astroid-3.2.2/astroid/context.py0000664000175000017500000001426414622475517016620 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Various context related utilities, including inference and call contexts.""" from __future__ import annotations import contextlib import pprint from collections.abc import Iterator from typing import TYPE_CHECKING, Dict, Optional, Sequence, Tuple from astroid.typing import InferenceResult, SuccessfulInferenceResult if TYPE_CHECKING: from astroid import constraint, nodes from astroid.nodes.node_classes import Keyword, NodeNG _InferenceCache = Dict[ Tuple["NodeNG", Optional[str], Optional[str], Optional[str]], Sequence["NodeNG"] ] _INFERENCE_CACHE: _InferenceCache = {} def _invalidate_cache() -> None: _INFERENCE_CACHE.clear() class InferenceContext: """Provide context for inference. Store already inferred nodes to save time Account for already visited nodes to stop infinite recursion """ __slots__ = ( "path", "lookupname", "callcontext", "boundnode", "extra_context", "constraints", "_nodes_inferred", ) max_inferred = 100 def __init__( self, path: set[tuple[nodes.NodeNG, str | None]] | None = None, nodes_inferred: list[int] | None = None, ) -> None: if nodes_inferred is None: self._nodes_inferred = [0] else: self._nodes_inferred = nodes_inferred self.path = path or set() """Path of visited nodes and their lookupname. Currently this key is ``(node, context.lookupname)`` """ self.lookupname: str | None = None """The original name of the node. e.g. foo = 1 The inference of 'foo' is nodes.Const(1) but the lookup name is 'foo' """ self.callcontext: CallContext | None = None """The call arguments and keywords for the given context.""" self.boundnode: SuccessfulInferenceResult | None = None """The bound node of the given context. e.g. the bound node of object.__new__(cls) is the object node """ self.extra_context: dict[SuccessfulInferenceResult, InferenceContext] = {} """Context that needs to be passed down through call stacks for call arguments.""" self.constraints: dict[str, dict[nodes.If, set[constraint.Constraint]]] = {} """The constraints on nodes.""" @property def nodes_inferred(self) -> int: """ Number of nodes inferred in this context and all its clones/descendents. Wrap inner value in a mutable cell to allow for mutating a class variable in the presence of __slots__ """ return self._nodes_inferred[0] @nodes_inferred.setter def nodes_inferred(self, value: int) -> None: self._nodes_inferred[0] = value @property def inferred(self) -> _InferenceCache: """ Inferred node contexts to their mapped results. Currently the key is ``(node, lookupname, callcontext, boundnode)`` and the value is tuple of the inferred results """ return _INFERENCE_CACHE def push(self, node: nodes.NodeNG) -> bool: """Push node into inference path. Allows one to see if the given node has already been looked at for this inference context """ name = self.lookupname if (node, name) in self.path: return True self.path.add((node, name)) return False def clone(self) -> InferenceContext: """Clone inference path. For example, each side of a binary operation (BinOp) starts with the same context but diverge as each side is inferred so the InferenceContext will need be cloned """ # XXX copy lookupname/callcontext ? clone = InferenceContext(self.path.copy(), nodes_inferred=self._nodes_inferred) clone.callcontext = self.callcontext clone.boundnode = self.boundnode clone.extra_context = self.extra_context clone.constraints = self.constraints.copy() return clone @contextlib.contextmanager def restore_path(self) -> Iterator[None]: path = set(self.path) yield self.path = path def is_empty(self) -> bool: return ( not self.path and not self.nodes_inferred and not self.callcontext and not self.boundnode and not self.lookupname and not self.callcontext and not self.extra_context and not self.constraints ) def __str__(self) -> str: state = ( f"{field}={pprint.pformat(getattr(self, field), width=80 - len(field))}" for field in self.__slots__ ) return "{}({})".format(type(self).__name__, ",\n ".join(state)) class CallContext: """Holds information for a call site.""" __slots__ = ("args", "keywords", "callee") def __init__( self, args: list[NodeNG], keywords: list[Keyword] | None = None, callee: InferenceResult | None = None, ): self.args = args # Call positional arguments if keywords: arg_value_pairs = [(arg.arg, arg.value) for arg in keywords] else: arg_value_pairs = [] self.keywords = arg_value_pairs # Call keyword arguments self.callee = callee # Function being called def copy_context(context: InferenceContext | None) -> InferenceContext: """Clone a context if given, or return a fresh context.""" if context is not None: return context.clone() return InferenceContext() def bind_context_to_node( context: InferenceContext | None, node: SuccessfulInferenceResult ) -> InferenceContext: """Give a context a boundnode to retrieve the correct function name or attribute value with from further inference. Do not use an existing context since the boundnode could then be incorrectly propagated higher up in the call stack. """ context = copy_context(context) context.boundnode = node return context astroid-3.2.2/astroid/exceptions.py0000664000175000017500000003076214622475517017316 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """This module contains exceptions used in the astroid library.""" from __future__ import annotations from collections.abc import Iterable, Iterator from typing import TYPE_CHECKING, Any from astroid.typing import InferenceResult, SuccessfulInferenceResult if TYPE_CHECKING: from astroid import arguments, bases, nodes, objects from astroid.context import InferenceContext __all__ = ( "AstroidBuildingError", "AstroidError", "AstroidImportError", "AstroidIndexError", "AstroidSyntaxError", "AstroidTypeError", "AstroidValueError", "AttributeInferenceError", "DuplicateBasesError", "InconsistentMroError", "InferenceError", "InferenceOverwriteError", "MroError", "NameInferenceError", "NoDefault", "NotFoundError", "ParentMissingError", "ResolveError", "StatementMissing", "SuperArgumentTypeError", "SuperError", "TooManyLevelsError", "UnresolvableName", "UseInferenceDefault", ) class AstroidError(Exception): """Base exception class for all astroid related exceptions. AstroidError and its subclasses are structured, intended to hold objects representing state when the exception is thrown. Field values are passed to the constructor as keyword-only arguments. Each subclass has its own set of standard fields, but use your best judgment to decide whether a specific exception instance needs more or fewer fields for debugging. Field values may be used to lazily generate the error message: self.message.format() will be called with the field names and values supplied as keyword arguments. """ def __init__(self, message: str = "", **kws: Any) -> None: super().__init__(message) self.message = message for key, value in kws.items(): setattr(self, key, value) def __str__(self) -> str: return self.message.format(**vars(self)) class AstroidBuildingError(AstroidError): """Exception class when we are unable to build an astroid representation. Standard attributes: modname: Name of the module that AST construction failed for. error: Exception raised during construction. """ def __init__( self, message: str = "Failed to import module {modname}.", modname: str | None = None, error: Exception | None = None, source: str | None = None, path: str | None = None, cls: type | None = None, class_repr: str | None = None, **kws: Any, ) -> None: self.modname = modname self.error = error self.source = source self.path = path self.cls = cls self.class_repr = class_repr super().__init__(message, **kws) class AstroidImportError(AstroidBuildingError): """Exception class used when a module can't be imported by astroid.""" class TooManyLevelsError(AstroidImportError): """Exception class which is raised when a relative import was beyond the top-level. Standard attributes: level: The level which was attempted. name: the name of the module on which the relative import was attempted. """ def __init__( self, message: str = "Relative import with too many levels " "({level}) for module {name!r}", level: int | None = None, name: str | None = None, **kws: Any, ) -> None: self.level = level self.name = name super().__init__(message, **kws) class AstroidSyntaxError(AstroidBuildingError): """Exception class used when a module can't be parsed.""" def __init__( self, message: str, modname: str | None, error: Exception, path: str | None, source: str | None = None, ) -> None: super().__init__(message, modname, error, source, path) class NoDefault(AstroidError): """Raised by function's `default_value` method when an argument has no default value. Standard attributes: func: Function node. name: Name of argument without a default. """ def __init__( self, message: str = "{func!r} has no default for {name!r}.", func: nodes.FunctionDef | None = None, name: str | None = None, **kws: Any, ) -> None: self.func = func self.name = name super().__init__(message, **kws) class ResolveError(AstroidError): """Base class of astroid resolution/inference error. ResolveError is not intended to be raised. Standard attributes: context: InferenceContext object. """ def __init__( self, message: str = "", context: InferenceContext | None = None, **kws: Any ) -> None: self.context = context super().__init__(message, **kws) class MroError(ResolveError): """Error raised when there is a problem with method resolution of a class. Standard attributes: mros: A sequence of sequences containing ClassDef nodes. cls: ClassDef node whose MRO resolution failed. context: InferenceContext object. """ def __init__( self, message: str, mros: Iterable[Iterable[nodes.ClassDef]], cls: nodes.ClassDef, context: InferenceContext | None = None, **kws: Any, ) -> None: self.mros = mros self.cls = cls self.context = context super().__init__(message, **kws) def __str__(self) -> str: mro_names = ", ".join(f"({', '.join(b.name for b in m)})" for m in self.mros) return self.message.format(mros=mro_names, cls=self.cls) class DuplicateBasesError(MroError): """Error raised when there are duplicate bases in the same class bases.""" class InconsistentMroError(MroError): """Error raised when a class's MRO is inconsistent.""" class SuperError(ResolveError): """Error raised when there is a problem with a *super* call. Standard attributes: *super_*: The Super instance that raised the exception. context: InferenceContext object. """ def __init__(self, message: str, super_: objects.Super, **kws: Any) -> None: self.super_ = super_ super().__init__(message, **kws) def __str__(self) -> str: return self.message.format(**vars(self.super_)) class InferenceError(ResolveError): # pylint: disable=too-many-instance-attributes """Raised when we are unable to infer a node. Standard attributes: node: The node inference was called on. context: InferenceContext object. """ def __init__( # pylint: disable=too-many-arguments self, message: str = "Inference failed for {node!r}.", node: InferenceResult | None = None, context: InferenceContext | None = None, target: InferenceResult | None = None, targets: InferenceResult | None = None, attribute: str | None = None, unknown: InferenceResult | None = None, assign_path: list[int] | None = None, caller: SuccessfulInferenceResult | None = None, stmts: Iterator[InferenceResult] | None = None, frame: InferenceResult | None = None, call_site: arguments.CallSite | None = None, func: InferenceResult | None = None, arg: str | None = None, positional_arguments: list | None = None, unpacked_args: list | None = None, keyword_arguments: dict | None = None, unpacked_kwargs: dict | None = None, **kws: Any, ) -> None: self.node = node self.context = context self.target = target self.targets = targets self.attribute = attribute self.unknown = unknown self.assign_path = assign_path self.caller = caller self.stmts = stmts self.frame = frame self.call_site = call_site self.func = func self.arg = arg self.positional_arguments = positional_arguments self.unpacked_args = unpacked_args self.keyword_arguments = keyword_arguments self.unpacked_kwargs = unpacked_kwargs super().__init__(message, **kws) # Why does this inherit from InferenceError rather than ResolveError? # Changing it causes some inference tests to fail. class NameInferenceError(InferenceError): """Raised when a name lookup fails, corresponds to NameError. Standard attributes: name: The name for which lookup failed, as a string. scope: The node representing the scope in which the lookup occurred. context: InferenceContext object. """ def __init__( self, message: str = "{name!r} not found in {scope!r}.", name: str | None = None, scope: nodes.LocalsDictNodeNG | None = None, context: InferenceContext | None = None, **kws: Any, ) -> None: self.name = name self.scope = scope self.context = context super().__init__(message, **kws) class AttributeInferenceError(ResolveError): """Raised when an attribute lookup fails, corresponds to AttributeError. Standard attributes: target: The node for which lookup failed. attribute: The attribute for which lookup failed, as a string. context: InferenceContext object. """ def __init__( self, message: str = "{attribute!r} not found on {target!r}.", attribute: str = "", target: nodes.NodeNG | bases.BaseInstance | None = None, context: InferenceContext | None = None, mros: list[nodes.ClassDef] | None = None, super_: nodes.ClassDef | None = None, cls: nodes.ClassDef | None = None, **kws: Any, ) -> None: self.attribute = attribute self.target = target self.context = context self.mros = mros self.super_ = super_ self.cls = cls super().__init__(message, **kws) class UseInferenceDefault(Exception): """Exception to be raised in custom inference function to indicate that it should go back to the default behaviour. """ class _NonDeducibleTypeHierarchy(Exception): """Raised when is_subtype / is_supertype can't deduce the relation between two types. """ class AstroidIndexError(AstroidError): """Raised when an Indexable / Mapping does not have an index / key.""" def __init__( self, message: str = "", node: nodes.NodeNG | bases.Instance | None = None, index: nodes.Subscript | None = None, context: InferenceContext | None = None, **kws: Any, ) -> None: self.node = node self.index = index self.context = context super().__init__(message, **kws) class AstroidTypeError(AstroidError): """Raised when a TypeError would be expected in Python code.""" def __init__( self, message: str = "", node: nodes.NodeNG | bases.Instance | None = None, index: nodes.Subscript | None = None, context: InferenceContext | None = None, **kws: Any, ) -> None: self.node = node self.index = index self.context = context super().__init__(message, **kws) class AstroidValueError(AstroidError): """Raised when a ValueError would be expected in Python code.""" class InferenceOverwriteError(AstroidError): """Raised when an inference tip is overwritten. Currently only used for debugging. """ class ParentMissingError(AstroidError): """Raised when a node which is expected to have a parent attribute is missing one. Standard attributes: target: The node for which the parent lookup failed. """ def __init__(self, target: nodes.NodeNG) -> None: self.target = target super().__init__(message=f"Parent not found on {target!r}.") class StatementMissing(ParentMissingError): """Raised when a call to node.statement() does not return a node. This is because a node in the chain does not have a parent attribute and therefore does not return a node for statement(). Standard attributes: target: The node for which the parent lookup failed. """ def __init__(self, target: nodes.NodeNG) -> None: super(ParentMissingError, self).__init__( message=f"Statement not found on {target!r}" ) SuperArgumentTypeError = SuperError UnresolvableName = NameInferenceError NotFoundError = AttributeInferenceError astroid-3.2.2/astroid/_ast.py0000664000175000017500000000573114622475517016061 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import ast from typing import NamedTuple from astroid.const import Context class FunctionType(NamedTuple): argtypes: list[ast.expr] returns: ast.expr class ParserModule(NamedTuple): unary_op_classes: dict[type[ast.unaryop], str] cmp_op_classes: dict[type[ast.cmpop], str] bool_op_classes: dict[type[ast.boolop], str] bin_op_classes: dict[type[ast.operator], str] context_classes: dict[type[ast.expr_context], Context] def parse( self, string: str, type_comments: bool = True, filename: str | None = None ) -> ast.Module: if filename: return ast.parse(string, filename=filename, type_comments=type_comments) return ast.parse(string, type_comments=type_comments) def parse_function_type_comment(type_comment: str) -> FunctionType | None: """Given a correct type comment, obtain a FunctionType object.""" func_type = ast.parse(type_comment, "", "func_type") # type: ignore[attr-defined] return FunctionType(argtypes=func_type.argtypes, returns=func_type.returns) def get_parser_module(type_comments: bool = True) -> ParserModule: unary_op_classes = _unary_operators_from_module() cmp_op_classes = _compare_operators_from_module() bool_op_classes = _bool_operators_from_module() bin_op_classes = _binary_operators_from_module() context_classes = _contexts_from_module() return ParserModule( unary_op_classes, cmp_op_classes, bool_op_classes, bin_op_classes, context_classes, ) def _unary_operators_from_module() -> dict[type[ast.unaryop], str]: return {ast.UAdd: "+", ast.USub: "-", ast.Not: "not", ast.Invert: "~"} def _binary_operators_from_module() -> dict[type[ast.operator], str]: return { ast.Add: "+", ast.BitAnd: "&", ast.BitOr: "|", ast.BitXor: "^", ast.Div: "/", ast.FloorDiv: "//", ast.MatMult: "@", ast.Mod: "%", ast.Mult: "*", ast.Pow: "**", ast.Sub: "-", ast.LShift: "<<", ast.RShift: ">>", } def _bool_operators_from_module() -> dict[type[ast.boolop], str]: return {ast.And: "and", ast.Or: "or"} def _compare_operators_from_module() -> dict[type[ast.cmpop], str]: return { ast.Eq: "==", ast.Gt: ">", ast.GtE: ">=", ast.In: "in", ast.Is: "is", ast.IsNot: "is not", ast.Lt: "<", ast.LtE: "<=", ast.NotEq: "!=", ast.NotIn: "not in", } def _contexts_from_module() -> dict[type[ast.expr_context], Context]: return { ast.Load: Context.Load, ast.Store: Context.Store, ast.Del: Context.Del, ast.Param: Context.Store, } astroid-3.2.2/pyproject.toml0000664000175000017500000000633514622475517016031 0ustar epsilonepsilon[build-system] requires = ["setuptools>=64.0", "wheel>=0.37.1"] build-backend = "setuptools.build_meta" [project] name = "astroid" license = {text = "LGPL-2.1-or-later"} description = "An abstract syntax tree for Python with inference support." readme = "README.rst" keywords = ["static code analysis", "python", "abstract syntax tree"] classifiers = [ "Development Status :: 6 - Mature", "Environment :: Console", "Intended Audience :: Developers", "License :: OSI Approved :: GNU Lesser General Public License v2 (LGPLv2)", "Operating System :: OS Independent", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3 :: Only", "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", "Programming Language :: Python :: 3.11", "Programming Language :: Python :: 3.12", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: Software Development :: Quality Assurance", "Topic :: Software Development :: Testing", ] requires-python = ">=3.8.0" dependencies = [ "typing-extensions>=4.0.0;python_version<'3.11'", ] dynamic = ["version"] [project.urls] "Docs" = "https://pylint.readthedocs.io/projects/astroid/en/latest/" "Source Code" = "https://github.com/pylint-dev/astroid" "Bug tracker" = "https://github.com/pylint-dev/astroid/issues" "Discord server" = "https://discord.gg/Egy6P8AMB5" [tool.setuptools] license-files = ["LICENSE", "CONTRIBUTORS.txt"] # Keep in sync with setup.cfg [tool.setuptools.package-dir] "" = "." [tool.setuptools.packages.find] include = ["astroid*"] [tool.setuptools.dynamic] version = {attr = "astroid.__pkginfo__.__version__"} [tool.aliases] test = "pytest" [tool.pytest.ini_options] addopts = '-m "not acceptance"' python_files = ["*test_*.py"] testpaths = ["tests"] filterwarnings = "error" [tool.mypy] enable_error_code = "ignore-without-code" no_implicit_optional = true scripts_are_modules = true show_error_codes = true warn_redundant_casts = true [[tool.mypy.overrides]] # Importlib typeshed stubs do not include the private functions we use module = [ "_io.*", "gi.*", "importlib.*", "nose.*", "numpy.*", "pytest", "setuptools", ] ignore_missing_imports = true [tool.ruff] # ruff is less lenient than pylint and does not make any exceptions # (for docstrings, strings and comments in particular). line-length = 110 select = [ "E", # pycodestyle "F", # pyflakes "W", # pycodestyle "B", # bugbear "I", # isort "RUF", # ruff ] ignore = [ "B905", # `zip()` without an explicit `strict=` parameter "F401", # API "RUF100", # ruff does not understand pylint's directive usefulness ] fixable = [ "E", # pycodestyle "F", # pyflakes "W", # pycodestyle "B", # bugbear "I", # isort "RUF", # ruff ] unfixable = ["RUF001"] target-version = "py38" [tool.ruff.per-file-ignores] # Ruff is autofixing a tests with a voluntarily sneaky unicode "tests/test_regrtest.py" = ["RUF001"] astroid-3.2.2/ChangeLog0000664000175000017500000037717314622475517014702 0ustar epsilonepsilon=================== astroid's ChangeLog =================== What's New in astroid 3.3.0? ============================ Release date: TBA What's New in astroid 3.2.3? ============================ Release date: TBA What's New in astroid 3.2.2? ============================ Release date: 2024-05-20 * Improve inference for generic classes using the PEP 695 syntax (Python 3.12). Closes pylint-dev/pylint#9406 What's New in astroid 3.2.1? ============================ Release date: 2024-05-16 * Fix ``RecursionError`` in ``infer_call_result()`` for certain ``__call__`` methods. Closes pylint-dev/pylint#9139 * Add ``AstroidManager.prefer_stubs`` attribute to control the astroid 3.2.0 feature that prefers stubs. Refs pylint-dev/pylint#9626 Refs pylint-dev/pylint#9623 What's New in astroid 3.2.0? ============================ Release date: 2024-05-07 * ``.pyi`` stub files are now preferred over ``.py`` files when resolving imports, (except for numpy). Closes pylint-dev/#9185 * ``igetattr()`` returns the last same-named function in a class (instead of the first). This avoids false positives in pylint with ``@overload``. Closes #1015 Refs pylint-dev/pylint#4696 * Adds ``module_denylist`` to ``AstroidManager`` for modules to be skipped during AST generation. Modules in this list will cause an ``AstroidImportError`` to be raised when an AST for them is requested. Refs pylint-dev/pylint#9442 * Make ``astroid.interpreter._import.util.is_namespace`` only consider modules using a loader set to ``NamespaceLoader`` or ``None`` as namespaces. This fixes a problem that ``six.moves`` brain was not effective if ``six.moves`` was already imported. Closes #1107 What's New in astroid 3.1.0? ============================ Release date: 2024-02-23 * Include PEP 695 (Python 3.12) generic type syntax nodes in ``get_children()``, allowing checkers to visit them. Refs pylint-dev/pylint#9193 * Add ``__main__`` as a possible inferred value for ``__name__`` to improve control flow inference around ``if __name__ == "__main__":`` guards. Closes #2071 * Following a deprecation period, the ``names`` arg to the ``Import`` constructor and the ``op`` arg to the ``BoolOp`` constructor are now required, and the ``doc`` args to the ``PartialFunction`` and ``Property`` constructors have been removed (call ``postinit(doc_node=...)`` instead.) * Following a deprecation announced in astroid 1.5.0, the alias ``AstroidBuildingException`` is removed in favor of ``AstroidBuildingError``. * Include modname in AST warnings. Useful for ``invalid escape sequence`` warnings with Python 3.12. * ``RecursionError`` is now trapped and logged out as ``UserWarning`` during astroid node transformations with instructions about raising the system recursion limit. Closes pylint-dev/pylint#8842 * Suppress ``SyntaxWarning`` for invalid escape sequences on Python 3.12 when parsing modules. Closes pylint-dev/pylint#9322 What's New in astroid 3.0.3? ============================ Release date: 2024-02-04 * Fix type of ``UnicodeDecodeError.object`` inferred as ``str`` instead of ``bytes``. Closes pylint-dev/pylint#9342 * Fix ``no-member`` false positives for ``args`` and ``kwargs`` on ``ParamSpec`` under Python 3.12. Closes pylint-dev/pylint#9401 What's New in astroid 3.0.2? ============================ Release date: 2023-12-12 * Avoid duplicate inference results for some uses of ``typing.X`` constructs like ``Tuple[Optional[int], ...]``. This was causing pylint to occasionally omit messages like ``deprecated-typing-alias``. Closes pylint-dev/pylint#9220 What's New in astroid 3.0.1? ============================ Release date: 2023-10-15 * Fix crashes linting code using PEP 695 (Python 3.12) generic type syntax. Closes pylint-dev/pylint#9098 What's New in astroid 3.0.0? ============================= Release date: 2023-09-26 * Add support for Python 3.12, including PEP 695 type parameter syntax. Closes #2201 * Remove support for Python 3.7. Refs #2137 * Use the global inference cache when inferring, even without an explicit ``InferenceContext``. This is a significant performance improvement given how often methods default to ``None`` for the context argument. (Linting ``astroid`` itself now takes ~5% less time on Python 3.12; other projects requiring more complex inference calculations will see greater speedups.) Refs #529 * Following a deprecation period starting in astroid 2.7.0, the ``astroid.node_classes`` and ``astroid.scoped_nodes`` modules have been removed in favor of ``astroid.nodes.node_classes`` and ``astroid.nodes.scoped_nodes``. Closes #1072 * Following a deprecation period starting in astroid 2.12.0, the ``astroid.mixins`` module has been removed in favor of ``astroid.nodes._base_nodes`` (private). Refs #1633 * Return all existing arguments when calling ``Arguments.arguments()``. This also means ``find_argname`` will now use the whole list of arguments for its search. Closes #2213 * Exclude class attributes from the ``__members__`` container of an ``Enum`` class when they are ``nodes.AnnAssign`` nodes with no assigned value. Refs pylint-dev/pylint#7402 * Remove ``@cached`` and ``@cachedproperty`` decorator (just use ``@cached_property`` from the stdlib). Closes #1780 Refs #2140 * Remove the ``inference`` module. Node inference methods are now in the module defining the node, rather than being associated to the node afterward. Closes #679 * Move ``LookupMixIn`` to ``astroid.nodes._base_nodes`` and make it private. * Remove the shims for ``OperationError``, ``BinaryOperationError``, and ``UnaryOperationError`` in ``exceptions``. They were moved to ``util`` in astroid 1.5.0. * Move ``safe_infer()`` from ``helpers`` to ``util``. This avoids some circular imports. * Reduce file system access in ``ast_from_file()``. * Reduce time to ``import astroid`` by delaying ``astroid_bootstrapping()`` until the first instantiation of ``AstroidBuilder``. Closes #2161 * Make ``igetattr()`` idempotent. This addresses some reports of varying results when running pylint with ``--jobs``. Closes pylint-dev/pylint#4356 Refs #7 * Fix incorrect cache keys for inference results, thereby correctly inferring types for calls instantiating types dynamically. Closes #1828 Closes pylint-dev/pylint#7464 Closes pylint-dev/pylint#8074 * Fix interrupted ``InferenceContext`` call chains, thereby addressing performance problems when linting ``sqlalchemy``. Closes pylint-dev/pylint#8150 * ``nodes.FunctionDef`` no longer inherits from ``nodes.Lambda``. This is a breaking change but considered a bug fix as the nodes did not share the same API and were not interchangeable. We have tried to minimize the amount of breaking changes caused by this change but some are unavoidable. * ``infer_call_result`` now shares the same interface across all implementations. Namely: ```python def infer_call_result( self, caller: SuccessfulInferenceResult | None, context: InferenceContext | None = None, ) -> Iterator[InferenceResult]: ``` This is a breaking change for ``nodes.FunctionDef`` where previously ``caller`` had a default of ``None``. Passing ``None`` again will not create a behaviour change. The breaking change allows us to better type and re-use the method within ``astroid``. * Improved signature of the ``__init__`` and ``__postinit__`` methods of most nodes. This includes making ``lineno``, ``col_offset``, ``end_lineno``, ``end_col_offset`` and ``parent`` required arguments for ``nodes.NodeNG`` and its subclasses. For most other nodes, arguments of their ``__postinit__`` methods have been made required to better represent how they would normally be constructed by the standard library ``ast`` module. The following nodes were changed or updated: - ``nodes.AnnAssign`` - ``nodes.Arguments`` - ``nodes.Assign`` - ``nodes.AssignAttr`` - ``nodes.AssignName`` - ``nodes.Attribute`` - ``nodes.AugAssign`` - ``nodes.Await`` - ``nodes.BaseContainer`` - ``nodes.BinOp`` - ``nodes.Call`` - ``nodes.ClassDef`` - ``nodes.Compare`` - ``nodes.Comprehension`` - ``nodes.Decorators`` - ``nodes.Delete`` - ``nodes.DelAttr`` - ``nodes.DelName`` - ``nodes.Dict`` - ``nodes.DictComp`` - ``nodes.ExceptHandler`` - ``nodes.Expr`` - ``nodes.For`` - ``nodes.FunctionDef`` - ``nodes.GeneratorExp`` - ``nodes.If`` - ``nodes.IfExp`` - ``nodes.Keyword`` - ``nodes.Lambda`` - ``nodes.ListComp`` - ``nodes.Module`` - ``nodes.Name`` - ``nodes.NodeNG`` - ``nodes.Raise`` - ``nodes.Return`` - ``nodes.SetComp`` - ``nodes.Slice`` - ``nodes.Starred`` - ``objects.Super``, we also added the ``call`` parameter to its ``__init__`` method. - ``nodes.Subscript`` - ``nodes.UnaryOp`` - ``nodes.While`` - ``nodes.Yield`` These changes involve breaking changes to their API but should be considered bug fixes. We now make arguments required when they are instead of always providing defaults. * ``nodes.If.self.is_orelse`` has been removed as it was never set correctly and therefore provided a false value. * Remove dependency on ``wrapt``. * Remove dependency on ``lazy_object_proxy``. This includes the removal of the associated ``lazy_import``, ``lazy_descriptor`` and ``proxy_alias`` utility functions. * ``CallSite._unpack_args`` and ``CallSite._unpack_keywords`` now use ``safe_infer()`` for better inference and fewer false positives. Closes pylint-dev/pylint#8544 * Add ``attr.Factory`` to the recognized class attributes for classes decorated with ``attrs``. Closes pylint-dev/pylint#4341 * ``infer_property()`` now observes the same property-specific workaround as ``infer_functiondef``. Refs #1490 * Remove unused and / or deprecated constants: - ``astroid.bases.BOOL_SPECIAL_METHOD`` - ``astroid.bases.BUILTINS`` - ``astroid.const.BUILTINS`` - ``astroid.const.PY38_PLUS`` - ``astroid.const.Load`` - ``astroid.const.Store`` - ``astroid.const.Del`` Refs #2141 * ``frame()`` raises ``ParentMissingError`` and ``statement()`` raises ``StatementMissing`` for missing parents regardless of the value of the ``future`` argument (which gave this behavior already). The ``future`` argument to each method is deprecated and will be removed in astroid 4.0. Refs #1217 * Remove deprecated ``Ellipsis``, ``ExtSlice``, ``Index`` nodes. Refs #2152 * Remove deprecated ``is_sys_guard`` and ``is_typing_guard`` methods. Refs #2153 * Remove deprecated ``doc`` attribute for ``Module``, ``ClassDef``, and ``FunctionDef``. Use the ``doc_node`` attribute instead. Refs #2154 * Add new ``nodes.Try`` to better match Python AST. Replaces the ``TryExcept`` and ``TryFinally`` nodes which have been removed. * Publicize ``NodeNG.repr_name()`` to facilitate finding a node's nice name. Refs pylint-dev/pylint#8598 * Fix false positives for ``no-member`` and ``invalid-name`` when using the ``_name_``, ``_value_`` and ``_ignore_`` sunders in Enums. Closes pylint-dev/pylint#9015 What's New in astroid 2.15.8? ============================= Release date: 2023-09-26 * Fix a regression in 2.15.7 for ``unsubscriptable-object``. Closes #2305 Closes pylint-dev/pylint#9069 * Fix a regression in 2.15.7 for ``unsubscriptable-object``. Closes #2305 Closes pylint-dev/pylint#9069 What's New in astroid 2.15.7? ============================= Release date: 2023-09-23 * Fix a crash when inferring a ``typing.TypeVar`` call. Closes pylint-dev/pylint#8802 * Infer user-defined enum classes by checking if the class is a subtype of ``enum.Enum``. Closes pylint-dev/pylint#8897 * Fix inference of functions with ``@functools.lru_cache`` decorators without parentheses. Closes pylint-dev/pylint#8868 * Make ``sys.argv`` uninferable because it never is. (It's impossible to infer the value it will have outside of static analysis where it's our own value.) Refs pylint-dev/pylint#7710 What's New in astroid 2.15.6? ============================= Release date: 2023-07-08 * Harden ``get_module_part()`` against ``"."``. Closes pylint-dev/pylint#8749 * Allow ``AsStringVisitor`` to visit ``objects.PartialFunction``. Closes pylint-dev/pylint#8881 * Avoid expensive list/tuple multiplication operations that would result in ``MemoryError``. Closes pylint-dev/pylint#8748 * Fix a regression in 2.12.0 where settings in AstroidManager would be ignored. Most notably this addresses pylint-dev/pylint#7433. Refs #2204 What's New in astroid 2.15.5? ============================= Release date: 2023-05-14 * Handle ``objects.Super`` in ``helpers.object_type()``. Refs pylint-dev/pylint#8554 * Recognize stub ``pyi`` Python files. Refs pylint-dev/pylint#4987 What's New in astroid 2.15.4? ============================= Release date: 2023-04-24 * Add visitor function for ``TryStar`` to ``AsStringVisitor`` and add ``TryStar`` to ``astroid.nodes.ALL_NODE_CLASSES``. Refs #2142 What's New in astroid 2.15.3? ============================= Release date: 2023-04-16 * Fix ``infer_call_result()`` crash on methods called ``with_metaclass()``. Closes #1735 * Suppress ``UserWarning`` when finding module specs. Closes pylint-dev/pylint#7906 What's New in astroid 2.15.2? ============================= Release date: 2023-04-03 * Support more possible usages of ``attrs`` decorators. Closes pylint-dev/pylint#7884 What's New in astroid 2.15.1? ============================= Release date: 2023-03-26 * Restore behavior of setting a Call as a base for classes created using ``six.with_metaclass()``, and harden support for using enums as metaclasses in this case. Reverts #1622 Refs pylint-dev/pylint#5935 Refs pylint-dev/pylint#7506 What's New in astroid 2.15.0? ============================= Release date: 2023-03-06 * astroid now supports ``TryStar`` nodes from python 3.11 and should be fully compatible with python 3.11. Closes #2028 * ``Formattedvalue.postinit`` is now keyword only. This is to allow correct typing of the ``Formattedvalue`` class. Refs #1516 * ``Astroid`` now supports custom import hooks. Refs pylint-dev/pylint#7306 * ``astroid`` now infers return values from match cases. Refs pylint-dev/pylint#5288 * ``AstroidManager.clear_cache`` now also clears the inference context cache. Refs #1780 * ``max_inferable_values`` can now be set on ``AstroidManager`` instances, e.g. ``astroid.MANAGER`` besides just the ``AstroidManager`` class itself. Closes #2280 * ``Astroid`` now retrieves the default values of keyword only arguments and sets them on ``Arguments.kw_defaults``. * ``Uninferable`` now has the type ``UninferableBase``. This is to facilitate correctly type annotating code that uses this singleton. Closes #1680 * Deprecate ``modutils.is_standard_module()``. It will be removed in the next minor release. Functionality has been replaced by two new functions, ``modutils.is_stdlib_module()`` and ``modutils.module_in_path()``. Closes #2012 * Fix ``are_exclusive`` function when a walrus operator is used inside ``IfExp.test`` field. Closes #2022 What's New in astroid 2.14.2? ============================= Release date: 2023-02-12 * '_infer_str_format_call' won't crash anymore when the string it analyses are uninferable. Closes pylint-dev/pylint#8109 What's New in astroid 2.14.1? ============================= Release date: 2023-01-31 * Revert ``CallContext`` change as it caused a ``RecursionError`` regression. What's New in astroid 2.14.0? ============================= Release date: 2023-01-31 * Add support for inferring binary union types added in Python 3.10. Refs pylint-dev/pylint#8119 * Capture and log messages emitted when inspecting a module for astroid. Closes #1904 What's New in astroid 2.13.5? ============================= Release date: 2023-01-31 * Revert ``CallContext`` change as it caused a ``RecursionError`` regression. What's New in astroid 2.13.4? ============================= Release date: 2023-01-31 * Fix issues with ``typing_extensions.TypeVar``. * Fix ``ClassDef.fromlino`` for PyPy 3.8 (v7.3.11) if class is wrapped by a decorator. * Preserve parent CallContext when inferring nested functions. Closes pylint-dev/pylint#8074 * Add ``Lock`` to the ``multiprocessing`` brain. Closes pylint-dev/pylint#3313 What's New in astroid 2.13.3? ============================= Release date: 2023-01-20 * Fix a regression in 2.13.2 where a RunTimeError could be raised unexpectedly. Closes #1958 * Fix overwritten attributes in inherited dataclasses not being ordered correctly. Closes pylint-dev/pylint#7881 * Fix a false positive when an attribute named ``Enum`` was confused with ``enum.Enum``. Calls to ``Enum`` are now inferred & the qualified name is checked. Refs pylint-dev/pylint#5719 * Remove unnecessary typing_extensions dependency on Python 3.11 and newer What's New in astroid 2.13.2? ============================= Release date: 2023-01-08 * Removed version conditions on typing_extensions dependency. Removed typing_extensions from our tests requirements as it was preventing issues to appear in our continuous integration. Closes #1945 What's New in astroid 2.13.1? ============================= Release date: 2023-01-08 * Bumping typing_extensions to 4.0.0 that is required when using ``Self`` Closes #1942 What's New in astroid 2.13.0? ============================= Release date: 2023-01-07 * Fixed importing of modules that have the same name as the file that is importing. ``astroid`` will now correctly handle an ``import math`` statement in a file called ``math.py`` by relying on the import system. Refs pylint-dev/pylint#5151 * Create ``ContextManagerModel`` and let ``GeneratorModel`` inherit from it. Refs pylint-dev/pylint#2567 * Added a ``regex`` brain. Refs pylint-dev/pylint#1911 * Support "is None" constraints from if statements during inference. Ref #791 Ref pylint-dev/pylint#157 Ref pylint-dev/pylint#1472 Ref pylint-dev/pylint#2016 Ref pylint-dev/pylint#2631 Ref pylint-dev/pylint#2880 What's New in astroid 2.12.14? ============================== Release date: 2023-01-06 * Handle the effect of properties on the ``__init__`` of a dataclass correctly. Closes pylint-dev/pylint#5225 * Handle the effect of ``kw_only=True`` in dataclass fields correctly. Closes pylint-dev/pylint#7623 * Handle the effect of ``init=False`` in dataclass fields correctly. Closes pylint-dev/pylint#7291 * Fix crash if ``numpy`` module doesn't have ``version`` attribute. Refs pylint-dev/pylint#7868 * Handle ``AttributeError`` during ``str.format`` template inference tip evaluation Closes pylint-dev/pylint#1902 * Add the ``masked_invalid`` function in the ``numpy.ma`` brain. Closes pylint-dev/pylint#5715 What's New in astroid 2.12.13? ============================== Release date: 2022-11-19 * Prevent returning an empty list for ``ClassDef.slots()`` when the mro list contains one class & it is not ``object``. Refs pylint-dev/pylint#5099 * Prevent a crash when inferring calls to ``str.format`` with inferred arguments that would be invalid. Closes #1856 * Infer the `length` argument of the ``random.sample`` function. Refs pylint-dev/pylint#7706 * Catch ``ValueError`` when indexing some builtin containers and sequences during inference. Closes #1843 What's New in astroid 2.12.12? ============================== Release date: 2022-10-19 * Add the ``length`` parameter to ``hash.digest`` & ``hash.hexdigest`` in the ``hashlib`` brain. Refs pylint-dev/pylint#4039 * Prevent a crash when a module's ``__path__`` attribute is unexpectedly missing. Refs pylint-dev/pylint#7592 * Fix inferring attributes with empty annotation assignments if parent class contains valid assignment. Refs pylint-dev/pylint#7631 What's New in astroid 2.12.11? ============================== Release date: 2022-10-10 * Add ``_value2member_map_`` member to the ``enum`` brain. Refs pylint-dev/pylint#3941 * Improve detection of namespace packages for the modules with ``__spec__`` set to None. Closes pylint-dev/pylint#7488. * Fixed a regression in the creation of the ``__init__`` of dataclasses with multiple inheritance. Closes pylint-dev/pylint#7434 What's New in astroid 2.12.10? ============================== Release date: 2022-09-17 * Fixed a crash when introspecting modules compiled by `cffi`. Closes #1776 Closes pylint-dev/pylint#7399 * ``decorators.cached`` now gets its cache cleared by calling ``AstroidManager.clear_cache``. Refs #1780 What's New in astroid 2.12.9? ============================= Release date: 2022-09-07 * Fixed creation of the ``__init__`` of ``dataclassess`` with multiple inheritance. Closes pylint-dev/pylint#7427 * Fixed a crash on ``namedtuples`` that use ``typename`` to specify their name. Closes pylint-dev/pylint#7429 What's New in astroid 2.12.8? ============================= Release date: 2022-09-06 * Fixed a crash in the ``dataclass`` brain for ``InitVars`` without subscript typing. Closes pylint-dev/pylint#7422 * Fixed parsing of default values in ``dataclass`` attributes. Closes pylint-dev/pylint#7425 What's New in astroid 2.12.7? ============================= Release date: 2022-09-06 * Fixed a crash in the ``dataclass`` brain for uninferable bases. Closes pylint-dev/pylint#7418 What's New in astroid 2.12.6? ============================= Release date: 2022-09-05 * Fix a crash involving ``Uninferable`` arguments to ``namedtuple()``. Closes pylint-dev/pylint#7375 * The ``dataclass`` brain now understands the ``kw_only`` keyword in dataclass decorators. Closes pylint-dev/pylint#7290 What's New in astroid 2.12.5? ============================= Release date: 2022-08-29 * Prevent first-party imports from being resolved to `site-packages`. Refs pylint-dev/pylint#7365 * Fix ``astroid.interpreter._import.util.is_namespace()`` incorrectly returning ``True`` for frozen stdlib modules on PyPy. Closes #1755 What's New in astroid 2.12.4? ============================= Release date: 2022-08-25 * Fixed a crash involving non-standard type comments such as ``# type: # any comment``. Refs pylint-dev/pylint#7347 What's New in astroid 2.12.3? ============================= Release date: 2022-08-23 * Fixed crash in ``ExplicitNamespacePackageFinder`` involving ``_SixMetaPathImporter``. Closes #1708 * Fix unhandled `FutureWarning` from pandas import in cython modules Closes #1717 * Fix false positive with inference of type-annotated Enum classes. Refs pylint-dev/pylint#7265 * Fix crash with inference of type-annotated Enum classes where the member has no value. * Fix a crash inferring invalid old-style string formatting with `%`. Closes #1737 * Fix false positive with inference of ``http`` module when iterating ``HTTPStatus``. Refs pylint-dev/pylint#7307 * Bumped minimum requirement of ``wrapt`` to 1.14 on Python 3.11. * Don't add dataclass fields annotated with ``KW_ONLY`` to the list of fields. Refs pylint-dev/pylint#5767 What's New in astroid 2.12.2? ============================= Release date: 2022-07-12 * Fixed crash in modulo operations for divisions by zero. Closes #1700 * Fixed crash with recursion limits during inference. Closes #1646 What's New in astroid 2.12.1? ============================= Release date: 2022-07-10 * Fix a crash when inferring old-style string formatting (``%``) using tuples. * Fix a crash when ``None`` (or a value inferred as ``None``) participates in a ``**`` expression. * Fix a crash involving properties within ``if`` blocks. What's New in astroid 2.12.0? ============================= Release date: 2022-07-09 * Fix signal has no ``connect`` member for PySide2 5.15.2+ and PySide6 Closes #4040, #5378 * ``astroid`` now requires Python 3.7.2 to run. * Avoid setting a Call as a base for classes created using ``six.with_metaclass()``. Refs pylint-dev/pylint#5935 * Fix detection of builtins on ``PyPy`` 3.9. * Fix ``re`` brain on Python ``3.11``. The flags now come from ``re._compile``. * Build ``nodes.Module`` for frozen modules which have location information in their ``ModuleSpec``. Closes #1512 * The ``astroid.mixins`` module has been deprecated and marked for removal in 3.0.0. Closes #1633 * Capture and log messages emitted by C extensions when importing them. This prevents contaminating programmatic output, e.g. pylint's JSON reporter. Closes pylint-dev/pylint#3518 * Calls to ``str.format`` are now correctly inferred. Closes #104, Closes #1611 * ``__new__`` and ``__init__`` have been added to the ``ObjectModel`` and are now inferred as ``BoundMethods``. * Old style string formatting (using ``%`` operators) is now correctly inferred. Closes #151 * Adds missing enums from ``ssl`` module. Closes pylint-dev/pylint#3691 * Remove dependency on ``pkg_resources`` from ``setuptools``. Closes #1103 * Allowed ``AstroidManager.clear_cache`` to reload necessary brain plugins. * Fixed incorrect inferences after rebuilding the builtins module, e.g. by calling ``AstroidManager.clear_cache``. Closes #1559 * ``Arguments.defaults`` is now ``None`` for uninferable signatures. * On Python versions >= 3.9, ``astroid`` now understands subscripting builtin classes such as ``enumerate`` or ``staticmethod``. * Fixed inference of ``Enums`` when they are imported under an alias. Closes pylint-dev/pylint#5776 * Rename ``ModuleSpec`` -> ``module_type`` constructor parameter to match attribute name and improve typing. Use ``type`` instead. * ``ObjectModel`` and ``ClassModel`` now know about their ``__new__`` and ``__call__`` attributes. * Fixed pylint ``not-callable`` false positive with nested-tuple assignment in a for-loop. Refs pylint-dev/pylint#5113 * Instances of builtins created with ``__new__(cls, value)`` are now inferred. * Infer the return value of the ``.copy()`` method on ``dict``, ``list``, ``set``, and ``frozenset``. Closes #1403 * Fixed inference of elements of living container objects such as tuples and sets in the ``sys`` and ``ssl`` modules. * Add ``pathlib`` brain to handle ``pathlib.PurePath.parents`` inference. Closes pylint-dev/pylint#5783 * Avoid inferring the results of ``**`` operations involving values greater than ``1e5`` to avoid expensive computation. Closes pylint-dev/pylint#6745 * Fix test for Python ``3.11``. In some instances ``err.__traceback__`` will be uninferable now. * Add brain for numpy core module ``einsumfunc``. Closes pylint-dev/pylint#5821 * Infer the ``DictUnpack`` value for ``Dict.getitem`` calls. Closes #1195 * Fix a crash involving properties within ``try ... except`` blocks. Closes pylint-dev/pylint#6592 * Prevent creating ``Instance`` objects that proxy other ``Instance``s when there is ambiguity (or user error) in calling ``__new__(cls)``. Refs pylint-dev/pylint#7109 What's New in astroid 2.11.7? ============================= Release date: 2022-07-09 * Added support for ``usedforsecurity`` keyword to ``hashlib`` constructors. Closes pylint-dev/pylint#6017 * Updated the stdlib brain for ``subprocess.Popen`` to accommodate Python 3.9+. Closes pylint-dev/pylint#7092 What's New in astroid 2.11.6? ============================= Release date: 2022-06-13 * The Qt brain now correctly treats calling ``.disconnect()`` (with no arguments) on a slot as valid. * The argparse brain no longer incorrectly adds ``"Namespace"`` to the locals of functions that return an ``argparse.Namespace`` object. Refs pylint-dev/pylint#6895 What's New in astroid 2.11.5? ============================= Release date: 2022-05-09 * Fix crash while obtaining ``object_type()`` of an ``Unknown`` node. Refs pylint-dev/pylint#6539 * Fix a bug where in attempting to handle the patching of ``distutils`` by ``virtualenv``, library submodules called ``distutils`` (e.g. ``numpy.distutils``) were included also. Refs pylint-dev/pylint#6497 What's New in astroid 2.11.4? ============================= Release date: 2022-05-02 * Fix ``col_offset`` attribute for nodes involving ``with`` on ``PyPy``. * Fixed a crash involving two starred expressions: one inside a comprehension, both inside a call. Refs pylint-dev/pylint#6372 * Made ``FunctionDef.implicit_parameters`` return 1 for methods by making ``FunctionDef.is_bound`` return ``True``, as it does for class methods. Closes pylint-dev/pylint#6464 * Fixed a crash when ``_filter_stmts`` encounters an ``EmptyNode``. Closes pylint-dev/pylint#6438 What's New in astroid 2.11.3? ============================= Release date: 2022-04-19 * Fixed an error in the Qt brain when building ``instance_attrs``. Closes pylint-dev/pylint#6221 * Fixed a crash in the ``gi`` brain. Closes pylint-dev/pylint#6371 What's New in astroid 2.11.2? ============================= Release date: 2022-03-26 * Avoided adding the name of a parent namedtuple to its child's locals. Refs pylint-dev/pylint#5982 What's New in astroid 2.11.1? ============================= Release date: 2022-03-22 * Promoted ``getattr()`` from ``astroid.scoped_nodes.FunctionDef`` to its parent ``astroid.scoped_nodes.Lambda``. * Fixed crash on direct inference via ``nodes.FunctionDef._infer``. Closes #817 What's New in astroid 2.11.0? ============================= Release date: 2022-03-12 * Add new (optional) ``doc_node`` attribute to ``nodes.Module``, ``nodes.ClassDef``, and ``nodes.FunctionDef``. * Accessing the ``doc`` attribute of ``nodes.Module``, ``nodes.ClassDef``, and ``nodes.FunctionDef`` has been deprecated in favour of the ``doc_node`` attribute. Note: ``doc_node`` is an (optional) ``nodes.Const`` whereas ``doc`` was an (optional) ``str``. * Passing the ``doc`` argument to the ``__init__`` of ``nodes.Module``, ``nodes.ClassDef``, and ``nodes.FunctionDef`` has been deprecated in favour of the ``postinit`` ``doc_node`` attribute. Note: ``doc_node`` is an (optional) ``nodes.Const`` whereas ``doc`` was an (optional) ``str``. * Replace custom ``cachedproperty`` with ``functools.cached_property`` and deprecate it for Python 3.8+. Closes #1410 * Set ``end_lineno`` and ``end_col_offset`` attributes to ``None`` for all nodes with PyPy 3.8. PyPy 3.8 assigns these attributes inconsistently which could lead to unexpected errors. Overwriting them with ``None`` will cause a fallback to the already supported way of PyPy 3.7. * Add missing ``shape`` parameter to numpy ``zeros_like``, ``ones_like``, and ``full_like`` methods. Closes pylint-dev/pylint#5871 * Only pin ``wrapt`` on the major version. What's New in astroid 2.10.0? ============================= Release date: 2022-02-27 * Fixed inference of ``self`` in binary operations in which ``self`` is part of a list or tuple. Closes pylint-dev/pylint#4826 * Fixed builtin inference on `property` calls not calling the `postinit` of the new node, which resulted in instance arguments missing on these nodes. * Fixed a crash on ``Super.getattr`` when the attribute was previously uninferable due to a cache limit size. This limit can be hit when the inheritance pattern of a class (and therefore of the ``__init__`` attribute) is very large. Closes pylint-dev/pylint#5679 * Include names of keyword-only arguments in ``astroid.scoped_nodes.Lambda.argnames``. Closes pylint-dev/pylint#5771 * Fixed a crash inferring on a ``NewType`` named with an f-string. Closes pylint-dev/pylint#5770 * Add support for [attrs v21.3.0](https://github.com/python-attrs/attrs/releases/tag/21.3.0) which added a new `attrs` module alongside the existing `attr`. Closes #1330 * Use the ``end_lineno`` attribute for the ``NodeNG.tolineno`` property when it is available. Closes #1350 * Add ``is_dataclass`` attribute to ``ClassDef`` nodes. * Use ``sysconfig`` instead of ``distutils`` to determine the location of python stdlib files and packages. Related pull requests: #1322, #1323, #1324 Closes #1282 Ref #1103 * Fixed crash with recursion error for inference of class attributes that referenced the class itself. Closes pylint-dev/pylint#5408 * Fixed crash when trying to infer ``items()`` on the ``__dict__`` attribute of an imported module. Closes #1085 * Add optional ``NodeNG.position`` attribute. Used for block nodes to highlight position of keyword(s) and name in cases where the AST doesn't provide good enough positional information. E.g. ``nodes.ClassDef``, ``nodes.FunctionDef``. * Fix ``ClassDef.fromlineno``. For Python < 3.8 the ``lineno`` attribute includes decorators. ``fromlineno`` should return the line of the ``class`` statement itself. * Performance improvements. Only run expensive decorator functions when non-default Deprecation warnings are enabled, eg. during a Pytest run. Closes #1383 What's New in astroid 2.9.3? ============================ Release date: 2022-01-09 * Fixed regression where packages without a ``__init__.py`` file were not recognized or imported correctly. Closes #1327 What's New in astroid 2.9.2? ============================ Release date: 2022-01-04 * Fixed regression in ``astroid.scoped_nodes`` where ``_is_metaclass`` was not accessible anymore. Closes #1325 What's New in astroid 2.9.1? ============================ Release date: 2021-12-31 * ``NodeNG.frame()`` and ``NodeNG.statement()`` will start raising ``ParentMissingError`` instead of ``AttributeError`` in astroid 3.0. This behaviour can already be triggered by passing ``future=True`` to a ``frame()`` or ``statement()`` call. * Prefer the module loader get_source() method in AstroidBuilder's module_build() when possible to avoid assumptions about source code being available on a filesystem. Otherwise the source cannot be found and application behavior changes when running within an embedded hermetic interpreter environment (pyoxidizer, etc.). * Require Python 3.6.2 to use astroid. * Removed custom ``distutils`` handling for resolving paths to submodules. Ref #1321 * Restore custom ``distutils`` handling for resolving paths to submodules. Closes pylint-dev/pylint#5645 * Fix ``deque.insert()`` signature in ``collections`` brain. Closes #1260 * Fix ``Module`` nodes not having a ``col_offset``, ``end_lineno``, and ``end_col_offset`` attributes. * Fix typing and update explanation for ``Arguments.args`` being ``None``. * Fix crash if a variable named ``type`` is accessed with an index operator (``[]``) in a generator expression. Closes pylint-dev/pylint#5461 * Enable inference of dataclass import from marshmallow_dataclass. This allows the dataclasses brain to recognize dataclasses annotated by marshmallow_dataclass. * Resolve symlinks in the import path Fixes inference error when the import path includes symlinks (e.g. Python installed on macOS via Homebrew). Closes #823 Closes pylint-dev/pylint#3499 Closes pylint-dev/pylint#4302 Closes pylint-dev/pylint#4798 Closes pylint-dev/pylint#5081 What's New in astroid 2.9.0? ============================ Release date: 2021-11-21 * Add ``end_lineno`` and ``end_col_offset`` attributes to astroid nodes. * Always treat ``__class_getitem__`` as a classmethod. * Add missing ``as_string`` visitor method for ``Unknown`` node. Closes #1264 What's New in astroid 2.8.6? ============================ Release date: 2021-11-21 * Fix crash on inference of subclasses created from ``Class().__subclasses__`` Closes pylint-dev/pylint#4982 * Fix bug with Python 3.7.0 / 3.7.1 and ``typing.NoReturn``. Closes #1239 What's New in astroid 2.8.5? ============================ Release date: 2021-11-12 * Use more permissive versions for the ``typed-ast`` dependency (<2.0 instead of <1.5) Closes #1237 * Fix crash on inference of ``__len__``. Closes pylint-dev/pylint#5244 * Added missing ``kind`` (for ``Const``) and ``conversion`` (for ``FormattedValue``) fields to repr. * Fix crash with assignment expressions, nested if expressions and filtering of statements Closes pylint-dev/pylint#5178 * Fix incorrect filtering of assignment expressions statements What's New in astroid 2.8.4? ============================ Release date: 2021-10-25 * Fix the ``scope()`` and ``frame()`` methods of ``NamedExpr`` nodes. When these nodes occur in ``Arguments``, ``Keyword`` or ``Comprehension`` nodes these methods now correctly point to the outer-scope of the ``FunctionDef``, ``ClassDef``, or ``Comprehension``. * Fix the ``set_local`` function for ``NamedExpr`` nodes. When these nodes occur in ``Arguments``, ``Keyword``, or ``Comprehension`` nodes these nodes are now correctly added to the locals of the ``FunctionDef``, ``ClassDef``, or ``Comprehension``. What's New in astroid 2.8.3? ============================ Release date: 2021-10-17 * Add support for wrapt 1.13 * Fixes handling of nested partial functions Closes pylint-dev/pylint#2462 Closes #1208 * Fix regression with the import resolver Closes pylint-dev/pylint#5131 * Fix crash with invalid dataclass field call Closes pylint-dev/pylint#5153 What's New in astroid 2.8.2? ============================ Release date: 2021-10-07 Same content than 2.8.2-dev0 / 2.8.1, released in order to fix a mistake when creating the tag. What's New in astroid 2.8.1? ============================ Release date: 2021-10-06 * Adds support of type hints inside numpy's brains. Closes pylint-dev/pylint#4326 * Enable inference of dataclass import from pydantic.dataclasses. This allows the dataclasses brain to recognize pydantic dataclasses. Closes pylint-dev/pylint#4899 * Fix regression on ClassDef inference Closes pylint-dev/pylint#5030 Closes pylint-dev/pylint#5036 * Fix regression on Compare node inference Closes pylint-dev/pylint#5048 * Extended attrs brain to support the provisional APIs * Astroid does not trigger it's own deprecation warning anymore. * Improve brain for ``typing.Callable`` and ``typing.Type``. * Fix bug with importing namespace packages with relative imports Closes pylint-dev/pylint#5059 * The ``is_typing_guard`` and ``is_sys_guard`` functions are deprecated and will be removed in 3.0.0. They are complex meta-inference functions that are better suited for pylint. Import them from ``pylint.checkers.utils`` instead (requires pylint ``2.12``). * Suppress the conditional between applied brains and dynamic import authorized modules. (Revert the "The transforms related to a module are applied only if this module has not been explicitly authorized to be imported" of version 2.7.3) * Adds a brain to infer the ``numpy.ma.masked_where`` function. Closes pylint-dev/pylint#3342 What's New in astroid 2.8.0? ============================ Release date: 2021-09-14 * Add additional deprecation warnings in preparation for astroid 3.0 * Require attributes for some node classes with ``__init__`` call. * ``name`` (``str``) for ``Name``, ``AssignName``, ``DelName`` * ``attrname`` (``str``) for ``Attribute``, ``AssignAttr``, ``DelAttr`` * ``op`` (``str``) for ``AugAssign``, ``BinOp``, ``BoolOp``, ``UnaryOp`` * ``names`` (``list[tuple[str, str | None]]``) for ``Import`` * Support pyz imports Closes pylint-dev/pylint#3887 * Add ``node_ancestors`` method to ``NodeNG`` for obtaining the ancestors of nodes. * It's now possible to infer the value of comparison nodes Closes #846 * Fixed bug in inference of dataclass field calls. Closes pylint-dev/pylint#4963 What's New in astroid 2.7.3? ============================ Release date: 2021-08-30 * The transforms related to a module are applied only if this module has not been explicitly authorized to be imported (i.e is not in AstroidManager.extension_package_whitelist). Solves the following issues if numpy is authorized to be imported through the `extension-pkg-allow-list` option. Closes pylint-dev/pylint#3342 Closes pylint-dev/pylint#4326 * Fixed bug in attribute inference from inside method calls. Closes pylint-dev/pylint#400 * Fixed bug in inference for superclass instance methods called from the class rather than an instance. Closes #1008 Closes pylint-dev/pylint#4377 * Fixed bug in inference of chained attributes where a subclass had an attribute that was an instance of its superclass. Closes pylint-dev/pylint#4220 * Adds a brain for the ctypes module. Closes pylint-dev/pylint#4896 * When processing dataclass attributes, exclude the same type hints from abc.collections as from typing. Closes pylint-dev/pylint#4895 * Apply dataclass inference to pydantic's dataclasses. Closes pylint-dev/pylint#4899 What's New in astroid 2.7.2? ============================ Release date: 2021-08-20 * ``BaseContainer`` is now public, and will replace ``_BaseContainer`` completely in astroid 3.0. * The call cache used by inference functions produced by ``inference_tip`` can now be cleared via ``clear_inference_tip_cache``. * ``astroid.const.BUILTINS`` and ``astroid.bases.BUILTINS`` are not used internally anymore and will be removed in astroid 3.0. Simply replace this by the string 'builtins' for better performances and clarity. * Add inference for dataclass initializer method. Closes pylint-dev/pylint#3201 What's New in astroid 2.7.1? ============================ Release date: 2021-08-16 * When processing dataclass attributes, only do typing inference on collection types. Support for instantiating other typing types is left for the future, if desired. Closes #1129 * Fixed LookupMixIn missing from ``astroid.node_classes``. What's New in astroid 2.7.0? ============================ Release date: 2021-08-15 * Import from ``astroid.node_classes`` and ``astroid.scoped_nodes`` has been deprecated in favor of ``astroid.nodes``. Only the imports from ``astroid.nodes`` will work in astroid 3.0.0. * Add support for arbitrary Enum subclass hierarchies Closes pylint-dev/pylint#533 Closes pylint-dev/pylint#2224 Closes pylint-dev/pylint#2626 * Add inference tips for dataclass attributes, including dataclasses.field calls. Also add support for InitVar. Closes pylint-dev/pylint#2600 Closes pylint-dev/pylint#2698 Closes pylint-dev/pylint#3405 Closes pylint-dev/pylint#3794 * Adds a brain that deals with dynamic import of `IsolatedAsyncioTestCase` class of the `unittest` module. Closes pylint-dev/pylint#4060 What's New in astroid 2.6.6? ============================ Release date: 2021-08-03 * Added support to infer return type of ``typing.cast()`` * Fix variable lookup handling of exclusive statements Closes pylint-dev/pylint#3711 * Fix variable lookup handling of function parameters Closes pylint-dev/astroid#180 * Fix variable lookup's handling of except clause variables * Fix handling of classes with duplicated bases with the same name Closes pylint-dev/astroid#1088 What's New in astroid 2.6.5? ============================ Release date: 2021-07-21 * Fix a crash when there would be a 'TypeError object does not support item assignment' in the code we parse. Closes pylint-dev/pylint#4439 * Fix a crash when a AttributeInferenceError was raised when failing to find the real name in infer_import_from. Closes pylint-dev/pylint#4692 What's New in astroid 2.6.4? ============================ Release date: 2021-07-19 * Fix a crash when a StopIteration was raised when inferring a faulty function in a context manager. Closes pylint-dev/pylint#4723 What's New in astroid 2.6.3? ============================ Release date: 2021-07-19 * Added ``If.is_sys_guard`` and ``If.is_typing_guard`` helper methods * Fix a bad inference type for yield values inside of a derived class. Closes pylint-dev/astroid#1090 * Fix a crash when the node is a 'Module' in the brain builtin inference Closes pylint-dev/pylint#4671 * Fix issues when inferring match variables Closes pylint-dev/pylint#4685 * Fix lookup for nested non-function scopes * Fix issue that ``TypedDict`` instance wasn't callable. Closes pylint-dev/pylint#4715 * Add dependency on setuptools and a guard to prevent related exceptions. What's New in astroid 2.6.2? ============================ Release date: 2021-06-30 * Fix a crash when the inference of the length of a node failed Closes pylint-dev/pylint#4633 * Fix unhandled StopIteration during inference, following the implementation of PEP479 in python 3.7+ Closes pylint-dev/pylint#4631 Closes #1080 What's New in astroid 2.6.1? ============================ Release date: 2021-06-29 * Fix issue with ``TypedDict`` for Python 3.9+ Closes pylint-dev/pylint#4610 What's New in astroid 2.6.0? ============================ Release date: 2021-06-22 * Appveyor and travis are no longer used in the continuous integration * ``setuptools_scm`` has been removed and replaced by ``tbump`` in order to not have hidden runtime dependencies to setuptools * ``NodeNg``, the base node class, is now accessible from ``astroid`` or ``astroid.nodes`` as it can be used for typing. * Update enum brain to improve inference of .name and .value dynamic class attributes Closes pylint-dev/pylint#1932 Closes pylint-dev/pylint#2062 Closes pylint-dev/pylint#2306 * Removed ``Repr``, ``Exec``, and ``Print`` nodes as the ``ast`` nodes they represented have been removed with the change to Python 3 * Deprecate ``Ellipsis`` node. It will be removed with the next minor release. Checkers that already support Python 3.8+ work without issues. It's only necessary to remove all references to the ``astroid.Ellipsis`` node. This changes will make development of checkers easier as the resulting tree for Ellipsis will no longer depend on the python version. **Background**: With Python 3.8 the ``ast.Ellipsis`` node, along with ``ast.Str``, ``ast.Bytes``, ``ast.Num``, and ``ast.NamedConstant`` were merged into ``ast.Constant``. * Deprecated ``Index`` and ``ExtSlice`` nodes. They will be removed with the next minor release. Both are now part of the ``Subscript`` node. Checkers that already support Python 3.9+ work without issues. It's only necessary to remove all references to the ``astroid.Index`` and ``astroid.ExtSlice`` nodes. This change will make development of checkers easier as the resulting tree for ``ast.Subscript`` nodes will no longer depend on the python version. **Background**: With Python 3.9 ``ast.Index`` and ``ast.ExtSlice`` were merged into the ``ast.Subscript`` node. * Updated all Match nodes to be internally consistent. * Add ``Pattern`` base class. What's New in astroid 2.5.8? ============================ Release date: 2021-06-07 * Improve support for Pattern Matching * Add lineno and col_offset for ``Keyword`` nodes and Python 3.9+ * Add global inference cache to speed up inference of long statement blocks * Add a limit to the total number of nodes inferred indirectly as a result of inferring some node What's New in astroid 2.5.7? ============================ Release date: 2021-05-09 * Fix six.with_metaclass transformation so it doesn't break user defined transformations. * Fix detection of relative imports. Closes #930 Closes pylint-dev/pylint#4186 * Fix inference of instance attributes defined in base classes Closes #932 * Update `infer_named_tuple` brain to reject namedtuple definitions that would raise ValueError Closes #920 * Do not set instance attributes on builtin object() Closes #945 Closes pylint-dev/pylint#4232 Closes pylint-dev/pylint#4221 Closes pylint-dev/pylint#3970 Closes pylint-dev/pylint#3595 * Fix some spurious cycles detected in ``context.path`` leading to more cases that can now be inferred Closes #926 * Add ``kind`` field to ``Const`` nodes, matching the structure of the built-in ast Const. The kind field is "u" if the literal is a u-prefixed string, and ``None`` otherwise. Closes #898 * Fix property inference in class contexts for properties defined on the metaclass Closes #940 * Update enum brain to fix definition of __members__ for subclass-defined Enums Closes pylint-dev/pylint#3535 Closes pylint-dev/pylint#4358 * Update random brain to fix a crash with inference of some sequence elements Closes #922 * Fix inference of attributes defined in a base class that is an inner class Closes #904 * Allow inferring a return value of None for non-abstract empty functions and functions with no return statements (implicitly returning None) Closes #485 * scm_setuptools has been added to the packaging. * Astroid's tags are now the standard form ``vX.Y.Z`` and not ``astroid-X.Y.Z`` anymore. * Add initial support for Pattern Matching in Python 3.10 What's New in astroid 2.5.6? ============================ Release date: 2021-04-25 * Fix retro-compatibility issues with old version of pylint Closes pylint-dev/pylint#4402 What's New in astroid 2.5.5? ============================ Release date: 2021-04-24 * Fixes the discord link in the project urls of the package. Closes pylint-dev/pylint#4393 What's New in astroid 2.5.4? ============================ Release date: 2021-04-24 * The packaging is now done via setuptools exclusively. ``doc``, ``tests``, and ``Changelog`` are not packaged anymore - reducing the size of the package greatly. * Debian packaging is now (officially) done in https://salsa.debian.org/python-team/packages/astroid. * ``__pkginfo__`` now only contain ``__version__`` (also accessible with ``astroid.__version__``), other meta-information are still accessible with ``import importlib;metadata.metadata('astroid')``. * Added inference tip for ``typing.Tuple`` alias * Fix crash when evaluating ``typing.NamedTuple`` Closes pylint-dev/pylint#4383 * COPYING was removed in favor of COPYING.LESSER and the latter was renamed to LICENSE to make more apparent that the code is licensed under LGPLv2 or later. * Moved from appveyor and travis to Github Actions for continuous integration. What's New in astroid 2.5.3? ============================ Release date: 2021-04-10 * Takes into account the fact that subscript inferring for a ClassDef may involve __class_getitem__ method * Reworks the ``collections`` and ``typing`` brain so that pylint`s acceptance tests are fine. Closes pylint-dev/pylint#4206 * Use ``inference_tip`` for ``typing.TypedDict`` brain. * Fix mro for classes that inherit from typing.Generic * Add inference tip for typing.Generic and typing.Annotated with ``__class_getitem__`` Closes pylint-dev/pylint#2822 What's New in astroid 2.5.2? ============================ Release date: 2021-03-28 * Detects `import numpy` as a valid `numpy` import. Closes pylint-dev/pylint#3974 * Iterate over ``Keywords`` when using ``ClassDef.get_children`` Closes pylint-dev/pylint#3202 What's New in astroid 2.5.1? ============================ Release date: 2021-02-28 * The ``context.path`` is reverted to a set because otherwise it leads to false positives for non `numpy` functions. Closes #895 #899 * Don't transform dataclass ClassVars * Improve typing.TypedDict inference * Fix the `Duplicates found in MROs` false positive. Closes #905 Closes pylint-dev/pylint#2717 Closes pylint-dev/pylint#3247 Closes pylint-dev/pylint#4093 Closes pylint-dev/pylint#4131 Closes pylint-dev/pylint#4145 What's New in astroid 2.5? ============================ Release date: 2021-02-15 * Adds `attr_fset` in the `PropertyModel` class. Fixes pylint-dev/pylint#3480 * Remove support for Python 3.5. * Remove the runtime dependency on ``six``. The ``six`` brain remains in astroid. Fixes pylint-dev/astroid#863 * Enrich the ``brain_collection`` module so that ``__class_getitem__`` method is added to `deque` for ``python`` version above 3.9. * The ``context.path`` is now a ``dict`` and the ``context.push`` method returns ``True`` if the node has been visited a certain amount of times. Close #669 * Adds a brain for type object so that it is possible to write `type[int]` in annotation. Fixes pylint-dev/pylint#4001 * Add ``__class_getitem__`` method to ``subprocess.Popen`` brain under Python 3.9 so that it is seen as subscriptable by pylint. Fixes pylint-dev/pylint#4034 * Adds `degrees`, `radians`, which are `numpy ufunc` functions, in the `numpy` brain. Adds `random` function in the `numpy.random` brain. Fixes pylint-dev/pylint#3856 * Fix deprecated importlib methods Closes #703 * Fix a crash in inference caused by `Uninferable` container elements Close #866 * Add `python 3.9` support. * The flat attribute of ``numpy.ndarray`` is now inferred as an ``numpy.ndarray`` itself. It should be a ``numpy.flatiter`` instance, but this class is not yet available in the numpy brain. Fixes pylint-dev/pylint#3640 * Fix a bug for dunder methods inference of function objects Fixes #819 * Fixes a bug in the signature of the ``ndarray.__or__`` method, in the ``brain_numpy_ndarray.py`` module. Fixes #815 * Fixes a to-list cast bug in ``starred_assigned_stmts`` method, in the ``protocols.py`` module. * Added a brain for ``hypothesis.strategies.composite`` * The transpose of a ``numpy.ndarray`` is also a ``numpy.ndarray`` Fixes pylint-dev/pylint#3387 * Added a brain for ``sqlalchemy.orm.session`` * Separate string and bytes classes patching Fixes pylint-dev/pylint#3599 * Prevent recursion error for self referential length calls Close #777 * Added missing methods to the brain for ``mechanize``, to fix pylint false positives Close #793 * Added more supported parameters to ``subprocess.check_output`` * Fix recursion errors with pandas Fixes pylint-dev/pylint#2843 Fixes pylint-dev/pylint#2811 * Added exception inference for `UnicodeDecodeError` Close pylint-dev/pylint#3639 * `FunctionDef.is_generator` properly handles `yield` nodes in `If` tests Close pylint-dev/pylint#3583 * Fixed exception-chaining error messages. * Fix failure to infer base class type with multiple inheritance and qualified names Fixes #843 * Fix interpretation of ``six.with_metaclass`` class definitions. Fixes #713 * Reduce memory usage of astroid's module cache. * Remove dependency on `imp`. Close #594 Close #681 * Do not crash when encountering starred assignments in enums. Close #835 * Fix a crash in functools.partial inference when the arguments cannot be determined Close pylint-dev/pylint#3776 * Fix a crash caused by a lookup of a monkey-patched method Close pylint-dev/pylint#3686 * ``is_generator`` correctly considers `Yield` nodes in `AugAssign` nodes This fixes a false positive with the `assignment-from-no-return` pylint check. Close pylint-dev/pylint#3904 * Corrected the parent of function type comment nodes. These nodes used to be parented to their original ast.FunctionDef parent but are now correctly parented to their astroid.FunctionDef parent. Close pylint-dev/astroid#851 What's New in astroid 2.4.2? ============================ Release date: 2020-06-08 * `FunctionDef.is_generator` properly handles `yield` nodes in `While` tests Close pylint-dev/pylint#3519 * Properly construct the arguments of inferred property descriptors Close pylint-dev/pylint#3648 What's New in astroid 2.4.1? ============================ Release date: 2020-05-05 * Handle the case where the raw builder fails to retrieve the ``__all__`` attribute Close #772 * Restructure the AST parsing heuristic to always pick the same module Close pylint-dev/pylint#3540 Close #773 * Changed setup.py to work with [distlib](https://pypi.org/project/distlib) Close #779 * Do not crash with SyntaxError when parsing namedtuples with invalid label Close pylint-dev/pylint#3549 * Protect against ``infer_call_result`` failing with `InferenceError` in `Super.getattr()` Close pylint-dev/pylint#3529 What's New in astroid 2.4.0? ============================ Release date: 2020-04-27 * Expose a ast_from_string method in AstroidManager, which will accept source code as a string and return the corresponding astroid object Closes pylint-dev/astroid#725 * ``BoundMethod.implicit_parameters`` returns a proper value for ``__new__`` Close pylint-dev/pylint#2335 * Allow slots added dynamically to a class to still be inferred Close pylint-dev/pylint#2334 * Allow `FunctionDef.getattr` to look into both instance attrs and special attributes Close pylint-dev/pylint#1078 * Infer qualified ``classmethod`` as a classmethod. Close pylint-dev/pylint#3417 * Prevent a recursion error to happen when inferring the declared metaclass of a class Close #749 * Raise ``AttributeInferenceError`` when ``getattr()`` receives an empty name Close pylint-dev/pylint#2991 * Prevent a recursion error for self reference variables and `type()` calls. Close #199 * Do not infer the first argument of a staticmethod in a metaclass as the class itself Close pylint-dev/pylint#3032 * ``NodeNG.bool_value()`` gained an optional ``context`` parameter We need to pass an inference context downstream when inferring the boolean value of a node in order to prevent recursion errors and double inference. This fix prevents a recursion error with dask library. Close pylint-dev/pylint#2985 * Pass a context argument to ``astroid.Arguments`` to prevent recursion errors Close pylint-dev/pylint#3414 * Better inference of class and static methods decorated with custom methods Close pylint-dev/pylint#3209 * Reverse the order of decorators for `infer_subscript` `path_wrapper` needs to come first, followed by `raise_if_nothing_inferred`, otherwise we won't handle `StopIteration` correctly. Close #762 * Prevent a recursion error when inferring self-referential variables without definition Close pylint-dev/pylint#1285 * Numpy `datetime64.astype` return value is inferred as a `ndarray`. Close pylint-dev/pylint#3332 * Skip non ``Assign`` and ``AnnAssign`` nodes from enum reinterpretation Closes pylint-dev/pylint#3365 * Numpy ``ndarray`` attributes ``imag`` and ``real`` are now inferred as ``ndarray``. Close pylint-dev/pylint#3322 * Added a call to ``register_transform`` for all functions of the ``brain_numpy_core_multiarray`` module in case the current node is an instance of ``astroid.Name`` Close #666 * Use the parent of the node when inferring aug assign nodes instead of the statement Close pylint-dev/pylint#2911 Close pylint-dev/pylint#3214 * Added some functions to the ``brain_numpy_core_umath`` module Close pylint-dev/pylint#3319 * Added some functions of the ``numpy.core.multiarray`` module Close pylint-dev/pylint#3208 * All the ``numpy ufunc`` functions derived now from a common class that implements the specific ``reduce``, ``accumulate``, ``reduceat``, ``outer`` and ``at`` methods. Close pylint-dev/pylint#2885 * ``nodes.Const.itered`` returns a list of ``Const`` nodes, not strings Close pylint-dev/pylint#3306 * The ``shape`` attribute of a ``numpy ndarray`` is now a ``ndarray`` Close pylint-dev/pylint#3139 * Don't ignore special methods when inspecting gi classes Close #728 * Added transform for ``scipy.gaussian`` * Add support for inferring properties. * Added a brain for ``responses`` * Allow inferring positional only arguments. * Retry parsing a module that has invalid type comments It is possible for a module to use comments that might be interpreted as type comments by the `ast` library. We do not want to completely crash on those invalid type comments. Close #708 * Scope the inference to the current bound node when inferring instances of classes When inferring instances of classes from arguments, such as ``self`` in a bound method, we could use as a hint the context's ``boundnode``, which indicates the instance from which the inference originated. As an example, a subclass that uses a parent's method which returns ``self``, will override the ``self`` to point to it instead of pointing to the parent class. Close pylint-dev/pylint#3157 * Add support for inferring exception instances in all contexts We were able to infer exception instances as ``ExceptionInstance`` only for a handful of cases, but not all. ``ExceptionInstance`` has support for better inference of `.args` and other exception related attributes that normal instances do not have. This additional support should remove certain false positives related to ``.args`` and other exception attributes in ``pylint``. Close pylint-dev/pylint#2333 * Add more supported parameters to ``subprocess.check_output`` Close #722 * Infer args unpacking of ``self`` Certain stdlib modules use ``*args`` to encapsulate the ``self`` parameter, which results in uninferable instances given we rely on the presence of the ``self`` argument to figure out the instance where we should be setting attributes. Close pylint-dev/pylint#3216 * Clean up setup.py Make pytest-runner a requirement only if running tests, similar to what was done with McCabe. Clean up the setup.py file, resolving a handful of minor warnings with it. * Handle StopIteration error in infer_int. Close pylint-dev/pylint#3274 * Can access per argument type comments for positional only and keyword only arguments. The comments are accessed through through the new ``Arguments.type_comment_posonlyargs`` and ``Arguments.type_comment_kwonlyargs`` attributes respectively. * Relax upper bound on `wrapt` Close #755 * Properly analyze CFFI compiled extensions. What's New in astroid 2.3.2? ============================ Release date: 2019-10-18 * All type comments have as parent the corresponding `astroid` node Until now they had as parent the builtin `ast` node which meant we were operating with primitive objects instead of our own. Close pylint-dev/pylint#3174 * Pass an inference context to `metaclass()` when inferring an object type This should prevent a bunch of recursion errors happening in pylint. Also refactor the inference of `IfExp` nodes to use separate contexts for each potential branch. Close pylint-dev/pylint#3152 Close pylint-dev/pylint#3159 What's New in astroid 2.3.1? ============================ Release date: 2019-09-30 * A transform for the builtin `dataclasses` module was added. This should address various `dataclasses` issues that were surfaced even more after the release of pylint 2.4.0. In the previous versions of `astroid`, annotated assign nodes were allowed to be retrieved via `getattr()` but that no longer happens with the latest `astroid` release, as those attribute are not actual attributes, but rather virtual ones, thus an operation such as `getattr()` does not make sense for them. * Update attr brain to partly understand annotated attributes Close #656 What's New in astroid 2.3.0? ============================ Release date: 2019-09-24 * Add a brain tip for ``subprocess.check_output`` Close #689 * Remove NodeNG.nearest method because of lack of usage in astroid and pylint. Close #691 * Allow importing wheel files. Close #541 * Annotated AST follows PEP8 coding style when converted to string. * Fix a bug where defining a class using type() could cause a DuplicateBasesError. Close #644 * Dropped support for Python 3.4. * Numpy brain support is improved. Numpy's fundamental type ``numpy.ndarray`` has its own brain : ``brain_numpy_ndarray`` and each numpy module that necessitates brain action has now its own numpy brain : - ``numpy.core.numeric`` - ``numpy.core.function_base`` - ``numpy.core.multiarray`` - ``numpy.core.numeric`` - ``numpy.core.numerictypes`` - ``numpy.core.umath`` - ``numpy.random.mtrand`` Close pylint-dev/pylint#2865 Close pylint-dev/pylint#2747 Close pylint-dev/pylint#2721 Close pylint-dev/pylint#2326 Close pylint-dev/pylint#2021 * ``assert`` only functions are properly inferred as returning ``None`` Close #668 * Add support for Python 3.8's `NamedExpr` nodes, which is part of assignment expressions. Close #674 * Added support for inferring `IfExp` nodes. * Instances of exceptions are inferred as such when inferring in non-exception context This allows special inference support for exception attributes such as `.args`. Close pylint-dev/pylint#2333 * Drop a superfluous and wrong callcontext when inferring the result of a context manager Close pylint-dev/pylint#2859 * ``igetattr`` raises ``InferenceError`` on re-inference of the same object This prevents ``StopIteration`` from leaking when we encounter the same object in the current context, which could result in various ``RuntimeErrors`` leaking in other parts of the inference. Until we get a global context per inference, the solution is sort of a hack, as with the suggested global context improvement, we could theoretically reuse the same inference object. Close #663 * Variable annotations can no longer be retrieved with `ClassDef.getattr` Unless they have an attached value, class variable annotations can no longer be retrieved with `ClassDef.getattr.` * Improved builtin inference for ``tuple``, ``set``, ``frozenset``, ``list`` and ``dict`` We were properly inferring these callables *only* if they had consts as values, but that is not the case most of the time. Instead we try to infer the values that their arguments can be and use them instead of assuming Const nodes all the time. Close pylint-dev/pylint#2841 * The last except handler wins when inferring variables bound in an except handler. Close pylint-dev/pylint#2777 * ``threading.Lock.locked()`` is properly recognized as a member of ``threading.Lock`` Close pylint-dev/pylint#2791 * Fix recursion error involving ``len`` and self referential attributes Close pylint-dev/pylint#2736 Close pylint-dev/pylint#2734 Close pylint-dev/pylint#2740 * Can access per argument type comments through new ``Arguments.type_comment_args`` attribute. Close #665 * Fix being unable to access class attributes on a NamedTuple. Close pylint-dev/pylint#1628 * Fixed being unable to find distutils submodules by name when in a virtualenv. Close pylint-dev/pylint#73 What's New in astroid 2.2.0? ============================ Release date: 2019-02-27 * Fix a bug concerning inference of calls to numpy function that should not return Tuple or List instances. Close pylint-dev/pylint#2436 * Fix a bug where a method, which is a lambda built from a function, is not inferred as ``BoundMethod`` Close pylint-dev/pylint#2594 * ``typed_ast`` gets installed for Python 3.7, meaning type comments can now work on 3.7. * Fix a bug concerning inference of unary operators on numpy types. Close pylint-dev/pylint#2436 (first part) * Fix a crash with ``typing.NamedTuple`` and empty fields. Close pylint-dev/pylint#2745 * Add a proper ``strerror`` inference to the ``OSError`` exceptions. Close pylint-dev/pylint#2553 * Support non-const nodes as values of Enum attributes. Close #612 * Fix a crash in the ``enum`` brain tip caused by non-assign members in class definitions. Close pylint-dev/pylint#2719 * ``brain_numpy`` returns an undefined type for ``numpy`` methods to avoid ``assignment-from-no-return`` Close pylint-dev/pylint#2694 * Fix a bug where a call to a function that has been previously called via functools.partial was wrongly inferred Close pylint-dev/pylint#2588 * Fix a recursion error caused by inferring the ``slice`` builtin. Close pylint-dev/pylint#2667 * Remove the restriction that "old style classes" cannot have a MRO. This does not make sense any longer given that we run against Python 3 code. Close pylint-dev/pylint#2701 * Added more builtin exceptions attributes. Close #580 * Add a registry for builtin exception models. Close pylint-dev/pylint#1432 * Add brain tips for `http.client`. Close pylint-dev/pylint#2687 * Prevent crashing when processing ``enums`` with mixed single and double quotes. Close pylint-dev/pylint#2676 * ``typing`` types have the `__args__` property. Close pylint-dev/pylint#2419 * Fix a bug where an Attribute used as a base class was triggering a crash Close #626 * Added special support for `enum.IntFlag` Close pylint-dev/pylint#2534 * Extend detection of data classes defined with attr Close #628 * Fix typo in description for brain_attrs What's New in astroid 2.1.0? ============================ Release date: 2018-11-25 * ``threading.Lock.acquire`` has the ``timeout`` parameter now. Close pylint-dev/pylint#2457 * Pass parameters by keyword name when inferring sequences. Close pylint-dev/pylint#2526 * Correct line numbering for f-strings for complex embedded expressions When a f-string contained a complex expression, such as an attribute access, we weren't cloning all the subtree of the f-string expression for attaching the correct line number. This problem is coming from the builtin AST parser which gives for the f-string and for its underlying elements the line number 1, but this is causing all sorts of bugs and problems in pylint, which expects correct line numbering. Close pylint-dev/pylint#2449 * Add support for `argparse.Namespace` Close pylint-dev/pylint#2413 * `async` functions are now inferred as `AsyncGenerator` when inferring their call result. * Filter out ``Uninferable`` when inferring the call result result of a class with an uninferable ``__call__`` method. Close pylint-dev/pylint#2434 * Make compatible with AST changes in Python 3.8. * Subscript inference (e.g. "`a[i]`") now pays attention to multiple inferred values for value (e.g. "`a`") and slice (e.g. "`i`") Close #614 What's New in astroid 2.0.4? ============================ Release date: 2018-08-10 * Make sure that assign nodes can find ``yield`` statements in their values Close pylint-dev/pylint#2400 What's New in astroid 2.0.3? ============================ Release date: 2018-08-08 * The environment markers for PyPy were invalid. What's New in astroid 2.0.2? ============================ Release date: 2018-08-01 * Stop repeat inference attempt causing a RuntimeError in Python3.7 Close pylint-dev/pylint#2317 * infer_call_result can raise InferenceError so make sure to handle that for the call sites where it is used infer_call_result started recently to raise InferenceError for objects for which it could not find any returns. Previously it was silently raising a StopIteration, which was especially leaking when calling builtin methods. Since it is after all an inference method, it is expected that it could raise an InferenceError rather than returning nothing. Close pylint-dev/pylint#2350 What's New in astroid 2.0.1? ============================ Release date: 2018-07-19 * Released to clear an old wheel package on PyPI What's New in astroid 2.0? ========================== Release date: 2018-07-15 * String representation of nodes takes in account precedence and associativity rules of operators. * Fix loading files with `modutils.load_from_module` when the path that contains it in `sys.path` is a symlink and the file is contained in a symlinked folder. Close #583 * Reworking of the numpy brain dealing with numerictypes (use of inspect module to determine the class hierarchy of numpy.core.numerictypes module) Close pylint-dev/pylint#2140 * Added inference support for starred nodes in for loops Close #146 * Support unpacking for dicts in assignments Close #268 * Add support for inferring functools.partial Close #125 * Inference support for `dict.fromkeys` Close #110 * `int()` builtin is inferred as returning integers. Close #150 * `str()` builtin is inferred as returning strings. Close #148 * DescriptorBoundMethod has the correct number of arguments defined. * Improvement of the numpy numeric types definition. Close pylint-dev/pylint#1971 * Subclasses of *property* are now interpreted as properties Close pylint-dev/pylint#1601 * AsStringRegexpPredicate has been removed. Use transform predicates instead of it. * Switched to using typed_ast for getting access to type comments As a side effect of this change, some nodes gained a new `type_annotation` attribute, which, if the type comments were correctly parsed, should contain a node object with the corresponding objects from the type comment. * typing.X[...] and typing.NewType are inferred as classes instead of instances. * Module.__path__ is now a list It used to be a string containing the path, but it doesn't reflect the situation on Python, where it is actually a list. * Fix a bug with namespace package's __path__ attribute. Close #528 * Added brain tips for random.sample Part of pylint-dev/pylint#811 * Add brain tip for `issubclass` builtin Close #101. * Fix submodule imports from six Close pylint-dev/pylint#1640 * Fix missing __module__ and __qualname__ from class definition locals Close pylint-dev/pylint#1753 * Fix a crash when __annotations__ access a parent's __init__ that does not have arguments Close #473 * Fix multiple objects sharing the same InferenceContext.path causing uninferable results Close #483 * Fix improper modification of col_offset, lineno upon inference of builtin functions Close pylint-dev/pylint#1839 * Subprocess.Popen brain now knows of the args member Close pylint-dev/pylint#1860 * add move_to_end method to collections.OrderedDict brain Close pylint-dev/pylint#1872 * Include new hashlib classes added in python 3.6 * Fix RecursionError for augmented assign Close #437, #447, #313, pylint-dev/pylint#1642, pylint-dev/pylint#1805, pylint-dev/pylint#1854, pylint-dev/pylint#1452 * Add missing attrs special attribute Close pylint-dev/pylint#1884 * Inference now understands the 'isinstance' builtin Close #98 * Stop duplicate nodes with the same key values from appearing in dictionaries from dictionary unpacking. Close pylint-dev/pylint#1843 * Fix ``contextlib.contextmanager`` inference for nested context managers Close #1699 * Implement inference for len builtin Close #112 * Add qname method to Super object preventing potential errors in upstream pylint Close #533 * Stop astroid from getting stuck in an infinite loop if a function shares its name with its decorator Close #375 * Fix issue with inherited __call__ improperly inferencing self Close #pylint-dev/pylint#2199 * Fix __call__ precedence for classes with custom metaclasses Close pylint-dev/pylint#2159 * Limit the maximum amount of interable result in an NodeNG.infer() call to 100 by default for performance issues with variables with large amounts of possible values. The max inferable value can be tuned by setting the `max_inferable_values` flag on astroid.MANAGER. What's New in astroid 1.6.0? ============================ Release date: 2017-12-15 * When verifying duplicates classes in MRO, ignore on-the-fly generated classes Close pylint-dev/pylint#1706 * Add brain tip for attrs library to prevent unsupported-assignment-operation false positives Close pylint-dev/pylint#1698 * file_stream was removed, since it was deprecated for three releases Instead one should use the .stream() method. * Vast improvements to numpy support * Add brain tips for curses Close pylint-dev/pylint#1703 * Add brain tips for UUID.int Close pylint-dev/pylint#961 * The result of using object.__new__ as class decorator is correctly inferred as instance Close #172 * Enums created with functional syntax are now iterable * Enums created with functional syntax are now subscriptable * Don't crash when getting the string representation of BadUnaryOperationMessage In some cases, when the operand does not have a .name attribute, getting the string representation of a BadUnaryOperationMessage leads to a crash. Close pylint-dev/pylint#1563 * Don't raise DuplicateBaseError when classes at different locations are used For instance, one can implement a namedtuple base class, which gets reused on a class with the same name later on in the file. Until now, we considered these two classes as being the same, because they shared the name, but in fact they are different, being created at different locations and through different means. Close pylint-dev/pylint#1458 * The func form of namedtuples with keywords is now understood Close pylint-dev/pylint#1530 * Fix inference for nested calls * Dunder class at method level is now inferred as the class of the method Close pylint-dev/pylint#1328 * Stop most inference tip overwrites from happening by using predicates on existing inference_tip transforms. Close #472 * Fix object.__new__(cls) calls in classmethods by using a context which has the proper boundnode for the given argument Close #404 * Fix Pathlib type inference Close pylint-dev/pylint#224 Close pylint-dev/pylint#1660 What's New in astroid 1.5.3? ============================ Release date: 2017-06-03 * enum34 dependency is forced to be at least version 1.1.3. Fixes spurious bug related to enum classes being falsy in boolean context, which caused ``_Inconsistent Hierarchy_`` ``RuntimeError`` in ``singledispatch`` module. See links below for details: - http://bugs.python.org/issue26748 - https://bitbucket.org/ambv/singledispatch/issues/8/inconsistent-hierarchy-with-enum - https://bitbucket.org/stoneleaf/enum34/commits/da50803651ab644e6fce66ebc85562f1117c344b * Do not raise an exception when uninferable value is unpacked in ``with`` statement. * Lock objects from ``threading`` module are now correctly recognised as context managers. What's New in astroid 1.5.2? ============================ Release date: 2017-04-17 * Basic support for the class form of typing.NamedTuple * mro() can be computed for classes with old style classes in the hierarchy What's New in astroid 1.5.0? ============================ Release date: 2017-04-13 * Arguments node gained a new attribute, ``kwonlyargs_annotations`` This new attribute holds the annotations for the keyword-only arguments. * `namedtuple` inference now understands `rename` keyword argument * Classes can now know their definition-time arguments. Classes can support keyword arguments, which are passed when a class is constructed using ``__new__``. * Add support for inferring typing.NamedTuple. * ClassDef now supports __getitem__ inference through the metaclass. * getitem() method accepts nodes now, instead of Python objects. * Add support for explicit namespace packages, created with pkg_resources. * Add brain tips for _io.TextIOWrapper's buffer and raw attributes. * Add `returns` into the proper order in FunctionDef._astroid_fields The order is important, since it determines the last child, which in turn determines the last line number of a scoped node. * Add brain tips for functools.lru_cache. * New function, astroid.extract_node, exported out from astroid.test_utils. * Stop saving assignment locals in ExceptHandlers, when the context is a store. This fixes a tripping case, where the RHS of a ExceptHandler can be redefined by the LHS, leading to a local save. For instance, ``except KeyError, exceptions.IndexError`` could result in a local save for IndexError as KeyError, resulting in potential unexpected inferences. Since we don't lose a lot, this syntax gets prohibited. * Fix a crash which occurred when the class of a namedtuple could not be inferred. * Add support for implicit namespace packages (PEP 420) This change involves a couple of modifications. First, we're relying on a spec finder protocol, inspired by importlib's ModuleSpec, for finding where a file or package is, using importlib's PathFinder as well, which enable us to discover namespace packages as well. This discovery is the center piece of the namespace package support, the other part being the construction of a dummy Module node whenever a namespace package root directory is requested during astroid's import references. * Introduce a special attributes model Through this model, astroid starts knowing special attributes of certain Python objects, such as functions, classes, super objects and so on. This was previously possible before, but now the lookup and the attributes themselves are separated into a new module, objectmodel.py, which describes, in a more comprehensive way, the data model of each object. * Exceptions have their own object model Some of exceptions's attributes, such as .args and .message, can't be inferred correctly since they are descriptors that get transformed into the proper objects at runtime. This can cause issues with the static analysis, since they are inferred as different than what's expected. Now when we're creating instances of exceptions, we're inferring a special object that knows how to transform those runtime attributes into the proper objects via a custom object model. Closes issue #81 * dict.values, dict.keys and dict.items are properly inferred to their corresponding type, which also includes the proper containers for Python 3. * Fix a crash which occurred when a method had a same name as a builtin object, decorated at the same time by that builtin object ( a property for instance) * The inference can handle the case where the attribute is accessed through a subclass of a base class and the attribute is defined at the base class's level, by taking in consideration a redefinition in the subclass. This should fix https://github.com/pylint-dev/pylint/issues/432 * Calling lambda methods (defined at class level) can be understood. * Don't take in consideration invalid assignments, especially when __slots__ declaration forbids them. Close issue #332 * Functional form of enums support accessing values through __call__. * Brain tips for the ssl library. * decoratornames() does not leak InferenceError anymore. * wildcard_imported_names() got replaced by _public_names() Our understanding of wildcard imports through __all__ was half baked to say at least, since we couldn't account for modifications of the list, which results in tons of false positives. Instead, we replaced it with _public_names(), a method which returns all the names that are publicly available in a module, that is that don't start with an underscore, even though this means that there is a possibility for other names to be leaked out even though they are not present in the __all__ variable. The method is private in 1.4.X. * unpack_infer raises InferenceError if it can't operate with the given sequences of nodes. * Support accessing properties with super(). * Enforce strong updates per frames. When looking up a name in a scope, Scope.lookup will return only the values which will be reachable after execution, as seen in the following code: a = 1 a = 2 In this case it doesn't make sense to return two values, but only the last one. * Add support for inference on threading.Lock As a matter of fact, astroid can infer on threading.RLock, threading.Semaphore, but can't do it on threading.Lock (because it comes from an extension module). * pkg_resources brain tips are a bit more specific, by specifying proper returns. * The slots() method conflates all the slots from the ancestors into a list of current and parent slots. We're doing this because this is the right semantics of slots, they get inherited, as long as each parent defines a __slots__ entry. * Some nodes got a new attribute, 'ctx', which tells in which context the said node was used. The possible values for the contexts are `Load` ('a'), `Del` ('del a'), `Store` ('a = 4') and the nodes that got the new attribute are Starred, Subscript, List and Tuple. Closes issue #267. * relative_to_absolute_name or methods calling it will now raise TooManyLevelsError when a relative import was trying to access something beyond the top-level package. * AstroidBuildingException is now AstroidBuildingError. The first name will exist until astroid 2.0. * Add two new exceptions, AstroidImportError and AstroidSyntaxError. They are subclasses of AstroidBuildingException and are raised when a module can't be imported from various reasons. Also do_import_module lets the errors to bubble up without converting them to InferenceError. This particular conversion happens only during the inference. * Revert to using printf-style formatting in as_string, in order to avoid a potential problem with encodings when using .format. Closes issue #273. Patch by notsqrt. * assigned_stmts methods have the same signature from now on. They used to have different signatures and each one made assumptions about what could be passed to other implementations, leading to various possible crashes when one or more arguments weren't given. Closes issue #277. * Fix metaclass detection, when multiple keyword arguments are used in class definition. * Add support for annotated variable assignments (PEP 526) * Starred expressions are now inferred correctly for tuple, list, set, and dictionary literals. * Support for asynchronous comprehensions introduced in Python 3.6. Fixes #399. See PEP530 for details. What's New in astroid 1.4.1? ============================ Release date: 2015-11-29 * Add support for handling Uninferable nodes when calling as_string Some object, for instance List or Tuple can have, after inference, Uninferable as their elements, happening when their components weren't couldn't be inferred properly. This means that as_string needs to cope with expecting Uninferable nodes part of the other nodes coming for a string transformation. The patch adds a visit method in AsString and ``accept`` on Yes / Uninferable nodes. Closes issue #270. What's New in astroid 1.4.0? ============================ Release date: 2015-11-29 * Class.getattr('__mro__') returns the actual MRO. Closes issue #128. * The logilab-common dependency is not needed anymore as the needed code was integrated into astroid. * Generated enum member stubs now support IntEnum and multiple base classes. * astroid.builder.AstroidBuilder.string_build and astroid.builder.AstroidBuilder.file_build are now raising AstroidBuildingException when the parsing of the string raises a SyntaxError. * Add brain tips for multiprocessing.Manager and multiprocessing.managers.SyncManager. * Add some fixes which enhances the Jython support. The fix mostly includes updates to modutils, which is modified in order to properly lookup paths from live objects, which ends in $py.class, not pyc as for Python 2, Closes issue #83. * The Generator objects inferred with `infer_call_result` from functions have as parent the function from which they are returned. * Add brain tips for multiprocessing post Python 3.4+, where the module level functions are retrieved with getattr from a context object, leading to many no-member errors in Pylint. * Understand partially the 3-argument form of `type`. The only change is that astroid understands members passed in as dictionaries as the third argument. * .slots() will return an empty list for classes with empty slots. Previously it returned None, which is the same value for classes without slots at all. This was changed in order to better reflect what's actually happening. * Improve the inference of Getattr nodes when dealing with abstract properties from the abc module. In astroid.bases.Instance._wrap_attr we had a detection code for properties, which basically inferred whatever a property returned, passing the results up the stack, to the igetattr() method. It handled only the builtin property but the new patch also handles a couple of other properties, such as abc.abstractproperty. * UnboundMethod.getattr calls the getattr of its _proxied object and doesn't call super(...) anymore. It previously crashed, since the first ancestor in its mro was bases.Proxy and bases.Proxy doesn't implement the .getattr method. Closes issue #91. * Don't hard fail when calling .mro() on a class which has combined both newstyle and old style classes. The class in question is actually newstyle (and the __mro__ can be retrieved using Python). .mro() fallbacks to using .ancestors() in that case. * Class.local_attr and Class.local_attr_ancestors uses internally a mro lookup, using .mro() method, if they can. That means for newstyle classes, when trying to lookup a member using one of these functions, the first one according to the mro will be returned. This reflects nicely the reality, but it can have as a drawback the fact that it is a behaviour change (the previous behaviour was incorrect though). Also, having bases which can return multiple values when inferred will not work with the new approach, because .mro() only retrieves the first value inferred from a base. * Expose an implicit_metaclass() method in Class. This will return a builtins.type instance for newstyle classes. * Add two new exceptions for handling MRO error cases. DuplicateBasesError is emitted when duplicate bases are found in a class, InconsistentMroError is raised when the method resolution is determined to be inconsistent. They share a common class, MroError, which is a subclass of ResolveError, meaning that this change is backwards compatible. * Classes aren't marked as interfaces anymore, in the `type` attribute. * Class.has_dynamic_getattr doesn't return True for special methods which aren't implemented in pure Python, as it is the case for extension modules. Since most likely the methods were coming from a live object, this implies that all of them will have __getattr__ and __getattribute__ present and it is wrong to consider that those methods were actually implemented. * Add basic support for understanding context managers. Currently, there's no way to understand whatever __enter__ returns in a context manager and what it is bound using the ``as`` keyword. With these changes, we can understand ``bar`` in ``with foo() as bar``, which will be the result of __enter__. * Add a new type of node, called *inference objects*. Inference objects are similar with AST nodes, but they can be obtained only after inference, so they can't be found inside the original AST tree. Their purpose is to handle at astroid level some operations which can't be handled when using brain transforms. For instance, the first object added is FrozenSet, which can be manipulated at astroid's level (inferred, itered etc). Code such as this 'frozenset((1,2))' will not return an Instance of frozenset, without having access to its content, but a new objects.FrozenSet, which can be used just as a nodes.Set. * Add a new *inference object* called Super, which also adds support for understanding super calls. astroid understands the zero-argument form of super, specific to Python 3, where the interpreter fills itself the arguments of the call. Also, we are understanding the 2-argument form of super, both for bounded lookups (super(X, instance)) as well as for unbounded lookups (super(X, Y)), having as well support for validating that the object-or-type is a subtype of the first argument. The unbounded form of super (one argument) is not understood, since it's useless in practice and should be removed from Python's specification. Closes issue #89. * Add inference support for getattr builtin. Now getattr builtins are properly understood. Closes issue #103. * Add inference support for hasattr builtin. Closes issue #102. * Add 'assert_equals' method in nose.tools's brain plugin. * Don't leak StopIteration when inferring invalid UnaryOps (+[], +None etc.). * Improve the inference of UnaryOperands. When inferring unary operands, astroid looks up the return value of __pos__, __neg__ and __invert__ to determine the inferred value of ``~node``, ``+node`` or ``-node``. * Improve the inference of six.moves, especially when using `from ... import ...` syntax. Also, we added a new fail import hook for six.moves, which fixes the import-error false positive from pylint. Closes issue #107. * Make the first steps towards detecting type errors for unary and binary operations. In exceptions, one object was added for holding information about a possible UnaryOp TypeError, object called `UnaryOperationError`. Even though the name suggests it's an exception, it's actually not one. When inferring UnaryOps, we use this special object to mark a possible TypeError, object which can be interpreted by pylint in order to emit a new warning. We are also exposing a new method for UnaryOps, called `type_errors`, which returns a list of UnaryOperationsError. * A new method was added to the AST nodes, 'bool_value'. It is used to deduce the value of a node when used in a boolean context, which is useful for both inference, as well as for data flow analysis, where we are interested in what branches will be followed when the program will be executed. `bool_value` returns True, False or YES, if the node's boolean value can't be deduced. The method is used when inferring the unary operand `not`. Thus, `not something` will result in calling `something.bool_value` and negating the result, if it is a boolean. * Add inference support for boolean operations (`and` and `not`). * Add inference support for the builtin `callable`. * astroid.inspector was moved to pylint.pyreverse, since it is the only known client of this module. No other change was made to the exported API. * astroid.utils.ASTWalker and astroid.utils.LocalsVisitor were moved to pylint.pyreverse.utils. * Add inference support for the builtin `bool`. * Add `igetattr` method to scoped_nodes.Function. * Add support for Python 3.5's MatMul operation: see PEP 465 for more details. * NotImplemented is detected properly now as being part of the builtins module. Previously trying to infer the Name(NotImplemented) returned an YES object. * Add astroid.helpers, a module of various useful utilities which don't belong yet into other components. Added *object_type*, a function which can be used to obtain the type of almost any astroid object, similar to how the builtin *type* works. * Understand the one-argument form of the builtin *type*. This uses the recently added *astroid.helpers.object_type* in order to retrieve the Python type of the first argument of the call. * Add helpers.is_supertype and helpers.is_subtype, two functions for checking if an object is a super/sub type of another. * Improve the inference of binary arithmetic operations (normal and augmented). * Add support for retrieving TypeErrors for binary arithmetic operations. The change is similar to what was added for UnaryOps: a new method called *type_errors* for both AugAssign and BinOp, which can be used to retrieve type errors occurred during inference. Also, a new exception object was added, BinaryOperationError. * Lambdas found at class level, which have a `self` argument, are considered BoundMethods when accessing them from instances of their class. * Add support for multiplication of tuples and lists with instances which provides an __index__ returning-int method. * Add support for indexing containers with instances which provides an __index__ returning-int method. * Star unpacking in assignments returns properly a list, not the individual components. Closes issue #138. * Add annotation support for function.as_string(). Closes issue #37. * Add support for indexing bytes on Python 3. * Add support for inferring subscript on instances, which will use __getitem__. Closes issue #124. * Add support for pkg_resources.declare_namespaces. * Move pyreverse specific modules and functionality back into pyreverse (astroid.manager.Project, astroid.manager.Manager.project_from_files). * Understand metaclasses added with six.add_metaclass decorator. Closes issue #129. * Add a new convenience API, `astroid.parse`, which can be used to retrieve an astroid AST from a source code string, similar to how ast.parse can be used to obtain a Python AST from a source string. This is the test_utils.build_module promoted to a public API. * do_import_module passes the proper relative_only flag if the level is higher than 1. This has the side effect that using `from .something import something` in a non-package will finally result in an import-error on Pylint's side. Until now relative_only was ignored, leading to the import of `something`, if it was globally available. * Add get_wrapping_class API to scoped_nodes, which can be used to retrieve the class that wraps a node. * Class.getattr looks by default in the implicit and the explicit metaclasses, which is `type` on Python 3. Closes issue #114. * There's a new separate step for transforms. Until now, the transforms were applied at the same time the tree was being built. This was problematic if the transform functions were using inference, since the inference was executed on a partially constructed tree, which led to failures when post-building information was needed (such as setting the _from_names for the From imports). Now there's a separate step for transforms, which are applied using transform.TransformVisitor. There's a couple of other related changes: * astroid.parse and AstroidBuilder gained a new parameter `apply_transforms`, which is a boolean flag, which will control if the transforms are applied. We do this because there are uses when the vanilla tree is wanted, without any implicit modification. * the transforms are also applied for builtin modules, as a side effect of the fact that transform visiting was moved in AstroidBuilder._post_build from AstroidBuilder._data_build. Closes issue #116. * Class._explicit_metaclass is now a public API, in the form of Class.declared_metaclass. Class.mro remains the de facto method for retrieving the metaclass of a class, which will also do an evaluation of what declared_metaclass returns. * Understand slices of tuples, lists, strings and instances with support for slices. Closes issue #137. * Add proper grammatical names for `inferred` and `ass_type` methods, namely `inferred` and `assign_type`. The old methods will raise PendingDeprecationWarning, being slated for removal in astroid 2.0. * Add new AST names in order to be similar to the ones from the builtin ast module. With this change, Getattr becomes Attributes, Backquote becomes Repr, Class is ClassDef, Function is FunctionDef, Discard is Expr, CallFunc is Call, From is ImportFrom, AssName is AssignName and AssAttr is AssignAttr. The old names are maintained for backwards compatibility and they are interchangeable, in the sense that using Discard will use Expr under the hood and the implemented visit_discard in checkers will be called with Expr nodes instead. The AST does not contain the old nodes, only the interoperability between them hides this fact. Recommendations to move to the new nodes are emitted accordingly, the old names will be removed in astroid 2.0. * Add support for understanding class creation using `type.__new__(mcs, name, bases, attrs)`` Until now, inferring this kind of calls resulted in Instances, not in classes, since astroid didn't understand that the presence of the metaclass in the call leads to a class creating, not to an instance creation. * Understand the `slice` builtin. Closes issue #184. * Add brain tips for numpy.core, which should fix Pylint's #453. * Add a new node, DictUnpack, which is used to represent the unpacking of a dictionary into another dictionary, using PEP 448 specific syntax ``({1:2, **{2:3})`` This is a different approach than what the builtin ast module does, since it just uses None to represent this kind of operation, which seems conceptually wrong, due to the fact the AST contains non-AST nodes. Closes issue #206. What's New in astroid 1.3.6? ============================ Release date: 2015-03-14 * Class.slots raises NotImplementedError for old style classes. Closes issue #67. * Add a new option to AstroidManager, `optimize_ast`, which controls if peephole optimizer should be enabled or not. This prevents a regression, where the visit_binop method wasn't called anymore with astroid 1.3.5, due to the differences in the resulting AST. Closes issue #82. What's New in astroid 1.3.5? ============================ Release date: 2015-03-11 * Add the ability to optimize small ast subtrees, with the first use in the optimization of multiple BinOp nodes. This removes recursivity in the rebuilder when dealing with a lot of small strings joined by the addition operator. Closes issue #59. * Obtain the methods for the nose brain tip through an unittest.TestCase instance. Closes Pylint issue #457. * Fix a crash which occurred when a class was the ancestor of itself. Closes issue #78. * Improve the scope_lookup method for Classes regarding qualified objects, with an attribute name exactly as one provided in the class itself. For example, a class containing an attribute 'first', which was also an import and which had, as a base, a qualified name or a Gettattr node, in the form 'module.first', then Pylint would have inferred the `first` name as the function from the Class, not the import. Closes Pylint issue #466. * Implement the assigned_stmts operation for Starred nodes, which was omitted when support for Python 3 was added in astroid. Closes issue #36. What's New in astroid 1.3.4? ============================ Release date: 2015-01-17 * Get the first element from the method list when obtaining the functions from nose.tools.trivial. Closes Pylint issue #448. What's New in astroid 1.3.3? ============================ Release date: 2015-01-16 * Restore file_stream to a property, but deprecate it in favour of the newly added method Module.stream. By using a method instead of a property, it will be easier to properly close the file right after it is used, which will ensure that no file descriptors are leaked. Until now, due to the fact that a module was cached, it was not possible to close the file_stream anywhere. file_stream will start emitting PendingDeprecationWarnings in astroid 1.4, DeprecationWarnings in astroid 1.5 and it will be finally removed in astroid 1.6. * Add inference tips for 'tuple', 'list', 'dict' and 'set' builtins. * Add brain definition for most string and unicode methods * Changed the API for Class.slots. It returns None when the class doesn't define any slots. Previously, for both the cases where the class didn't have slots defined and when it had an empty list of slots, Class.slots returned an empty list. * Add a new method to Class nodes, 'mro', for obtaining the the method resolution order of the class. * Add brain tips for six.moves. Closes issue #63. * Improve the detection for functions decorated with decorators which returns static or class methods. * .slots() can contain unicode strings on Python 2. * Add inference tips for nose.tools. What's New in astroid 1.3.2? ============================ Release date: 2014-11-22 * Fixed a crash with invalid subscript index. * Implement proper base class semantics for Python 3, where every class derives from object. * Allow more fine-grained control over C extension loading in the manager. What's New in astroid 1.3.1? ============================ Release date: 2014-11-21 * Fixed a crash issue with the pytest brain module. What's New in astroid 1.3.0? ============================ Release date: 2014-11-20 * Fix a maximum recursion error occurred during the inference, where statements with the same name weren't filtered properly. Closes pylint issue #295. * Check that EmptyNode has an underlying object in EmptyNode.has_underlying_object. * Simplify the understanding of enum members. * Fix an infinite loop with decorator call chain inference, where the decorator returns itself. Closes issue #50. * Various speed improvements. Patch by Alex Munroe. * Add pytest brain plugin. Patch by Robbie Coomber. * Support for Python versions < 2.7 has been dropped, and the source has been made compatible with Python 2 and 3. Running 2to3 on installation for Python 3 is not needed anymore. * astroid now depends on six. * modutils._module_file opens __init__.py in binary mode. Closes issues #51 and #13. * Only C extensions from trusted sources (the standard library) are loaded into the examining Python process to build an AST from the live module. * Path names on case-insensitive filesystems are now properly handled. This fixes the stdlib detection code on Windows. * Metaclass-generating functions like six.with_metaclass are now supported via some explicit detection code. * astroid.register_module_extender has been added to generalize the support for module extenders as used by many brain plugins. * brain plugins can now register hooks to handle failed imports, as done by the gobject-introspection plugin. * The modules have been moved to a separate package directory, `setup.py develop` now works correctly. What's New in astroid 1.2.1? ============================ Release date: 2014-08-24 * Fix a crash occurred when inferring decorator call chain. Closes issue #42. * Set the parent of vararg and kwarg nodes when inferring them. Closes issue #43. * namedtuple inference knows about '_fields' attribute. * enum members knows about the methods from the enum class. * Name inference will lookup in the parent function of the current scope, in case searching in the current scope fails. * Inference of the functional form of the enums takes into consideration the various inputs that enums accepts. * The inference engine handles binary operations (add, mul etc.) between instances. * Fix an infinite loop in the inference, by returning a copy of instance attributes, when calling 'instance_attr'. Closes issue #34 (patch by Emile Anclin). * Don't crash when trying to infer unbound object.__new__ call. Closes issue #11. What's New in astroid 1.2.0? ============================ Release date: 2014-07-25 * Function nodes can detect decorator call chain and see if they are decorated with builtin descriptors (`classmethod` and `staticmethod`). * infer_call_result called on a subtype of the builtin type will now return a new `Class` rather than an `Instance`. * `Class.metaclass()` now handles module-level __metaclass__ declaration on python 2, and no longer looks at the __metaclass__ class attribute on python 3. * Function nodes can detect if they are decorated with subclasses of builtin descriptors when determining their type (`classmethod` and `staticmethod`). * Add `slots` method to `Class` nodes, for retrieving the list of valid slots it defines. * Expose function annotation to astroid: `Arguments` node exposes 'varargannotation', 'kwargannotation' and 'annotations' attributes, while `Function` node has the 'returns' attribute. * Backported most of the logilab.common.modutils module there, as most things there are for pylint/astroid only and we want to be able to fix them without requiring a new logilab.common release * Fix names grabbed using wildcard import in "absolute import mode" (ie with absolute_import activated from the __future__ or with python 3). Fix pylint issue #58. * Add support in pylint-brain for understanding enum classes. What's New in astroid 1.1.1? ============================ Release date: 2014-04-30 * `Class.metaclass()` looks in ancestors when the current class does not define explicitly a metaclass. * Do not cache modules if a module with the same qname is already known, and only return cached modules if both name and filepath match. Fixes pylint Bitbucket issue #136. What's New in astroid 1.1.0? ============================ Release date: 2014-04-18 * All class nodes are marked as new style classes for Py3k. * Add a `metaclass` function to `Class` nodes to retrieve their metaclass. * Add a new YieldFrom node. * Add support for inferring arguments to namedtuple invocations. * Make sure that objects returned for namedtuple inference have parents. * Don't crash when inferring nodes from `with` clauses with multiple context managers. Closes #18. * Don't crash when a class has some __call__ method that is not inferable. Closes #17. * Unwrap instances found in `.ancestors()`, by using their _proxied class. What's New in astroid 1.0.1? ============================ Release date: 2013-10-18 * fix py3k/windows installation issue (issue #4) * fix bug with namedtuple inference (issue #3) * get back gobject introspection from pylint-brain * fix some test failures under pypy and py3.3, though there is one remaining in each of these platform (2.7 tests are all green) What's New in astroid 1.0.0? ============================= Release date: 2013-07-29 * Fix some omissions in py2stdlib's version of hashlib and add a small test for it. * Properly recognize methods annotated with abc.abstract{property,method} as abstract. * Allow transformation functions on any node, providing a ``register_transform`` function on the manager instead of the ``register_transformer`` to make it more flexible wrt node selection * Use the new transformation API to provide support for namedtuple (actually in pylint-brain, closes #8766) * Added the test_utils module for building ASTs and extracting deeply nested nodes for easier testing. * Add support for py3k's keyword only arguments (PEP 3102) * RENAME THE PROJECT to astroid What's New in astroid 0.24.3? ============================= Release date: 2013-04-16 * #124360 [py3.3]: Don't crash on 'yield from' nodes * #123062 [pylint-brain]: Use correct names for keywords for urlparse * #123056 [pylint-brain]: Add missing methods for hashlib * #123068: Fix inference for generator methods to correctly handle yields in lambdas. * #123068: Make sure .as_string() returns valid code for yields in expressions. * #47957: Set literals are now correctly treated as inference leaves. * #123074: Add support for inference of subscript operations on dict literals. What's New in astroid 0.24.2? ============================= Release date: 2013-02-27 * pylint-brain: more subprocess.Popen faking (see #46273) * #109562 [jython]: java modules have no __doc__, causing crash * #120646 [py3]: fix for python3.3 _ast changes which may cause crash * #109988 [py3]: test fixes What's New in astroid 0.24.1? ============================= Release date: 2012-10-05 * #106191: fix __future__ absolute import w/ From node * #50395: fix function fromlineno when some decorator is splited on multiple lines (patch by Mark Gius) * #92362: fix pyreverse crash on relative import * #104041: fix crash 'module object has no file_encoding attribute' * #4294 (pylint-brain): bad inference on mechanize.Browser.open * #46273 (pylint-brain): bad inference subprocess.Popen.communicate What's New in astroid 0.24.0? ============================= Release date: 2012-07-18 * include pylint brain extension, describing some stuff not properly understood until then. (#100013, #53049, #23986, #72355) * #99583: fix raw_building.object_build for pypy implementation * use `open` rather than `file` in scoped_nodes as 2to3 miss it What's New in astroid 0.23.1? ============================= Release date: 2011-12-08 * #62295: avoid "OSError: Too many open files" by moving .file_stream as a Module property opening the file only when needed * Lambda nodes should have a `name` attribute * only call transformers if modname specified What's New in astroid 0.23.0? ============================= Release date: 2011-10-07 * #77187: ancestor() only returns the first class when inheriting from two classes coming from the same module * #76159: putting module's parent directory on the path causes problems linting when file names clash * #74746: should return empty module when __main__ is imported (patch by google) * #74748: getitem protocol return constant value instead of a Const node (patch by google) * #77188: support lgc.decorators.classproperty * #77253: provide a way for user code to register astng "transformers" using manager.register_transformer(callable) where callable will be called after an astng has been built and given the related module node as argument What's New in astroid 0.22.0? ============================= Release date: 2011-07-18 * added column offset information on nodes (patch by fawce) * #70497: Crash on AttributeError: 'NoneType' object has no attribute '_infer_name' * #70381: IndentationError in import causes crash * #70565: absolute imports treated as relative (patch by Jacek Konieczny) * #70494: fix file encoding detection with python2.x * py3k: __builtin__ module renamed to builtins, we should consider this to properly build ast for builtin objects What's New in astroid 0.21.1? ============================= Release date: 2011-01-11 * python3: handle file encoding; fix a lot of tests * fix #52006: "True" and "False" can be assigned as variable in Python2x * fix #8847: pylint doesn't understand function attributes at all * fix #8774: iterator / generator / next method * fix bad building of ast from living object w/ container classes (eg dict, set, list, tuple): contained elements should be turned to ast as well (not doing it will much probably cause crash later) * somewhat fix #57299 and other similar issue: Exception when trying to validate file using PyQt's PyQt4.QtCore module: we can't do much about it but at least catch such exception to avoid crash What's New in astroid 0.21.0? ============================= Release date: 2010-11-15 * python3.x: first python3.x release * fix #37105: Crash on AttributeError: 'NoneType' object has no attribute '_infer_name' * python2.4: drop python < 2.5 support What's New in astroid 0.20.4? ============================= Release date: 2010-10-27 * fix #37868 #37665 #33638 #37909: import problems with absolute_import_activated * fix #8969: false positive when importing from zip-safe eggs * fix #46131: minimal class decorator support * minimal python2.7 support (dict and set comprehension) * important progress on Py3k compatibility What's New in astroid 0.20.3? ============================= Release date: 2010-09-28 * restored python 2.3 compatibility * fix #45959: AttributeError: 'NoneType' object has no attribute 'frame', due to handling of __class__ when importing from living object (because of missing source code or C-compiled object) What's New in astroid 0.20.2? ============================= Release date: 2010-09-10 * fix astng building bug: we've to set module.package flag at the node creation time otherwise we'll miss this information when inferring relative import during the build process (this should fix for instance some problems with numpy) * added __subclasses__ to special class attribute * fix Class.interfaces so that no InferenceError raised on empty __implements__ * yield YES on multiplication of tuple/list with non valid operand What's New in astroid 0.20.1? ============================= Release date: 2010-05-11 * fix licensing to LGPL * add ALL_NODES_CLASSES constant to nodes module * nodes redirection cleanup (possible since refactoring) * bug fix for python < 2.5: add Delete node on Subscript nodes if we are in a del context What's New in astroid 0.20.0? ============================= Release date: 2010-03-22 * fix #20464: raises ?TypeError: '_Yes' object is not iterable? on list inference * fix #19882: pylint hangs * fix #20759: crash on pyreverse UNARY_OP_METHOD KeyError '~' * fix #20760: crash on pyreverse : AttributeError: 'Subscript' object has no attribute 'infer_lhs' * fix #21980: [Python-modules-team] Bug#573229 : Pylint hangs; improving the cache yields a speed improvement on big projects * major refactoring: rebuild the tree instead of modify / monkey patching * fix #19641: "maximum recursion depth exceeded" messages w/ python 2.6 this was introduced by a refactoring * Ned Batchelder patch to properly import eggs with Windows line endings. This fixes a problem with pylint not being able to import setuptools. * Winfried Plapper patches fixing .op attribute value for AugAssign nodes, visit_ifexp in nodes_as_string * Edward K. Ream / Tom Fleck patch closes #19641 (maximum recursion depth exceeded" messages w/ python 2.6), see https://bugs.launchpad.net/pylint/+bug/456870 What's New in astroid 0.19.3? ============================= Release date: 2009-12-18 * fix name error making 0.19.2 almost useless What's New in astroid 0.19.2? ============================= Release date: 2009-12-18 * fix #18773: inference bug on class member (due to bad handling of instance / class nodes "bounded" to method calls) * fix #9515: strange message for non-class "Class baz has no egg member" (due to bad inference of function call) * fix #18953: inference fails with augmented assignment (special case for augmented assignment in infer_ass method) * fix #13944: false positive for class/instance attributes (Instance.getattr should return assign nodes on instance classes as well as instance. * include spelling fixes provided by Dotan Barak What's New in astroid 0.19.1? ============================= Release date: 2009-08-27 * fix #8771: crash on yield expression * fix #10024: line numbering bug with try/except/finally * fix #10020: when building from living object, __name__ may be None * fix #9891: help(logilab.astng) throws TypeError * fix #9588: false positive E1101 for augmented assignment What's New in astroid 0.19.0? ============================= Release date: 2009-03-25 * fixed python 2.6 issue (tests ok w/ 2.4, 2.5, 2.6. Anyone using 2.2 / 2.3 to tell us if it works?) * some understanding of the __builtin__.property decorator * inference: introduce UnboundMethod / rename InstanceMethod to BoundMethod What's New in astroid 0.18.0? ============================= Release date: 2009-03-19 * major api / tree structure changes to make it works with compiler *and* python >= 2.5 _ast module * cleanup and refactoring on the way What's New in astroid 0.17.4? ============================= Release date: 2008-11-19 * fix #6015: filter statements bug triggering W0631 false positive in pylint * fix #5571: Function.is_method() should return False on module level functions decorated by staticmethod/classmethod (avoid some crash in pylint) * fix #5010: understand python 2.5 explicit relative imports What's New in astroid 0.17.3? ============================= Release date: 2008-09-10 * fix #5889: astng crash on certain pyreverse projects * fix bug w/ loop assignment in .lookup * apply Maarten patch fixing a crash on TryFinalaly.block_range and fixing 'else'/'final' block line detection What's New in astroid 0.17.2? ============================= Release date: 2008-01-14 * "with" statement support, patch provided by Brian Hawthorne * fixed recursion arguments in nodes_of_class method as notified by Dave Borowitz * new InstanceMethod node introduced to wrap bound method (e.g. Function node), patch provided by Dave Borowitz What's New in astroid 0.17.1? ============================= Release date: 2007-06-07 * fix #3651: crash when callable as default arg * fix #3670: subscription inference crash in some cases * fix #3673: Lambda instance has no attribute 'pytype' * fix crash with chained "import as" * fix crash on numpy * fix potential InfiniteRecursion error with builtin objects * include patch from Marien Zwart fixing some test / py 2.5 * be more error resilient when accessing living objects from external code in the manager What's New in astroid 0.17.0? ============================= Release date: 2007-02-22 * api change to be able to infer using a context (used to infer function call result only for now) * slightly better inference on astng built from living object by trying to infer dummy nodes (able to infer 'help' builtin for instance) * external attribute definition support * basic math operation inference * new pytype method on possibly inferred node (e.g. module, classes, const...) * fix a living object astng building bug, which was making "open" uninferable * fix lookup of name in method bug (#3289) * fix decorator lookup bug (#3261) What's New in astroid 0.16.3? ============================= Release date: 2006-11-23 * enhance inference for the subscription notation (motivated by a patch from Amaury) and for unary sub/add What's New in astroid 0.16.2? ============================= Release date: 2006-11-15 * grrr, fixed python 2.3 incompatibility introduced by generator expression scope handling * upgrade to avoid warnings with logilab-common 0.21.0 (on which now depends so) * backported astutils module from logilab-common What's New in astroid 0.16.1? ============================= Release date: 2006-09-25 * python 2.5 support, patch provided by Marien Zwart * fix [Class|Module].block_range method (this fixes pylint's inline disabling of messages on classes/modules) * handle class.__bases__ and class.__mro__ (proper metaclass handling still needed though) * drop python2.2 support: remove code that was working around python2.2 * fixed generator expression scope bug * patch transformer to extract correct line information What's New in astroid 0.16.0? ============================= Release date: 2006-04-19 * fix living object building to consider classes such as property as a class instead of a data descriptor * fix multiple assignment inference which was discarding some solutions * added some line manipulation methods to handle pylint's block messages control feature (Node.last_source_line(), None.block_range(lineno) What's New in astroid 0.15.1? ============================= Release date: 2006-03-10 * fix avoiding to load everything from living objects... Thanks Amaury! * fix a possible NameError in Instance.infer_call_result What's New in astroid 0.15.0? ============================= Release date: 2006-03-06 * fix possible infinite recursion on global statements (close #10342) and in various other cases... * fix locals/globals interactions when the global statement is used (close #10434) * multiple inference related bug fixes * associate List, Tuple and Dict and Const nodes to their respective classes * new .ass_type method on assignment related node, returning the assignment type node (Assign, For, ListCompFor, GenExprFor, TryExcept) * more API refactoring... .resolve method has disappeared, now you have .ilookup on every nodes and .getattr/.igetattr on node supporting the attribute protocol * introduced a YES object that may be returned when there is ambiguity on an inference path (typically function call when we don't know arguments value) * builder try to instantiate builtin exceptions subclasses to get their instance attribute What's New in astroid 0.14.0? ============================= Release date: 2006-01-10 * some major inference improvements and refactoring ! The drawback is the introduction of some non backward compatible change in the API but it's imho much cleaner and powerful now :) * new boolean property .newstyle on Class nodes (implements #10073) * new .import_module method on Module node to help in .resolve refactoring * .instance_attrs has list of assignments to instance attribute dictionary as value instead of one * added missing GenExprIf and GenExprInner nodes, and implements as_string for each generator expression related nodes * specifically catch KeyboardInterrupt to reraise it in some places * fix so that module names are always absolute * fix .resolve on package where a subpackage is imported in the __init__ file * fix a bug regarding construction of Function node from living object with earlier version of python 2.4 * fix a NameError on Import and From self_resolve method * fix a bug occurring when building an astng from a living object with a property * lint fixes What's New in astroid 0.13.1? ============================= Release date: 2005-11-07 * fix bug on building from living module the same object in encountered more than once time (e.g. builtins.object) (close #10069) * fix bug in Class.ancestors() regarding inner classes (close #10072) * fix .self_resolve() on From and Module nodes to handle package precedence over module (close #10066) * locals dict for package contains __path__ definition (close #10065) * astng provide GenExpr and GenExprFor nodes with python >= 2.4 (close #10063) * fix python2.2 compatibility (close #9922) * link .__contains__ to .has_key on scoped node to speed up execution * remove no more necessary .module_object() method on From and Module nodes * normalize parser.ParserError to SyntaxError with python 2.2 What's New in astroid 0.13.0? ============================= Release date: 2005-10-21 * .locals and .globals on scoped node handle now a list of references to each assignment statements instead of a single reference to the first assignment statement. * fix bug with manager.astng_from_module_name when a context file is given (notably fix ZODB 3.4 crash with pylint/pyreverse) * fix Compare.as_string method * fix bug with lambda object missing the "type" attribute * some minor refactoring * This package has been extracted from the logilab-common package, which will be kept for some time for backward compatibility but will no longer be maintained (this explains that this package is starting with the 0.13 version number, since the fork occurs with the version released in logilab-common 0.12). astroid-3.2.2/tests/0000775000175000017500000000000014622475517014250 5ustar epsilonepsilonastroid-3.2.2/tests/test_python3.py0000664000175000017500000003416514622475517017276 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest from textwrap import dedent import pytest from astroid import exceptions, nodes from astroid.builder import AstroidBuilder, extract_node from astroid.test_utils import require_version class Python3TC(unittest.TestCase): @classmethod def setUpClass(cls): cls.builder = AstroidBuilder() def test_starred_notation(self) -> None: astroid = self.builder.string_build("*a, b = [1, 2, 3]", "test", "test") # Get the star node node = next(next(next(astroid.get_children()).get_children()).get_children()) self.assertTrue(isinstance(node.assign_type(), nodes.Assign)) def test_yield_from(self) -> None: body = dedent( """ def func(): yield from iter([1, 2]) """ ) astroid = self.builder.string_build(body) func = astroid.body[0] self.assertIsInstance(func, nodes.FunctionDef) yieldfrom_stmt = func.body[0] self.assertIsInstance(yieldfrom_stmt, nodes.Expr) self.assertIsInstance(yieldfrom_stmt.value, nodes.YieldFrom) self.assertEqual(yieldfrom_stmt.as_string(), "yield from iter([1, 2])") def test_yield_from_is_generator(self) -> None: body = dedent( """ def func(): yield from iter([1, 2]) """ ) astroid = self.builder.string_build(body) func = astroid.body[0] self.assertIsInstance(func, nodes.FunctionDef) self.assertTrue(func.is_generator()) def test_yield_from_as_string(self) -> None: body = dedent( """ def func(): yield from iter([1, 2]) value = yield from other() """ ) astroid = self.builder.string_build(body) func = astroid.body[0] self.assertEqual(func.as_string().strip(), body.strip()) # metaclass tests def test_simple_metaclass(self) -> None: astroid = self.builder.string_build("class Test(metaclass=type): pass") klass = astroid.body[0] metaclass = klass.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertEqual(metaclass.name, "type") def test_metaclass_error(self) -> None: astroid = self.builder.string_build("class Test(metaclass=typ): pass") klass = astroid.body[0] self.assertFalse(klass.metaclass()) def test_metaclass_imported(self) -> None: astroid = self.builder.string_build( dedent( """ from abc import ABCMeta class Test(metaclass=ABCMeta): pass""" ) ) klass = astroid.body[1] metaclass = klass.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertEqual(metaclass.name, "ABCMeta") def test_metaclass_multiple_keywords(self) -> None: astroid = self.builder.string_build( "class Test(magic=None, metaclass=type): pass" ) klass = astroid.body[0] metaclass = klass.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertEqual(metaclass.name, "type") def test_as_string(self) -> None: body = dedent( """ from abc import ABCMeta class Test(metaclass=ABCMeta): pass""" ) astroid = self.builder.string_build(body) klass = astroid.body[1] self.assertEqual( klass.as_string(), "\n\nclass Test(metaclass=ABCMeta):\n pass\n" ) def test_old_syntax_works(self) -> None: astroid = self.builder.string_build( dedent( """ class Test: __metaclass__ = type class SubTest(Test): pass """ ) ) klass = astroid["SubTest"] metaclass = klass.metaclass() self.assertIsNone(metaclass) def test_metaclass_yes_leak(self) -> None: astroid = self.builder.string_build( dedent( """ # notice `ab` instead of `abc` from ab import ABCMeta class Meta(metaclass=ABCMeta): pass """ ) ) klass = astroid["Meta"] self.assertIsNone(klass.metaclass()) def test_parent_metaclass(self) -> None: astroid = self.builder.string_build( dedent( """ from abc import ABCMeta class Test(metaclass=ABCMeta): pass class SubTest(Test): pass """ ) ) klass = astroid["SubTest"] self.assertTrue(klass.newstyle) metaclass = klass.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertEqual(metaclass.name, "ABCMeta") def test_metaclass_ancestors(self) -> None: astroid = self.builder.string_build( dedent( """ from abc import ABCMeta class FirstMeta(metaclass=ABCMeta): pass class SecondMeta(metaclass=type): pass class Simple: pass class FirstImpl(FirstMeta): pass class SecondImpl(FirstImpl): pass class ThirdImpl(Simple, SecondMeta): pass """ ) ) classes = {"ABCMeta": ("FirstImpl", "SecondImpl"), "type": ("ThirdImpl",)} for metaclass, names in classes.items(): for name in names: impl = astroid[name] meta = impl.metaclass() self.assertIsInstance(meta, nodes.ClassDef) self.assertEqual(meta.name, metaclass) def test_annotation_support(self) -> None: astroid = self.builder.string_build( dedent( """ def test(a: int, b: str, c: None, d, e, *args: float, **kwargs: int)->int: pass """ ) ) func = astroid["test"] self.assertIsInstance(func.args.varargannotation, nodes.Name) self.assertEqual(func.args.varargannotation.name, "float") self.assertIsInstance(func.args.kwargannotation, nodes.Name) self.assertEqual(func.args.kwargannotation.name, "int") self.assertIsInstance(func.returns, nodes.Name) self.assertEqual(func.returns.name, "int") arguments = func.args self.assertIsInstance(arguments.annotations[0], nodes.Name) self.assertEqual(arguments.annotations[0].name, "int") self.assertIsInstance(arguments.annotations[1], nodes.Name) self.assertEqual(arguments.annotations[1].name, "str") self.assertIsInstance(arguments.annotations[2], nodes.Const) self.assertIsNone(arguments.annotations[2].value) self.assertIsNone(arguments.annotations[3]) self.assertIsNone(arguments.annotations[4]) astroid = self.builder.string_build( dedent( """ def test(a: int=1, b: str=2): pass """ ) ) func = astroid["test"] self.assertIsInstance(func.args.annotations[0], nodes.Name) self.assertEqual(func.args.annotations[0].name, "int") self.assertIsInstance(func.args.annotations[1], nodes.Name) self.assertEqual(func.args.annotations[1].name, "str") self.assertIsNone(func.returns) def test_kwonlyargs_annotations_supper(self) -> None: node = self.builder.string_build( dedent( """ def test(*, a: int, b: str, c: None, d, e): pass """ ) ) func = node["test"] arguments = func.args self.assertIsInstance(arguments.kwonlyargs_annotations[0], nodes.Name) self.assertEqual(arguments.kwonlyargs_annotations[0].name, "int") self.assertIsInstance(arguments.kwonlyargs_annotations[1], nodes.Name) self.assertEqual(arguments.kwonlyargs_annotations[1].name, "str") self.assertIsInstance(arguments.kwonlyargs_annotations[2], nodes.Const) self.assertIsNone(arguments.kwonlyargs_annotations[2].value) self.assertIsNone(arguments.kwonlyargs_annotations[3]) self.assertIsNone(arguments.kwonlyargs_annotations[4]) def test_annotation_as_string(self) -> None: code1 = dedent( """ def test(a, b: int = 4, c=2, f: 'lala' = 4) -> 2: pass""" ) code2 = dedent( """ def test(a: typing.Generic[T], c: typing.Any = 24) -> typing.Iterable: pass""" ) for code in (code1, code2): func = extract_node(code) self.assertEqual(func.as_string(), code) def test_unpacking_in_dicts(self) -> None: code = "{'x': 1, **{'y': 2}}" node = extract_node(code) self.assertEqual(node.as_string(), code) assert isinstance(node, nodes.Dict) keys = [key for (key, _) in node.items] self.assertIsInstance(keys[0], nodes.Const) self.assertIsInstance(keys[1], nodes.DictUnpack) def test_nested_unpacking_in_dicts(self) -> None: code = "{'x': 1, **{'y': 2, **{'z': 3}}}" node = extract_node(code) self.assertEqual(node.as_string(), code) def test_unpacking_in_dict_getitem(self) -> None: node = extract_node("{1:2, **{2:3, 3:4}, **{5: 6}}") for key, expected in ((1, 2), (2, 3), (3, 4), (5, 6)): value = node.getitem(nodes.Const(key)) self.assertIsInstance(value, nodes.Const) self.assertEqual(value.value, expected) @staticmethod def test_unpacking_in_dict_getitem_with_ref() -> None: node = extract_node( """ a = {1: 2} {**a, 2: 3} #@ """ ) assert isinstance(node, nodes.Dict) for key, expected in ((1, 2), (2, 3)): value = node.getitem(nodes.Const(key)) assert isinstance(value, nodes.Const) assert value.value == expected @staticmethod def test_unpacking_in_dict_getitem_uninferable() -> None: node = extract_node("{**a, 2: 3}") assert isinstance(node, nodes.Dict) with pytest.raises(exceptions.AstroidIndexError): node.getitem(nodes.Const(1)) value = node.getitem(nodes.Const(2)) assert isinstance(value, nodes.Const) assert value.value == 3 def test_format_string(self) -> None: code = "f'{greetings} {person}'" node = extract_node(code) self.assertEqual(node.as_string(), code) def test_underscores_in_numeral_literal(self) -> None: pairs = [("10_1000", 101000), ("10_000_000", 10000000), ("0x_FF_FF", 65535)] for value, expected in pairs: node = extract_node(value) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected) def test_async_comprehensions(self) -> None: async_comprehensions = [ extract_node( "async def f(): return __([i async for i in aiter() if i % 2])" ), extract_node( "async def f(): return __({i async for i in aiter() if i % 2})" ), extract_node( "async def f(): return __((i async for i in aiter() if i % 2))" ), extract_node( "async def f(): return __({i: i async for i in aiter() if i % 2})" ), ] non_async_comprehensions = [ extract_node("async def f(): return __({i: i for i in iter() if i % 2})") ] for comp in async_comprehensions: self.assertTrue(comp.generators[0].is_async) for comp in non_async_comprehensions: self.assertFalse(comp.generators[0].is_async) @require_version("3.7") def test_async_comprehensions_outside_coroutine(self): # When async and await will become keywords, async comprehensions # will be allowed outside of coroutines body comprehensions = [ "[i async for i in aiter() if condition(i)]", "[await fun() async for fun in funcs]", "{await fun() async for fun in funcs}", "{fun: await fun() async for fun in funcs}", "[await fun() async for fun in funcs if await smth]", "{await fun() async for fun in funcs if await smth}", "{fun: await fun() async for fun in funcs if await smth}", "[await fun() async for fun in funcs]", "{await fun() async for fun in funcs}", "{fun: await fun() async for fun in funcs}", "[await fun() async for fun in funcs if await smth]", "{await fun() async for fun in funcs if await smth}", "{fun: await fun() async for fun in funcs if await smth}", ] for comp in comprehensions: node = extract_node(comp) self.assertTrue(node.generators[0].is_async) def test_async_comprehensions_as_string(self) -> None: func_bodies = [ "return [i async for i in aiter() if condition(i)]", "return [await fun() for fun in funcs]", "return {await fun() for fun in funcs}", "return {fun: await fun() for fun in funcs}", "return [await fun() for fun in funcs if await smth]", "return {await fun() for fun in funcs if await smth}", "return {fun: await fun() for fun in funcs if await smth}", "return [await fun() async for fun in funcs]", "return {await fun() async for fun in funcs}", "return {fun: await fun() async for fun in funcs}", "return [await fun() async for fun in funcs if await smth]", "return {await fun() async for fun in funcs if await smth}", "return {fun: await fun() async for fun in funcs if await smth}", ] for func_body in func_bodies: code = dedent( f""" async def f(): {func_body}""" ) func = extract_node(code) self.assertEqual(func.as_string().strip(), code.strip()) astroid-3.2.2/tests/test_utils.py0000664000175000017500000001447314622475517017032 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest from astroid import Uninferable, builder, extract_node, nodes from astroid.exceptions import InferenceError class InferenceUtil(unittest.TestCase): def test_not_exclusive(self) -> None: module = builder.parse( """ x = 10 for x in range(5): print (x) if x > 0: print ('#' * x) """, __name__, __file__, ) xass1 = module.locals["x"][0] assert xass1.lineno == 2 xnames = [n for n in module.nodes_of_class(nodes.Name) if n.name == "x"] assert len(xnames) == 3 assert xnames[1].lineno == 6 self.assertEqual(nodes.are_exclusive(xass1, xnames[1]), False) self.assertEqual(nodes.are_exclusive(xass1, xnames[2]), False) def test_not_exclusive_walrus_operator(self) -> None: node_if, node_body, node_or_else = extract_node( """ if val := True: #@ print(val) #@ else: print(val) #@ """ ) node_if: nodes.If node_walrus = next(node_if.nodes_of_class(nodes.NamedExpr)) assert nodes.are_exclusive(node_walrus, node_if) is False assert nodes.are_exclusive(node_walrus, node_body) is False assert nodes.are_exclusive(node_walrus, node_or_else) is False assert nodes.are_exclusive(node_if, node_body) is False assert nodes.are_exclusive(node_if, node_or_else) is False assert nodes.are_exclusive(node_body, node_or_else) is True def test_not_exclusive_walrus_multiple(self) -> None: node_if, body_1, body_2, or_else_1, or_else_2 = extract_node( """ if (val := True) or (val_2 := True): #@ print(val) #@ print(val_2) #@ else: print(val) #@ print(val_2) #@ """ ) node_if: nodes.If walruses = list(node_if.nodes_of_class(nodes.NamedExpr)) assert nodes.are_exclusive(node_if, walruses[0]) is False assert nodes.are_exclusive(node_if, walruses[1]) is False assert nodes.are_exclusive(walruses[0], walruses[1]) is False assert nodes.are_exclusive(walruses[0], body_1) is False assert nodes.are_exclusive(walruses[0], body_2) is False assert nodes.are_exclusive(walruses[1], body_1) is False assert nodes.are_exclusive(walruses[1], body_2) is False assert nodes.are_exclusive(walruses[0], or_else_1) is False assert nodes.are_exclusive(walruses[0], or_else_2) is False assert nodes.are_exclusive(walruses[1], or_else_1) is False assert nodes.are_exclusive(walruses[1], or_else_2) is False def test_not_exclusive_walrus_operator_nested(self) -> None: node_if, node_body, node_or_else = extract_node( """ if all((last_val := i) % 2 == 0 for i in range(10)): #@ print(last_val) #@ else: print(last_val) #@ """ ) node_if: nodes.If node_walrus = next(node_if.nodes_of_class(nodes.NamedExpr)) assert nodes.are_exclusive(node_walrus, node_if) is False assert nodes.are_exclusive(node_walrus, node_body) is False assert nodes.are_exclusive(node_walrus, node_or_else) is False assert nodes.are_exclusive(node_if, node_body) is False assert nodes.are_exclusive(node_if, node_or_else) is False assert nodes.are_exclusive(node_body, node_or_else) is True def test_if(self) -> None: module = builder.parse( """ if 1: a = 1 a = 2 elif 2: a = 12 a = 13 else: a = 3 a = 4 """ ) a1 = module.locals["a"][0] a2 = module.locals["a"][1] a3 = module.locals["a"][2] a4 = module.locals["a"][3] a5 = module.locals["a"][4] a6 = module.locals["a"][5] self.assertEqual(nodes.are_exclusive(a1, a2), False) self.assertEqual(nodes.are_exclusive(a1, a3), True) self.assertEqual(nodes.are_exclusive(a1, a5), True) self.assertEqual(nodes.are_exclusive(a3, a5), True) self.assertEqual(nodes.are_exclusive(a3, a4), False) self.assertEqual(nodes.are_exclusive(a5, a6), False) def test_try_except(self) -> None: module = builder.parse( """ try: def exclusive_func2(): "docstring" except TypeError: def exclusive_func2(): "docstring" except: def exclusive_func2(): "docstring" else: def exclusive_func2(): "this one redefine the one defined line 42" """ ) f1 = module.locals["exclusive_func2"][0] f2 = module.locals["exclusive_func2"][1] f3 = module.locals["exclusive_func2"][2] f4 = module.locals["exclusive_func2"][3] self.assertEqual(nodes.are_exclusive(f1, f2), True) self.assertEqual(nodes.are_exclusive(f1, f3), True) self.assertEqual(nodes.are_exclusive(f1, f4), False) self.assertEqual(nodes.are_exclusive(f2, f4), True) self.assertEqual(nodes.are_exclusive(f3, f4), True) self.assertEqual(nodes.are_exclusive(f3, f2), True) self.assertEqual(nodes.are_exclusive(f2, f1), True) self.assertEqual(nodes.are_exclusive(f4, f1), False) self.assertEqual(nodes.are_exclusive(f4, f2), True) def test_unpack_infer_uninferable_nodes(self) -> None: node = builder.extract_node( """ x = [A] * 1 f = [x, [A] * 2] f """ ) inferred = next(node.infer()) unpacked = list(nodes.unpack_infer(inferred)) self.assertEqual(len(unpacked), 3) self.assertTrue(all(elt is Uninferable for elt in unpacked)) def test_unpack_infer_empty_tuple(self) -> None: node = builder.extract_node( """ () """ ) inferred = next(node.infer()) with self.assertRaises(InferenceError): list(nodes.unpack_infer(inferred)) astroid-3.2.2/tests/test_group_exceptions.py0000664000175000017500000000671214622475517021264 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import textwrap import pytest from astroid import ( AssignName, ExceptHandler, For, Name, Try, Uninferable, bases, extract_node, ) from astroid.const import PY311_PLUS from astroid.context import InferenceContext from astroid.nodes import Expr, Raise, TryStar @pytest.mark.skipif(not PY311_PLUS, reason="Requires Python 3.11 or higher") def test_group_exceptions() -> None: node = extract_node( textwrap.dedent( """ try: raise ExceptionGroup("group", [ValueError(654)]) except ExceptionGroup as eg: for err in eg.exceptions: if isinstance(err, ValueError): print("Handling ValueError") elif isinstance(err, TypeError): print("Handling TypeError")""" ) ) assert isinstance(node, Try) handler = node.handlers[0] assert node.block_range(lineno=1) == (1, 9) assert node.block_range(lineno=2) == (2, 2) assert node.block_range(lineno=5) == (5, 9) assert isinstance(handler, ExceptHandler) assert handler.type.name == "ExceptionGroup" children = list(handler.get_children()) assert len(children) == 3 exception_group, short_name, for_loop = children assert isinstance(exception_group, Name) assert exception_group.block_range(1) == (1, 4) assert isinstance(short_name, AssignName) assert isinstance(for_loop, For) @pytest.mark.skipif(not PY311_PLUS, reason="Requires Python 3.11 or higher") def test_star_exceptions() -> None: code = textwrap.dedent( """ try: raise ExceptionGroup("group", [ValueError(654)]) except* ValueError: print("Handling ValueError") except* TypeError: print("Handling TypeError") else: sys.exit(127) finally: sys.exit(0)""" ) node = extract_node(code) assert isinstance(node, TryStar) assert node.as_string() == code.replace('"', "'").strip() assert isinstance(node.body[0], Raise) assert node.block_range(1) == (1, 11) assert node.block_range(2) == (2, 2) assert node.block_range(3) == (3, 3) assert node.block_range(4) == (4, 4) assert node.block_range(5) == (5, 5) assert node.block_range(6) == (6, 6) assert node.block_range(7) == (7, 7) assert node.block_range(8) == (8, 8) assert node.block_range(9) == (9, 9) assert node.block_range(10) == (10, 10) assert node.block_range(11) == (11, 11) assert node.handlers handler = node.handlers[0] assert isinstance(handler, ExceptHandler) assert handler.type.name == "ValueError" orelse = node.orelse[0] assert isinstance(orelse, Expr) assert orelse.value.args[0].value == 127 final = node.finalbody[0] assert isinstance(final, Expr) assert final.value.args[0].value == 0 @pytest.mark.skipif(not PY311_PLUS, reason="Requires Python 3.11 or higher") def test_star_exceptions_infer_name() -> None: trystar = extract_node( """ try: 1/0 except* ValueError: pass""" ) name = "arbitraryName" context = InferenceContext() context.lookupname = name stmts = bases._infer_stmts([trystar], context) assert list(stmts) == [Uninferable] assert context.lookupname == name astroid-3.2.2/tests/test_object_model.py0000664000175000017500000006631614622475517020323 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest import xml import pytest import astroid from astroid import bases, builder, nodes, objects, test_utils, util from astroid.const import PY311_PLUS from astroid.exceptions import InferenceError try: import six # type: ignore[import] # pylint: disable=unused-import HAS_SIX = True except ImportError: HAS_SIX = False class InstanceModelTest(unittest.TestCase): def test_instance_special_model(self) -> None: ast_nodes = builder.extract_node( """ class A: "test" def __init__(self): self.a = 42 a = A() a.__class__ #@ a.__module__ #@ a.__doc__ #@ a.__dict__ #@ """, module_name="fake_module", ) assert isinstance(ast_nodes, list) cls = next(ast_nodes[0].infer()) self.assertIsInstance(cls, astroid.ClassDef) self.assertEqual(cls.name, "A") module = next(ast_nodes[1].infer()) self.assertIsInstance(module, astroid.Const) self.assertEqual(module.value, "fake_module") doc = next(ast_nodes[2].infer()) self.assertIsInstance(doc, astroid.Const) self.assertEqual(doc.value, "test") dunder_dict = next(ast_nodes[3].infer()) self.assertIsInstance(dunder_dict, astroid.Dict) attr = next(dunder_dict.getitem(astroid.Const("a")).infer()) self.assertIsInstance(attr, astroid.Const) self.assertEqual(attr.value, 42) @pytest.mark.xfail(reason="Instance lookup cannot override object model") def test_instance_local_attributes_overrides_object_model(self): # The instance lookup needs to be changed in order for this to work. ast_node = builder.extract_node( """ class A: @property def __dict__(self): return [] A().__dict__ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, astroid.List) self.assertEqual(inferred.elts, []) class BoundMethodModelTest(unittest.TestCase): def test_bound_method_model(self) -> None: ast_nodes = builder.extract_node( """ class A: def test(self): pass a = A() a.test.__func__ #@ a.test.__self__ #@ """ ) assert isinstance(ast_nodes, list) func = next(ast_nodes[0].infer()) self.assertIsInstance(func, astroid.FunctionDef) self.assertEqual(func.name, "test") self_ = next(ast_nodes[1].infer()) self.assertIsInstance(self_, astroid.Instance) self.assertEqual(self_.name, "A") class UnboundMethodModelTest(unittest.TestCase): def test_unbound_method_model(self) -> None: ast_nodes = builder.extract_node( """ class A: def test(self): pass t = A.test t.__class__ #@ t.__func__ #@ t.__self__ #@ t.im_class #@ t.im_func #@ t.im_self #@ """ ) assert isinstance(ast_nodes, list) cls = next(ast_nodes[0].infer()) self.assertIsInstance(cls, astroid.ClassDef) unbound_name = "function" self.assertEqual(cls.name, unbound_name) func = next(ast_nodes[1].infer()) self.assertIsInstance(func, astroid.FunctionDef) self.assertEqual(func.name, "test") self_ = next(ast_nodes[2].infer()) self.assertIsInstance(self_, astroid.Const) self.assertIsNone(self_.value) self.assertEqual(cls.name, next(ast_nodes[3].infer()).name) self.assertEqual(func, next(ast_nodes[4].infer())) self.assertIsNone(next(ast_nodes[5].infer()).value) class ClassModelTest(unittest.TestCase): def test_priority_to_local_defined_values(self) -> None: ast_node = builder.extract_node( """ class A: __doc__ = "first" A.__doc__ #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, astroid.Const) self.assertEqual(inferred.value, "first") def test_class_model_correct_mro_subclasses_proxied(self) -> None: ast_nodes = builder.extract_node( """ class A(object): pass A.mro #@ A.__subclasses__ #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.BoundMethod) self.assertIsInstance(inferred._proxied, astroid.FunctionDef) self.assertIsInstance(inferred.bound, astroid.ClassDef) self.assertEqual(inferred.bound.name, "type") def test_class_model(self) -> None: ast_nodes = builder.extract_node( """ class A(object): "test" class B(A): pass class C(A): pass A.__module__ #@ A.__name__ #@ A.__qualname__ #@ A.__doc__ #@ A.__mro__ #@ A.mro() #@ A.__bases__ #@ A.__class__ #@ A.__dict__ #@ A.__subclasses__() #@ """, module_name="fake_module", ) assert isinstance(ast_nodes, list) module = next(ast_nodes[0].infer()) self.assertIsInstance(module, astroid.Const) self.assertEqual(module.value, "fake_module") name = next(ast_nodes[1].infer()) self.assertIsInstance(name, astroid.Const) self.assertEqual(name.value, "A") qualname = next(ast_nodes[2].infer()) self.assertIsInstance(qualname, astroid.Const) self.assertEqual(qualname.value, "fake_module.A") doc = next(ast_nodes[3].infer()) self.assertIsInstance(doc, astroid.Const) self.assertEqual(doc.value, "test") mro = next(ast_nodes[4].infer()) self.assertIsInstance(mro, astroid.Tuple) self.assertEqual([cls.name for cls in mro.elts], ["A", "object"]) called_mro = next(ast_nodes[5].infer()) self.assertEqual(called_mro.elts, mro.elts) base_nodes = next(ast_nodes[6].infer()) self.assertIsInstance(base_nodes, astroid.Tuple) self.assertEqual([cls.name for cls in base_nodes.elts], ["object"]) cls = next(ast_nodes[7].infer()) self.assertIsInstance(cls, astroid.ClassDef) self.assertEqual(cls.name, "type") cls_dict = next(ast_nodes[8].infer()) self.assertIsInstance(cls_dict, astroid.Dict) subclasses = next(ast_nodes[9].infer()) self.assertIsInstance(subclasses, astroid.List) self.assertEqual([cls.name for cls in subclasses.elts], ["B", "C"]) class ModuleModelTest(unittest.TestCase): def test_priority_to_local_defined_values(self) -> None: ast_node = astroid.parse( """ __file__ = "mine" """ ) file_value = next(ast_node.igetattr("__file__")) self.assertIsInstance(file_value, astroid.Const) self.assertEqual(file_value.value, "mine") def test__path__not_a_package(self) -> None: ast_node = builder.extract_node( """ import sys sys.__path__ #@ """ ) with self.assertRaises(InferenceError): next(ast_node.infer()) def test_module_model(self) -> None: ast_nodes = builder.extract_node( """ import xml xml.__path__ #@ xml.__name__ #@ xml.__doc__ #@ xml.__file__ #@ xml.__spec__ #@ xml.__loader__ #@ xml.__cached__ #@ xml.__package__ #@ xml.__dict__ #@ xml.__init__ #@ xml.__new__ #@ xml.__subclasshook__ #@ xml.__str__ #@ xml.__sizeof__ #@ xml.__repr__ #@ xml.__reduce__ #@ xml.__setattr__ #@ xml.__reduce_ex__ #@ xml.__lt__ #@ xml.__eq__ #@ xml.__gt__ #@ xml.__format__ #@ xml.__delattr___ #@ xml.__getattribute__ #@ xml.__hash__ #@ xml.__dir__ #@ xml.__call__ #@ xml.__closure__ #@ """ ) assert isinstance(ast_nodes, list) path = next(ast_nodes[0].infer()) self.assertIsInstance(path, astroid.List) self.assertIsInstance(path.elts[0], astroid.Const) self.assertEqual(path.elts[0].value, xml.__path__[0]) name = next(ast_nodes[1].infer()) self.assertIsInstance(name, astroid.Const) self.assertEqual(name.value, "xml") doc = next(ast_nodes[2].infer()) self.assertIsInstance(doc, astroid.Const) self.assertEqual(doc.value, xml.__doc__) file_ = next(ast_nodes[3].infer()) self.assertIsInstance(file_, astroid.Const) self.assertEqual(file_.value, xml.__file__.replace(".pyc", ".py")) for ast_node in ast_nodes[4:7]: inferred = next(ast_node.infer()) self.assertIs(inferred, astroid.Uninferable) package = next(ast_nodes[7].infer()) self.assertIsInstance(package, astroid.Const) self.assertEqual(package.value, "xml") dict_ = next(ast_nodes[8].infer()) self.assertIsInstance(dict_, astroid.Dict) init_ = next(ast_nodes[9].infer()) assert isinstance(init_, bases.BoundMethod) init_result = next( init_.infer_call_result( nodes.Call( parent=None, lineno=None, col_offset=None, end_lineno=None, end_col_offset=None, ) ) ) assert isinstance(init_result, nodes.Const) assert init_result.value is None new_ = next(ast_nodes[10].infer()) assert isinstance(new_, bases.BoundMethod) # The following nodes are just here for theoretical completeness, # and they either return Uninferable or raise InferenceError. for ast_node in ast_nodes[11:28]: with pytest.raises(InferenceError): next(ast_node.infer()) class FunctionModelTest(unittest.TestCase): def test_partial_descriptor_support(self) -> None: bound, result = builder.extract_node( """ class A(object): pass def test(self): return 42 f = test.__get__(A(), A) f #@ f() #@ """ ) bound = next(bound.infer()) self.assertIsInstance(bound, astroid.BoundMethod) self.assertEqual(bound._proxied._proxied.name, "test") result = next(result.infer()) self.assertIsInstance(result, astroid.Const) self.assertEqual(result.value, 42) def test___get__has_extra_params_defined(self) -> None: node = builder.extract_node( """ def test(self): return 42 test.__get__ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.BoundMethod) args = inferred.args.args self.assertEqual(len(args), 2) self.assertEqual([arg.name for arg in args], ["self", "type"]) @test_utils.require_version(minver="3.8") def test__get__and_positional_only_args(self): node = builder.extract_node( """ def test(self, a, b, /, c): return a + b + c test.__get__(test)(1, 2, 3) """ ) inferred = next(node.infer()) assert inferred is util.Uninferable @pytest.mark.xfail(reason="Descriptors cannot infer what self is") def test_descriptor_not_inferrring_self(self): # We can't infer __get__(X, Y)() when the bounded function # uses self, because of the tree's parent not being propagating good enough. result = builder.extract_node( """ class A(object): x = 42 def test(self): return self.x f = test.__get__(A(), A) f() #@ """ ) result = next(result.infer()) self.assertIsInstance(result, astroid.Const) self.assertEqual(result.value, 42) def test_descriptors_binding_invalid(self) -> None: ast_nodes = builder.extract_node( """ class A: pass def test(self): return 42 test.__get__()() #@ test.__get__(2, 3, 4) #@ """ ) for node in ast_nodes: with self.assertRaises(InferenceError): next(node.infer()) def test_descriptor_error_regression(self) -> None: """Make sure the following code does node cause an exception.""" node = builder.extract_node( """ class MyClass: text = "MyText" def mymethod1(self): return self.text def mymethod2(self): return self.mymethod1.__get__(self, MyClass) cl = MyClass().mymethod2()() cl #@ """ ) assert isinstance(node, nodes.NodeNG) [const] = node.inferred() assert const.value == "MyText" def test_function_model(self) -> None: ast_nodes = builder.extract_node( ''' def func(a=1, b=2): """test""" func.__name__ #@ func.__doc__ #@ func.__qualname__ #@ func.__module__ #@ func.__defaults__ #@ func.__dict__ #@ func.__globals__ #@ func.__code__ #@ func.__closure__ #@ func.__init__ #@ func.__new__ #@ func.__subclasshook__ #@ func.__str__ #@ func.__sizeof__ #@ func.__repr__ #@ func.__reduce__ #@ func.__reduce_ex__ #@ func.__lt__ #@ func.__eq__ #@ func.__gt__ #@ func.__format__ #@ func.__delattr___ #@ func.__getattribute__ #@ func.__hash__ #@ func.__dir__ #@ func.__class__ #@ func.__setattr__ #@ ''', module_name="fake_module", ) assert isinstance(ast_nodes, list) name = next(ast_nodes[0].infer()) self.assertIsInstance(name, astroid.Const) self.assertEqual(name.value, "func") doc = next(ast_nodes[1].infer()) self.assertIsInstance(doc, astroid.Const) self.assertEqual(doc.value, "test") qualname = next(ast_nodes[2].infer()) self.assertIsInstance(qualname, astroid.Const) self.assertEqual(qualname.value, "fake_module.func") module = next(ast_nodes[3].infer()) self.assertIsInstance(module, astroid.Const) self.assertEqual(module.value, "fake_module") defaults = next(ast_nodes[4].infer()) self.assertIsInstance(defaults, astroid.Tuple) self.assertEqual([default.value for default in defaults.elts], [1, 2]) dict_ = next(ast_nodes[5].infer()) self.assertIsInstance(dict_, astroid.Dict) globals_ = next(ast_nodes[6].infer()) self.assertIsInstance(globals_, astroid.Dict) for ast_node in ast_nodes[7:9]: self.assertIs(next(ast_node.infer()), astroid.Uninferable) init_ = next(ast_nodes[9].infer()) assert isinstance(init_, bases.BoundMethod) init_result = next( init_.infer_call_result( nodes.Call( parent=None, lineno=None, col_offset=None, end_lineno=None, end_col_offset=None, ) ) ) assert isinstance(init_result, nodes.Const) assert init_result.value is None new_ = next(ast_nodes[10].infer()) assert isinstance(new_, bases.BoundMethod) # The following nodes are just here for theoretical completeness, # and they either return Uninferable or raise InferenceError. for ast_node in ast_nodes[11:26]: inferred = next(ast_node.infer()) assert inferred is util.Uninferable for ast_node in ast_nodes[26:27]: with pytest.raises(InferenceError): inferred = next(ast_node.infer()) def test_empty_return_annotation(self) -> None: ast_node = builder.extract_node( """ def test(): pass test.__annotations__ """ ) annotations = next(ast_node.infer()) self.assertIsInstance(annotations, astroid.Dict) self.assertEqual(len(annotations.items), 0) def test_builtin_dunder_init_does_not_crash_when_accessing_annotations( self, ) -> None: ast_node = builder.extract_node( """ class Class: @classmethod def class_method(cls): cls.__init__.__annotations__ #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, astroid.Dict) self.assertEqual(len(inferred.items), 0) def test_annotations_kwdefaults(self) -> None: ast_node = builder.extract_node( """ def test(a: 1, *args: 2, f:4='lala', **kwarg:3)->2: pass test.__annotations__ #@ test.__kwdefaults__ #@ """ ) annotations = next(ast_node[0].infer()) self.assertIsInstance(annotations, astroid.Dict) self.assertIsInstance( annotations.getitem(astroid.Const("return")), astroid.Const ) self.assertEqual(annotations.getitem(astroid.Const("return")).value, 2) self.assertIsInstance(annotations.getitem(astroid.Const("a")), astroid.Const) self.assertEqual(annotations.getitem(astroid.Const("a")).value, 1) self.assertEqual(annotations.getitem(astroid.Const("args")).value, 2) self.assertEqual(annotations.getitem(astroid.Const("kwarg")).value, 3) self.assertEqual(annotations.getitem(astroid.Const("f")).value, 4) kwdefaults = next(ast_node[1].infer()) self.assertIsInstance(kwdefaults, astroid.Dict) # self.assertEqual(kwdefaults.getitem('f').value, 'lala') @test_utils.require_version(minver="3.8") def test_annotation_positional_only(self): ast_node = builder.extract_node( """ def test(a: 1, b: 2, /, c: 3): pass test.__annotations__ #@ """ ) annotations = next(ast_node.infer()) self.assertIsInstance(annotations, astroid.Dict) self.assertIsInstance(annotations.getitem(astroid.Const("a")), astroid.Const) self.assertEqual(annotations.getitem(astroid.Const("a")).value, 1) self.assertEqual(annotations.getitem(astroid.Const("b")).value, 2) self.assertEqual(annotations.getitem(astroid.Const("c")).value, 3) def test_is_not_lambda(self): ast_node = builder.extract_node("def func(): pass") self.assertIs(ast_node.is_lambda, False) class TestContextManagerModel: def test_model(self) -> None: """We use a generator to test this model.""" ast_nodes = builder.extract_node( """ def test(): "a" yield gen = test() gen.__enter__ #@ gen.__exit__ #@ """ ) assert isinstance(ast_nodes, list) enter = next(ast_nodes[0].infer()) assert isinstance(enter, astroid.BoundMethod) # Test that the method is correctly bound assert isinstance(enter.bound, bases.Generator) assert enter.bound._proxied.qname() == "builtins.generator" # Test that thet FunctionDef accepts no arguments except self # NOTE: This probably shouldn't be double proxied, but this is a # quirck of the current model implementations. assert isinstance(enter._proxied._proxied, nodes.FunctionDef) assert len(enter._proxied._proxied.args.args) == 1 assert enter._proxied._proxied.args.args[0].name == "self" exit_node = next(ast_nodes[1].infer()) assert isinstance(exit_node, astroid.BoundMethod) # Test that the FunctionDef accepts the arguments as defiend in the ObjectModel assert isinstance(exit_node._proxied._proxied, nodes.FunctionDef) assert len(exit_node._proxied._proxied.args.args) == 4 assert exit_node._proxied._proxied.args.args[0].name == "self" assert exit_node._proxied._proxied.args.args[1].name == "exc_type" assert exit_node._proxied._proxied.args.args[2].name == "exc_value" assert exit_node._proxied._proxied.args.args[3].name == "traceback" class GeneratorModelTest(unittest.TestCase): def test_model(self) -> None: ast_nodes = builder.extract_node( """ def test(): "a" yield gen = test() gen.__name__ #@ gen.__doc__ #@ gen.gi_code #@ gen.gi_frame #@ gen.send #@ gen.__enter__ #@ gen.__exit__ #@ """ ) assert isinstance(ast_nodes, list) name = next(ast_nodes[0].infer()) self.assertEqual(name.value, "test") doc = next(ast_nodes[1].infer()) self.assertEqual(doc.value, "a") gi_code = next(ast_nodes[2].infer()) self.assertIsInstance(gi_code, astroid.ClassDef) self.assertEqual(gi_code.name, "gi_code") gi_frame = next(ast_nodes[3].infer()) self.assertIsInstance(gi_frame, astroid.ClassDef) self.assertEqual(gi_frame.name, "gi_frame") send = next(ast_nodes[4].infer()) self.assertIsInstance(send, astroid.BoundMethod) enter = next(ast_nodes[5].infer()) assert isinstance(enter, astroid.BoundMethod) exit_node = next(ast_nodes[6].infer()) assert isinstance(exit_node, astroid.BoundMethod) class ExceptionModelTest(unittest.TestCase): @staticmethod def test_valueerror_py3() -> None: ast_nodes = builder.extract_node( """ try: x[42] except ValueError as err: err.args #@ err.__traceback__ #@ err.message #@ """ ) assert isinstance(ast_nodes, list) args = next(ast_nodes[0].infer()) assert isinstance(args, astroid.Tuple) tb = next(ast_nodes[1].infer()) # Python 3.11: If 'contextlib' is loaded, '__traceback__' # could be set inside '__exit__' method in # which case 'err.__traceback__' will be 'Uninferable' try: assert isinstance(tb, astroid.Instance) assert tb.name == "traceback" except AssertionError: if PY311_PLUS: assert tb == util.Uninferable else: raise with pytest.raises(InferenceError): next(ast_nodes[2].infer()) def test_syntax_error(self) -> None: ast_node = builder.extract_node( """ try: x[42] except SyntaxError as err: err.text #@ """ ) inferred = next(ast_node.infer()) assert isinstance(inferred, astroid.Const) @unittest.skipIf(HAS_SIX, "This test fails if the six library is installed") def test_oserror(self) -> None: ast_nodes = builder.extract_node( """ try: raise OSError("a") except OSError as err: err.filename #@ err.filename2 #@ err.errno #@ """ ) expected_values = ["", "", 0] for node, value in zip(ast_nodes, expected_values): inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) assert inferred.value == value def test_unicodedecodeerror(self) -> None: code = """ try: raise UnicodeDecodeError("utf-8", b"blob", 0, 1, "reason") except UnicodeDecodeError as error: error.object #@ """ node = builder.extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) assert inferred.value == b"" def test_import_error(self) -> None: ast_nodes = builder.extract_node( """ try: raise ImportError("a") except ImportError as err: err.name #@ err.path #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) assert inferred.value == "" def test_exception_instance_correctly_instantiated(self) -> None: ast_node = builder.extract_node( """ try: raise ImportError("a") except ImportError as err: err #@ """ ) inferred = next(ast_node.infer()) assert isinstance(inferred, astroid.Instance) cls = next(inferred.igetattr("__class__")) assert isinstance(cls, astroid.ClassDef) class DictObjectModelTest(unittest.TestCase): def test__class__(self) -> None: ast_node = builder.extract_node("{}.__class__") inferred = next(ast_node.infer()) self.assertIsInstance(inferred, astroid.ClassDef) self.assertEqual(inferred.name, "dict") def test_attributes_inferred_as_methods(self) -> None: ast_nodes = builder.extract_node( """ {}.values #@ {}.items #@ {}.keys #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.BoundMethod) def test_wrapper_objects_for_dict_methods_python3(self) -> None: ast_nodes = builder.extract_node( """ {1:1, 2:3}.values() #@ {1:1, 2:3}.keys() #@ {1:1, 2:3}.items() #@ """ ) assert isinstance(ast_nodes, list) values = next(ast_nodes[0].infer()) self.assertIsInstance(values, objects.DictValues) self.assertEqual([elt.value for elt in values.elts], [1, 3]) keys = next(ast_nodes[1].infer()) self.assertIsInstance(keys, objects.DictKeys) self.assertEqual([elt.value for elt in keys.elts], [1, 2]) items = next(ast_nodes[2].infer()) self.assertIsInstance(items, objects.DictItems) class TestExceptionInstanceModel: """Tests for ExceptionInstanceModel.""" def test_str_argument_not_required(self) -> None: """Test that the first argument to an exception does not need to be a str.""" ast_node = builder.extract_node( """ BaseException() #@ """ ) args: nodes.Tuple = next(ast_node.infer()).getattr("args")[0] # BaseException doesn't have any required args, not even a string assert not args.elts @pytest.mark.parametrize("parentheses", (True, False)) def test_lru_cache(parentheses) -> None: ast_nodes = builder.extract_node( f""" import functools class Foo(object): @functools.lru_cache{"()" if parentheses else ""} def foo(): pass f = Foo() f.foo.cache_clear #@ f.foo.__wrapped__ #@ f.foo.cache_info() #@ """ ) assert isinstance(ast_nodes, list) cache_clear = next(ast_nodes[0].infer()) assert isinstance(cache_clear, astroid.BoundMethod) wrapped = next(ast_nodes[1].infer()) assert isinstance(wrapped, astroid.FunctionDef) assert wrapped.name == "foo" cache_info = next(ast_nodes[2].infer()) assert isinstance(cache_info, astroid.Instance) astroid-3.2.2/tests/test_manager.py0000664000175000017500000004766114622475517017311 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import os import site import sys import time import unittest import warnings from collections.abc import Iterator from contextlib import contextmanager from unittest import mock import pytest import astroid from astroid import manager, test_utils from astroid.const import IS_JYTHON, IS_PYPY from astroid.exceptions import ( AstroidBuildingError, AstroidImportError, AttributeInferenceError, ) from astroid.interpreter._import import util from astroid.modutils import EXT_LIB_DIRS, module_in_path from astroid.nodes import Const from astroid.nodes.scoped_nodes import ClassDef, Module from . import resources def _get_file_from_object(obj) -> str: if IS_JYTHON: return obj.__file__.split("$py.class")[0] + ".py" return obj.__file__ class AstroidManagerTest(resources.SysPathSetup, unittest.TestCase): def setUp(self) -> None: super().setUp() self.manager = test_utils.brainless_manager() self.manager.clear_cache() def test_ast_from_file(self) -> None: filepath = unittest.__file__ ast = self.manager.ast_from_file(filepath) self.assertEqual(ast.name, "unittest") self.assertIn("unittest", self.manager.astroid_cache) def test_ast_from_file_cache(self) -> None: filepath = unittest.__file__ self.manager.ast_from_file(filepath) ast = self.manager.ast_from_file("unhandledName", "unittest") self.assertEqual(ast.name, "unittest") self.assertIn("unittest", self.manager.astroid_cache) def test_ast_from_file_astro_builder(self) -> None: filepath = unittest.__file__ ast = self.manager.ast_from_file(filepath, None, True, True) self.assertEqual(ast.name, "unittest") self.assertIn("unittest", self.manager.astroid_cache) def test_ast_from_file_name_astro_builder_exception(self) -> None: self.assertRaises( AstroidBuildingError, self.manager.ast_from_file, "unhandledName" ) def test_ast_from_string(self) -> None: filepath = unittest.__file__ dirname = os.path.dirname(filepath) modname = os.path.basename(dirname) with open(filepath, encoding="utf-8") as file: data = file.read() ast = self.manager.ast_from_string(data, modname, filepath) self.assertEqual(ast.name, "unittest") self.assertEqual(ast.file, filepath) self.assertIn("unittest", self.manager.astroid_cache) def test_do_not_expose_main(self) -> None: obj = self.manager.ast_from_module_name("__main__") self.assertEqual(obj.name, "__main__") self.assertEqual(obj.items(), []) def test_ast_from_module_name(self) -> None: ast = self.manager.ast_from_module_name("unittest") self.assertEqual(ast.name, "unittest") self.assertIn("unittest", self.manager.astroid_cache) def test_ast_from_module_name_not_python_source(self) -> None: ast = self.manager.ast_from_module_name("time") self.assertEqual(ast.name, "time") self.assertIn("time", self.manager.astroid_cache) self.assertEqual(ast.pure_python, False) def test_ast_from_module_name_astro_builder_exception(self) -> None: self.assertRaises( AstroidBuildingError, self.manager.ast_from_module_name, "unhandledModule", ) def _test_ast_from_old_namespace_package_protocol(self, root: str) -> None: origpath = sys.path[:] paths = [resources.find(f"data/path_{root}_{index}") for index in range(1, 4)] sys.path.extend(paths) try: for name in ("foo", "bar", "baz"): module = self.manager.ast_from_module_name("package." + name) self.assertIsInstance(module, astroid.Module) finally: sys.path = origpath def test_ast_from_namespace_pkgutil(self) -> None: self._test_ast_from_old_namespace_package_protocol("pkgutil") def test_ast_from_namespace_pkg_resources(self) -> None: self._test_ast_from_old_namespace_package_protocol("pkg_resources") def test_identify_old_namespace_package_protocol(self) -> None: # Like the above cases, this package follows the old namespace package protocol # astroid currently assumes such packages are in sys.modules, so import it # pylint: disable-next=import-outside-toplevel import tests.testdata.python3.data.path_pkg_resources_1.package.foo as _ # noqa self.assertTrue( util.is_namespace("tests.testdata.python3.data.path_pkg_resources_1") ) def test_submodule_homonym_with_non_module(self) -> None: self.assertFalse( util.is_namespace("tests.testdata.python3.data.parent_of_homonym.doc") ) def test_module_is_not_namespace(self) -> None: self.assertFalse(util.is_namespace("tests.testdata.python3.data.all")) self.assertFalse(util.is_namespace("__main__")) self.assertFalse( util.is_namespace(next(iter(EXT_LIB_DIRS)).rsplit("/", maxsplit=1)[-1]), ) self.assertFalse(util.is_namespace("importlib._bootstrap")) def test_module_unexpectedly_missing_spec(self) -> None: astroid_module = sys.modules["astroid"] original_spec = astroid_module.__spec__ del astroid_module.__spec__ try: self.assertFalse(util.is_namespace("astroid")) finally: astroid_module.__spec__ = original_spec @mock.patch( "astroid.interpreter._import.util._find_spec_from_path", side_effect=AttributeError, ) def test_module_unexpectedly_missing_path(self, mocked) -> None: """Https://github.com/pylint-dev/pylint/issues/7592.""" self.assertFalse(util.is_namespace("astroid")) def test_module_unexpectedly_spec_is_none(self) -> None: astroid_module = sys.modules["astroid"] original_spec = astroid_module.__spec__ astroid_module.__spec__ = None try: self.assertFalse(util.is_namespace("astroid")) finally: astroid_module.__spec__ = original_spec def test_implicit_namespace_package(self) -> None: data_dir = os.path.dirname(resources.find("data/namespace_pep_420")) contribute = os.path.join(data_dir, "contribute_to_namespace") for value in (data_dir, contribute): sys.path.insert(0, value) try: module = self.manager.ast_from_module_name("namespace_pep_420.module") self.assertIsInstance(module, astroid.Module) self.assertEqual(module.name, "namespace_pep_420.module") var = next(module.igetattr("var")) self.assertIsInstance(var, astroid.Const) self.assertEqual(var.value, 42) finally: for _ in range(2): sys.path.pop(0) @pytest.mark.skipif( IS_PYPY, reason="PyPy provides no way to tell apart frozen stdlib from old-style namespace packages", ) def test_namespace_package_pth_support(self) -> None: pth = "foogle_fax-0.12.5-py2.7-nspkg.pth" site.addpackage(resources.RESOURCE_PATH, pth, []) try: module = self.manager.ast_from_module_name("foogle.fax") submodule = next(module.igetattr("a")) value = next(submodule.igetattr("x")) self.assertIsInstance(value, astroid.Const) with self.assertRaises(AstroidImportError): self.manager.ast_from_module_name("foogle.moogle") finally: sys.modules.pop("foogle") @pytest.mark.skipif( IS_PYPY, reason="PyPy provides no way to tell apart frozen stdlib from old-style namespace packages", ) def test_nested_namespace_import(self) -> None: pth = "foogle_fax-0.12.5-py2.7-nspkg.pth" site.addpackage(resources.RESOURCE_PATH, pth, []) try: self.manager.ast_from_module_name("foogle.crank") finally: sys.modules.pop("foogle") def test_namespace_and_file_mismatch(self) -> None: filepath = unittest.__file__ ast = self.manager.ast_from_file(filepath) self.assertEqual(ast.name, "unittest") pth = "foogle_fax-0.12.5-py2.7-nspkg.pth" site.addpackage(resources.RESOURCE_PATH, pth, []) try: with self.assertRaises(AstroidImportError): self.manager.ast_from_module_name("unittest.foogle.fax") finally: sys.modules.pop("foogle") def _test_ast_from_zip(self, archive: str) -> None: sys.modules.pop("mypypa", None) archive_path = resources.find(archive) sys.path.insert(0, archive_path) module = self.manager.ast_from_module_name("mypypa") self.assertEqual(module.name, "mypypa") end = os.path.join(archive, "mypypa") self.assertTrue( module.file.endswith(end), f"{module.file} doesn't endswith {end}" ) @contextmanager def _restore_package_cache(self) -> Iterator: orig_path = sys.path[:] orig_pathcache = sys.path_importer_cache.copy() orig_modcache = self.manager.astroid_cache.copy() orig_modfilecache = self.manager._mod_file_cache.copy() orig_importhooks = self.manager._failed_import_hooks[:] yield self.manager._failed_import_hooks = orig_importhooks self.manager._mod_file_cache = orig_modfilecache self.manager.astroid_cache = orig_modcache sys.path_importer_cache = orig_pathcache sys.path = orig_path def test_ast_from_module_name_egg(self) -> None: with self._restore_package_cache(): self._test_ast_from_zip( os.path.sep.join(["data", os.path.normcase("MyPyPa-0.1.0-py2.5.egg")]) ) def test_ast_from_module_name_zip(self) -> None: with self._restore_package_cache(): self._test_ast_from_zip( os.path.sep.join(["data", os.path.normcase("MyPyPa-0.1.0-py2.5.zip")]) ) def test_ast_from_module_name_pyz(self) -> None: try: linked_file_name = os.path.join( resources.RESOURCE_PATH, "MyPyPa-0.1.0-py2.5.pyz" ) os.symlink( os.path.join(resources.RESOURCE_PATH, "MyPyPa-0.1.0-py2.5.zip"), linked_file_name, ) with self._restore_package_cache(): self._test_ast_from_zip(linked_file_name) finally: os.remove(linked_file_name) def test_zip_import_data(self) -> None: """Check if zip_import_data works.""" with self._restore_package_cache(): filepath = resources.find("data/MyPyPa-0.1.0-py2.5.zip/mypypa") ast = self.manager.zip_import_data(filepath) self.assertEqual(ast.name, "mypypa") def test_zip_import_data_without_zipimport(self) -> None: """Check if zip_import_data return None without zipimport.""" self.assertEqual(self.manager.zip_import_data("path"), None) def test_file_from_module(self) -> None: """Check if the unittest filepath is equals to the result of the method.""" self.assertEqual( _get_file_from_object(unittest), self.manager.file_from_module_name("unittest", None).location, ) def test_file_from_module_name_astro_building_exception(self) -> None: """Check if the method raises an exception with a wrong module name.""" self.assertRaises( AstroidBuildingError, self.manager.file_from_module_name, "unhandledModule", None, ) def test_ast_from_module(self) -> None: ast = self.manager.ast_from_module(unittest) self.assertEqual(ast.pure_python, True) ast = self.manager.ast_from_module(time) self.assertEqual(ast.pure_python, False) def test_ast_from_module_cache(self) -> None: """Check if the module is in the cache manager.""" ast = self.manager.ast_from_module(unittest) self.assertEqual(ast.name, "unittest") self.assertIn("unittest", self.manager.astroid_cache) def test_ast_from_class(self) -> None: ast = self.manager.ast_from_class(int) self.assertEqual(ast.name, "int") self.assertEqual(ast.parent.frame().name, "builtins") self.assertEqual(ast.parent.frame().name, "builtins") ast = self.manager.ast_from_class(object) self.assertEqual(ast.name, "object") self.assertEqual(ast.parent.frame().name, "builtins") self.assertEqual(ast.parent.frame().name, "builtins") self.assertIn("__setattr__", ast) def test_ast_from_class_with_module(self) -> None: """Check if the method works with the module name.""" ast = self.manager.ast_from_class(int, int.__module__) self.assertEqual(ast.name, "int") self.assertEqual(ast.parent.frame().name, "builtins") self.assertEqual(ast.parent.frame().name, "builtins") ast = self.manager.ast_from_class(object, object.__module__) self.assertEqual(ast.name, "object") self.assertEqual(ast.parent.frame().name, "builtins") self.assertEqual(ast.parent.frame().name, "builtins") self.assertIn("__setattr__", ast) def test_ast_from_class_attr_error(self) -> None: """Give a wrong class at the ast_from_class method.""" self.assertRaises(AstroidBuildingError, self.manager.ast_from_class, None) def test_failed_import_hooks(self) -> None: def hook(modname: str): if modname == "foo.bar": return unittest raise AstroidBuildingError() with self.assertRaises(AstroidBuildingError): self.manager.ast_from_module_name("foo.bar") with self._restore_package_cache(): self.manager.register_failed_import_hook(hook) self.assertEqual(unittest, self.manager.ast_from_module_name("foo.bar")) with self.assertRaises(AstroidBuildingError): self.manager.ast_from_module_name("foo.bar.baz") def test_same_name_import_module(self) -> None: """Test inference of an import statement with the same name as the module. See https://github.com/pylint-dev/pylint/issues/5151. """ math_file = resources.find("data/import_conflicting_names/math.py") module = self.manager.ast_from_file(math_file) # Change the cache key and module name to mimic importing the test file # from the root/top level. This creates a clash between math.py and stdlib math. self.manager.astroid_cache["math"] = self.manager.astroid_cache.pop(module.name) module.name = "math" # Infer the 'import math' statement stdlib_math = next(module.body[1].value.args[0].infer()) assert self.manager.astroid_cache["math"] != stdlib_math def test_raises_exception_for_empty_modname(self) -> None: with pytest.raises(AstroidBuildingError): self.manager.ast_from_module_name(None) def test_denied_modules_raise(self) -> None: self.manager.module_denylist.add("random") with pytest.raises(AstroidImportError, match="random"): self.manager.ast_from_module_name("random") # and module not in the deny list shouldn't raise self.manager.ast_from_module_name("math") class IsolatedAstroidManagerTest(unittest.TestCase): def test_no_user_warning(self): mgr = manager.AstroidManager() self.addCleanup(mgr.clear_cache) with warnings.catch_warnings(): warnings.filterwarnings("error", category=UserWarning) mgr.ast_from_module_name("setuptools") mgr.ast_from_module_name("pip") class BorgAstroidManagerTC(unittest.TestCase): def test_borg(self) -> None: """Test that the AstroidManager is really a borg, i.e. that two different instances has same cache. """ first_manager = manager.AstroidManager() built = first_manager.ast_from_module_name("builtins") second_manager = manager.AstroidManager() second_built = second_manager.ast_from_module_name("builtins") self.assertIs(built, second_built) def test_max_inferable_values(self) -> None: mgr = manager.AstroidManager() original_limit = mgr.max_inferable_values def reset_limit(): nonlocal original_limit manager.AstroidManager().max_inferable_values = original_limit self.addCleanup(reset_limit) mgr.max_inferable_values = 4 self.assertEqual(manager.AstroidManager.brain["max_inferable_values"], 4) class ClearCacheTest(unittest.TestCase): def test_clear_cache_clears_other_lru_caches(self) -> None: lrus = ( astroid.nodes._base_nodes.LookupMixIn.lookup, astroid.modutils._cache_normalize_path_, util.is_namespace, astroid.interpreter.objectmodel.ObjectModel.attributes, ClassDef._metaclass_lookup_attribute, ) # Get a baseline for the size of the cache after simply calling bootstrap() baseline_cache_infos = [lru.cache_info() for lru in lrus] # Generate some hits and misses module = Module("", file="", path=[], package=False) ClassDef( "", lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=module, ).lookup("garbage") module_in_path("unittest", "garbage_path") util.is_namespace("unittest") astroid.interpreter.objectmodel.ObjectModel().attributes() with pytest.raises(AttributeInferenceError): ClassDef( "", lineno=0, col_offset=0, end_lineno=0, end_col_offset=0, parent=module, ).getattr("garbage") # Did the hits or misses actually happen? incremented_cache_infos = [lru.cache_info() for lru in lrus] for incremented_cache, baseline_cache in zip( incremented_cache_infos, baseline_cache_infos ): with self.subTest(incremented_cache=incremented_cache): self.assertGreater( incremented_cache.hits + incremented_cache.misses, baseline_cache.hits + baseline_cache.misses, ) astroid.MANAGER.clear_cache() # also calls bootstrap() self.assertEqual(astroid.context._INFERENCE_CACHE, {}) # The cache sizes are now as low or lower than the original baseline cleared_cache_infos = [lru.cache_info() for lru in lrus] for cleared_cache, baseline_cache in zip( cleared_cache_infos, baseline_cache_infos ): with self.subTest(cleared_cache=cleared_cache): # less equal because the "baseline" might have had multiple calls to bootstrap() self.assertLessEqual(cleared_cache.currsize, baseline_cache.currsize) def test_brain_plugins_reloaded_after_clearing_cache(self) -> None: astroid.MANAGER.clear_cache() format_call = astroid.extract_node("''.format()") inferred = next(format_call.infer()) self.assertIsInstance(inferred, Const) def test_builtins_inference_after_clearing_cache(self) -> None: astroid.MANAGER.clear_cache() isinstance_call = astroid.extract_node("isinstance(1, int)") inferred = next(isinstance_call.infer()) self.assertIs(inferred.value, True) def test_builtins_inference_after_clearing_cache_manually(self) -> None: # Not recommended to manipulate this, so we detect it and call clear_cache() instead astroid.MANAGER.brain["astroid_cache"].clear() isinstance_call = astroid.extract_node("isinstance(1, int)") inferred = next(isinstance_call.infer()) self.assertIs(inferred.value, True) astroid-3.2.2/tests/test_constraint.py0000664000175000017500000003611114622475517020047 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for inference involving constraints.""" from __future__ import annotations import pytest from astroid import builder, nodes from astroid.util import Uninferable def common_params(node: str) -> pytest.MarkDecorator: return pytest.mark.parametrize( ("condition", "satisfy_val", "fail_val"), ( (f"{node} is None", None, 3), (f"{node} is not None", 3, None), ), ) @common_params(node="x") def test_if_single_statement( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test constraint for a variable that is used in the first statement of an if body.""" node1, node2 = builder.extract_node( f""" def f1(x = {fail_val}): if {condition}: # Filters out default value return ( x #@ ) def f2(x = {satisfy_val}): if {condition}: # Does not filter out default value return ( x #@ ) """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == satisfy_val assert inferred[1] is Uninferable @common_params(node="x") def test_if_multiple_statements( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test constraint for a variable that is used in an if body with multiple statements. """ node1, node2 = builder.extract_node( f""" def f1(x = {fail_val}): if {condition}: # Filters out default value print(x) return ( x #@ ) def f2(x = {satisfy_val}): if {condition}: # Does not filter out default value print(x) return ( x #@ ) """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == satisfy_val assert inferred[1] is Uninferable @common_params(node="x") def test_if_irrelevant_condition( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint for a different variable doesn't apply.""" nodes_ = builder.extract_node( f""" def f1(x, y = {fail_val}): if {condition}: # Does not filter out fail_val return ( y #@ ) def f2(x, y = {satisfy_val}): if {condition}: return ( y #@ ) """ ) for node, val in zip(nodes_, (fail_val, satisfy_val)): inferred = node.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == val assert inferred[1] is Uninferable @common_params(node="x") def test_outside_if( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply outside of the if.""" nodes_ = builder.extract_node( f""" def f1(x = {fail_val}): if {condition}: pass return ( x #@ ) def f2(x = {satisfy_val}): if {condition}: pass return ( x #@ ) """ ) for node, val in zip(nodes_, (fail_val, satisfy_val)): inferred = node.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == val assert inferred[1] is Uninferable @common_params(node="x") def test_nested_if( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition applies within inner if statements.""" node1, node2 = builder.extract_node( f""" def f1(y, x = {fail_val}): if {condition}: if y is not None: return ( x #@ ) def f2(y, x = {satisfy_val}): if {condition}: if y is not None: return ( x #@ ) """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == satisfy_val assert inferred[1] is Uninferable def test_if_uninferable() -> None: """Test that when no inferred values satisfy all constraints, Uninferable is inferred. """ node1, node2 = builder.extract_node( """ def f1(): x = None if x is not None: x #@ def f2(): x = 1 if x is not None: pass else: x #@ """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable @common_params(node="x") def test_if_reassignment_in_body( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply when the variable is assigned to a failing value inside the if body. """ node = builder.extract_node( f""" def f(x, y): if {condition}: if y: x = {fail_val} return ( x #@ ) """ ) inferred = node.inferred() assert len(inferred) == 2 assert inferred[0] is Uninferable assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == fail_val @common_params(node="x") def test_if_elif_else_negates( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition is negated when the variable is used in the elif and else branches. """ node1, node2, node3, node4 = builder.extract_node( f""" def f1(y, x = {fail_val}): if {condition}: pass elif y: # Does not filter out default value return ( x #@ ) else: # Does not filter out default value return ( x #@ ) def f2(y, x = {satisfy_val}): if {condition}: pass elif y: # Filters out default value return ( x #@ ) else: # Filters out default value return ( x #@ ) """ ) for node in (node1, node2): inferred = node.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val assert inferred[1] is Uninferable for node in (node3, node4): inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable @common_params(node="x") def test_if_reassignment_in_else( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply when the variable is assigned to a failing value inside the else branch. """ node = builder.extract_node( f""" def f(x, y): if {condition}: return x else: if y: x = {satisfy_val} return ( x #@ ) """ ) inferred = node.inferred() assert len(inferred) == 2 assert inferred[0] is Uninferable assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == satisfy_val @common_params(node="x") def test_if_comprehension_shadow( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply when the variable is shadowed by an inner comprehension scope. """ node = builder.extract_node( f""" def f(x): if {condition}: return [ x #@ for x in [{satisfy_val}, {fail_val}] ] """ ) inferred = node.inferred() assert len(inferred) == 2 for actual, expected in zip(inferred, (satisfy_val, fail_val)): assert isinstance(actual, nodes.Const) assert actual.value == expected @common_params(node="x") def test_if_function_shadow( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply when the variable is shadowed by an inner function scope. """ node = builder.extract_node( f""" x = {satisfy_val} if {condition}: def f(x = {fail_val}): return ( x #@ ) """ ) inferred = node.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val assert inferred[1] is Uninferable @common_params(node="x") def test_if_function_call( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply for a parameter a different function call, but with the same name. """ node = builder.extract_node( f""" def f(x = {satisfy_val}): if {condition}: g({fail_val}) #@ def g(x): return x """ ) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val @common_params(node="self.x") def test_if_instance_attr( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test constraint for an instance attribute in an if statement.""" node1, node2 = builder.extract_node( f""" class A1: def __init__(self, x = {fail_val}): self.x = x def method(self): if {condition}: self.x #@ class A2: def __init__(self, x = {satisfy_val}): self.x = x def method(self): if {condition}: self.x #@ """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == satisfy_val assert inferred[1] is Uninferable @common_params(node="self.x") def test_if_instance_attr_reassignment_in_body( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply to an instance attribute when it is assigned inside the if body. """ node1, node2 = builder.extract_node( f""" class A1: def __init__(self, x): self.x = x def method1(self): if {condition}: self.x = {satisfy_val} self.x #@ def method2(self): if {condition}: self.x = {fail_val} self.x #@ """ ) inferred = node1.inferred() assert len(inferred) == 2 assert inferred[0] is Uninferable assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == satisfy_val inferred = node2.inferred() assert len(inferred) == 3 assert inferred[0] is Uninferable assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == satisfy_val assert isinstance(inferred[2], nodes.Const) assert inferred[2].value == fail_val @common_params(node="x") def test_if_instance_attr_varname_collision1( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply to an instance attribute when the constraint refers to a variable with the same name. """ node1, node2 = builder.extract_node( f""" class A1: def __init__(self, x = {fail_val}): self.x = x def method(self, x = {fail_val}): if {condition}: x #@ self.x #@ """ ) inferred = node1.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = node2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val assert inferred[1] is Uninferable @common_params(node="self.x") def test_if_instance_attr_varname_collision2( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply to a variable with the same name. """ node1, node2 = builder.extract_node( f""" class A1: def __init__(self, x = {fail_val}): self.x = x def method(self, x = {fail_val}): if {condition}: x #@ self.x #@ """ ) inferred = node1.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val assert inferred[1] is Uninferable inferred = node2.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable @common_params(node="self.x") def test_if_instance_attr_varname_collision3( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply to an instance attribute for an object of a different class. """ node = builder.extract_node( f""" class A1: def __init__(self, x = {fail_val}): self.x = x def method(self): obj = A2() if {condition}: obj.x #@ class A2: def __init__(self): self.x = {fail_val} """ ) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == fail_val @common_params(node="self.x") def test_if_instance_attr_varname_collision4( condition: str, satisfy_val: int | None, fail_val: int | None ) -> None: """Test that constraint in an if condition doesn't apply to a variable of the same name, when that variable is used to infer the value of the instance attribute. """ node = builder.extract_node( f""" class A1: def __init__(self, x): self.x = x def method(self): x = {fail_val} if {condition}: self.x = x self.x #@ """ ) inferred = node.inferred() assert len(inferred) == 2 assert inferred[0] is Uninferable assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == fail_val astroid-3.2.2/tests/test_raw_building.py0000664000175000017500000001342614622475517020335 0ustar epsilonepsilon""" 'tests.testdata.python3.data.fake_module_with_warnings' and 'tests.testdata.python3.data.fake_module_with_warnings' are fake modules to simulate issues in unittest below """ # Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import logging import os import sys import types import unittest from typing import Any from unittest import mock import _io import pytest import tests.testdata.python3.data.fake_module_with_broken_getattr as fm_getattr import tests.testdata.python3.data.fake_module_with_warnings as fm from astroid.builder import AstroidBuilder from astroid.const import IS_PYPY, PY312_PLUS from astroid.raw_building import ( attach_dummy_node, build_class, build_from_import, build_function, build_module, ) class RawBuildingTC(unittest.TestCase): def test_attach_dummy_node(self) -> None: node = build_module("MyModule") attach_dummy_node(node, "DummyNode") self.assertEqual(1, len(list(node.get_children()))) def test_build_module(self) -> None: node = build_module("MyModule") self.assertEqual(node.name, "MyModule") self.assertEqual(node.pure_python, False) self.assertEqual(node.package, False) self.assertEqual(node.parent, None) def test_build_class(self) -> None: node = build_class("MyClass") self.assertEqual(node.name, "MyClass") self.assertEqual(node.doc_node, None) def test_build_function(self) -> None: node = build_function("MyFunction") self.assertEqual(node.name, "MyFunction") self.assertEqual(node.doc_node, None) def test_build_function_args(self) -> None: args = ["myArgs1", "myArgs2"] node = build_function("MyFunction", args) self.assertEqual("myArgs1", node.args.args[0].name) self.assertEqual("myArgs2", node.args.args[1].name) self.assertEqual(2, len(node.args.args)) def test_build_function_defaults(self) -> None: defaults = ["defaults1", "defaults2"] node = build_function(name="MyFunction", args=None, defaults=defaults) self.assertEqual(2, len(node.args.defaults)) def test_build_function_posonlyargs(self) -> None: node = build_function(name="MyFunction", posonlyargs=["a", "b"]) self.assertEqual(2, len(node.args.posonlyargs)) def test_build_function_kwonlyargs(self) -> None: node = build_function(name="MyFunction", kwonlyargs=["a", "b"]) assert len(node.args.kwonlyargs) == 2 assert node.args.kwonlyargs[0].name == "a" assert node.args.kwonlyargs[1].name == "b" def test_build_from_import(self) -> None: names = ["exceptions, inference, inspector"] node = build_from_import("astroid", names) self.assertEqual(len(names), len(node.names)) @unittest.skipIf(IS_PYPY, "Only affects CPython") def test_io_is__io(self): # _io module calls itself io before Python 3.12. This leads # to cyclic dependencies when astroid tries to resolve # what io.BufferedReader is. The code that handles this # is in astroid.raw_building.imported_member, which verifies # the true name of the module. builder = AstroidBuilder() module = builder.inspect_build(_io) buffered_reader = module.getattr("BufferedReader")[0] expected = "_io" if PY312_PLUS else "io" self.assertEqual(buffered_reader.root().name, expected) def test_build_function_deepinspect_deprecation(self) -> None: # Tests https://github.com/pylint-dev/astroid/issues/1717 # When astroid deep inspection of modules raises # attribute errors when getting all attributes # Create a mock module to simulate a Cython module m = types.ModuleType("test") # Attach a mock of pandas with the same behavior m.pd = fm # This should not raise an exception AstroidBuilder().module_build(m, "test") def test_module_object_with_broken_getattr(self) -> None: # Tests https://github.com/pylint-dev/astroid/issues/1958 # When astroid deep inspection of modules raises # errors when using hasattr(). # This should not raise an exception AstroidBuilder().inspect_build(fm_getattr, "test") @pytest.mark.skipif( "posix" not in sys.builtin_module_names, reason="Platform doesn't support posix" ) def test_build_module_getattr_catch_output( capsys: pytest.CaptureFixture[str], caplog: pytest.LogCaptureFixture, ) -> None: """Catch stdout and stderr in module __getattr__ calls when building a module. Usually raised by DeprecationWarning or FutureWarning. """ caplog.set_level(logging.INFO) original_sys = sys.modules original_module = sys.modules["posix"] expected_out = "INFO (TEST): Welcome to posix!" expected_err = "WARNING (TEST): Monkey-patched version of posix - module getattr" class CustomGetattr: def __getattr__(self, name: str) -> Any: print(f"{expected_out}") print(expected_err, file=sys.stderr) return getattr(original_module, name) def mocked_sys_modules_getitem(name: str) -> types.ModuleType | CustomGetattr: if name != "posix": return original_sys[name] return CustomGetattr() with mock.patch("astroid.raw_building.sys.modules") as sys_mock: sys_mock.__getitem__.side_effect = mocked_sys_modules_getitem builder = AstroidBuilder() builder.inspect_build(os) out, err = capsys.readouterr() assert expected_out in caplog.text assert expected_err in caplog.text assert not out assert not err astroid-3.2.2/tests/__init__.py0000664000175000017500000000000014622475517016347 0ustar epsilonepsilonastroid-3.2.2/tests/test_lookup.py0000664000175000017500000010405014622475517017172 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for the astroid variable lookup capabilities.""" import functools import unittest from astroid import builder, nodes, test_utils from astroid.exceptions import ( AttributeInferenceError, InferenceError, NameInferenceError, ) from . import resources class LookupTest(resources.SysPathSetup, unittest.TestCase): def setUp(self) -> None: super().setUp() self.module = resources.build_file("data/module.py", "data.module") self.module2 = resources.build_file("data/module2.py", "data.module2") self.nonregr = resources.build_file("data/nonregr.py", "data.nonregr") def test_limit(self) -> None: code = """ l = [a for a,b in list] a = 1 b = a a = None def func(): c = 1 """ astroid = builder.parse(code, __name__) # a & b a = next(astroid.nodes_of_class(nodes.Name)) self.assertEqual(a.lineno, 2) self.assertEqual(len(astroid.lookup("b")[1]), 1) self.assertEqual(len(astroid.lookup("a")[1]), 1) b = astroid.locals["b"][0] stmts = a.lookup("a")[1] self.assertEqual(len(stmts), 1) self.assertEqual(b.lineno, 6) b_infer = b.infer() b_value = next(b_infer) self.assertEqual(b_value.value, 1) # c self.assertRaises(StopIteration, functools.partial(next, b_infer)) func = astroid.locals["func"][0] self.assertEqual(len(func.lookup("c")[1]), 1) def test_module(self) -> None: astroid = builder.parse("pass", __name__) # built-in objects none = next(astroid.ilookup("None")) self.assertIsNone(none.value) obj = next(astroid.ilookup("object")) self.assertIsInstance(obj, nodes.ClassDef) self.assertEqual(obj.name, "object") self.assertRaises( InferenceError, functools.partial(next, astroid.ilookup("YOAA")) ) # XXX self.assertEqual(len(list(self.nonregr.ilookup("enumerate"))), 2) def test_class_ancestor_name(self) -> None: code = """ class A: pass class A(A): pass """ astroid = builder.parse(code, __name__) cls1 = astroid.locals["A"][0] cls2 = astroid.locals["A"][1] name = next(cls2.nodes_of_class(nodes.Name)) self.assertEqual(next(name.infer()), cls1) # backport those test to inline code def test_method(self) -> None: method = self.module["YOUPI"]["method"] my_dict = next(method.ilookup("MY_DICT")) self.assertTrue(isinstance(my_dict, nodes.Dict), my_dict) none = next(method.ilookup("None")) self.assertIsNone(none.value) self.assertRaises( InferenceError, functools.partial(next, method.ilookup("YOAA")) ) def test_function_argument_with_default(self) -> None: make_class = self.module2["make_class"] base = next(make_class.ilookup("base")) self.assertTrue(isinstance(base, nodes.ClassDef), base.__class__) self.assertEqual(base.name, "YO") self.assertEqual(base.root().name, "data.module") def test_class(self) -> None: klass = self.module["YOUPI"] my_dict = next(klass.ilookup("MY_DICT")) self.assertIsInstance(my_dict, nodes.Dict) none = next(klass.ilookup("None")) self.assertIsNone(none.value) obj = next(klass.ilookup("object")) self.assertIsInstance(obj, nodes.ClassDef) self.assertEqual(obj.name, "object") self.assertRaises( InferenceError, functools.partial(next, klass.ilookup("YOAA")) ) def test_inner_classes(self) -> None: ddd = list(self.nonregr["Ccc"].ilookup("Ddd")) self.assertEqual(ddd[0].name, "Ddd") def test_loopvar_hiding(self) -> None: astroid = builder.parse( """ x = 10 for x in range(5): print (x) if x > 0: print ('#' * x) """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x"] # inside the loop, only one possible assignment self.assertEqual(len(xnames[0].lookup("x")[1]), 1) # outside the loop, two possible assignments self.assertEqual(len(xnames[1].lookup("x")[1]), 2) self.assertEqual(len(xnames[2].lookup("x")[1]), 2) def test_list_comps(self) -> None: astroid = builder.parse( """ print ([ i for i in range(10) ]) print ([ i for i in range(10) ]) print ( list( i for i in range(10) ) ) """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 2) self.assertEqual(len(xnames[1].lookup("i")[1]), 1) self.assertEqual(xnames[1].lookup("i")[1][0].lineno, 3) self.assertEqual(len(xnames[2].lookup("i")[1]), 1) self.assertEqual(xnames[2].lookup("i")[1][0].lineno, 4) def test_list_comp_target(self) -> None: """Test the list comprehension target.""" astroid = builder.parse( """ ten = [ var for var in range(10) ] var """ ) var = astroid.body[1].value self.assertRaises(NameInferenceError, var.inferred) def test_dict_comps(self) -> None: astroid = builder.parse( """ print ({ i: j for i in range(10) for j in range(10) }) print ({ i: j for i in range(10) for j in range(10) }) """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 2) self.assertEqual(len(xnames[1].lookup("i")[1]), 1) self.assertEqual(xnames[1].lookup("i")[1][0].lineno, 3) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "j"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 2) self.assertEqual(len(xnames[1].lookup("i")[1]), 1) self.assertEqual(xnames[1].lookup("i")[1][0].lineno, 3) def test_set_comps(self) -> None: astroid = builder.parse( """ print ({ i for i in range(10) }) print ({ i for i in range(10) }) """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 2) self.assertEqual(len(xnames[1].lookup("i")[1]), 1) self.assertEqual(xnames[1].lookup("i")[1][0].lineno, 3) def test_set_comp_closure(self) -> None: astroid = builder.parse( """ ten = { var for var in range(10) } var """ ) var = astroid.body[1].value self.assertRaises(NameInferenceError, var.inferred) def test_list_comp_nested(self) -> None: astroid = builder.parse( """ x = [[i + j for j in range(20)] for i in range(10)] """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 3) def test_dict_comp_nested(self) -> None: astroid = builder.parse( """ x = {i: {i: j for j in range(20)} for i in range(10)} x3 = [{i + j for j in range(20)} # Can't do nested sets for i in range(10)] """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 3) self.assertEqual(len(xnames[1].lookup("i")[1]), 1) self.assertEqual(xnames[1].lookup("i")[1][0].lineno, 3) def test_set_comp_nested(self) -> None: astroid = builder.parse( """ x = [{i + j for j in range(20)} # Can't do nested sets for i in range(10)] """, __name__, ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "i"] self.assertEqual(len(xnames[0].lookup("i")[1]), 1) self.assertEqual(xnames[0].lookup("i")[1][0].lineno, 3) def test_lambda_nested(self) -> None: astroid = builder.parse( """ f = lambda x: ( lambda y: x + y) """ ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x"] self.assertEqual(len(xnames[0].lookup("x")[1]), 1) self.assertEqual(xnames[0].lookup("x")[1][0].lineno, 2) def test_function_nested(self) -> None: astroid = builder.parse( """ def f1(x): def f2(y): return x + y return f2 """ ) xnames = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x"] self.assertEqual(len(xnames[0].lookup("x")[1]), 1) self.assertEqual(xnames[0].lookup("x")[1][0].lineno, 2) def test_class_variables(self) -> None: # Class variables are NOT available within nested scopes. astroid = builder.parse( """ class A: a = 10 def f1(self): return a # a is not defined f2 = lambda: a # a is not defined b = [a for _ in range(10)] # a is not defined class _Inner: inner_a = a + 1 """ ) names = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "a"] self.assertEqual(len(names), 4) for name in names: self.assertRaises(NameInferenceError, name.inferred) def test_class_in_function(self) -> None: # Function variables are available within classes, including methods astroid = builder.parse( """ def f(): x = 10 class A: a = x def f1(self): return x f2 = lambda: x b = [x for _ in range(10)] class _Inner: inner_a = x + 1 """ ) names = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x"] self.assertEqual(len(names), 5) for name in names: self.assertEqual(len(name.lookup("x")[1]), 1, repr(name)) self.assertEqual(name.lookup("x")[1][0].lineno, 3, repr(name)) def test_generator_attributes(self) -> None: tree = builder.parse( """ def count(): "test" yield 0 iterer = count() num = iterer.next() """ ) next_node = tree.body[2].value.func gener = next_node.expr.inferred()[0] self.assertIsInstance(gener.getattr("__next__")[0], nodes.FunctionDef) self.assertIsInstance(gener.getattr("send")[0], nodes.FunctionDef) self.assertIsInstance(gener.getattr("throw")[0], nodes.FunctionDef) self.assertIsInstance(gener.getattr("close")[0], nodes.FunctionDef) def test_explicit___name__(self) -> None: code = """ class Pouet: __name__ = "pouet" p1 = Pouet() class PouetPouet(Pouet): pass p2 = Pouet() class NoName: pass p3 = NoName() """ astroid = builder.parse(code, __name__) p1 = next(astroid["p1"].infer()) self.assertTrue(p1.getattr("__name__")) p2 = next(astroid["p2"].infer()) self.assertTrue(p2.getattr("__name__")) self.assertTrue(astroid["NoName"].getattr("__name__")) p3 = next(astroid["p3"].infer()) self.assertRaises(AttributeInferenceError, p3.getattr, "__name__") def test_function_module_special(self) -> None: astroid = builder.parse( ''' def initialize(linter): """initialize linter with checkers in this package """ package_load(linter, __path__[0]) ''', "data.__init__", ) path = next( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "__path__" ) self.assertEqual(len(path.lookup("__path__")[1]), 1) def test_builtin_lookup(self) -> None: self.assertEqual(nodes.builtin_lookup("__dict__")[1], ()) intstmts = nodes.builtin_lookup("int")[1] self.assertEqual(len(intstmts), 1) self.assertIsInstance(intstmts[0], nodes.ClassDef) self.assertEqual(intstmts[0].name, "int") self.assertIs(intstmts[0], nodes.const_factory(1)._proxied) def test_decorator_arguments_lookup(self) -> None: code = """ def decorator(value): def wrapper(function): return function return wrapper class foo: member = 10 #@ @decorator(member) #This will cause pylint to complain def test(self): pass """ node = builder.extract_node(code, __name__) assert isinstance(node, nodes.Assign) member = node.targets[0] it = member.infer() obj = next(it) self.assertIsInstance(obj, nodes.Const) self.assertEqual(obj.value, 10) self.assertRaises(StopIteration, functools.partial(next, it)) def test_inner_decorator_member_lookup(self) -> None: code = """ class FileA: def decorator(bla): return bla @__(decorator) def funcA(): return 4 """ decname = builder.extract_node(code, __name__) it = decname.infer() obj = next(it) self.assertIsInstance(obj, nodes.FunctionDef) self.assertRaises(StopIteration, functools.partial(next, it)) def test_static_method_lookup(self) -> None: code = """ class FileA: @staticmethod def funcA(): return 4 class Test: FileA = [1,2,3] def __init__(self): print (FileA.funcA()) """ astroid = builder.parse(code, __name__) it = astroid["Test"]["__init__"].ilookup("FileA") obj = next(it) self.assertIsInstance(obj, nodes.ClassDef) self.assertRaises(StopIteration, functools.partial(next, it)) def test_global_delete(self) -> None: code = """ def run2(): f = Frobble() class Frobble: pass Frobble.mumble = True del Frobble def run1(): f = Frobble() """ astroid = builder.parse(code, __name__) stmts = astroid["run2"].lookup("Frobbel")[1] self.assertEqual(len(stmts), 0) stmts = astroid["run1"].lookup("Frobbel")[1] self.assertEqual(len(stmts), 0) class LookupControlFlowTest(unittest.TestCase): """Tests for lookup capabilities and control flow.""" def test_consecutive_assign(self) -> None: """When multiple assignment statements are in the same block, only the last one is returned. """ code = """ x = 10 x = 100 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 3) def test_assign_after_use(self) -> None: """An assignment statement appearing after the variable is not returned.""" code = """ print(x) x = 10 """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 0) def test_del_removes_prior(self) -> None: """Delete statement removes any prior assignments.""" code = """ x = 10 del x print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 0) def test_del_no_effect_after(self) -> None: """Delete statement doesn't remove future assignments.""" code = """ x = 10 del x x = 100 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 4) def test_if_assign(self) -> None: """Assignment in if statement is added to lookup results, but does not replace prior assignments. """ code = """ def f(b): x = 10 if b: x = 100 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 2) self.assertCountEqual([stmt.lineno for stmt in stmts], [3, 5]) def test_if_assigns_same_branch(self) -> None: """When if branch has multiple assignment statements, only the last one is added. """ code = """ def f(b): x = 10 if b: x = 100 x = 1000 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 2) self.assertCountEqual([stmt.lineno for stmt in stmts], [3, 6]) def test_if_assigns_different_branch(self) -> None: """When different branches have assignment statements, the last one in each branch is added. """ code = """ def f(b): x = 10 if b == 1: x = 100 x = 1000 elif b == 2: x = 3 elif b == 3: x = 4 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 4) self.assertCountEqual([stmt.lineno for stmt in stmts], [3, 6, 8, 10]) def test_assign_exclusive(self) -> None: """When the variable appears inside a branch of an if statement, no assignment statements from other branches are returned. """ code = """ def f(b): x = 10 if b == 1: x = 100 x = 1000 elif b == 2: x = 3 elif b == 3: x = 4 else: print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 3) def test_assign_not_exclusive(self) -> None: """When the variable appears inside a branch of an if statement, only the last assignment statement in the same branch is returned. """ code = """ def f(b): x = 10 if b == 1: x = 100 x = 1000 elif b == 2: x = 3 elif b == 3: x = 4 print(x) else: x = 5 """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 10) def test_if_else(self) -> None: """When an assignment statement appears in both an if and else branch, both are added. This does NOT replace an assignment statement appearing before the if statement. (See issue #213) """ code = """ def f(b): x = 10 if b: x = 100 else: x = 1000 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 3) self.assertCountEqual([stmt.lineno for stmt in stmts], [3, 5, 7]) def test_if_variable_in_condition_1(self) -> None: """Test lookup works correctly when a variable appears in an if condition.""" code = """ x = 10 if x > 10: print('a') elif x > 0: print('b') """ astroid = builder.parse(code) x_name1, x_name2 = ( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x" ) _, stmts1 = x_name1.lookup("x") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 2) _, stmts2 = x_name2.lookup("x") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 2) def test_if_variable_in_condition_2(self) -> None: """Test lookup works correctly when a variable appears in an if condition, and the variable is reassigned in each branch. This is based on pylint-dev/pylint issue #3711. """ code = """ x = 10 if x > 10: x = 100 elif x > 0: x = 200 elif x > -10: x = 300 else: x = 400 """ astroid = builder.parse(code) x_names = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x"] # All lookups should refer only to the initial x = 10. for x_name in x_names: _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 2) def test_del_not_exclusive(self) -> None: """A delete statement in an if statement branch removes all previous assignment statements when the delete statement is not exclusive with the variable (e.g., when the variable is used below the if statement). """ code = """ def f(b): x = 10 if b == 1: x = 100 elif b == 2: del x elif b == 3: x = 4 # Only this assignment statement is returned print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 9) def test_del_exclusive(self) -> None: """A delete statement in an if statement branch that is exclusive with the variable does not remove previous assignment statements. """ code = """ def f(b): x = 10 if b == 1: x = 100 elif b == 2: del x else: print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 3) def test_assign_after_param(self) -> None: """When an assignment statement overwrites a function parameter, only the assignment is returned, even when the variable and assignment do not have the same parent. """ code = """ def f1(x): x = 100 print(x) def f2(x): x = 100 if True: print(x) """ astroid = builder.parse(code) x_name1, x_name2 = ( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x" ) _, stmts1 = x_name1.lookup("x") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 3) _, stmts2 = x_name2.lookup("x") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 7) def test_assign_after_kwonly_param(self) -> None: """When an assignment statement overwrites a function keyword-only parameter, only the assignment is returned, even when the variable and assignment do not have the same parent. """ code = """ def f1(*, x): x = 100 print(x) def f2(*, x): x = 100 if True: print(x) """ astroid = builder.parse(code) x_name1, x_name2 = ( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x" ) _, stmts1 = x_name1.lookup("x") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 3) _, stmts2 = x_name2.lookup("x") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 7) @test_utils.require_version(minver="3.8") def test_assign_after_posonly_param(self): """When an assignment statement overwrites a function positional-only parameter, only the assignment is returned, even when the variable and assignment do not have the same parent. """ code = """ def f1(x, /): x = 100 print(x) def f2(x, /): x = 100 if True: print(x) """ astroid = builder.parse(code) x_name1, x_name2 = ( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x" ) _, stmts1 = x_name1.lookup("x") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 3) _, stmts2 = x_name2.lookup("x") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 7) def test_assign_after_args_param(self) -> None: """When an assignment statement overwrites a function parameter, only the assignment is returned. """ code = """ def f(*args, **kwargs): args = [100] kwargs = {} if True: print(args, kwargs) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "args") _, stmts1 = x_name.lookup("args") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 3) x_name = next( n for n in astroid.nodes_of_class(nodes.Name) if n.name == "kwargs" ) _, stmts2 = x_name.lookup("kwargs") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 4) def test_except_var_in_block(self) -> None: """When the variable bound to an exception in an except clause, it is returned when that variable is used inside the except block. """ code = """ try: 1 / 0 except ZeroDivisionError as e: print(e) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "e") _, stmts = x_name.lookup("e") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 4) def test_except_var_in_block_overwrites(self) -> None: """When the variable bound to an exception in an except clause, it is returned when that variable is used inside the except block, and replaces any previous assignments. """ code = """ e = 0 try: 1 / 0 except ZeroDivisionError as e: print(e) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "e") _, stmts = x_name.lookup("e") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 5) def test_except_var_in_multiple_blocks(self) -> None: """When multiple variables with the same name are bound to an exception in an except clause, and the variable is used inside the except block, only the assignment from the corresponding except clause is returned. """ code = """ e = 0 try: 1 / 0 except ZeroDivisionError as e: print(e) except NameError as e: print(e) """ astroid = builder.parse(code) x_names = [n for n in astroid.nodes_of_class(nodes.Name) if n.name == "e"] _, stmts1 = x_names[0].lookup("e") self.assertEqual(len(stmts1), 1) self.assertEqual(stmts1[0].lineno, 5) _, stmts2 = x_names[1].lookup("e") self.assertEqual(len(stmts2), 1) self.assertEqual(stmts2[0].lineno, 7) def test_except_var_after_block_single(self) -> None: """When the variable bound to an exception in an except clause, it is NOT returned when that variable is used after the except block. """ code = """ try: 1 / 0 except NameError as e: pass print(e) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "e") _, stmts = x_name.lookup("e") self.assertEqual(len(stmts), 0) def test_except_var_after_block_multiple(self) -> None: """When the variable bound to an exception in multiple except clauses, it is NOT returned when that variable is used after the except blocks. """ code = """ try: 1 / 0 except NameError as e: pass except ZeroDivisionError as e: pass print(e) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "e") _, stmts = x_name.lookup("e") self.assertEqual(len(stmts), 0) def test_except_assign_in_block(self) -> None: """When a variable is assigned in an except block, it is returned when that variable is used in the except block. """ code = """ try: 1 / 0 except ZeroDivisionError as e: x = 10 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 5) def test_except_assign_in_block_multiple(self) -> None: """When a variable is assigned in multiple except blocks, and the variable is used in one of the blocks, only the assignments in that block are returned. """ code = """ try: 1 / 0 except ZeroDivisionError: x = 10 except NameError: x = 100 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 7) def test_except_assign_after_block(self) -> None: """When a variable is assigned in an except clause, it is returned when that variable is used after the except block. """ code = """ try: 1 / 0 except ZeroDivisionError: x = 10 except NameError: x = 100 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 2) self.assertCountEqual([stmt.lineno for stmt in stmts], [5, 7]) def test_except_assign_after_block_overwritten(self) -> None: """When a variable is assigned in an except clause, it is not returned when it is reassigned and used after the except block. """ code = """ try: 1 / 0 except ZeroDivisionError: x = 10 except NameError: x = 100 x = 1000 print(x) """ astroid = builder.parse(code) x_name = next(n for n in astroid.nodes_of_class(nodes.Name) if n.name == "x") _, stmts = x_name.lookup("x") self.assertEqual(len(stmts), 1) self.assertEqual(stmts[0].lineno, 8) astroid-3.2.2/tests/test_objects.py0000664000175000017500000005153214622475517017320 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from astroid import bases, builder, nodes, objects, util from astroid.exceptions import AttributeInferenceError, InferenceError, SuperError from astroid.objects import Super class ObjectsTest(unittest.TestCase): def test_frozenset(self) -> None: node = builder.extract_node( """ frozenset({1: 2, 2: 3}) #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, objects.FrozenSet) self.assertEqual(inferred.pytype(), "builtins.frozenset") itered = inferred.itered() self.assertEqual(len(itered), 2) self.assertIsInstance(itered[0], nodes.Const) self.assertEqual([const.value for const in itered], [1, 2]) proxied = inferred._proxied self.assertEqual(inferred.qname(), "builtins.frozenset") self.assertIsInstance(proxied, nodes.ClassDef) def test_lookup_regression_slots(self) -> None: """Regression test for attr__new__ of ObjectModel. ObjectModel._instance is not always an bases.Instance, so we can't rely on the ._proxied attribute of an Instance. """ node = builder.extract_node( """ class ClassHavingUnknownAncestors(Unknown): __slots__ = ["yo"] def test(self): self.not_yo = 42 """ ) assert node.getattr("__new__") class SuperTests(unittest.TestCase): def test_inferring_super_outside_methods(self) -> None: ast_nodes = builder.extract_node( """ class Module(object): pass class StaticMethod(object): @staticmethod def static(): # valid, but we don't bother with it. return super(StaticMethod, StaticMethod) #@ # super outside methods aren't inferred super(Module, Module) #@ # no argument super is not recognised outside methods as well. super() #@ """ ) assert isinstance(ast_nodes, list) in_static = next(ast_nodes[0].value.infer()) self.assertIsInstance(in_static, bases.Instance) self.assertEqual(in_static.qname(), "builtins.super") module_level = next(ast_nodes[1].infer()) self.assertIsInstance(module_level, bases.Instance) self.assertEqual(in_static.qname(), "builtins.super") no_arguments = next(ast_nodes[2].infer()) self.assertIsInstance(no_arguments, bases.Instance) self.assertEqual(no_arguments.qname(), "builtins.super") def test_inferring_unbound_super_doesnt_work(self) -> None: node = builder.extract_node( """ class Test(object): def __init__(self): super(Test) #@ """ ) unbounded = next(node.infer()) self.assertIsInstance(unbounded, bases.Instance) self.assertEqual(unbounded.qname(), "builtins.super") def test_use_default_inference_on_not_inferring_args(self) -> None: ast_nodes = builder.extract_node( """ class Test(object): def __init__(self): super(Lala, self) #@ super(Test, lala) #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, bases.Instance) self.assertEqual(first.qname(), "builtins.super") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, bases.Instance) self.assertEqual(second.qname(), "builtins.super") def test_no_arguments_super(self) -> None: ast_nodes = builder.extract_node( """ class First(object): pass class Second(First): def test(self): super() #@ @classmethod def test_classmethod(cls): super() #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, objects.Super) self.assertIsInstance(first.type, bases.Instance) self.assertEqual(first.type.name, "Second") self.assertIsInstance(first.mro_pointer, nodes.ClassDef) self.assertEqual(first.mro_pointer.name, "Second") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, objects.Super) self.assertIsInstance(second.type, nodes.ClassDef) self.assertEqual(second.type.name, "Second") self.assertIsInstance(second.mro_pointer, nodes.ClassDef) self.assertEqual(second.mro_pointer.name, "Second") def test_super_simple_cases(self) -> None: ast_nodes = builder.extract_node( """ class First(object): pass class Second(First): pass class Third(First): def test(self): super(Third, self) #@ super(Second, self) #@ # mro position and the type super(Third, Third) #@ super(Third, Second) #@ super(Fourth, Fourth) #@ class Fourth(Third): pass """ ) # .type is the object which provides the mro. # .mro_pointer is the position in the mro from where # the lookup should be done. # super(Third, self) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, objects.Super) self.assertIsInstance(first.type, bases.Instance) self.assertEqual(first.type.name, "Third") self.assertIsInstance(first.mro_pointer, nodes.ClassDef) self.assertEqual(first.mro_pointer.name, "Third") # super(Second, self) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, objects.Super) self.assertIsInstance(second.type, bases.Instance) self.assertEqual(second.type.name, "Third") self.assertIsInstance(first.mro_pointer, nodes.ClassDef) self.assertEqual(second.mro_pointer.name, "Second") # super(Third, Third) third = next(ast_nodes[2].infer()) self.assertIsInstance(third, objects.Super) self.assertIsInstance(third.type, nodes.ClassDef) self.assertEqual(third.type.name, "Third") self.assertIsInstance(third.mro_pointer, nodes.ClassDef) self.assertEqual(third.mro_pointer.name, "Third") # super(Third, second) fourth = next(ast_nodes[3].infer()) self.assertIsInstance(fourth, objects.Super) self.assertIsInstance(fourth.type, nodes.ClassDef) self.assertEqual(fourth.type.name, "Second") self.assertIsInstance(fourth.mro_pointer, nodes.ClassDef) self.assertEqual(fourth.mro_pointer.name, "Third") # Super(Fourth, Fourth) fifth = next(ast_nodes[4].infer()) self.assertIsInstance(fifth, objects.Super) self.assertIsInstance(fifth.type, nodes.ClassDef) self.assertEqual(fifth.type.name, "Fourth") self.assertIsInstance(fifth.mro_pointer, nodes.ClassDef) self.assertEqual(fifth.mro_pointer.name, "Fourth") def test_super_infer(self) -> None: node = builder.extract_node( """ class Super(object): def __init__(self): super(Super, self) #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, objects.Super) reinferred = next(inferred.infer()) self.assertIsInstance(reinferred, objects.Super) self.assertIs(inferred, reinferred) def test_inferring_invalid_supers(self) -> None: ast_nodes = builder.extract_node( """ class Super(object): def __init__(self): # MRO pointer is not a type super(1, self) #@ # MRO type is not a subtype super(Super, 1) #@ # self is not a subtype of Bupper super(Bupper, self) #@ class Bupper(Super): pass """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, objects.Super) with self.assertRaises(SuperError) as cm: first.super_mro() self.assertIsInstance(cm.exception.super_.mro_pointer, nodes.Const) self.assertEqual(cm.exception.super_.mro_pointer.value, 1) for node, invalid_type in zip(ast_nodes[1:], (nodes.Const, bases.Instance)): inferred = next(node.infer()) self.assertIsInstance(inferred, objects.Super, node) with self.assertRaises(SuperError) as cm: inferred.super_mro() self.assertIsInstance(cm.exception.super_.type, invalid_type) def test_proxied(self) -> None: node = builder.extract_node( """ class Super(object): def __init__(self): super(Super, self) #@ """ ) inferred = next(node.infer()) proxied = inferred._proxied self.assertEqual(proxied.qname(), "builtins.super") self.assertIsInstance(proxied, nodes.ClassDef) def test_super_bound_model(self) -> None: ast_nodes = builder.extract_node( """ class First(object): def method(self): pass @classmethod def class_method(cls): pass class Super_Type_Type(First): def method(self): super(Super_Type_Type, Super_Type_Type).method #@ super(Super_Type_Type, Super_Type_Type).class_method #@ @classmethod def class_method(cls): super(Super_Type_Type, Super_Type_Type).method #@ super(Super_Type_Type, Super_Type_Type).class_method #@ class Super_Type_Object(First): def method(self): super(Super_Type_Object, self).method #@ super(Super_Type_Object, self).class_method #@ """ ) # Super(type, type) is the same for both functions and classmethods. assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.FunctionDef) self.assertEqual(first.name, "method") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, bases.BoundMethod) self.assertEqual(second.bound.name, "First") self.assertEqual(second.type, "classmethod") third = next(ast_nodes[2].infer()) self.assertIsInstance(third, nodes.FunctionDef) self.assertEqual(third.name, "method") fourth = next(ast_nodes[3].infer()) self.assertIsInstance(fourth, bases.BoundMethod) self.assertEqual(fourth.bound.name, "First") self.assertEqual(fourth.type, "classmethod") # Super(type, obj) can lead to different attribute bindings # depending on the type of the place where super was called. fifth = next(ast_nodes[4].infer()) self.assertIsInstance(fifth, bases.BoundMethod) self.assertEqual(fifth.bound.name, "First") self.assertEqual(fifth.type, "method") sixth = next(ast_nodes[5].infer()) self.assertIsInstance(sixth, bases.BoundMethod) self.assertEqual(sixth.bound.name, "First") self.assertEqual(sixth.type, "classmethod") def test_super_getattr_single_inheritance(self) -> None: ast_nodes = builder.extract_node( """ class First(object): def test(self): pass class Second(First): def test2(self): pass class Third(Second): test3 = 42 def __init__(self): super(Third, self).test2 #@ super(Third, self).test #@ # test3 is local, no MRO lookup is done. super(Third, self).test3 #@ super(Third, self) #@ # Unbounds. super(Third, Third).test2 #@ super(Third, Third).test #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, bases.BoundMethod) self.assertEqual(first.bound.name, "Second") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, bases.BoundMethod) self.assertEqual(second.bound.name, "First") with self.assertRaises(InferenceError): next(ast_nodes[2].infer()) fourth = next(ast_nodes[3].infer()) with self.assertRaises(AttributeInferenceError): fourth.getattr("test3") with self.assertRaises(AttributeInferenceError): next(fourth.igetattr("test3")) first_unbound = next(ast_nodes[4].infer()) self.assertIsInstance(first_unbound, nodes.FunctionDef) self.assertEqual(first_unbound.name, "test2") self.assertEqual(first_unbound.parent.name, "Second") second_unbound = next(ast_nodes[5].infer()) self.assertIsInstance(second_unbound, nodes.FunctionDef) self.assertEqual(second_unbound.name, "test") self.assertEqual(second_unbound.parent.name, "First") def test_super_invalid_mro(self) -> None: node = builder.extract_node( """ class A(object): test = 42 class Super(A, A): def __init__(self): super(Super, self) #@ """ ) inferred = next(node.infer()) with self.assertRaises(AttributeInferenceError): next(inferred.getattr("test")) def test_super_complex_mro(self) -> None: ast_nodes = builder.extract_node( """ class A(object): def spam(self): return "A" def foo(self): return "A" @staticmethod def static(self): pass class B(A): def boo(self): return "B" def spam(self): return "B" class C(A): def boo(self): return "C" class E(C, B): def __init__(self): super(E, self).boo #@ super(C, self).boo #@ super(E, self).spam #@ super(E, self).foo #@ super(E, self).static #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, bases.BoundMethod) self.assertEqual(first.bound.name, "C") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, bases.BoundMethod) self.assertEqual(second.bound.name, "B") third = next(ast_nodes[2].infer()) self.assertIsInstance(third, bases.BoundMethod) self.assertEqual(third.bound.name, "B") fourth = next(ast_nodes[3].infer()) self.assertEqual(fourth.bound.name, "A") static = next(ast_nodes[4].infer()) self.assertIsInstance(static, nodes.FunctionDef) self.assertEqual(static.parent.scope().name, "A") def test_super_data_model(self) -> None: ast_nodes = builder.extract_node( """ class X(object): pass class A(X): def __init__(self): super(A, self) #@ super(A, A) #@ super(X, A) #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) thisclass = first.getattr("__thisclass__")[0] self.assertIsInstance(thisclass, nodes.ClassDef) self.assertEqual(thisclass.name, "A") selfclass = first.getattr("__self_class__")[0] self.assertIsInstance(selfclass, nodes.ClassDef) self.assertEqual(selfclass.name, "A") self_ = first.getattr("__self__")[0] self.assertIsInstance(self_, bases.Instance) self.assertEqual(self_.name, "A") cls = first.getattr("__class__")[0] self.assertEqual(cls, first._proxied) second = next(ast_nodes[1].infer()) thisclass = second.getattr("__thisclass__")[0] self.assertEqual(thisclass.name, "A") self_ = second.getattr("__self__")[0] self.assertIsInstance(self_, nodes.ClassDef) self.assertEqual(self_.name, "A") third = next(ast_nodes[2].infer()) thisclass = third.getattr("__thisclass__")[0] self.assertEqual(thisclass.name, "X") selfclass = third.getattr("__self_class__")[0] self.assertEqual(selfclass.name, "A") def assertEqualMro(self, klass: Super, expected_mro: list[str]) -> None: self.assertEqual([member.name for member in klass.super_mro()], expected_mro) def test_super_mro(self) -> None: ast_nodes = builder.extract_node( """ class A(object): pass class B(A): pass class C(A): pass class E(C, B): def __init__(self): super(E, self) #@ super(C, self) #@ super(B, self) #@ super(B, 1) #@ super(1, B) #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertEqualMro(first, ["C", "B", "A", "object"]) second = next(ast_nodes[1].infer()) self.assertEqualMro(second, ["B", "A", "object"]) third = next(ast_nodes[2].infer()) self.assertEqualMro(third, ["A", "object"]) fourth = next(ast_nodes[3].infer()) with self.assertRaises(SuperError): fourth.super_mro() fifth = next(ast_nodes[4].infer()) with self.assertRaises(SuperError): fifth.super_mro() def test_super_yes_objects(self) -> None: ast_nodes = builder.extract_node( """ from collections import Missing class A(object): def __init__(self): super(Missing, self) #@ super(A, Missing) #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, bases.Instance) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, bases.Instance) def test_super_invalid_types(self) -> None: node = builder.extract_node( """ import collections class A(object): def __init__(self): super(A, collections) #@ """ ) inferred = next(node.infer()) with self.assertRaises(SuperError): inferred.super_mro() with self.assertRaises(SuperError): inferred.super_mro() def test_super_properties(self) -> None: node = builder.extract_node( """ class Foo(object): @property def dict(self): return 42 class Bar(Foo): @property def dict(self): return super(Bar, self).dict Bar().dict """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_super_qname(self) -> None: """Make sure a Super object generates a qname equivalent to super.__qname__. """ # See issue 533 code = """ class C: def foo(self): return super() C().foo() #@ """ super_obj = next(builder.extract_node(code).infer()) self.assertEqual(super_obj.qname(), "super") def test_super_new_call(self) -> None: """Test that __new__ returns an object or node and not a (Un)BoundMethod.""" new_call_result: nodes.Name = builder.extract_node( """ import enum class ChoicesMeta(enum.EnumMeta): def __new__(metacls, classname, bases, classdict, **kwds): cls = super().__new__(metacls, "str", (enum.Enum,), enum._EnumDict(), **kwargs) cls #@ """ ) inferred = list(new_call_result.infer()) assert all( isinstance(i, (nodes.NodeNG, type(util.Uninferable))) for i in inferred ) def test_super_init_call(self) -> None: """Test that __init__ is still callable.""" init_node: nodes.Attribute = builder.extract_node( """ class SuperUsingClass: @staticmethod def test(): super(object, 1).__new__ #@ super(object, 1).__init__ #@ class A: pass A().__new__ #@ A().__init__ #@ """ ) assert isinstance(next(init_node[0].infer()), bases.BoundMethod) assert isinstance(next(init_node[1].infer()), bases.BoundMethod) assert isinstance(next(init_node[2].infer()), bases.BoundMethod) assert isinstance(next(init_node[3].infer()), bases.BoundMethod) astroid-3.2.2/tests/test_scoped_nodes.py0000664000175000017500000026662114622475517020343 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for specific behaviour of astroid scoped nodes (i.e. module, class and function). """ from __future__ import annotations import difflib import os import sys import textwrap import unittest from functools import partial from typing import Any from unittest.mock import patch import pytest from astroid import ( MANAGER, builder, extract_node, nodes, objects, parse, test_utils, util, ) from astroid.bases import BoundMethod, Generator, Instance, UnboundMethod from astroid.const import IS_PYPY, PY38, WIN32 from astroid.exceptions import ( AstroidBuildingError, AttributeInferenceError, DuplicateBasesError, InconsistentMroError, InferenceError, MroError, NameInferenceError, NoDefault, ResolveError, TooManyLevelsError, ) from astroid.nodes.scoped_nodes.scoped_nodes import _is_metaclass from . import resources try: import six # type: ignore[import] # pylint: disable=unused-import HAS_SIX = True except ImportError: HAS_SIX = False def _test_dict_interface( self: Any, node: nodes.ClassDef | nodes.FunctionDef | nodes.Module, test_attr: str, ) -> None: self.assertIs(node[test_attr], node[test_attr]) self.assertIn(test_attr, node) node.keys() node.values() node.items() iter(node) class ModuleLoader(resources.SysPathSetup): def setUp(self) -> None: super().setUp() self.module = resources.build_file("data/module.py", "data.module") self.module2 = resources.build_file("data/module2.py", "data.module2") self.nonregr = resources.build_file("data/nonregr.py", "data.nonregr") self.pack = resources.build_file("data/__init__.py", "data") class ModuleNodeTest(ModuleLoader, unittest.TestCase): def test_special_attributes(self) -> None: self.assertEqual(len(self.module.getattr("__name__")), 2) self.assertIsInstance(self.module.getattr("__name__")[0], nodes.Const) self.assertEqual(self.module.getattr("__name__")[0].value, "data.module") self.assertIsInstance(self.module.getattr("__name__")[1], nodes.Const) self.assertEqual(self.module.getattr("__name__")[1].value, "__main__") self.assertEqual(len(self.module.getattr("__doc__")), 1) self.assertIsInstance(self.module.getattr("__doc__")[0], nodes.Const) self.assertEqual( self.module.getattr("__doc__")[0].value, "test module for astroid\n" ) self.assertEqual(len(self.module.getattr("__file__")), 1) self.assertIsInstance(self.module.getattr("__file__")[0], nodes.Const) self.assertEqual( self.module.getattr("__file__")[0].value, os.path.abspath(resources.find("data/module.py")), ) self.assertEqual(len(self.module.getattr("__dict__")), 1) self.assertIsInstance(self.module.getattr("__dict__")[0], nodes.Dict) self.assertRaises(AttributeInferenceError, self.module.getattr, "__path__") self.assertEqual(len(self.pack.getattr("__path__")), 1) self.assertIsInstance(self.pack.getattr("__path__")[0], nodes.List) def test_dict_interface(self) -> None: _test_dict_interface(self, self.module, "YO") def test_getattr(self) -> None: yo = self.module.getattr("YO")[0] self.assertIsInstance(yo, nodes.ClassDef) self.assertEqual(yo.name, "YO") red = next(self.module.igetattr("redirect")) self.assertIsInstance(red, nodes.FunctionDef) self.assertEqual(red.name, "four_args") namenode = next(self.module.igetattr("NameNode")) self.assertIsInstance(namenode, nodes.ClassDef) self.assertEqual(namenode.name, "Name") # resolve packageredirection mod = resources.build_file( "data/appl/myConnection.py", "data.appl.myConnection" ) ssl = next(mod.igetattr("SSL1")) cnx = next(ssl.igetattr("Connection")) self.assertEqual(cnx.__class__, nodes.ClassDef) self.assertEqual(cnx.name, "Connection") self.assertEqual(cnx.root().name, "data.SSL1.Connection1") self.assertEqual(len(self.nonregr.getattr("enumerate")), 2) self.assertRaises(InferenceError, self.nonregr.igetattr, "YOAA") def test_wildcard_import_names(self) -> None: m = resources.build_file("data/all.py", "all") self.assertEqual(m.wildcard_import_names(), ["Aaa", "_bla", "name"]) m = resources.build_file("data/notall.py", "notall") res = sorted(m.wildcard_import_names()) self.assertEqual(res, ["Aaa", "func", "name", "other"]) def test_public_names(self) -> None: m = builder.parse( """ name = 'a' _bla = 2 other = 'o' class Aaa: pass def func(): print('yo') __all__ = 'Aaa', '_bla', 'name' """ ) values = sorted(["Aaa", "name", "other", "func"]) self.assertEqual(sorted(m.public_names()), values) m = builder.parse( """ name = 'a' _bla = 2 other = 'o' class Aaa: pass def func(): return 'yo' """ ) res = sorted(m.public_names()) self.assertEqual(res, values) m = builder.parse( """ from missing import tzop trop = "test" __all__ = (trop, "test1", tzop, 42) """ ) res = sorted(m.public_names()) self.assertEqual(res, ["trop", "tzop"]) m = builder.parse( """ test = tzop = 42 __all__ = ('test', ) + ('tzop', ) """ ) res = sorted(m.public_names()) self.assertEqual(res, ["test", "tzop"]) def test_module_getattr(self) -> None: data = """ appli = application appli += 2 del appli """ astroid = builder.parse(data, __name__) # test del statement not returned by getattr self.assertEqual(len(astroid.getattr("appli")), 2, astroid.getattr("appli")) def test_relative_to_absolute_name(self) -> None: # package mod = nodes.Module("very.multi.package", package=True) modname = mod.relative_to_absolute_name("utils", 1) self.assertEqual(modname, "very.multi.package.utils") modname = mod.relative_to_absolute_name("utils", 2) self.assertEqual(modname, "very.multi.utils") modname = mod.relative_to_absolute_name("utils", 0) self.assertEqual(modname, "very.multi.package.utils") modname = mod.relative_to_absolute_name("", 1) self.assertEqual(modname, "very.multi.package") # non package mod = nodes.Module("very.multi.module", package=False) modname = mod.relative_to_absolute_name("utils", 0) self.assertEqual(modname, "very.multi.utils") modname = mod.relative_to_absolute_name("utils", 1) self.assertEqual(modname, "very.multi.utils") modname = mod.relative_to_absolute_name("utils", 2) self.assertEqual(modname, "very.utils") modname = mod.relative_to_absolute_name("", 1) self.assertEqual(modname, "very.multi") def test_relative_to_absolute_name_beyond_top_level(self) -> None: mod = nodes.Module("a.b.c", package=True) for level in (5, 4): with self.assertRaises(TooManyLevelsError) as cm: mod.relative_to_absolute_name("test", level) expected = ( "Relative import with too many levels " f"({level-1}) for module {mod.name!r}" ) self.assertEqual(expected, str(cm.exception)) def test_import_1(self) -> None: data = """from . import subpackage""" sys.path.insert(0, resources.find("data")) astroid = builder.parse(data, "package", "data/package/__init__.py") try: m = astroid.import_module("", level=1) self.assertEqual(m.name, "package") inferred = list(astroid.igetattr("subpackage")) self.assertEqual(len(inferred), 1) self.assertEqual(inferred[0].name, "package.subpackage") finally: del sys.path[0] def test_import_2(self) -> None: data = """from . import subpackage as pouet""" astroid = builder.parse(data, "package", "data/package/__init__.py") sys.path.insert(0, resources.find("data")) try: m = astroid.import_module("", level=1) self.assertEqual(m.name, "package") inferred = list(astroid.igetattr("pouet")) self.assertEqual(len(inferred), 1) self.assertEqual(inferred[0].name, "package.subpackage") finally: del sys.path[0] @patch( "astroid.nodes.scoped_nodes.scoped_nodes.AstroidManager.ast_from_module_name" ) def test_import_unavailable_module(self, mock) -> None: unavailable_modname = "posixpath" if WIN32 else "ntpath" module = builder.parse(f"import {unavailable_modname}") mock.side_effect = AstroidBuildingError with pytest.raises(AstroidBuildingError): module.import_module(unavailable_modname) mock.assert_called_once() def test_file_stream_in_memory(self) -> None: data = """irrelevant_variable is irrelevant""" astroid = builder.parse(data, "in_memory") with astroid.stream() as stream: self.assertEqual(stream.read().decode(), data) def test_file_stream_physical(self) -> None: path = resources.find("data/all.py") astroid = builder.AstroidBuilder().file_build(path, "all") with open(path, "rb") as file_io: with astroid.stream() as stream: self.assertEqual(stream.read(), file_io.read()) def test_file_stream_api(self) -> None: path = resources.find("data/all.py") file_build = builder.AstroidBuilder().file_build(path, "all") with self.assertRaises(AttributeError): # pylint: disable=pointless-statement, no-member file_build.file_stream # noqa: B018 def test_stream_api(self) -> None: path = resources.find("data/all.py") astroid = builder.AstroidBuilder().file_build(path, "all") stream = astroid.stream() self.assertTrue(hasattr(stream, "close")) with stream: with open(path, "rb") as file_io: self.assertEqual(stream.read(), file_io.read()) @staticmethod def test_singleline_docstring() -> None: data = textwrap.dedent( """\ '''Hello World''' foo = 1 """ ) module = builder.parse(data, __name__) assert isinstance(module.doc_node, nodes.Const) assert module.doc_node.lineno == 1 assert module.doc_node.col_offset == 0 assert module.doc_node.end_lineno == 1 assert module.doc_node.end_col_offset == 17 @staticmethod def test_multiline_docstring() -> None: data = textwrap.dedent( """\ '''Hello World Also on this line. ''' foo = 1 """ ) module = builder.parse(data, __name__) assert isinstance(module.doc_node, nodes.Const) assert module.doc_node.lineno == 1 assert module.doc_node.col_offset == 0 assert module.doc_node.end_lineno == 4 assert module.doc_node.end_col_offset == 3 @staticmethod def test_comment_before_docstring() -> None: data = textwrap.dedent( """\ # Some comment '''This is a multiline docstring. ''' """ ) module = builder.parse(data, __name__) assert isinstance(module.doc_node, nodes.Const) assert module.doc_node.lineno == 2 assert module.doc_node.col_offset == 0 assert module.doc_node.end_lineno == 5 assert module.doc_node.end_col_offset == 3 @staticmethod def test_without_docstring() -> None: data = textwrap.dedent( """\ foo = 1 """ ) module = builder.parse(data, __name__) assert module.doc_node is None class FunctionNodeTest(ModuleLoader, unittest.TestCase): def test_special_attributes(self) -> None: func = self.module2["make_class"] self.assertEqual(len(func.getattr("__name__")), 1) self.assertIsInstance(func.getattr("__name__")[0], nodes.Const) self.assertEqual(func.getattr("__name__")[0].value, "make_class") self.assertEqual(len(func.getattr("__doc__")), 1) self.assertIsInstance(func.getattr("__doc__")[0], nodes.Const) self.assertEqual( func.getattr("__doc__")[0].value, "check base is correctly resolved to Concrete0", ) self.assertEqual(len(self.module.getattr("__dict__")), 1) self.assertIsInstance(self.module.getattr("__dict__")[0], nodes.Dict) def test_dict_interface(self) -> None: _test_dict_interface(self, self.module["global_access"], "local") def test_default_value(self) -> None: func = self.module2["make_class"] self.assertIsInstance(func.args.default_value("base"), nodes.Attribute) self.assertRaises(NoDefault, func.args.default_value, "args") self.assertRaises(NoDefault, func.args.default_value, "kwargs") self.assertRaises(NoDefault, func.args.default_value, "any") # self.assertIsInstance(func.mularg_class('args'), nodes.Tuple) # self.assertIsInstance(func.mularg_class('kwargs'), nodes.Dict) # self.assertIsNone(func.mularg_class('base')) def test_navigation(self) -> None: function = self.module["global_access"] self.assertEqual(function.statement(), function) self.assertEqual(function.statement(), function) l_sibling = function.previous_sibling() # check taking parent if child is not a stmt self.assertIsInstance(l_sibling, nodes.Assign) child = function.args.args[0] self.assertIs(l_sibling, child.previous_sibling()) r_sibling = function.next_sibling() self.assertIsInstance(r_sibling, nodes.ClassDef) self.assertEqual(r_sibling.name, "YO") self.assertIs(r_sibling, child.next_sibling()) last = r_sibling.next_sibling().next_sibling().next_sibling() self.assertIsInstance(last, nodes.Assign) self.assertIsNone(last.next_sibling()) first = l_sibling.root().body[0] self.assertIsNone(first.previous_sibling()) def test_four_args(self) -> None: func = self.module["four_args"] local = sorted(func.keys()) self.assertEqual(local, ["a", "b", "c", "d"]) self.assertEqual(func.type, "function") def test_format_args(self) -> None: func = self.module2["make_class"] self.assertEqual( func.args.format_args(), "any, base=data.module.YO, *args, **kwargs" ) func = self.module["four_args"] self.assertEqual(func.args.format_args(), "a, b, c, d") def test_format_args_keyword_only_args(self) -> None: node = ( builder.parse( """ def test(a: int, *, b: dict): pass """ ) .body[-1] .args ) formatted = node.format_args() self.assertEqual(formatted, "a: int, *, b: dict") def test_is_generator(self) -> None: self.assertTrue(self.module2["generator"].is_generator()) self.assertFalse(self.module2["not_a_generator"].is_generator()) self.assertFalse(self.module2["make_class"].is_generator()) def test_is_abstract(self) -> None: method = self.module2["AbstractClass"]["to_override"] self.assertTrue(method.is_abstract(pass_is_abstract=False)) self.assertEqual(method.qname(), "data.module2.AbstractClass.to_override") self.assertEqual(method.pytype(), "builtins.instancemethod") method = self.module2["AbstractClass"]["return_something"] self.assertFalse(method.is_abstract(pass_is_abstract=False)) # non regression : test raise "string" doesn't cause an exception in is_abstract func = self.module2["raise_string"] self.assertFalse(func.is_abstract(pass_is_abstract=False)) def test_is_abstract_decorated(self) -> None: methods = builder.extract_node( """ import abc class Klass(object): @abc.abstractproperty def prop(self): #@ pass @abc.abstractmethod def method1(self): #@ pass some_other_decorator = lambda x: x @some_other_decorator def method2(self): #@ pass """ ) assert len(methods) == 3 prop, method1, method2 = methods assert isinstance(prop, nodes.FunctionDef) assert prop.is_abstract(pass_is_abstract=False) assert isinstance(method1, nodes.FunctionDef) assert method1.is_abstract(pass_is_abstract=False) assert isinstance(method2, nodes.FunctionDef) assert not method2.is_abstract(pass_is_abstract=False) # def test_raises(self): # method = self.module2["AbstractClass"]["to_override"] # self.assertEqual( # [str(term) for term in method.raises()], # ["Call(Name('NotImplementedError'), [], None, None)"], # ) # def test_returns(self): # method = self.module2["AbstractClass"]["return_something"] # # use string comp since Node doesn't handle __cmp__ # self.assertEqual( # [str(term) for term in method.returns()], ["Const('toto')", "Const(None)"] # ) def test_lambda_pytype(self) -> None: data = """ def f(): g = lambda: None """ astroid = builder.parse(data) g = next(iter(astroid["f"].ilookup("g"))) self.assertEqual(g.pytype(), "builtins.function") def test_lambda_qname(self) -> None: astroid = builder.parse("lmbd = lambda: None", __name__) self.assertEqual(f"{__name__}.", astroid["lmbd"].parent.value.qname()) def test_lambda_getattr(self) -> None: astroid = builder.parse("lmbd = lambda: None") self.assertIsInstance( astroid["lmbd"].parent.value.getattr("__code__")[0], nodes.Unknown ) def test_is_method(self) -> None: data = """ class A: def meth1(self): return 1 @classmethod def meth2(cls): return 2 @staticmethod def meth3(): return 3 def function(): return 0 @staticmethod def sfunction(): return -1 """ astroid = builder.parse(data) self.assertTrue(astroid["A"]["meth1"].is_method()) self.assertTrue(astroid["A"]["meth2"].is_method()) self.assertTrue(astroid["A"]["meth3"].is_method()) self.assertFalse(astroid["function"].is_method()) self.assertFalse(astroid["sfunction"].is_method()) def test_argnames(self) -> None: code = "def f(a, b, c, *args, **kwargs): pass" astroid = builder.parse(code, __name__) self.assertEqual(astroid["f"].argnames(), ["a", "b", "c", "args", "kwargs"]) code_with_kwonly_args = "def f(a, b, *args, c=None, d=None, **kwargs): pass" astroid = builder.parse(code_with_kwonly_args, __name__) self.assertEqual( astroid["f"].argnames(), ["a", "b", "args", "c", "d", "kwargs"] ) def test_argnames_lambda(self) -> None: lambda_node = extract_node("lambda a, b, c, *args, **kwargs: ...") self.assertEqual(lambda_node.argnames(), ["a", "b", "c", "args", "kwargs"]) def test_positional_only_argnames(self) -> None: code = "def f(a, b, /, c=None, *args, d, **kwargs): pass" astroid = builder.parse(code, __name__) self.assertEqual( astroid["f"].argnames(), ["a", "b", "c", "args", "d", "kwargs"] ) def test_return_nothing(self) -> None: """Test inferred value on a function with empty return.""" data = """ def func(): return a = func() """ astroid = builder.parse(data) call = astroid.body[1].value func_vals = call.inferred() self.assertEqual(len(func_vals), 1) self.assertIsInstance(func_vals[0], nodes.Const) self.assertIsNone(func_vals[0].value) def test_no_returns_is_implicitly_none(self) -> None: code = """ def f(): print('non-empty, non-pass, no return statements') value = f() value """ node = builder.extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value is None def test_only_raises_is_not_implicitly_none(self) -> None: code = """ def f(): raise SystemExit() f() """ node = builder.extract_node(code) assert isinstance(node, nodes.Call) inferred = next(node.infer()) assert inferred is util.Uninferable def test_abstract_methods_are_not_implicitly_none(self) -> None: code = """ from abc import ABCMeta, abstractmethod class Abstract(metaclass=ABCMeta): @abstractmethod def foo(self): pass def bar(self): print('non-empty, non-pass, no return statements') Abstract().foo() #@ Abstract().bar() #@ class Concrete(Abstract): def foo(self): return 123 Concrete().foo() #@ Concrete().bar() #@ """ afoo, abar, cfoo, cbar = builder.extract_node(code) assert next(afoo.infer()) is util.Uninferable for node, value in ((abar, None), (cfoo, 123), (cbar, None)): inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == value def test_func_instance_attr(self) -> None: """Test instance attributes for functions.""" data = """ def test(): print(test.bar) test.bar = 1 test() """ astroid = builder.parse(data, "mod") func = astroid.body[2].value.func.inferred()[0] self.assertIsInstance(func, nodes.FunctionDef) self.assertEqual(func.name, "test") one = func.getattr("bar")[0].inferred()[0] self.assertIsInstance(one, nodes.Const) self.assertEqual(one.value, 1) def test_func_is_bound(self) -> None: data = """ class MyClass: def bound(): #@ pass """ func = builder.extract_node(data) self.assertIs(func.is_bound(), True) self.assertEqual(func.implicit_parameters(), 1) data2 = """ def not_bound(): #@ pass """ func2 = builder.extract_node(data2) self.assertIs(func2.is_bound(), False) self.assertEqual(func2.implicit_parameters(), 0) def test_type_builtin_descriptor_subclasses(self) -> None: astroid = builder.parse( """ class classonlymethod(classmethod): pass class staticonlymethod(staticmethod): pass class Node: @classonlymethod def clsmethod_subclass(cls): pass @classmethod def clsmethod(cls): pass @staticonlymethod def staticmethod_subclass(cls): pass @staticmethod def stcmethod(cls): pass """ ) node = astroid.locals["Node"][0] self.assertEqual(node.locals["clsmethod_subclass"][0].type, "classmethod") self.assertEqual(node.locals["clsmethod"][0].type, "classmethod") self.assertEqual(node.locals["staticmethod_subclass"][0].type, "staticmethod") self.assertEqual(node.locals["stcmethod"][0].type, "staticmethod") def test_decorator_builtin_descriptors(self) -> None: astroid = builder.parse( """ def static_decorator(platform=None, order=50): def wrapper(f): f.cgm_module = True f.cgm_module_order = order f.cgm_module_platform = platform return staticmethod(f) return wrapper def long_classmethod_decorator(platform=None, order=50): def wrapper(f): def wrapper2(f): def wrapper3(f): f.cgm_module = True f.cgm_module_order = order f.cgm_module_platform = platform return classmethod(f) return wrapper3(f) return wrapper2(f) return wrapper def classmethod_decorator(platform=None): def wrapper(f): f.platform = platform return classmethod(f) return wrapper def classmethod_wrapper(fn): def wrapper(cls, *args, **kwargs): result = fn(cls, *args, **kwargs) return result return classmethod(wrapper) def staticmethod_wrapper(fn): def wrapper(*args, **kwargs): return fn(*args, **kwargs) return staticmethod(wrapper) class SomeClass(object): @static_decorator() def static(node, cfg): pass @classmethod_decorator() def classmethod(cls): pass @static_decorator def not_so_static(node): pass @classmethod_decorator def not_so_classmethod(node): pass @classmethod_wrapper def classmethod_wrapped(cls): pass @staticmethod_wrapper def staticmethod_wrapped(): pass @long_classmethod_decorator() def long_classmethod(cls): pass """ ) node = astroid.locals["SomeClass"][0] self.assertEqual(node.locals["static"][0].type, "staticmethod") self.assertEqual(node.locals["classmethod"][0].type, "classmethod") self.assertEqual(node.locals["not_so_static"][0].type, "method") self.assertEqual(node.locals["not_so_classmethod"][0].type, "method") self.assertEqual(node.locals["classmethod_wrapped"][0].type, "classmethod") self.assertEqual(node.locals["staticmethod_wrapped"][0].type, "staticmethod") self.assertEqual(node.locals["long_classmethod"][0].type, "classmethod") def test_igetattr(self) -> None: func = builder.extract_node( """ def test(): pass """ ) assert isinstance(func, nodes.FunctionDef) func.instance_attrs["value"] = [nodes.Const(42)] value = func.getattr("value") self.assertEqual(len(value), 1) self.assertIsInstance(value[0], nodes.Const) self.assertEqual(value[0].value, 42) inferred = next(func.igetattr("value")) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_return_annotation_is_not_the_last(self) -> None: func = builder.extract_node( """ def test() -> bytes: pass pass return """ ) last_child = func.last_child() self.assertIsInstance(last_child, nodes.Return) self.assertEqual(func.tolineno, 5) def test_method_init_subclass(self) -> None: klass = builder.extract_node( """ class MyClass: def __init_subclass__(cls): pass """ ) method = klass["__init_subclass__"] self.assertEqual([n.name for n in method.args.args], ["cls"]) self.assertEqual(method.type, "classmethod") def test_dunder_class_local_to_method(self) -> None: node = builder.extract_node( """ class MyClass: def test(self): __class__ #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "MyClass") def test_dunder_class_local_to_function(self) -> None: node = builder.extract_node( """ def test(self): __class__ #@ """ ) with self.assertRaises(NameInferenceError): next(node.infer()) def test_dunder_class_local_to_classmethod(self) -> None: node = builder.extract_node( """ class MyClass: @classmethod def test(cls): __class__ #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "MyClass") @staticmethod def test_singleline_docstring() -> None: code = textwrap.dedent( """\ def foo(): '''Hello World''' bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert isinstance(func.doc_node, nodes.Const) assert func.doc_node.lineno == 2 assert func.doc_node.col_offset == 4 assert func.doc_node.end_lineno == 2 assert func.doc_node.end_col_offset == 21 @staticmethod def test_multiline_docstring() -> None: code = textwrap.dedent( """\ def foo(): '''Hello World Also on this line. ''' bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert isinstance(func.doc_node, nodes.Const) assert func.doc_node.lineno == 2 assert func.doc_node.col_offset == 4 assert func.doc_node.end_lineno == 5 assert func.doc_node.end_col_offset == 7 @staticmethod def test_multiline_docstring_async() -> None: code = textwrap.dedent( """\ async def foo(var: tuple = ()): '''Hello World ''' """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert isinstance(func.doc_node, nodes.Const) assert func.doc_node.lineno == 2 assert func.doc_node.col_offset == 4 assert func.doc_node.end_lineno == 5 assert func.doc_node.end_col_offset == 7 @staticmethod def test_docstring_special_cases() -> None: code = textwrap.dedent( """\ def f1(var: tuple = ()): #@ 'Hello World' def f2() -> "just some comment with an open bracket(": #@ 'Hello World' def f3() -> "Another comment with a colon: ": #@ 'Hello World' def f4(): #@ # It should work with comments too 'Hello World' """ ) ast_nodes: list[nodes.FunctionDef] = builder.extract_node(code) # type: ignore[assignment] assert len(ast_nodes) == 4 assert isinstance(ast_nodes[0].doc_node, nodes.Const) assert ast_nodes[0].doc_node.lineno == 2 assert ast_nodes[0].doc_node.col_offset == 4 assert ast_nodes[0].doc_node.end_lineno == 2 assert ast_nodes[0].doc_node.end_col_offset == 17 assert isinstance(ast_nodes[1].doc_node, nodes.Const) assert ast_nodes[1].doc_node.lineno == 5 assert ast_nodes[1].doc_node.col_offset == 4 assert ast_nodes[1].doc_node.end_lineno == 5 assert ast_nodes[1].doc_node.end_col_offset == 17 assert isinstance(ast_nodes[2].doc_node, nodes.Const) assert ast_nodes[2].doc_node.lineno == 8 assert ast_nodes[2].doc_node.col_offset == 4 assert ast_nodes[2].doc_node.end_lineno == 8 assert ast_nodes[2].doc_node.end_col_offset == 17 assert isinstance(ast_nodes[3].doc_node, nodes.Const) assert ast_nodes[3].doc_node.lineno == 12 assert ast_nodes[3].doc_node.col_offset == 4 assert ast_nodes[3].doc_node.end_lineno == 12 assert ast_nodes[3].doc_node.end_col_offset == 17 @staticmethod def test_without_docstring() -> None: code = textwrap.dedent( """\ def foo(): bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert func.doc_node is None @staticmethod def test_display_type() -> None: code = textwrap.dedent( """\ def foo(): bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert func.display_type() == "Function" code = textwrap.dedent( """\ class A: def foo(self): #@ bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] assert func.display_type() == "Method" @staticmethod def test_inference_error() -> None: code = textwrap.dedent( """\ def foo(): bar = 1 """ ) func: nodes.FunctionDef = builder.extract_node(code) # type: ignore[assignment] with pytest.raises(AttributeInferenceError): func.getattr("") class ClassNodeTest(ModuleLoader, unittest.TestCase): def test_dict_interface(self) -> None: _test_dict_interface(self, self.module["YOUPI"], "method") def test_cls_special_attributes_1(self) -> None: cls = self.module["YO"] self.assertEqual(len(cls.getattr("__bases__")), 1) self.assertEqual(len(cls.getattr("__name__")), 1) self.assertIsInstance(cls.getattr("__name__")[0], nodes.Const) self.assertEqual(cls.getattr("__name__")[0].value, "YO") self.assertEqual(len(cls.getattr("__doc__")), 1) self.assertIsInstance(cls.getattr("__doc__")[0], nodes.Const) self.assertEqual(cls.getattr("__doc__")[0].value, "hehe\n haha") # YO is an old styled class for Python 2.7 # May want to stop locals from referencing namespaced variables in the future module_attr_num = 4 self.assertEqual(len(cls.getattr("__module__")), module_attr_num) self.assertIsInstance(cls.getattr("__module__")[0], nodes.Const) self.assertEqual(cls.getattr("__module__")[0].value, "data.module") self.assertEqual(len(cls.getattr("__dict__")), 1) if not cls.newstyle: self.assertRaises(AttributeInferenceError, cls.getattr, "__mro__") for cls in (nodes.List._proxied, nodes.Const(1)._proxied): self.assertEqual(len(cls.getattr("__bases__")), 1) self.assertEqual(len(cls.getattr("__name__")), 1) self.assertEqual( len(cls.getattr("__doc__")), 1, (cls, cls.getattr("__doc__")) ) self.assertEqual(cls.getattr("__doc__")[0].value, cls.doc_node.value) self.assertEqual(len(cls.getattr("__module__")), 4) self.assertEqual(len(cls.getattr("__dict__")), 1) self.assertEqual(len(cls.getattr("__mro__")), 1) def test__mro__attribute(self) -> None: node = builder.extract_node( """ class A(object): pass class B(object): pass class C(A, B): pass """ ) assert isinstance(node, nodes.ClassDef) mro = node.getattr("__mro__")[0] self.assertIsInstance(mro, nodes.Tuple) self.assertEqual(mro.elts, node.mro()) def test__bases__attribute(self) -> None: node = builder.extract_node( """ class A(object): pass class B(object): pass class C(A, B): pass class D(C): pass """ ) assert isinstance(node, nodes.ClassDef) bases = node.getattr("__bases__")[0] self.assertIsInstance(bases, nodes.Tuple) self.assertEqual(len(bases.elts), 1) self.assertIsInstance(bases.elts[0], nodes.ClassDef) self.assertEqual(bases.elts[0].name, "C") def test_cls_special_attributes_2(self) -> None: astroid = builder.parse( """ class A(object): pass class B(object): pass A.__bases__ += (B,) """, __name__, ) self.assertEqual(len(astroid["A"].getattr("__bases__")), 2) self.assertIsInstance(astroid["A"].getattr("__bases__")[1], nodes.Tuple) self.assertIsInstance(astroid["A"].getattr("__bases__")[0], nodes.AssignAttr) def test_instance_special_attributes(self) -> None: for inst in (Instance(self.module["YO"]), nodes.List(), nodes.Const(1)): self.assertRaises(AttributeInferenceError, inst.getattr, "__mro__") self.assertRaises(AttributeInferenceError, inst.getattr, "__bases__") self.assertRaises(AttributeInferenceError, inst.getattr, "__name__") self.assertEqual(len(inst.getattr("__dict__")), 1) self.assertEqual(len(inst.getattr("__doc__")), 1) def test_navigation(self) -> None: klass = self.module["YO"] self.assertEqual(klass.statement(), klass) self.assertEqual(klass.statement(), klass) l_sibling = klass.previous_sibling() self.assertTrue(isinstance(l_sibling, nodes.FunctionDef), l_sibling) self.assertEqual(l_sibling.name, "global_access") r_sibling = klass.next_sibling() self.assertIsInstance(r_sibling, nodes.ClassDef) self.assertEqual(r_sibling.name, "YOUPI") def test_local_attr_ancestors(self) -> None: module = builder.parse( """ class A(): def __init__(self): pass class B(A): pass class C(B): pass class D(object): pass class F(): pass class E(F, D): pass """ ) # Test old-style (Python 2) / new-style (Python 3+) ancestors lookups klass2 = module["C"] it = klass2.local_attr_ancestors("__init__") anc_klass = next(it) self.assertIsInstance(anc_klass, nodes.ClassDef) self.assertEqual(anc_klass.name, "A") anc_klass = next(it) self.assertIsInstance(anc_klass, nodes.ClassDef) self.assertEqual(anc_klass.name, "object") self.assertRaises(StopIteration, partial(next, it)) it = klass2.local_attr_ancestors("method") self.assertRaises(StopIteration, partial(next, it)) # Test mixed-style ancestor lookups klass2 = module["E"] it = klass2.local_attr_ancestors("__init__") anc_klass = next(it) self.assertIsInstance(anc_klass, nodes.ClassDef) self.assertEqual(anc_klass.name, "object") self.assertRaises(StopIteration, partial(next, it)) def test_local_attr_mro(self) -> None: module = builder.parse( """ class A(object): def __init__(self): pass class B(A): def __init__(self, arg, arg2): pass class C(A): pass class D(C, B): pass """ ) dclass = module["D"] init = dclass.local_attr("__init__")[0] self.assertIsInstance(init, nodes.FunctionDef) self.assertEqual(init.parent.name, "B") cclass = module["C"] init = cclass.local_attr("__init__")[0] self.assertIsInstance(init, nodes.FunctionDef) self.assertEqual(init.parent.name, "A") ancestors = list(dclass.local_attr_ancestors("__init__")) self.assertEqual([node.name for node in ancestors], ["B", "A", "object"]) def test_instance_attr_ancestors(self) -> None: klass2 = self.module["YOUPI"] it = klass2.instance_attr_ancestors("yo") anc_klass = next(it) self.assertIsInstance(anc_klass, nodes.ClassDef) self.assertEqual(anc_klass.name, "YO") self.assertRaises(StopIteration, partial(next, it)) klass2 = self.module["YOUPI"] it = klass2.instance_attr_ancestors("member") self.assertRaises(StopIteration, partial(next, it)) def test_methods(self) -> None: expected_methods = {"__init__", "class_method", "method", "static_method"} klass2 = self.module["YOUPI"] methods = {m.name for m in klass2.methods()} self.assertTrue(methods.issuperset(expected_methods)) methods = {m.name for m in klass2.mymethods()} self.assertSetEqual(expected_methods, methods) klass2 = self.module2["Specialization"] methods = {m.name for m in klass2.mymethods()} self.assertSetEqual(set(), methods) method_locals = klass2.local_attr("method") self.assertEqual(len(method_locals), 1) self.assertEqual(method_locals[0].name, "method") self.assertRaises(AttributeInferenceError, klass2.local_attr, "nonexistent") methods = {m.name for m in klass2.methods()} self.assertTrue(methods.issuperset(expected_methods)) # def test_rhs(self): # my_dict = self.module['MY_DICT'] # self.assertIsInstance(my_dict.rhs(), nodes.Dict) # a = self.module['YO']['a'] # value = a.rhs() # self.assertIsInstance(value, nodes.Const) # self.assertEqual(value.value, 1) def test_ancestors(self) -> None: klass = self.module["YOUPI"] self.assertEqual(["YO", "object"], [a.name for a in klass.ancestors()]) klass = self.module2["Specialization"] self.assertEqual(["YOUPI", "YO", "object"], [a.name for a in klass.ancestors()]) def test_type(self) -> None: klass = self.module["YOUPI"] self.assertEqual(klass.type, "class") klass = self.module2["Metaclass"] self.assertEqual(klass.type, "metaclass") klass = self.module2["MyException"] self.assertEqual(klass.type, "exception") klass = self.module2["MyError"] self.assertEqual(klass.type, "exception") # the following class used to be detected as a metaclass # after the fix which used instance._proxied in .ancestors(), # when in fact it is a normal class klass = self.module2["NotMetaclass"] self.assertEqual(klass.type, "class") def test_inner_classes(self) -> None: eee = self.nonregr["Ccc"]["Eee"] self.assertEqual([n.name for n in eee.ancestors()], ["Ddd", "Aaa", "object"]) def test_classmethod_attributes(self) -> None: data = """ class WebAppObject(object): def registered(cls, application): cls.appli = application cls.schema = application.schema cls.config = application.config return cls registered = classmethod(registered) """ astroid = builder.parse(data, __name__) cls = astroid["WebAppObject"] assert_keys = [ "__module__", "__qualname__", "appli", "config", "registered", "schema", ] self.assertEqual(sorted(cls.locals.keys()), assert_keys) def test_class_getattr(self) -> None: data = """ class WebAppObject(object): appli = application appli += 2 del self.appli """ astroid = builder.parse(data, __name__) cls = astroid["WebAppObject"] # test del statement not returned by getattr self.assertEqual(len(cls.getattr("appli")), 2) def test_instance_getattr(self) -> None: data = """ class WebAppObject(object): def __init__(self, application): self.appli = application self.appli += 2 del self.appli """ astroid = builder.parse(data) inst = Instance(astroid["WebAppObject"]) # test del statement not returned by getattr self.assertEqual(len(inst.getattr("appli")), 2) def test_instance_getattr_with_class_attr(self) -> None: data = """ class Parent: aa = 1 cc = 1 class Klass(Parent): aa = 0 bb = 0 def incr(self, val): self.cc = self.aa if val > self.aa: val = self.aa if val < self.bb: val = self.bb self.aa += val """ astroid = builder.parse(data) inst = Instance(astroid["Klass"]) self.assertEqual(len(inst.getattr("aa")), 3, inst.getattr("aa")) self.assertEqual(len(inst.getattr("bb")), 1, inst.getattr("bb")) self.assertEqual(len(inst.getattr("cc")), 2, inst.getattr("cc")) def test_getattr_method_transform(self) -> None: data = """ class Clazz(object): def m1(self, value): self.value = value m2 = m1 def func(arg1, arg2): "function that will be used as a method" return arg1.value + arg2 Clazz.m3 = func inst = Clazz() inst.m4 = func """ astroid = builder.parse(data) cls = astroid["Clazz"] # test del statement not returned by getattr for method in ("m1", "m2", "m3"): inferred = list(cls.igetattr(method)) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], UnboundMethod) inferred = list(Instance(cls).igetattr(method)) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], BoundMethod) inferred = list(Instance(cls).igetattr("m4")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.FunctionDef) def test_getattr_from_grandpa(self) -> None: data = """ class Future: attr = 1 class Present(Future): pass class Past(Present): pass """ astroid = builder.parse(data) past = astroid["Past"] attr = past.getattr("attr") self.assertEqual(len(attr), 1) attr1 = attr[0] self.assertIsInstance(attr1, nodes.AssignName) self.assertEqual(attr1.name, "attr") @staticmethod def test_getattr_with_enpty_annassign() -> None: code = """ class Parent: attr: int = 2 class Child(Parent): #@ attr: int """ child = extract_node(code) attr = child.getattr("attr") assert len(attr) == 1 assert isinstance(attr[0], nodes.AssignName) assert attr[0].name == "attr" assert attr[0].lineno == 3 def test_function_with_decorator_lineno(self) -> None: data = """ @f(a=2, b=3) def g1(x): print(x) @f(a=2, b=3, ) def g2(): pass """ astroid = builder.parse(data) self.assertEqual(astroid["g1"].fromlineno, 4) self.assertEqual(astroid["g1"].tolineno, 5) if PY38 and IS_PYPY: self.assertEqual(astroid["g2"].fromlineno, 9) else: self.assertEqual(astroid["g2"].fromlineno, 10) self.assertEqual(astroid["g2"].tolineno, 11) def test_metaclass_error(self) -> None: astroid = builder.parse( """ class Test(object): __metaclass__ = typ """ ) klass = astroid["Test"] self.assertFalse(klass.metaclass()) def test_metaclass_yes_leak(self) -> None: astroid = builder.parse( """ # notice `ab` instead of `abc` from ab import ABCMeta class Meta(object): __metaclass__ = ABCMeta """ ) klass = astroid["Meta"] self.assertIsNone(klass.metaclass()) def test_metaclass_type(self) -> None: klass = builder.extract_node( """ def with_metaclass(meta, base=object): return meta("NewBase", (base, ), {}) class ClassWithMeta(with_metaclass(type)): #@ pass """ ) assert isinstance(klass, nodes.ClassDef) self.assertEqual( ["NewBase", "object"], [base.name for base in klass.ancestors()] ) def test_no_infinite_metaclass_loop(self) -> None: klass = builder.extract_node( """ class SSS(object): class JJJ(object): pass @classmethod def Init(cls): cls.JJJ = type('JJJ', (cls.JJJ,), {}) class AAA(SSS): pass class BBB(AAA.JJJ): pass """ ) assert isinstance(klass, nodes.ClassDef) self.assertFalse(_is_metaclass(klass)) ancestors = [base.name for base in klass.ancestors()] self.assertIn("object", ancestors) self.assertIn("JJJ", ancestors) def test_no_infinite_metaclass_loop_with_redefine(self) -> None: ast_nodes = builder.extract_node( """ import datetime class A(datetime.date): #@ @classmethod def now(cls): return cls() class B(datetime.date): #@ pass datetime.date = A datetime.date = B """ ) for klass in ast_nodes: self.assertEqual(None, klass.metaclass()) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_metaclass_generator_hack(self): klass = builder.extract_node( """ import six class WithMeta(six.with_metaclass(type, object)): #@ pass """ ) assert isinstance(klass, nodes.ClassDef) self.assertEqual(["object"], [base.name for base in klass.ancestors()]) self.assertEqual("type", klass.metaclass().name) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_metaclass_generator_hack_enum_base(self): """Regression test for https://github.com/pylint-dev/pylint/issues/5935""" klass = builder.extract_node( """ import six from enum import Enum, EnumMeta class PetEnumPy2Metaclass(six.with_metaclass(EnumMeta, Enum)): #@ DOG = "dog" """ ) self.assertEqual(list(klass.local_attr_ancestors("DOG")), []) def test_add_metaclass(self) -> None: klass = builder.extract_node( """ import abc class WithMeta(object, metaclass=abc.ABCMeta): pass """ ) assert isinstance(klass, nodes.ClassDef) inferred = next(klass.infer()) metaclass = inferred.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertIn(metaclass.qname(), ("abc.ABCMeta", "_py_abc.ABCMeta")) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_using_invalid_six_add_metaclass_call(self): klass = builder.extract_node( """ import six @six.add_metaclass() class Invalid(object): pass """ ) inferred = next(klass.infer()) self.assertIsNone(inferred.metaclass()) @staticmethod def test_with_invalid_metaclass(): klass = extract_node( """ class InvalidAsMetaclass: ... class Invalid(metaclass=InvalidAsMetaclass()): #@ pass """ ) inferred = next(klass.infer()) metaclass = inferred.metaclass() assert isinstance(metaclass, Instance) def test_nonregr_infer_callresult(self) -> None: astroid = builder.parse( """ class Delegate(object): def __get__(self, obj, cls): return getattr(obj._subject, self.attribute) class CompositeBuilder(object): __call__ = Delegate() builder = CompositeBuilder(result, composite) tgts = builder() """ ) instance = astroid["tgts"] # used to raise "'_Yes' object is not iterable", see # https://bitbucket.org/logilab/astroid/issue/17 self.assertEqual(list(instance.infer()), [util.Uninferable]) def test_slots(self) -> None: astroid = builder.parse( """ from collections import deque from textwrap import dedent class First(object): #@ __slots__ = ("a", "b", 1) class Second(object): #@ __slots__ = "a" class Third(object): #@ __slots__ = deque(["a", "b", "c"]) class Fourth(object): #@ __slots__ = {"a": "a", "b": "b"} class Fifth(object): #@ __slots__ = list class Sixth(object): #@ __slots__ = "" class Seventh(object): #@ __slots__ = dedent.__name__ class Eight(object): #@ __slots__ = ("parens") class Ninth(object): #@ pass class Ten(object): #@ __slots__ = dict({"a": "b", "c": "d"}) """ ) expected = [ ("First", ("a", "b")), ("Second", ("a",)), ("Third", None), ("Fourth", ("a", "b")), ("Fifth", None), ("Sixth", None), ("Seventh", ("dedent",)), ("Eight", ("parens",)), ("Ninth", None), ("Ten", ("a", "c")), ] for cls, expected_value in expected: slots = astroid[cls].slots() if expected_value is None: self.assertIsNone(slots) else: self.assertEqual(list(expected_value), [node.value for node in slots]) def test_slots_for_dict_keys(self) -> None: module = builder.parse( """ class Issue(object): SlotDefaults = {'id': 0, 'id1':1} __slots__ = SlotDefaults.keys() """ ) cls = module["Issue"] slots = cls.slots() self.assertEqual(len(slots), 2) self.assertEqual(slots[0].value, "id") self.assertEqual(slots[1].value, "id1") def test_slots_empty_list_of_slots(self) -> None: module = builder.parse( """ class Klass(object): __slots__ = () """ ) cls = module["Klass"] self.assertEqual(cls.slots(), []) def test_slots_taken_from_parents(self) -> None: module = builder.parse( """ class FirstParent(object): __slots__ = ('a', 'b', 'c') class SecondParent(FirstParent): __slots__ = ('d', 'e') class Third(SecondParent): __slots__ = ('d', ) """ ) cls = module["Third"] slots = cls.slots() self.assertEqual( sorted({slot.value for slot in slots}), ["a", "b", "c", "d", "e"] ) def test_all_ancestors_need_slots(self) -> None: module = builder.parse( """ class A(object): __slots__ = ('a', ) class B(A): pass class C(B): __slots__ = ('a', ) """ ) cls = module["C"] self.assertIsNone(cls.slots()) cls = module["B"] self.assertIsNone(cls.slots()) def test_slots_added_dynamically_still_inferred(self) -> None: code = """ class NodeBase(object): __slots__ = "a", "b" if Options.isFullCompat(): __slots__ += ("c",) """ node = builder.extract_node(code) inferred = next(node.infer()) slots = inferred.slots() assert len(slots) == 3, slots assert [slot.value for slot in slots] == ["a", "b", "c"] def assertEqualMro(self, klass: nodes.ClassDef, expected_mro: list[str]) -> None: self.assertEqual([member.name for member in klass.mro()], expected_mro) def assertEqualMroQName( self, klass: nodes.ClassDef, expected_mro: list[str] ) -> None: self.assertEqual([member.qname() for member in klass.mro()], expected_mro) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_with_metaclass_mro(self): astroid = builder.parse( """ import six class C(object): pass class B(C): pass class A(six.with_metaclass(type, B)): pass """ ) self.assertEqualMro(astroid["A"], ["A", "B", "C", "object"]) def test_mro(self) -> None: astroid = builder.parse( """ class C(object): pass class D(dict, C): pass class A1(object): pass class B1(A1): pass class C1(A1): pass class D1(B1, C1): pass class E1(C1, B1): pass class F1(D1, E1): pass class G1(E1, D1): pass class Boat(object): pass class DayBoat(Boat): pass class WheelBoat(Boat): pass class EngineLess(DayBoat): pass class SmallMultihull(DayBoat): pass class PedalWheelBoat(EngineLess, WheelBoat): pass class SmallCatamaran(SmallMultihull): pass class Pedalo(PedalWheelBoat, SmallCatamaran): pass class OuterA(object): class Inner(object): pass class OuterB(OuterA): class Inner(OuterA.Inner): pass class OuterC(OuterA): class Inner(OuterA.Inner): pass class OuterD(OuterC): class Inner(OuterC.Inner, OuterB.Inner): pass class Duplicates(str, str): pass """ ) self.assertEqualMro(astroid["D"], ["D", "dict", "C", "object"]) self.assertEqualMro(astroid["D1"], ["D1", "B1", "C1", "A1", "object"]) self.assertEqualMro(astroid["E1"], ["E1", "C1", "B1", "A1", "object"]) with self.assertRaises(InconsistentMroError) as cm: astroid["F1"].mro() A1 = astroid.getattr("A1")[0] B1 = astroid.getattr("B1")[0] C1 = astroid.getattr("C1")[0] object_ = MANAGER.astroid_cache["builtins"].getattr("object")[0] self.assertEqual( cm.exception.mros, [[B1, C1, A1, object_], [C1, B1, A1, object_]] ) with self.assertRaises(InconsistentMroError) as cm: astroid["G1"].mro() self.assertEqual( cm.exception.mros, [[C1, B1, A1, object_], [B1, C1, A1, object_]] ) self.assertEqualMro( astroid["PedalWheelBoat"], ["PedalWheelBoat", "EngineLess", "DayBoat", "WheelBoat", "Boat", "object"], ) self.assertEqualMro( astroid["SmallCatamaran"], ["SmallCatamaran", "SmallMultihull", "DayBoat", "Boat", "object"], ) self.assertEqualMro( astroid["Pedalo"], [ "Pedalo", "PedalWheelBoat", "EngineLess", "SmallCatamaran", "SmallMultihull", "DayBoat", "WheelBoat", "Boat", "object", ], ) self.assertEqualMro( astroid["OuterD"]["Inner"], ["Inner", "Inner", "Inner", "Inner", "object"] ) with self.assertRaises(DuplicateBasesError) as cm: astroid["Duplicates"].mro() Duplicates = astroid.getattr("Duplicates")[0] self.assertEqual(cm.exception.cls, Duplicates) self.assertIsInstance(cm.exception, MroError) self.assertIsInstance(cm.exception, ResolveError) def test_mro_with_factories(self) -> None: cls = builder.extract_node( """ def MixinFactory(cls): mixin_name = '{}Mixin'.format(cls.__name__) mixin_bases = (object,) mixin_attrs = {} mixin = type(mixin_name, mixin_bases, mixin_attrs) return mixin class MixinA(MixinFactory(int)): pass class MixinB(MixinFactory(str)): pass class Base(object): pass class ClassA(MixinA, Base): pass class ClassB(MixinB, ClassA): pass class FinalClass(ClassB): def __init__(self): self.name = 'x' """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMro( cls, [ "FinalClass", "ClassB", "MixinB", "strMixin", "ClassA", "MixinA", "intMixin", "Base", "object", ], ) def test_mro_with_attribute_classes(self) -> None: cls = builder.extract_node( """ class A: pass class B: pass class Scope: pass scope = Scope() scope.A = A scope.B = B class C(scope.A, scope.B): pass """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMro(cls, ["C", "A", "B", "object"]) def test_mro_generic_1(self): cls = builder.extract_node( """ import typing T = typing.TypeVar('T') class A(typing.Generic[T]): ... class B: ... class C(A[T], B): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".C", ".A", "typing.Generic", ".B", "builtins.object"] ) def test_mro_generic_2(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A: ... class B(Generic[T]): ... class C(Generic[T], A, B[T]): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".C", ".A", ".B", "typing.Generic", "builtins.object"] ) def test_mro_generic_3(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A: ... class B(A, Generic[T]): ... class C(Generic[T]): ... class D(B[T], C[T], Generic[T]): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".D", ".B", ".A", ".C", "typing.Generic", "builtins.object"] ) def test_mro_generic_4(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A: ... class B(Generic[T]): ... class C(A, Generic[T], B[T]): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".C", ".A", ".B", "typing.Generic", "builtins.object"] ) def test_mro_generic_5(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T1 = TypeVar('T1') T2 = TypeVar('T2') class A(Generic[T1]): ... class B(Generic[T2]): ... class C(A[T1], B[T2]): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".C", ".A", ".B", "typing.Generic", "builtins.object"] ) def test_mro_generic_6(self): cls = builder.extract_node( """ from typing import Generic as TGeneric, TypeVar T = TypeVar('T') class Generic: ... class A(Generic): ... class B(TGeneric[T]): ... class C(A, B[T]): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".C", ".A", ".Generic", ".B", "typing.Generic", "builtins.object"] ) def test_mro_generic_7(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A(): ... class B(Generic[T]): ... class C(A, B[T]): ... class D: ... class E(C[str], D): ... """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqualMroQName( cls, [".E", ".C", ".A", ".B", "typing.Generic", ".D", "builtins.object"] ) def test_mro_generic_error_1(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T1 = TypeVar('T1') T2 = TypeVar('T2') class A(Generic[T1], Generic[T2]): ... """ ) assert isinstance(cls, nodes.ClassDef) with self.assertRaises(DuplicateBasesError): cls.mro() def test_mro_generic_error_2(self): cls = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A(Generic[T]): ... class B(A[T], A[T]): ... """ ) assert isinstance(cls, nodes.ClassDef) with self.assertRaises(DuplicateBasesError): cls.mro() def test_mro_typing_extensions(self): """Regression test for mro() inference on typing_extensions. Regression reported in: https://github.com/pylint-dev/astroid/issues/1124 """ module = parse( """ import abc import typing import dataclasses from typing import Protocol T = typing.TypeVar("T") class MyProtocol(Protocol): pass class EarlyBase(typing.Generic[T], MyProtocol): pass class Base(EarlyBase[T], abc.ABC): pass class Final(Base[object]): pass """ ) class_names = [ "ABC", "Base", "EarlyBase", "Final", "Generic", "MyProtocol", "Protocol", "object", ] final_def = module.body[-1] self.assertEqual(class_names, sorted(i.name for i in final_def.mro())) def test_generator_from_infer_call_result_parent(self) -> None: func = builder.extract_node( """ import contextlib @contextlib.contextmanager def test(): #@ yield """ ) assert isinstance(func, nodes.FunctionDef) result = next(func.infer_call_result(None)) self.assertIsInstance(result, Generator) self.assertEqual(result.parent, func) def test_type_three_arguments(self) -> None: classes = builder.extract_node( """ type('A', (object, ), {"a": 1, "b": 2, missing: 3}) #@ """ ) assert isinstance(classes, nodes.Call) first = next(classes.infer()) self.assertIsInstance(first, nodes.ClassDef) self.assertEqual(first.name, "A") self.assertEqual(first.basenames, ["object"]) self.assertIsInstance(first["a"], nodes.Const) self.assertEqual(first["a"].value, 1) self.assertIsInstance(first["b"], nodes.Const) self.assertEqual(first["b"].value, 2) with self.assertRaises(AttributeInferenceError): first.getattr("missing") def test_implicit_metaclass(self) -> None: cls = builder.extract_node( """ class A(object): pass """ ) assert isinstance(cls, nodes.ClassDef) type_cls = nodes.builtin_lookup("type")[1][0] self.assertEqual(cls.implicit_metaclass(), type_cls) def test_implicit_metaclass_lookup(self) -> None: cls = builder.extract_node( """ class A(object): pass """ ) assert isinstance(cls, nodes.ClassDef) instance = cls.instantiate_class() func = cls.getattr("mro") self.assertEqual(len(func), 1) self.assertRaises(AttributeInferenceError, instance.getattr, "mro") def test_metaclass_lookup_using_same_class(self) -> None: """Check that we don't have recursive attribute access for metaclass.""" cls = builder.extract_node( """ class A(object): pass """ ) assert isinstance(cls, nodes.ClassDef) self.assertEqual(len(cls.getattr("mro")), 1) def test_metaclass_lookup_inference_errors(self) -> None: module = builder.parse( """ class Metaclass(type): foo = lala class B(object, metaclass=Metaclass): pass """ ) cls = module["B"] self.assertEqual(util.Uninferable, next(cls.igetattr("foo"))) def test_metaclass_lookup(self) -> None: module = builder.parse( """ class Metaclass(type): foo = 42 @classmethod def class_method(cls): pass def normal_method(cls): pass @property def meta_property(cls): return 42 @staticmethod def static(): pass class A(object, metaclass=Metaclass): pass """ ) acls = module["A"] normal_attr = next(acls.igetattr("foo")) self.assertIsInstance(normal_attr, nodes.Const) self.assertEqual(normal_attr.value, 42) class_method = next(acls.igetattr("class_method")) self.assertIsInstance(class_method, BoundMethod) self.assertEqual(class_method.bound, module["Metaclass"]) normal_method = next(acls.igetattr("normal_method")) self.assertIsInstance(normal_method, BoundMethod) self.assertEqual(normal_method.bound, module["A"]) # Attribute access for properties: # from the metaclass is a property object # from the class that uses the metaclass, the value # of the property property_meta = next(module["Metaclass"].igetattr("meta_property")) self.assertIsInstance(property_meta, objects.Property) wrapping = nodes.get_wrapping_class(property_meta) self.assertEqual(wrapping, module["Metaclass"]) property_class = next(acls.igetattr("meta_property")) self.assertIsInstance(property_class, nodes.Const) self.assertEqual(property_class.value, 42) static = next(acls.igetattr("static")) self.assertIsInstance(static, nodes.FunctionDef) def test_local_attr_invalid_mro(self) -> None: cls = builder.extract_node( """ # A has an invalid MRO, local_attr should fallback # to using .ancestors. class A(object, object): test = 42 class B(A): #@ pass """ ) assert isinstance(cls, nodes.ClassDef) local = cls.local_attr("test")[0] inferred = next(local.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_has_dynamic_getattr(self) -> None: module = builder.parse( """ class Getattr(object): def __getattr__(self, attrname): pass class Getattribute(object): def __getattribute__(self, attrname): pass class ParentGetattr(Getattr): pass """ ) self.assertTrue(module["Getattr"].has_dynamic_getattr()) self.assertTrue(module["Getattribute"].has_dynamic_getattr()) self.assertTrue(module["ParentGetattr"].has_dynamic_getattr()) # Test that objects analyzed through the live introspection # aren't considered to have dynamic getattr implemented. astroid_builder = builder.AstroidBuilder() module = astroid_builder.module_build(difflib) self.assertFalse(module["SequenceMatcher"].has_dynamic_getattr()) def test_duplicate_bases_namedtuple(self) -> None: module = builder.parse( """ import collections _A = collections.namedtuple('A', 'a') class A(_A): pass class B(A): pass """ ) names = ["B", "A", "A", "tuple", "object"] mro = module["B"].mro() class_names = [i.name for i in mro] self.assertEqual(names, class_names) def test_instance_bound_method_lambdas(self) -> None: ast_nodes = builder.extract_node( """ class Test(object): #@ lam = lambda self: self not_method = lambda xargs: xargs Test() #@ """ ) assert isinstance(ast_nodes, list) cls = next(ast_nodes[0].infer()) self.assertIsInstance(next(cls.igetattr("lam")), nodes.Lambda) self.assertIsInstance(next(cls.igetattr("not_method")), nodes.Lambda) instance = next(ast_nodes[1].infer()) lam = next(instance.igetattr("lam")) self.assertIsInstance(lam, BoundMethod) not_method = next(instance.igetattr("not_method")) self.assertIsInstance(not_method, nodes.Lambda) def test_instance_bound_method_lambdas_2(self) -> None: """ Test the fact that a method which is a lambda built from a factory is well inferred as a bound method (bug pylint 2594). """ ast_nodes = builder.extract_node( """ def lambda_factory(): return lambda self: print("Hello world") class MyClass(object): #@ f2 = lambda_factory() MyClass() #@ """ ) assert isinstance(ast_nodes, list) cls = next(ast_nodes[0].infer()) self.assertIsInstance(next(cls.igetattr("f2")), nodes.Lambda) instance = next(ast_nodes[1].infer()) f2 = next(instance.igetattr("f2")) self.assertIsInstance(f2, BoundMethod) def test_class_extra_decorators_frame_is_not_class(self) -> None: ast_node = builder.extract_node( """ def ala(): def bala(): #@ func = 42 """ ) assert isinstance(ast_node, nodes.FunctionDef) self.assertEqual(ast_node.extra_decorators, []) def test_class_extra_decorators_only_callfunc_are_considered(self) -> None: ast_node = builder.extract_node( """ class Ala(object): def func(self): #@ pass func = 42 """ ) self.assertEqual(ast_node.extra_decorators, []) def test_class_extra_decorators_only_assignment_names_are_considered(self) -> None: ast_node = builder.extract_node( """ class Ala(object): def func(self): #@ pass def __init__(self): self.func = staticmethod(func) """ ) self.assertEqual(ast_node.extra_decorators, []) def test_class_extra_decorators_only_same_name_considered(self) -> None: ast_node = builder.extract_node( """ class Ala(object): def func(self): #@ pass bala = staticmethod(func) """ ) self.assertEqual(ast_node.extra_decorators, []) self.assertEqual(ast_node.type, "method") def test_class_extra_decorators(self) -> None: static_method, clsmethod = builder.extract_node( """ class Ala(object): def static(self): #@ pass def class_method(self): #@ pass class_method = classmethod(class_method) static = staticmethod(static) """ ) self.assertEqual(len(clsmethod.extra_decorators), 1) self.assertEqual(clsmethod.type, "classmethod") self.assertEqual(len(static_method.extra_decorators), 1) self.assertEqual(static_method.type, "staticmethod") def test_extra_decorators_only_class_level_assignments(self) -> None: node = builder.extract_node( """ def _bind(arg): return arg.bind class A(object): @property def bind(self): return 42 def irelevant(self): # This is important, because it used to trigger # a maximum recursion error. bind = _bind(self) return bind A() #@ """ ) inferred = next(node.infer()) bind = next(inferred.igetattr("bind")) self.assertIsInstance(bind, nodes.Const) self.assertEqual(bind.value, 42) parent = bind.scope() self.assertEqual(len(parent.extra_decorators), 0) def test_class_keywords(self) -> None: data = """ class TestKlass(object, metaclass=TestMetaKlass, foo=42, bar='baz'): pass """ astroid = builder.parse(data, __name__) cls = astroid["TestKlass"] self.assertEqual(len(cls.keywords), 2) self.assertEqual([x.arg for x in cls.keywords], ["foo", "bar"]) children = list(cls.get_children()) assert len(children) == 4 assert isinstance(children[1], nodes.Keyword) assert isinstance(children[2], nodes.Keyword) assert children[1].arg == "foo" assert children[2].arg == "bar" def test_kite_graph(self) -> None: data = """ A = type('A', (object,), {}) class B1(A): pass class B2(A): pass class C(B1, B2): pass class D(C): def update(self): self.hello = 'hello' """ # Should not crash builder.parse(data) @staticmethod def test_singleline_docstring() -> None: code = textwrap.dedent( """\ class Foo: '''Hello World''' bar = 1 """ ) node: nodes.ClassDef = builder.extract_node(code) # type: ignore[assignment] assert isinstance(node.doc_node, nodes.Const) assert node.doc_node.lineno == 2 assert node.doc_node.col_offset == 4 assert node.doc_node.end_lineno == 2 assert node.doc_node.end_col_offset == 21 @staticmethod def test_multiline_docstring() -> None: code = textwrap.dedent( """\ class Foo: '''Hello World Also on this line. ''' bar = 1 """ ) node: nodes.ClassDef = builder.extract_node(code) # type: ignore[assignment] assert isinstance(node.doc_node, nodes.Const) assert node.doc_node.lineno == 2 assert node.doc_node.col_offset == 4 assert node.doc_node.end_lineno == 5 assert node.doc_node.end_col_offset == 7 @staticmethod def test_without_docstring() -> None: code = textwrap.dedent( """\ class Foo: bar = 1 """ ) node: nodes.ClassDef = builder.extract_node(code) # type: ignore[assignment] assert node.doc_node is None def test_issue940_metaclass_subclass_property() -> None: node = builder.extract_node( """ class BaseMeta(type): @property def __members__(cls): return ['a', 'property'] class Parent(metaclass=BaseMeta): pass class Derived(Parent): pass Derived.__members__ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert [c.value for c in inferred.elts] == ["a", "property"] def test_issue940_property_grandchild() -> None: node = builder.extract_node( """ class Grandparent: @property def __members__(self): return ['a', 'property'] class Parent(Grandparent): pass class Child(Parent): pass Child().__members__ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert [c.value for c in inferred.elts] == ["a", "property"] def test_issue940_metaclass_property() -> None: node = builder.extract_node( """ class BaseMeta(type): @property def __members__(cls): return ['a', 'property'] class Parent(metaclass=BaseMeta): pass Parent.__members__ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert [c.value for c in inferred.elts] == ["a", "property"] def test_issue940_with_metaclass_class_context_property() -> None: node = builder.extract_node( """ class BaseMeta(type): pass class Parent(metaclass=BaseMeta): @property def __members__(self): return ['a', 'property'] class Derived(Parent): pass Derived.__members__ """ ) inferred = next(node.infer()) assert not isinstance(inferred, nodes.List) assert isinstance(inferred, objects.Property) def test_issue940_metaclass_values_funcdef() -> None: node = builder.extract_node( """ class BaseMeta(type): def __members__(cls): return ['a', 'func'] class Parent(metaclass=BaseMeta): pass Parent.__members__() """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert [c.value for c in inferred.elts] == ["a", "func"] def test_issue940_metaclass_derived_funcdef() -> None: node = builder.extract_node( """ class BaseMeta(type): def __members__(cls): return ['a', 'func'] class Parent(metaclass=BaseMeta): pass class Derived(Parent): pass Derived.__members__() """ ) inferred_result = next(node.infer()) assert isinstance(inferred_result, nodes.List) assert [c.value for c in inferred_result.elts] == ["a", "func"] def test_issue940_metaclass_funcdef_is_not_datadescriptor() -> None: node = builder.extract_node( """ class BaseMeta(type): def __members__(cls): return ['a', 'property'] class Parent(metaclass=BaseMeta): @property def __members__(cls): return BaseMeta.__members__() class Derived(Parent): pass Derived.__members__ """ ) # Here the function is defined on the metaclass, but the property # is defined on the base class. When loading the attribute in a # class context, this should return the property object instead of # resolving the data descriptor inferred = next(node.infer()) assert isinstance(inferred, objects.Property) def test_property_in_body_of_try() -> None: """Regression test for https://github.com/pylint-dev/pylint/issues/6596.""" node: nodes.Return = builder._extract_single_node( """ def myfunc(): try: @property def myfunc(): return None except TypeError: pass @myfunc.setter def myfunc(): pass return myfunc() #@ """ ) next(node.value.infer()) def test_property_in_body_of_if() -> None: node: nodes.Return = builder._extract_single_node( """ def myfunc(): if True: @property def myfunc(): return None @myfunc.setter def myfunc(): pass return myfunc() #@ """ ) next(node.value.infer()) def test_issue940_enums_as_a_real_world_usecase() -> None: node = builder.extract_node( """ from enum import Enum class Sounds(Enum): bee = "buzz" cat = "meow" Sounds.__members__ """ ) inferred_result = next(node.infer()) assert isinstance(inferred_result, nodes.Dict) actual = [k.value for k, _ in inferred_result.items] assert sorted(actual) == ["bee", "cat"] def test_enums_type_annotation_str_member() -> None: """A type-annotated member of an Enum class where: - `member.value` is of type `nodes.Const` & - `member.value.value` is of type `str` is inferred as: `repr(member.value.value)` """ node = builder.extract_node( """ from enum import Enum class Veg(Enum): TOMATO: str = "sweet" Veg.TOMATO.value """ ) inferred_member_value = node.inferred()[0] assert isinstance(inferred_member_value, nodes.Const) assert inferred_member_value.value == "sweet" @pytest.mark.parametrize("annotation", ["bool", "dict", "int", "str"]) def test_enums_type_annotation_no_value(annotation) -> None: """A type-annotated member of an Enum class which has no value where: - `member.value.value` is `None` is not inferred """ node = builder.extract_node( """ from enum import Enum class Veg(Enum): TOMATO: {annotation} Veg.TOMATO.value """ ) inferred_member_value = node.inferred()[0] assert inferred_member_value.value is None def test_enums_value2member_map_() -> None: """Check the `_value2member_map_` member is present in an Enum class.""" node = builder.extract_node( """ from enum import Enum class Veg(Enum): TOMATO: 1 Veg """ ) inferred_class = node.inferred()[0] assert "_value2member_map_" in inferred_class.locals @pytest.mark.parametrize("annotation, value", [("int", 42), ("bytes", b"")]) def test_enums_type_annotation_non_str_member(annotation, value) -> None: """A type-annotated member of an Enum class where: - `member.value` is of type `nodes.Const` & - `member.value.value` is not of type `str` is inferred as: `member.value.value` """ node = builder.extract_node( f""" from enum import Enum class Veg(Enum): TOMATO: {annotation} = {value} Veg.TOMATO.value """ ) inferred_member_value = node.inferred()[0] assert isinstance(inferred_member_value, nodes.Const) assert inferred_member_value.value == value @pytest.mark.parametrize( "annotation, value", [ ("dict", {"variety": "beefeater"}), ("list", ["beefeater", "moneymaker"]), ("TypedDict", {"variety": "moneymaker"}), ], ) def test_enums_type_annotations_non_const_member(annotation, value) -> None: """A type-annotated member of an Enum class where: - `member.value` is not of type `nodes.Const` is inferred as: `member.value.as_string()`. """ member = builder.extract_node( f""" from enum import Enum class Veg(Enum): TOMATO: {annotation} = {value} Veg.TOMATO.value """ ) inferred_member_value = member.inferred()[0] assert not isinstance(inferred_member_value, nodes.Const) assert inferred_member_value.as_string() == repr(value) def test_metaclass_cannot_infer_call_yields_an_instance() -> None: node = builder.extract_node( """ from undefined import Undefined class Meta(type): __call__ = Undefined class A(metaclass=Meta): pass A() """ ) inferred = next(node.infer()) assert isinstance(inferred, Instance) @pytest.mark.parametrize( "func", [ textwrap.dedent( """ def func(a, b, /, d, e): pass """ ), textwrap.dedent( """ def func(a, b=None, /, d=None, e=None): pass """ ), textwrap.dedent( """ def func(a, other, other, b=None, /, d=None, e=None): pass """ ), textwrap.dedent( """ def func(a, other, other, b=None, /, d=None, e=None, **kwargs): pass """ ), textwrap.dedent( """ def name(p1, p2, /, p_or_kw, *, kw): pass """ ), textwrap.dedent( """ def __init__(self, other=(), /, **kw): pass """ ), textwrap.dedent( """ def __init__(self: int, other: float, /, **kw): pass """ ), ], ) @test_utils.require_version("3.8") def test_posonlyargs_python_38(func): ast_node = builder.extract_node(func) assert ast_node.as_string().strip() == func.strip() @test_utils.require_version("3.8") def test_posonlyargs_default_value() -> None: ast_node = builder.extract_node( """ def func(a, b=1, /, c=2): pass """ ) last_param = ast_node.args.default_value("c") assert isinstance(last_param, nodes.Const) assert last_param.value == 2 first_param = ast_node.args.default_value("b") assert isinstance(first_param, nodes.Const) assert first_param.value == 1 def test_ancestor_with_generic() -> None: # https://github.com/pylint-dev/astroid/issues/942 tree = builder.parse( """ from typing import TypeVar, Generic T = TypeVar("T") class A(Generic[T]): def a_method(self): print("hello") class B(A[T]): pass class C(B[str]): pass """ ) inferred_b = next(tree["B"].infer()) assert [cdef.name for cdef in inferred_b.ancestors()] == ["A", "Generic", "object"] inferred_c = next(tree["C"].infer()) assert [cdef.name for cdef in inferred_c.ancestors()] == [ "B", "A", "Generic", "object", ] def test_slots_duplicate_bases_issue_1089() -> None: astroid = builder.parse( """ class First(object, object): #@ pass """ ) with pytest.raises(NotImplementedError): astroid["First"].slots() class TestFrameNodes: @staticmethod def test_frame_node(): """Test if the frame of FunctionDef, ClassDef and Module is correctly set.""" module = builder.parse( """ def func(): var_1 = x return var_1 class MyClass: attribute = 1 def method(): pass VAR = lambda y = (named_expr := "walrus"): print(y) """ ) function = module.body[0] assert function.frame() == function assert function.frame() == function assert function.body[0].frame() == function assert function.body[0].frame() == function class_node = module.body[1] assert class_node.frame() == class_node assert class_node.frame() == class_node assert class_node.body[0].frame() == class_node assert class_node.body[0].frame() == class_node assert class_node.body[1].frame() == class_node.body[1] assert class_node.body[1].frame() == class_node.body[1] lambda_assignment = module.body[2].value assert lambda_assignment.args.args[0].frame() == lambda_assignment assert lambda_assignment.args.args[0].frame() == lambda_assignment assert module.frame() == module assert module.frame() == module @staticmethod def test_non_frame_node(): """Test if the frame of non frame nodes is set correctly.""" module = builder.parse( """ VAR_ONE = 1 VAR_TWO = [x for x in range(1)] """ ) assert module.body[0].frame() == module assert module.body[0].frame() == module assert module.body[1].value.locals["x"][0].frame() == module assert module.body[1].value.locals["x"][0].frame() == module astroid-3.2.2/tests/test_transforms.py0000664000175000017500000002314214622475517020061 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import contextlib import sys import time import unittest from collections.abc import Callable, Iterator import pytest from astroid import MANAGER, builder, nodes, parse, transforms from astroid.brain.brain_dataclasses import _looks_like_dataclass_field_call from astroid.const import IS_PYPY from astroid.manager import AstroidManager from astroid.nodes.node_classes import Call, Compare, Const, Name from astroid.nodes.scoped_nodes import FunctionDef, Module from tests.testdata.python3.recursion_error import LONG_CHAINED_METHOD_CALL @contextlib.contextmanager def add_transform( manager: AstroidManager, node: type, transform: Callable, predicate: Callable | None = None, ) -> Iterator: manager.register_transform(node, transform, predicate) try: yield finally: manager.unregister_transform(node, transform, predicate) class TestTransforms(unittest.TestCase): def setUp(self) -> None: self.transformer = transforms.TransformVisitor() def parse_transform(self, code: str) -> Module: module = parse(code, apply_transforms=False) return self.transformer.visit(module) def test_function_inlining_transform(self) -> None: def transform_call(node: Call) -> Const: # Let's do some function inlining inferred = next(node.infer()) return inferred self.transformer.register_transform(nodes.Call, transform_call) module = self.parse_transform( """ def test(): return 42 test() #@ """ ) self.assertIsInstance(module.body[1], nodes.Expr) self.assertIsInstance(module.body[1].value, nodes.Const) self.assertEqual(module.body[1].value.value, 42) def test_recursive_transforms_into_astroid_fields(self) -> None: # Test that the transformer walks properly the tree # by going recursively into the _astroid_fields per each node. def transform_compare(node: Compare) -> Const: # Let's check the values of the ops _, right = node.ops[0] # Assume they are Consts and they were transformed before # us. return nodes.const_factory(node.left.value < right.value) def transform_name(node: Name) -> Const: # Should be Consts return next(node.infer()) self.transformer.register_transform(nodes.Compare, transform_compare) self.transformer.register_transform(nodes.Name, transform_name) module = self.parse_transform( """ a = 42 b = 24 a < b """ ) self.assertIsInstance(module.body[2], nodes.Expr) self.assertIsInstance(module.body[2].value, nodes.Const) self.assertFalse(module.body[2].value.value) def test_transform_patches_locals(self) -> None: def transform_function(node: FunctionDef) -> None: assign = nodes.Assign( parent=node, lineno=node.lineno, col_offset=node.col_offset, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) name = nodes.AssignName( name="value", lineno=0, col_offset=0, parent=assign, end_lineno=None, end_col_offset=None, ) assign.targets = [name] assign.value = nodes.const_factory(42) node.body.append(assign) self.transformer.register_transform(nodes.FunctionDef, transform_function) module = self.parse_transform( """ def test(): pass """ ) func = module.body[0] self.assertEqual(len(func.body), 2) self.assertIsInstance(func.body[1], nodes.Assign) self.assertEqual(func.body[1].as_string(), "value = 42") def test_predicates(self) -> None: def transform_call(node: Call) -> Const: inferred = next(node.infer()) return inferred def should_inline(node: Call) -> bool: return node.func.name.startswith("inlineme") self.transformer.register_transform(nodes.Call, transform_call, should_inline) module = self.parse_transform( """ def inlineme_1(): return 24 def dont_inline_me(): return 42 def inlineme_2(): return 2 inlineme_1() dont_inline_me() inlineme_2() """ ) values = module.body[-3:] self.assertIsInstance(values[0], nodes.Expr) self.assertIsInstance(values[0].value, nodes.Const) self.assertEqual(values[0].value.value, 24) self.assertIsInstance(values[1], nodes.Expr) self.assertIsInstance(values[1].value, nodes.Call) self.assertIsInstance(values[2], nodes.Expr) self.assertIsInstance(values[2].value, nodes.Const) self.assertEqual(values[2].value.value, 2) def test_transforms_are_separated(self) -> None: # Test that the transforming is done at a separate # step, which means that we are not doing inference # on a partially constructed tree anymore, which was the # source of crashes in the past when certain inference rules # were used in a transform. def transform_function(node: FunctionDef) -> Const: if node.decorators: for decorator in node.decorators.nodes: inferred = next(decorator.infer()) if inferred.qname() == "abc.abstractmethod": return next(node.infer_call_result(None)) return None manager = MANAGER with add_transform(manager, nodes.FunctionDef, transform_function): module = builder.parse( """ import abc from abc import abstractmethod class A(object): @abc.abstractmethod def ala(self): return 24 @abstractmethod def bala(self): return 42 """ ) cls = module["A"] ala = cls.body[0] bala = cls.body[1] self.assertIsInstance(ala, nodes.Const) self.assertEqual(ala.value, 24) self.assertIsInstance(bala, nodes.Const) self.assertEqual(bala.value, 42) def test_transforms_are_called_for_builtin_modules(self) -> None: # Test that transforms are called for builtin modules. def transform_function(node: FunctionDef) -> FunctionDef: name = nodes.AssignName( name="value", lineno=0, col_offset=0, parent=node.args, end_lineno=None, end_col_offset=None, ) node.args.args = [name] return node manager = MANAGER def predicate(node: FunctionDef) -> bool: return node.root().name == "time" with add_transform(manager, nodes.FunctionDef, transform_function, predicate): builder_instance = builder.AstroidBuilder() module = builder_instance.module_build(time) asctime = module["asctime"] self.assertEqual(len(asctime.args.args), 1) self.assertIsInstance(asctime.args.args[0], nodes.AssignName) self.assertEqual(asctime.args.args[0].name, "value") def test_builder_apply_transforms(self) -> None: def transform_function(node): return nodes.const_factory(42) manager = MANAGER with add_transform(manager, nodes.FunctionDef, transform_function): astroid_builder = builder.AstroidBuilder(apply_transforms=False) module = astroid_builder.string_build("""def test(): pass""") # The transform wasn't applied. self.assertIsInstance(module.body[0], nodes.FunctionDef) def test_transform_crashes_on_is_subtype_of(self) -> None: # Test that we don't crash when having is_subtype_of # in a transform, as per issue #188. This happened # before, when the transforms weren't in their own step. def transform_class(cls): if cls.is_subtype_of("django.db.models.base.Model"): return cls return cls self.transformer.register_transform(nodes.ClassDef, transform_class) self.parse_transform( """ # Change environ to automatically call putenv() if it exists import os putenv = os.putenv try: # This will fail if there's no putenv putenv except NameError: pass else: import UserDict """ ) def test_transform_aborted_if_recursion_limited(self): def transform_call(node: Call) -> Const: return node self.transformer.register_transform( nodes.Call, transform_call, _looks_like_dataclass_field_call ) original_limit = sys.getrecursionlimit() sys.setrecursionlimit(500 if IS_PYPY else 1000) try: with pytest.warns(UserWarning) as records: self.parse_transform(LONG_CHAINED_METHOD_CALL) assert "sys.setrecursionlimit" in records[0].message.args[0] finally: sys.setrecursionlimit(original_limit) astroid-3.2.2/tests/test_decorators.py0000664000175000017500000000732114622475517020031 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import pytest from _pytest.recwarn import WarningsRecorder from astroid.decorators import deprecate_default_argument_values class SomeClass: @deprecate_default_argument_values(name="str") def __init__(self, name=None, lineno=None): ... @deprecate_default_argument_values("3.2", name="str", var="int") def func(self, name=None, var=None, type_annotation=None): ... class TestDeprecationDecorators: @staticmethod def test_deprecated_default_argument_values_one_arg() -> None: with pytest.warns(DeprecationWarning) as records: # No argument passed for 'name' SomeClass() assert len(records) == 1 assert "name" in records[0].message.args[0] assert "'SomeClass.__init__'" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: # 'None' passed as argument for 'name' SomeClass(None) assert len(records) == 1 assert "name" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: # 'None' passed as keyword argument for 'name' SomeClass(name=None) assert len(records) == 1 assert "name" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: # No value passed for 'name' SomeClass(lineno=42) assert len(records) == 1 assert "name" in records[0].message.args[0] @staticmethod def test_deprecated_default_argument_values_two_args() -> None: instance = SomeClass(name="") # No value of 'None' passed for both arguments with pytest.warns(DeprecationWarning) as records: instance.func() assert len(records) == 2 assert "'SomeClass.func'" in records[0].message.args[0] assert "astroid 3.2" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: instance.func(None) assert len(records) == 2 with pytest.warns(DeprecationWarning) as records: instance.func(name=None) assert len(records) == 2 with pytest.warns(DeprecationWarning) as records: instance.func(var=None) assert len(records) == 2 with pytest.warns(DeprecationWarning) as records: instance.func(name=None, var=None) assert len(records) == 2 with pytest.warns(DeprecationWarning) as records: instance.func(type_annotation="") assert len(records) == 2 # No value of 'None' for one argument with pytest.warns(DeprecationWarning) as records: instance.func(42) assert len(records) == 1 assert "var" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: instance.func(name="") assert len(records) == 1 assert "var" in records[0].message.args[0] with pytest.warns(DeprecationWarning) as records: instance.func(var=42) assert len(records) == 1 assert "name" in records[0].message.args[0] @staticmethod def test_deprecated_default_argument_values_ok(recwarn: WarningsRecorder) -> None: """No DeprecationWarning should be emitted if all arguments are passed with not None values. """ instance = SomeClass(name="some_name") instance.func(name="", var=42) assert len(recwarn) == 0 astroid-3.2.2/tests/test_helpers.py0000664000175000017500000002264214622475517017331 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import builtins import unittest import pytest from astroid import builder, helpers, manager, nodes, raw_building, util from astroid.const import IS_PYPY from astroid.exceptions import _NonDeducibleTypeHierarchy from astroid.nodes.scoped_nodes import ClassDef class TestHelpers(unittest.TestCase): def setUp(self) -> None: builtins_name = builtins.__name__ astroid_manager = manager.AstroidManager() self.builtins = astroid_manager.astroid_cache[builtins_name] self.manager = manager.AstroidManager() def _extract(self, obj_name: str) -> ClassDef: return self.builtins.getattr(obj_name)[0] def _build_custom_builtin(self, obj_name: str) -> ClassDef: proxy = raw_building.build_class(obj_name) proxy.parent = self.builtins return proxy def assert_classes_equal(self, cls: ClassDef, other: ClassDef) -> None: self.assertEqual(cls.name, other.name) self.assertEqual(cls.parent, other.parent) self.assertEqual(cls.qname(), other.qname()) def test_object_type(self) -> None: pairs = [ ("1", self._extract("int")), ("[]", self._extract("list")), ("{1, 2, 3}", self._extract("set")), ("{1:2, 4:3}", self._extract("dict")), ("type", self._extract("type")), ("object", self._extract("type")), ("object()", self._extract("object")), ("super()", self._extract("super")), ("lambda: None", self._build_custom_builtin("function")), ("len", self._build_custom_builtin("builtin_function_or_method")), ("None", self._build_custom_builtin("NoneType")), ("import sys\nsys#@", self._build_custom_builtin("module")), ] for code, expected in pairs: node = builder.extract_node(code) objtype = helpers.object_type(node) self.assert_classes_equal(objtype, expected) def test_object_type_classes_and_functions(self) -> None: ast_nodes = builder.extract_node( """ def generator(): yield class A(object): def test(self): self #@ @classmethod def cls_method(cls): pass @staticmethod def static_method(): pass A #@ A() #@ A.test #@ A().test #@ A.cls_method #@ A().cls_method #@ A.static_method #@ A().static_method #@ generator() #@ """ ) assert isinstance(ast_nodes, list) from_self = helpers.object_type(ast_nodes[0]) cls = next(ast_nodes[1].infer()) self.assert_classes_equal(from_self, cls) cls_type = helpers.object_type(ast_nodes[1]) self.assert_classes_equal(cls_type, self._extract("type")) instance_type = helpers.object_type(ast_nodes[2]) cls = next(ast_nodes[2].infer())._proxied self.assert_classes_equal(instance_type, cls) expected_method_types = [ (ast_nodes[3], "function"), (ast_nodes[4], "method"), (ast_nodes[5], "method"), (ast_nodes[6], "method"), (ast_nodes[7], "function"), (ast_nodes[8], "function"), (ast_nodes[9], "generator"), ] for node, expected in expected_method_types: node_type = helpers.object_type(node) expected_type = self._build_custom_builtin(expected) self.assert_classes_equal(node_type, expected_type) def test_object_type_metaclasses(self) -> None: module = builder.parse( """ import abc class Meta(metaclass=abc.ABCMeta): pass meta_instance = Meta() """ ) meta_type = helpers.object_type(module["Meta"]) self.assert_classes_equal(meta_type, module["Meta"].metaclass()) meta_instance = next(module["meta_instance"].infer()) instance_type = helpers.object_type(meta_instance) self.assert_classes_equal(instance_type, module["Meta"]) def test_object_type_most_derived(self) -> None: node = builder.extract_node( """ class A(type): def __new__(*args, **kwargs): return type.__new__(*args, **kwargs) class B(object): pass class C(object, metaclass=A): pass # The most derived metaclass of D is A rather than type. class D(B , C): #@ pass """ ) assert isinstance(node, nodes.NodeNG) metaclass = node.metaclass() self.assertEqual(metaclass.name, "A") obj_type = helpers.object_type(node) self.assertEqual(metaclass, obj_type) def test_inference_errors(self) -> None: node = builder.extract_node( """ from unknown import Unknown u = Unknown #@ """ ) self.assertEqual(helpers.object_type(node), util.Uninferable) @pytest.mark.skipif(IS_PYPY, reason="__code__ will not be Unknown on PyPy") def test_inference_errors_2(self) -> None: node = builder.extract_node("type(float.__new__.__code__)") self.assertIs(helpers.object_type(node), util.Uninferable) def test_object_type_too_many_types(self) -> None: node = builder.extract_node( """ from unknown import Unknown def test(x): if x: return lambda: None else: return 1 test(Unknown) #@ """ ) self.assertEqual(helpers.object_type(node), util.Uninferable) def test_is_subtype(self) -> None: ast_nodes = builder.extract_node( """ class int_subclass(int): pass class A(object): pass #@ class B(A): pass #@ class C(A): pass #@ int_subclass() #@ """ ) assert isinstance(ast_nodes, list) cls_a = ast_nodes[0] cls_b = ast_nodes[1] cls_c = ast_nodes[2] int_subclass = ast_nodes[3] int_subclass = helpers.object_type(next(int_subclass.infer())) base_int = self._extract("int") self.assertTrue(helpers.is_subtype(int_subclass, base_int)) self.assertTrue(helpers.is_supertype(base_int, int_subclass)) self.assertTrue(helpers.is_supertype(cls_a, cls_b)) self.assertTrue(helpers.is_supertype(cls_a, cls_c)) self.assertTrue(helpers.is_subtype(cls_b, cls_a)) self.assertTrue(helpers.is_subtype(cls_c, cls_a)) self.assertFalse(helpers.is_subtype(cls_a, cls_b)) self.assertFalse(helpers.is_subtype(cls_a, cls_b)) def test_is_subtype_supertype_mro_error(self) -> None: cls_e, cls_f = builder.extract_node( """ class A(object): pass class B(A): pass class C(A): pass class D(B, C): pass class E(C, B): pass #@ class F(D, E): pass #@ """ ) self.assertFalse(helpers.is_subtype(cls_e, cls_f)) self.assertFalse(helpers.is_subtype(cls_e, cls_f)) with self.assertRaises(_NonDeducibleTypeHierarchy): helpers.is_subtype(cls_f, cls_e) self.assertFalse(helpers.is_supertype(cls_f, cls_e)) def test_is_subtype_supertype_unknown_bases(self) -> None: cls_a, cls_b = builder.extract_node( """ from unknown import Unknown class A(Unknown): pass #@ class B(A): pass #@ """ ) with self.assertRaises(_NonDeducibleTypeHierarchy): helpers.is_subtype(cls_a, cls_b) with self.assertRaises(_NonDeducibleTypeHierarchy): helpers.is_supertype(cls_a, cls_b) def test_is_subtype_supertype_unrelated_classes(self) -> None: cls_a, cls_b = builder.extract_node( """ class A(object): pass #@ class B(object): pass #@ """ ) self.assertFalse(helpers.is_subtype(cls_a, cls_b)) self.assertFalse(helpers.is_subtype(cls_b, cls_a)) self.assertFalse(helpers.is_supertype(cls_a, cls_b)) self.assertFalse(helpers.is_supertype(cls_b, cls_a)) def test_is_subtype_supertype_classes_no_type_ancestor(self) -> None: cls_a = builder.extract_node( """ class A(object): #@ pass """ ) builtin_type = self._extract("type") self.assertFalse(helpers.is_supertype(builtin_type, cls_a)) self.assertFalse(helpers.is_subtype(cls_a, builtin_type)) def test_is_subtype_supertype_classes_metaclasses(self) -> None: cls_a = builder.extract_node( """ class A(type): #@ pass """ ) builtin_type = self._extract("type") self.assertTrue(helpers.is_supertype(builtin_type, cls_a)) self.assertTrue(helpers.is_subtype(cls_a, builtin_type)) def test_uninferable_for_safe_infer() -> None: uninfer = util.Uninferable assert util.safe_infer(util.Uninferable) == uninfer def test_safe_infer_shim() -> None: with pytest.warns(DeprecationWarning) as records: helpers.safe_infer(nodes.Unknown()) assert ( "Import safe_infer from astroid.util; this shim in astroid.helpers will be removed." in records[0].message.args[0] ) astroid-3.2.2/tests/test_modutils.py0000664000175000017500000005472414622475517017535 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Unit tests for module modutils (module manipulation utilities).""" import email import logging import os import shutil import sys import tempfile import unittest import xml from pathlib import Path from xml import etree from xml.etree import ElementTree import pytest from pytest import CaptureFixture, LogCaptureFixture import astroid from astroid import modutils from astroid.const import PY310_PLUS from astroid.interpreter._import import spec from . import resources try: import urllib3 # type: ignore[import] HAS_URLLIB3_V1 = urllib3.__version__.startswith("1") except ImportError: HAS_URLLIB3_V1 = False def _get_file_from_object(obj) -> str: return modutils._path_from_filename(obj.__file__) class ModuleFileTest(unittest.TestCase): package = "mypypa" def tearDown(self) -> None: astroid.MANAGER.clear_cache() for k in list(sys.path_importer_cache): if "MyPyPa" in k: del sys.path_importer_cache[k] def test_find_zipped_module(self) -> None: found_spec = spec.find_spec( [self.package], [resources.find("data/MyPyPa-0.1.0-py2.5.zip")] ) self.assertEqual(found_spec.type, spec.ModuleType.PY_ZIPMODULE) self.assertEqual( found_spec.location.split(os.sep)[-3:], ["data", "MyPyPa-0.1.0-py2.5.zip", self.package], ) def test_find_egg_module(self) -> None: found_spec = spec.find_spec( [self.package], [resources.find("data/MyPyPa-0.1.0-py2.5.egg")] ) self.assertEqual(found_spec.type, spec.ModuleType.PY_ZIPMODULE) self.assertEqual( found_spec.location.split(os.sep)[-3:], ["data", "MyPyPa-0.1.0-py2.5.egg", self.package], ) class LoadModuleFromNameTest(unittest.TestCase): """Load a python module from its name.""" def test_known_values_load_module_from_name_1(self) -> None: self.assertEqual(modutils.load_module_from_name("sys"), sys) def test_known_values_load_module_from_name_2(self) -> None: self.assertEqual(modutils.load_module_from_name("os.path"), os.path) def test_raise_load_module_from_name_1(self) -> None: self.assertRaises( ImportError, modutils.load_module_from_name, "_this_module_does_not_exist_" ) def test_import_dotted_library( capsys: CaptureFixture, caplog: LogCaptureFixture, ) -> None: caplog.set_level(logging.INFO) original_module = sys.modules.pop("xml.etree.ElementTree") expected_out = "INFO (TEST): Welcome to cElementTree!" expected_err = "WARNING (TEST): Monkey-patched version of cElementTree" def function_with_stdout_and_stderr(expected_out, expected_err): def mocked_function(*args, **kwargs): print(f"{expected_out} args={args} kwargs={kwargs}") print(expected_err, file=sys.stderr) return mocked_function try: with unittest.mock.patch( "importlib.import_module", side_effect=function_with_stdout_and_stderr(expected_out, expected_err), ): modutils.load_module_from_name("xml.etree.ElementTree") out, err = capsys.readouterr() assert expected_out in caplog.text assert expected_err in caplog.text assert not out assert not err finally: sys.modules["xml.etree.ElementTree"] = original_module class GetModulePartTest(unittest.TestCase): """Given a dotted name return the module part of the name.""" def test_known_values_get_module_part_1(self) -> None: self.assertEqual( modutils.get_module_part("astroid.modutils"), "astroid.modutils" ) def test_known_values_get_module_part_2(self) -> None: self.assertEqual( modutils.get_module_part("astroid.modutils.get_module_part"), "astroid.modutils", ) def test_known_values_get_module_part_3(self) -> None: """Relative import from given file.""" self.assertEqual( modutils.get_module_part("nodes.node_classes.AssName", modutils.__file__), "nodes.node_classes", ) def test_known_values_get_compiled_module_part(self) -> None: self.assertEqual(modutils.get_module_part("math.log10"), "math") self.assertEqual(modutils.get_module_part("math.log10", __file__), "math") def test_known_values_get_builtin_module_part(self) -> None: self.assertEqual(modutils.get_module_part("sys.path"), "sys") self.assertEqual(modutils.get_module_part("sys.path", "__file__"), "sys") def test_get_module_part_exception(self) -> None: self.assertRaises( ImportError, modutils.get_module_part, "unknown.module", modutils.__file__ ) def test_get_module_part_only_dot(self) -> None: self.assertEqual(modutils.get_module_part(".", modutils.__file__), ".") class ModPathFromFileTest(unittest.TestCase): """Given an absolute file path return the python module's path as a list.""" def test_known_values_modpath_from_file_1(self) -> None: self.assertEqual( modutils.modpath_from_file(ElementTree.__file__), ["xml", "etree", "ElementTree"], ) def test_raise_modpath_from_file_exception(self) -> None: self.assertRaises(Exception, modutils.modpath_from_file, "/turlututu") def test_import_symlink_with_source_outside_of_path(self) -> None: with tempfile.NamedTemporaryFile() as tmpfile: linked_file_name = "symlinked_file.py" try: os.symlink(tmpfile.name, linked_file_name) self.assertEqual( modutils.modpath_from_file(linked_file_name), ["symlinked_file"] ) finally: os.remove(linked_file_name) def test_import_symlink_both_outside_of_path(self) -> None: with tempfile.NamedTemporaryFile() as tmpfile: linked_file_name = os.path.join(tempfile.gettempdir(), "symlinked_file.py") try: os.symlink(tmpfile.name, linked_file_name) self.assertRaises( ImportError, modutils.modpath_from_file, linked_file_name ) finally: os.remove(linked_file_name) def test_load_from_module_symlink_on_symlinked_paths_in_syspath(self) -> None: # constants tmp = tempfile.gettempdir() deployment_path = os.path.join(tmp, "deployment") path_to_include = os.path.join(tmp, "path_to_include") real_secret_path = os.path.join(tmp, "secret.py") symlink_secret_path = os.path.join(path_to_include, "secret.py") # setup double symlink # /tmp/deployment # /tmp/path_to_include (symlink to /tmp/deployment) # /tmp/secret.py # /tmp/deployment/secret.py (points to /tmp/secret.py) try: os.mkdir(deployment_path) self.addCleanup(shutil.rmtree, deployment_path) os.symlink(deployment_path, path_to_include) self.addCleanup(os.remove, path_to_include) except OSError: pass with open(real_secret_path, "w", encoding="utf-8"): pass os.symlink(real_secret_path, symlink_secret_path) self.addCleanup(os.remove, real_secret_path) # add the symlinked path to sys.path sys.path.append(path_to_include) self.addCleanup(sys.path.pop) # this should be equivalent to: import secret self.assertEqual(modutils.modpath_from_file(symlink_secret_path), ["secret"]) def test_load_packages_without_init(self) -> None: """Test that we correctly find packages with an __init__.py file. Regression test for issue reported in: https://github.com/pylint-dev/astroid/issues/1327 """ tmp_dir = Path(tempfile.gettempdir()) self.addCleanup(os.chdir, os.getcwd()) os.chdir(tmp_dir) self.addCleanup(shutil.rmtree, tmp_dir / "src") os.mkdir(tmp_dir / "src") os.mkdir(tmp_dir / "src" / "package") with open(tmp_dir / "src" / "__init__.py", "w", encoding="utf-8"): pass with open(tmp_dir / "src" / "package" / "file.py", "w", encoding="utf-8"): pass # this should be equivalent to: import secret self.assertEqual( modutils.modpath_from_file(str(Path("src") / "package"), ["."]), ["src", "package"], ) class LoadModuleFromPathTest(resources.SysPathSetup, unittest.TestCase): def test_do_not_load_twice(self) -> None: modutils.load_module_from_modpath(["data", "lmfp", "foo"]) modutils.load_module_from_modpath(["data", "lmfp"]) # pylint: disable=no-member; just-once is added by a test file dynamically. self.assertEqual(len(sys.just_once), 1) del sys.just_once class FileFromModPathTest(resources.SysPathSetup, unittest.TestCase): """given a mod path (i.e. splited module / package name), return the corresponding file, giving priority to source file over precompiled file if it exists""" def test_site_packages(self) -> None: filename = _get_file_from_object(modutils) result = modutils.file_from_modpath(["astroid", "modutils"]) self.assertEqual(os.path.realpath(result), os.path.realpath(filename)) def test_std_lib(self) -> None: path = modutils.file_from_modpath(["os", "path"]).replace(".pyc", ".py") self.assertEqual( os.path.realpath(path), os.path.realpath(os.path.__file__.replace(".pyc", ".py")), ) def test_builtin(self) -> None: self.assertIsNone(modutils.file_from_modpath(["sys"])) def test_unexisting(self) -> None: self.assertRaises(ImportError, modutils.file_from_modpath, ["turlututu"]) def test_unicode_in_package_init(self) -> None: # file_from_modpath should not crash when reading an __init__ # file with unicode characters. modutils.file_from_modpath(["data", "unicode_package", "core"]) class GetSourceFileTest(unittest.TestCase): def test(self) -> None: filename = _get_file_from_object(os.path) self.assertEqual( modutils.get_source_file(os.path.__file__), os.path.normpath(filename) ) def test_raise(self) -> None: self.assertRaises(modutils.NoSourceFile, modutils.get_source_file, "whatever") def test_pyi(self) -> None: package = resources.find("pyi_data") module = os.path.join(package, "__init__.pyi") self.assertEqual(modutils.get_source_file(module), os.path.normpath(module)) def test_pyi_preferred(self) -> None: package = resources.find("pyi_data/find_test") module = os.path.join(package, "__init__.py") self.assertEqual( modutils.get_source_file(module, prefer_stubs=True), os.path.normpath(module) + "i", ) class IsStandardModuleTest(resources.SysPathSetup, unittest.TestCase): """ Return true if the module may be considered as a module from the standard library. """ def test_datetime(self) -> None: # This is an interesting example, since datetime, on pypy, # is under lib_pypy, rather than the usual Lib directory. with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("datetime") def test_builtins(self) -> None: with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("__builtin__") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("builtins") def test_builtin(self) -> None: with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("sys") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("marshal") def test_nonstandard(self) -> None: with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("astroid") def test_unknown(self) -> None: with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("unknown") def test_4(self) -> None: with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("hashlib") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("pickle") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("email") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("io") with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("StringIO") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("unicodedata") def test_custom_path(self) -> None: datadir = resources.find("") if any(datadir.startswith(p) for p in modutils.EXT_LIB_DIRS): self.skipTest("known breakage of is_standard_module on installed package") with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("data.module", (datadir,)) with pytest.warns(DeprecationWarning): assert modutils.is_standard_module( "data.module", (os.path.abspath(datadir),) ) # "" will evaluate to cwd with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("data.module", ("",)) def test_failing_edge_cases(self) -> None: # using a subpackage/submodule path as std_path argument with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("xml.etree", etree.__path__) # using a module + object name as modname argument with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("sys.path") # this is because only the first package/module is considered with pytest.warns(DeprecationWarning): assert modutils.is_standard_module("sys.whatever") with pytest.warns(DeprecationWarning): assert not modutils.is_standard_module("xml.whatever", etree.__path__) class IsStdLibModuleTest(resources.SysPathSetup, unittest.TestCase): """ Return true if the module is path of the standard library """ def test_datetime(self) -> None: # This is an interesting example, since datetime, on pypy, # is under lib_pypy, rather than the usual Lib directory. assert modutils.is_stdlib_module("datetime") def test_builtins(self) -> None: assert not modutils.is_stdlib_module("__builtin__") assert modutils.is_stdlib_module("builtins") def test_builtin(self) -> None: assert modutils.is_stdlib_module("sys") assert modutils.is_stdlib_module("marshal") def test_nonstandard(self) -> None: assert not modutils.is_stdlib_module("astroid") def test_unknown(self) -> None: assert not modutils.is_stdlib_module("unknown") def test_4(self) -> None: assert modutils.is_stdlib_module("hashlib") assert modutils.is_stdlib_module("pickle") assert modutils.is_stdlib_module("email") assert modutils.is_stdlib_module("io") assert not modutils.is_stdlib_module("StringIO") assert modutils.is_stdlib_module("unicodedata") def test_subpackages(self) -> None: # using a module + object name as modname argument assert modutils.is_stdlib_module("sys.path") # this is because only the first package/module is considered assert modutils.is_stdlib_module("sys.whatever") def test_platform_specific(self) -> None: assert modutils.is_stdlib_module("_curses") assert modutils.is_stdlib_module("msvcrt") assert modutils.is_stdlib_module("termios") class ModuleInPathTest(resources.SysPathSetup, unittest.TestCase): """ Return true if the module is imported from the specified path """ def test_success(self) -> None: datadir = resources.find("") assert modutils.module_in_path("data.module", datadir) assert modutils.module_in_path("data.module", (datadir,)) assert modutils.module_in_path("data.module", os.path.abspath(datadir)) assert modutils.module_in_path("pyi_data.module", datadir) assert modutils.module_in_path("pyi_data.module", (datadir,)) assert modutils.module_in_path("pyi_data.module", os.path.abspath(datadir)) # "" will evaluate to cwd assert modutils.module_in_path("data.module", "") assert modutils.module_in_path("pyi_data.module", "") def test_bad_import(self) -> None: datadir = resources.find("") assert not modutils.module_in_path("this_module_is_no_more", datadir) def test_no_filename(self) -> None: datadir = resources.find("") assert not modutils.module_in_path("sys", datadir) def test_failure(self) -> None: datadir = resources.find("") assert not modutils.module_in_path("etree", datadir) assert not modutils.module_in_path("astroid", datadir) class BackportStdlibNamesTest(resources.SysPathSetup, unittest.TestCase): """ Verify backport raises exception on newer versions """ @pytest.mark.skipif(not PY310_PLUS, reason="Backport valid on <=3.9") def test_import_error(self) -> None: with pytest.raises(AssertionError): # pylint: disable-next=import-outside-toplevel, unused-import from astroid import _backport_stdlib_names # noqa class IsRelativeTest(unittest.TestCase): def test_known_values_is_relative_1(self) -> None: self.assertTrue(modutils.is_relative("utils", email.__path__[0])) def test_known_values_is_relative_3(self) -> None: self.assertFalse(modutils.is_relative("astroid", astroid.__path__[0])) def test_known_values_is_relative_4(self) -> None: self.assertTrue( modutils.is_relative("util", astroid.interpreter._import.spec.__file__) ) def test_known_values_is_relative_5(self) -> None: self.assertFalse( modutils.is_relative( "objectmodel", astroid.interpreter._import.spec.__file__ ) ) def test_deep_relative(self) -> None: self.assertTrue(modutils.is_relative("ElementTree", xml.etree.__path__[0])) def test_deep_relative2(self) -> None: self.assertFalse(modutils.is_relative("ElementTree", xml.__path__[0])) def test_deep_relative3(self) -> None: self.assertTrue(modutils.is_relative("etree.ElementTree", xml.__path__[0])) def test_deep_relative4(self) -> None: self.assertTrue(modutils.is_relative("etree.gibberish", xml.__path__[0])) def test_is_relative_bad_path(self) -> None: self.assertFalse( modutils.is_relative("ElementTree", os.path.join(xml.__path__[0], "ftree")) ) class GetModuleFilesTest(unittest.TestCase): def test_get_module_files_1(self) -> None: package = resources.find("data/find_test") modules = set(modutils.get_module_files(package, [])) expected = [ "__init__.py", "module.py", "module2.py", "noendingnewline.py", "nonregr.py", ] self.assertEqual(modules, {os.path.join(package, x) for x in expected}) def test_get_module_files_2(self) -> None: package = resources.find("pyi_data/find_test") modules = set(modutils.get_module_files(package, [])) expected = [ "__init__.py", "__init__.pyi", "module.py", "module2.py", "noendingnewline.py", "nonregr.py", ] self.assertEqual(modules, {os.path.join(package, x) for x in expected}) def test_get_all_files(self) -> None: """Test that list_all returns all Python files from given location.""" non_package = resources.find("data/notamodule") modules = modutils.get_module_files(non_package, [], list_all=True) self.assertEqual(modules, [os.path.join(non_package, "file.py")]) def test_load_module_set_attribute(self) -> None: del xml.etree.ElementTree del sys.modules["xml.etree.ElementTree"] m = modutils.load_module_from_modpath(["xml", "etree", "ElementTree"]) self.assertTrue(hasattr(xml, "etree")) self.assertTrue(hasattr(xml.etree, "ElementTree")) self.assertTrue(m is xml.etree.ElementTree) class ExtensionPackageWhitelistTest(unittest.TestCase): def test_is_module_name_part_of_extension_package_whitelist_true(self) -> None: self.assertTrue( modutils.is_module_name_part_of_extension_package_whitelist( "numpy", {"numpy"} ) ) self.assertTrue( modutils.is_module_name_part_of_extension_package_whitelist( "numpy.core", {"numpy"} ) ) self.assertTrue( modutils.is_module_name_part_of_extension_package_whitelist( "numpy.core.umath", {"numpy"} ) ) def test_is_module_name_part_of_extension_package_whitelist_success(self) -> None: self.assertFalse( modutils.is_module_name_part_of_extension_package_whitelist( "numpy", {"numpy.core"} ) ) self.assertFalse( modutils.is_module_name_part_of_extension_package_whitelist( "numpy.core", {"numpy.core.umath"} ) ) self.assertFalse( modutils.is_module_name_part_of_extension_package_whitelist( "core.umath", {"numpy"} ) ) @pytest.mark.skipif(not HAS_URLLIB3_V1, reason="This test requires urllib3 < 2.") def test_file_info_from_modpath__SixMetaPathImporter() -> None: """Six is not backported anymore in urllib3 v2.0.0+""" assert modutils.file_info_from_modpath(["urllib3.packages.six.moves.http_client"]) def test_find_setuptools_pep660_editable_install(): """Find the spec for a package installed via setuptools PEP 660 import hooks.""" # pylint: disable-next=import-outside-toplevel from tests.testdata.python3.data.import_setuptools_pep660.__editable___example_0_1_0_finder import ( _EditableFinder, ) with unittest.mock.patch.object(sys, "meta_path", new=[_EditableFinder]): assert spec.find_spec(["example"]) assert spec.find_spec(["example", "subpackage"]) astroid-3.2.2/tests/test_nodes.py0000664000175000017500000020113014622475517016766 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for specific behaviour of astroid nodes.""" from __future__ import annotations import copy import inspect import os import random import sys import textwrap import unittest from typing import Any import pytest import astroid from astroid import ( Uninferable, bases, builder, extract_node, nodes, parse, test_utils, transforms, util, ) from astroid.const import IS_PYPY, PY310_PLUS, PY312_PLUS, Context from astroid.context import InferenceContext from astroid.exceptions import ( AstroidBuildingError, AstroidSyntaxError, AttributeInferenceError, ParentMissingError, StatementMissing, ) from astroid.nodes.node_classes import ( AssignAttr, AssignName, Attribute, Call, ImportFrom, Tuple, ) from astroid.nodes.scoped_nodes import ClassDef, FunctionDef, GeneratorExp, Module from tests.testdata.python3.recursion_error import LONG_CHAINED_METHOD_CALL from . import resources abuilder = builder.AstroidBuilder() class AsStringTest(resources.SysPathSetup, unittest.TestCase): def test_tuple_as_string(self) -> None: def build(string: str) -> Tuple: return abuilder.string_build(string).body[0].value self.assertEqual(build("1,").as_string(), "(1, )") self.assertEqual(build("1, 2, 3").as_string(), "(1, 2, 3)") self.assertEqual(build("(1, )").as_string(), "(1, )") self.assertEqual(build("1, 2, 3").as_string(), "(1, 2, 3)") def test_func_signature_issue_185(self) -> None: code = textwrap.dedent( """ def test(a, b, c=42, *, x=42, **kwargs): print(a, b, c, args) """ ) node = parse(code) self.assertEqual(node.as_string().strip(), code.strip()) def test_as_string_for_list_containing_uninferable(self) -> None: node = builder.extract_node( """ def foo(): bar = [arg] * 1 """ ) binop = node.body[0].value inferred = next(binop.infer()) self.assertEqual(inferred.as_string(), "[Uninferable]") self.assertEqual(binop.as_string(), "[arg] * 1") def test_frozenset_as_string(self) -> None: ast_nodes = builder.extract_node( """ frozenset((1, 2, 3)) #@ frozenset({1, 2, 3}) #@ frozenset([1, 2, 3,]) #@ frozenset(None) #@ frozenset(1) #@ """ ) ast_nodes = [next(node.infer()) for node in ast_nodes] assert isinstance(ast_nodes, list) self.assertEqual(ast_nodes[0].as_string(), "frozenset((1, 2, 3))") self.assertEqual(ast_nodes[1].as_string(), "frozenset({1, 2, 3})") self.assertEqual(ast_nodes[2].as_string(), "frozenset([1, 2, 3])") self.assertNotEqual(ast_nodes[3].as_string(), "frozenset(None)") self.assertNotEqual(ast_nodes[4].as_string(), "frozenset(1)") def test_varargs_kwargs_as_string(self) -> None: ast = abuilder.string_build("raise_string(*args, **kwargs)").body[0] self.assertEqual(ast.as_string(), "raise_string(*args, **kwargs)") def test_module_as_string(self) -> None: """Check as_string on a whole module prepared to be returned identically.""" module = resources.build_file("data/module.py", "data.module") with open(resources.find("data/module.py"), encoding="utf-8") as fobj: self.assertMultiLineEqual(module.as_string(), fobj.read()) def test_module2_as_string(self) -> None: """Check as_string on a whole module prepared to be returned identically.""" module2 = resources.build_file("data/module2.py", "data.module2") with open(resources.find("data/module2.py"), encoding="utf-8") as fobj: self.assertMultiLineEqual(module2.as_string(), fobj.read()) def test_as_string(self) -> None: """Check as_string for python syntax >= 2.7.""" code = """one_two = {1, 2} b = {v: k for (k, v) in enumerate('string')} cdd = {k for k in b}\n\n""" ast = abuilder.string_build(code) self.assertMultiLineEqual(ast.as_string(), code) def test_3k_as_string(self) -> None: """Check as_string for python 3k syntax.""" code = """print() def function(var): nonlocal counter try: hello except NameError as nexc: (*hell, o) = b'hello' raise AttributeError from nexc \n""" ast = abuilder.string_build(code) self.assertEqual(ast.as_string(), code) def test_3k_annotations_and_metaclass(self) -> None: code = ''' def function(var: int): nonlocal counter class Language(metaclass=Natural): """natural language""" ''' code_annotations = textwrap.dedent(code) expected = '''\ def function(var: int): nonlocal counter class Language(metaclass=Natural): """natural language"""''' ast = abuilder.string_build(code_annotations) self.assertEqual(ast.as_string().strip(), expected) def test_ellipsis(self) -> None: ast = abuilder.string_build("a[...]").body[0] self.assertEqual(ast.as_string(), "a[...]") def test_slices(self) -> None: for code in ( "a[0]", "a[1:3]", "a[:-1:step]", "a[:, newaxis]", "a[newaxis, :]", "del L[::2]", "del A[1]", "del Br[:]", ): ast = abuilder.string_build(code).body[0] self.assertEqual(ast.as_string(), code) def test_slice_and_subscripts(self) -> None: code = """a[:1] = bord[2:] a[:1] = bord[2:] del bree[3:d] bord[2:] del av[d::f], a[df:] a[:1] = bord[2:] del SRC[::1, newaxis, 1:] tous[vals] = 1010 del thousand[key] del a[::2], a[:-1:step] del Fee.form[left:] aout.vals = miles.of_stuff del (ccok, (name.thing, foo.attrib.value)), Fee.form[left:] if all[1] == bord[0:]: pass\n\n""" ast = abuilder.string_build(code) self.assertEqual(ast.as_string(), code) def test_int_attribute(self) -> None: code = """ x = (-3).real y = (3).imag """ ast = abuilder.string_build(code) self.assertEqual(ast.as_string().strip(), code.strip()) def test_operator_precedence(self) -> None: with open(resources.find("data/operator_precedence.py"), encoding="utf-8") as f: for code in f: self.check_as_string_ast_equality(code) @staticmethod def check_as_string_ast_equality(code: str) -> None: """ Check that as_string produces source code with exactly the same semantics as the source it was originally parsed from. """ pre = builder.parse(code) post = builder.parse(pre.as_string()) pre_repr = pre.repr_tree() post_repr = post.repr_tree() assert pre_repr == post_repr assert pre.as_string().strip() == code.strip() def test_class_def(self) -> None: code = """ import abc from typing import Tuple class A: pass class B(metaclass=A, x=1): pass class C(B): pass class D(metaclass=abc.ABCMeta): pass def func(param: Tuple): pass """ ast = abuilder.string_build(code) self.assertEqual(ast.as_string().strip(), code.strip()) def test_f_strings(self): code = r''' a = f"{'a'}" b = f'{{b}}' c = f""" "{'c'}" """ d = f'{d!r} {d!s} {d!a}' e = f'{e:.3}' f = f'{f:{x}.{y}}' n = f'\n' everything = f""" " \' \r \t \\ {{ }} {'x' + x!r:a} {["'"]!s:{a}}""" ''' ast = abuilder.string_build(code) self.assertEqual(ast.as_string().strip(), code.strip()) @staticmethod def test_as_string_unknown() -> None: assert nodes.Unknown().as_string() == "Unknown.Unknown()" assert nodes.Unknown(lineno=1, col_offset=0).as_string() == "Unknown.Unknown()" @staticmethod @pytest.mark.skipif( IS_PYPY, reason="Test requires manipulating the recursion limit, which cannot " "be undone in a finally block without polluting other tests on PyPy.", ) def test_recursion_error_trapped() -> None: with pytest.warns(UserWarning, match="unable to transform"): ast = abuilder.string_build(LONG_CHAINED_METHOD_CALL) attribute = ast.body[1].value.func with pytest.raises(UserWarning): attribute.as_string() @pytest.mark.skipif(not PY312_PLUS, reason="Uses 3.12 type param nodes") class AsStringTypeParamNodes(unittest.TestCase): @staticmethod def test_as_string_type_alias() -> None: ast = abuilder.string_build("type Point = tuple[float, float]") type_alias = ast.body[0] assert type_alias.as_string().strip() == "Point" @staticmethod def test_as_string_type_var() -> None: ast = abuilder.string_build("type Point[T] = tuple[float, float]") type_var = ast.body[0].type_params[0] assert type_var.as_string().strip() == "T" @staticmethod def test_as_string_type_var_tuple() -> None: ast = abuilder.string_build("type Alias[*Ts] = tuple[*Ts]") type_var_tuple = ast.body[0].type_params[0] assert type_var_tuple.as_string().strip() == "*Ts" @staticmethod def test_as_string_param_spec() -> None: ast = abuilder.string_build("type Alias[**P] = Callable[P, int]") param_spec = ast.body[0].type_params[0] assert param_spec.as_string().strip() == "P" class _NodeTest(unittest.TestCase): """Test transformation of If Node.""" CODE = "" @property def astroid(self) -> Module: try: return self.__class__.__dict__["CODE_Astroid"] except KeyError: module = builder.parse(self.CODE) self.__class__.CODE_Astroid = module return module class IfNodeTest(_NodeTest): """Test transformation of If Node.""" CODE = """ if 0: print() if True: print() else: pass if "": print() elif []: raise if 1: print() elif True: print() elif func(): pass else: raise """ def test_if_elif_else_node(self) -> None: """Test transformation for If node.""" self.assertEqual(len(self.astroid.body), 4) for stmt in self.astroid.body: self.assertIsInstance(stmt, nodes.If) self.assertFalse(self.astroid.body[0].orelse) # simple If self.assertIsInstance(self.astroid.body[1].orelse[0], nodes.Pass) # If / else self.assertIsInstance(self.astroid.body[2].orelse[0], nodes.If) # If / elif self.assertIsInstance(self.astroid.body[3].orelse[0].orelse[0], nodes.If) def test_block_range(self) -> None: # XXX ensure expected values self.assertEqual(self.astroid.block_range(1), (0, 22)) self.assertEqual(self.astroid.block_range(10), (0, 22)) # XXX (10, 22) ? self.assertEqual(self.astroid.body[1].block_range(5), (5, 6)) self.assertEqual(self.astroid.body[1].block_range(6), (6, 6)) self.assertEqual(self.astroid.body[1].orelse[0].block_range(7), (7, 8)) self.assertEqual(self.astroid.body[1].orelse[0].block_range(8), (8, 8)) class TryNodeTest(_NodeTest): CODE = """ try: # L2 print("Hello") except IOError: pass except UnicodeError: pass else: print() finally: print() """ def test_block_range(self) -> None: try_node = self.astroid.body[0] assert try_node.block_range(1) == (1, 11) assert try_node.block_range(2) == (2, 2) assert try_node.block_range(3) == (3, 3) assert try_node.block_range(4) == (4, 4) assert try_node.block_range(5) == (5, 5) assert try_node.block_range(6) == (6, 6) assert try_node.block_range(7) == (7, 7) assert try_node.block_range(8) == (8, 8) assert try_node.block_range(9) == (9, 9) assert try_node.block_range(10) == (10, 10) assert try_node.block_range(11) == (11, 11) class TryExceptNodeTest(_NodeTest): CODE = """ try: print ('pouet') except IOError: pass except UnicodeError: print() else: print() """ def test_block_range(self) -> None: # XXX ensure expected values self.assertEqual(self.astroid.body[0].block_range(1), (1, 9)) self.assertEqual(self.astroid.body[0].block_range(2), (2, 2)) self.assertEqual(self.astroid.body[0].block_range(3), (3, 3)) self.assertEqual(self.astroid.body[0].block_range(4), (4, 4)) self.assertEqual(self.astroid.body[0].block_range(5), (5, 5)) self.assertEqual(self.astroid.body[0].block_range(6), (6, 6)) self.assertEqual(self.astroid.body[0].block_range(7), (7, 7)) self.assertEqual(self.astroid.body[0].block_range(8), (8, 8)) self.assertEqual(self.astroid.body[0].block_range(9), (9, 9)) class TryFinallyNodeTest(_NodeTest): CODE = """ try: print ('pouet') finally: print ('pouet') """ def test_block_range(self) -> None: # XXX ensure expected values self.assertEqual(self.astroid.body[0].block_range(1), (1, 5)) self.assertEqual(self.astroid.body[0].block_range(2), (2, 2)) self.assertEqual(self.astroid.body[0].block_range(3), (3, 3)) self.assertEqual(self.astroid.body[0].block_range(4), (4, 4)) self.assertEqual(self.astroid.body[0].block_range(5), (5, 5)) class TryExceptFinallyNodeTest(_NodeTest): CODE = """ try: print('pouet') except Exception: print ('oops') finally: print ('pouet') """ def test_block_range(self) -> None: # XXX ensure expected values self.assertEqual(self.astroid.body[0].block_range(1), (1, 7)) self.assertEqual(self.astroid.body[0].block_range(2), (2, 2)) self.assertEqual(self.astroid.body[0].block_range(3), (3, 3)) self.assertEqual(self.astroid.body[0].block_range(4), (4, 4)) self.assertEqual(self.astroid.body[0].block_range(5), (5, 5)) self.assertEqual(self.astroid.body[0].block_range(6), (6, 6)) self.assertEqual(self.astroid.body[0].block_range(7), (7, 7)) class ImportNodeTest(resources.SysPathSetup, unittest.TestCase): def setUp(self) -> None: super().setUp() self.module = resources.build_file("data/module.py", "data.module") self.module2 = resources.build_file("data/module2.py", "data.module2") def test_import_self_resolve(self) -> None: myos = next(self.module2.igetattr("myos")) self.assertTrue(isinstance(myos, nodes.Module), myos) self.assertEqual(myos.name, "os") self.assertEqual(myos.qname(), "os") self.assertEqual(myos.pytype(), "builtins.module") def test_from_self_resolve(self) -> None: namenode = next(self.module.igetattr("NameNode")) self.assertTrue(isinstance(namenode, nodes.ClassDef), namenode) self.assertEqual(namenode.root().name, "astroid.nodes.node_classes") self.assertEqual(namenode.qname(), "astroid.nodes.node_classes.Name") self.assertEqual(namenode.pytype(), "builtins.type") abspath = next(self.module2.igetattr("abspath")) self.assertTrue(isinstance(abspath, nodes.FunctionDef), abspath) self.assertEqual(abspath.root().name, "os.path") self.assertEqual(abspath.pytype(), "builtins.function") if sys.platform != "win32": # Not sure what is causing this check to fail on Windows. # For some reason the abspath() inference returns a different # path than expected: # AssertionError: 'os.path._abspath_fallback' != 'os.path.abspath' self.assertEqual(abspath.qname(), "os.path.abspath") def test_real_name(self) -> None: from_ = self.module["NameNode"] self.assertEqual(from_.real_name("NameNode"), "Name") imp_ = self.module["os"] self.assertEqual(imp_.real_name("os"), "os") self.assertRaises(AttributeInferenceError, imp_.real_name, "os.path") imp_ = self.module["NameNode"] self.assertEqual(imp_.real_name("NameNode"), "Name") self.assertRaises(AttributeInferenceError, imp_.real_name, "Name") imp_ = self.module2["YO"] self.assertEqual(imp_.real_name("YO"), "YO") self.assertRaises(AttributeInferenceError, imp_.real_name, "data") def test_as_string(self) -> None: ast = self.module["modutils"] self.assertEqual(ast.as_string(), "from astroid import modutils") ast = self.module["NameNode"] self.assertEqual( ast.as_string(), "from astroid.nodes.node_classes import Name as NameNode" ) ast = self.module["os"] self.assertEqual(ast.as_string(), "import os.path") code = """from . import here from .. import door from .store import bread from ..cave import wine\n\n""" ast = abuilder.string_build(code) self.assertMultiLineEqual(ast.as_string(), code) def test_bad_import_inference(self) -> None: # Explication of bug """When we import PickleError from nonexistent, a call to the infer method of this From node will be made by unpack_infer. inference.infer_from will try to import this module, which will fail and raise a InferenceException (by ImportNode.do_import_module). The infer_name will catch this exception and yield and Uninferable instead. """ code = """ try: from pickle import PickleError except ImportError: from nonexistent import PickleError try: pass except PickleError: pass """ module = builder.parse(code) handler_type = module.body[1].handlers[0].type excs = list(nodes.unpack_infer(handler_type)) # The number of returned object can differ on Python 2 # and Python 3. In one version, an additional item will # be returned, from the _pickle module, which is not # present in the other version. self.assertIsInstance(excs[0], nodes.ClassDef) self.assertEqual(excs[0].name, "PickleError") self.assertIs(excs[-1], util.Uninferable) def test_absolute_import(self) -> None: module = resources.build_file("data/absimport.py") ctx = InferenceContext() # will fail if absolute import failed ctx.lookupname = "message" next(module["message"].infer(ctx)) ctx.lookupname = "email" m = next(module["email"].infer(ctx)) self.assertFalse(m.file.startswith(os.path.join("data", "email.py"))) def test_more_absolute_import(self) -> None: module = resources.build_file("data/module1abs/__init__.py", "data.module1abs") self.assertIn("sys", module.locals) _pickle_names = ("dump",) # "dumps", "load", "loads") def test_conditional(self) -> None: module = resources.build_file("data/conditional_import/__init__.py") ctx = InferenceContext() for name in self._pickle_names: ctx.lookupname = name some = list(module[name].infer(ctx)) assert Uninferable not in some, name def test_conditional_import(self) -> None: module = resources.build_file("data/conditional.py") ctx = InferenceContext() for name in self._pickle_names: ctx.lookupname = name some = list(module[name].infer(ctx)) assert Uninferable not in some, name class CmpNodeTest(unittest.TestCase): def test_as_string(self) -> None: ast = abuilder.string_build("a == 2").body[0] self.assertEqual(ast.as_string(), "a == 2") class ConstNodeTest(unittest.TestCase): def _test(self, value: Any) -> None: node = nodes.const_factory(value) self.assertIsInstance(node._proxied, nodes.ClassDef) self.assertEqual(node._proxied.name, value.__class__.__name__) self.assertIs(node.value, value) self.assertTrue(node._proxied.parent) self.assertEqual(node._proxied.root().name, value.__class__.__module__) with self.assertRaises(StatementMissing): with pytest.warns(DeprecationWarning) as records: node.statement(future=True) assert len(records) == 1 with self.assertRaises(StatementMissing): node.statement() with self.assertRaises(ParentMissingError): with pytest.warns(DeprecationWarning) as records: node.frame(future=True) assert len(records) == 1 with self.assertRaises(ParentMissingError): node.frame() def test_none(self) -> None: self._test(None) def test_bool(self) -> None: self._test(True) def test_int(self) -> None: self._test(1) def test_float(self) -> None: self._test(1.0) def test_complex(self) -> None: self._test(1.0j) def test_str(self) -> None: self._test("a") def test_unicode(self) -> None: self._test("a") def test_str_kind(self): node = builder.extract_node( """ const = u"foo" """ ) assert isinstance(node.value, nodes.Const) assert node.value.value == "foo" assert node.value.kind, "u" def test_copy(self) -> None: """Make sure copying a Const object doesn't result in infinite recursion.""" const = copy.copy(nodes.Const(1)) assert const.value == 1 class NameNodeTest(unittest.TestCase): def test_assign_to_true(self) -> None: """Test that True and False assignments don't crash.""" code = """ True = False def hello(False): pass del True """ with self.assertRaises(AstroidBuildingError): builder.parse(code) class TestNamedExprNode: """Tests for the NamedExpr node.""" @staticmethod def test_frame() -> None: """Test if the frame of NamedExpr is correctly set for certain types of parent nodes. """ module = builder.parse( """ def func(var_1): pass def func_two(var_2, var_2 = (named_expr_1 := "walrus")): pass class MyBaseClass: pass class MyInheritedClass(MyBaseClass, var_3=(named_expr_2 := "walrus")): pass VAR = lambda y = (named_expr_3 := "walrus"): print(y) def func_with_lambda( var_5 = ( named_expr_4 := lambda y = (named_expr_5 := "walrus"): y ) ): pass COMPREHENSION = [y for i in (1, 2) if (y := i ** 2)] """ ) function = module.body[0] assert function.args.frame() == function assert function.args.frame() == function function_two = module.body[1] assert function_two.args.args[0].frame() == function_two assert function_two.args.args[0].frame() == function_two assert function_two.args.args[1].frame() == function_two assert function_two.args.args[1].frame() == function_two assert function_two.args.defaults[0].frame() == module assert function_two.args.defaults[0].frame() == module inherited_class = module.body[3] assert inherited_class.keywords[0].frame() == inherited_class assert inherited_class.keywords[0].frame() == inherited_class assert inherited_class.keywords[0].value.frame() == module assert inherited_class.keywords[0].value.frame() == module lambda_assignment = module.body[4].value assert lambda_assignment.args.args[0].frame() == lambda_assignment assert lambda_assignment.args.args[0].frame() == lambda_assignment assert lambda_assignment.args.defaults[0].frame() == module assert lambda_assignment.args.defaults[0].frame() == module lambda_named_expr = module.body[5].args.defaults[0] assert lambda_named_expr.value.args.defaults[0].frame() == module assert lambda_named_expr.value.args.defaults[0].frame() == module comprehension = module.body[6].value assert comprehension.generators[0].ifs[0].frame() == module assert comprehension.generators[0].ifs[0].frame() == module @staticmethod def test_scope() -> None: """Test if the scope of NamedExpr is correctly set for certain types of parent nodes. """ module = builder.parse( """ def func(var_1): pass def func_two(var_2, var_2 = (named_expr_1 := "walrus")): pass class MyBaseClass: pass class MyInheritedClass(MyBaseClass, var_3=(named_expr_2 := "walrus")): pass VAR = lambda y = (named_expr_3 := "walrus"): print(y) def func_with_lambda( var_5 = ( named_expr_4 := lambda y = (named_expr_5 := "walrus"): y ) ): pass COMPREHENSION = [y for i in (1, 2) if (y := i ** 2)] """ ) function = module.body[0] assert function.args.scope() == function function_two = module.body[1] assert function_two.args.args[0].scope() == function_two assert function_two.args.args[1].scope() == function_two assert function_two.args.defaults[0].scope() == module inherited_class = module.body[3] assert inherited_class.keywords[0].scope() == inherited_class assert inherited_class.keywords[0].value.scope() == module lambda_assignment = module.body[4].value assert lambda_assignment.args.args[0].scope() == lambda_assignment assert lambda_assignment.args.defaults[0].scope() lambda_named_expr = module.body[5].args.defaults[0] assert lambda_named_expr.value.args.defaults[0].scope() == module comprehension = module.body[6].value assert comprehension.generators[0].ifs[0].scope() == module class AnnAssignNodeTest(unittest.TestCase): def test_primitive(self) -> None: code = textwrap.dedent( """ test: int = 5 """ ) assign = builder.extract_node(code) self.assertIsInstance(assign, nodes.AnnAssign) self.assertEqual(assign.target.name, "test") self.assertEqual(assign.annotation.name, "int") self.assertEqual(assign.value.value, 5) self.assertEqual(assign.simple, 1) def test_primitive_without_initial_value(self) -> None: code = textwrap.dedent( """ test: str """ ) assign = builder.extract_node(code) self.assertIsInstance(assign, nodes.AnnAssign) self.assertEqual(assign.target.name, "test") self.assertEqual(assign.annotation.name, "str") self.assertEqual(assign.value, None) def test_complex(self) -> None: code = textwrap.dedent( """ test: Dict[List[str]] = {} """ ) assign = builder.extract_node(code) self.assertIsInstance(assign, nodes.AnnAssign) self.assertEqual(assign.target.name, "test") self.assertIsInstance(assign.annotation, astroid.Subscript) self.assertIsInstance(assign.value, astroid.Dict) def test_as_string(self) -> None: code = textwrap.dedent( """ print() test: int = 5 test2: str test3: List[Dict[str, str]] = [] """ ) ast = abuilder.string_build(code) self.assertEqual(ast.as_string().strip(), code.strip()) class ArgumentsNodeTC(unittest.TestCase): def test_linenumbering(self) -> None: ast = builder.parse( """ def func(a, b): pass x = lambda x: None """ ) self.assertEqual(ast["func"].args.fromlineno, 2) self.assertFalse(ast["func"].args.is_statement) xlambda = next(ast["x"].infer()) self.assertEqual(xlambda.args.fromlineno, 4) self.assertEqual(xlambda.args.tolineno, 4) self.assertFalse(xlambda.args.is_statement) def test_kwoargs(self) -> None: ast = builder.parse( """ def func(*, x): pass """ ) args = ast["func"].args assert isinstance(args, nodes.Arguments) assert args.is_argument("x") assert args.kw_defaults == [None] ast = builder.parse( """ def func(*, x = "default"): pass """ ) args = ast["func"].args assert isinstance(args, nodes.Arguments) assert args.is_argument("x") assert len(args.kw_defaults) == 1 assert isinstance(args.kw_defaults[0], nodes.Const) assert args.kw_defaults[0].value == "default" @test_utils.require_version(minver="3.8") def test_positional_only(self): ast = builder.parse( """ def func(x, /, y): pass """ ) args = ast["func"].args self.assertTrue(args.is_argument("x")) self.assertTrue(args.is_argument("y")) index, node = args.find_argname("x") self.assertEqual(index, 0) self.assertIsNotNone(node) class UnboundMethodNodeTest(unittest.TestCase): def test_no_super_getattr(self) -> None: # This is a test for issue # https://bitbucket.org/logilab/astroid/issue/91, which tests # that UnboundMethod doesn't call super when doing .getattr. ast = builder.parse( """ class A(object): def test(self): pass meth = A.test """ ) node = next(ast["meth"].infer()) with self.assertRaises(AttributeInferenceError): node.getattr("__missssing__") name = node.getattr("__name__")[0] self.assertIsInstance(name, nodes.Const) self.assertEqual(name.value, "test") class BoundMethodNodeTest(unittest.TestCase): def test_is_property(self) -> None: ast = builder.parse( """ import abc def cached_property(): # Not a real decorator, but we don't care pass def reify(): # Same as cached_property pass def lazy_property(): pass def lazyproperty(): pass def lazy(): pass class A(object): @property def builtin_property(self): return 42 @abc.abstractproperty def abc_property(self): return 42 @cached_property def cached_property(self): return 42 @reify def reified(self): return 42 @lazy_property def lazy_prop(self): return 42 @lazyproperty def lazyprop(self): return 42 def not_prop(self): pass @lazy def decorated_with_lazy(self): return 42 cls = A() builtin_property = cls.builtin_property abc_property = cls.abc_property cached_p = cls.cached_property reified = cls.reified not_prop = cls.not_prop lazy_prop = cls.lazy_prop lazyprop = cls.lazyprop decorated_with_lazy = cls.decorated_with_lazy """ ) for prop in ( "builtin_property", "abc_property", "cached_p", "reified", "lazy_prop", "lazyprop", "decorated_with_lazy", ): inferred = next(ast[prop].infer()) self.assertIsInstance(inferred, nodes.Const, prop) self.assertEqual(inferred.value, 42, prop) inferred = next(ast["not_prop"].infer()) self.assertIsInstance(inferred, bases.BoundMethod) class AliasesTest(unittest.TestCase): def setUp(self) -> None: self.transformer = transforms.TransformVisitor() def parse_transform(self, code: str) -> Module: module = parse(code, apply_transforms=False) return self.transformer.visit(module) def test_aliases(self) -> None: def test_from(node: ImportFrom) -> ImportFrom: node.names = [*node.names, ("absolute_import", None)] return node def test_class(node: ClassDef) -> ClassDef: node.name = "Bar" return node def test_function(node: FunctionDef) -> FunctionDef: node.name = "another_test" return node def test_callfunc(node: Call) -> Call | None: if node.func.name == "Foo": node.func.name = "Bar" return node return None def test_assname(node: AssignName) -> AssignName | None: if node.name == "foo": return nodes.AssignName( "bar", node.lineno, node.col_offset, node.parent, end_lineno=node.end_lineno, end_col_offset=node.end_col_offset, ) return None def test_assattr(node: AssignAttr) -> AssignAttr: if node.attrname == "a": node.attrname = "b" return node return None def test_getattr(node: Attribute) -> Attribute: if node.attrname == "a": node.attrname = "b" return node return None def test_genexpr(node: GeneratorExp) -> GeneratorExp: if node.elt.value == 1: node.elt = nodes.Const(2, node.lineno, node.col_offset, node.parent) return node return None self.transformer.register_transform(nodes.ImportFrom, test_from) self.transformer.register_transform(nodes.ClassDef, test_class) self.transformer.register_transform(nodes.FunctionDef, test_function) self.transformer.register_transform(nodes.Call, test_callfunc) self.transformer.register_transform(nodes.AssignName, test_assname) self.transformer.register_transform(nodes.AssignAttr, test_assattr) self.transformer.register_transform(nodes.Attribute, test_getattr) self.transformer.register_transform(nodes.GeneratorExp, test_genexpr) string = """ from __future__ import print_function class Foo: pass def test(a): return a foo = Foo() foo.a = test(42) foo.a (1 for _ in range(0, 42)) """ module = self.parse_transform(string) self.assertEqual(len(module.body[0].names), 2) self.assertIsInstance(module.body[0], nodes.ImportFrom) self.assertEqual(module.body[1].name, "Bar") self.assertIsInstance(module.body[1], nodes.ClassDef) self.assertEqual(module.body[2].name, "another_test") self.assertIsInstance(module.body[2], nodes.FunctionDef) self.assertEqual(module.body[3].targets[0].name, "bar") self.assertIsInstance(module.body[3].targets[0], nodes.AssignName) self.assertEqual(module.body[3].value.func.name, "Bar") self.assertIsInstance(module.body[3].value, nodes.Call) self.assertEqual(module.body[4].targets[0].attrname, "b") self.assertIsInstance(module.body[4].targets[0], nodes.AssignAttr) self.assertIsInstance(module.body[5], nodes.Expr) self.assertEqual(module.body[5].value.attrname, "b") self.assertIsInstance(module.body[5].value, nodes.Attribute) self.assertEqual(module.body[6].value.elt.value, 2) self.assertIsInstance(module.body[6].value, nodes.GeneratorExp) class Python35AsyncTest(unittest.TestCase): def test_async_await_keywords(self) -> None: ( async_def, async_for, async_with, async_for2, async_with2, await_node, ) = builder.extract_node( """ async def func(): #@ async for i in range(10): #@ f = __(await i) async with test(): #@ pass async for i \ in range(10): #@ pass async with test(), \ test2(): #@ pass """ ) assert isinstance(async_def, nodes.AsyncFunctionDef) assert async_def.lineno == 2 assert async_def.col_offset == 0 assert isinstance(async_for, nodes.AsyncFor) assert async_for.lineno == 3 assert async_for.col_offset == 4 assert isinstance(async_with, nodes.AsyncWith) assert async_with.lineno == 5 assert async_with.col_offset == 4 assert isinstance(async_for2, nodes.AsyncFor) assert async_for2.lineno == 7 assert async_for2.col_offset == 4 assert isinstance(async_with2, nodes.AsyncWith) assert async_with2.lineno == 9 assert async_with2.col_offset == 4 assert isinstance(await_node, nodes.Await) assert isinstance(await_node.value, nodes.Name) assert await_node.lineno == 4 assert await_node.col_offset == 15 def _test_await_async_as_string(self, code: str) -> None: ast_node = parse(code) self.assertEqual(ast_node.as_string().strip(), code.strip()) def test_await_as_string(self) -> None: code = textwrap.dedent( """ async def function(): await 42 await x[0] (await x)[0] await (x + y)[0] """ ) self._test_await_async_as_string(code) def test_asyncwith_as_string(self) -> None: code = textwrap.dedent( """ async def function(): async with 42: pass """ ) self._test_await_async_as_string(code) def test_asyncfor_as_string(self) -> None: code = textwrap.dedent( """ async def function(): async for i in range(10): await 42 """ ) self._test_await_async_as_string(code) def test_decorated_async_def_as_string(self) -> None: code = textwrap.dedent( """ @decorator async def function(): async for i in range(10): await 42 """ ) self._test_await_async_as_string(code) class ContextTest(unittest.TestCase): def test_subscript_load(self) -> None: node = builder.extract_node("f[1]") self.assertIs(node.ctx, Context.Load) def test_subscript_del(self) -> None: node = builder.extract_node("del f[1]") self.assertIs(node.targets[0].ctx, Context.Del) def test_subscript_store(self) -> None: node = builder.extract_node("f[1] = 2") subscript = node.targets[0] self.assertIs(subscript.ctx, Context.Store) def test_list_load(self) -> None: node = builder.extract_node("[]") self.assertIs(node.ctx, Context.Load) def test_list_del(self) -> None: node = builder.extract_node("del []") self.assertIs(node.targets[0].ctx, Context.Del) def test_list_store(self) -> None: with self.assertRaises(AstroidSyntaxError): builder.extract_node("[0] = 2") def test_tuple_load(self) -> None: node = builder.extract_node("(1, )") self.assertIs(node.ctx, Context.Load) def test_tuple_store(self) -> None: with self.assertRaises(AstroidSyntaxError): builder.extract_node("(1, ) = 3") def test_starred_load(self) -> None: node = builder.extract_node("a = *b") starred = node.value self.assertIs(starred.ctx, Context.Load) def test_starred_store(self) -> None: node = builder.extract_node("a, *b = 1, 2") starred = node.targets[0].elts[1] self.assertIs(starred.ctx, Context.Store) def test_unknown() -> None: """Test Unknown node.""" assert isinstance(next(nodes.Unknown().infer()), type(util.Uninferable)) assert isinstance(nodes.Unknown().name, str) assert isinstance(nodes.Unknown().qname(), str) def test_type_comments_with() -> None: module = builder.parse( """ with a as b: # type: int pass with a as b: # type: ignore[name-defined] pass """ ) node = module.body[0] ignored_node = module.body[1] assert isinstance(node.type_annotation, astroid.Name) assert ignored_node.type_annotation is None def test_type_comments_for() -> None: module = builder.parse( """ for a, b in [1, 2, 3]: # type: List[int] pass for a, b in [1, 2, 3]: # type: ignore[name-defined] pass """ ) node = module.body[0] ignored_node = module.body[1] assert isinstance(node.type_annotation, astroid.Subscript) assert node.type_annotation.as_string() == "List[int]" assert ignored_node.type_annotation is None def test_type_coments_assign() -> None: module = builder.parse( """ a, b = [1, 2, 3] # type: List[int] a, b = [1, 2, 3] # type: ignore[name-defined] """ ) node = module.body[0] ignored_node = module.body[1] assert isinstance(node.type_annotation, astroid.Subscript) assert node.type_annotation.as_string() == "List[int]" assert ignored_node.type_annotation is None def test_type_comments_invalid_expression() -> None: module = builder.parse( """ a, b = [1, 2, 3] # type: something completely invalid a, b = [1, 2, 3] # typeee: 2*+4 a, b = [1, 2, 3] # type: List[int """ ) for node in module.body: assert node.type_annotation is None def test_type_comments_invalid_function_comments() -> None: module = builder.parse( """ def func(): # type: something completely invalid pass def func1(): # typeee: 2*+4 pass def func2(): # type: List[int pass """ ) for node in module.body: assert node.type_comment_returns is None assert node.type_comment_args is None def test_type_comments_function() -> None: module = builder.parse( """ def func(): # type: (int) -> str pass def func1(): # type: (int, int, int) -> (str, str) pass def func2(): # type: (int, int, str, List[int]) -> List[int] pass """ ) expected_annotations = [ (["int"], astroid.Name, "str"), (["int", "int", "int"], astroid.Tuple, "(str, str)"), (["int", "int", "str", "List[int]"], astroid.Subscript, "List[int]"), ] for node, (expected_args, expected_returns_type, expected_returns_string) in zip( module.body, expected_annotations ): assert node.type_comment_returns is not None assert node.type_comment_args is not None for expected_arg, actual_arg in zip(expected_args, node.type_comment_args): assert actual_arg.as_string() == expected_arg assert isinstance(node.type_comment_returns, expected_returns_type) assert node.type_comment_returns.as_string() == expected_returns_string def test_type_comments_arguments() -> None: module = builder.parse( """ def func( a, # type: int ): # type: (...) -> str pass def func1( a, # type: int b, # type: int c, # type: int ): # type: (...) -> (str, str) pass def func2( a, # type: int b, # type: int c, # type: str d, # type: List[int] ): # type: (...) -> List[int] pass """ ) expected_annotations = [ ["int"], ["int", "int", "int"], ["int", "int", "str", "List[int]"], ] for node, expected_args in zip(module.body, expected_annotations): assert len(node.type_comment_args) == 1 assert isinstance(node.type_comment_args[0], astroid.Const) assert node.type_comment_args[0].value == Ellipsis assert len(node.args.type_comment_args) == len(expected_args) for expected_arg, actual_arg in zip(expected_args, node.args.type_comment_args): assert actual_arg.as_string() == expected_arg def test_type_comments_posonly_arguments() -> None: module = builder.parse( """ def f_arg_comment( a, # type: int b, # type: int /, c, # type: Optional[int] d, # type: Optional[int] *, e, # type: float f, # type: float ): # type: (...) -> None pass """ ) expected_annotations = [ [["int", "int"], ["Optional[int]", "Optional[int]"], ["float", "float"]] ] for node, expected_types in zip(module.body, expected_annotations): assert len(node.type_comment_args) == 1 assert isinstance(node.type_comment_args[0], astroid.Const) assert node.type_comment_args[0].value == Ellipsis type_comments = [ node.args.type_comment_posonlyargs, node.args.type_comment_args, node.args.type_comment_kwonlyargs, ] for expected_args, actual_args in zip(expected_types, type_comments): assert len(expected_args) == len(actual_args) for expected_arg, actual_arg in zip(expected_args, actual_args): assert actual_arg.as_string() == expected_arg def test_correct_function_type_comment_parent() -> None: data = """ def f(a): # type: (A) -> A pass """ parsed_data = builder.parse(data) f = parsed_data.body[0] assert f.type_comment_args[0].parent is f assert f.type_comment_returns.parent is f def test_is_generator_for_yield_assignments() -> None: node = astroid.extract_node( """ class A: def test(self): a = yield while True: print(a) yield a a = A() a.test """ ) inferred = next(node.infer()) assert isinstance(inferred, astroid.BoundMethod) assert bool(inferred.is_generator()) class AsyncGeneratorTest: def test_async_generator(self): node = astroid.extract_node( """ async def a_iter(n): for i in range(1, n + 1): yield i await asyncio.sleep(1) a_iter(2) #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, bases.AsyncGenerator) assert inferred.getattr("__aiter__") assert inferred.getattr("__anext__") assert inferred.pytype() == "builtins.async_generator" assert inferred.display_type() == "AsyncGenerator" def test_async_generator_is_generator_on_older_python(self): node = astroid.extract_node( """ async def a_iter(n): for i in range(1, n + 1): yield i await asyncio.sleep(1) a_iter(2) #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, bases.Generator) assert inferred.getattr("__iter__") assert inferred.getattr("__next__") assert inferred.pytype() == "builtins.generator" assert inferred.display_type() == "Generator" def test_f_string_correct_line_numbering() -> None: """Test that we generate correct line numbers for f-strings.""" node = astroid.extract_node( """ def func_foo(arg_bar, arg_foo): dict_foo = {} f'{arg_bar.attr_bar}' #@ """ ) assert node.lineno == 5 assert node.last_child().lineno == 5 assert node.last_child().last_child().lineno == 5 def test_assignment_expression() -> None: code = """ if __(a := 1): pass if __(b := test): pass """ first, second = astroid.extract_node(code) assert isinstance(first.target, nodes.AssignName) assert first.target.name == "a" assert isinstance(first.value, nodes.Const) assert first.value.value == 1 assert first.as_string() == "a := 1" assert isinstance(second.target, nodes.AssignName) assert second.target.name == "b" assert isinstance(second.value, nodes.Name) assert second.value.name == "test" assert second.as_string() == "b := test" def test_assignment_expression_in_functiondef() -> None: code = """ def function(param = (assignment := "walrus")): def inner_function(inner_param = (inner_assign := "walrus")): pass pass class MyClass(attr = (assignment_two := "walrus")): pass VAR = lambda y = (assignment_three := "walrus"): print(y) def func_with_lambda( param=(named_expr_four := lambda y=(assignment_four := "walrus"): y), ): pass COMPREHENSION = [y for i in (1, 2) if (assignment_five := i ** 2)] def func(): var = lambda y = (assignment_six := 2): print(y) VAR_TWO = [ func(assignment_seven := 2) for _ in (1,) ] LAMBDA = lambda x: print(assignment_eight := x ** 2) class SomeClass: (assignment_nine := 2**2) """ module = astroid.parse(code) assert "assignment" in module.locals assert isinstance(module.locals.get("assignment")[0], nodes.AssignName) function = module.body[0] assert "inner_assign" in function.locals assert "inner_assign" not in module.locals assert isinstance(function.locals.get("inner_assign")[0], nodes.AssignName) assert "assignment_two" in module.locals assert isinstance(module.locals.get("assignment_two")[0], nodes.AssignName) assert "assignment_three" in module.locals assert isinstance(module.locals.get("assignment_three")[0], nodes.AssignName) assert "assignment_four" in module.locals assert isinstance(module.locals.get("assignment_four")[0], nodes.AssignName) assert "assignment_five" in module.locals assert isinstance(module.locals.get("assignment_five")[0], nodes.AssignName) func = module.body[5] assert "assignment_six" in func.locals assert "assignment_six" not in module.locals assert isinstance(func.locals.get("assignment_six")[0], nodes.AssignName) assert "assignment_seven" in module.locals assert isinstance(module.locals.get("assignment_seven")[0], nodes.AssignName) lambda_assign = module.body[7] assert "assignment_eight" in lambda_assign.value.locals assert "assignment_eight" not in module.locals assert isinstance( lambda_assign.value.locals.get("assignment_eight")[0], nodes.AssignName ) class_assign = module.body[8] assert "assignment_nine" in class_assign.locals assert "assignment_nine" not in module.locals assert isinstance(class_assign.locals.get("assignment_nine")[0], nodes.AssignName) def test_get_doc() -> None: code = textwrap.dedent( """\ def func(): "Docstring" return 1 """ ) node: nodes.FunctionDef = astroid.extract_node(code) # type: ignore[assignment] assert isinstance(node.doc_node, nodes.Const) assert node.doc_node.value == "Docstring" assert node.doc_node.lineno == 2 assert node.doc_node.col_offset == 4 assert node.doc_node.end_lineno == 2 assert node.doc_node.end_col_offset == 15 code = textwrap.dedent( """\ def func(): ... return 1 """ ) node = astroid.extract_node(code) assert node.doc_node is None @test_utils.require_version(minver="3.8") def test_parse_fstring_debug_mode() -> None: node = astroid.extract_node('f"{3=}"') assert isinstance(node, nodes.JoinedStr) assert node.as_string() == "f'3={3!r}'" def test_parse_type_comments_with_proper_parent() -> None: code = """ class D: #@ @staticmethod def g( x # type: np.array ): pass """ node = astroid.extract_node(code) func = node.getattr("g")[0] type_comments = func.args.type_comment_args assert len(type_comments) == 1 type_comment = type_comments[0] assert isinstance(type_comment, astroid.Attribute) assert isinstance(type_comment.parent, astroid.Expr) assert isinstance(type_comment.parent.parent, astroid.Arguments) def test_const_itered() -> None: code = 'a = "string"' node = astroid.extract_node(code).value assert isinstance(node, astroid.Const) itered = node.itered() assert len(itered) == 6 assert [elem.value for elem in itered] == list("string") def test_is_generator_for_yield_in_while() -> None: code = """ def paused_iter(iterable): while True: # Continue to yield the same item until `next(i)` or `i.send(False)` while (yield value): pass """ node = astroid.extract_node(code) assert bool(node.is_generator()) def test_is_generator_for_yield_in_if() -> None: code = """ import asyncio def paused_iter(iterable): if (yield from asyncio.sleep(0.01)): pass return """ node = astroid.extract_node(code) assert bool(node.is_generator()) def test_is_generator_for_yield_in_aug_assign() -> None: code = """ def test(): buf = '' while True: buf += yield """ node = astroid.extract_node(code) assert bool(node.is_generator()) @pytest.mark.skipif(not PY310_PLUS, reason="pattern matching was added in PY310") class TestPatternMatching: @staticmethod def test_match_simple(): code = textwrap.dedent( """ match status: case 200: pass case 401 | 402 | 403: pass case None: pass case _: pass """ ).strip() node = builder.extract_node(code) assert node.as_string() == code assert isinstance(node, nodes.Match) assert isinstance(node.subject, nodes.Name) assert node.subject.name == "status" assert isinstance(node.cases, list) and len(node.cases) == 4 case0, case1, case2, case3 = node.cases assert list(node.get_children()) == [node.subject, *node.cases] assert isinstance(case0.pattern, nodes.MatchValue) assert ( isinstance(case0.pattern.value, astroid.Const) and case0.pattern.value.value == 200 ) assert list(case0.pattern.get_children()) == [case0.pattern.value] assert case0.guard is None assert isinstance(case0.body[0], astroid.Pass) assert list(case0.get_children()) == [case0.pattern, case0.body[0]] assert isinstance(case1.pattern, nodes.MatchOr) assert ( isinstance(case1.pattern.patterns, list) and len(case1.pattern.patterns) == 3 ) for i in range(3): match_value = case1.pattern.patterns[i] assert isinstance(match_value, nodes.MatchValue) assert isinstance(match_value.value, nodes.Const) assert match_value.value.value == (401, 402, 403)[i] assert list(case1.pattern.get_children()) == case1.pattern.patterns assert isinstance(case2.pattern, nodes.MatchSingleton) assert case2.pattern.value is None assert not list(case2.pattern.get_children()) assert isinstance(case3.pattern, nodes.MatchAs) assert case3.pattern.name is None assert case3.pattern.pattern is None assert not list(case3.pattern.get_children()) @staticmethod def test_match_sequence(): code = textwrap.dedent( """ match status: case [x, 2, _, *rest] as y if x > 2: pass """ ).strip() node = builder.extract_node(code) assert node.as_string() == code assert isinstance(node, nodes.Match) assert isinstance(node.cases, list) and len(node.cases) == 1 case = node.cases[0] assert isinstance(case.pattern, nodes.MatchAs) assert isinstance(case.pattern.name, nodes.AssignName) assert case.pattern.name.name == "y" assert list(case.pattern.get_children()) == [ case.pattern.pattern, case.pattern.name, ] assert isinstance(case.guard, nodes.Compare) assert isinstance(case.body[0], nodes.Pass) assert list(case.get_children()) == [case.pattern, case.guard, case.body[0]] pattern_seq = case.pattern.pattern assert isinstance(pattern_seq, nodes.MatchSequence) assert isinstance(pattern_seq.patterns, list) and len(pattern_seq.patterns) == 4 assert ( isinstance(pattern_seq.patterns[0], nodes.MatchAs) and isinstance(pattern_seq.patterns[0].name, nodes.AssignName) and pattern_seq.patterns[0].name.name == "x" and pattern_seq.patterns[0].pattern is None ) assert ( isinstance(pattern_seq.patterns[1], nodes.MatchValue) and isinstance(pattern_seq.patterns[1].value, nodes.Const) and pattern_seq.patterns[1].value.value == 2 ) assert ( isinstance(pattern_seq.patterns[2], nodes.MatchAs) and pattern_seq.patterns[2].name is None ) assert ( isinstance(pattern_seq.patterns[3], nodes.MatchStar) and isinstance(pattern_seq.patterns[3].name, nodes.AssignName) and pattern_seq.patterns[3].name.name == "rest" ) assert list(pattern_seq.patterns[3].get_children()) == [ pattern_seq.patterns[3].name ] assert list(pattern_seq.get_children()) == pattern_seq.patterns @staticmethod def test_match_mapping(): code = textwrap.dedent( """ match status: case {0: x, 1: _}: pass case {**rest}: pass """ ).strip() node = builder.extract_node(code) assert node.as_string() == code assert isinstance(node, nodes.Match) assert isinstance(node.cases, list) and len(node.cases) == 2 case0, case1 = node.cases assert isinstance(case0.pattern, nodes.MatchMapping) assert case0.pattern.rest is None assert isinstance(case0.pattern.keys, list) and len(case0.pattern.keys) == 2 assert ( isinstance(case0.pattern.patterns, list) and len(case0.pattern.patterns) == 2 ) for i in range(2): key = case0.pattern.keys[i] assert isinstance(key, nodes.Const) assert key.value == i pattern = case0.pattern.patterns[i] assert isinstance(pattern, nodes.MatchAs) if i == 0: assert isinstance(pattern.name, nodes.AssignName) assert pattern.name.name == "x" elif i == 1: assert pattern.name is None assert list(case0.pattern.get_children()) == [ *case0.pattern.keys, *case0.pattern.patterns, ] assert isinstance(case1.pattern, nodes.MatchMapping) assert isinstance(case1.pattern.rest, nodes.AssignName) assert case1.pattern.rest.name == "rest" assert isinstance(case1.pattern.keys, list) and len(case1.pattern.keys) == 0 assert ( isinstance(case1.pattern.patterns, list) and len(case1.pattern.patterns) == 0 ) assert list(case1.pattern.get_children()) == [case1.pattern.rest] @staticmethod def test_match_class(): code = textwrap.dedent( """ match x: case Point2D(0, a): pass case Point3D(x=0, y=1, z=b): pass """ ).strip() node = builder.extract_node(code) assert node.as_string() == code assert isinstance(node, nodes.Match) assert isinstance(node.cases, list) and len(node.cases) == 2 case0, case1 = node.cases assert isinstance(case0.pattern, nodes.MatchClass) assert isinstance(case0.pattern.cls, nodes.Name) assert case0.pattern.cls.name == "Point2D" assert ( isinstance(case0.pattern.patterns, list) and len(case0.pattern.patterns) == 2 ) match_value = case0.pattern.patterns[0] assert ( isinstance(match_value, nodes.MatchValue) and isinstance(match_value.value, nodes.Const) and match_value.value.value == 0 ) match_as = case0.pattern.patterns[1] assert ( isinstance(match_as, nodes.MatchAs) and match_as.pattern is None and isinstance(match_as.name, nodes.AssignName) and match_as.name.name == "a" ) assert list(case0.pattern.get_children()) == [ case0.pattern.cls, *case0.pattern.patterns, ] assert isinstance(case1.pattern, nodes.MatchClass) assert isinstance(case1.pattern.cls, nodes.Name) assert case1.pattern.cls.name == "Point3D" assert ( isinstance(case1.pattern.patterns, list) and len(case1.pattern.patterns) == 0 ) assert ( isinstance(case1.pattern.kwd_attrs, list) and len(case1.pattern.kwd_attrs) == 3 ) assert ( isinstance(case1.pattern.kwd_patterns, list) and len(case1.pattern.kwd_patterns) == 3 ) for i in range(2): assert case1.pattern.kwd_attrs[i] == ("x", "y")[i] kwd_pattern = case1.pattern.kwd_patterns[i] assert isinstance(kwd_pattern, nodes.MatchValue) assert isinstance(kwd_pattern.value, nodes.Const) assert kwd_pattern.value.value == i assert case1.pattern.kwd_attrs[2] == "z" kwd_pattern = case1.pattern.kwd_patterns[2] assert ( isinstance(kwd_pattern, nodes.MatchAs) and kwd_pattern.pattern is None and isinstance(kwd_pattern.name, nodes.AssignName) and kwd_pattern.name.name == "b" ) assert list(case1.pattern.get_children()) == [ case1.pattern.cls, *case1.pattern.kwd_patterns, ] @staticmethod def test_return_from_match(): code = textwrap.dedent( """ def return_from_match(x): match x: case 10: return 10 case _: return -1 return_from_match(10) #@ """ ).strip() node = builder.extract_node(code) inferred = node.inferred() assert len(inferred) == 2 assert [inf.value for inf in inferred] == [10, -1] @pytest.mark.parametrize( "node", [ node for node in astroid.nodes.ALL_NODE_CLASSES if node.__name__ not in ["BaseContainer", "NodeNG", "const_factory"] ], ) @pytest.mark.filterwarnings("error") def test_str_repr_no_warnings(node): parameters = inspect.signature(node.__init__).parameters args = {} for name, param_type in parameters.items(): if name == "self": continue if "int" in param_type.annotation: args[name] = random.randint(0, 50) elif ( "NodeNG" in param_type.annotation or "SuccessfulInferenceResult" in param_type.annotation ): args[name] = nodes.Unknown() elif "str" in param_type.annotation: args[name] = "" else: args[name] = None test_node = node(**args) str(test_node) repr(test_node) def test_arguments_contains_all(): """Ensure Arguments.arguments actually returns all available arguments""" def manually_get_args(arg_node) -> set: names = set() if arg_node.args.vararg: names.add(arg_node.args.vararg) if arg_node.args.kwarg: names.add(arg_node.args.kwarg) names.update([x.name for x in arg_node.args.args]) names.update([x.name for x in arg_node.args.kwonlyargs]) return names node = extract_node("""def a(fruit: str, *args, b=None, c=None, **kwargs): ...""") assert manually_get_args(node) == {x.name for x in node.args.arguments} node = extract_node("""def a(mango: int, b="banana", c=None, **kwargs): ...""") assert manually_get_args(node) == {x.name for x in node.args.arguments} node = extract_node("""def a(self, num = 10, *args): ...""") assert manually_get_args(node) == {x.name for x in node.args.arguments} def test_arguments_default_value(): node = extract_node( "def fruit(eat='please', *, peel='no', trim='yes', **kwargs): ..." ) assert node.args.default_value("eat").value == "please" node = extract_node("def fruit(seeds, flavor='good', *, peel='maybe'): ...") assert node.args.default_value("flavor").value == "good" astroid-3.2.2/tests/test_inference_calls.py0000664000175000017500000003457014622475517021006 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for function call inference.""" from astroid import bases, builder, nodes from astroid.util import Uninferable def test_no_return() -> None: """Test function with no return statements.""" node = builder.extract_node( """ def f(): pass f() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_one_return() -> None: """Test function with a single return that always executes.""" node = builder.extract_node( """ def f(): return 1 f() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 def test_one_return_possible() -> None: """Test function with a single return that only sometimes executes. Note: currently, inference doesn't handle this type of control flow """ node = builder.extract_node( """ def f(x): if x: return 1 f(1) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 def test_multiple_returns() -> None: """Test function with multiple returns.""" node = builder.extract_node( """ def f(x): if x > 10: return 1 elif x > 20: return 2 else: return 3 f(100) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 3 assert all(isinstance(node, nodes.Const) for node in inferred) assert {node.value for node in inferred} == {1, 2, 3} def test_argument() -> None: """Test function whose return value uses its arguments.""" node = builder.extract_node( """ def f(x, y): return x + y f(1, 2) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 3 def test_inner_call() -> None: """Test function where return value is the result of a separate function call.""" node = builder.extract_node( """ def f(): return g() def g(): return 1 f() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 def test_inner_call_with_const_argument() -> None: """Test function where return value is the result of a separate function call, with a constant value passed to the inner function. """ node = builder.extract_node( """ def f(): return g(1) def g(y): return y + 2 f() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 3 def test_inner_call_with_dynamic_argument() -> None: """Test function where return value is the result of a separate function call, with a dynamic value passed to the inner function. Currently, this is Uninferable. """ node = builder.extract_node( """ def f(x): return g(x) def g(y): return y + 2 f(1) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_method_const_instance_attr() -> None: """Test method where the return value is based on an instance attribute with a constant value. """ node = builder.extract_node( """ class A: def __init__(self): self.x = 1 def get_x(self): return self.x A().get_x() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 def test_method_const_instance_attr_multiple() -> None: """Test method where the return value is based on an instance attribute with multiple possible constant values, across different methods. """ node = builder.extract_node( """ class A: def __init__(self, x): if x: self.x = 1 else: self.x = 2 def set_x(self): self.x = 3 def get_x(self): return self.x A().get_x() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 3 assert all(isinstance(node, nodes.Const) for node in inferred) assert {node.value for node in inferred} == {1, 2, 3} def test_method_const_instance_attr_same_method() -> None: """Test method where the return value is based on an instance attribute with multiple possible constant values, including in the method being called. Note that even with a simple control flow where the assignment in the method body is guaranteed to override any previous assignments, all possible constant values are returned. """ node = builder.extract_node( """ class A: def __init__(self, x): if x: self.x = 1 else: self.x = 2 def set_x(self): self.x = 3 def get_x(self): self.x = 4 return self.x A().get_x() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 4 assert all(isinstance(node, nodes.Const) for node in inferred) assert {node.value for node in inferred} == {1, 2, 3, 4} def test_method_dynamic_instance_attr_1() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in a different method. In this case, the return value is Uninferable. """ node = builder.extract_node( """ class A: def __init__(self, x): self.x = x def get_x(self): return self.x A(1).get_x() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_method_dynamic_instance_attr_2() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in the same method. """ node = builder.extract_node( """ class A: # Note: no initializer, so the only assignment happens in get_x def get_x(self, x): self.x = x return self.x A().get_x(1) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 def test_method_dynamic_instance_attr_3() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in a different method. This is currently Uninferable. """ node = builder.extract_node( """ class A: def get_x(self, x): # x is unused return self.x def set_x(self, x): self.x = x A().get_x(10) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable # not 10! def test_method_dynamic_instance_attr_4() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in a different method, and is passed a constant value. This is currently Uninferable. """ node = builder.extract_node( """ class A: # Note: no initializer, so the only assignment happens in get_x def get_x(self): self.set_x(10) return self.x def set_x(self, x): self.x = x A().get_x() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_method_dynamic_instance_attr_5() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in a different method, and is passed a constant value. But, where the outer and inner functions have the same signature. Inspired by https://github.com/pylint-dev/pylint/issues/400 This is currently Uninferable. """ node = builder.extract_node( """ class A: # Note: no initializer, so the only assignment happens in get_x def get_x(self, x): self.set_x(10) return self.x def set_x(self, x): self.x = x A().get_x(1) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_method_dynamic_instance_attr_6() -> None: """Test method where the return value is based on an instance attribute with a dynamically-set value in a different method, and is passed a dynamic value. This is currently Uninferable. """ node = builder.extract_node( """ class A: # Note: no initializer, so the only assignment happens in get_x def get_x(self, x): self.set_x(x + 1) return self.x def set_x(self, x): self.x = x A().get_x(1) #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_dunder_getitem() -> None: """Test for the special method __getitem__ (used by Instance.getitem). This is currently Uninferable, until we can infer instance attribute values through constructor calls. """ node = builder.extract_node( """ class A: def __init__(self, x): self.x = x def __getitem__(self, i): return self.x + i A(1)[2] #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable def test_instance_method() -> None: """Tests for instance method, both bound and unbound.""" nodes_ = builder.extract_node( """ class A: def method(self, x): return x A().method(42) #@ # In this case, the 1 argument is bound to self, which is ignored in the method A.method(1, 42) #@ """ ) for node in nodes_: assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 42 def test_class_method() -> None: """Tests for class method calls, both instance and with the class.""" nodes_ = builder.extract_node( """ class A: @classmethod def method(cls, x): return x A.method(42) #@ A().method(42) #@ """ ) for node in nodes_: assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const), node assert inferred[0].value == 42 def test_static_method() -> None: """Tests for static method calls, both instance and with the class.""" nodes_ = builder.extract_node( """ class A: @staticmethod def method(x): return x A.method(42) #@ A().method(42) #@ """ ) for node in nodes_: assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const), node assert inferred[0].value == 42 def test_instance_method_inherited() -> None: """Tests for instance methods that are inherited from a superclass. Based on https://github.com/pylint-dev/astroid/issues/1008. """ nodes_ = builder.extract_node( """ class A: def method(self): return self class B(A): pass A().method() #@ A.method(A()) #@ B().method() #@ B.method(B()) #@ A.method(B()) #@ """ ) expected_names = ["A", "A", "B", "B", "B"] for node, expected in zip(nodes_, expected_names): assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], bases.Instance) assert inferred[0].name == expected def test_class_method_inherited() -> None: """Tests for class methods that are inherited from a superclass. Based on https://github.com/pylint-dev/astroid/issues/1008. """ nodes_ = builder.extract_node( """ class A: @classmethod def method(cls): return cls class B(A): pass A().method() #@ A.method() #@ B().method() #@ B.method() #@ """ ) expected_names = ["A", "A", "B", "B"] for node, expected in zip(nodes_, expected_names): assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.ClassDef) assert inferred[0].name == expected def test_chained_attribute_inherited() -> None: """Tests for class methods that are inherited from a superclass. Based on https://github.com/pylint-dev/pylint/issues/4220. """ node = builder.extract_node( """ class A: def f(self): return 42 class B(A): def __init__(self): self.a = A() result = self.a.f() def f(self): pass B().a.f() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 42 astroid-3.2.2/tests/test_type_params.py0000664000175000017500000000522714622475517020213 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import pytest from astroid import extract_node from astroid.const import PY312_PLUS from astroid.nodes import ( AssignName, ParamSpec, Subscript, TypeAlias, TypeVar, TypeVarTuple, ) if not PY312_PLUS: pytest.skip("Requires Python 3.12 or higher", allow_module_level=True) def test_type_alias() -> None: node = extract_node("type Point[T] = list[float, float]") assert isinstance(node, TypeAlias) assert isinstance(node.type_params[0], TypeVar) assert isinstance(node.type_params[0].name, AssignName) assert node.type_params[0].name.name == "T" assert node.type_params[0].bound is None assert isinstance(node.value, Subscript) assert node.value.value.name == "list" assert node.value.slice.name == "tuple" assert all(elt.name == "float" for elt in node.value.slice.elts) assert node.inferred()[0] is node assert node.type_params[0].inferred()[0] is node.type_params[0] assert node.statement() is node assigned = next(node.assigned_stmts()) assert assigned is node.value def test_type_param_spec() -> None: node = extract_node("type Alias[**P] = Callable[P, int]") params = node.type_params[0] assert isinstance(params, ParamSpec) assert isinstance(params.name, AssignName) assert params.name.name == "P" assert node.inferred()[0] is node def test_type_var_tuple() -> None: node = extract_node("type Alias[*Ts] = tuple[*Ts]") params = node.type_params[0] assert isinstance(params, TypeVarTuple) assert isinstance(params.name, AssignName) assert params.name.name == "Ts" assert node.inferred()[0] is node def test_type_param() -> None: func_node = extract_node("def func[T]() -> T: ...") assert isinstance(func_node.type_params[0], TypeVar) assert func_node.type_params[0].name.name == "T" assert func_node.type_params[0].bound is None class_node = extract_node("class MyClass[T]: ...") assert isinstance(class_node.type_params[0], TypeVar) assert class_node.type_params[0].name.name == "T" assert class_node.type_params[0].bound is None def test_get_children() -> None: func_node = extract_node("def func[T]() -> T: ...") func_children = tuple(func_node.get_children()) assert isinstance(func_children[2], TypeVar) class_node = extract_node("class MyClass[T]: ...") class_children = tuple(class_node.get_children()) assert isinstance(class_children[0], TypeVar) astroid-3.2.2/tests/test_builder.py0000664000175000017500000010500314622475517017306 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for the astroid builder and rebuilder module.""" import collections import importlib import os import pathlib import py_compile import socket import sys import tempfile import textwrap import unittest import unittest.mock import pytest from astroid import Instance, builder, nodes, test_utils, util from astroid.const import IS_PYPY, PY38, PY39_PLUS, PYPY_7_3_11_PLUS from astroid.exceptions import ( AstroidBuildingError, AstroidSyntaxError, AttributeInferenceError, InferenceError, StatementMissing, ) from astroid.nodes.scoped_nodes import Module from . import resources class FromToLineNoTest(unittest.TestCase): def setUp(self) -> None: self.astroid = resources.build_file("data/format.py") def test_callfunc_lineno(self) -> None: stmts = self.astroid.body # on line 4: # function('aeozrijz\ # earzer', hop) discard = stmts[0] self.assertIsInstance(discard, nodes.Expr) self.assertEqual(discard.fromlineno, 4) self.assertEqual(discard.tolineno, 5) callfunc = discard.value self.assertIsInstance(callfunc, nodes.Call) self.assertEqual(callfunc.fromlineno, 4) self.assertEqual(callfunc.tolineno, 5) name = callfunc.func self.assertIsInstance(name, nodes.Name) self.assertEqual(name.fromlineno, 4) self.assertEqual(name.tolineno, 4) strarg = callfunc.args[0] self.assertIsInstance(strarg, nodes.Const) if IS_PYPY: self.assertEqual(strarg.fromlineno, 4) if not PY39_PLUS: self.assertEqual(strarg.tolineno, 4) else: self.assertEqual(strarg.tolineno, 5) else: self.assertEqual(strarg.fromlineno, 4) self.assertEqual(strarg.tolineno, 5) namearg = callfunc.args[1] self.assertIsInstance(namearg, nodes.Name) self.assertEqual(namearg.fromlineno, 5) self.assertEqual(namearg.tolineno, 5) # on line 10: # fonction(1, # 2, # 3, # 4) discard = stmts[2] self.assertIsInstance(discard, nodes.Expr) self.assertEqual(discard.fromlineno, 10) self.assertEqual(discard.tolineno, 13) callfunc = discard.value self.assertIsInstance(callfunc, nodes.Call) self.assertEqual(callfunc.fromlineno, 10) self.assertEqual(callfunc.tolineno, 13) name = callfunc.func self.assertIsInstance(name, nodes.Name) self.assertEqual(name.fromlineno, 10) self.assertEqual(name.tolineno, 10) for i, arg in enumerate(callfunc.args): self.assertIsInstance(arg, nodes.Const) self.assertEqual(arg.fromlineno, 10 + i) self.assertEqual(arg.tolineno, 10 + i) def test_function_lineno(self) -> None: stmts = self.astroid.body # on line 15: # def definition(a, # b, # c): # return a + b + c function = stmts[3] self.assertIsInstance(function, nodes.FunctionDef) self.assertEqual(function.fromlineno, 15) self.assertEqual(function.tolineno, 18) return_ = function.body[0] self.assertIsInstance(return_, nodes.Return) self.assertEqual(return_.fromlineno, 18) self.assertEqual(return_.tolineno, 18) def test_decorated_function_lineno(self) -> None: astroid = builder.parse( """ @decorator def function( arg): print (arg) """, __name__, ) function = astroid["function"] # XXX discussable, but that's what is expected by pylint right now, similar to ClassDef self.assertEqual(function.fromlineno, 3) self.assertEqual(function.tolineno, 5) self.assertEqual(function.decorators.fromlineno, 2) self.assertEqual(function.decorators.tolineno, 2) @staticmethod def test_decorated_class_lineno() -> None: code = textwrap.dedent( """ class A: # L2 ... @decorator class B: # L6 ... @deco1 @deco2( var=42 ) class C: # L13 ... """ ) ast_module: nodes.Module = builder.parse(code) # type: ignore[assignment] a = ast_module.body[0] assert isinstance(a, nodes.ClassDef) assert a.fromlineno == 2 assert a.tolineno == 3 b = ast_module.body[1] assert isinstance(b, nodes.ClassDef) assert b.fromlineno == 6 assert b.tolineno == 7 c = ast_module.body[2] assert isinstance(c, nodes.ClassDef) if IS_PYPY and PY38 and not PYPY_7_3_11_PLUS: # Not perfect, but best we can do for PyPy 3.8 (< v7.3.11). # Can't detect closing bracket on new line. assert c.fromlineno == 12 else: assert c.fromlineno == 13 assert c.tolineno == 14 @staticmethod def test_class_with_docstring() -> None: """Test class nodes which only have docstrings.""" code = textwrap.dedent( '''\ class A: """My docstring""" var = 1 class B: """My docstring""" class C: """My docstring is long.""" class D: """My docstring is long. """ class E: ... ''' ) ast_module = builder.parse(code) a = ast_module.body[0] assert isinstance(a, nodes.ClassDef) assert a.fromlineno == 1 assert a.tolineno == 3 b = ast_module.body[1] assert isinstance(b, nodes.ClassDef) assert b.fromlineno == 5 assert b.tolineno == 6 c = ast_module.body[2] assert isinstance(c, nodes.ClassDef) assert c.fromlineno == 8 assert c.tolineno == 10 d = ast_module.body[3] assert isinstance(d, nodes.ClassDef) assert d.fromlineno == 12 assert d.tolineno == 15 e = ast_module.body[4] assert isinstance(d, nodes.ClassDef) assert e.fromlineno == 17 assert e.tolineno == 18 @staticmethod def test_function_with_docstring() -> None: """Test function defintions with only docstrings.""" code = textwrap.dedent( '''\ def a(): """My docstring""" var = 1 def b(): """My docstring""" def c(): """My docstring is long.""" def d(): """My docstring is long. """ def e(a, b): """My docstring is long. """ ''' ) ast_module = builder.parse(code) a = ast_module.body[0] assert isinstance(a, nodes.FunctionDef) assert a.fromlineno == 1 assert a.tolineno == 3 b = ast_module.body[1] assert isinstance(b, nodes.FunctionDef) assert b.fromlineno == 5 assert b.tolineno == 6 c = ast_module.body[2] assert isinstance(c, nodes.FunctionDef) assert c.fromlineno == 8 assert c.tolineno == 10 d = ast_module.body[3] assert isinstance(d, nodes.FunctionDef) assert d.fromlineno == 12 assert d.tolineno == 15 e = ast_module.body[4] assert isinstance(e, nodes.FunctionDef) assert e.fromlineno == 17 assert e.tolineno == 20 def test_class_lineno(self) -> None: stmts = self.astroid.body # on line 20: # class debile(dict, # object): # pass class_ = stmts[4] self.assertIsInstance(class_, nodes.ClassDef) self.assertEqual(class_.fromlineno, 20) self.assertEqual(class_.tolineno, 22) self.assertEqual(class_.blockstart_tolineno, 21) pass_ = class_.body[0] self.assertIsInstance(pass_, nodes.Pass) self.assertEqual(pass_.fromlineno, 22) self.assertEqual(pass_.tolineno, 22) def test_if_lineno(self) -> None: stmts = self.astroid.body # on line 20: # if aaaa: pass # else: # aaaa,bbbb = 1,2 # aaaa,bbbb = bbbb,aaaa if_ = stmts[5] self.assertIsInstance(if_, nodes.If) self.assertEqual(if_.fromlineno, 24) self.assertEqual(if_.tolineno, 27) self.assertEqual(if_.blockstart_tolineno, 24) self.assertEqual(if_.orelse[0].fromlineno, 26) self.assertEqual(if_.orelse[1].tolineno, 27) def test_for_while_lineno(self) -> None: for code in ( """ for a in range(4): print (a) break else: print ("bouh") """, """ while a: print (a) break else: print ("bouh") """, ): astroid = builder.parse(code, __name__) stmt = astroid.body[0] self.assertEqual(stmt.fromlineno, 2) self.assertEqual(stmt.tolineno, 6) self.assertEqual(stmt.blockstart_tolineno, 2) self.assertEqual(stmt.orelse[0].fromlineno, 6) # XXX self.assertEqual(stmt.orelse[0].tolineno, 6) def test_try_except_lineno(self) -> None: astroid = builder.parse( """ try: print (a) except: pass else: print ("bouh") """, __name__, ) try_ = astroid.body[0] self.assertEqual(try_.fromlineno, 2) self.assertEqual(try_.tolineno, 7) self.assertEqual(try_.blockstart_tolineno, 2) self.assertEqual(try_.orelse[0].fromlineno, 7) # XXX self.assertEqual(try_.orelse[0].tolineno, 7) hdlr = try_.handlers[0] self.assertEqual(hdlr.fromlineno, 4) self.assertEqual(hdlr.tolineno, 5) self.assertEqual(hdlr.blockstart_tolineno, 4) def test_try_finally_lineno(self) -> None: astroid = builder.parse( """ try: print (a) finally: print ("bouh") """, __name__, ) try_ = astroid.body[0] self.assertEqual(try_.fromlineno, 2) self.assertEqual(try_.tolineno, 5) self.assertEqual(try_.blockstart_tolineno, 2) self.assertEqual(try_.finalbody[0].fromlineno, 5) # XXX self.assertEqual(try_.finalbody[0].tolineno, 5) def test_try_finally_25_lineno(self) -> None: astroid = builder.parse( """ try: print (a) except: pass finally: print ("bouh") """, __name__, ) try_ = astroid.body[0] self.assertEqual(try_.fromlineno, 2) self.assertEqual(try_.tolineno, 7) self.assertEqual(try_.blockstart_tolineno, 2) self.assertEqual(try_.finalbody[0].fromlineno, 7) # XXX self.assertEqual(try_.finalbody[0].tolineno, 7) def test_with_lineno(self) -> None: astroid = builder.parse( """ from __future__ import with_statement with file("/tmp/pouet") as f: print (f) """, __name__, ) with_ = astroid.body[1] self.assertEqual(with_.fromlineno, 3) self.assertEqual(with_.tolineno, 4) self.assertEqual(with_.blockstart_tolineno, 3) class BuilderTest(unittest.TestCase): def setUp(self) -> None: self.manager = test_utils.brainless_manager() self.builder = builder.AstroidBuilder(self.manager) def test_data_build_null_bytes(self) -> None: with self.assertRaises(AstroidSyntaxError): self.builder.string_build("\x00") def test_data_build_invalid_x_escape(self) -> None: with self.assertRaises(AstroidSyntaxError): self.builder.string_build('"\\x1"') def test_data_build_error_filename(self) -> None: """Check that error filename is set to modname if given.""" with pytest.raises(AstroidSyntaxError, match="invalid escape sequence") as ctx: self.builder.string_build("'\\d+\\.\\d+'") assert isinstance(ctx.value.error, SyntaxError) assert ctx.value.error.filename == "" with pytest.raises(AstroidSyntaxError, match="invalid escape sequence") as ctx: self.builder.string_build("'\\d+\\.\\d+'", modname="mymodule") assert isinstance(ctx.value.error, SyntaxError) assert ctx.value.error.filename == "mymodule" def test_missing_newline(self) -> None: """Check that a file with no trailing new line is parseable.""" resources.build_file("data/noendingnewline.py") def test_missing_file(self) -> None: with self.assertRaises(AstroidBuildingError): resources.build_file("data/inexistent.py") def test_inspect_build0(self) -> None: """Test astroid tree build from a living object.""" builtin_ast = self.manager.ast_from_module_name("builtins") # just check type and object are there builtin_ast.getattr("type") objectastroid = builtin_ast.getattr("object")[0] self.assertIsInstance(objectastroid.getattr("__new__")[0], nodes.FunctionDef) # check open file alias builtin_ast.getattr("open") # check 'help' is there (defined dynamically by site.py) builtin_ast.getattr("help") # check property has __init__ pclass = builtin_ast["property"] self.assertIn("__init__", pclass) self.assertIsInstance(builtin_ast["None"], nodes.Const) self.assertIsInstance(builtin_ast["True"], nodes.Const) self.assertIsInstance(builtin_ast["False"], nodes.Const) self.assertIsInstance(builtin_ast["Exception"], nodes.ClassDef) self.assertIsInstance(builtin_ast["NotImplementedError"], nodes.ClassDef) def test_inspect_build1(self) -> None: time_ast = self.manager.ast_from_module_name("time") self.assertTrue(time_ast) self.assertEqual(time_ast["time"].args.defaults, None) def test_inspect_build3(self) -> None: self.builder.inspect_build(unittest) def test_inspect_build_type_object(self) -> None: builtin_ast = self.manager.ast_from_module_name("builtins") inferred = list(builtin_ast.igetattr("object")) self.assertEqual(len(inferred), 1) inferred = inferred[0] self.assertEqual(inferred.name, "object") inferred.as_string() # no crash test inferred = list(builtin_ast.igetattr("type")) self.assertEqual(len(inferred), 1) inferred = inferred[0] self.assertEqual(inferred.name, "type") inferred.as_string() # no crash test def test_inspect_transform_module(self) -> None: # ensure no cached version of the time module self.manager._mod_file_cache.pop(("time", None), None) self.manager.astroid_cache.pop("time", None) def transform_time(node: Module) -> None: if node.name == "time": node.transformed = True self.manager.register_transform(nodes.Module, transform_time) try: time_ast = self.manager.ast_from_module_name("time") self.assertTrue(getattr(time_ast, "transformed", False)) finally: self.manager.unregister_transform(nodes.Module, transform_time) def test_package_name(self) -> None: """Test base properties and method of an astroid module.""" datap = resources.build_file("data/__init__.py", "data") self.assertEqual(datap.name, "data") self.assertEqual(datap.package, 1) datap = resources.build_file("data/__init__.py", "data.__init__") self.assertEqual(datap.name, "data") self.assertEqual(datap.package, 1) datap = resources.build_file("data/tmp__init__.py", "data.tmp__init__") self.assertEqual(datap.name, "data.tmp__init__") self.assertEqual(datap.package, 0) def test_yield_parent(self) -> None: """Check if we added discard nodes as yield parent (w/ compiler).""" code = """ def yiell(): #@ yield 0 if noe: yield more """ func = builder.extract_node(code) self.assertIsInstance(func, nodes.FunctionDef) stmt = func.body[0] self.assertIsInstance(stmt, nodes.Expr) self.assertIsInstance(stmt.value, nodes.Yield) self.assertIsInstance(func.body[1].body[0], nodes.Expr) self.assertIsInstance(func.body[1].body[0].value, nodes.Yield) def test_object(self) -> None: obj_ast = self.builder.inspect_build(object) self.assertIn("__setattr__", obj_ast) def test_newstyle_detection(self) -> None: data = """ class A: "old style" class B(A): "old style" class C(object): "new style" class D(C): "new style" __metaclass__ = type class E(A): "old style" class F: "new style" """ mod_ast = builder.parse(data, __name__) self.assertTrue(mod_ast["A"].newstyle) self.assertTrue(mod_ast["B"].newstyle) self.assertTrue(mod_ast["E"].newstyle) self.assertTrue(mod_ast["C"].newstyle) self.assertTrue(mod_ast["D"].newstyle) self.assertTrue(mod_ast["F"].newstyle) def test_globals(self) -> None: data = """ CSTE = 1 def update_global(): global CSTE CSTE += 1 def global_no_effect(): global CSTE2 print (CSTE) """ astroid = builder.parse(data, __name__) self.assertEqual(len(astroid.getattr("CSTE")), 2) self.assertIsInstance(astroid.getattr("CSTE")[0], nodes.AssignName) self.assertEqual(astroid.getattr("CSTE")[0].fromlineno, 2) self.assertEqual(astroid.getattr("CSTE")[1].fromlineno, 6) with self.assertRaises(AttributeInferenceError): astroid.getattr("CSTE2") with self.assertRaises(InferenceError): next(astroid["global_no_effect"].ilookup("CSTE2")) def test_socket_build(self) -> None: astroid = self.builder.module_build(socket) # XXX just check the first one. Actually 3 objects are inferred (look at # the socket module) but the last one as those attributes dynamically # set and astroid is missing this. for fclass in astroid.igetattr("socket"): self.assertIn("connect", fclass) self.assertIn("send", fclass) self.assertIn("close", fclass) break def test_gen_expr_var_scope(self) -> None: data = "l = list(n for n in range(10))\n" astroid = builder.parse(data, __name__) # n unavailable outside gen expr scope self.assertNotIn("n", astroid) # test n is inferable anyway n = test_utils.get_name_node(astroid, "n") self.assertIsNot(n.scope(), astroid) self.assertEqual([i.__class__ for i in n.infer()], [util.Uninferable.__class__]) def test_no_future_imports(self) -> None: mod = builder.parse("import sys") self.assertEqual(set(), mod.future_imports) def test_future_imports(self) -> None: mod = builder.parse("from __future__ import print_function") self.assertEqual({"print_function"}, mod.future_imports) def test_two_future_imports(self) -> None: mod = builder.parse( """ from __future__ import print_function from __future__ import absolute_import """ ) self.assertEqual({"print_function", "absolute_import"}, mod.future_imports) def test_inferred_build(self) -> None: code = """ class A: pass A.type = "class" def A_assign_type(self): print (self) A.assign_type = A_assign_type """ astroid = builder.parse(code) lclass = list(astroid.igetattr("A")) self.assertEqual(len(lclass), 1) lclass = lclass[0] self.assertIn("assign_type", lclass.locals) self.assertIn("type", lclass.locals) def test_infer_can_assign_regular_object(self) -> None: mod = builder.parse( """ class A: pass a = A() a.value = "is set" a.other = "is set" """ ) obj = list(mod.igetattr("a")) self.assertEqual(len(obj), 1) obj = obj[0] self.assertIsInstance(obj, Instance) self.assertIn("value", obj.instance_attrs) self.assertIn("other", obj.instance_attrs) def test_infer_can_assign_has_slots(self) -> None: mod = builder.parse( """ class A: __slots__ = ('value',) a = A() a.value = "is set" a.other = "not set" """ ) obj = list(mod.igetattr("a")) self.assertEqual(len(obj), 1) obj = obj[0] self.assertIsInstance(obj, Instance) self.assertIn("value", obj.instance_attrs) self.assertNotIn("other", obj.instance_attrs) def test_infer_can_assign_no_classdict(self) -> None: mod = builder.parse( """ a = object() a.value = "not set" """ ) obj = list(mod.igetattr("a")) self.assertEqual(len(obj), 1) obj = obj[0] self.assertIsInstance(obj, Instance) self.assertNotIn("value", obj.instance_attrs) def test_augassign_attr(self) -> None: builder.parse( """ class Counter: v = 0 def inc(self): self.v += 1 """, __name__, ) # TODO: Check self.v += 1 generate AugAssign(AssAttr(...)), # not AugAssign(GetAttr(AssName...)) def test_inferred_dont_pollute(self) -> None: code = """ def func(a=None): a.custom_attr = 0 def func2(a={}): a.custom_attr = 0 """ builder.parse(code) # pylint: disable=no-member nonetype = nodes.const_factory(None) self.assertNotIn("custom_attr", nonetype.locals) self.assertNotIn("custom_attr", nonetype.instance_attrs) nonetype = nodes.const_factory({}) self.assertNotIn("custom_attr", nonetype.locals) self.assertNotIn("custom_attr", nonetype.instance_attrs) def test_asstuple(self) -> None: code = "a, b = range(2)" astroid = builder.parse(code) self.assertIn("b", astroid.locals) code = """ def visit_if(self, node): node.test, body = node.tests[0] """ astroid = builder.parse(code) self.assertIn("body", astroid["visit_if"].locals) def test_build_constants(self) -> None: """Test expected values of constants after rebuilding.""" code = """ def func(): return None return return 'None' """ astroid = builder.parse(code) none, nothing, chain = (ret.value for ret in astroid.body[0].body) self.assertIsInstance(none, nodes.Const) self.assertIsNone(none.value) self.assertIsNone(nothing) self.assertIsInstance(chain, nodes.Const) self.assertEqual(chain.value, "None") def test_not_implemented(self) -> None: node = builder.extract_node( """ NotImplemented #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, NotImplemented) def test_type_comments_without_content(self) -> None: node = builder.parse( """ a = 1 # type: # any comment """ ) assert node class FileBuildTest(unittest.TestCase): def setUp(self) -> None: self.module = resources.build_file("data/module.py", "data.module") def test_module_base_props(self) -> None: """Test base properties and method of an astroid module.""" module = self.module self.assertEqual(module.name, "data.module") assert isinstance(module.doc_node, nodes.Const) self.assertEqual(module.doc_node.value, "test module for astroid\n") self.assertEqual(module.fromlineno, 0) self.assertIsNone(module.parent) self.assertEqual(module.frame(), module) self.assertEqual(module.frame(), module) self.assertEqual(module.root(), module) self.assertEqual(module.file, os.path.abspath(resources.find("data/module.py"))) self.assertEqual(module.pure_python, 1) self.assertEqual(module.package, 0) self.assertFalse(module.is_statement) with self.assertRaises(StatementMissing): with pytest.warns(DeprecationWarning) as records: self.assertEqual(module.statement(future=True), module) assert len(records) == 1 with self.assertRaises(StatementMissing): module.statement() def test_module_locals(self) -> None: """Test the 'locals' dictionary of an astroid module.""" module = self.module _locals = module.locals self.assertIs(_locals, module.globals) keys = sorted(_locals.keys()) should = [ "MY_DICT", "NameNode", "YO", "YOUPI", "__revision__", "global_access", "modutils", "four_args", "os", "redirect", ] should.sort() self.assertEqual(keys, sorted(should)) def test_function_base_props(self) -> None: """Test base properties and method of an astroid function.""" module = self.module function = module["global_access"] self.assertEqual(function.name, "global_access") assert isinstance(function.doc_node, nodes.Const) self.assertEqual(function.doc_node.value, "function test") self.assertEqual(function.fromlineno, 11) self.assertTrue(function.parent) self.assertEqual(function.frame(), function) self.assertEqual(function.parent.frame(), module) self.assertEqual(function.frame(), function) self.assertEqual(function.parent.frame(), module) self.assertEqual(function.root(), module) self.assertEqual([n.name for n in function.args.args], ["key", "val"]) self.assertEqual(function.type, "function") def test_function_locals(self) -> None: """Test the 'locals' dictionary of an astroid function.""" _locals = self.module["global_access"].locals self.assertEqual(len(_locals), 4) keys = sorted(_locals.keys()) self.assertEqual(keys, ["i", "key", "local", "val"]) def test_class_base_props(self) -> None: """Test base properties and method of an astroid class.""" module = self.module klass = module["YO"] self.assertEqual(klass.name, "YO") assert isinstance(klass.doc_node, nodes.Const) self.assertEqual(klass.doc_node.value, "hehe\n haha") self.assertEqual(klass.fromlineno, 25) self.assertTrue(klass.parent) self.assertEqual(klass.frame(), klass) self.assertEqual(klass.parent.frame(), module) self.assertEqual(klass.frame(), klass) self.assertEqual(klass.parent.frame(), module) self.assertEqual(klass.root(), module) self.assertEqual(klass.basenames, []) self.assertTrue(klass.newstyle) def test_class_locals(self) -> None: """Test the 'locals' dictionary of an astroid class.""" module = self.module klass1 = module["YO"] locals1 = klass1.locals keys = sorted(locals1.keys()) assert_keys = ["__init__", "__module__", "__qualname__", "a"] self.assertEqual(keys, assert_keys) klass2 = module["YOUPI"] locals2 = klass2.locals keys = locals2.keys() assert_keys = [ "__init__", "__module__", "__qualname__", "class_attr", "class_method", "method", "static_method", ] self.assertEqual(sorted(keys), assert_keys) def test_class_instance_attrs(self) -> None: module = self.module klass1 = module["YO"] klass2 = module["YOUPI"] self.assertEqual(list(klass1.instance_attrs.keys()), ["yo"]) self.assertEqual(list(klass2.instance_attrs.keys()), ["member"]) def test_class_basenames(self) -> None: module = self.module klass1 = module["YO"] klass2 = module["YOUPI"] self.assertEqual(klass1.basenames, []) self.assertEqual(klass2.basenames, ["YO"]) def test_method_base_props(self) -> None: """Test base properties and method of an astroid method.""" klass2 = self.module["YOUPI"] # "normal" method method = klass2["method"] self.assertEqual(method.name, "method") self.assertEqual([n.name for n in method.args.args], ["self"]) assert isinstance(method.doc_node, nodes.Const) self.assertEqual(method.doc_node.value, "method\n test") self.assertEqual(method.fromlineno, 48) self.assertEqual(method.type, "method") # class method method = klass2["class_method"] self.assertEqual([n.name for n in method.args.args], ["cls"]) self.assertEqual(method.type, "classmethod") # static method method = klass2["static_method"] self.assertEqual(method.args.args, []) self.assertEqual(method.type, "staticmethod") def test_method_locals(self) -> None: """Test the 'locals' dictionary of an astroid method.""" method = self.module["YOUPI"]["method"] _locals = method.locals keys = sorted(_locals) # ListComp variables are not accessible outside self.assertEqual(len(_locals), 3) self.assertEqual(keys, ["autre", "local", "self"]) def test_unknown_encoding(self) -> None: with self.assertRaises(AstroidSyntaxError): resources.build_file("data/invalid_encoding.py") def test_module_build_dunder_file() -> None: """Test that module_build() can work with modules that have the *__file__* attribute. """ module = builder.AstroidBuilder().module_build(collections) assert module.path[0] == collections.__file__ @pytest.mark.xfail( reason=( "The builtin ast module does not fail with a specific error " "for syntax error caused by invalid type comments." ), ) def test_parse_module_with_invalid_type_comments_does_not_crash(): node = builder.parse( """ # op { # name: "AssignAddVariableOp" # input_arg { # name: "resource" # type: DT_RESOURCE # } # input_arg { # name: "value" # type_attr: "dtype" # } # attr { # name: "dtype" # type: "type" # } # is_stateful: true # } a, b = 2 """ ) assert isinstance(node, nodes.Module) def test_arguments_of_signature() -> None: """Test that arguments is None for function without an inferable signature.""" node = builder.extract_node("int") classdef: nodes.ClassDef = next(node.infer()) assert all(i.args.args is None for i in classdef.getattr("__dir__")) class HermeticInterpreterTest(unittest.TestCase): """Modeled on https://github.com/pylint-dev/astroid/pull/1207#issuecomment-951455588.""" @classmethod def setUpClass(cls): """Simulate a hermetic interpreter environment having no code on the filesystem.""" with tempfile.TemporaryDirectory() as tmp_dir: sys.path.append(tmp_dir) # Write a python file and compile it to .pyc # To make this test have even more value, we would need to come up with some # code that gets inferred differently when we get its "partial representation". # This code is too simple for that. But we can't use builtins either, because we would # have to delete builtins from the filesystem. But even if we engineered that, # the difference might evaporate over time as inference changes. cls.code_snippet = "def func(): return 42" with tempfile.NamedTemporaryFile( mode="w", dir=tmp_dir, suffix=".py", delete=False ) as tmp: tmp.write(cls.code_snippet) pyc_file = py_compile.compile(tmp.name) cls.pyc_name = tmp.name.replace(".py", ".pyc") os.remove(tmp.name) os.rename(pyc_file, cls.pyc_name) # Import the module cls.imported_module_path = pathlib.Path(cls.pyc_name) cls.imported_module = importlib.import_module(cls.imported_module_path.stem) # Delete source code from module object, filesystem, and path del cls.imported_module.__file__ os.remove(cls.imported_module_path) sys.path.remove(tmp_dir) def test_build_from_live_module_without_source_file(self) -> None: """Assert that inspect_build() is not called. See comment in module_build() before the call to inspect_build(): "get a partial representation by introspection" This "partial representation" was presumably causing unexpected behavior. """ # Sanity check self.assertIsNone( self.imported_module.__loader__.get_source(self.imported_module_path.stem) ) with self.assertRaises(AttributeError): _ = self.imported_module.__file__ my_builder = builder.AstroidBuilder() with unittest.mock.patch.object( self.imported_module.__loader__, "get_source", return_value=self.code_snippet, ): with unittest.mock.patch.object( my_builder, "inspect_build", side_effect=AssertionError ): my_builder.module_build( self.imported_module, modname=self.imported_module_path.stem ) astroid-3.2.2/tests/test_filter_statements.py0000664000175000017500000000120214622475517021410 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid.builder import extract_node from astroid.filter_statements import _filter_stmts from astroid.nodes import EmptyNode def test_empty_node() -> None: enum_mod = extract_node("import enum") empty = EmptyNode(parent=enum_mod) empty.is_statement = True filtered_statements = _filter_stmts(empty, [empty.statement()], empty.frame(), 0) assert filtered_statements[0] is empty astroid-3.2.2/tests/testdata/0000775000175000017500000000000014622475517016061 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/0000775000175000017500000000000014622475517017465 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/0000775000175000017500000000000014622475517020376 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/conditional.py0000664000175000017500000000012714622475517023253 0ustar epsilonepsilonfrom data.conditional_import import ( dump, # dumps, # load, # loads, )astroid-3.2.2/tests/testdata/python3/data/max_inferable_limit_for_classes/0000775000175000017500000000000014622475517026753 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/max_inferable_limit_for_classes/main.py0000664000175000017500000000123314622475517030250 0ustar epsilonepsilon"""This example is based on sqlalchemy. See https://github.com/pylint-dev/pylint/issues/5679 """ from other_funcs import FromClause from .nodes import roles class HasMemoized(object): ... class Generative(HasMemoized): ... class ColumnElement( roles.ColumnArgumentOrKeyRole, roles.BinaryElementRole, roles.OrderByRole, roles.ColumnsClauseRole, roles.LimitOffsetRole, roles.DMLColumnRole, roles.DDLConstraintColumnRole, roles.StatementRole, Generative, ): ... class FunctionElement(ColumnElement, FromClause): ... class months_between(FunctionElement): def __init__(self): super().__init__() astroid-3.2.2/tests/testdata/python3/data/max_inferable_limit_for_classes/other_funcs.py0000664000175000017500000000071314622475517031645 0ustar epsilonepsilonfrom operator import attrgetter from .nodes import roles class HasCacheKey(object): ... class HasMemoized(object): ... class MemoizedHasCacheKey(HasCacheKey, HasMemoized): ... class ClauseElement(MemoizedHasCacheKey): ... class ReturnsRows(roles.ReturnsRowsRole, ClauseElement): ... class Selectable(ReturnsRows): ... class FromClause(roles.AnonymizedFromClauseRole, Selectable): c = property(attrgetter("columns")) astroid-3.2.2/tests/testdata/python3/data/max_inferable_limit_for_classes/nodes/0000775000175000017500000000000014622475517030063 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/max_inferable_limit_for_classes/nodes/roles.py0000664000175000017500000000206014622475517031557 0ustar epsilonepsilonclass SQLRole(object): ... class UsesInspection(object): ... class AllowsLambdaRole(object): ... class ColumnArgumentRole(SQLRole): ... class ColumnArgumentOrKeyRole(ColumnArgumentRole): ... class ColumnListRole(SQLRole): ... class ColumnsClauseRole(AllowsLambdaRole, UsesInspection, ColumnListRole): ... class LimitOffsetRole(SQLRole): ... class ByOfRole(ColumnListRole): ... class OrderByRole(AllowsLambdaRole, ByOfRole): ... class StructuralRole(SQLRole): ... class ExpressionElementRole(SQLRole): ... class BinaryElementRole(ExpressionElementRole): ... class JoinTargetRole(AllowsLambdaRole, UsesInspection, StructuralRole): ... class FromClauseRole(ColumnsClauseRole, JoinTargetRole): ... class StrictFromClauseRole(FromClauseRole): ... class AnonymizedFromClauseRole(StrictFromClauseRole): ... class ReturnsRowsRole(SQLRole): ... class StatementRole(SQLRole): ... class DMLColumnRole(SQLRole): ... class DDLConstraintColumnRole(SQLRole): ... astroid-3.2.2/tests/testdata/python3/data/MyPyPa-0.1.0-py2.5.zip0000664000175000017500000000230614622475517023607 0ustar epsilonepsilonPKAy:5\Bmypypa/__init__.pyc+Id(`b .) Q@!X $ $ 4@JD||YjQqf~^|H;(I9)I)%z9yffy%z@6)9v APKAy:\sjmypypa/__init__.py/K-*ϋWUP73TPKAy:K[EGG-INFO/SOURCES.txt+N-)-+ӏ,|+*Ru3u=1$C]J*J0RR RRR+s2󲋱**/I-KPKAy:#z| EGG-INFO/top_level.txt˭,,HPKAy:2EGG-INFO/dependency_links.txtPKAy:2EGG-INFO/zip-safePKAy:SA4~EGG-INFO/PKG-INFOM-ILI,I K-*ϳR03KMR HKss*RK22ҹ> 2 c = ~b c = not b d = [c] e = d[:] e = d[a:b:c] raise_string(*args, **kwargs) print('bonjour', file=stream) print('salut', end=' ', file=stream) def make_class(any, base=data.module.YO, *args, **kwargs): """check base is correctly resolved to Concrete0""" class Aaaa(base): """dynamic class""" return Aaaa from os.path import abspath import os as myos class A: pass class A(A): pass def generator(): """A generator.""" yield def not_a_generator(): """A function that contains generator, but is not one.""" def generator(): yield genl = lambda: (yield) def with_metaclass(meta, *bases): return meta('NewBase', bases, {}) class NotMetaclass(with_metaclass(Metaclass)): pass astroid-3.2.2/tests/testdata/python3/data/SSL1/0000775000175000017500000000000014622475517021120 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/SSL1/__init__.py0000664000175000017500000000004414622475517023227 0ustar epsilonepsilonfrom .Connection1 import Connection astroid-3.2.2/tests/testdata/python3/data/SSL1/Connection1.py0000664000175000017500000000013514622475517023651 0ustar epsilonepsilon class Connection: def __init__(self, ctx, sock=None): print('init Connection') astroid-3.2.2/tests/testdata/python3/data/tmp__init__.py0000664000175000017500000000000014622475517023216 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/metaclass_recursion/0000775000175000017500000000000014622475517024443 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/metaclass_recursion/__init__.py0000664000175000017500000000000014622475517026542 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/metaclass_recursion/parent.py0000664000175000017500000000012114622475517026300 0ustar epsilonepsilon# https://github.com/pylint-dev/astroid/issues/749 class OriginalClass: pass astroid-3.2.2/tests/testdata/python3/data/metaclass_recursion/monkeypatch.py0000664000175000017500000000114614622475517027341 0ustar epsilonepsilon# https://github.com/pylint-dev/astroid/issues/749 # Not an actual module but allows us to reproduce the issue from tests.testdata.python3.data.metaclass_recursion import parent class MonkeyPatchClass(parent.OriginalClass): _original_class = parent.OriginalClass @classmethod def patch(cls): if parent.OriginalClass != MonkeyPatchClass: cls._original_class = parent.OriginalClass parent.OriginalClass = MonkeyPatchClass @classmethod def unpatch(cls): if parent.OriginalClass == MonkeyPatchClass: parent.OriginalClass = cls._original_class astroid-3.2.2/tests/testdata/python3/data/module.py0000664000175000017500000000337014622475517022240 0ustar epsilonepsilon"""test module for astroid """ __revision__ = '$Id: module.py,v 1.2 2005-11-02 11:56:54 syt Exp $' from astroid.nodes.node_classes import Name as NameNode from astroid import modutils from astroid.utils import * import os.path MY_DICT = {} def global_access(key, val): """function test""" local = 1 MY_DICT[key] = val for i in val: if i: del MY_DICT[i] continue else: break else: return class YO: """hehe haha""" a = 1 def __init__(self): try: self.yo = 1 except ValueError as ex: pass except (NameError, TypeError): raise XXXError() except: raise class YOUPI(YO): class_attr = None def __init__(self): self.member = None def method(self): """method test""" global MY_DICT try: MY_DICT = {} local = None autre = [a for (a, b) in MY_DICT if b] if b in autre: return elif a in autre: return 'hehe' global_access(local, val=autre) finally: return local def static_method(): """static method test""" assert MY_DICT, '???' static_method = staticmethod(static_method) def class_method(cls): """class method test""" exec(a, b) class_method = classmethod(class_method) def four_args(a, b, c, d): """four arguments (was nested_args)""" while 1: if a: break a += +1 else: b += -2 if c: d = a and (b or c) else: c = a and b or d list(map(lambda x, y: (y, x), a)) redirect = four_args astroid-3.2.2/tests/testdata/python3/data/email.py0000664000175000017500000000010614622475517022034 0ustar epsilonepsilon"""fake email module to test absolute import doesn't grab this one""" astroid-3.2.2/tests/testdata/python3/data/absimp/0000775000175000017500000000000014622475517021651 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/absimp/sidepackage/0000775000175000017500000000000014622475517024111 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/absimp/sidepackage/__init__.py0000664000175000017500000000005214622475517026217 0ustar epsilonepsilon"""a side package with nothing in it """ astroid-3.2.2/tests/testdata/python3/data/absimp/__init__.py0000664000175000017500000000013114622475517023755 0ustar epsilonepsilon"""a package with absolute import activated """ from __future__ import absolute_import astroid-3.2.2/tests/testdata/python3/data/absimp/string.py0000664000175000017500000000012314622475517023525 0ustar epsilonepsilonfrom __future__ import absolute_import, print_function import string print(string) astroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/0000775000175000017500000000000014622475517024620 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/a.py0000664000175000017500000000024314622475517025411 0ustar epsilonepsilon# pylint: disable=missing-docstring from .level1.beyond_top_level_two import func def do_something(var, some_other_var): # error func(var, some_other_var) astroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/level1/0000775000175000017500000000000014622475517026010 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/level1/__init__.py0000664000175000017500000000003014622475517030112 0ustar epsilonepsilondef func(var): pass astroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/level1/beyond_top_level_two.py0000664000175000017500000000005014622475517032577 0ustar epsilonepsilondef func(var, some_other_var): pass astroid-3.2.2/tests/testdata/python3/data/beyond_top_level_two/__init__.py0000664000175000017500000000000014622475517026717 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/module_dict_items_call/0000775000175000017500000000000014622475517025062 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/module_dict_items_call/models.py0000664000175000017500000000006314622475517026716 0ustar epsilonepsilonimport re class MyModel: class_attribute = 1 astroid-3.2.2/tests/testdata/python3/data/module_dict_items_call/test.py0000664000175000017500000000022414622475517026411 0ustar epsilonepsilonimport models def func(): for _, value in models.__dict__.items(): if isinstance(value, type): value.class_attribute += 1 astroid-3.2.2/tests/testdata/python3/data/foogle_fax-0.12.5-py2.7-nspkg.pth0000664000175000017500000000116514622475517025745 0ustar epsilonepsilonimport sys, types, os;p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('foogle',));ie = os.path.exists(os.path.join(p,'__init__.py'));m = not ie and sys.modules.setdefault('foogle', types.ModuleType('foogle'));mp = (m or []) and m.__dict__.setdefault('__path__',[]);(p not in mp) and mp.append(p) import sys, types, os;p = os.path.join(sys._getframe(1).f_locals['sitedir'], *('foogle','crank'));ie = os.path.exists(os.path.join(p,'__init__.py'));m = not ie and sys.modules.setdefault('foogle.crank', types.ModuleType('foogle.crank'));mp = (m or []) and m.__dict__.setdefault('__path__',[]);(p not in mp) and mp.append(p) astroid-3.2.2/tests/testdata/python3/data/notamodule/0000775000175000017500000000000014622475517022545 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/notamodule/file.py0000664000175000017500000000000014622475517024024 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkg_resources_3/0000775000175000017500000000000014622475517024507 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkg_resources_3/package/0000775000175000017500000000000014622475517026102 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkg_resources_3/package/__init__.py0000664000175000017500000000006714622475517030216 0ustar epsilonepsilon__import__('pkg_resources').declare_namespace(__name__)astroid-3.2.2/tests/testdata/python3/data/path_pkg_resources_3/package/baz.py0000664000175000017500000000000014622475517027216 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkgutil_2/0000775000175000017500000000000014622475517023312 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkgutil_2/package/0000775000175000017500000000000014622475517024705 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/path_pkgutil_2/package/__init__.py0000664000175000017500000000011214622475517027010 0ustar epsilonepsilonfrom pkgutil import extend_path __path__ = extend_path(__path__, __name__)astroid-3.2.2/tests/testdata/python3/data/path_pkgutil_2/package/bar.py0000664000175000017500000000000014622475517026011 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/data/all.py0000664000175000017500000000015314622475517021517 0ustar epsilonepsilon name = 'a' _bla = 2 other = 'o' class Aaa: pass def func(): print('yo') __all__ = 'Aaa', '_bla', 'name' astroid-3.2.2/tests/testdata/python3/pyi_data/0000775000175000017500000000000014622475517021257 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/__init__.pyi0000664000175000017500000000000014622475517023527 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/0000775000175000017500000000000014622475517023236 5ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/__init__.pyi0000664000175000017500000000000014622475517025506 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/noendingnewline.py0000664000175000017500000000000014622475517026761 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/__init__.py0000664000175000017500000000000014622475517025335 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/nonregr.py0000664000175000017500000000000014622475517025250 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/module2.py0000664000175000017500000000000014622475517025145 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/find_test/module.py0000664000175000017500000000000014622475517025063 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/pyi_data/module.py0000664000175000017500000000000014622475517023104 0ustar epsilonepsilonastroid-3.2.2/tests/testdata/python3/recursion_error.py0000664000175000017500000001227214622475517023265 0ustar epsilonepsilonLONG_CHAINED_METHOD_CALL = """ from a import b ( b.builder('name') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .add('name', value='value') .Build() )""" astroid-3.2.2/tests/test_protocols.py0000664000175000017500000004017514622475517017714 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import contextlib import unittest from collections.abc import Callable, Iterator from typing import Any import pytest import astroid from astroid import extract_node, nodes from astroid.const import PY310_PLUS, PY312_PLUS from astroid.exceptions import InferenceError from astroid.manager import AstroidManager from astroid.util import Uninferable, UninferableBase @contextlib.contextmanager def _add_transform( manager: AstroidManager, node: type, transform: Callable, predicate: Any | None = None, ) -> Iterator: manager.register_transform(node, transform, predicate) try: yield finally: manager.unregister_transform(node, transform, predicate) class ProtocolTests(unittest.TestCase): def assertConstNodesEqual( self, nodes_list_expected: list[int], nodes_list_got: list[nodes.Const] ) -> None: self.assertEqual(len(nodes_list_expected), len(nodes_list_got)) for node in nodes_list_got: self.assertIsInstance(node, nodes.Const) for node, expected_value in zip(nodes_list_got, nodes_list_expected): self.assertEqual(expected_value, node.value) def assertNameNodesEqual( self, nodes_list_expected: list[str], nodes_list_got: list[nodes.Name] ) -> None: self.assertEqual(len(nodes_list_expected), len(nodes_list_got)) for node in nodes_list_got: self.assertIsInstance(node, nodes.Name) for node, expected_name in zip(nodes_list_got, nodes_list_expected): self.assertEqual(expected_name, node.name) def test_assigned_stmts_simple_for(self) -> None: assign_stmts = extract_node( """ for a in (1, 2, 3): #@ pass for b in range(3): #@ pass """ ) for1_assnode = next(assign_stmts[0].nodes_of_class(nodes.AssignName)) assigned = list(for1_assnode.assigned_stmts()) self.assertConstNodesEqual([1, 2, 3], assigned) for2_assnode = next(assign_stmts[1].nodes_of_class(nodes.AssignName)) self.assertRaises(InferenceError, list, for2_assnode.assigned_stmts()) def test_assigned_stmts_nested_for_tuple(self) -> None: assign_stmts = extract_node( """ for a, (b, c) in [(1, (2, 3))]: #@ pass """ ) assign_nodes = assign_stmts.nodes_of_class(nodes.AssignName) for1_assnode = next(assign_nodes) assigned = list(for1_assnode.assigned_stmts()) self.assertConstNodesEqual([1], assigned) for2_assnode = next(assign_nodes) assigned2 = list(for2_assnode.assigned_stmts()) self.assertConstNodesEqual([2], assigned2) def test_assigned_stmts_nested_for_dict(self) -> None: assign_stmts = extract_node( """ for a, (b, c) in {1: ("a", str), 2: ("b", bytes)}.items(): #@ pass """ ) assign_nodes = assign_stmts.nodes_of_class(nodes.AssignName) # assigned: [1, 2] for1_assnode = next(assign_nodes) assigned = list(for1_assnode.assigned_stmts()) self.assertConstNodesEqual([1, 2], assigned) # assigned2: ["a", "b"] for2_assnode = next(assign_nodes) assigned2 = list(for2_assnode.assigned_stmts()) self.assertConstNodesEqual(["a", "b"], assigned2) # assigned3: [str, bytes] for3_assnode = next(assign_nodes) assigned3 = list(for3_assnode.assigned_stmts()) self.assertNameNodesEqual(["str", "bytes"], assigned3) def test_assigned_stmts_starred_for(self) -> None: assign_stmts = extract_node( """ for *a, b in ((1, 2, 3), (4, 5, 6, 7)): #@ pass """ ) for1_starred = next(assign_stmts.nodes_of_class(nodes.Starred)) assigned = next(for1_starred.assigned_stmts()) assert isinstance(assigned, astroid.List) assert assigned.as_string() == "[1, 2]" def _get_starred_stmts(self, code: str) -> list | UninferableBase: assign_stmt = extract_node(f"{code} #@") starred = next(assign_stmt.nodes_of_class(nodes.Starred)) return next(starred.assigned_stmts()) def _helper_starred_expected_const(self, code: str, expected: list[int]) -> None: stmts = self._get_starred_stmts(code) self.assertIsInstance(stmts, nodes.List) stmts = stmts.elts self.assertConstNodesEqual(expected, stmts) def _helper_starred_expected(self, code: str, expected: UninferableBase) -> None: stmts = self._get_starred_stmts(code) self.assertEqual(expected, stmts) def _helper_starred_inference_error(self, code: str) -> None: assign_stmt = extract_node(f"{code} #@") starred = next(assign_stmt.nodes_of_class(nodes.Starred)) self.assertRaises(InferenceError, list, starred.assigned_stmts()) def test_assigned_stmts_starred_assnames(self) -> None: self._helper_starred_expected_const("a, *b = (1, 2, 3, 4) #@", [2, 3, 4]) self._helper_starred_expected_const("*a, b = (1, 2, 3) #@", [1, 2]) self._helper_starred_expected_const("a, *b, c = (1, 2, 3, 4, 5) #@", [2, 3, 4]) self._helper_starred_expected_const("a, *b = (1, 2) #@", [2]) self._helper_starred_expected_const("*b, a = (1, 2) #@", [1]) self._helper_starred_expected_const("[*b] = (1, 2) #@", [1, 2]) def test_assigned_stmts_starred_yes(self) -> None: # Not something iterable and known self._helper_starred_expected("a, *b = range(3) #@", Uninferable) # Not something inferable self._helper_starred_expected("a, *b = balou() #@", Uninferable) # In function, unknown. self._helper_starred_expected( """ def test(arg): head, *tail = arg #@""", Uninferable, ) # These cases aren't worth supporting. self._helper_starred_expected( "a, (*b, c), d = (1, (2, 3, 4), 5) #@", Uninferable ) def test_assigned_stmts_starred_inside_call(self) -> None: """Regression test for https://github.com/pylint-dev/pylint/issues/6372.""" code = "string_twos = ''.join(str(*y) for _, *y in [[1, 2], [1, 2]]) #@" stmt = extract_node(code) starred = next(stmt.nodes_of_class(nodes.Starred)) starred_stmts = starred.assigned_stmts() self.assertIs(next(starred_stmts), Uninferable) # Generator exhausted after one call with self.assertRaises(StopIteration): next(starred_stmts) def test_assign_stmts_starred_fails(self) -> None: # Too many starred self._helper_starred_inference_error("a, *b, *c = (1, 2, 3) #@") # This could be solved properly, but it complicates needlessly the # code for assigned_stmts, without offering real benefit. self._helper_starred_inference_error( "(*a, b), (c, *d) = (1, 2, 3), (4, 5, 6) #@" ) def test_assigned_stmts_assignments(self) -> None: assign_stmts = extract_node( """ c = a #@ d, e = b, c #@ """ ) simple_assnode = next(assign_stmts[0].nodes_of_class(nodes.AssignName)) assigned = list(simple_assnode.assigned_stmts()) self.assertNameNodesEqual(["a"], assigned) assnames = assign_stmts[1].nodes_of_class(nodes.AssignName) simple_mul_assnode_1 = next(assnames) assigned = list(simple_mul_assnode_1.assigned_stmts()) self.assertNameNodesEqual(["b"], assigned) simple_mul_assnode_2 = next(assnames) assigned = list(simple_mul_assnode_2.assigned_stmts()) self.assertNameNodesEqual(["c"], assigned) def test_assigned_stmts_annassignments(self) -> None: annassign_stmts = extract_node( """ a: str = "abc" #@ b: str #@ """ ) simple_annassign_node = next( annassign_stmts[0].nodes_of_class(nodes.AssignName) ) assigned = list(simple_annassign_node.assigned_stmts()) self.assertEqual(1, len(assigned)) self.assertIsInstance(assigned[0], nodes.Const) self.assertEqual(assigned[0].value, "abc") empty_annassign_node = next(annassign_stmts[1].nodes_of_class(nodes.AssignName)) assigned = list(empty_annassign_node.assigned_stmts()) self.assertEqual(1, len(assigned)) self.assertIs(assigned[0], Uninferable) def test_sequence_assigned_stmts_not_accepting_empty_node(self) -> None: def transform(node: nodes.Assign) -> None: node.root().locals["__all__"] = [node.value] manager = astroid.MANAGER with _add_transform(manager, astroid.Assign, transform): module = astroid.parse( """ __all__ = ['a'] """ ) module.wildcard_import_names() def test_not_passing_uninferable_in_seq_inference(self) -> None: class Visitor: def visit(self, node: nodes.Assign | nodes.BinOp | nodes.List) -> Any: for child in node.get_children(): child.accept(self) visit_module = visit visit_assign = visit visit_binop = visit visit_list = visit visit_const = visit visit_name = visit def visit_assignname(self, node: nodes.AssignName) -> None: for _ in node.infer(): pass parsed = extract_node( """ a = [] x = [a*2, a]*2*2 """ ) parsed.accept(Visitor()) @staticmethod def test_uninferable_exponents() -> None: """Attempting to calculate the result is prohibitively expensive.""" parsed = extract_node("15 ** 20220609") assert parsed.inferred() == [Uninferable] # Test a pathological case (more realistic: None as naive inference result) parsed = extract_node("None ** 2") assert parsed.inferred() == [Uninferable] @staticmethod def test_uninferable_list_multiplication() -> None: """Attempting to calculate the result is prohibitively expensive.""" parsed = extract_node("[0] * 123456789") element = parsed.inferred()[0].elts[0] assert element.value is Uninferable def test_named_expr_inference() -> None: code = """ if (a := 2) == 2: a #@ # Test a function call def test(): return 24 if (a := test()): a #@ # Normal assignments in sequences { (a:= 4) } #@ [ (a:= 5) ] #@ # Something more complicated def test(value=(p := 24)): return p [ y:= test()] #@ # Priority assignment (x := 1, 2) x #@ """ ast_nodes = extract_node(code) assert isinstance(ast_nodes, list) node = next(ast_nodes[0].infer()) assert isinstance(node, nodes.Const) assert node.value == 2 node = next(ast_nodes[1].infer()) assert isinstance(node, nodes.Const) assert node.value == 24 node = next(ast_nodes[2].infer()) assert isinstance(node, nodes.Set) assert isinstance(node.elts[0], nodes.Const) assert node.elts[0].value == 4 node = next(ast_nodes[3].infer()) assert isinstance(node, nodes.List) assert isinstance(node.elts[0], nodes.Const) assert node.elts[0].value == 5 node = next(ast_nodes[4].infer()) assert isinstance(node, nodes.List) assert isinstance(node.elts[0], nodes.Const) assert node.elts[0].value == 24 node = next(ast_nodes[5].infer()) assert isinstance(node, nodes.Const) assert node.value == 1 @pytest.mark.skipif(not PY310_PLUS, reason="Match requires python 3.10") class TestPatternMatching: @staticmethod def test_assigned_stmts_match_mapping(): """Assigned_stmts for MatchMapping not yet implemented. Test the result is 'Uninferable' and no exception is raised. """ assign_stmts = extract_node( """ var = {1: "Hello", 2: "World"} match var: case {**rest}: #@ pass """ ) match_mapping: nodes.MatchMapping = assign_stmts.pattern # type: ignore[union-attr] assert match_mapping.rest assigned = next(match_mapping.rest.assigned_stmts()) assert assigned == Uninferable @staticmethod def test_assigned_stmts_match_star(): """Assigned_stmts for MatchStar not yet implemented. Test the result is 'Uninferable' and no exception is raised. """ assign_stmts = extract_node( """ var = (0, 1, 2) match var: case (0, 1, *rest): #@ pass """ ) match_sequence: nodes.MatchSequence = assign_stmts.pattern # type: ignore[union-attr] match_star = match_sequence.patterns[2] assert isinstance(match_star, nodes.MatchStar) and match_star.name assigned = next(match_star.name.assigned_stmts()) assert assigned == Uninferable @staticmethod def test_assigned_stmts_match_as(): """Assigned_stmts for MatchAs only implemented for the most basic case (y).""" assign_stmts = extract_node( """ var = 42 match var: #@ case 2 | x: #@ pass case (1, 2) as y: #@ pass case z: #@ pass """ ) subject: nodes.Const = assign_stmts[0].subject # type: ignore[index,union-attr] match_or: nodes.MatchOr = assign_stmts[1].pattern # type: ignore[index,union-attr] match_as_with_pattern: nodes.MatchAs = assign_stmts[2].pattern # type: ignore[index,union-attr] match_as: nodes.MatchAs = assign_stmts[3].pattern # type: ignore[index,union-attr] match_or_1 = match_or.patterns[1] assert isinstance(match_or_1, nodes.MatchAs) and match_or_1.name assigned_match_or_1 = next(match_or_1.name.assigned_stmts()) assert assigned_match_or_1 == Uninferable assert match_as_with_pattern.name and match_as_with_pattern.pattern assigned_match_as_pattern = next(match_as_with_pattern.name.assigned_stmts()) assert assigned_match_as_pattern == Uninferable assert match_as.name assigned_match_as = next(match_as.name.assigned_stmts()) assert assigned_match_as == subject @pytest.mark.skipif(not PY312_PLUS, reason="Generic typing syntax requires python 3.12") class TestGenericTypeSyntax: @staticmethod def test_assigned_stmts_type_var(): """The result is 'Uninferable' and no exception is raised.""" assign_stmts = extract_node("type Point[T] = tuple[float, float]") type_var: nodes.TypeVar = assign_stmts.type_params[0] assigned = next(type_var.name.assigned_stmts()) # Hack so inference doesn't fail when evaluating __class_getitem__ # Revert if it's causing issues. assert isinstance(assigned, nodes.Const) assert assigned.value is None @staticmethod def test_assigned_stmts_type_var_tuple(): """The result is 'Uninferable' and no exception is raised.""" assign_stmts = extract_node("type Alias[*Ts] = tuple[*Ts]") type_var_tuple: nodes.TypeVarTuple = assign_stmts.type_params[0] assigned = next(type_var_tuple.name.assigned_stmts()) # Hack so inference doesn't fail when evaluating __class_getitem__ # Revert if it's causing issues. assert isinstance(assigned, nodes.Const) assert assigned.value is None @staticmethod def test_assigned_stmts_param_spec(): """The result is 'Uninferable' and no exception is raised.""" assign_stmts = extract_node("type Alias[**P] = Callable[P, int]") param_spec: nodes.ParamSpec = assign_stmts.type_params[0] assigned = next(param_spec.name.assigned_stmts()) # Hack so inference doesn't fail when evaluating __class_getitem__ # Revert if it's causing issues. assert isinstance(assigned, nodes.Const) assert assigned.value is None astroid-3.2.2/tests/brain/0000775000175000017500000000000014622475517015343 5ustar epsilonepsilonastroid-3.2.2/tests/brain/test_enum.py0000664000175000017500000004415514622475517017731 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import pytest import astroid from astroid import bases, builder, nodes, objects from astroid.exceptions import InferenceError class EnumBrainTest(unittest.TestCase): def test_simple_enum(self) -> None: module = builder.parse( """ import enum class MyEnum(enum.Enum): one = "one" two = "two" def mymethod(self, x): return 5 """ ) enumeration = next(module["MyEnum"].infer()) one = enumeration["one"] self.assertEqual(one.pytype(), ".MyEnum.one") for propname in ("name", "value"): prop = next(iter(one.getattr(propname))) self.assertIn("builtins.property", prop.decoratornames()) meth = one.getattr("mymethod")[0] self.assertIsInstance(meth, astroid.FunctionDef) def test_looks_like_enum_false_positive(self) -> None: # Test that a class named Enumeration is not considered a builtin enum. module = builder.parse( """ class Enumeration(object): def __init__(self, name, enum_list): pass test = 42 """ ) enumeration = module["Enumeration"] test = next(enumeration.igetattr("test")) self.assertEqual(test.value, 42) def test_user_enum_false_positive(self) -> None: # Test that a user-defined class named Enum is not considered a builtin enum. ast_node = astroid.extract_node( """ class Enum: pass class Color(Enum): red = 1 Color.red #@ """ ) assert isinstance(ast_node, nodes.NodeNG) inferred = ast_node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], astroid.Const) self.assertEqual(inferred[0].value, 1) def test_ignores_with_nodes_from_body_of_enum(self) -> None: code = """ import enum class Error(enum.Enum): Foo = "foo" Bar = "bar" with "error" as err: pass """ node = builder.extract_node(code) inferred = next(node.infer()) assert "err" in inferred.locals assert len(inferred.locals["err"]) == 1 def test_enum_multiple_base_classes(self) -> None: module = builder.parse( """ import enum class Mixin: pass class MyEnum(Mixin, enum.Enum): one = 1 """ ) enumeration = next(module["MyEnum"].infer()) one = enumeration["one"] clazz = one.getattr("__class__")[0] self.assertTrue( clazz.is_subtype_of(".Mixin"), "Enum instance should share base classes with generating class", ) def test_int_enum(self) -> None: module = builder.parse( """ import enum class MyEnum(enum.IntEnum): one = 1 """ ) enumeration = next(module["MyEnum"].infer()) one = enumeration["one"] clazz = one.getattr("__class__")[0] self.assertTrue( clazz.is_subtype_of("builtins.int"), "IntEnum based enums should be a subtype of int", ) def test_enum_func_form_is_class_not_instance(self) -> None: cls, instance = builder.extract_node( """ from enum import Enum f = Enum('Audience', ['a', 'b', 'c']) f #@ f(1) #@ """ ) inferred_cls = next(cls.infer()) self.assertIsInstance(inferred_cls, bases.Instance) inferred_instance = next(instance.infer()) self.assertIsInstance(inferred_instance, bases.Instance) self.assertIsInstance(next(inferred_instance.igetattr("name")), nodes.Const) self.assertIsInstance(next(inferred_instance.igetattr("value")), nodes.Const) def test_enum_func_form_iterable(self) -> None: instance = builder.extract_node( """ from enum import Enum Animal = Enum('Animal', 'ant bee cat dog') Animal """ ) inferred = next(instance.infer()) self.assertIsInstance(inferred, astroid.Instance) self.assertTrue(inferred.getattr("__iter__")) def test_enum_func_form_subscriptable(self) -> None: instance, name = builder.extract_node( """ from enum import Enum Animal = Enum('Animal', 'ant bee cat dog') Animal['ant'] #@ Animal['ant'].name #@ """ ) instance = next(instance.infer()) self.assertIsInstance(instance, astroid.Instance) inferred = next(name.infer()) self.assertIsInstance(inferred, astroid.Const) def test_enum_func_form_has_dunder_members(self) -> None: instance = builder.extract_node( """ from enum import Enum Animal = Enum('Animal', 'ant bee cat dog') for i in Animal.__members__: i #@ """ ) instance = next(instance.infer()) self.assertIsInstance(instance, astroid.Const) self.assertIsInstance(instance.value, str) def test_infer_enum_value_as_the_right_type(self) -> None: string_value, int_value = builder.extract_node( """ from enum import Enum class A(Enum): a = 'a' b = 1 A.a.value #@ A.b.value #@ """ ) inferred_string = string_value.inferred() assert any( isinstance(elem, astroid.Const) and elem.value == "a" for elem in inferred_string ) inferred_int = int_value.inferred() assert any( isinstance(elem, astroid.Const) and elem.value == 1 for elem in inferred_int ) def test_mingled_single_and_double_quotes_does_not_crash(self) -> None: node = builder.extract_node( """ from enum import Enum class A(Enum): a = 'x"y"' A.a.value #@ """ ) inferred_string = next(node.infer()) assert inferred_string.value == 'x"y"' def test_special_characters_does_not_crash(self) -> None: node = builder.extract_node( """ import enum class Example(enum.Enum): NULL = '\\N{NULL}' Example.NULL.value """ ) inferred_string = next(node.infer()) assert inferred_string.value == "\N{NULL}" def test_dont_crash_on_for_loops_in_body(self) -> None: node = builder.extract_node( """ class Commands(IntEnum): _ignore_ = 'Commands index' _init_ = 'value string' BEL = 0x07, 'Bell' Commands = vars() for index in range(4): Commands[f'DC{index + 1}'] = 0x11 + index, f'Device Control {index + 1}' Commands """ ) inferred = next(node.infer()) assert isinstance(inferred, astroid.ClassDef) def test_enum_tuple_list_values(self) -> None: tuple_node, list_node = builder.extract_node( """ import enum class MyEnum(enum.Enum): a = (1, 2) b = [2, 4] MyEnum.a.value #@ MyEnum.b.value #@ """ ) inferred_tuple_node = next(tuple_node.infer()) inferred_list_node = next(list_node.infer()) assert isinstance(inferred_tuple_node, astroid.Tuple) assert isinstance(inferred_list_node, astroid.List) assert inferred_tuple_node.as_string() == "(1, 2)" assert inferred_list_node.as_string() == "[2, 4]" def test_enum_starred_is_skipped(self) -> None: code = """ from enum import Enum class ContentType(Enum): TEXT, PHOTO, VIDEO, GIF, YOUTUBE, *_ = [1, 2, 3, 4, 5, 6] ContentType.TEXT #@ """ node = astroid.extract_node(code) next(node.infer()) def test_enum_name_is_str_on_self(self) -> None: code = """ from enum import Enum class TestEnum(Enum): def func(self): self.name #@ self.value #@ TestEnum.name #@ TestEnum.value #@ """ i_name, i_value, c_name, c_value = astroid.extract_node(code) # .name should be a string, .name should be a property (that # forwards the lookup to __getattr__) inferred = next(i_name.infer()) assert isinstance(inferred, nodes.Const) assert inferred.pytype() == "builtins.str" inferred = next(c_name.infer()) assert isinstance(inferred, objects.Property) # Inferring .value should not raise InferenceError. It is probably Uninferable # but we don't particularly care next(i_value.infer()) next(c_value.infer()) def test_enum_name_and_value_members_override_dynamicclassattr(self) -> None: code = """ from enum import Enum class TrickyEnum(Enum): name = 1 value = 2 def func(self): self.name #@ self.value #@ TrickyEnum.name #@ TrickyEnum.value #@ """ i_name, i_value, c_name, c_value = astroid.extract_node(code) # All of these cases should be inferred as enum members inferred = next(i_name.infer()) assert isinstance(inferred, bases.Instance) assert inferred.pytype() == ".TrickyEnum.name" inferred = next(c_name.infer()) assert isinstance(inferred, bases.Instance) assert inferred.pytype() == ".TrickyEnum.name" inferred = next(i_value.infer()) assert isinstance(inferred, bases.Instance) assert inferred.pytype() == ".TrickyEnum.value" inferred = next(c_value.infer()) assert isinstance(inferred, bases.Instance) assert inferred.pytype() == ".TrickyEnum.value" def test_enum_subclass_member_name(self) -> None: ast_node = astroid.extract_node( """ from enum import Enum class EnumSubclass(Enum): pass class Color(EnumSubclass): red = 1 Color.red.name #@ """ ) assert isinstance(ast_node, nodes.NodeNG) inferred = ast_node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], astroid.Const) self.assertEqual(inferred[0].value, "red") def test_enum_subclass_member_value(self) -> None: ast_node = astroid.extract_node( """ from enum import Enum class EnumSubclass(Enum): pass class Color(EnumSubclass): red = 1 Color.red.value #@ """ ) assert isinstance(ast_node, nodes.NodeNG) inferred = ast_node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], astroid.Const) self.assertEqual(inferred[0].value, 1) def test_enum_subclass_member_method(self) -> None: # See Pylint issue #2626 ast_node = astroid.extract_node( """ from enum import Enum class EnumSubclass(Enum): def hello_pylint(self) -> str: return self.name class Color(EnumSubclass): red = 1 Color.red.hello_pylint() #@ """ ) assert isinstance(ast_node, nodes.NodeNG) inferred = ast_node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], astroid.Const) self.assertEqual(inferred[0].value, "red") def test_enum_subclass_different_modules(self) -> None: # See Pylint issue #2626 astroid.extract_node( """ from enum import Enum class EnumSubclass(Enum): pass """, "a", ) ast_node = astroid.extract_node( """ from a import EnumSubclass class Color(EnumSubclass): red = 1 Color.red.value #@ """ ) assert isinstance(ast_node, nodes.NodeNG) inferred = ast_node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], astroid.Const) self.assertEqual(inferred[0].value, 1) def test_members_member_ignored(self) -> None: ast_node = builder.extract_node( """ from enum import Enum class Animal(Enum): a = 1 __members__ = {} Animal.__members__ #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, astroid.Dict) self.assertTrue(inferred.locals) def test_enum_as_renamed_import(self) -> None: """Originally reported in https://github.com/pylint-dev/pylint/issues/5776.""" ast_node: nodes.Attribute = builder.extract_node( """ from enum import Enum as PyEnum class MyEnum(PyEnum): ENUM_KEY = "enum_value" MyEnum.ENUM_KEY """ ) inferred = next(ast_node.infer()) assert isinstance(inferred, bases.Instance) assert inferred._proxied.name == "ENUM_KEY" def test_class_named_enum(self) -> None: """Test that the user-defined class named `Enum` is not inferred as `enum.Enum`""" astroid.extract_node( """ class Enum: def __init__(self, one, two): self.one = one self.two = two def pear(self): ... """, "module_with_class_named_enum", ) attribute_nodes = astroid.extract_node( """ import module_with_class_named_enum module_with_class_named_enum.Enum("apple", "orange") #@ typo_module_with_class_named_enum.Enum("apple", "orange") #@ """ ) name_nodes = astroid.extract_node( """ from module_with_class_named_enum import Enum Enum("apple", "orange") #@ TypoEnum("apple", "orange") #@ """ ) # Test that both of the successfully inferred `Name` & `Attribute` # nodes refer to the user-defined Enum class. for inferred in (attribute_nodes[0].inferred()[0], name_nodes[0].inferred()[0]): assert isinstance(inferred, astroid.Instance) assert inferred.name == "Enum" assert inferred.qname() == "module_with_class_named_enum.Enum" assert "pear" in inferred.locals # Test that an `InferenceError` is raised when an attempt is made to # infer a `Name` or `Attribute` node & they cannot be found. for node in (attribute_nodes[1], name_nodes[1]): with pytest.raises(InferenceError): node.inferred() def test_enum_members_uppercase_only(self) -> None: """Originally reported in https://github.com/pylint-dev/pylint/issues/7402. ``nodes.AnnAssign`` nodes with no assigned values do not appear inside ``__members__``. Test that only enum members `MARS` and `radius` appear in the `__members__` container while the attribute `mass` does not. """ enum_class = astroid.extract_node( """ from enum import Enum class Planet(Enum): #@ MARS = (1, 2) radius: int = 1 mass: int def __init__(self, mass, radius): self.mass = mass self.radius = radius Planet.MARS.value """ ) enum_members = next(enum_class.igetattr("__members__")) assert len(enum_members.items) == 2 mars, radius = enum_members.items assert mars[1].name == "MARS" assert radius[1].name == "radius" def test_local_enum_child_class_inference(self) -> None: """Originally reported in https://github.com/pylint-dev/pylint/issues/8897 Test that a user-defined enum class is inferred when it subclasses another user-defined enum class. """ enum_class_node, enum_member_value_node = astroid.extract_node( """ import sys from enum import Enum if sys.version_info >= (3, 11): from enum import StrEnum else: class StrEnum(str, Enum): pass class Color(StrEnum): #@ RED = "red" Color.RED.value #@ """ ) assert "RED" in enum_class_node.locals enum_members = enum_class_node.locals["__members__"][0].items assert len(enum_members) == 1 _, name = enum_members[0] assert name.name == "RED" inferred_enum_member_value_node = next(enum_member_value_node.infer()) assert inferred_enum_member_value_node.value == "red" def test_enum_with_ignore(self) -> None: """Exclude ``_ignore_`` from the ``__members__`` container Originally reported in https://github.com/pylint-dev/pylint/issues/9015 """ ast_node: nodes.Attribute = builder.extract_node( """ import enum class MyEnum(enum.Enum): FOO = enum.auto() BAR = enum.auto() _ignore_ = ["BAZ"] BAZ = 42 MyEnum.__members__ """ ) inferred = next(ast_node.infer()) members_names = [const_node.value for const_node, name_obj in inferred.items] assert members_names == ["FOO", "BAR", "BAZ"] def test_enum_sunder_names(self) -> None: """Test that both `_name_` and `_value_` sunder names exist""" sunder_name, sunder_value = builder.extract_node( """ import enum class MyEnum(enum.Enum): APPLE = 42 MyEnum.APPLE._name_ #@ MyEnum.APPLE._value_ #@ """ ) inferred_name = next(sunder_name.infer()) assert inferred_name.value == "APPLE" inferred_value = next(sunder_value.infer()) assert inferred_value.value == 42 astroid-3.2.2/tests/brain/test_regex.py0000664000175000017500000000337414622475517020075 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt try: import regex # type: ignore[import] HAS_REGEX = True except ImportError: HAS_REGEX = False import pytest from astroid import MANAGER, builder, nodes, test_utils @pytest.mark.skipif(not HAS_REGEX, reason="This test requires the regex library.") class TestRegexBrain: def test_regex_flags(self) -> None: """Test that we have all regex enum flags in the brain.""" names = [name for name in dir(regex) if name.isupper()] re_ast = MANAGER.ast_from_module_name("regex") for name in names: assert name in re_ast assert next(re_ast[name].infer()).value == getattr(regex, name) @pytest.mark.xfail( reason="Started failing on main, but no one reproduced locally yet" ) @test_utils.require_version(minver="3.9") def test_regex_pattern_and_match_subscriptable(self): """Test regex.Pattern and regex.Match are subscriptable in PY39+.""" node1 = builder.extract_node( """ import regex regex.Pattern[str] """ ) inferred1 = next(node1.infer()) assert isinstance(inferred1, nodes.ClassDef) assert isinstance(inferred1.getattr("__class_getitem__")[0], nodes.FunctionDef) node2 = builder.extract_node( """ import regex regex.Match[str] """ ) inferred2 = next(node2.infer()) assert isinstance(inferred2, nodes.ClassDef) assert isinstance(inferred2.getattr("__class_getitem__")[0], nodes.FunctionDef) astroid-3.2.2/tests/brain/numpy/0000775000175000017500000000000014622475517016513 5ustar epsilonepsilonastroid-3.2.2/tests/brain/numpy/test_core_numeric.py0000664000175000017500000000477414622475517022612 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import pytest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class BrainNumpyCoreNumericTest(unittest.TestCase): """Test the numpy core numeric brain module.""" numpy_functions = ( ("zeros_like", "[1, 2]"), ("full_like", "[1, 2]", "4"), ("ones_like", "[1, 2]"), ("ones", "[1, 2]"), ) def _inferred_numpy_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() def test_numpy_function_calls_inferred_as_ndarray(self): """Test that calls to numpy functions are inferred as numpy.ndarray.""" licit_array_types = (".ndarray",) for func_ in self.numpy_functions: with self.subTest(typ=func_): inferred_values = list(self._inferred_numpy_func_call(*func_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred value for {func_[0]:s}", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) @pytest.mark.skipif(not HAS_NUMPY, reason="This test requires the numpy library.") @pytest.mark.parametrize( "method, expected_args", [ ("zeros_like", ["a", "dtype", "order", "subok", "shape"]), ("full_like", ["a", "fill_value", "dtype", "order", "subok", "shape"]), ("ones_like", ["a", "dtype", "order", "subok", "shape"]), ("ones", ["shape", "dtype", "order"]), ], ) def test_function_parameters(method: str, expected_args: list[str]) -> None: instance = builder.extract_node( f""" import numpy numpy.{method} #@ """ ) actual_args = instance.inferred()[0].args.args assert [arg.name for arg in actual_args] == expected_args astroid-3.2.2/tests/brain/numpy/test_core_einsumfunc.py0000664000175000017500000000325414622475517023314 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import pytest from astroid import builder, nodes try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False def _inferred_numpy_func_call(func_name: str, *func_args: str) -> nodes.FunctionDef: node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() @pytest.mark.skipif(not HAS_NUMPY, reason="This test requires the numpy library.") def test_numpy_function_calls_inferred_as_ndarray() -> None: """Test that calls to numpy functions are inferred as numpy.ndarray.""" method = "einsum" inferred_values = list( _inferred_numpy_func_call(method, "ii, np.arange(25).reshape(5, 5)") ) assert len(inferred_values) == 1, f"Too much inferred value for {method:s}" assert ( inferred_values[-1].pytype() == ".ndarray" ), f"Illicit type for {method:s} ({inferred_values[-1].pytype()})" @pytest.mark.skipif(not HAS_NUMPY, reason="This test requires the numpy library.") def test_function_parameters() -> None: instance = builder.extract_node( """ import numpy numpy.einsum #@ """ ) actual_args = instance.inferred()[0].args assert actual_args.vararg == "operands" assert [arg.name for arg in actual_args.kwonlyargs] == ["out", "optimize"] assert actual_args.kwarg == "kwargs" astroid-3.2.2/tests/brain/numpy/__init__.py0000664000175000017500000000000014622475517020612 0ustar epsilonepsilonastroid-3.2.2/tests/brain/numpy/test_core_umath.py0000664000175000017500000001710014622475517022251 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import bases, builder, nodes @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class NumpyBrainCoreUmathTest(unittest.TestCase): """Test of all members of numpy.core.umath module.""" one_arg_ufunc = ( "arccos", "arccosh", "arcsin", "arcsinh", "arctan", "arctanh", "cbrt", "conj", "conjugate", "cosh", "deg2rad", "degrees", "exp2", "expm1", "fabs", "frexp", "isfinite", "isinf", "log", "log1p", "log2", "logical_not", "modf", "negative", "positive", "rad2deg", "radians", "reciprocal", "rint", "sign", "signbit", "spacing", "square", "tan", "tanh", "trunc", ) two_args_ufunc = ( "add", "bitwise_and", "bitwise_or", "bitwise_xor", "copysign", "divide", "divmod", "equal", "float_power", "floor_divide", "fmax", "fmin", "fmod", "gcd", "greater", "heaviside", "hypot", "lcm", "ldexp", "left_shift", "less", "logaddexp", "logaddexp2", "logical_and", "logical_or", "logical_xor", "maximum", "minimum", "multiply", "nextafter", "not_equal", "power", "remainder", "right_shift", "subtract", "true_divide", ) all_ufunc = one_arg_ufunc + two_args_ufunc constants = ("e", "euler_gamma") def _inferred_numpy_attribute(self, func_name): node = builder.extract_node( f""" import numpy.core.umath as tested_module func = tested_module.{func_name:s} func""" ) return next(node.infer()) def test_numpy_core_umath_constants(self): """Test that constants have Const type.""" for const in self.constants: with self.subTest(const=const): inferred = self._inferred_numpy_attribute(const) self.assertIsInstance(inferred, nodes.Const) def test_numpy_core_umath_constants_values(self): """Test the values of the constants.""" exact_values = {"e": 2.718281828459045, "euler_gamma": 0.5772156649015329} for const in self.constants: with self.subTest(const=const): inferred = self._inferred_numpy_attribute(const) self.assertEqual(inferred.value, exact_values[const]) def test_numpy_core_umath_functions(self): """Test that functions have FunctionDef type.""" for func in self.all_ufunc: with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) self.assertIsInstance(inferred, bases.Instance) def test_numpy_core_umath_functions_one_arg(self): """Test the arguments names of functions.""" exact_arg_names = [ "self", "x", "out", "where", "casting", "order", "dtype", "subok", ] for func in self.one_arg_ufunc: with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) self.assertEqual( inferred.getattr("__call__")[0].argnames(), exact_arg_names ) def test_numpy_core_umath_functions_two_args(self): """Test the arguments names of functions.""" exact_arg_names = [ "self", "x1", "x2", "out", "where", "casting", "order", "dtype", "subok", ] for func in self.two_args_ufunc: with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) self.assertEqual( inferred.getattr("__call__")[0].argnames(), exact_arg_names ) def test_numpy_core_umath_functions_kwargs_default_values(self): """Test the default values for keyword arguments.""" exact_kwargs_default_values = [None, True, "same_kind", "K", None, True] for func in self.one_arg_ufunc + self.two_args_ufunc: with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) default_args_values = [ default.value for default in inferred.getattr("__call__")[0].args.defaults ] self.assertEqual(default_args_values, exact_kwargs_default_values) def _inferred_numpy_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func() """ ) return node.infer() def test_numpy_core_umath_functions_return_type(self): """Test that functions which should return a ndarray do return it.""" ndarray_returning_func = [ f for f in self.all_ufunc if f not in ("frexp", "modf") ] for func_ in ndarray_returning_func: with self.subTest(typ=func_): inferred_values = list(self._inferred_numpy_func_call(func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values[-1].pytype(), func_ ), ) self.assertTrue( inferred_values[0].pytype() == ".ndarray", msg=f"Illicit type for {func_:s} ({inferred_values[-1].pytype()})", ) def test_numpy_core_umath_functions_return_type_tuple(self): """Test that functions which should return a pair of ndarray do return it.""" ndarray_returning_func = ("frexp", "modf") for func_ in ndarray_returning_func: with self.subTest(typ=func_): inferred_values = list(self._inferred_numpy_func_call(func_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred values ({inferred_values}) for {func_:s}", ) self.assertTrue( inferred_values[-1].pytype() == "builtins.tuple", msg=f"Illicit type for {func_:s} ({inferred_values[-1].pytype()})", ) self.assertTrue( len(inferred_values[0].elts) == 2, msg=f"{func_} should return a pair of values. That's not the case.", ) for array in inferred_values[-1].elts: effective_infer = [m.pytype() for m in array.inferred()] self.assertTrue( ".ndarray" in effective_infer, msg=( f"Each item in the return of {func_} should be inferred" f" as a ndarray and not as {effective_infer}" ), ) astroid-3.2.2/tests/brain/numpy/test_core_numerictypes.py0000664000175000017500000002567714622475517023704 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from typing import ClassVar try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import Uninferable, builder, nodes from astroid.brain.brain_numpy_utils import ( NUMPY_VERSION_TYPE_HINTS_SUPPORT, _get_numpy_version, numpy_supports_type_hints, ) @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class NumpyBrainCoreNumericTypesTest(unittest.TestCase): """Test of all the missing types defined in numerictypes module.""" all_types: ClassVar[list[str]] = [ "uint16", "uint32", "uint64", "float16", "float32", "float64", "float96", "complex64", "complex128", "complex192", "timedelta64", "datetime64", "unicode_", "str_", "bool_", "bool8", "byte", "int8", "bytes0", "bytes_", "cdouble", "cfloat", "character", "clongdouble", "clongfloat", "complexfloating", "csingle", "double", "flexible", "floating", "half", "inexact", "int0", "longcomplex", "longdouble", "longfloat", "short", "signedinteger", "single", "singlecomplex", "str0", "ubyte", "uint", "uint0", "uintc", "uintp", "ulonglong", "unsignedinteger", "ushort", "void0", ] def _inferred_numpy_attribute(self, attrib): node = builder.extract_node( f""" import numpy.core.numerictypes as tested_module missing_type = tested_module.{attrib:s}""" ) return next(node.value.infer()) def test_numpy_core_types(self): """Test that all defined types have ClassDef type.""" for typ in self.all_types: with self.subTest(typ=typ): inferred = self._inferred_numpy_attribute(typ) self.assertIsInstance(inferred, nodes.ClassDef) def test_generic_types_have_methods(self): """Test that all generic derived types have specified methods.""" generic_methods = [ "all", "any", "argmax", "argmin", "argsort", "astype", "base", "byteswap", "choose", "clip", "compress", "conj", "conjugate", "copy", "cumprod", "cumsum", "data", "diagonal", "dtype", "dump", "dumps", "fill", "flags", "flat", "flatten", "getfield", "imag", "item", "itemset", "itemsize", "max", "mean", "min", "nbytes", "ndim", "newbyteorder", "nonzero", "prod", "ptp", "put", "ravel", "real", "repeat", "reshape", "resize", "round", "searchsorted", "setfield", "setflags", "shape", "size", "sort", "squeeze", "std", "strides", "sum", "swapaxes", "take", "tobytes", "tofile", "tolist", "tostring", "trace", "transpose", "var", "view", ] for type_ in ( "bool_", "bytes_", "character", "complex128", "complex192", "complex64", "complexfloating", "datetime64", "flexible", "float16", "float32", "float64", "float96", "floating", "generic", "inexact", "int16", "int32", "int32", "int64", "int8", "integer", "number", "signedinteger", "str_", "timedelta64", "uint16", "uint32", "uint32", "uint64", "uint8", "unsignedinteger", "void", ): with self.subTest(typ=type_): inferred = self._inferred_numpy_attribute(type_) for meth in generic_methods: with self.subTest(meth=meth): self.assertTrue(meth in {m.name for m in inferred.methods()}) def test_generic_types_have_attributes(self): """Test that all generic derived types have specified attributes.""" generic_attr = [ "base", "data", "dtype", "flags", "flat", "imag", "itemsize", "nbytes", "ndim", "real", "size", "strides", ] for type_ in ( "bool_", "bytes_", "character", "complex128", "complex192", "complex64", "complexfloating", "datetime64", "flexible", "float16", "float32", "float64", "float96", "floating", "generic", "inexact", "int16", "int32", "int32", "int64", "int8", "integer", "number", "signedinteger", "str_", "timedelta64", "uint16", "uint32", "uint32", "uint64", "uint8", "unsignedinteger", "void", ): with self.subTest(typ=type_): inferred = self._inferred_numpy_attribute(type_) for attr in generic_attr: with self.subTest(attr=attr): self.assertNotEqual(len(inferred.getattr(attr)), 0) def test_number_types_have_unary_operators(self): """Test that number types have unary operators.""" unary_ops = ("__neg__",) for type_ in ( "float64", "float96", "floating", "int16", "int32", "int32", "int64", "int8", "integer", "number", "signedinteger", "uint16", "uint32", "uint32", "uint64", "uint8", "unsignedinteger", ): with self.subTest(typ=type_): inferred = self._inferred_numpy_attribute(type_) for attr in unary_ops: with self.subTest(attr=attr): self.assertNotEqual(len(inferred.getattr(attr)), 0) def test_array_types_have_unary_operators(self): """Test that array types have unary operators.""" unary_ops = ("__neg__", "__invert__") for type_ in ("ndarray",): with self.subTest(typ=type_): inferred = self._inferred_numpy_attribute(type_) for attr in unary_ops: with self.subTest(attr=attr): self.assertNotEqual(len(inferred.getattr(attr)), 0) def test_datetime_astype_return(self): """ Test that the return of astype method of the datetime object is inferred as a ndarray. pylint-dev/pylint#3332 """ node = builder.extract_node( """ import numpy as np import datetime test_array = np.datetime64(1, 'us') test_array.astype(datetime.datetime) """ ) licit_array_types = ".ndarray" inferred_values = list(node.infer()) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred value for datetime64.astype", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg="Illicit type for {:s} ({})".format( "datetime64.astype", inferred_values[-1].pytype() ), ) @unittest.skipUnless( HAS_NUMPY and numpy_supports_type_hints(), f"This test requires the numpy library with a version above {NUMPY_VERSION_TYPE_HINTS_SUPPORT}", ) def test_generic_types_are_subscriptables(self): """Test that all types deriving from generic are subscriptables.""" for type_ in ( "bool_", "bytes_", "character", "complex128", "complex192", "complex64", "complexfloating", "datetime64", "flexible", "float16", "float32", "float64", "float96", "floating", "generic", "inexact", "int16", "int32", "int32", "int64", "int8", "integer", "number", "signedinteger", "str_", "timedelta64", "uint16", "uint32", "uint32", "uint64", "uint8", "unsignedinteger", "void", ): with self.subTest(type_=type_): src = f""" import numpy as np np.{type_}[int] """ node = builder.extract_node(src) cls_node = node.inferred()[0] self.assertIsInstance(cls_node, nodes.ClassDef) self.assertEqual(cls_node.name, type_) @unittest.skipIf( HAS_NUMPY, "Those tests check that astroid does not crash if numpy is not available" ) class NumpyBrainUtilsTest(unittest.TestCase): """ This class is dedicated to test that astroid does not crash if numpy module is not available. """ def test_get_numpy_version_do_not_crash(self): """ Test that the function _get_numpy_version doesn't crash even if numpy is not installed. """ self.assertEqual(_get_numpy_version(), ("0", "0", "0")) def test_numpy_object_uninferable(self): """ Test that in case numpy is not available, then a numpy object is uninferable but the inference doesn't lead to a crash. """ src = """ import numpy as np np.number[int] """ node = builder.extract_node(src) cls_node = node.inferred()[0] self.assertIs(cls_node, Uninferable) astroid-3.2.2/tests/brain/numpy/test_ma.py0000664000175000017500000000272614622475517020530 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import pytest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder @pytest.mark.skipif(HAS_NUMPY is False, reason="This test requires the numpy library.") class TestBrainNumpyMa: """Test the numpy ma brain module.""" def _assert_maskedarray(self, code): node = builder.extract_node(code) cls_node = node.inferred()[0] assert cls_node.pytype() == "numpy.ma.core.MaskedArray" @pytest.mark.parametrize("alias_import", [True, False]) @pytest.mark.parametrize("ma_function", ["masked_invalid", "masked_where"]) def test_numpy_ma_returns_maskedarray(self, alias_import, ma_function): """ Test that calls to numpy ma functions return a MaskedArray object. The `ma_function` node is an Attribute or a Name """ import_str = ( "import numpy as np" if alias_import else f"from numpy.ma import {ma_function}" ) func = f"np.ma.{ma_function}" if alias_import else ma_function src = f""" {import_str} data = np.ndarray((1,2)) {func}([1, 0, 0], data) """ self._assert_maskedarray(src) astroid-3.2.2/tests/brain/numpy/test_random_mtrand.py0000664000175000017500000001065514622475517022760 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from typing import ClassVar try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder, nodes @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class NumpyBrainRandomMtrandTest(unittest.TestCase): """Test of all the functions of numpy.random.mtrand module.""" # Map between functions names and arguments names and default values all_mtrand: ClassVar[dict[str, tuple]] = { "beta": (["a", "b", "size"], [None]), "binomial": (["n", "p", "size"], [None]), "bytes": (["length"], []), "chisquare": (["df", "size"], [None]), "choice": (["a", "size", "replace", "p"], [None, True, None]), "dirichlet": (["alpha", "size"], [None]), "exponential": (["scale", "size"], [1.0, None]), "f": (["dfnum", "dfden", "size"], [None]), "gamma": (["shape", "scale", "size"], [1.0, None]), "geometric": (["p", "size"], [None]), "get_state": ([], []), "gumbel": (["loc", "scale", "size"], [0.0, 1.0, None]), "hypergeometric": (["ngood", "nbad", "nsample", "size"], [None]), "laplace": (["loc", "scale", "size"], [0.0, 1.0, None]), "logistic": (["loc", "scale", "size"], [0.0, 1.0, None]), "lognormal": (["mean", "sigma", "size"], [0.0, 1.0, None]), "logseries": (["p", "size"], [None]), "multinomial": (["n", "pvals", "size"], [None]), "multivariate_normal": (["mean", "cov", "size"], [None]), "negative_binomial": (["n", "p", "size"], [None]), "noncentral_chisquare": (["df", "nonc", "size"], [None]), "noncentral_f": (["dfnum", "dfden", "nonc", "size"], [None]), "normal": (["loc", "scale", "size"], [0.0, 1.0, None]), "pareto": (["a", "size"], [None]), "permutation": (["x"], []), "poisson": (["lam", "size"], [1.0, None]), "power": (["a", "size"], [None]), "rand": (["args"], []), "randint": (["low", "high", "size", "dtype"], [None, None, "l"]), "randn": (["args"], []), "random": (["size"], [None]), "random_integers": (["low", "high", "size"], [None, None]), "random_sample": (["size"], [None]), "rayleigh": (["scale", "size"], [1.0, None]), "seed": (["seed"], [None]), "set_state": (["state"], []), "shuffle": (["x"], []), "standard_cauchy": (["size"], [None]), "standard_exponential": (["size"], [None]), "standard_gamma": (["shape", "size"], [None]), "standard_normal": (["size"], [None]), "standard_t": (["df", "size"], [None]), "triangular": (["left", "mode", "right", "size"], [None]), "uniform": (["low", "high", "size"], [0.0, 1.0, None]), "vonmises": (["mu", "kappa", "size"], [None]), "wald": (["mean", "scale", "size"], [None]), "weibull": (["a", "size"], [None]), "zipf": (["a", "size"], [None]), } def _inferred_numpy_attribute(self, func_name): node = builder.extract_node( f""" import numpy.random.mtrand as tested_module func = tested_module.{func_name:s} func""" ) return next(node.infer()) def test_numpy_random_mtrand_functions(self): """Test that all functions have FunctionDef type.""" for func in self.all_mtrand: with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) self.assertIsInstance(inferred, nodes.FunctionDef) def test_numpy_random_mtrand_functions_signature(self): """Test the arguments names and default values.""" for ( func, (exact_arg_names, exact_kwargs_default_values), ) in self.all_mtrand.items(): with self.subTest(func=func): inferred = self._inferred_numpy_attribute(func) self.assertEqual(inferred.argnames(), exact_arg_names) default_args_values = [ default.value for default in inferred.args.defaults ] self.assertEqual(default_args_values, exact_kwargs_default_values) astroid-3.2.2/tests/brain/numpy/test_ndarray.py0000664000175000017500000001144114622475517021565 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder, nodes from astroid.brain.brain_numpy_utils import ( NUMPY_VERSION_TYPE_HINTS_SUPPORT, numpy_supports_type_hints, ) @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class NumpyBrainNdarrayTest(unittest.TestCase): """Test that calls to numpy functions returning arrays are correctly inferred.""" ndarray_returning_ndarray_methods = ( "__abs__", "__add__", "__and__", "__array__", "__array_wrap__", "__copy__", "__deepcopy__", "__eq__", "__floordiv__", "__ge__", "__gt__", "__iadd__", "__iand__", "__ifloordiv__", "__ilshift__", "__imod__", "__imul__", "__invert__", "__ior__", "__ipow__", "__irshift__", "__isub__", "__itruediv__", "__ixor__", "__le__", "__lshift__", "__lt__", "__matmul__", "__mod__", "__mul__", "__ne__", "__neg__", "__or__", "__pos__", "__pow__", "__rshift__", "__sub__", "__truediv__", "__xor__", "all", "any", "argmax", "argmin", "argpartition", "argsort", "astype", "byteswap", "choose", "clip", "compress", "conj", "conjugate", "copy", "cumprod", "cumsum", "diagonal", "dot", "flatten", "getfield", "max", "mean", "min", "newbyteorder", "prod", "ptp", "ravel", "repeat", "reshape", "round", "searchsorted", "squeeze", "std", "sum", "swapaxes", "take", "trace", "transpose", "var", "view", ) def _inferred_ndarray_method_call(self, func_name): node = builder.extract_node( f""" import numpy as np test_array = np.ndarray((2, 2)) test_array.{func_name:s}() """ ) return node.infer() def _inferred_ndarray_attribute(self, attr_name): node = builder.extract_node( f""" import numpy as np test_array = np.ndarray((2, 2)) test_array.{attr_name:s} """ ) return node.infer() def test_numpy_function_calls_inferred_as_ndarray(self): """Test that some calls to numpy functions are inferred as numpy.ndarray.""" licit_array_types = ".ndarray" for func_ in self.ndarray_returning_ndarray_methods: with self.subTest(typ=func_): inferred_values = list(self._inferred_ndarray_method_call(func_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred value for {func_:s}", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg=f"Illicit type for {func_:s} ({inferred_values[-1].pytype()})", ) def test_numpy_ndarray_attribute_inferred_as_ndarray(self): """Test that some numpy ndarray attributes are inferred as numpy.ndarray.""" licit_array_types = ".ndarray" for attr_ in ("real", "imag", "shape", "T"): with self.subTest(typ=attr_): inferred_values = list(self._inferred_ndarray_attribute(attr_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred value for {attr_:s}", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg=f"Illicit type for {attr_:s} ({inferred_values[-1].pytype()})", ) @unittest.skipUnless( HAS_NUMPY and numpy_supports_type_hints(), f"This test requires the numpy library with a version above {NUMPY_VERSION_TYPE_HINTS_SUPPORT}", ) def test_numpy_ndarray_class_support_type_indexing(self): """Test that numpy ndarray class can be subscripted (type hints).""" src = """ import numpy as np np.ndarray[int] """ node = builder.extract_node(src) cls_node = node.inferred()[0] self.assertIsInstance(cls_node, nodes.ClassDef) self.assertEqual(cls_node.name, "ndarray") astroid-3.2.2/tests/brain/numpy/test_core_fromnumeric.py0000664000175000017500000000313514622475517023464 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class BrainNumpyCoreFromNumericTest(unittest.TestCase): """Test the numpy core fromnumeric brain module.""" numpy_functions = (("sum", "[1, 2]"),) def _inferred_numpy_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() def test_numpy_function_calls_inferred_as_ndarray(self): """Test that calls to numpy functions are inferred as numpy.ndarray.""" licit_array_types = (".ndarray",) for func_ in self.numpy_functions: with self.subTest(typ=func_): inferred_values = list(self._inferred_numpy_func_call(*func_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred value for {func_[0]:s}", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg=f"Illicit type for {func_[0]:s} ({inferred_values[-1].pytype()})", ) astroid-3.2.2/tests/brain/numpy/test_core_multiarray.py0000664000175000017500000001700114622475517023324 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class BrainNumpyCoreMultiarrayTest(unittest.TestCase): """Test the numpy core multiarray brain module.""" numpy_functions_returning_array = ( ("array", "[1, 2]"), ("bincount", "[1, 2]"), ("busday_count", "('2011-01', '2011-02')"), ("busday_offset", "'2012-03', -1, roll='forward'"), ("concatenate", "([1, 2], [1, 2])"), ("datetime_as_string", "['2012-02', '2012-03']"), ("dot", "[1, 2]", "[1, 2]"), ("empty_like", "[1, 2]"), ("inner", "[1, 2]", "[1, 2]"), ("is_busday", "['2011-07-01', '2011-07-02', '2011-07-18']"), ("lexsort", "(('toto', 'tutu'), ('riri', 'fifi'))"), ("packbits", "np.array([1, 2])"), ("unpackbits", "np.array([[1], [2], [3]], dtype=np.uint8)"), ("vdot", "[1, 2]", "[1, 2]"), ("where", "[True, False]", "[1, 2]", "[2, 1]"), ("empty", "[1, 2]"), ("zeros", "[1, 2]"), ) numpy_functions_returning_bool = ( ("can_cast", "np.int32, np.int64"), ("may_share_memory", "np.array([1, 2])", "np.array([3, 4])"), ("shares_memory", "np.array([1, 2])", "np.array([3, 4])"), ) numpy_functions_returning_dtype = ( # ("min_scalar_type", "10"), # Not yet tested as it returns np.dtype # ("result_type", "'i4'", "'c8'"), # Not yet tested as it returns np.dtype ) numpy_functions_returning_none = (("copyto", "([1, 2], [1, 3])"),) numpy_functions_returning_tuple = ( ( "unravel_index", "[22, 33, 44]", "(6, 7)", ), # Not yet tested as is returns a tuple ) def _inferred_numpy_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() def _inferred_numpy_no_alias_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy func = numpy.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() def test_numpy_function_calls_inferred_as_ndarray(self): """Test that calls to numpy functions are inferred as numpy.ndarray.""" for infer_wrapper in ( self._inferred_numpy_func_call, self._inferred_numpy_no_alias_func_call, ): for func_ in self.numpy_functions_returning_array: with self.subTest(typ=func_): inferred_values = list(infer_wrapper(*func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values, func_[0] ), ) self.assertTrue( inferred_values[-1].pytype() == ".ndarray", msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) def test_numpy_function_calls_inferred_as_bool(self): """Test that calls to numpy functions are inferred as bool.""" for infer_wrapper in ( self._inferred_numpy_func_call, self._inferred_numpy_no_alias_func_call, ): for func_ in self.numpy_functions_returning_bool: with self.subTest(typ=func_): inferred_values = list(infer_wrapper(*func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values, func_[0] ), ) self.assertTrue( inferred_values[-1].pytype() == "builtins.bool", msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) def test_numpy_function_calls_inferred_as_dtype(self): """Test that calls to numpy functions are inferred as numpy.dtype.""" for infer_wrapper in ( self._inferred_numpy_func_call, self._inferred_numpy_no_alias_func_call, ): for func_ in self.numpy_functions_returning_dtype: with self.subTest(typ=func_): inferred_values = list(infer_wrapper(*func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values, func_[0] ), ) self.assertTrue( inferred_values[-1].pytype() == "numpy.dtype", msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) def test_numpy_function_calls_inferred_as_none(self): """Test that calls to numpy functions are inferred as None.""" for infer_wrapper in ( self._inferred_numpy_func_call, self._inferred_numpy_no_alias_func_call, ): for func_ in self.numpy_functions_returning_none: with self.subTest(typ=func_): inferred_values = list(infer_wrapper(*func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values, func_[0] ), ) self.assertTrue( inferred_values[-1].pytype() == "builtins.NoneType", msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) def test_numpy_function_calls_inferred_as_tuple(self): """Test that calls to numpy functions are inferred as tuple.""" for infer_wrapper in ( self._inferred_numpy_func_call, self._inferred_numpy_no_alias_func_call, ): for func_ in self.numpy_functions_returning_tuple: with self.subTest(typ=func_): inferred_values = list(infer_wrapper(*func_)) self.assertTrue( len(inferred_values) == 1, msg="Too much inferred values ({}) for {:s}".format( inferred_values, func_[0] ), ) self.assertTrue( inferred_values[-1].pytype() == "builtins.tuple", msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) astroid-3.2.2/tests/brain/numpy/test_core_function_base.py0000664000175000017500000000334614622475517023761 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest try: import numpy # pylint: disable=unused-import HAS_NUMPY = True except ImportError: HAS_NUMPY = False from astroid import builder @unittest.skipUnless(HAS_NUMPY, "This test requires the numpy library.") class BrainNumpyCoreFunctionBaseTest(unittest.TestCase): """Test the numpy core numeric brain module.""" numpy_functions = ( ("linspace", "1, 100"), ("logspace", "1, 100"), ("geomspace", "1, 100"), ) def _inferred_numpy_func_call(self, func_name, *func_args): node = builder.extract_node( f""" import numpy as np func = np.{func_name:s} func({','.join(func_args):s}) """ ) return node.infer() def test_numpy_function_calls_inferred_as_ndarray(self): """Test that calls to numpy functions are inferred as numpy.ndarray.""" licit_array_types = (".ndarray",) for func_ in self.numpy_functions: with self.subTest(typ=func_): inferred_values = list(self._inferred_numpy_func_call(*func_)) self.assertTrue( len(inferred_values) == 1, msg=f"Too much inferred value for {func_[0]:s}", ) self.assertTrue( inferred_values[-1].pytype() in licit_array_types, msg="Illicit type for {:s} ({})".format( func_[0], inferred_values[-1].pytype() ), ) astroid-3.2.2/tests/brain/test_six.py0000664000175000017500000001366314622475517017570 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from typing import Any import astroid from astroid import MANAGER, builder, nodes from astroid.nodes.scoped_nodes import ClassDef try: import six # type: ignore[import] # pylint: disable=unused-import HAS_SIX = True except ImportError: HAS_SIX = False @unittest.skipUnless(HAS_SIX, "These tests require the six library") class SixBrainTest(unittest.TestCase): def test_attribute_access(self) -> None: ast_nodes = builder.extract_node( """ import six six.moves.http_client #@ six.moves.urllib_parse #@ six.moves.urllib_error #@ six.moves.urllib.request #@ from six.moves import StringIO StringIO #@ """ ) assert isinstance(ast_nodes, list) http_client = next(ast_nodes[0].infer()) self.assertIsInstance(http_client, nodes.Module) self.assertEqual(http_client.name, "http.client") urllib_parse = next(ast_nodes[1].infer()) self.assertIsInstance(urllib_parse, nodes.Module) self.assertEqual(urllib_parse.name, "urllib.parse") urljoin = next(urllib_parse.igetattr("urljoin")) urlencode = next(urllib_parse.igetattr("urlencode")) self.assertIsInstance(urljoin, nodes.FunctionDef) self.assertEqual(urljoin.qname(), "urllib.parse.urljoin") self.assertIsInstance(urlencode, nodes.FunctionDef) self.assertEqual(urlencode.qname(), "urllib.parse.urlencode") urllib_error = next(ast_nodes[2].infer()) self.assertIsInstance(urllib_error, nodes.Module) self.assertEqual(urllib_error.name, "urllib.error") urlerror = next(urllib_error.igetattr("URLError")) self.assertIsInstance(urlerror, nodes.ClassDef) content_too_short = next(urllib_error.igetattr("ContentTooShortError")) self.assertIsInstance(content_too_short, nodes.ClassDef) urllib_request = next(ast_nodes[3].infer()) self.assertIsInstance(urllib_request, nodes.Module) self.assertEqual(urllib_request.name, "urllib.request") urlopen = next(urllib_request.igetattr("urlopen")) urlretrieve = next(urllib_request.igetattr("urlretrieve")) self.assertIsInstance(urlopen, nodes.FunctionDef) self.assertEqual(urlopen.qname(), "urllib.request.urlopen") self.assertIsInstance(urlretrieve, nodes.FunctionDef) self.assertEqual(urlretrieve.qname(), "urllib.request.urlretrieve") StringIO = next(ast_nodes[4].infer()) self.assertIsInstance(StringIO, nodes.ClassDef) self.assertEqual(StringIO.qname(), "_io.StringIO") self.assertTrue(StringIO.callable()) def test_attribute_access_with_six_moves_imported(self) -> None: astroid.MANAGER.clear_cache() astroid.MANAGER._mod_file_cache.clear() import six.moves # type: ignore[import] # pylint: disable=import-outside-toplevel,unused-import,redefined-outer-name self.test_attribute_access() def test_from_imports(self) -> None: ast_node = builder.extract_node( """ from six.moves import http_client http_client.HTTPSConnection #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) qname = "http.client.HTTPSConnection" self.assertEqual(inferred.qname(), qname) def test_from_submodule_imports(self) -> None: """Make sure ulrlib submodules can be imported from See pylint-dev/pylint#1640 for relevant issue """ ast_node = builder.extract_node( """ from six.moves.urllib.parse import urlparse urlparse #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.FunctionDef) def test_with_metaclass_subclasses_inheritance(self) -> None: ast_node = builder.extract_node( """ class A(type): def test(cls): return cls class C: pass import six class B(six.with_metaclass(A, C)): pass B #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "B") self.assertIsInstance(inferred.bases[0], nodes.Call) ancestors = tuple(inferred.ancestors()) self.assertIsInstance(ancestors[0], nodes.ClassDef) self.assertEqual(ancestors[0].name, "C") self.assertIsInstance(ancestors[1], nodes.ClassDef) self.assertEqual(ancestors[1].name, "object") @staticmethod def test_six_with_metaclass_enum_ancestor() -> None: code = """ import six from enum import Enum, EnumMeta class FooMeta(EnumMeta): pass class Foo(six.with_metaclass(FooMeta, Enum)): #@ bar = 1 """ klass = astroid.extract_node(code) assert next(klass.ancestors()).name == "Enum" def test_six_with_metaclass_with_additional_transform(self) -> None: def transform_class(cls: Any) -> ClassDef: if cls.name == "A": cls._test_transform = 314 return cls MANAGER.register_transform(nodes.ClassDef, transform_class) try: ast_node = builder.extract_node( """ import six class A(six.with_metaclass(type, object)): pass A #@ """ ) inferred = next(ast_node.infer()) assert getattr(inferred, "_test_transform", None) == 314 finally: MANAGER.unregister_transform(nodes.ClassDef, transform_class) astroid-3.2.2/tests/brain/test_unittest.py0000664000175000017500000000202714622475517020634 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import unittest from astroid import builder from astroid.test_utils import require_version class UnittestTest(unittest.TestCase): """A class that tests the brain_unittest module.""" @require_version(minver="3.8.0") def test_isolatedasynciotestcase(self): """ Tests that the IsolatedAsyncioTestCase class is statically imported thanks to the brain_unittest module. """ node = builder.extract_node( """ from unittest import IsolatedAsyncioTestCase class TestClass(IsolatedAsyncioTestCase): pass """ ) assert [n.qname() for n in node.ancestors()] == [ "unittest.async_case.IsolatedAsyncioTestCase", "unittest.case.TestCase", "builtins.object", ] astroid-3.2.2/tests/brain/test_attr.py0000664000175000017500000001157314622475517017735 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import astroid from astroid import nodes try: import attr # type: ignore[import] # pylint: disable=unused-import HAS_ATTR = True except ImportError: HAS_ATTR = False @unittest.skipUnless(HAS_ATTR, "These tests require the attr library") class AttrsTest(unittest.TestCase): def test_attr_transform(self) -> None: module = astroid.parse( """ import attr from attr import attrs, attrib, field @attr.s class Foo: d = attr.ib(attr.Factory(dict)) f = Foo() f.d['answer'] = 42 @attr.s(slots=True) class Bar: d = attr.ib(attr.Factory(dict)) g = Bar() g.d['answer'] = 42 @attrs class Bah: d = attrib(attr.Factory(dict)) h = Bah() h.d['answer'] = 42 @attr.attrs class Bai: d = attr.attrib(attr.Factory(dict)) i = Bai() i.d['answer'] = 42 @attr.define class Spam: d = field(default=attr.Factory(dict)) j = Spam(d=1) j.d['answer'] = 42 @attr.mutable class Eggs: d = attr.field(default=attr.Factory(dict)) k = Eggs(d=1) k.d['answer'] = 42 @attr.frozen class Eggs: d = attr.field(default=attr.Factory(dict)) l = Eggs(d=1) l.d['answer'] = 42 @attr.attrs(auto_attribs=True) class Eggs: d: int = attr.Factory(lambda: 3) m = Eggs(d=1) """ ) for name in ("f", "g", "h", "i", "j", "k", "l", "m"): should_be_unknown = next(module.getattr(name)[0].infer()).getattr("d")[0] self.assertIsInstance(should_be_unknown, astroid.Unknown) def test_attrs_transform(self) -> None: """Test brain for decorators of the 'attrs' package. Package added support for 'attrs' alongside 'attr' in v21.3.0. See: https://github.com/python-attrs/attrs/releases/tag/21.3.0 """ module = astroid.parse( """ import attrs from attrs import field, mutable, frozen, define from attrs import mutable as my_mutable @attrs.define class Foo: d = attrs.field(attrs.Factory(dict)) f = Foo() f.d['answer'] = 42 @attrs.define(slots=True) class Bar: d = field(attrs.Factory(dict)) g = Bar() g.d['answer'] = 42 @attrs.mutable class Bah: d = field(attrs.Factory(dict)) h = Bah() h.d['answer'] = 42 @attrs.frozen class Bai: d = attrs.field(attrs.Factory(dict)) i = Bai() i.d['answer'] = 42 @attrs.define class Spam: d = field(default=attrs.Factory(dict)) j = Spam(d=1) j.d['answer'] = 42 @attrs.mutable class Eggs: d = attrs.field(default=attrs.Factory(dict)) k = Eggs(d=1) k.d['answer'] = 42 @attrs.frozen class Eggs: d = attrs.field(default=attrs.Factory(dict)) l = Eggs(d=1) l.d['answer'] = 42 @frozen class Legs: d = attrs.field(default=attrs.Factory(dict)) """ ) for name in ("f", "g", "h", "i", "j", "k", "l"): should_be_unknown = next(module.getattr(name)[0].infer()).getattr("d")[0] self.assertIsInstance(should_be_unknown, astroid.Unknown, name) def test_special_attributes(self) -> None: """Make sure special attrs attributes exist""" code = """ import attr @attr.s class Foo: pass Foo() """ foo_inst = next(astroid.extract_node(code).infer()) [attr_node] = foo_inst.getattr("__attrs_attrs__") # Prevents https://github.com/pylint-dev/pylint/issues/1884 assert isinstance(attr_node, nodes.Unknown) def test_dont_consider_assignments_but_without_attrs(self) -> None: code = """ import attr class Cls: pass @attr.s class Foo: temp = Cls() temp.prop = 5 bar_thing = attr.ib(default=temp) Foo() """ next(astroid.extract_node(code).infer()) def test_attrs_with_annotation(self) -> None: code = """ import attr @attr.s class Foo: bar: int = attr.ib(default=5) Foo() """ should_be_unknown = next(astroid.extract_node(code).infer()).getattr("bar")[0] self.assertIsInstance(should_be_unknown, astroid.Unknown) astroid-3.2.2/tests/brain/test_typing_extensions.py0000664000175000017500000000236114622475517022547 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import pytest from astroid import builder, nodes try: import typing_extensions HAS_TYPING_EXTENSIONS = True HAS_TYPING_EXTENSIONS_TYPEVAR = hasattr(typing_extensions, "TypeVar") except ImportError: HAS_TYPING_EXTENSIONS = False HAS_TYPING_EXTENSIONS_TYPEVAR = False @pytest.mark.skipif( not HAS_TYPING_EXTENSIONS, reason="These tests require the typing_extensions library", ) class TestTypingExtensions: @staticmethod @pytest.mark.skipif( not HAS_TYPING_EXTENSIONS_TYPEVAR, reason="Need typing_extensions>=4.4.0 to test TypeVar", ) def test_typing_extensions_types() -> None: ast_nodes = builder.extract_node( """ from typing_extensions import TypeVar TypeVar('MyTypeVar', int, float, complex) #@ TypeVar('AnyStr', str, bytes) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) astroid-3.2.2/tests/brain/test_named_tuple.py0000664000175000017500000002500214622475517021250 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import astroid from astroid import builder, nodes, util from astroid.exceptions import AttributeInferenceError class NamedTupleTest(unittest.TestCase): def test_namedtuple_base(self) -> None: klass = builder.extract_node( """ from collections import namedtuple class X(namedtuple("X", ["a", "b", "c"])): pass """ ) assert isinstance(klass, nodes.ClassDef) self.assertEqual( [anc.name for anc in klass.ancestors()], ["X", "tuple", "object"] ) # See: https://github.com/pylint-dev/pylint/issues/5982 self.assertNotIn("X", klass.locals) for anc in klass.ancestors(): self.assertFalse(anc.parent is None) def test_namedtuple_inference(self) -> None: klass = builder.extract_node( """ from collections import namedtuple name = "X" fields = ["a", "b", "c"] class X(namedtuple(name, fields)): pass """ ) assert isinstance(klass, nodes.ClassDef) base = next(base for base in klass.ancestors() if base.name == "X") self.assertSetEqual({"a", "b", "c"}, set(base.instance_attrs)) def test_namedtuple_inference_failure(self) -> None: klass = builder.extract_node( """ from collections import namedtuple def foo(fields): return __(namedtuple("foo", fields)) """ ) self.assertIs(util.Uninferable, next(klass.infer())) def test_namedtuple_advanced_inference(self) -> None: # urlparse return an object of class ParseResult, which has a # namedtuple call and a mixin as base classes result = builder.extract_node( """ from urllib.parse import urlparse result = __(urlparse('gopher://')) """ ) instance = next(result.infer()) self.assertGreaterEqual(len(instance.getattr("scheme")), 1) self.assertGreaterEqual(len(instance.getattr("port")), 1) with self.assertRaises(AttributeInferenceError): instance.getattr("foo") self.assertGreaterEqual(len(instance.getattr("geturl")), 1) self.assertEqual(instance.name, "ParseResult") def test_namedtuple_instance_attrs(self) -> None: result = builder.extract_node( """ from collections import namedtuple namedtuple('a', 'a b c')(1, 2, 3) #@ """ ) inferred = next(result.infer()) for name, attr in inferred.instance_attrs.items(): self.assertEqual(attr[0].attrname, name) def test_namedtuple_uninferable_fields(self) -> None: node = builder.extract_node( """ x = [A] * 2 from collections import namedtuple l = namedtuple('a', x) l(1) """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) def test_namedtuple_access_class_fields(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "field other") Tuple #@ """ ) inferred = next(node.infer()) self.assertIn("field", inferred.locals) self.assertIn("other", inferred.locals) def test_namedtuple_rename_keywords(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "abc def", rename=True) Tuple #@ """ ) inferred = next(node.infer()) self.assertIn("abc", inferred.locals) self.assertIn("_1", inferred.locals) def test_namedtuple_rename_duplicates(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "abc abc abc", rename=True) Tuple #@ """ ) inferred = next(node.infer()) self.assertIn("abc", inferred.locals) self.assertIn("_1", inferred.locals) self.assertIn("_2", inferred.locals) def test_namedtuple_rename_uninferable(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "a b c", rename=UNINFERABLE) Tuple #@ """ ) inferred = next(node.infer()) self.assertIn("a", inferred.locals) self.assertIn("b", inferred.locals) self.assertIn("c", inferred.locals) def test_namedtuple_func_form(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple(typename="Tuple", field_names="a b c", rename=UNINFERABLE) Tuple #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred.name, "Tuple") self.assertIn("a", inferred.locals) self.assertIn("b", inferred.locals) self.assertIn("c", inferred.locals) def test_namedtuple_func_form_args_and_kwargs(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", field_names="a b c", rename=UNINFERABLE) Tuple #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred.name, "Tuple") self.assertIn("a", inferred.locals) self.assertIn("b", inferred.locals) self.assertIn("c", inferred.locals) def test_namedtuple_bases_are_actually_names_not_nodes(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", field_names="a b c", rename=UNINFERABLE) Tuple #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.ClassDef) self.assertIsInstance(inferred.bases[0], astroid.Name) self.assertEqual(inferred.bases[0].name, "tuple") def test_invalid_label_does_not_crash_inference(self) -> None: code = """ import collections a = collections.namedtuple( 'a', ['b c'] ) a """ node = builder.extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, astroid.ClassDef) assert "b" not in inferred.locals assert "c" not in inferred.locals def test_no_rename_duplicates_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "abc abc") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_no_rename_keywords_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "abc def") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_no_rename_nonident_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "123 456") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_no_rename_underscore_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", "_1") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_invalid_typename_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("123", "abc") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_keyword_typename_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("while", "abc") Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) # would raise ValueError def test_typeerror_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple Tuple = namedtuple("Tuple", [123, 456]) Tuple #@ """ ) inferred = next(node.infer()) # namedtuple converts all arguments to strings so these should be too # and catch on the isidentifier() check self.assertIs(util.Uninferable, inferred) def test_pathological_str_does_not_crash_inference(self) -> None: node = builder.extract_node( """ from collections import namedtuple class Invalid: def __str__(self): return 123 # will raise TypeError Tuple = namedtuple("Tuple", [Invalid()]) Tuple #@ """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) def test_name_as_typename(self) -> None: """Reported in https://github.com/pylint-dev/pylint/issues/7429 as a crash.""" good_node, good_node_two, bad_node = builder.extract_node( """ import collections collections.namedtuple(typename="MyTuple", field_names=["birth_date", "city"]) #@ collections.namedtuple("MyTuple", field_names=["birth_date", "city"]) #@ collections.namedtuple(["birth_date", "city"], typename="MyTuple") #@ """ ) good_inferred = next(good_node.infer()) assert isinstance(good_inferred, nodes.ClassDef) good_node_two_inferred = next(good_node_two.infer()) assert isinstance(good_node_two_inferred, nodes.ClassDef) bad_node_inferred = next(bad_node.infer()) assert bad_node_inferred == util.Uninferable astroid-3.2.2/tests/brain/test_pytest.py0000664000175000017500000000143314622475517020305 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations from astroid import builder def test_pytest() -> None: ast_node = builder.extract_node( """ import pytest pytest #@ """ ) module = next(ast_node.infer()) attrs = [ "deprecated_call", "warns", "exit", "fail", "skip", "importorskip", "xfail", "mark", "raises", "freeze_includes", "set_trace", "fixture", "yield_fixture", ] for attr in attrs: assert attr in module astroid-3.2.2/tests/brain/__init__.py0000664000175000017500000000000014622475517017442 0ustar epsilonepsilonastroid-3.2.2/tests/brain/test_ssl.py0000664000175000017500000000313214622475517017554 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for the ssl brain.""" from astroid import bases, nodes, parse def test_ssl_brain() -> None: """Test ssl brain transform.""" module = parse( """ import ssl ssl.PROTOCOL_TLSv1 ssl.VerifyMode ssl.TLSVersion ssl.VerifyMode.CERT_REQUIRED """ ) inferred_protocol = next(module.body[1].value.infer()) assert isinstance(inferred_protocol, nodes.Const) inferred_verifymode = next(module.body[2].value.infer()) assert isinstance(inferred_verifymode, nodes.ClassDef) assert inferred_verifymode.name == "VerifyMode" assert len(inferred_verifymode.bases) == 1 # Check that VerifyMode correctly inherits from enum.IntEnum int_enum = next(inferred_verifymode.bases[0].infer()) assert isinstance(int_enum, nodes.ClassDef) assert int_enum.name == "IntEnum" assert int_enum.parent.name == "enum" # TLSVersion is inferred from the main module, not from the brain inferred_tlsversion = next(module.body[3].value.infer()) assert isinstance(inferred_tlsversion, nodes.ClassDef) assert inferred_tlsversion.name == "TLSVersion" # TLSVersion is inferred from the main module, not from the brain inferred_cert_required = next(module.body[4].value.infer()) assert isinstance(inferred_cert_required, bases.Instance) assert inferred_cert_required._proxied.name == "CERT_REQUIRED" astroid-3.2.2/tests/brain/test_ctypes.py0000664000175000017500000000664114622475517020272 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import sys import pytest from astroid import extract_node, nodes pytestmark = pytest.mark.skipif( hasattr(sys, "pypy_version_info"), reason="pypy has its own implementation of _ctypes module which is different " "from the one of cpython", ) # The parameters of the test define a mapping between the ctypes redefined types # and the builtin types that the "value" member holds @pytest.mark.parametrize( "c_type,builtin_type,type_code", [ ("c_bool", "bool", "?"), ("c_byte", "int", "b"), ("c_char", "bytes", "c"), ("c_double", "float", "d"), pytest.param( "c_buffer", "bytes", "", marks=pytest.mark.xfail( reason="c_buffer is Uninferable but for now we do not know why" ), ), ("c_float", "float", "f"), ("c_int", "int", "i"), ("c_int16", "int", "h"), ("c_int32", "int", "i"), ("c_int64", "int", "l"), ("c_int8", "int", "b"), ("c_long", "int", "l"), ("c_longdouble", "float", "g"), ("c_longlong", "int", "l"), ("c_short", "int", "h"), ("c_size_t", "int", "L"), ("c_ssize_t", "int", "l"), ("c_ubyte", "int", "B"), ("c_uint", "int", "I"), ("c_uint16", "int", "H"), ("c_uint32", "int", "I"), ("c_uint64", "int", "L"), ("c_uint8", "int", "B"), ("c_ulong", "int", "L"), ("c_ulonglong", "int", "L"), ("c_ushort", "int", "H"), ("c_wchar", "str", "u"), ], ) def test_ctypes_redefined_types_members(c_type, builtin_type, type_code): """Test that the "value" and "_type_" member of each redefined types are correct. """ src = f""" import ctypes x=ctypes.{c_type}("toto") x.value """ node = extract_node(src) assert isinstance(node, nodes.NodeNG) node_inf = node.inferred()[0] assert node_inf.pytype() == f"builtins.{builtin_type}" src = f""" import ctypes x=ctypes.{c_type}("toto") x._type_ """ node = extract_node(src) assert isinstance(node, nodes.NodeNG) node_inf = node.inferred()[0] assert isinstance(node_inf, nodes.Const) assert node_inf.value == type_code def test_cdata_member_access() -> None: """ Test that the base members are still accessible. Each redefined ctypes type inherits from _SimpleCData which itself inherits from _CData. Checks that _CData members are accessible. """ src = """ import ctypes x=ctypes.c_float(1.0) x._objects """ node = extract_node(src) assert isinstance(node, nodes.NodeNG) node_inf = node.inferred()[0] assert node_inf.display_type() == "Class" assert node_inf.qname() == "_ctypes._SimpleCData._objects" def test_other_ctypes_member_untouched() -> None: """ Test that other ctypes members, which are not touched by the brain, are correctly inferred. """ src = """ import ctypes ctypes.ARRAY(3, 2) """ node = extract_node(src) assert isinstance(node, nodes.NodeNG) node_inf = node.inferred()[0] assert isinstance(node_inf, nodes.Const) assert node_inf.value == 6 astroid-3.2.2/tests/brain/test_dateutil.py0000664000175000017500000000160114622475517020565 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from astroid import builder try: import dateutil # type: ignore[import] # pylint: disable=unused-import HAS_DATEUTIL = True except ImportError: HAS_DATEUTIL = False @unittest.skipUnless(HAS_DATEUTIL, "This test requires the dateutil library.") class DateutilBrainTest(unittest.TestCase): def test_parser(self): module = builder.parse( """ from dateutil.parser import parse d = parse('2000-01-01') """ ) d_type = next(module["d"].infer()) self.assertIn(d_type.qname(), {"_pydatetime.datetime", "datetime.datetime"}) astroid-3.2.2/tests/brain/test_brain.py0000664000175000017500000017577614622475517020076 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import io import re import sys import unittest import pytest import astroid from astroid import MANAGER, builder, nodes, objects, test_utils, util from astroid.bases import Instance from astroid.brain.brain_namedtuple_enum import _get_namedtuple_fields from astroid.const import PY312_PLUS, PY313_PLUS from astroid.exceptions import ( AttributeInferenceError, InferenceError, UseInferenceDefault, ) from astroid.nodes.node_classes import Const from astroid.nodes.scoped_nodes import ClassDef def assertEqualMro(klass: ClassDef, expected_mro: list[str]) -> None: """Check mro names.""" assert [member.qname() for member in klass.mro()] == expected_mro class CollectionsDequeTests(unittest.TestCase): def _inferred_queue_instance(self) -> Instance: node = builder.extract_node( """ import collections q = collections.deque([]) q """ ) return next(node.infer()) def test_deque(self) -> None: inferred = self._inferred_queue_instance() self.assertTrue(inferred.getattr("__len__")) def test_deque_py35methods(self) -> None: inferred = self._inferred_queue_instance() self.assertIn("copy", inferred.locals) self.assertIn("insert", inferred.locals) self.assertIn("index", inferred.locals) @test_utils.require_version(maxver="3.8") def test_deque_not_py39methods(self): inferred = self._inferred_queue_instance() with self.assertRaises(AttributeInferenceError): inferred.getattr("__class_getitem__") @test_utils.require_version(minver="3.9") def test_deque_py39methods(self): inferred = self._inferred_queue_instance() self.assertTrue(inferred.getattr("__class_getitem__")) class OrderedDictTest(unittest.TestCase): def _inferred_ordered_dict_instance(self) -> Instance: node = builder.extract_node( """ import collections d = collections.OrderedDict() d """ ) return next(node.infer()) def test_ordered_dict_py34method(self) -> None: inferred = self._inferred_ordered_dict_instance() self.assertIn("move_to_end", inferred.locals) class DefaultDictTest(unittest.TestCase): def test_1(self) -> None: node = builder.extract_node( """ from collections import defaultdict X = defaultdict(int) X[0] """ ) inferred = next(node.infer()) self.assertIs(util.Uninferable, inferred) class ModuleExtenderTest(unittest.TestCase): def test_extension_modules(self) -> None: transformer = MANAGER._transform for extender, _ in transformer.transforms[nodes.Module]: n = nodes.Module("__main__") extender(n) def streams_are_fine(): """Check if streams are being overwritten, for example, by pytest stream inference will not work if they are overwritten PY3 only """ return all(isinstance(s, io.IOBase) for s in (sys.stdout, sys.stderr, sys.stdin)) class IOBrainTest(unittest.TestCase): @unittest.skipUnless( streams_are_fine(), "Needs Python 3 io model / doesn't work with plain pytest." "use pytest -s for this test to work", ) def test_sys_streams(self): for name in ("stdout", "stderr", "stdin"): node = astroid.extract_node( f""" import sys sys.{name} """ ) inferred = next(node.infer()) buffer_attr = next(inferred.igetattr("buffer")) self.assertIsInstance(buffer_attr, astroid.Instance) self.assertEqual(buffer_attr.name, "BufferedWriter") raw = next(buffer_attr.igetattr("raw")) self.assertIsInstance(raw, astroid.Instance) self.assertEqual(raw.name, "FileIO") @test_utils.require_version("3.9") class TypeBrain(unittest.TestCase): def test_type_subscript(self): """ Check that type object has the __class_getitem__ method when it is used as a subscript """ src = builder.extract_node( """ a: type[int] = int """ ) val_inf = src.annotation.value.inferred()[0] self.assertIsInstance(val_inf, astroid.ClassDef) self.assertEqual(val_inf.name, "type") meth_inf = val_inf.getattr("__class_getitem__")[0] self.assertIsInstance(meth_inf, astroid.FunctionDef) def test_invalid_type_subscript(self): """ Check that a type (str for example) that inherits from type does not have __class_getitem__ method even when it is used as a subscript """ src = builder.extract_node( """ a: str[int] = "abc" """ ) val_inf = src.annotation.value.inferred()[0] self.assertIsInstance(val_inf, astroid.ClassDef) self.assertEqual(val_inf.name, "str") with self.assertRaises(AttributeInferenceError): # pylint: disable=expression-not-assigned # noinspection PyStatementEffect val_inf.getattr("__class_getitem__")[0] @test_utils.require_version(minver="3.9") def test_builtin_subscriptable(self): """Starting with python3.9 builtin types such as list are subscriptable. Any builtin class such as "enumerate" or "staticmethod" also works.""" for typename in ("tuple", "list", "dict", "set", "frozenset", "enumerate"): src = f""" {typename:s}[int] """ right_node = builder.extract_node(src) inferred = next(right_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertIsInstance(inferred.getattr("__iter__")[0], nodes.FunctionDef) def check_metaclass_is_abc(node: nodes.ClassDef): if PY312_PLUS and node.name == "ByteString": # .metaclass() finds the first metaclass in the mro(), # which, from 3.12, is _DeprecateByteStringMeta (unhelpful) # until ByteString is removed in 3.14. # Jump over the first two ByteString classes in the mro(). check_metaclass_is_abc(node.mro()[2]) else: meta = node.metaclass() assert isinstance(meta, nodes.ClassDef) assert meta.name == "ABCMeta" class CollectionsBrain(unittest.TestCase): def test_collections_object_not_subscriptable(self) -> None: """ Test that unsubscriptable types are detected Hashable is not subscriptable even with python39 """ wrong_node = builder.extract_node( """ import collections.abc collections.abc.Hashable[int] """ ) with self.assertRaises(InferenceError): next(wrong_node.infer()) right_node = builder.extract_node( """ import collections.abc collections.abc.Hashable """ ) inferred = next(right_node.infer()) check_metaclass_is_abc(inferred) assertEqualMro( inferred, [ "_collections_abc.Hashable", "builtins.object", ], ) with self.assertRaises(AttributeInferenceError): inferred.getattr("__class_getitem__") @test_utils.require_version(minver="3.9") def test_collections_object_subscriptable(self): """Starting with python39 some object of collections module are subscriptable. Test one of them""" right_node = builder.extract_node( """ import collections.abc collections.abc.MutableSet[int] """ ) inferred = next(right_node.infer()) check_metaclass_is_abc(inferred) assertEqualMro( inferred, [ "_collections_abc.MutableSet", "_collections_abc.Set", "_collections_abc.Collection", "_collections_abc.Sized", "_collections_abc.Iterable", "_collections_abc.Container", "builtins.object", ], ) self.assertIsInstance( inferred.getattr("__class_getitem__")[0], nodes.FunctionDef ) @test_utils.require_version(maxver="3.9") def test_collections_object_not_yet_subscriptable(self): """ Test that unsubscriptable types are detected as such. Until python39 MutableSet of the collections module is not subscriptable. """ wrong_node = builder.extract_node( """ import collections.abc collections.abc.MutableSet[int] """ ) with self.assertRaises(InferenceError): next(wrong_node.infer()) right_node = builder.extract_node( """ import collections.abc collections.abc.MutableSet """ ) inferred = next(right_node.infer()) check_metaclass_is_abc(inferred) assertEqualMro( inferred, [ "_collections_abc.MutableSet", "_collections_abc.Set", "_collections_abc.Collection", "_collections_abc.Sized", "_collections_abc.Iterable", "_collections_abc.Container", "builtins.object", ], ) with self.assertRaises(AttributeInferenceError): inferred.getattr("__class_getitem__") @test_utils.require_version(minver="3.9") def test_collections_object_subscriptable_2(self): """Starting with python39 Iterator in the collection.abc module is subscriptable""" node = builder.extract_node( """ import collections.abc class Derived(collections.abc.Iterator[int]): pass """ ) inferred = next(node.infer()) check_metaclass_is_abc(inferred) assertEqualMro( inferred, [ ".Derived", "_collections_abc.Iterator", "_collections_abc.Iterable", "builtins.object", ], ) @test_utils.require_version(maxver="3.9") def test_collections_object_not_yet_subscriptable_2(self): """Before python39 Iterator in the collection.abc module is not subscriptable""" node = builder.extract_node( """ import collections.abc collections.abc.Iterator[int] """ ) with self.assertRaises(InferenceError): next(node.infer()) @test_utils.require_version(minver="3.9") def test_collections_object_subscriptable_3(self): """With Python 3.9 the ByteString class of the collections module is subscriptable (but not the same class from typing module)""" right_node = builder.extract_node( """ import collections.abc collections.abc.ByteString[int] """ ) inferred = next(right_node.infer()) check_metaclass_is_abc(inferred) self.assertIsInstance( inferred.getattr("__class_getitem__")[0], nodes.FunctionDef ) @test_utils.require_version(minver="3.9") def test_collections_object_subscriptable_4(self): """Multiple inheritance with subscriptable collection class""" node = builder.extract_node( """ import collections.abc class Derived(collections.abc.Hashable, collections.abc.Iterator[int]): pass """ ) inferred = next(node.infer()) assertEqualMro( inferred, [ ".Derived", "_collections_abc.Hashable", "_collections_abc.Iterator", "_collections_abc.Iterable", "builtins.object", ], ) class TypingBrain(unittest.TestCase): def test_namedtuple_base(self) -> None: klass = builder.extract_node( """ from typing import NamedTuple class X(NamedTuple("X", [("a", int), ("b", str), ("c", bytes)])): pass """ ) self.assertEqual( [anc.name for anc in klass.ancestors()], ["X", "tuple", "object"] ) for anc in klass.ancestors(): self.assertFalse(anc.parent is None) def test_namedtuple_can_correctly_access_methods(self) -> None: klass, called = builder.extract_node( """ from typing import NamedTuple class X(NamedTuple): #@ a: int b: int def as_string(self): return '%s' % self.a def as_integer(self): return 2 + 3 X().as_integer() #@ """ ) self.assertEqual(len(klass.getattr("as_string")), 1) inferred = next(called.infer()) self.assertIsInstance(inferred, astroid.Const) self.assertEqual(inferred.value, 5) def test_namedtuple_inference(self) -> None: klass = builder.extract_node( """ from typing import NamedTuple class X(NamedTuple("X", [("a", int), ("b", str), ("c", bytes)])): pass """ ) base = next(base for base in klass.ancestors() if base.name == "X") self.assertSetEqual({"a", "b", "c"}, set(base.instance_attrs)) def test_namedtuple_inference_nonliteral(self) -> None: # Note: NamedTuples in mypy only work with literals. klass = builder.extract_node( """ from typing import NamedTuple name = "X" fields = [("a", int), ("b", str), ("c", bytes)] NamedTuple(name, fields) """ ) inferred = next(klass.infer()) self.assertIsInstance(inferred, astroid.Instance) self.assertEqual(inferred.qname(), "typing.NamedTuple") def test_namedtuple_instance_attrs(self) -> None: result = builder.extract_node( """ from typing import NamedTuple NamedTuple("A", [("a", int), ("b", str), ("c", bytes)])(1, 2, 3) #@ """ ) inferred = next(result.infer()) for name, attr in inferred.instance_attrs.items(): self.assertEqual(attr[0].attrname, name) def test_namedtuple_simple(self) -> None: result = builder.extract_node( """ from typing import NamedTuple NamedTuple("A", [("a", int), ("b", str), ("c", bytes)]) """ ) inferred = next(result.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertSetEqual({"a", "b", "c"}, set(inferred.instance_attrs)) def test_namedtuple_few_args(self) -> None: result = builder.extract_node( """ from typing import NamedTuple NamedTuple("A") """ ) inferred = next(result.infer()) self.assertIsInstance(inferred, astroid.Instance) self.assertEqual(inferred.qname(), "typing.NamedTuple") def test_namedtuple_few_fields(self) -> None: result = builder.extract_node( """ from typing import NamedTuple NamedTuple("A", [("a",), ("b", str), ("c", bytes)]) """ ) inferred = next(result.infer()) self.assertIsInstance(inferred, astroid.Instance) self.assertEqual(inferred.qname(), "typing.NamedTuple") def test_namedtuple_class_form(self) -> None: result = builder.extract_node( """ from typing import NamedTuple class Example(NamedTuple): CLASS_ATTR = "class_attr" mything: int Example(mything=1) """ ) inferred = next(result.infer()) self.assertIsInstance(inferred, astroid.Instance) class_attr = inferred.getattr("CLASS_ATTR")[0] self.assertIsInstance(class_attr, astroid.AssignName) const = next(class_attr.infer()) self.assertEqual(const.value, "class_attr") def test_namedtuple_inferred_as_class(self) -> None: node = builder.extract_node( """ from typing import NamedTuple NamedTuple """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert inferred.name == "NamedTuple" def test_namedtuple_bug_pylint_4383(self) -> None: """Inference of 'NamedTuple' function shouldn't cause InferenceError. https://github.com/pylint-dev/pylint/issues/4383 """ node = builder.extract_node( """ if True: def NamedTuple(): pass NamedTuple """ ) next(node.infer()) def test_namedtuple_uninferable_member(self) -> None: call = builder.extract_node( """ from typing import namedtuple namedtuple('uninf', {x: x for x in range(0)}) #@""" ) with pytest.raises(UseInferenceDefault): _get_namedtuple_fields(call) call = builder.extract_node( """ from typing import namedtuple uninferable = {x: x for x in range(0)} namedtuple('uninferable', uninferable) #@ """ ) with pytest.raises(UseInferenceDefault): _get_namedtuple_fields(call) def test_typing_types(self) -> None: ast_nodes = builder.extract_node( """ from typing import TypeVar, Iterable, Tuple, NewType, Dict, Union TypeVar('MyTypeVar', int, float, complex) #@ Iterable[Tuple[MyTypeVar, MyTypeVar]] #@ TypeVar('AnyStr', str, bytes) #@ NewType('UserId', str) #@ Dict[str, str] #@ Union[int, str] #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef, node.as_string()) def test_typing_type_without_tip(self): """Regression test for https://github.com/pylint-dev/pylint/issues/5770""" node = builder.extract_node( """ from typing import NewType def make_new_type(t): new_type = NewType(f'IntRange_{t}', t) #@ """ ) with self.assertRaises(UseInferenceDefault): astroid.brain.brain_typing.infer_typing_typevar_or_newtype(node.value) def test_namedtuple_nested_class(self): result = builder.extract_node( """ from typing import NamedTuple class Example(NamedTuple): class Foo: bar = "bar" Example """ ) inferred = next(result.infer()) self.assertIsInstance(inferred, astroid.ClassDef) class_def_attr = inferred.getattr("Foo")[0] self.assertIsInstance(class_def_attr, astroid.ClassDef) attr_def = class_def_attr.getattr("bar")[0] attr = next(attr_def.infer()) self.assertEqual(attr.value, "bar") def test_tuple_type(self): node = builder.extract_node( """ from typing import Tuple Tuple[int, int] """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) assert inferred.qname() == "typing.Tuple" def test_callable_type(self): node = builder.extract_node( """ from typing import Callable, Any Callable[..., Any] """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) assert inferred.qname() == "typing.Callable" def test_typing_generic_subscriptable(self): """Test typing.Generic is subscriptable with __class_getitem__ (added in PY37)""" node = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') Generic[T] """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) @test_utils.require_version(minver="3.12") def test_typing_generic_subscriptable_pep695(self): """Test class using type parameters is subscriptable with __class_getitem__ (added in PY312)""" node = builder.extract_node( """ class Foo[T]: ... class Bar[T](Foo[T]): ... """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert inferred.name == "Bar" assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) ancestors = list(inferred.ancestors()) assert len(ancestors) == 2 assert ancestors[0].name == "Foo" assert ancestors[1].name == "object" @test_utils.require_version(minver="3.9") def test_typing_annotated_subscriptable(self): """Test typing.Annotated is subscriptable with __class_getitem__""" node = builder.extract_node( """ import typing typing.Annotated[str, "data"] """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) def test_typing_generic_slots(self): """Test slots for Generic subclass.""" node = builder.extract_node( """ from typing import Generic, TypeVar T = TypeVar('T') class A(Generic[T]): __slots__ = ['value'] def __init__(self, value): self.value = value """ ) inferred = next(node.infer()) slots = inferred.slots() assert len(slots) == 1 assert isinstance(slots[0], nodes.Const) assert slots[0].value == "value" @test_utils.require_version(minver="3.9") def test_typing_no_duplicates(self): node = builder.extract_node( """ from typing import List List[int] """ ) assert len(node.inferred()) == 1 @test_utils.require_version(minver="3.9") def test_typing_no_duplicates_2(self): node = builder.extract_node( """ from typing import Optional, Tuple Tuple[Optional[int], ...] """ ) assert len(node.inferred()) == 1 @test_utils.require_version(minver="3.10") def test_typing_param_spec(self): node = builder.extract_node( """ from typing import ParamSpec P = ParamSpec("P") """ ) inferred = next(node.targets[0].infer()) assert next(inferred.igetattr("args")) is not None assert next(inferred.igetattr("kwargs")) is not None def test_collections_generic_alias_slots(self): """Test slots for a class which is a subclass of a generic alias type.""" node = builder.extract_node( """ import collections import typing Type = typing.TypeVar('Type') class A(collections.abc.AsyncIterator[Type]): __slots__ = ('_value',) def __init__(self, value: collections.abc.AsyncIterator[Type]): self._value = value """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) slots = inferred.slots() assert len(slots) == 1 assert isinstance(slots[0], nodes.Const) assert slots[0].value == "_value" def test_has_dunder_args(self) -> None: ast_node = builder.extract_node( """ from typing import Union NumericTypes = Union[int, float] NumericTypes.__args__ #@ """ ) inferred = next(ast_node.infer()) assert isinstance(inferred, nodes.Tuple) def test_typing_namedtuple_dont_crash_on_no_fields(self) -> None: node = builder.extract_node( """ from typing import NamedTuple Bar = NamedTuple("bar", []) Bar() """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.Instance) @test_utils.require_version("3.8") def test_typed_dict(self): code = builder.extract_node( """ from typing import TypedDict class CustomTD(TypedDict): #@ var: int CustomTD(var=1) #@ """ ) inferred_base = next(code[0].bases[0].infer()) assert isinstance(inferred_base, nodes.ClassDef) assert inferred_base.qname() == "typing.TypedDict" typedDict_base = next(inferred_base.bases[0].infer()) assert typedDict_base.qname() == "builtins.dict" # Test TypedDict has `__call__` method local_call = inferred_base.locals.get("__call__", None) assert local_call and len(local_call) == 1 assert isinstance(local_call[0], nodes.Name) and local_call[0].name == "dict" # Test TypedDict instance is callable assert next(code[1].infer()).callable() is True def test_typing_alias_type(self): """ Test that the type aliased thanks to typing._alias function are correctly inferred. typing_alias function is introduced with python37 """ node = builder.extract_node( """ from typing import TypeVar, MutableSet T = TypeVar("T") MutableSet[T] class Derived1(MutableSet[T]): pass """ ) inferred = next(node.infer()) assertEqualMro( inferred, [ ".Derived1", "typing.MutableSet", "_collections_abc.MutableSet", "_collections_abc.Set", "_collections_abc.Collection", "_collections_abc.Sized", "_collections_abc.Iterable", "_collections_abc.Container", "builtins.object", ], ) def test_typing_alias_type_2(self): """ Test that the type aliased thanks to typing._alias function are correctly inferred. typing_alias function is introduced with python37. OrderedDict in the typing module appears only with python 3.7.2 """ node = builder.extract_node( """ import typing class Derived2(typing.OrderedDict[int, str]): pass """ ) inferred = next(node.infer()) assertEqualMro( inferred, [ ".Derived2", "typing.OrderedDict", "collections.OrderedDict", "builtins.dict", "builtins.object", ], ) def test_typing_object_not_subscriptable(self): """Hashable is not subscriptable""" wrong_node = builder.extract_node( """ import typing typing.Hashable[int] """ ) with self.assertRaises(InferenceError): next(wrong_node.infer()) right_node = builder.extract_node( """ import typing typing.Hashable """ ) inferred = next(right_node.infer()) assertEqualMro( inferred, [ "typing.Hashable", "_collections_abc.Hashable", "builtins.object", ], ) with self.assertRaises(AttributeInferenceError): inferred.getattr("__class_getitem__") def test_typing_object_subscriptable(self): """Test that MutableSet is subscriptable""" right_node = builder.extract_node( """ import typing typing.MutableSet[int] """ ) inferred = next(right_node.infer()) assertEqualMro( inferred, [ "typing.MutableSet", "_collections_abc.MutableSet", "_collections_abc.Set", "_collections_abc.Collection", "_collections_abc.Sized", "_collections_abc.Iterable", "_collections_abc.Container", "builtins.object", ], ) self.assertIsInstance( inferred.getattr("__class_getitem__")[0], nodes.FunctionDef ) def test_typing_object_subscriptable_2(self): """Multiple inheritance with subscriptable typing alias""" node = builder.extract_node( """ import typing class Derived(typing.Hashable, typing.Iterator[int]): pass """ ) inferred = next(node.infer()) assertEqualMro( inferred, [ ".Derived", "typing.Hashable", "_collections_abc.Hashable", "typing.Iterator", "_collections_abc.Iterator", "_collections_abc.Iterable", "builtins.object", ], ) def test_typing_object_notsubscriptable_3(self): """Until python39 ByteString class of the typing module is not subscriptable (whereas it is in the collections' module)""" right_node = builder.extract_node( """ import typing typing.ByteString """ ) inferred = next(right_node.infer()) check_metaclass_is_abc(inferred) with self.assertRaises(AttributeInferenceError): self.assertIsInstance( inferred.getattr("__class_getitem__")[0], nodes.FunctionDef ) @test_utils.require_version(minver="3.9") def test_typing_object_builtin_subscriptable(self): """ Test that builtins alias, such as typing.List, are subscriptable """ for typename in ("List", "Dict", "Set", "FrozenSet", "Tuple"): src = f""" import typing typing.{typename:s}[int] """ right_node = builder.extract_node(src) inferred = next(right_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertIsInstance(inferred.getattr("__iter__")[0], nodes.FunctionDef) @staticmethod @test_utils.require_version(minver="3.9") def test_typing_type_subscriptable(): node = builder.extract_node( """ from typing import Type Type[int] """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.ClassDef) assert isinstance(inferred.getattr("__class_getitem__")[0], nodes.FunctionDef) assert inferred.qname() == "typing.Type" def test_typing_cast(self) -> None: node = builder.extract_node( """ from typing import cast class A: pass b = 42 cast(A, b) """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 def test_typing_cast_attribute(self) -> None: node = builder.extract_node( """ import typing class A: pass b = 42 typing.cast(A, b) """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 def test_typing_cast_multiple_inference_calls(self) -> None: """Inference of an outer function should not store the result for cast.""" ast_nodes = builder.extract_node( """ from typing import TypeVar, cast T = TypeVar("T") def ident(var: T) -> T: return cast(T, var) ident(2) #@ ident("Hello") #@ """ ) i0 = next(ast_nodes[0].infer()) assert isinstance(i0, nodes.Const) assert i0.value == 2 i1 = next(ast_nodes[1].infer()) assert isinstance(i1, nodes.Const) assert i1.value == "Hello" class ReBrainTest(unittest.TestCase): def test_regex_flags(self) -> None: names = [name for name in dir(re) if name.isupper()] re_ast = MANAGER.ast_from_module_name("re") for name in names: self.assertIn(name, re_ast) self.assertEqual(next(re_ast[name].infer()).value, getattr(re, name)) @test_utils.require_version(maxver="3.9") def test_re_pattern_unsubscriptable(self): """ re.Pattern and re.Match are unsubscriptable until PY39. """ right_node1 = builder.extract_node( """ import re re.Pattern """ ) inferred1 = next(right_node1.infer()) assert isinstance(inferred1, nodes.ClassDef) with self.assertRaises(AttributeInferenceError): assert isinstance( inferred1.getattr("__class_getitem__")[0], nodes.FunctionDef ) right_node2 = builder.extract_node( """ import re re.Pattern """ ) inferred2 = next(right_node2.infer()) assert isinstance(inferred2, nodes.ClassDef) with self.assertRaises(AttributeInferenceError): assert isinstance( inferred2.getattr("__class_getitem__")[0], nodes.FunctionDef ) wrong_node1 = builder.extract_node( """ import re re.Pattern[int] """ ) with self.assertRaises(InferenceError): next(wrong_node1.infer()) wrong_node2 = builder.extract_node( """ import re re.Match[int] """ ) with self.assertRaises(InferenceError): next(wrong_node2.infer()) @test_utils.require_version(minver="3.9") def test_re_pattern_subscriptable(self): """Test re.Pattern and re.Match are subscriptable in PY39+""" node1 = builder.extract_node( """ import re re.Pattern[str] """ ) inferred1 = next(node1.infer()) assert isinstance(inferred1, nodes.ClassDef) assert isinstance(inferred1.getattr("__class_getitem__")[0], nodes.FunctionDef) node2 = builder.extract_node( """ import re re.Match[str] """ ) inferred2 = next(node2.infer()) assert isinstance(inferred2, nodes.ClassDef) assert isinstance(inferred2.getattr("__class_getitem__")[0], nodes.FunctionDef) class BrainFStrings(unittest.TestCase): def test_no_crash_on_const_reconstruction(self) -> None: node = builder.extract_node( """ max_width = 10 test1 = f'{" ":{max_width+4}}' print(f'"{test1}"') test2 = f'[{"7":>{max_width}}:0]' test2 """ ) inferred = next(node.infer()) self.assertIs(inferred, util.Uninferable) class BrainNamedtupleAnnAssignTest(unittest.TestCase): def test_no_crash_on_ann_assign_in_namedtuple(self) -> None: node = builder.extract_node( """ from enum import Enum from typing import Optional class A(Enum): B: str = 'B' """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) class BrainUUIDTest(unittest.TestCase): def test_uuid_has_int_member(self) -> None: node = builder.extract_node( """ import uuid u = uuid.UUID('{12345678-1234-5678-1234-567812345678}') u.int """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) class RandomSampleTest(unittest.TestCase): def test_inferred_successfully(self) -> None: node = astroid.extract_node( """ import random random.sample([1, 2], 2) #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.List) elems = sorted(elem.value for elem in inferred.elts) self.assertEqual(elems, [1, 2]) def test_arguments_inferred_successfully(self) -> None: """Test inference of `random.sample` when both arguments are of type `nodes.Call`.""" node = astroid.extract_node( """ import random def sequence(): return [1, 2] random.sample(sequence(), len([1,2])) #@ """ ) # Check that arguments are of type `nodes.Call`. sequence, length = node.args self.assertIsInstance(sequence, astroid.Call) self.assertIsInstance(length, astroid.Call) # Check the inference of `random.sample` call. inferred = next(node.infer()) self.assertIsInstance(inferred, astroid.List) elems = sorted(elem.value for elem in inferred.elts) self.assertEqual(elems, [1, 2]) def test_no_crash_on_evaluatedobject(self) -> None: node = astroid.extract_node( """ from random import sample class A: pass sample(list({1: A()}.values()), 1)""" ) inferred = next(node.infer()) assert isinstance(inferred, astroid.List) assert len(inferred.elts) == 1 assert isinstance(inferred.elts[0], nodes.Call) class SubprocessTest(unittest.TestCase): """Test subprocess brain""" def test_subprocess_args(self) -> None: """Make sure the args attribute exists for Popen Test for https://github.com/pylint-dev/pylint/issues/1860""" name = astroid.extract_node( """ import subprocess p = subprocess.Popen(['ls']) p #@ """ ) [inst] = name.inferred() self.assertIsInstance(next(inst.igetattr("args")), nodes.List) def test_subprcess_check_output(self) -> None: code = """ import subprocess subprocess.check_output(['echo', 'hello']); """ node = astroid.extract_node(code) inferred = next(node.infer()) # Can be either str or bytes assert isinstance(inferred, astroid.Const) assert isinstance(inferred.value, (str, bytes)) @test_utils.require_version("3.9") def test_popen_does_not_have_class_getitem(self): code = """import subprocess; subprocess.Popen""" node = astroid.extract_node(code) inferred = next(node.infer()) assert "__class_getitem__" in inferred class TestIsinstanceInference: """Test isinstance builtin inference""" def test_type_type(self) -> None: assert _get_result("isinstance(type, type)") == "True" def test_object_type(self) -> None: assert _get_result("isinstance(object, type)") == "True" def test_type_object(self) -> None: assert _get_result("isinstance(type, object)") == "True" def test_isinstance_int_true(self) -> None: """Make sure isinstance can check builtin int types""" assert _get_result("isinstance(1, int)") == "True" def test_isinstance_int_false(self) -> None: assert _get_result("isinstance('a', int)") == "False" def test_isinstance_object_true(self) -> None: assert ( _get_result( """ class Bar(object): pass isinstance(Bar(), object) """ ) == "True" ) def test_isinstance_object_true3(self) -> None: assert ( _get_result( """ class Bar(object): pass isinstance(Bar(), Bar) """ ) == "True" ) def test_isinstance_class_false(self) -> None: assert ( _get_result( """ class Foo(object): pass class Bar(object): pass isinstance(Bar(), Foo) """ ) == "False" ) def test_isinstance_type_false(self) -> None: assert ( _get_result( """ class Bar(object): pass isinstance(Bar(), type) """ ) == "False" ) def test_isinstance_str_true(self) -> None: """Make sure isinstance can check builtin str types""" assert _get_result("isinstance('a', str)") == "True" def test_isinstance_str_false(self) -> None: assert _get_result("isinstance(1, str)") == "False" def test_isinstance_tuple_argument(self) -> None: """obj just has to be an instance of ANY class/type on the right""" assert _get_result("isinstance(1, (str, int))") == "True" def test_isinstance_type_false2(self) -> None: assert ( _get_result( """ isinstance(1, type) """ ) == "False" ) def test_isinstance_object_true2(self) -> None: assert ( _get_result( """ class Bar(type): pass mainbar = Bar("Bar", tuple(), {}) isinstance(mainbar, object) """ ) == "True" ) def test_isinstance_type_true(self) -> None: assert ( _get_result( """ class Bar(type): pass mainbar = Bar("Bar", tuple(), {}) isinstance(mainbar, type) """ ) == "True" ) def test_isinstance_edge_case(self) -> None: """isinstance allows bad type short-circuting""" assert _get_result("isinstance(1, (int, 1))") == "True" def test_uninferable_bad_type(self) -> None: """The second argument must be a class or a tuple of classes""" with pytest.raises(InferenceError): _get_result_node("isinstance(int, 1)") def test_uninferable_keywords(self) -> None: """isinstance does not allow keywords""" with pytest.raises(InferenceError): _get_result_node("isinstance(1, class_or_tuple=int)") def test_too_many_args(self) -> None: """isinstance must have two arguments""" with pytest.raises(InferenceError): _get_result_node("isinstance(1, int, str)") def test_first_param_is_uninferable(self) -> None: with pytest.raises(InferenceError): _get_result_node("isinstance(something, int)") class TestIssubclassBrain: """Test issubclass() builtin inference""" def test_type_type(self) -> None: assert _get_result("issubclass(type, type)") == "True" def test_object_type(self) -> None: assert _get_result("issubclass(object, type)") == "False" def test_type_object(self) -> None: assert _get_result("issubclass(type, object)") == "True" def test_issubclass_same_class(self) -> None: assert _get_result("issubclass(int, int)") == "True" def test_issubclass_not_the_same_class(self) -> None: assert _get_result("issubclass(str, int)") == "False" def test_issubclass_object_true(self) -> None: assert ( _get_result( """ class Bar(object): pass issubclass(Bar, object) """ ) == "True" ) def test_issubclass_same_user_defined_class(self) -> None: assert ( _get_result( """ class Bar(object): pass issubclass(Bar, Bar) """ ) == "True" ) def test_issubclass_different_user_defined_classes(self) -> None: assert ( _get_result( """ class Foo(object): pass class Bar(object): pass issubclass(Bar, Foo) """ ) == "False" ) def test_issubclass_type_false(self) -> None: assert ( _get_result( """ class Bar(object): pass issubclass(Bar, type) """ ) == "False" ) def test_isinstance_tuple_argument(self) -> None: """obj just has to be a subclass of ANY class/type on the right""" assert _get_result("issubclass(int, (str, int))") == "True" def test_isinstance_object_true2(self) -> None: assert ( _get_result( """ class Bar(type): pass issubclass(Bar, object) """ ) == "True" ) def test_issubclass_short_circuit(self) -> None: """issubclasss allows bad type short-circuting""" assert _get_result("issubclass(int, (int, 1))") == "True" def test_uninferable_bad_type(self) -> None: """The second argument must be a class or a tuple of classes""" # Should I subclass with pytest.raises(InferenceError): _get_result_node("issubclass(int, 1)") def test_uninferable_keywords(self) -> None: """issubclass does not allow keywords""" with pytest.raises(InferenceError): _get_result_node("issubclass(int, class_or_tuple=int)") def test_too_many_args(self) -> None: """issubclass must have two arguments""" with pytest.raises(InferenceError): _get_result_node("issubclass(int, int, str)") def _get_result_node(code: str) -> Const: node = next(astroid.extract_node(code).infer()) return node def _get_result(code: str) -> str: return _get_result_node(code).as_string() class TestLenBuiltinInference: def test_len_list(self) -> None: # Uses .elts node = astroid.extract_node( """ len(['a','b','c']) """ ) node = next(node.infer()) assert node.as_string() == "3" assert isinstance(node, nodes.Const) def test_len_tuple(self) -> None: node = astroid.extract_node( """ len(('a','b','c')) """ ) node = next(node.infer()) assert node.as_string() == "3" def test_len_var(self) -> None: # Make sure argument is inferred node = astroid.extract_node( """ a = [1,2,'a','b','c'] len(a) """ ) node = next(node.infer()) assert node.as_string() == "5" def test_len_dict(self) -> None: # Uses .items node = astroid.extract_node( """ a = {'a': 1, 'b': 2} len(a) """ ) node = next(node.infer()) assert node.as_string() == "2" def test_len_set(self) -> None: node = astroid.extract_node( """ len({'a'}) """ ) inferred_node = next(node.infer()) assert inferred_node.as_string() == "1" def test_len_object(self) -> None: """Test len with objects that implement the len protocol""" node = astroid.extract_node( """ class A: def __len__(self): return 57 len(A()) """ ) inferred_node = next(node.infer()) assert inferred_node.as_string() == "57" def test_len_class_with_metaclass(self) -> None: """Make sure proper len method is located""" cls_node, inst_node = astroid.extract_node( """ class F2(type): def __new__(cls, name, bases, attrs): return super().__new__(cls, name, bases, {}) def __len__(self): return 57 class F(metaclass=F2): def __len__(self): return 4 len(F) #@ len(F()) #@ """ ) assert next(cls_node.infer()).as_string() == "57" assert next(inst_node.infer()).as_string() == "4" def test_len_object_failure(self) -> None: """If taking the length of a class, do not use an instance method""" node = astroid.extract_node( """ class F: def __len__(self): return 57 len(F) """ ) with pytest.raises(InferenceError): next(node.infer()) def test_len_string(self) -> None: node = astroid.extract_node( """ len("uwu") """ ) assert next(node.infer()).as_string() == "3" def test_len_generator_failure(self) -> None: node = astroid.extract_node( """ def gen(): yield 'a' yield 'b' len(gen()) """ ) with pytest.raises(InferenceError): next(node.infer()) def test_len_failure_missing_variable(self) -> None: node = astroid.extract_node( """ len(a) """ ) with pytest.raises(InferenceError): next(node.infer()) def test_len_bytes(self) -> None: node = astroid.extract_node( """ len(b'uwu') """ ) assert next(node.infer()).as_string() == "3" def test_int_subclass_result(self) -> None: """Check that a subclass of an int can still be inferred This test does not properly infer the value passed to the int subclass (5) but still returns a proper integer as we fake the result of the `len()` call. """ node = astroid.extract_node( """ class IntSubclass(int): pass class F: def __len__(self): return IntSubclass(5) len(F()) """ ) assert next(node.infer()).as_string() == "0" @pytest.mark.xfail(reason="Can't use list special astroid fields") def test_int_subclass_argument(self): """I am unable to access the length of an object which subclasses list""" node = astroid.extract_node( """ class ListSubclass(list): pass len(ListSubclass([1,2,3,4,4])) """ ) assert next(node.infer()).as_string() == "5" def test_len_builtin_inference_attribute_error_str(self) -> None: """Make sure len builtin doesn't raise an AttributeError on instances of str or bytes See https://github.com/pylint-dev/pylint/issues/1942 """ code = 'len(str("F"))' try: next(astroid.extract_node(code).infer()) except InferenceError: pass def test_len_builtin_inference_recursion_error_self_referential_attribute( self, ) -> None: """Make sure len calls do not trigger recursion errors for self referential assignment See https://github.com/pylint-dev/pylint/issues/2734 """ code = """ class Data: def __init__(self): self.shape = [] data = Data() data.shape = len(data.shape) data.shape #@ """ try: astroid.extract_node(code).inferred() except RecursionError: pytest.fail("Inference call should not trigger a recursion error") def test_infer_str() -> None: ast_nodes = astroid.extract_node( """ str(s) #@ str('a') #@ str(some_object()) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) node = astroid.extract_node( """ str(s='') #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, astroid.Instance) assert inferred.qname() == "builtins.str" def test_infer_int() -> None: ast_nodes = astroid.extract_node( """ int(0) #@ int('1') #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) ast_nodes = astroid.extract_node( """ int(s='') #@ int('2.5') #@ int('something else') #@ int(unknown) #@ int(b'a') #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, astroid.Instance) assert inferred.qname() == "builtins.int" def test_infer_dict_from_keys() -> None: bad_nodes = astroid.extract_node( """ dict.fromkeys() #@ dict.fromkeys(1, 2, 3) #@ dict.fromkeys(a=1) #@ """ ) for node in bad_nodes: with pytest.raises(InferenceError): next(node.infer()) # Test uninferable values good_nodes = astroid.extract_node( """ from unknown import Unknown dict.fromkeys(some_value) #@ dict.fromkeys(some_other_value) #@ dict.fromkeys([Unknown(), Unknown()]) #@ dict.fromkeys([Unknown(), Unknown()]) #@ """ ) for node in good_nodes: inferred = next(node.infer()) assert isinstance(inferred, astroid.Dict) assert inferred.items == [] # Test inferable values # from a dictionary's keys from_dict = astroid.extract_node( """ dict.fromkeys({'a':2, 'b': 3, 'c': 3}) #@ """ ) inferred = next(from_dict.infer()) assert isinstance(inferred, astroid.Dict) itered = inferred.itered() assert all(isinstance(elem, astroid.Const) for elem in itered) actual_values = [elem.value for elem in itered] assert sorted(actual_values) == ["a", "b", "c"] # from a string from_string = astroid.extract_node( """ dict.fromkeys('abc') """ ) inferred = next(from_string.infer()) assert isinstance(inferred, astroid.Dict) itered = inferred.itered() assert all(isinstance(elem, astroid.Const) for elem in itered) actual_values = [elem.value for elem in itered] assert sorted(actual_values) == ["a", "b", "c"] # from bytes from_bytes = astroid.extract_node( """ dict.fromkeys(b'abc') """ ) inferred = next(from_bytes.infer()) assert isinstance(inferred, astroid.Dict) itered = inferred.itered() assert all(isinstance(elem, astroid.Const) for elem in itered) actual_values = [elem.value for elem in itered] assert sorted(actual_values) == [97, 98, 99] # From list/set/tuple from_others = astroid.extract_node( """ dict.fromkeys(('a', 'b', 'c')) #@ dict.fromkeys(['a', 'b', 'c']) #@ dict.fromkeys({'a', 'b', 'c'}) #@ """ ) for node in from_others: inferred = next(node.infer()) assert isinstance(inferred, astroid.Dict) itered = inferred.itered() assert all(isinstance(elem, astroid.Const) for elem in itered) actual_values = [elem.value for elem in itered] assert sorted(actual_values) == ["a", "b", "c"] class TestFunctoolsPartial: @staticmethod def test_infer_partial() -> None: ast_node = astroid.extract_node( """ from functools import partial def test(a, b): '''Docstring''' return a + b partial(test, 1)(3) #@ """ ) assert isinstance(ast_node.func, nodes.Call) inferred = ast_node.func.inferred() assert len(inferred) == 1 partial = inferred[0] assert isinstance(partial, objects.PartialFunction) assert isinstance(partial.as_string(), str) assert isinstance(partial.doc_node, nodes.Const) assert partial.doc_node.value == "Docstring" assert partial.lineno == 3 assert partial.col_offset == 0 def test_invalid_functools_partial_calls(self) -> None: ast_nodes = astroid.extract_node( """ from functools import partial from unknown import Unknown def test(a, b, c): return a + b + c partial() #@ partial(test) #@ partial(func=test) #@ partial(some_func, a=1) #@ partial(Unknown, a=1) #@ partial(2, a=1) #@ partial(test, unknown=1) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert isinstance(inferred, (astroid.FunctionDef, astroid.Instance)) assert inferred.qname() in { "functools.partial", "functools.partial.newfunc", } def test_inferred_partial_function_calls(self) -> None: ast_nodes = astroid.extract_node( """ from functools import partial def test(a, b): return a + b partial(test, 1)(3) #@ partial(test, b=4)(3) #@ partial(test, b=4)(a=3) #@ def other_test(a, b, *, c=1): return (a + b) * c partial(other_test, 1, 2)() #@ partial(other_test, 1, 2)(c=4) #@ partial(other_test, c=4)(1, 3) #@ partial(other_test, 4, c=4)(4) #@ partial(other_test, 4, c=4)(b=5) #@ test(1, 2) #@ partial(other_test, 1, 2)(c=3) #@ partial(test, b=4)(a=3) #@ """ ) expected_values = [4, 7, 7, 3, 12, 16, 32, 36, 3, 9, 7] for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) assert inferred.value == expected_value def test_partial_assignment(self) -> None: """Make sure partials are not assigned to original scope.""" ast_nodes = astroid.extract_node( """ from functools import partial def test(a, b): #@ return a + b test2 = partial(test, 1) test2 #@ def test3_scope(a): test3 = partial(test, a) test3 #@ """ ) func1, func2, func3 = ast_nodes assert func1.parent.scope() == func2.parent.scope() assert func1.parent.scope() != func3.parent.scope() partial_func3 = next(func3.infer()) # use scope of parent, so that it doesn't just refer to self scope = partial_func3.parent.scope() assert scope.name == "test3_scope", "parented by closure" def test_partial_does_not_affect_scope(self) -> None: """Make sure partials are not automatically assigned.""" ast_nodes = astroid.extract_node( """ from functools import partial def test(a, b): return a + b def scope(): test2 = partial(test, 1) test2 #@ """ ) test2 = next(ast_nodes.infer()) mod_scope = test2.root() scope = test2.parent.scope() assert set(mod_scope) == {"test", "scope", "partial"} assert set(scope) == {"test2"} def test_multiple_partial_args(self) -> None: "Make sure partials remember locked-in args." ast_node = astroid.extract_node( """ from functools import partial def test(a, b, c, d, e=5): return a + b + c + d + e test1 = partial(test, 1) test2 = partial(test1, 2) test3 = partial(test2, 3) test3(4, e=6) #@ """ ) expected_args = [1, 2, 3, 4] expected_keywords = {"e": 6} call_site = astroid.arguments.CallSite.from_call(ast_node) called_func = next(ast_node.func.infer()) called_args = called_func.filled_args + call_site.positional_arguments called_keywords = {**called_func.filled_keywords, **call_site.keyword_arguments} assert len(called_args) == len(expected_args) assert [arg.value for arg in called_args] == expected_args assert len(called_keywords) == len(expected_keywords) for keyword, value in expected_keywords.items(): assert keyword in called_keywords assert called_keywords[keyword].value == value def test_http_client_brain() -> None: node = astroid.extract_node( """ from http.client import OK OK """ ) inferred = next(node.infer()) assert isinstance(inferred, astroid.Instance) def test_http_status_brain() -> None: node = astroid.extract_node( """ import http http.HTTPStatus.CONTINUE.phrase """ ) inferred = next(node.infer()) # Cannot infer the exact value but the field is there. assert inferred.value == "" node = astroid.extract_node( """ import http http.HTTPStatus(200).phrase """ ) inferred = next(node.infer()) assert isinstance(inferred, astroid.Const) def test_http_status_brain_iterable() -> None: """Astroid inference of `http.HTTPStatus` is an iterable subclass of `enum.IntEnum`""" node = astroid.extract_node( """ import http http.HTTPStatus """ ) inferred = next(node.infer()) assert "enum.IntEnum" in [ancestor.qname() for ancestor in inferred.ancestors()] assert inferred.getattr("__iter__") def test_oserror_model() -> None: node = astroid.extract_node( """ try: 1/0 except OSError as exc: exc #@ """ ) inferred = next(node.infer()) strerror = next(inferred.igetattr("strerror")) assert isinstance(strerror, astroid.Const) assert strerror.value == "" @pytest.mark.skipif(PY313_PLUS, reason="Python >= 3.13 no longer has a crypt module") def test_crypt_brain() -> None: module = MANAGER.ast_from_module_name("crypt") dynamic_attrs = [ "METHOD_SHA512", "METHOD_SHA256", "METHOD_BLOWFISH", "METHOD_MD5", "METHOD_CRYPT", ] for attr in dynamic_attrs: assert attr in module @pytest.mark.parametrize( "code,expected_class,expected_value", [ ("'hey'.encode()", astroid.Const, b""), ("b'hey'.decode()", astroid.Const, ""), ("'hey'.encode().decode()", astroid.Const, ""), ], ) def test_str_and_bytes(code, expected_class, expected_value): node = astroid.extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, expected_class) assert inferred.value == expected_value def test_no_recursionerror_on_self_referential_length_check() -> None: """ Regression test for https://github.com/pylint-dev/astroid/issues/777 This test should only raise an InferenceError and no RecursionError. """ with pytest.raises(InferenceError): node = astroid.extract_node( """ class Crash: def __len__(self) -> int: return len(self) len(Crash()) #@ """ ) assert isinstance(node, nodes.NodeNG) node.inferred() def test_inference_on_outer_referential_length_check() -> None: """ Regression test for https://github.com/pylint-dev/pylint/issues/5244 See also https://github.com/pylint-dev/astroid/pull/1234 This test should succeed without any error. """ node = astroid.extract_node( """ class A: def __len__(self) -> int: return 42 class Crash: def __len__(self) -> int: a = A() return len(a) len(Crash()) #@ """ ) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 42 def test_no_attributeerror_on_self_referential_length_check() -> None: """ Regression test for https://github.com/pylint-dev/pylint/issues/5244 See also https://github.com/pylint-dev/astroid/pull/1234 This test should only raise an InferenceError and no AttributeError. """ with pytest.raises(InferenceError): node = astroid.extract_node( """ class MyClass: def some_func(self): return lambda: 42 def __len__(self): return len(self.some_func()) len(MyClass()) #@ """ ) assert isinstance(node, nodes.NodeNG) node.inferred() astroid-3.2.2/tests/brain/test_typing.py0000664000175000017500000000133214622475517020265 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import pytest from astroid import builder from astroid.exceptions import InferenceError def test_infer_typevar() -> None: """ Regression test for: https://github.com/pylint-dev/pylint/issues/8802 Test that an inferred `typing.TypeVar()` call produces a `nodes.ClassDef` node. """ call_node = builder.extract_node( """ from typing import TypeVar TypeVar('My.Type') """ ) with pytest.raises(InferenceError): call_node.inferred() astroid-3.2.2/tests/brain/test_hashlib.py0000664000175000017500000000522314622475517020370 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest from astroid import MANAGER from astroid.const import PY39_PLUS from astroid.nodes.scoped_nodes import ClassDef class HashlibTest(unittest.TestCase): def _assert_hashlib_class(self, class_obj: ClassDef) -> None: self.assertIn("update", class_obj) self.assertIn("digest", class_obj) self.assertIn("hexdigest", class_obj) self.assertIn("block_size", class_obj) self.assertIn("digest_size", class_obj) # usedforsecurity was added in Python 3.9, see 8e7174a9 self.assertEqual(len(class_obj["__init__"].args.args), 3 if PY39_PLUS else 2) self.assertEqual( len(class_obj["__init__"].args.defaults), 2 if PY39_PLUS else 1 ) self.assertEqual(len(class_obj["update"].args.args), 2) def test_hashlib(self) -> None: """Tests that brain extensions for hashlib work.""" hashlib_module = MANAGER.ast_from_module_name("hashlib") for class_name in ( "md5", "sha1", "sha224", "sha256", "sha384", "sha512", "sha3_224", "sha3_256", "sha3_384", "sha3_512", ): class_obj = hashlib_module[class_name] self._assert_hashlib_class(class_obj) self.assertEqual(len(class_obj["digest"].args.args), 1) self.assertEqual(len(class_obj["hexdigest"].args.args), 1) def test_shake(self) -> None: """Tests that the brain extensions for the hashlib shake algorithms work.""" hashlib_module = MANAGER.ast_from_module_name("hashlib") for class_name in ("shake_128", "shake_256"): class_obj = hashlib_module[class_name] self._assert_hashlib_class(class_obj) self.assertEqual(len(class_obj["digest"].args.args), 2) self.assertEqual(len(class_obj["hexdigest"].args.args), 2) def test_blake2(self) -> None: """Tests that the brain extensions for the hashlib blake2 hash functions work.""" hashlib_module = MANAGER.ast_from_module_name("hashlib") for class_name in ("blake2b", "blake2s"): class_obj = hashlib_module[class_name] self.assertEqual(len(class_obj["__init__"].args.args), 2) self.assertEqual(len(class_obj["digest"].args.args), 1) self.assertEqual(len(class_obj["hexdigest"].args.args), 1) astroid-3.2.2/tests/brain/test_dataclasses.py0000664000175000017500000010510314622475517021243 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import pytest import astroid from astroid import bases, nodes from astroid.const import PY310_PLUS from astroid.exceptions import InferenceError from astroid.util import Uninferable parametrize_module = pytest.mark.parametrize( ("module",), (["dataclasses"], ["pydantic.dataclasses"], ["marshmallow_dataclass"]) ) @parametrize_module def test_inference_attribute_no_default(module: str): """Test inference of dataclass attribute with no default. Note that the argument to the constructor is ignored by the inference. """ klass, instance = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: name: str A.name #@ A('hi').name #@ """ ) with pytest.raises(InferenceError): klass.inferred() inferred = instance.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], bases.Instance) assert inferred[0].name == "str" @parametrize_module def test_inference_non_field_default(module: str): """Test inference of dataclass attribute with a non-field default.""" klass, instance = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: name: str = 'hi' A.name #@ A().name #@ """ ) inferred = klass.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" inferred = instance.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" assert isinstance(inferred[1], bases.Instance) assert inferred[1].name == "str" @parametrize_module def test_inference_field_default(module: str): """Test inference of dataclass attribute with a field call default (default keyword argument given). """ klass, instance = astroid.extract_node( f""" from {module} import dataclass from dataclasses import field @dataclass class A: name: str = field(default='hi') A.name #@ A().name #@ """ ) inferred = klass.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" inferred = instance.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" assert isinstance(inferred[1], bases.Instance) assert inferred[1].name == "str" @parametrize_module def test_inference_field_default_factory(module: str): """Test inference of dataclass attribute with a field call default (default_factory keyword argument given). """ klass, instance = astroid.extract_node( f""" from {module} import dataclass from dataclasses import field @dataclass class A: name: list = field(default_factory=list) A.name #@ A().name #@ """ ) inferred = klass.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.List) assert inferred[0].elts == [] inferred = instance.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.List) assert inferred[0].elts == [] assert isinstance(inferred[1], bases.Instance) assert inferred[1].name == "list" @parametrize_module def test_inference_method(module: str): """Test inference of dataclass attribute within a method, with a default_factory field. Based on https://github.com/pylint-dev/pylint/issues/2600 """ node = astroid.extract_node( f""" from typing import Dict from {module} import dataclass from dataclasses import field @dataclass class TestClass: foo: str bar: str baz_dict: Dict[str, str] = field(default_factory=dict) def some_func(self) -> None: f = self.baz_dict.items #@ for key, value in f(): print(key) print(value) """ ) inferred = next(node.value.infer()) assert isinstance(inferred, bases.BoundMethod) @parametrize_module def test_inference_no_annotation(module: str): """Test that class variables without type annotations are not turned into instance attributes. """ class_def, klass, instance = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: name = 'hi' A #@ A.name #@ A().name #@ """ ) inferred = next(class_def.infer()) assert isinstance(inferred, nodes.ClassDef) assert inferred.instance_attrs == {} assert inferred.is_dataclass # Both the class and instance can still access the attribute for node in (klass, instance): assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" @parametrize_module def test_inference_class_var(module: str): """Test that class variables with a ClassVar type annotations are not turned into instance attributes. """ class_def, klass, instance = astroid.extract_node( f""" from {module} import dataclass from typing import ClassVar @dataclass class A: name: ClassVar[str] = 'hi' A #@ A.name #@ A().name #@ """ ) inferred = next(class_def.infer()) assert isinstance(inferred, nodes.ClassDef) assert inferred.instance_attrs == {} assert inferred.is_dataclass # Both the class and instance can still access the attribute for node in (klass, instance): assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" @parametrize_module def test_inference_init_var(module: str): """Test that class variables with InitVar type annotations are not turned into instance attributes. """ class_def, klass, instance = astroid.extract_node( f""" from {module} import dataclass from dataclasses import InitVar @dataclass class A: name: InitVar[str] = 'hi' A #@ A.name #@ A().name #@ """ ) inferred = next(class_def.infer()) assert isinstance(inferred, nodes.ClassDef) assert inferred.instance_attrs == {} assert inferred.is_dataclass # Both the class and instance can still access the attribute for node in (klass, instance): assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" @parametrize_module def test_inference_generic_collection_attribute(module: str): """Test that an attribute with a generic collection type from the typing module is inferred correctly. """ attr_nodes = astroid.extract_node( f""" from {module} import dataclass from dataclasses import field import typing @dataclass class A: dict_prop: typing.Dict[str, str] frozenset_prop: typing.FrozenSet[str] list_prop: typing.List[str] set_prop: typing.Set[str] tuple_prop: typing.Tuple[int, str] a = A({{}}, frozenset(), [], set(), (1, 'hi')) a.dict_prop #@ a.frozenset_prop #@ a.list_prop #@ a.set_prop #@ a.tuple_prop #@ """ ) names = ( "Dict", "FrozenSet", "List", "Set", "Tuple", ) for node, name in zip(attr_nodes, names): inferred = next(node.infer()) assert isinstance(inferred, bases.Instance) assert inferred.name == name @pytest.mark.parametrize( ("module", "typing_module"), [ ("dataclasses", "typing"), ("pydantic.dataclasses", "typing"), ("pydantic.dataclasses", "collections.abc"), ("marshmallow_dataclass", "typing"), ("marshmallow_dataclass", "collections.abc"), ], ) def test_inference_callable_attribute(module: str, typing_module: str): """Test that an attribute with a Callable annotation is inferred as Uninferable. See issue #1129 and pylint-dev/pylint#4895 """ instance = astroid.extract_node( f""" from {module} import dataclass from {typing_module} import Any, Callable @dataclass class A: enabled: Callable[[Any], bool] A(lambda x: x == 42).enabled #@ """ ) inferred = next(instance.infer()) assert inferred is Uninferable @parametrize_module def test_inference_inherited(module: str): """Test that an attribute is inherited from a superclass dataclass.""" klass1, instance1, klass2, instance2 = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: value: int name: str = "hi" @dataclass class B(A): new_attr: bool = True B.value #@ B(1).value #@ B.name #@ B(1).name #@ """ ) with pytest.raises(InferenceError): # B.value is not defined klass1.inferred() inferred = instance1.inferred() assert isinstance(inferred[0], bases.Instance) assert inferred[0].name == "int" inferred = klass2.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" inferred = instance2.inferred() assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hi" assert isinstance(inferred[1], bases.Instance) assert inferred[1].name == "str" def test_dataclass_order_of_inherited_attributes(): """Test that an attribute in a child does not get put at the end of the init.""" child, normal, keyword_only = astroid.extract_node( """ from dataclass import dataclass @dataclass class Parent: a: str b: str @dataclass class Child(Parent): c: str a: str @dataclass(kw_only=True) class KeywordOnlyParent: a: int b: str @dataclass class NormalChild(KeywordOnlyParent): c: str a: str @dataclass(kw_only=True) class KeywordOnlyChild(KeywordOnlyParent): c: str a: str Child.__init__ #@ NormalChild.__init__ #@ KeywordOnlyChild.__init__ #@ """ ) child_init: bases.UnboundMethod = next(child.infer()) assert [a.name for a in child_init.args.args] == ["self", "a", "b", "c"] normal_init: bases.UnboundMethod = next(normal.infer()) if PY310_PLUS: assert [a.name for a in normal_init.args.args] == ["self", "a", "c"] assert [a.name for a in normal_init.args.kwonlyargs] == ["b"] else: assert [a.name for a in normal_init.args.args] == ["self", "a", "b", "c"] assert [a.name for a in normal_init.args.kwonlyargs] == [] keyword_only_init: bases.UnboundMethod = next(keyword_only.infer()) if PY310_PLUS: assert [a.name for a in keyword_only_init.args.args] == ["self"] assert [a.name for a in keyword_only_init.args.kwonlyargs] == ["a", "b", "c"] else: assert [a.name for a in keyword_only_init.args.args] == ["self", "a", "b", "c"] def test_pydantic_field() -> None: """Test that pydantic.Field attributes are currently Uninferable. (Eventually, we can extend the brain to support pydantic.Field) """ klass, instance = astroid.extract_node( """ from pydantic import Field from pydantic.dataclasses import dataclass @dataclass class A: name: str = Field("hi") A.name #@ A().name #@ """ ) inferred = klass.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable inferred = instance.inferred() assert len(inferred) == 2 assert inferred[0] is Uninferable assert isinstance(inferred[1], bases.Instance) assert inferred[1].name == "str" @parametrize_module def test_init_empty(module: str): """Test init for a dataclass with no attributes.""" node = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: pass A.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self"] @parametrize_module def test_init_no_defaults(module: str): """Test init for a dataclass with attributes and no defaults.""" node = astroid.extract_node( f""" from {module} import dataclass from typing import List @dataclass class A: x: int y: str z: List[bool] A.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "x", "y", "z"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "int", "str", "List[bool]", ] @parametrize_module def test_init_defaults(module: str): """Test init for a dataclass with attributes and some defaults.""" node = astroid.extract_node( f""" from {module} import dataclass from dataclasses import field from typing import List @dataclass class A: w: int x: int = 10 y: str = field(default="hi") z: List[bool] = field(default_factory=list) A.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "w", "x", "y", "z"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "int", "int", "str", "List[bool]", ] assert [a.as_string() if a else None for a in init.args.defaults] == [ "10", "'hi'", "_HAS_DEFAULT_FACTORY", ] @parametrize_module def test_init_initvar(module: str): """Test init for a dataclass with attributes and an InitVar.""" node = astroid.extract_node( f""" from {module} import dataclass from dataclasses import InitVar from typing import List @dataclass class A: x: int y: str init_var: InitVar[int] z: List[bool] A.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "x", "y", "init_var", "z"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "int", "str", "int", "List[bool]", ] @parametrize_module def test_init_decorator_init_false(module: str): """Test that no init is generated when init=False is passed to dataclass decorator. """ node = astroid.extract_node( f""" from {module} import dataclass from typing import List @dataclass(init=False) class A: x: int y: str z: List[bool] A.__init__ #@ """ ) init = next(node.infer()) assert init._proxied.parent.name == "object" @parametrize_module def test_init_field_init_false(module: str): """Test init for a dataclass with attributes with a field value where init=False (these attributes should not be included in the initializer). """ node = astroid.extract_node( f""" from {module} import dataclass from dataclasses import field from typing import List @dataclass class A: x: int y: str z: List[bool] = field(init=False) A.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "x", "y"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "int", "str", ] @parametrize_module def test_init_override(module: str): """Test init for a dataclass overrides a superclass initializer. Based on https://github.com/pylint-dev/pylint/issues/3201 """ node = astroid.extract_node( f""" from {module} import dataclass from typing import List class A: arg0: str = None def __init__(self, arg0): raise NotImplementedError @dataclass class B(A): arg1: int = None arg2: str = None B.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "arg1", "arg2"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "int", "str", ] @parametrize_module def test_init_attributes_from_superclasses(module: str): """Test init for a dataclass that inherits and overrides attributes from superclasses. Based on https://github.com/pylint-dev/pylint/issues/3201 """ node = astroid.extract_node( f""" from {module} import dataclass from typing import List @dataclass class A: arg0: float arg2: str @dataclass class B(A): arg1: int arg2: list # Overrides arg2 from A B.__init__ #@ """ ) init = next(node.infer()) assert [a.name for a in init.args.args] == ["self", "arg0", "arg2", "arg1"] assert [a.as_string() if a else None for a in init.args.annotations] == [ None, "float", "list", # not str "int", ] @parametrize_module def test_invalid_init(module: str): """Test that astroid doesn't generate an initializer when attribute order is invalid. """ node = astroid.extract_node( f""" from {module} import dataclass @dataclass class A: arg1: float = 0.0 arg2: str A.__init__ #@ """ ) init = next(node.infer()) assert init._proxied.parent.name == "object" @parametrize_module def test_annotated_enclosed_field_call(module: str): """Test inference of dataclass attribute with a field call in another function call. """ node = astroid.extract_node( f""" from {module} import dataclass, field from typing import cast @dataclass class A: attribute: int = cast(int, field(default_factory=dict)) """ ) inferred = node.inferred() assert len(inferred) == 1 and isinstance(inferred[0], nodes.ClassDef) assert "attribute" in inferred[0].instance_attrs assert inferred[0].is_dataclass @parametrize_module def test_invalid_field_call(module: str) -> None: """Test inference of invalid field call doesn't crash.""" code = astroid.extract_node( f""" from {module} import dataclass, field @dataclass class A: val: field() """ ) inferred = code.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], nodes.ClassDef) assert inferred[0].is_dataclass def test_non_dataclass_is_not_dataclass() -> None: """Test that something that isn't a dataclass has the correct attribute.""" module = astroid.parse( """ class A: val: field() def dataclass(): return @dataclass class B: val: field() """ ) class_a = module.body[0].inferred() assert len(class_a) == 1 assert isinstance(class_a[0], nodes.ClassDef) assert not class_a[0].is_dataclass class_b = module.body[2].inferred() assert len(class_b) == 1 assert isinstance(class_b[0], nodes.ClassDef) assert not class_b[0].is_dataclass def test_kw_only_sentinel() -> None: """Test that the KW_ONLY sentinel doesn't get added to the fields.""" node_one, node_two = astroid.extract_node( """ from dataclasses import dataclass, KW_ONLY from dataclasses import KW_ONLY as keyword_only @dataclass class A: _: KW_ONLY y: str A.__init__ #@ @dataclass class B: _: keyword_only y: str B.__init__ #@ """ ) if PY310_PLUS: expected = ["self", "y"] else: expected = ["self", "_", "y"] init = next(node_one.infer()) assert [a.name for a in init.args.args] == expected init = next(node_two.infer()) assert [a.name for a in init.args.args] == expected def test_kw_only_decorator() -> None: """Test that we update the signature correctly based on the keyword. kw_only was introduced in PY310. """ foodef, bardef, cee, dee = astroid.extract_node( """ from dataclasses import dataclass @dataclass(kw_only=True) class Foo: a: int e: str @dataclass(kw_only=False) class Bar(Foo): c: int @dataclass(kw_only=False) class Cee(Bar): d: int @dataclass(kw_only=True) class Dee(Cee): ee: int Foo.__init__ #@ Bar.__init__ #@ Cee.__init__ #@ Dee.__init__ #@ """ ) foo_init: bases.UnboundMethod = next(foodef.infer()) if PY310_PLUS: assert [a.name for a in foo_init.args.args] == ["self"] assert [a.name for a in foo_init.args.kwonlyargs] == ["a", "e"] else: assert [a.name for a in foo_init.args.args] == ["self", "a", "e"] assert [a.name for a in foo_init.args.kwonlyargs] == [] bar_init: bases.UnboundMethod = next(bardef.infer()) if PY310_PLUS: assert [a.name for a in bar_init.args.args] == ["self", "c"] assert [a.name for a in bar_init.args.kwonlyargs] == ["a", "e"] else: assert [a.name for a in bar_init.args.args] == ["self", "a", "e", "c"] assert [a.name for a in bar_init.args.kwonlyargs] == [] cee_init: bases.UnboundMethod = next(cee.infer()) if PY310_PLUS: assert [a.name for a in cee_init.args.args] == ["self", "c", "d"] assert [a.name for a in cee_init.args.kwonlyargs] == ["a", "e"] else: assert [a.name for a in cee_init.args.args] == ["self", "a", "e", "c", "d"] assert [a.name for a in cee_init.args.kwonlyargs] == [] dee_init: bases.UnboundMethod = next(dee.infer()) if PY310_PLUS: assert [a.name for a in dee_init.args.args] == ["self", "c", "d"] assert [a.name for a in dee_init.args.kwonlyargs] == ["a", "e", "ee"] else: assert [a.name for a in dee_init.args.args] == [ "self", "a", "e", "c", "d", "ee", ] assert [a.name for a in dee_init.args.kwonlyargs] == [] def test_kw_only_in_field_call() -> None: """Test that keyword only fields get correctly put at the end of the __init__.""" first, second, third = astroid.extract_node( """ from dataclasses import dataclass, field @dataclass class Parent: p1: int = field(kw_only=True, default=0) @dataclass class Child(Parent): c1: str @dataclass(kw_only=True) class GrandChild(Child): p2: int = field(kw_only=False, default=1) p3: int = field(kw_only=True, default=2) Parent.__init__ #@ Child.__init__ #@ GrandChild.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self"] assert [a.name for a in first_init.args.kwonlyargs] == ["p1"] assert [d.value for d in first_init.args.kw_defaults] == [0] second_init: bases.UnboundMethod = next(second.infer()) assert [a.name for a in second_init.args.args] == ["self", "c1"] assert [a.name for a in second_init.args.kwonlyargs] == ["p1"] assert [d.value for d in second_init.args.kw_defaults] == [0] third_init: bases.UnboundMethod = next(third.infer()) assert [a.name for a in third_init.args.args] == ["self", "c1", "p2"] assert [a.name for a in third_init.args.kwonlyargs] == ["p1", "p3"] assert [d.value for d in third_init.args.defaults] == [1] assert [d.value for d in third_init.args.kw_defaults] == [0, 2] def test_dataclass_with_unknown_base() -> None: """Regression test for dataclasses with unknown base classes. Reported in https://github.com/pylint-dev/pylint/issues/7418 """ node = astroid.extract_node( """ import dataclasses from unknown import Unknown @dataclasses.dataclass class MyDataclass(Unknown): pass MyDataclass() """ ) assert next(node.infer()) def test_dataclass_with_unknown_typing() -> None: """Regression test for dataclasses with unknown base classes. Reported in https://github.com/pylint-dev/pylint/issues/7422 """ node = astroid.extract_node( """ from dataclasses import dataclass, InitVar @dataclass class TestClass: '''Test Class''' config: InitVar = None TestClass.__init__ #@ """ ) init_def: bases.UnboundMethod = next(node.infer()) assert [a.name for a in init_def.args.args] == ["self", "config"] def test_dataclass_with_default_factory() -> None: """Regression test for dataclasses with default values. Reported in https://github.com/pylint-dev/pylint/issues/7425 """ bad_node, good_node = astroid.extract_node( """ from dataclasses import dataclass from typing import Union @dataclass class BadExampleParentClass: xyz: Union[str, int] @dataclass class BadExampleClass(BadExampleParentClass): xyz: str = "" BadExampleClass.__init__ #@ @dataclass class GoodExampleParentClass: xyz: str @dataclass class GoodExampleClass(GoodExampleParentClass): xyz: str = "" GoodExampleClass.__init__ #@ """ ) bad_init: bases.UnboundMethod = next(bad_node.infer()) assert bad_init.args.defaults assert [a.name for a in bad_init.args.args] == ["self", "xyz"] good_init: bases.UnboundMethod = next(good_node.infer()) assert bad_init.args.defaults assert [a.name for a in good_init.args.args] == ["self", "xyz"] def test_dataclass_with_multiple_inheritance() -> None: """Regression test for dataclasses with multiple inheritance. Reported in https://github.com/pylint-dev/pylint/issues/7427 Reported in https://github.com/pylint-dev/pylint/issues/7434 """ first, second, overwritten, overwriting, mixed = astroid.extract_node( """ from dataclasses import dataclass @dataclass class BaseParent: _abc: int = 1 @dataclass class AnotherParent: ef: int = 2 @dataclass class FirstChild(BaseParent, AnotherParent): ghi: int = 3 @dataclass class ConvolutedParent(AnotherParent): '''Convoluted Parent''' @dataclass class SecondChild(BaseParent, ConvolutedParent): jkl: int = 4 @dataclass class OverwritingParent: ef: str = "2" @dataclass class OverwrittenChild(OverwritingParent, AnotherParent): '''Overwritten Child''' @dataclass class OverwritingChild(BaseParent, AnotherParent): _abc: float = 1.0 ef: float = 2.0 class NotADataclassParent: ef: int = 2 @dataclass class ChildWithMixedParents(BaseParent, NotADataclassParent): ghi: int = 3 FirstChild.__init__ #@ SecondChild.__init__ #@ OverwrittenChild.__init__ #@ OverwritingChild.__init__ #@ ChildWithMixedParents.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self", "ef", "_abc", "ghi"] assert [a.value for a in first_init.args.defaults] == [2, 1, 3] second_init: bases.UnboundMethod = next(second.infer()) assert [a.name for a in second_init.args.args] == ["self", "ef", "_abc", "jkl"] assert [a.value for a in second_init.args.defaults] == [2, 1, 4] overwritten_init: bases.UnboundMethod = next(overwritten.infer()) assert [a.name for a in overwritten_init.args.args] == ["self", "ef"] assert [a.value for a in overwritten_init.args.defaults] == ["2"] overwriting_init: bases.UnboundMethod = next(overwriting.infer()) assert [a.name for a in overwriting_init.args.args] == ["self", "ef", "_abc"] assert [a.value for a in overwriting_init.args.defaults] == [2.0, 1.0] mixed_init: bases.UnboundMethod = next(mixed.infer()) assert [a.name for a in mixed_init.args.args] == ["self", "_abc", "ghi"] assert [a.value for a in mixed_init.args.defaults] == [1, 3] first = astroid.extract_node( """ from dataclasses import dataclass @dataclass class BaseParent: required: bool @dataclass class FirstChild(BaseParent): ... @dataclass class SecondChild(BaseParent): optional: bool = False @dataclass class GrandChild(FirstChild, SecondChild): ... GrandChild.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self", "required", "optional"] assert [a.value for a in first_init.args.defaults] == [False] @pytest.mark.xfail(reason="Transforms returning Uninferable isn't supported.") def test_dataclass_non_default_argument_after_default() -> None: """Test that a non-default argument after a default argument is not allowed. This should succeed, but the dataclass brain is a transform which currently can't return an Uninferable correctly. Therefore, we can't set the dataclass ClassDef node to be Uninferable currently. Eventually it can be merged into test_dataclass_with_multiple_inheritance. """ impossible = astroid.extract_node( """ from dataclasses import dataclass @dataclass class BaseParent: required: bool @dataclass class FirstChild(BaseParent): ... @dataclass class SecondChild(BaseParent): optional: bool = False @dataclass class ThirdChild: other: bool = False @dataclass class ImpossibleGrandChild(FirstChild, SecondChild, ThirdChild): ... ImpossibleGrandChild() #@ """ ) assert next(impossible.infer()) is Uninferable def test_dataclass_with_field_init_is_false() -> None: """When init=False it shouldn't end up in the __init__.""" first, second, second_child, third_child, third = astroid.extract_node( """ from dataclasses import dataclass, field @dataclass class First: a: int @dataclass class Second(First): a: int = field(init=False, default=1) @dataclass class SecondChild(Second): a: float @dataclass class ThirdChild(SecondChild): a: str @dataclass class Third(First): a: str First.__init__ #@ Second.__init__ #@ SecondChild.__init__ #@ ThirdChild.__init__ #@ Third.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self", "a"] assert [a.value for a in first_init.args.defaults] == [] second_init: bases.UnboundMethod = next(second.infer()) assert [a.name for a in second_init.args.args] == ["self"] assert [a.value for a in second_init.args.defaults] == [] second_child_init: bases.UnboundMethod = next(second_child.infer()) assert [a.name for a in second_child_init.args.args] == ["self", "a"] assert [a.value for a in second_child_init.args.defaults] == [1] third_child_init: bases.UnboundMethod = next(third_child.infer()) assert [a.name for a in third_child_init.args.args] == ["self", "a"] assert [a.value for a in third_child_init.args.defaults] == [1] third_init: bases.UnboundMethod = next(third.infer()) assert [a.name for a in third_init.args.args] == ["self", "a"] assert [a.value for a in third_init.args.defaults] == [] def test_dataclass_inits_of_non_dataclasses() -> None: """Regression test for __init__ mangling for non dataclasses. Regression test against changes tested in test_dataclass_with_multiple_inheritance """ first, second, third = astroid.extract_node( """ from dataclasses import dataclass @dataclass class DataclassParent: _abc: int = 1 class NotADataclassParent: ef: int = 2 class FirstChild(DataclassParent, NotADataclassParent): ghi: int = 3 class SecondChild(DataclassParent, NotADataclassParent): ghi: int = 3 def __init__(self, ef: int = 3): self.ef = ef class ThirdChild(NotADataclassParent, DataclassParent): ghi: int = 3 def __init__(self, ef: int = 3): self.ef = ef FirstChild.__init__ #@ SecondChild.__init__ #@ ThirdChild.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self", "_abc"] assert [a.value for a in first_init.args.defaults] == [1] second_init: bases.UnboundMethod = next(second.infer()) assert [a.name for a in second_init.args.args] == ["self", "ef"] assert [a.value for a in second_init.args.defaults] == [3] third_init: bases.UnboundMethod = next(third.infer()) assert [a.name for a in third_init.args.args] == ["self", "ef"] assert [a.value for a in third_init.args.defaults] == [3] def test_dataclass_with_properties() -> None: """Tests for __init__ creation for dataclasses that use properties.""" first, second, third = astroid.extract_node( """ from dataclasses import dataclass @dataclass class Dataclass: attr: int @property def attr(self) -> int: return 1 @attr.setter def attr(self, value: int) -> None: pass class ParentOne(Dataclass): '''Docstring''' @dataclass class ParentTwo(Dataclass): '''Docstring''' Dataclass.__init__ #@ ParentOne.__init__ #@ ParentTwo.__init__ #@ """ ) first_init: bases.UnboundMethod = next(first.infer()) assert [a.name for a in first_init.args.args] == ["self", "attr"] assert [a.value for a in first_init.args.defaults] == [1] second_init: bases.UnboundMethod = next(second.infer()) assert [a.name for a in second_init.args.args] == ["self", "attr"] assert [a.value for a in second_init.args.defaults] == [1] third_init: bases.UnboundMethod = next(third.infer()) assert [a.name for a in third_init.args.args] == ["self", "attr"] assert [a.value for a in third_init.args.defaults] == [1] fourth = astroid.extract_node( """ from dataclasses import dataclass @dataclass class Dataclass: other_attr: str attr: str @property def attr(self) -> str: return self.other_attr[-1] @attr.setter def attr(self, value: int) -> None: pass Dataclass.__init__ #@ """ ) fourth_init: bases.UnboundMethod = next(fourth.infer()) assert [a.name for a in fourth_init.args.args] == ["self", "other_attr", "attr"] assert [a.name for a in fourth_init.args.defaults] == ["Uninferable"] astroid-3.2.2/tests/brain/test_nose.py0000664000175000017500000000322714622475517017724 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import warnings import astroid from astroid import builder try: with warnings.catch_warnings(): warnings.simplefilter("ignore", DeprecationWarning) import nose # pylint: disable=unused-import HAS_NOSE = True except ImportError: HAS_NOSE = False @unittest.skipUnless(HAS_NOSE, "This test requires nose library.") class NoseBrainTest(unittest.TestCase): def test_nose_tools(self): methods = builder.extract_node( """ from nose.tools import assert_equal from nose.tools import assert_equals from nose.tools import assert_true assert_equal = assert_equal #@ assert_true = assert_true #@ assert_equals = assert_equals #@ """ ) assert isinstance(methods, list) assert_equal = next(methods[0].value.infer()) assert_true = next(methods[1].value.infer()) assert_equals = next(methods[2].value.infer()) self.assertIsInstance(assert_equal, astroid.BoundMethod) self.assertIsInstance(assert_true, astroid.BoundMethod) self.assertIsInstance(assert_equals, astroid.BoundMethod) self.assertEqual(assert_equal.qname(), "unittest.case.TestCase.assertEqual") self.assertEqual(assert_true.qname(), "unittest.case.TestCase.assertTrue") self.assertEqual(assert_equals.qname(), "unittest.case.TestCase.assertEqual") astroid-3.2.2/tests/brain/test_signal.py0000664000175000017500000000245414622475517020236 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Unit Tests for the signal brain module.""" import sys import pytest from astroid import builder, nodes # Define signal enums ENUMS = ["Signals", "Handlers", "Sigmasks"] if sys.platform == "win32": ENUMS.remove("Sigmasks") # Sigmasks do not exist on Windows @pytest.mark.parametrize("enum_name", ENUMS) def test_enum(enum_name): """Tests that the signal module enums are handled by the brain.""" # Extract node for signal module enum from code node = builder.extract_node( f""" import signal signal.{enum_name} """ ) # Check the extracted node assert isinstance(node, nodes.NodeNG) node_inf = node.inferred()[0] assert isinstance(node_inf, nodes.ClassDef) assert node_inf.display_type() == "Class" assert node_inf.is_subtype_of("enum.IntEnum") assert node_inf.qname() == f"signal.{enum_name}" # Check enum members for member in node_inf.body: assert isinstance(member, nodes.Assign) for target in member.targets: assert isinstance(target, nodes.AssignName) astroid-3.2.2/tests/brain/test_threading.py0000664000175000017500000000340414622475517020722 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import unittest import astroid from astroid import builder from astroid.bases import Instance class ThreadingBrainTest(unittest.TestCase): def test_lock(self) -> None: lock_instance = builder.extract_node( """ import threading threading.Lock() """ ) inferred = next(lock_instance.infer()) self.assert_is_valid_lock(inferred) acquire_method = inferred.getattr("acquire")[0] parameters = [param.name for param in acquire_method.args.args[1:]] assert parameters == ["blocking", "timeout"] assert inferred.getattr("locked") def test_rlock(self) -> None: self._test_lock_object("RLock") def test_semaphore(self) -> None: self._test_lock_object("Semaphore") def test_boundedsemaphore(self) -> None: self._test_lock_object("BoundedSemaphore") def _test_lock_object(self, object_name: str) -> None: lock_instance = builder.extract_node( f""" import threading threading.{object_name}() """ ) inferred = next(lock_instance.infer()) self.assert_is_valid_lock(inferred) def assert_is_valid_lock(self, inferred: Instance) -> None: self.assertIsInstance(inferred, astroid.Instance) self.assertEqual(inferred.root().name, "threading") for method in ("acquire", "release", "__enter__", "__exit__"): self.assertIsInstance(next(inferred.igetattr(method)), astroid.BoundMethod) astroid-3.2.2/tests/brain/test_builtin.py0000664000175000017500000001172114622475517020424 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Unit Tests for the builtins brain module.""" import unittest import pytest from astroid import nodes, objects, util from astroid.builder import _extract_single_node, extract_node class BuiltinsTest(unittest.TestCase): def test_infer_property(self): class_with_property = _extract_single_node( """ class Something: def getter(): return 5 asd = property(getter) #@ """ ) inferred_property = next(iter(class_with_property.value.infer())) self.assertTrue(isinstance(inferred_property, objects.Property)) class_parent = inferred_property.parent.parent.parent self.assertIsInstance(class_parent, nodes.ClassDef) self.assertFalse( any( isinstance(getter, objects.Property) for getter in class_parent.locals["getter"] ) ) self.assertTrue(hasattr(inferred_property, "args")) class TestStringNodes: @pytest.mark.parametrize( "format_string", [ pytest.param( """"My name is {}, I'm {}".format("Daniel", 12)""", id="empty-indexes" ), pytest.param( """"My name is {0}, I'm {1}".format("Daniel", 12)""", id="numbered-indexes", ), pytest.param( """"My name is {fname}, I'm {age}".format(fname = "Daniel", age = 12)""", id="named-indexes", ), pytest.param( """ name = "Daniel" age = 12 "My name is {0}, I'm {1}".format(name, age) """, id="numbered-indexes-from-positional", ), pytest.param( """ name = "Daniel" age = 12 "My name is {fname}, I'm {age}".format(fname = name, age = age) """, id="named-indexes-from-keyword", ), pytest.param( """ name = "Daniel" age = 12 "My name is {0}, I'm {age}".format(name, age = age) """, id="mixed-indexes-from-mixed", ), pytest.param( """ string = "My name is {}, I'm {}" string.format("Daniel", 12) """, id="empty-indexes-on-variable", ), ], ) def test_string_format(self, format_string: str) -> None: node: nodes.Call = _extract_single_node(format_string) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == "My name is Daniel, I'm 12" @pytest.mark.parametrize( "format_string", [ """ from missing import Unknown name = Unknown age = 12 "My name is {fname}, I'm {age}".format(fname = name, age = age) """, """ from missing import Unknown age = 12 "My name is {fname}, I'm {age}".format(fname = Unknown, age = age) """, """ from missing import Unknown "My name is {}, I'm {}".format(Unknown, 12) """, """"I am {}".format()""", """ "My name is {fname}, I'm {age}".format(fsname = "Daniel", age = 12) """, """ "My unicode character is {:c}".format(None) """, """ "My hex format is {:4x}".format('1') """, """ daniel_age = 12 "My name is {0.name}".format(daniel_age) """, ], ) def test_string_format_uninferable(self, format_string: str) -> None: node: nodes.Call = _extract_single_node(format_string) inferred = next(node.infer()) assert inferred is util.Uninferable def test_string_format_with_specs(self) -> None: node: nodes.Call = _extract_single_node( """"My name is {}, I'm {:.2f}".format("Daniel", 12)""" ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == "My name is Daniel, I'm 12.00" def test_string_format_in_dataclass_pylint8109(self) -> None: """https://github.com/pylint-dev/pylint/issues/8109""" function_def = extract_node( """ from dataclasses import dataclass @dataclass class Number: amount: int | float round: int = 2 def __str__(self): #@ number_format = "{:,.%sf}" % self.round return number_format.format(self.amount).rstrip("0").rstrip(".") """ ) inferit = function_def.infer_call_result(function_def, context=None) assert [a.name for a in inferit] == [util.Uninferable] astroid-3.2.2/tests/brain/test_qt.py0000664000175000017500000000570014622475517017402 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from importlib.util import find_spec import pytest from astroid import Uninferable, extract_node from astroid.bases import UnboundMethod from astroid.const import PY312_PLUS from astroid.manager import AstroidManager from astroid.nodes import FunctionDef HAS_PYQT6 = find_spec("PyQt6") @pytest.mark.skipif(HAS_PYQT6 is None, reason="These tests require the PyQt6 library.") # TODO: enable for Python 3.12 as soon as PyQt6 release is compatible @pytest.mark.skipif(PY312_PLUS, reason="This test was segfaulting with Python 3.12.") class TestBrainQt: AstroidManager.brain["extension_package_whitelist"] = {"PyQt6"} # noqa: RUF012 @staticmethod def test_value_of_lambda_instance_attrs_is_list(): """Regression test for https://github.com/pylint-dev/pylint/issues/6221. A crash occurred in pylint when a nodes.FunctionDef was iterated directly, giving items like "self" instead of iterating a one-element list containing the wanted nodes.FunctionDef. """ src = """ from PyQt6 import QtPrintSupport as printsupport printsupport.QPrintPreviewDialog.paintRequested #@ """ node = extract_node(src) attribute_node = node.inferred()[0] if attribute_node is Uninferable: pytest.skip("PyQt6 C bindings may not be installed?") assert isinstance(attribute_node, UnboundMethod) # scoped_nodes.Lambda.instance_attrs is typed as Dict[str, List[NodeNG]] assert isinstance(attribute_node.instance_attrs["connect"][0], FunctionDef) @staticmethod def test_implicit_parameters() -> None: """Regression test for https://github.com/pylint-dev/pylint/issues/6464.""" src = """ from PyQt6.QtCore import QTimer timer = QTimer() timer.timeout.connect #@ """ node = extract_node(src) attribute_node = node.inferred()[0] if attribute_node is Uninferable: pytest.skip("PyQt6 C bindings may not be installed?") assert isinstance(attribute_node, FunctionDef) assert attribute_node.implicit_parameters() == 1 @staticmethod def test_slot_disconnect_no_args() -> None: """Test calling .disconnect() on a signal. See https://github.com/pylint-dev/astroid/pull/1531#issuecomment-1111963792 """ src = """ from PyQt6.QtCore import QTimer timer = QTimer() timer.timeout.disconnect #@ """ node = extract_node(src) attribute_node = node.inferred()[0] if attribute_node is Uninferable: pytest.skip("PyQt6 C bindings may not be installed?") assert isinstance(attribute_node, FunctionDef) assert attribute_node.args.defaults astroid-3.2.2/tests/brain/test_argparse.py0000664000175000017500000000133614622475517020563 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from astroid import bases, extract_node, nodes class TestBrainArgparse: @staticmethod def test_infer_namespace() -> None: func = extract_node( """ import argparse def make_namespace(): #@ return argparse.Namespace(debug=True) """ ) assert isinstance(func, nodes.FunctionDef) inferred = next(func.infer_call_result(func)) assert isinstance(inferred, bases.Instance) assert not func.locals astroid-3.2.2/tests/brain/test_pathlib.py0000664000175000017500000000423614622475517020404 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import astroid from astroid import bases from astroid.const import PY310_PLUS from astroid.util import Uninferable def test_inference_parents() -> None: """Test inference of ``pathlib.Path.parents``.""" name_node = astroid.extract_node( """ from pathlib import Path current_path = Path().resolve() path_parents = current_path.parents path_parents """ ) inferred = name_node.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], bases.Instance) assert inferred[0].qname() == "pathlib._PathParents" def test_inference_parents_subscript_index() -> None: """Test inference of ``pathlib.Path.parents``, accessed by index.""" path = astroid.extract_node( """ from pathlib import Path current_path = Path().resolve() current_path.parents[2] #@ """ ) inferred = path.inferred() assert len(inferred) == 1 assert isinstance(inferred[0], bases.Instance) assert inferred[0].qname() == "pathlib.Path" def test_inference_parents_subscript_slice() -> None: """Test inference of ``pathlib.Path.parents``, accessed by slice.""" name_node = astroid.extract_node( """ from pathlib import Path current_path = Path().resolve() parent_path = current_path.parents[:2] parent_path """ ) inferred = name_node.inferred() assert len(inferred) == 1 if PY310_PLUS: assert isinstance(inferred[0], bases.Instance) assert inferred[0].qname() == "builtins.tuple" else: assert inferred[0] is Uninferable def test_inference_parents_subscript_not_path() -> None: """Test inference of other ``.parents`` subscripts is unaffected.""" name_node = astroid.extract_node( """ class A: parents = 42 c = A() error = c.parents[:2] error """ ) inferred = name_node.inferred() assert len(inferred) == 1 assert inferred[0] is Uninferable astroid-3.2.2/tests/brain/test_multiprocessing.py0000664000175000017500000000770314622475517022212 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import queue import sys import unittest import astroid from astroid import builder, nodes try: import multiprocessing # pylint: disable=unused-import HAS_MULTIPROCESSING = True except ImportError: HAS_MULTIPROCESSING = False @unittest.skipUnless( HAS_MULTIPROCESSING, "multiprocesing is required for this test, but " "on some platforms it is missing " "(Jython for instance)", ) class MultiprocessingBrainTest(unittest.TestCase): def test_multiprocessing_module_attributes(self) -> None: # Test that module attributes are working, # especially on Python 3.4+, where they are obtained # from a context. module = builder.extract_node( """ import multiprocessing """ ) assert isinstance(module, nodes.Import) module = module.do_import_module("multiprocessing") cpu_count = next(module.igetattr("cpu_count")) self.assertIsInstance(cpu_count, astroid.BoundMethod) def test_module_name(self) -> None: module = builder.extract_node( """ import multiprocessing multiprocessing.SyncManager() """ ) inferred_sync_mgr = next(module.infer()) module = inferred_sync_mgr.root() self.assertEqual(module.name, "multiprocessing.managers") def test_multiprocessing_manager(self) -> None: # Test that we have the proper attributes # for a multiprocessing.managers.SyncManager module = builder.parse( """ import multiprocessing manager = multiprocessing.Manager() queue = manager.Queue() joinable_queue = manager.JoinableQueue() event = manager.Event() rlock = manager.RLock() lock = manager.Lock() bounded_semaphore = manager.BoundedSemaphore() condition = manager.Condition() barrier = manager.Barrier() pool = manager.Pool() list = manager.list() dict = manager.dict() value = manager.Value() array = manager.Array() namespace = manager.Namespace() """ ) ast_queue = next(module["queue"].infer()) self.assertEqual(ast_queue.qname(), f"{queue.__name__}.Queue") joinable_queue = next(module["joinable_queue"].infer()) self.assertEqual(joinable_queue.qname(), f"{queue.__name__}.Queue") event = next(module["event"].infer()) event_name = "threading.Event" self.assertEqual(event.qname(), event_name) rlock = next(module["rlock"].infer()) rlock_name = "threading._RLock" self.assertEqual(rlock.qname(), rlock_name) lock = next(module["lock"].infer()) lock_name = "threading.lock" self.assertEqual(lock.qname(), lock_name) bounded_semaphore = next(module["bounded_semaphore"].infer()) semaphore_name = "threading.BoundedSemaphore" self.assertEqual(bounded_semaphore.qname(), semaphore_name) pool = next(module["pool"].infer()) pool_name = "multiprocessing.pool.Pool" self.assertEqual(pool.qname(), pool_name) for attr in ("list", "dict"): obj = next(module[attr].infer()) self.assertEqual(obj.qname(), f"builtins.{attr}") # pypy's implementation of array.__spec__ return None. This causes problems for this inference. if not hasattr(sys, "pypy_version_info"): array = next(module["array"].infer()) self.assertEqual(array.qname(), "array.array") manager = next(module["manager"].infer()) # Verify that we have these attributes self.assertTrue(manager.getattr("start")) self.assertTrue(manager.getattr("shutdown")) astroid-3.2.2/tests/test_stdlib.py0000664000175000017500000000226514622475517017147 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for modules in the stdlib.""" from astroid import nodes from astroid.builder import _extract_single_node class TestSys: """Tests for the sys module.""" def test_sys_builtin_module_names(self) -> None: """Test that we can gather the elements of a living tuple object.""" node = _extract_single_node( """ import sys sys.builtin_module_names """ ) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Tuple) assert inferred[0].elts def test_sys_modules(self) -> None: """Test that we can gather the items of a living dict object.""" node = _extract_single_node( """ import sys sys.modules """ ) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Dict) assert inferred[0].items astroid-3.2.2/tests/resources.py0000664000175000017500000000204014622475517016630 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import os import sys from pathlib import Path from astroid import builder from astroid.nodes.scoped_nodes import Module DATA_DIR = Path("testdata") / "python3" RESOURCE_PATH = Path(__file__).parent / DATA_DIR / "data" def find(name: str) -> str: return os.path.normpath(os.path.join(os.path.dirname(__file__), DATA_DIR, name)) def build_file(path: str, modname: str | None = None) -> Module: return builder.AstroidBuilder().file_build(find(path), modname) class SysPathSetup: def setUp(self) -> None: sys.path.insert(0, find("")) def tearDown(self) -> None: del sys.path[0] datadir = find("") for key in list(sys.path_importer_cache): if key.startswith(datadir): del sys.path_importer_cache[key] astroid-3.2.2/tests/test_inference.py0000664000175000017500000067247014622475517017637 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """Tests for the astroid inference capabilities.""" from __future__ import annotations import sys import textwrap import unittest from abc import ABCMeta from collections.abc import Callable from functools import partial from pathlib import Path from typing import Any from unittest.mock import patch import pytest from astroid import ( Slice, Uninferable, arguments, manager, nodes, objects, test_utils, util, ) from astroid import decorators as decoratorsmod from astroid.arguments import CallSite from astroid.bases import BoundMethod, Generator, Instance, UnboundMethod, UnionType from astroid.builder import AstroidBuilder, _extract_single_node, extract_node, parse from astroid.const import IS_PYPY, PY39_PLUS, PY310_PLUS, PY312_PLUS from astroid.context import CallContext, InferenceContext from astroid.exceptions import ( AstroidTypeError, AttributeInferenceError, InferenceError, NoDefault, NotFoundError, ) from astroid.objects import ExceptionInstance from . import resources try: import six # type: ignore[import] # pylint: disable=unused-import HAS_SIX = True except ImportError: HAS_SIX = False def get_node_of_class(start_from: nodes.FunctionDef, klass: type) -> nodes.Attribute: return next(start_from.nodes_of_class(klass)) builder = AstroidBuilder() DATA_DIR = Path(__file__).parent / "testdata" / "python3" / "data" class InferenceUtilsTest(unittest.TestCase): def test_path_wrapper(self) -> None: def infer_default(self: Any, *args: InferenceContext) -> None: raise InferenceError infer_default = decoratorsmod.path_wrapper(infer_default) infer_end = decoratorsmod.path_wrapper(Slice._infer) with self.assertRaises(InferenceError): next(infer_default(1)) self.assertEqual(next(infer_end(1)), 1) def _assertInferElts( node_type: ABCMeta, self: InferenceTest, node: Any, elts: list[int] | list[str], ) -> None: inferred = next(node.infer()) self.assertIsInstance(inferred, node_type) self.assertEqual(sorted(elt.value for elt in inferred.elts), elts) def partialmethod(func, arg): """similar to functools.partial but return a lambda instead of a class so returned value may be turned into a method. """ return lambda *args, **kwargs: func(arg, *args, **kwargs) class InferenceTest(resources.SysPathSetup, unittest.TestCase): # additional assertInfer* method for builtin types def assertInferConst(self, node: nodes.Call, expected: str) -> None: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected) def assertInferDict( self, node: nodes.Call | nodes.Dict | nodes.NodeNG, expected: Any ) -> None: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Dict) elts = {(key.value, value.value) for (key, value) in inferred.items} self.assertEqual(sorted(elts), sorted(expected.items())) assertInferTuple = partialmethod(_assertInferElts, nodes.Tuple) assertInferList = partialmethod(_assertInferElts, nodes.List) assertInferSet = partialmethod(_assertInferElts, nodes.Set) assertInferFrozenSet = partialmethod(_assertInferElts, objects.FrozenSet) CODE = """ class C(object): "new style" attr = 4 def meth1(self, arg1, optarg=0): var = object() print ("yo", arg1, optarg) self.iattr = "hop" return var def meth2(self): self.meth1(*self.meth3) def meth3(self, d=attr): b = self.attr c = self.iattr return b, c ex = Exception("msg") v = C().meth1(1) m_unbound = C.meth1 m_bound = C().meth1 a, b, c = ex, 1, "bonjour" [d, e, f] = [ex, 1.0, ("bonjour", v)] g, h = f i, (j, k) = "glup", f a, b= b, a # Gasp ! """ ast = parse(CODE, __name__) def test_arg_keyword_no_default_value(self): node = extract_node( """ class Sensor: def __init__(self, *, description): #@ self._id = description.key """ ) with self.assertRaises(NoDefault): node.args.default_value("description") node = extract_node("def apple(color, *args, name: str, **kwargs): ...") with self.assertRaises(NoDefault): node.args.default_value("name") def test_infer_abstract_property_return_values(self) -> None: module = parse( """ import abc class A(object): @abc.abstractproperty def test(self): return 42 a = A() x = a.test """ ) inferred = next(module["x"].infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_module_inference(self) -> None: inferred = self.ast.infer() obj = next(inferred) self.assertEqual(obj.name, __name__) self.assertEqual(obj.root().name, __name__) self.assertRaises(StopIteration, partial(next, inferred)) def test_class_inference(self) -> None: inferred = self.ast["C"].infer() obj = next(inferred) self.assertEqual(obj.name, "C") self.assertEqual(obj.root().name, __name__) self.assertRaises(StopIteration, partial(next, inferred)) def test_function_inference(self) -> None: inferred = self.ast["C"]["meth1"].infer() obj = next(inferred) self.assertEqual(obj.name, "meth1") self.assertEqual(obj.root().name, __name__) self.assertRaises(StopIteration, partial(next, inferred)) def test_builtin_name_inference(self) -> None: inferred = self.ast["C"]["meth1"]["var"].infer() var = next(inferred) self.assertEqual(var.name, "object") self.assertEqual(var.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_tupleassign_name_inference(self) -> None: inferred = self.ast["a"].infer() exc = next(inferred) self.assertIsInstance(exc, Instance) self.assertEqual(exc.name, "Exception") self.assertEqual(exc.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["b"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, 1) self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["c"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, "bonjour") self.assertRaises(StopIteration, partial(next, inferred)) def test_listassign_name_inference(self) -> None: inferred = self.ast["d"].infer() exc = next(inferred) self.assertIsInstance(exc, Instance) self.assertEqual(exc.name, "Exception") self.assertEqual(exc.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["e"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, 1.0) self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["f"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Tuple) self.assertRaises(StopIteration, partial(next, inferred)) def test_advanced_tupleassign_name_inference1(self) -> None: inferred = self.ast["g"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, "bonjour") self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["h"].infer() var = next(inferred) self.assertEqual(var.name, "object") self.assertEqual(var.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_advanced_tupleassign_name_inference2(self) -> None: inferred = self.ast["i"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, "glup") self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["j"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, "bonjour") self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast["k"].infer() var = next(inferred) self.assertEqual(var.name, "object") self.assertEqual(var.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_swap_assign_inference(self) -> None: inferred = self.ast.locals["a"][1].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, 1) self.assertRaises(StopIteration, partial(next, inferred)) inferred = self.ast.locals["b"][1].infer() exc = next(inferred) self.assertIsInstance(exc, Instance) self.assertEqual(exc.name, "Exception") self.assertEqual(exc.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_getattr_inference1(self) -> None: inferred = self.ast["ex"].infer() exc = next(inferred) self.assertIsInstance(exc, Instance) self.assertEqual(exc.name, "Exception") self.assertEqual(exc.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_getattr_inference2(self) -> None: inferred = get_node_of_class(self.ast["C"]["meth2"], nodes.Attribute).infer() meth1 = next(inferred) self.assertEqual(meth1.name, "meth1") self.assertEqual(meth1.root().name, __name__) self.assertRaises(StopIteration, partial(next, inferred)) def test_getattr_inference3(self) -> None: inferred = self.ast["C"]["meth3"]["b"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, 4) self.assertRaises(StopIteration, partial(next, inferred)) def test_getattr_inference4(self) -> None: inferred = self.ast["C"]["meth3"]["c"].infer() const = next(inferred) self.assertIsInstance(const, nodes.Const) self.assertEqual(const.value, "hop") self.assertRaises(StopIteration, partial(next, inferred)) def test_callfunc_inference(self) -> None: inferred = self.ast["v"].infer() meth1 = next(inferred) self.assertIsInstance(meth1, Instance) self.assertEqual(meth1.name, "object") self.assertEqual(meth1.root().name, "builtins") self.assertRaises(StopIteration, partial(next, inferred)) def test_unbound_method_inference(self) -> None: inferred = self.ast["m_unbound"].infer() meth1 = next(inferred) self.assertIsInstance(meth1, UnboundMethod) self.assertEqual(meth1.name, "meth1") self.assertEqual(meth1.parent.frame().name, "C") self.assertEqual(meth1.parent.frame().name, "C") self.assertRaises(StopIteration, partial(next, inferred)) def test_bound_method_inference(self) -> None: inferred = self.ast["m_bound"].infer() meth1 = next(inferred) self.assertIsInstance(meth1, BoundMethod) self.assertEqual(meth1.name, "meth1") self.assertEqual(meth1.parent.frame().name, "C") self.assertEqual(meth1.parent.frame().name, "C") self.assertRaises(StopIteration, partial(next, inferred)) def test_args_default_inference1(self) -> None: optarg = test_utils.get_name_node(self.ast["C"]["meth1"], "optarg") inferred = optarg.infer() obj1 = next(inferred) self.assertIsInstance(obj1, nodes.Const) self.assertEqual(obj1.value, 0) obj1 = next(inferred) self.assertIs(obj1, util.Uninferable, obj1) self.assertRaises(StopIteration, partial(next, inferred)) def test_args_default_inference2(self) -> None: inferred = self.ast["C"]["meth3"].ilookup("d") obj1 = next(inferred) self.assertIsInstance(obj1, nodes.Const) self.assertEqual(obj1.value, 4) obj1 = next(inferred) self.assertIs(obj1, util.Uninferable, obj1) self.assertRaises(StopIteration, partial(next, inferred)) def test_inference_restrictions(self) -> None: inferred = test_utils.get_name_node(self.ast["C"]["meth1"], "arg1").infer() obj1 = next(inferred) self.assertIs(obj1, util.Uninferable, obj1) self.assertRaises(StopIteration, partial(next, inferred)) def test_ancestors_inference(self) -> None: code = """ class A(object): #@ pass class A(A): #@ pass """ a1, a2 = extract_node(code, __name__) a2_ancestors = list(a2.ancestors()) self.assertEqual(len(a2_ancestors), 2) self.assertIs(a2_ancestors[0], a1) def test_ancestors_inference2(self) -> None: code = """ class A(object): #@ pass class B(A): #@ pass class A(B): #@ pass """ a1, b, a2 = extract_node(code, __name__) a2_ancestors = list(a2.ancestors()) self.assertEqual(len(a2_ancestors), 3) self.assertIs(a2_ancestors[0], b) self.assertIs(a2_ancestors[1], a1) def test_f_arg_f(self) -> None: code = """ def f(f=1): return f a = f() """ ast = parse(code, __name__) a = ast["a"] a_inferred = a.inferred() self.assertEqual(a_inferred[0].value, 1) self.assertEqual(len(a_inferred), 1) def test_exc_ancestors(self) -> None: code = """ def f(): raise __(NotImplementedError) """ error = extract_node(code, __name__) nie = error.inferred()[0] self.assertIsInstance(nie, nodes.ClassDef) nie_ancestors = [c.name for c in nie.ancestors()] expected = ["RuntimeError", "Exception", "BaseException", "object"] self.assertEqual(nie_ancestors, expected) def test_except_inference(self) -> None: code = """ try: print (hop) except NameError as ex: ex1 = ex except Exception as ex: ex2 = ex raise """ ast = parse(code, __name__) ex1 = ast["ex1"] ex1_infer = ex1.infer() ex1 = next(ex1_infer) self.assertIsInstance(ex1, Instance) self.assertEqual(ex1.name, "NameError") self.assertRaises(StopIteration, partial(next, ex1_infer)) ex2 = ast["ex2"] ex2_infer = ex2.infer() ex2 = next(ex2_infer) self.assertIsInstance(ex2, Instance) self.assertEqual(ex2.name, "Exception") self.assertRaises(StopIteration, partial(next, ex2_infer)) def test_del1(self) -> None: code = """ del undefined_attr """ delete = extract_node(code, __name__) self.assertRaises(InferenceError, next, delete.infer()) def test_del2(self) -> None: code = """ a = 1 b = a del a c = a a = 2 d = a """ ast = parse(code, __name__) n = ast["b"] n_infer = n.infer() inferred = next(n_infer) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 1) self.assertRaises(StopIteration, partial(next, n_infer)) n = ast["c"] n_infer = n.infer() self.assertRaises(InferenceError, partial(next, n_infer)) n = ast["d"] n_infer = n.infer() inferred = next(n_infer) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 2) self.assertRaises(StopIteration, partial(next, n_infer)) def test_builtin_types(self) -> None: code = """ l = [1] t = (2,) d = {} s = '' s2 = '_' """ ast = parse(code, __name__) n = ast["l"] inferred = next(n.infer()) self.assertIsInstance(inferred, nodes.List) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.getitem(nodes.Const(0)).value, 1) self.assertIsInstance(inferred._proxied, nodes.ClassDef) self.assertEqual(inferred._proxied.name, "list") self.assertIn("append", inferred._proxied.locals) n = ast["t"] inferred = next(n.infer()) self.assertIsInstance(inferred, nodes.Tuple) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.getitem(nodes.Const(0)).value, 2) self.assertIsInstance(inferred._proxied, nodes.ClassDef) self.assertEqual(inferred._proxied.name, "tuple") n = ast["d"] inferred = next(n.infer()) self.assertIsInstance(inferred, nodes.Dict) self.assertIsInstance(inferred, Instance) self.assertIsInstance(inferred._proxied, nodes.ClassDef) self.assertEqual(inferred._proxied.name, "dict") self.assertIn("get", inferred._proxied.locals) n = ast["s"] inferred = next(n.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "str") self.assertIn("lower", inferred._proxied.locals) n = ast["s2"] inferred = next(n.infer()) self.assertEqual(inferred.getitem(nodes.Const(0)).value, "_") code = "s = {1}" ast = parse(code, __name__) n = ast["s"] inferred = next(n.infer()) self.assertIsInstance(inferred, nodes.Set) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "set") self.assertIn("remove", inferred._proxied.locals) @pytest.mark.xfail(reason="Descriptors are not properly inferred as callable") def test_descriptor_are_callable(self): code = """ class A: statm = staticmethod(open) clsm = classmethod('whatever') """ ast = parse(code, __name__) statm = next(ast["A"].igetattr("statm")) self.assertTrue(statm.callable()) clsm = next(ast["A"].igetattr("clsm")) self.assertFalse(clsm.callable()) def test_bt_ancestor_crash(self) -> None: code = """ class Warning(Warning): pass """ ast = parse(code, __name__) w = ast["Warning"] ancestors = w.ancestors() ancestor = next(ancestors) self.assertEqual(ancestor.name, "Warning") self.assertEqual(ancestor.root().name, "builtins") ancestor = next(ancestors) self.assertEqual(ancestor.name, "Exception") self.assertEqual(ancestor.root().name, "builtins") ancestor = next(ancestors) self.assertEqual(ancestor.name, "BaseException") self.assertEqual(ancestor.root().name, "builtins") ancestor = next(ancestors) self.assertEqual(ancestor.name, "object") self.assertEqual(ancestor.root().name, "builtins") self.assertRaises(StopIteration, partial(next, ancestors)) def test_method_argument(self) -> None: code = ''' class ErudiEntitySchema: """an entity has a type, a set of subject and or object relations""" def __init__(self, e_type, **kwargs): kwargs['e_type'] = e_type.capitalize().encode() def meth(self, e_type, *args, **kwargs): kwargs['e_type'] = e_type.capitalize().encode() print(args) ''' ast = parse(code, __name__) arg = test_utils.get_name_node(ast["ErudiEntitySchema"]["__init__"], "e_type") self.assertEqual( [n.__class__ for n in arg.infer()], [util.Uninferable.__class__] ) arg = test_utils.get_name_node(ast["ErudiEntitySchema"]["__init__"], "kwargs") self.assertEqual([n.__class__ for n in arg.infer()], [nodes.Dict]) arg = test_utils.get_name_node(ast["ErudiEntitySchema"]["meth"], "e_type") self.assertEqual( [n.__class__ for n in arg.infer()], [util.Uninferable.__class__] ) arg = test_utils.get_name_node(ast["ErudiEntitySchema"]["meth"], "args") self.assertEqual([n.__class__ for n in arg.infer()], [nodes.Tuple]) arg = test_utils.get_name_node(ast["ErudiEntitySchema"]["meth"], "kwargs") self.assertEqual([n.__class__ for n in arg.infer()], [nodes.Dict]) def test_tuple_then_list(self) -> None: code = """ def test_view(rql, vid, tags=()): tags = list(tags) __(tags).append(vid) """ name = extract_node(code, __name__) it = name.infer() tags = next(it) self.assertIsInstance(tags, nodes.List) self.assertEqual(tags.elts, []) with self.assertRaises(StopIteration): next(it) def test_mulassign_inference(self) -> None: code = ''' def first_word(line): """Return the first word of a line""" return line.split()[0] def last_word(line): """Return last word of a line""" return line.split()[-1] def process_line(word_pos): """Silly function: returns (ok, callable) based on argument. For test purpose only. """ if word_pos > 0: return (True, first_word) elif word_pos < 0: return (True, last_word) else: return (False, None) if __name__ == '__main__': line_number = 0 for a_line in file('test_callable.py'): tupletest = process_line(line_number) (ok, fct) = process_line(line_number) if ok: fct(a_line) ''' ast = parse(code, __name__) self.assertEqual(len(list(ast["process_line"].infer_call_result(None))), 3) self.assertEqual(len(list(ast["tupletest"].infer())), 3) values = [ " None: code = ''' def no_conjugate_member(magic_flag): #@ """should not raise E1101 on something.conjugate""" if magic_flag: something = 1.0 else: something = 1.0j if isinstance(something, float): return something return __(something).conjugate() ''' func, retval = extract_node(code, __name__) self.assertEqual([i.value for i in func.ilookup("something")], [1.0, 1.0j]) self.assertEqual([i.value for i in retval.infer()], [1.0, 1.0j]) def test_lookup_cond_branches(self) -> None: code = ''' def no_conjugate_member(magic_flag): """should not raise E1101 on something.conjugate""" something = 1.0 if magic_flag: something = 1.0j return something.conjugate() ''' ast = parse(code, __name__) values = [ i.value for i in test_utils.get_name_node(ast, "something", -1).infer() ] self.assertEqual(values, [1.0, 1.0j]) def test_simple_subscript(self) -> None: code = """ class A(object): def __getitem__(self, index): return index + 42 [1, 2, 3][0] #@ (1, 2, 3)[1] #@ (1, 2, 3)[-1] #@ [1, 2, 3][0] + (2, )[0] + (3, )[-1] #@ e = {'key': 'value'} e['key'] #@ "first"[0] #@ list([1, 2, 3])[-1] #@ tuple((4, 5, 6))[2] #@ A()[0] #@ A()[-1] #@ """ ast_nodes = extract_node(code, __name__) expected = [1, 2, 3, 6, "value", "f", 3, 6, 42, 41] for node, expected_value in zip(ast_nodes, expected): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected_value) def test_invalid_subscripts(self) -> None: ast_nodes = extract_node( """ class NoGetitem(object): pass class InvalidGetitem(object): def __getitem__(self): pass class InvalidGetitem2(object): __getitem__ = 42 NoGetitem()[4] #@ InvalidGetitem()[5] #@ InvalidGetitem2()[10] #@ [1, 2, 3][None] #@ 'lala'['bala'] #@ """ ) for node in ast_nodes: self.assertRaises(InferenceError, next, node.infer()) def test_bytes_subscript(self) -> None: node = extract_node("""b'a'[0]""") inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 97) def test_subscript_multi_value(self) -> None: code = """ def do_thing_with_subscript(magic_flag): src = [3, 2, 1] if magic_flag: src = [1, 2, 3] something = src[0] return something """ ast = parse(code, __name__) values = [ i.value for i in test_utils.get_name_node(ast, "something", -1).infer() ] self.assertEqual(list(sorted(values)), [1, 3]) def test_subscript_multi_slice(self) -> None: code = """ def zero_or_one(magic_flag): if magic_flag: return 1 return 0 def do_thing_with_subscript(magic_flag): src = [3, 2, 1] index = zero_or_one(magic_flag) something = src[index] return something """ ast = parse(code, __name__) values = [ i.value for i in test_utils.get_name_node(ast, "something", -1).infer() ] self.assertEqual(list(sorted(values)), [2, 3]) def test_simple_tuple(self) -> None: module = parse( """ a = (1,) b = (22,) some = a + b #@ """ ) ast = next(module["some"].infer()) self.assertIsInstance(ast, nodes.Tuple) self.assertEqual(len(ast.elts), 2) self.assertEqual(ast.elts[0].value, 1) self.assertEqual(ast.elts[1].value, 22) def test_simple_for(self) -> None: code = """ for a in [1, 2, 3]: print (a) for b,c in [(1,2), (3,4)]: print (b) print (c) print ([(d,e) for e,d in ([1,2], [3,4])]) """ ast = parse(code, __name__) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "a", -1).infer()], [1, 2, 3] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "b", -1).infer()], [1, 3] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "c", -1).infer()], [2, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "d", -1).infer()], [2, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "e", -1).infer()], [1, 3] ) def test_simple_for_genexpr(self) -> None: code = """ print ((d,e) for e,d in ([1,2], [3,4])) """ ast = parse(code, __name__) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "d", -1).infer()], [2, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "e", -1).infer()], [1, 3] ) def test_for_dict(self) -> None: code = """ for a, b in {1: 2, 3: 4}.items(): print(a) print(b) for c, (d, e) in {1: (2, 3), 4: (5, 6)}.items(): print(c) print(d) print(e) print([(f, g, h) for f, (g, h) in {1: (2, 3), 4: (5, 6)}.items()]) """ ast = parse(code, __name__) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "a", -1).infer()], [1, 3] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "b", -1).infer()], [2, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "c", -1).infer()], [1, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "d", -1).infer()], [2, 5] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "e", -1).infer()], [3, 6] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "f", -1).infer()], [1, 4] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "g", -1).infer()], [2, 5] ) self.assertEqual( [i.value for i in test_utils.get_name_node(ast, "h", -1).infer()], [3, 6] ) def test_builtin_help(self) -> None: code = """ help() """ # XXX failing since __builtin__.help assignment has # been moved into a function... node = extract_node(code, __name__) inferred = list(node.func.infer()) self.assertEqual(len(inferred), 1, inferred) self.assertIsInstance(inferred[0], Instance) self.assertEqual(inferred[0].name, "_Helper") def test_builtin_open(self) -> None: code = """ open("toto.txt") """ node = extract_node(code, __name__).func inferred = list(node.infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.FunctionDef) self.assertEqual(inferred[0].name, "open") def test_callfunc_context_func(self) -> None: code = """ def mirror(arg=None): return arg un = mirror(1) """ ast = parse(code, __name__) inferred = list(ast.igetattr("un")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Const) self.assertEqual(inferred[0].value, 1) def test_callfunc_context_lambda(self) -> None: code = """ mirror = lambda x=None: x un = mirror(1) """ ast = parse(code, __name__) inferred = list(ast.igetattr("mirror")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Lambda) inferred = list(ast.igetattr("un")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Const) self.assertEqual(inferred[0].value, 1) def test_factory_method(self) -> None: code = """ class Super(object): @classmethod def instance(cls): return cls() class Sub(Super): def method(self): print ('method called') sub = Sub.instance() """ ast = parse(code, __name__) inferred = list(ast.igetattr("sub")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], Instance) self.assertEqual(inferred[0]._proxied.name, "Sub") def test_factory_methods_cls_call(self) -> None: ast = extract_node( """ class C: @classmethod def factory(cls): return cls() class D(C): pass C.factory() #@ D.factory() #@ """, "module", ) should_be_c = list(ast[0].infer()) should_be_d = list(ast[1].infer()) self.assertEqual(1, len(should_be_c)) self.assertEqual(1, len(should_be_d)) self.assertEqual("module.C", should_be_c[0].qname()) self.assertEqual("module.D", should_be_d[0].qname()) def test_factory_methods_object_new_call(self) -> None: ast = extract_node( """ class C: @classmethod def factory(cls): return object.__new__(cls) class D(C): pass C.factory() #@ D.factory() #@ """, "module", ) should_be_c = list(ast[0].infer()) should_be_d = list(ast[1].infer()) self.assertEqual(1, len(should_be_c)) self.assertEqual(1, len(should_be_d)) self.assertEqual("module.C", should_be_c[0].qname()) self.assertEqual("module.D", should_be_d[0].qname()) @pytest.mark.xfail( reason="pathlib.Path cannot be inferred on Python 3.8", ) def test_factory_methods_inside_binary_operation(self): node = extract_node( """ from pathlib import Path h = Path("/home") u = h / "user" u #@ """ ) assert next(node.infer()).qname() == "pathlib.Path" def test_import_as(self) -> None: code = """ import os.path as osp print (osp.dirname(__file__)) from os.path import exists as e assert e(__file__) """ ast = parse(code, __name__) inferred = list(ast.igetattr("osp")) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Module) self.assertEqual(inferred[0].name, "os.path") inferred = list(ast.igetattr("e")) if PY312_PLUS and sys.platform.startswith("win"): # There are two os.path.exists exported, likely due to # https://github.com/python/cpython/pull/101324 self.assertEqual(len(inferred), 2) else: self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.FunctionDef) self.assertEqual(inferred[0].name, "exists") def test_do_import_module_performance(self) -> None: import_node = extract_node("import importlib") import_node.modname = "" import_node.do_import_module() # calling file_from_module_name() indicates we didn't hit the cache with unittest.mock.patch.object( manager.AstroidManager, "file_from_module_name", side_effect=AssertionError ): import_node.do_import_module() def _test_const_inferred(self, node: nodes.AssignName, value: float | str) -> None: inferred = list(node.infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Const) self.assertEqual(inferred[0].value, value) def test_unary_not(self) -> None: for code in ( "a = not (1,); b = not ()", "a = not {1:2}; b = not {}", "a = not [1, 2]; b = not []", "a = not {1, 2}; b = not set()", "a = not 1; b = not 0", 'a = not "a"; b = not ""', 'a = not b"a"; b = not b""', ): ast = builder.string_build(code, __name__, __file__) self._test_const_inferred(ast["a"], False) self._test_const_inferred(ast["b"], True) def test_unary_op_numbers(self) -> None: ast_nodes = extract_node( """ +1 #@ -1 #@ ~1 #@ +2.0 #@ -2.0 #@ """ ) expected = [1, -1, -2, 2.0, -2.0] for node, expected_value in zip(ast_nodes, expected): inferred = next(node.infer()) self.assertEqual(inferred.value, expected_value) def test_matmul(self) -> None: node = extract_node( """ class Array: def __matmul__(self, other): return 42 Array() @ Array() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_binary_op_int_add(self) -> None: ast = builder.string_build("a = 1 + 2", __name__, __file__) self._test_const_inferred(ast["a"], 3) def test_binary_op_int_sub(self) -> None: ast = builder.string_build("a = 1 - 2", __name__, __file__) self._test_const_inferred(ast["a"], -1) def test_binary_op_float_div(self) -> None: ast = builder.string_build("a = 1 / 2.", __name__, __file__) self._test_const_inferred(ast["a"], 1 / 2.0) def test_binary_op_str_mul(self) -> None: ast = builder.string_build('a = "*" * 40', __name__, __file__) self._test_const_inferred(ast["a"], "*" * 40) def test_binary_op_int_bitand(self) -> None: ast = builder.string_build("a = 23&20", __name__, __file__) self._test_const_inferred(ast["a"], 23 & 20) def test_binary_op_int_bitor(self) -> None: ast = builder.string_build("a = 23|8", __name__, __file__) self._test_const_inferred(ast["a"], 23 | 8) def test_binary_op_int_bitxor(self) -> None: ast = builder.string_build("a = 23^9", __name__, __file__) self._test_const_inferred(ast["a"], 23 ^ 9) def test_binary_op_int_shiftright(self) -> None: ast = builder.string_build("a = 23 >>1", __name__, __file__) self._test_const_inferred(ast["a"], 23 >> 1) def test_binary_op_int_shiftleft(self) -> None: ast = builder.string_build("a = 23 <<1", __name__, __file__) self._test_const_inferred(ast["a"], 23 << 1) def test_binary_op_other_type(self) -> None: ast_nodes = extract_node( """ class A: def __add__(self, other): return other + 42 A() + 1 #@ 1 + A() #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.Const) self.assertEqual(first.value, 43) second = next(ast_nodes[1].infer()) self.assertEqual(second, util.Uninferable) def test_binary_op_other_type_using_reflected_operands(self) -> None: ast_nodes = extract_node( """ class A(object): def __radd__(self, other): return other + 42 A() + 1 #@ 1 + A() #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertEqual(first, util.Uninferable) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, nodes.Const) self.assertEqual(second.value, 43) def test_binary_op_reflected_and_not_implemented_is_type_error(self) -> None: ast_node = extract_node( """ class A(object): def __radd__(self, other): return NotImplemented 1 + A() #@ """ ) first = next(ast_node.infer()) self.assertEqual(first, util.Uninferable) @pytest.mark.filterwarnings("error::DeprecationWarning") def test_binary_op_not_used_in_boolean_context(self) -> None: ast_node = extract_node("not NotImplemented") first = next(ast_node.infer()) self.assertIsInstance(first, nodes.Const) def test_binary_op_list_mul(self) -> None: for code in ("a = [[]] * 2", "a = 2 * [[]]"): ast = builder.string_build(code, __name__, __file__) inferred = list(ast["a"].infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.List) self.assertEqual(len(inferred[0].elts), 2) self.assertIsInstance(inferred[0].elts[0], nodes.List) self.assertIsInstance(inferred[0].elts[1], nodes.List) def test_binary_op_list_mul_none(self) -> None: """Test correct handling on list multiplied by None.""" ast = builder.string_build('a = [1] * None\nb = [1] * "r"') inferred = ast["a"].inferred() self.assertEqual(len(inferred), 1) self.assertEqual(inferred[0], util.Uninferable) inferred = ast["b"].inferred() self.assertEqual(len(inferred), 1) self.assertEqual(inferred[0], util.Uninferable) def test_binary_op_list_mul_int(self) -> None: """Test correct handling on list multiplied by int when there are more than one.""" code = """ from ctypes import c_int seq = [c_int()] * 4 """ ast = parse(code, __name__) inferred = ast["seq"].inferred() self.assertEqual(len(inferred), 1) listval = inferred[0] self.assertIsInstance(listval, nodes.List) self.assertEqual(len(listval.itered()), 4) def test_binary_op_on_self(self) -> None: """Test correct handling of applying binary operator to self.""" code = """ import sys sys.path = ['foo'] + sys.path sys.path.insert(0, 'bar') path = sys.path """ ast = parse(code, __name__) inferred = ast["path"].inferred() self.assertIsInstance(inferred[0], nodes.List) def test_binary_op_tuple_add(self) -> None: ast = builder.string_build("a = (1,) + (2,)", __name__, __file__) inferred = list(ast["a"].infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Tuple) self.assertEqual(len(inferred[0].elts), 2) self.assertEqual(inferred[0].elts[0].value, 1) self.assertEqual(inferred[0].elts[1].value, 2) def test_binary_op_custom_class(self) -> None: code = """ class myarray: def __init__(self, array): self.array = array def __mul__(self, x): return myarray([2,4,6]) def astype(self): return "ASTYPE" def randint(maximum): if maximum is not None: return myarray([1,2,3]) * 2 else: return int(5) x = randint(1) """ ast = parse(code, __name__) inferred = list(ast.igetattr("x")) self.assertEqual(len(inferred), 2) value = [str(v) for v in inferred] # The __name__ trick here makes it work when invoked directly # (__name__ == '__main__') and through pytest (__name__ == # 'unittest_inference') self.assertEqual( value, [ f"Instance of {__name__}.myarray", "Const.int(value=5,\n kind=None)", ], ) def test_binary_op_or_union_type(self) -> None: """Binary or union is only defined for Python 3.10+.""" code = """ class A: ... int | 2 #@ int | "Hello" #@ int | ... #@ int | A() #@ int | None | 2 #@ """ ast_nodes = extract_node(code) for n in ast_nodes: assert n.inferred() == [util.Uninferable] code = """ from typing import List class A: ... class B: ... int | None #@ int | str #@ int | str | None #@ A | B #@ A | None #@ List[int] | int #@ tuple | int #@ """ ast_nodes = extract_node(code) if not PY310_PLUS: for n in ast_nodes: assert n.inferred() == [util.Uninferable] else: i0 = ast_nodes[0].inferred()[0] assert isinstance(i0, UnionType) assert isinstance(i0.left, nodes.ClassDef) assert i0.left.name == "int" assert isinstance(i0.right, nodes.Const) assert i0.right.value is None # Assert basic UnionType properties and methods assert i0.callable() is False assert i0.bool_value() is True assert i0.pytype() == "types.UnionType" assert i0.display_type() == "UnionType" assert str(i0) == "UnionType(UnionType)" assert repr(i0) == f"" i1 = ast_nodes[1].inferred()[0] assert isinstance(i1, UnionType) i2 = ast_nodes[2].inferred()[0] assert isinstance(i2, UnionType) assert isinstance(i2.left, UnionType) assert isinstance(i2.left.left, nodes.ClassDef) assert i2.left.left.name == "int" assert isinstance(i2.left.right, nodes.ClassDef) assert i2.left.right.name == "str" assert isinstance(i2.right, nodes.Const) assert i2.right.value is None i3 = ast_nodes[3].inferred()[0] assert isinstance(i3, UnionType) assert isinstance(i3.left, nodes.ClassDef) assert i3.left.name == "A" assert isinstance(i3.right, nodes.ClassDef) assert i3.right.name == "B" i4 = ast_nodes[4].inferred()[0] assert isinstance(i4, UnionType) i5 = ast_nodes[5].inferred()[0] assert isinstance(i5, UnionType) assert isinstance(i5.left, nodes.ClassDef) assert i5.left.name == "List" i6 = ast_nodes[6].inferred()[0] assert isinstance(i6, UnionType) assert isinstance(i6.left, nodes.ClassDef) assert i6.left.name == "tuple" code = """ from typing import List Alias1 = List[int] Alias2 = str | int Alias1 | int #@ Alias2 | int #@ Alias1 | Alias2 #@ """ ast_nodes = extract_node(code) if not PY310_PLUS: for n in ast_nodes: assert n.inferred() == [util.Uninferable] else: i0 = ast_nodes[0].inferred()[0] assert isinstance(i0, UnionType) assert isinstance(i0.left, nodes.ClassDef) assert i0.left.name == "List" i1 = ast_nodes[1].inferred()[0] assert isinstance(i1, UnionType) assert isinstance(i1.left, UnionType) assert isinstance(i1.left.left, nodes.ClassDef) assert i1.left.left.name == "str" i2 = ast_nodes[2].inferred()[0] assert isinstance(i2, UnionType) assert isinstance(i2.left, nodes.ClassDef) assert i2.left.name == "List" assert isinstance(i2.right, UnionType) def test_nonregr_lambda_arg(self) -> None: code = """ def f(g = lambda: None): __(g()).x """ callfuncnode = extract_node(code) inferred = list(callfuncnode.infer()) self.assertEqual(len(inferred), 2, inferred) inferred.remove(util.Uninferable) self.assertIsInstance(inferred[0], nodes.Const) self.assertIsNone(inferred[0].value) def test_nonregr_getitem_empty_tuple(self) -> None: code = """ def f(x): a = ()[x] """ ast = parse(code, __name__) inferred = list(ast["f"].ilookup("a")) self.assertEqual(len(inferred), 1) self.assertEqual(inferred[0], util.Uninferable) def test_nonregr_instance_attrs(self) -> None: """Non regression for instance_attrs infinite loop : pylint / #4.""" code = """ class Foo(object): def set_42(self): self.attr = 42 class Bar(Foo): def __init__(self): self.attr = 41 """ ast = parse(code, __name__) foo_class = ast["Foo"] bar_class = ast["Bar"] bar_self = ast["Bar"]["__init__"]["self"] assattr = bar_class.instance_attrs["attr"][0] self.assertEqual(len(foo_class.instance_attrs["attr"]), 1) self.assertEqual(len(bar_class.instance_attrs["attr"]), 1) self.assertEqual(bar_class.instance_attrs, {"attr": [assattr]}) # call 'instance_attr' via 'Instance.getattr' to trigger the bug: instance = bar_self.inferred()[0] instance.getattr("attr") self.assertEqual(len(bar_class.instance_attrs["attr"]), 1) self.assertEqual(len(foo_class.instance_attrs["attr"]), 1) self.assertEqual(bar_class.instance_attrs, {"attr": [assattr]}) def test_nonregr_multi_referential_addition(self) -> None: """Regression test for https://github.com/pylint-dev/astroid/issues/483 Make sure issue where referring to the same variable in the same inferred expression caused an uninferable result. """ code = """ b = 1 a = b + b a #@ """ variable_a = extract_node(code) self.assertEqual(variable_a.inferred()[0].value, 2) def test_nonregr_layed_dictunpack(self) -> None: """Regression test for https://github.com/pylint-dev/astroid/issues/483 Make sure multiple dictunpack references are inferable. """ code = """ base = {'data': 0} new = {**base, 'data': 1} new3 = {**base, **new} new3 #@ """ ass = extract_node(code) self.assertIsInstance(ass.inferred()[0], nodes.Dict) def test_nonregr_inference_modifying_col_offset(self) -> None: """Make sure inference doesn't improperly modify col_offset. Regression test for https://github.com/pylint-dev/pylint/issues/1839 """ code = """ class F: def _(self): return type(self).f """ mod = parse(code) cdef = mod.body[0] call = cdef.body[0].body[0].value.expr orig_offset = cdef.col_offset call.inferred() self.assertEqual(cdef.col_offset, orig_offset) def test_no_runtime_error_in_repeat_inference(self) -> None: """Stop repeat inference attempt causing a RuntimeError in Python3.7. See https://github.com/pylint-dev/pylint/issues/2317 """ code = """ class ContextMixin: def get_context_data(self, **kwargs): return kwargs class DVM(ContextMixin): def get_context_data(self, **kwargs): ctx = super().get_context_data(**kwargs) return ctx class IFDVM(DVM): def get_context_data(self, **kwargs): ctx = super().get_context_data(**kwargs) ctx['bar'] = 'foo' ctx #@ return ctx """ node = extract_node(code) assert isinstance(node, nodes.NodeNG) results = node.inferred() assert len(results) == 2 assert all(isinstance(result, nodes.Dict) for result in results) def test_name_repeat_inference(self) -> None: node = extract_node("print") context = InferenceContext() _ = next(node.infer(context=context)) with pytest.raises(InferenceError): next(node.infer(context=context)) def test_python25_no_relative_import(self) -> None: ast = resources.build_file("data/package/absimport.py") self.assertTrue(ast.absolute_import_activated(), True) inferred = next( test_utils.get_name_node(ast, "import_package_subpackage_module").infer() ) # failed to import since absolute_import is activated self.assertIs(inferred, util.Uninferable) def test_nonregr_absolute_import(self) -> None: ast = resources.build_file("data/absimp/string.py", "data.absimp.string") self.assertTrue(ast.absolute_import_activated(), True) inferred = next(test_utils.get_name_node(ast, "string").infer()) self.assertIsInstance(inferred, nodes.Module) self.assertEqual(inferred.name, "string") self.assertIn("ascii_letters", inferred.locals) def test_property(self) -> None: code = """ from smtplib import SMTP class SendMailController(object): @property def smtp(self): return SMTP(mailhost, port) @property def me(self): return self my_smtp = SendMailController().smtp my_me = SendMailController().me """ decorators = {"builtins.property"} ast = parse(code, __name__) self.assertEqual(ast["SendMailController"]["smtp"].decoratornames(), decorators) propinferred = list(ast.body[2].value.infer()) self.assertEqual(len(propinferred), 1) propinferred = propinferred[0] self.assertIsInstance(propinferred, Instance) self.assertEqual(propinferred.name, "SMTP") self.assertEqual(propinferred.root().name, "smtplib") self.assertEqual(ast["SendMailController"]["me"].decoratornames(), decorators) propinferred = list(ast.body[3].value.infer()) self.assertEqual(len(propinferred), 1) propinferred = propinferred[0] self.assertIsInstance(propinferred, Instance) self.assertEqual(propinferred.name, "SendMailController") self.assertEqual(propinferred.root().name, __name__) def test_im_func_unwrap(self) -> None: code = """ class EnvBasedTC: def pactions(self): pass pactions = EnvBasedTC.pactions.im_func print (pactions) class EnvBasedTC2: pactions = EnvBasedTC.pactions.im_func print (pactions) """ ast = parse(code, __name__) pactions = test_utils.get_name_node(ast, "pactions") inferred = list(pactions.infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.FunctionDef) pactions = test_utils.get_name_node(ast["EnvBasedTC2"], "pactions") inferred = list(pactions.infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.FunctionDef) def test_augassign(self) -> None: code = """ a = 1 a += 2 print (a) """ ast = parse(code, __name__) inferred = list(test_utils.get_name_node(ast, "a").infer()) self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Const) self.assertEqual(inferred[0].value, 3) def test_nonregr_func_arg(self) -> None: code = """ def foo(self, bar): def baz(): pass def qux(): return baz spam = bar(None, qux) print (spam) """ ast = parse(code, __name__) inferred = list(test_utils.get_name_node(ast["foo"], "spam").infer()) self.assertEqual(len(inferred), 1) self.assertIs(inferred[0], util.Uninferable) def test_nonregr_func_global(self) -> None: code = """ active_application = None def get_active_application(): global active_application return active_application class Application(object): def __init__(self): global active_application active_application = self class DataManager(object): def __init__(self, app=None): self.app = get_active_application() def test(self): p = self.app print (p) """ ast = parse(code, __name__) inferred = list(Instance(ast["DataManager"]).igetattr("app")) self.assertEqual(len(inferred), 2, inferred) # None / Instance(Application) inferred = list( test_utils.get_name_node(ast["DataManager"]["test"], "p").infer() ) self.assertEqual(len(inferred), 2, inferred) for node in inferred: if isinstance(node, Instance) and node.name == "Application": break else: self.fail(f"expected to find an instance of Application in {inferred}") def test_list_inference(self) -> None: code = """ from unknown import Unknown A = [] B = [] def test(): xyz = [ Unknown ] + A + B return xyz Z = test() """ ast = parse(code, __name__) inferred = next(ast["Z"].infer()) self.assertIsInstance(inferred, nodes.List) self.assertEqual(len(inferred.elts), 1) self.assertIsInstance(inferred.elts[0], nodes.Unknown) def test__new__(self) -> None: code = """ class NewTest(object): "doc" def __new__(cls, arg): self = object.__new__(cls) self.arg = arg return self n = NewTest() """ ast = parse(code, __name__) self.assertRaises(InferenceError, list, ast["NewTest"].igetattr("arg")) n = next(ast["n"].infer()) inferred = list(n.igetattr("arg")) self.assertEqual(len(inferred), 1, inferred) def test__new__bound_methods(self) -> None: node = extract_node( """ class cls(object): pass cls().__new__(cls) #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred._proxied, node.root()["cls"]) def test_two_parents_from_same_module(self) -> None: code = """ from data import nonregr class Xxx(nonregr.Aaa, nonregr.Ccc): "doc" """ ast = parse(code, __name__) parents = list(ast["Xxx"].ancestors()) self.assertEqual(len(parents), 3, parents) # Aaa, Ccc, object def test_pluggable_inference(self) -> None: code = """ from collections import namedtuple A = namedtuple('A', ['a', 'b']) B = namedtuple('B', 'a b') """ ast = parse(code, __name__) aclass = ast["A"].inferred()[0] self.assertIsInstance(aclass, nodes.ClassDef) self.assertIn("a", aclass.instance_attrs) self.assertIn("b", aclass.instance_attrs) bclass = ast["B"].inferred()[0] self.assertIsInstance(bclass, nodes.ClassDef) self.assertIn("a", bclass.instance_attrs) self.assertIn("b", bclass.instance_attrs) def test_infer_arguments(self) -> None: code = """ class A(object): def first(self, arg1, arg2): return arg1 @classmethod def method(cls, arg1, arg2): return arg2 @classmethod def empty(cls): return 2 @staticmethod def static(arg1, arg2): return arg1 def empty_method(self): return [] x = A().first(1, []) y = A.method(1, []) z = A.static(1, []) empty = A.empty() empty_list = A().empty_method() """ ast = parse(code, __name__) int_node = ast["x"].inferred()[0] self.assertIsInstance(int_node, nodes.Const) self.assertEqual(int_node.value, 1) list_node = ast["y"].inferred()[0] self.assertIsInstance(list_node, nodes.List) int_node = ast["z"].inferred()[0] self.assertIsInstance(int_node, nodes.Const) self.assertEqual(int_node.value, 1) empty = ast["empty"].inferred()[0] self.assertIsInstance(empty, nodes.Const) self.assertEqual(empty.value, 2) empty_list = ast["empty_list"].inferred()[0] self.assertIsInstance(empty_list, nodes.List) def test_infer_variable_arguments(self) -> None: code = """ def test(*args, **kwargs): vararg = args kwarg = kwargs """ ast = parse(code, __name__) func = ast["test"] vararg = func.body[0].value kwarg = func.body[1].value kwarg_inferred = kwarg.inferred()[0] self.assertIsInstance(kwarg_inferred, nodes.Dict) self.assertIs(kwarg_inferred.parent, func.args) vararg_inferred = vararg.inferred()[0] self.assertIsInstance(vararg_inferred, nodes.Tuple) self.assertIs(vararg_inferred.parent, func.args) def test_infer_nested(self) -> None: code = """ def nested(): from threading import Thread class NestedThread(Thread): def __init__(self): Thread.__init__(self) """ # Test that inferring Thread.__init__ looks up in # the nested scope. ast = parse(code, __name__) callfunc = next(ast.nodes_of_class(nodes.Call)) func = callfunc.func inferred = func.inferred()[0] self.assertIsInstance(inferred, UnboundMethod) def test_instance_binary_operations(self) -> None: code = """ class A(object): def __mul__(self, other): return 42 a = A() b = A() sub = a - b mul = a * b """ ast = parse(code, __name__) sub = ast["sub"].inferred()[0] mul = ast["mul"].inferred()[0] self.assertIs(sub, util.Uninferable) self.assertIsInstance(mul, nodes.Const) self.assertEqual(mul.value, 42) def test_instance_binary_operations_parent(self) -> None: code = """ class A(object): def __mul__(self, other): return 42 class B(A): pass a = B() b = B() sub = a - b mul = a * b """ ast = parse(code, __name__) sub = ast["sub"].inferred()[0] mul = ast["mul"].inferred()[0] self.assertIs(sub, util.Uninferable) self.assertIsInstance(mul, nodes.Const) self.assertEqual(mul.value, 42) def test_instance_binary_operations_multiple_methods(self) -> None: code = """ class A(object): def __mul__(self, other): return 42 class B(A): def __mul__(self, other): return [42] a = B() b = B() sub = a - b mul = a * b """ ast = parse(code, __name__) sub = ast["sub"].inferred()[0] mul = ast["mul"].inferred()[0] self.assertIs(sub, util.Uninferable) self.assertIsInstance(mul, nodes.List) self.assertIsInstance(mul.elts[0], nodes.Const) self.assertEqual(mul.elts[0].value, 42) def test_infer_call_result_crash(self) -> None: code = """ class A(object): def __mul__(self, other): return type.__new__() a = A() b = A() c = a * b """ ast = parse(code, __name__) node = ast["c"] assert isinstance(node, nodes.NodeNG) self.assertEqual(node.inferred(), [util.Uninferable]) def test_infer_empty_nodes(self) -> None: # Should not crash when trying to infer EmptyNodes. node = nodes.EmptyNode() assert isinstance(node, nodes.NodeNG) self.assertEqual(node.inferred(), [util.Uninferable]) def test_infinite_loop_for_decorators(self) -> None: # Issue https://bitbucket.org/logilab/astroid/issue/50 # A decorator that returns itself leads to an infinite loop. code = """ def decorator(): def wrapper(): return decorator() return wrapper @decorator() def do_a_thing(): pass """ ast = parse(code, __name__) node = ast["do_a_thing"] self.assertEqual(node.type, "function") def test_no_infinite_ancestor_loop(self) -> None: klass = extract_node( """ import datetime def method(self): datetime.datetime = something() class something(datetime.datetime): #@ pass """ ) ancestors = [base.name for base in klass.ancestors()] expected_subset = ["datetime", "date"] self.assertEqual(expected_subset, ancestors[:2]) def test_stop_iteration_leak(self) -> None: code = """ class Test: def __init__(self): self.config = {0: self.config[0]} self.config[0].test() #@ """ ast = extract_node(code, __name__) expr = ast.func.expr self.assertIs(next(expr.infer()), util.Uninferable) def test_tuple_builtin_inference(self) -> None: code = """ var = (1, 2) tuple() #@ tuple([1]) #@ tuple({2}) #@ tuple("abc") #@ tuple({1: 2}) #@ tuple(var) #@ tuple(tuple([1])) #@ tuple(frozenset((1, 2))) #@ tuple(None) #@ tuple(1) #@ tuple(1, 2) #@ """ ast = extract_node(code, __name__) self.assertInferTuple(ast[0], []) self.assertInferTuple(ast[1], [1]) self.assertInferTuple(ast[2], [2]) self.assertInferTuple(ast[3], ["a", "b", "c"]) self.assertInferTuple(ast[4], [1]) self.assertInferTuple(ast[5], [1, 2]) self.assertInferTuple(ast[6], [1]) self.assertInferTuple(ast[7], [1, 2]) for node in ast[8:]: inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.tuple") def test_starred_in_tuple_literal(self) -> None: code = """ var = (1, 2, 3) bar = (5, 6, 7) foo = [999, 1000, 1001] (0, *var) #@ (0, *var, 4) #@ (0, *var, 4, *bar) #@ (0, *var, 4, *(*bar, 8)) #@ (0, *var, 4, *(*bar, *foo)) #@ """ ast = extract_node(code, __name__) self.assertInferTuple(ast[0], [0, 1, 2, 3]) self.assertInferTuple(ast[1], [0, 1, 2, 3, 4]) self.assertInferTuple(ast[2], [0, 1, 2, 3, 4, 5, 6, 7]) self.assertInferTuple(ast[3], [0, 1, 2, 3, 4, 5, 6, 7, 8]) self.assertInferTuple(ast[4], [0, 1, 2, 3, 4, 5, 6, 7, 999, 1000, 1001]) def test_starred_in_list_literal(self) -> None: code = """ var = (1, 2, 3) bar = (5, 6, 7) foo = [999, 1000, 1001] [0, *var] #@ [0, *var, 4] #@ [0, *var, 4, *bar] #@ [0, *var, 4, *[*bar, 8]] #@ [0, *var, 4, *[*bar, *foo]] #@ """ ast = extract_node(code, __name__) self.assertInferList(ast[0], [0, 1, 2, 3]) self.assertInferList(ast[1], [0, 1, 2, 3, 4]) self.assertInferList(ast[2], [0, 1, 2, 3, 4, 5, 6, 7]) self.assertInferList(ast[3], [0, 1, 2, 3, 4, 5, 6, 7, 8]) self.assertInferList(ast[4], [0, 1, 2, 3, 4, 5, 6, 7, 999, 1000, 1001]) def test_starred_in_set_literal(self) -> None: code = """ var = (1, 2, 3) bar = (5, 6, 7) foo = [999, 1000, 1001] {0, *var} #@ {0, *var, 4} #@ {0, *var, 4, *bar} #@ {0, *var, 4, *{*bar, 8}} #@ {0, *var, 4, *{*bar, *foo}} #@ """ ast = extract_node(code, __name__) self.assertInferSet(ast[0], [0, 1, 2, 3]) self.assertInferSet(ast[1], [0, 1, 2, 3, 4]) self.assertInferSet(ast[2], [0, 1, 2, 3, 4, 5, 6, 7]) self.assertInferSet(ast[3], [0, 1, 2, 3, 4, 5, 6, 7, 8]) self.assertInferSet(ast[4], [0, 1, 2, 3, 4, 5, 6, 7, 999, 1000, 1001]) def test_starred_in_literals_inference_issues(self) -> None: code = """ {0, *var} #@ {0, *var, 4} #@ {0, *var, 4, *bar} #@ {0, *var, 4, *{*bar, 8}} #@ {0, *var, 4, *{*bar, *foo}} #@ """ ast = extract_node(code, __name__) for node in ast: with self.assertRaises(InferenceError): next(node.infer()) def test_starred_in_mapping_literal(self) -> None: code = """ var = {1: 'b', 2: 'c'} bar = {4: 'e', 5: 'f'} {0: 'a', **var} #@ {0: 'a', **var, 3: 'd'} #@ {0: 'a', **var, 3: 'd', **{**bar, 6: 'g'}} #@ """ ast = extract_node(code, __name__) self.assertInferDict(ast[0], {0: "a", 1: "b", 2: "c"}) self.assertInferDict(ast[1], {0: "a", 1: "b", 2: "c", 3: "d"}) self.assertInferDict( ast[2], {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g"} ) def test_starred_in_mapping_literal_no_inference_possible(self) -> None: node = extract_node( """ from unknown import unknown def test(a): return a + 1 def func(): a = {unknown: 'a'} return {0: 1, **a} test(**func()) """ ) self.assertEqual(next(node.infer()), util.Uninferable) def test_starred_in_mapping_inference_issues(self) -> None: code = """ {0: 'a', **var} #@ {0: 'a', **var, 3: 'd'} #@ {0: 'a', **var, 3: 'd', **{**bar, 6: 'g'}} #@ """ ast = extract_node(code, __name__) for node in ast: with self.assertRaises(InferenceError): next(node.infer()) def test_starred_in_mapping_literal_non_const_keys_values(self) -> None: code = """ a, b, c, d, e, f, g, h, i, j = "ABCDEFGHIJ" var = {c: d, e: f} bar = {i: j} {a: b, **var} #@ {a: b, **var, **{g: h, **bar}} #@ """ ast = extract_node(code, __name__) self.assertInferDict(ast[0], {"A": "B", "C": "D", "E": "F"}) self.assertInferDict(ast[1], {"A": "B", "C": "D", "E": "F", "G": "H", "I": "J"}) def test_frozenset_builtin_inference(self) -> None: code = """ var = (1, 2) frozenset() #@ frozenset([1, 2, 1]) #@ frozenset({2, 3, 1}) #@ frozenset("abcab") #@ frozenset({1: 2}) #@ frozenset(var) #@ frozenset(tuple([1])) #@ frozenset(set(tuple([4, 5, set([2])]))) #@ frozenset(None) #@ frozenset(1) #@ frozenset(1, 2) #@ """ ast = extract_node(code, __name__) self.assertInferFrozenSet(ast[0], []) self.assertInferFrozenSet(ast[1], [1, 2]) self.assertInferFrozenSet(ast[2], [1, 2, 3]) self.assertInferFrozenSet(ast[3], ["a", "b", "c"]) self.assertInferFrozenSet(ast[4], [1]) self.assertInferFrozenSet(ast[5], [1, 2]) self.assertInferFrozenSet(ast[6], [1]) for node in ast[7:]: inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.frozenset") def test_set_builtin_inference(self) -> None: code = """ var = (1, 2) set() #@ set([1, 2, 1]) #@ set({2, 3, 1}) #@ set("abcab") #@ set({1: 2}) #@ set(var) #@ set(tuple([1])) #@ set(set(tuple([4, 5, set([2])]))) #@ set(None) #@ set(1) #@ set(1, 2) #@ """ ast = extract_node(code, __name__) self.assertInferSet(ast[0], []) self.assertInferSet(ast[1], [1, 2]) self.assertInferSet(ast[2], [1, 2, 3]) self.assertInferSet(ast[3], ["a", "b", "c"]) self.assertInferSet(ast[4], [1]) self.assertInferSet(ast[5], [1, 2]) self.assertInferSet(ast[6], [1]) for node in ast[7:]: inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.set") def test_list_builtin_inference(self) -> None: code = """ var = (1, 2) list() #@ list([1, 2, 1]) #@ list({2, 3, 1}) #@ list("abcab") #@ list({1: 2}) #@ list(var) #@ list(tuple([1])) #@ list(list(tuple([4, 5, list([2])]))) #@ list(None) #@ list(1) #@ list(1, 2) #@ """ ast = extract_node(code, __name__) self.assertInferList(ast[0], []) self.assertInferList(ast[1], [1, 1, 2]) self.assertInferList(ast[2], [1, 2, 3]) self.assertInferList(ast[3], ["a", "a", "b", "b", "c"]) self.assertInferList(ast[4], [1]) self.assertInferList(ast[5], [1, 2]) self.assertInferList(ast[6], [1]) for node in ast[7:]: inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.list") def test_conversion_of_dict_methods(self) -> None: ast_nodes = extract_node( """ list({1:2, 2:3}.values()) #@ list({1:2, 2:3}.keys()) #@ tuple({1:2, 2:3}.values()) #@ tuple({1:2, 3:4}.keys()) #@ set({1:2, 2:4}.keys()) #@ """ ) assert isinstance(ast_nodes, list) self.assertInferList(ast_nodes[0], [2, 3]) self.assertInferList(ast_nodes[1], [1, 2]) self.assertInferTuple(ast_nodes[2], [2, 3]) self.assertInferTuple(ast_nodes[3], [1, 3]) self.assertInferSet(ast_nodes[4], [1, 2]) def test_builtin_inference_py3k(self) -> None: code = """ list(b"abc") #@ tuple(b"abc") #@ set(b"abc") #@ """ ast = extract_node(code, __name__) self.assertInferList(ast[0], [97, 98, 99]) self.assertInferTuple(ast[1], [97, 98, 99]) self.assertInferSet(ast[2], [97, 98, 99]) def test_dict_inference(self) -> None: code = """ dict() #@ dict(a=1, b=2, c=3) #@ dict([(1, 2), (2, 3)]) #@ dict([[1, 2], [2, 3]]) #@ dict([(1, 2), [2, 3]]) #@ dict([('a', 2)], b=2, c=3) #@ dict({1: 2}) #@ dict({'c': 2}, a=4, b=5) #@ def func(): return dict(a=1, b=2) func() #@ var = {'x': 2, 'y': 3} dict(var, a=1, b=2) #@ dict([1, 2, 3]) #@ dict([(1, 2), (1, 2, 3)]) #@ dict({1: 2}, {1: 2}) #@ dict({1: 2}, (1, 2)) #@ dict({1: 2}, (1, 2), a=4) #@ dict([(1, 2), ([4, 5], 2)]) #@ dict([None, None]) #@ def using_unknown_kwargs(**kwargs): return dict(**kwargs) using_unknown_kwargs(a=1, b=2) #@ """ ast = extract_node(code, __name__) self.assertInferDict(ast[0], {}) self.assertInferDict(ast[1], {"a": 1, "b": 2, "c": 3}) for i in range(2, 5): self.assertInferDict(ast[i], {1: 2, 2: 3}) self.assertInferDict(ast[5], {"a": 2, "b": 2, "c": 3}) self.assertInferDict(ast[6], {1: 2}) self.assertInferDict(ast[7], {"c": 2, "a": 4, "b": 5}) self.assertInferDict(ast[8], {"a": 1, "b": 2}) self.assertInferDict(ast[9], {"x": 2, "y": 3, "a": 1, "b": 2}) for node in ast[10:]: inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.dict") def test_dict_inference_kwargs(self) -> None: ast_node = extract_node("""dict(a=1, b=2, **{'c': 3})""") self.assertInferDict(ast_node, {"a": 1, "b": 2, "c": 3}) def test_dict_inference_for_multiple_starred(self) -> None: pairs = [ ('dict(a=1, **{"b": 2}, **{"c":3})', {"a": 1, "b": 2, "c": 3}), ('dict(a=1, **{"b": 2}, d=4, **{"c":3})', {"a": 1, "b": 2, "c": 3, "d": 4}), ('dict({"a":1}, b=2, **{"c":3})', {"a": 1, "b": 2, "c": 3}), ] for code, expected_value in pairs: node = extract_node(code) self.assertInferDict(node, expected_value) def test_dict_inference_unpack_repeated_key(self) -> None: """Make sure astroid does not infer repeated keys in a dictionary. Regression test for https://github.com/pylint-dev/pylint/issues/1843 """ code = """ base = {'data': 0} new = {**base, 'data': 1} #@ new2 = {'data': 1, **base} #@ # Make sure overwrite works a = 'd' + 'ata' b3 = {**base, a: 3} #@ Make sure keys are properly inferred b4 = {a: 3, **base} #@ """ ast = extract_node(code) final_values = ("{'data': 1}", "{'data': 0}", "{'data': 3}", "{'data': 0}") for node, final_value in zip(ast, final_values): assert node.targets[0].inferred()[0].as_string() == final_value def test_dict_invalid_args(self) -> None: invalid_values = ["dict(*1)", "dict(**lala)", "dict(**[])"] for invalid in invalid_values: ast_node = extract_node(invalid) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.qname(), "builtins.dict") def test_copy_method_inference(self) -> None: code = """ a_dict = {"b": 1, "c": 2} b_dict = a_dict.copy() b_dict #@ a_list = [1, 2, 3] b_list = a_list.copy() b_list #@ a_set = set([1, 2, 3]) b_set = a_set.copy() b_set #@ a_frozenset = frozenset([1, 2, 3]) b_frozenset = a_frozenset.copy() b_frozenset #@ a_unknown = unknown() b_unknown = a_unknown.copy() b_unknown #@ """ ast = extract_node(code, __name__) self.assertInferDict(ast[0], {"b": 1, "c": 2}) self.assertInferList(ast[1], [1, 2, 3]) self.assertInferSet(ast[2], [1, 2, 3]) self.assertInferFrozenSet(ast[3], [1, 2, 3]) inferred_unknown = next(ast[4].infer()) assert inferred_unknown == util.Uninferable def test_str_methods(self) -> None: code = """ ' '.decode() #@ ' '.join('abcd') #@ ' '.replace('a', 'b') #@ ' '.capitalize() #@ ' '.title() #@ ' '.lower() #@ ' '.upper() #@ ' '.swapcase() #@ ' '.strip() #@ ' '.rstrip() #@ ' '.lstrip() #@ ' '.rjust() #@ ' '.ljust() #@ ' '.center() #@ ' '.index() #@ ' '.find() #@ ' '.count() #@ ' '.format('a') #@ """ ast = extract_node(code, __name__) self.assertInferConst(ast[0], "") for i in range(1, 14): self.assertInferConst(ast[i], "") for i in range(14, 17): self.assertInferConst(ast[i], 0) self.assertInferConst(ast[17], " ") def test_unicode_methods(self) -> None: code = """ u' '.decode() #@ u' '.join('abcd') #@ u' '.replace('a', 'b') #@ u' '.capitalize() #@ u' '.title() #@ u' '.lower() #@ u' '.upper() #@ u' '.swapcase() #@ u' '.strip() #@ u' '.rstrip() #@ u' '.lstrip() #@ u' '.rjust() #@ u' '.ljust() #@ u' '.center() #@ u' '.index() #@ u' '.find() #@ u' '.count() #@ u' '.format('a') #@ """ ast = extract_node(code, __name__) self.assertInferConst(ast[0], "") for i in range(1, 14): self.assertInferConst(ast[i], "") for i in range(14, 17): self.assertInferConst(ast[i], 0) self.assertInferConst(ast[17], " ") def test_scope_lookup_same_attributes(self) -> None: code = """ import collections class Second(collections.Counter): def collections(self): return "second" """ ast = parse(code, __name__) bases = ast["Second"].bases[0] inferred = next(bases.infer()) self.assertTrue(inferred) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.qname(), "collections.Counter") def test_inferring_with_statement_failures(self) -> None: module = parse( """ class NoEnter(object): pass class NoMethod(object): __enter__ = None class NoElts(object): def __enter__(self): return 42 with NoEnter() as no_enter: pass with NoMethod() as no_method: pass with NoElts() as (no_elts, no_elts1): pass """ ) self.assertRaises(InferenceError, next, module["no_enter"].infer()) self.assertRaises(InferenceError, next, module["no_method"].infer()) self.assertRaises(InferenceError, next, module["no_elts"].infer()) def test_inferring_with_statement(self) -> None: module = parse( """ class SelfContext(object): def __enter__(self): return self class OtherContext(object): def __enter__(self): return SelfContext() class MultipleReturns(object): def __enter__(self): return SelfContext(), OtherContext() class MultipleReturns2(object): def __enter__(self): return [1, [2, 3]] with SelfContext() as self_context: pass with OtherContext() as other_context: pass with MultipleReturns(), OtherContext() as multiple_with: pass with MultipleReturns2() as (stdout, (stderr, stdin)): pass """ ) self_context = module["self_context"] inferred = next(self_context.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "SelfContext") other_context = module["other_context"] inferred = next(other_context.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "SelfContext") multiple_with = module["multiple_with"] inferred = next(multiple_with.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "SelfContext") stdout = module["stdout"] inferred = next(stdout.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 1) stderr = module["stderr"] inferred = next(stderr.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 2) def test_inferring_with_contextlib_contextmanager(self) -> None: module = parse( """ import contextlib from contextlib import contextmanager @contextlib.contextmanager def manager_none(): try: yield finally: pass @contextlib.contextmanager def manager_something(): try: yield 42 yield 24 # This should be ignored. finally: pass @contextmanager def manager_multiple(): with manager_none() as foo: with manager_something() as bar: yield foo, bar with manager_none() as none: pass with manager_something() as something: pass with manager_multiple() as (first, second): pass """ ) none = module["none"] inferred = next(none.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertIsNone(inferred.value) something = module["something"] inferred = something.inferred() self.assertEqual(len(inferred), 1) inferred = inferred[0] self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) first, second = module["first"], module["second"] first = next(first.infer()) second = next(second.infer()) self.assertIsInstance(first, nodes.Const) self.assertIsNone(first.value) self.assertIsInstance(second, nodes.Const) self.assertEqual(second.value, 42) def test_inferring_context_manager_skip_index_error(self) -> None: # Raise an InferenceError when having multiple 'as' bindings # from a context manager, but its result doesn't have those # indices. This is the case of contextlib.nested, where the # result is a list, which is mutated later on, so it's # undetected by astroid. module = parse( """ class Manager(object): def __enter__(self): return [] with Manager() as (a, b, c): pass """ ) self.assertRaises(InferenceError, next, module["a"].infer()) def test_inferring_context_manager_unpacking_inference_error(self) -> None: # https://github.com/pylint-dev/pylint/issues/1463 module = parse( """ import contextlib @contextlib.contextmanager def _select_source(a=None): with _select_source() as result: yield result result = _select_source() with result as (a, b, c): pass """ ) self.assertRaises(InferenceError, next, module["a"].infer()) def test_inferring_with_contextlib_contextmanager_failures(self) -> None: module = parse( """ from contextlib import contextmanager def no_decorators_mgr(): yield @no_decorators_mgr def other_decorators_mgr(): yield @contextmanager def no_yield_mgr(): pass with no_decorators_mgr() as no_decorators: pass with other_decorators_mgr() as other_decorators: pass with no_yield_mgr() as no_yield: pass """ ) self.assertRaises(InferenceError, next, module["no_decorators"].infer()) self.assertRaises(InferenceError, next, module["other_decorators"].infer()) self.assertRaises(InferenceError, next, module["no_yield"].infer()) def test_nested_contextmanager(self) -> None: """Make sure contextmanager works with nested functions. Previously contextmanager would retrieve the first yield instead of the yield in the proper scope Fixes https://github.com/pylint-dev/pylint/issues/1746 """ code = """ from contextlib import contextmanager @contextmanager def outer(): @contextmanager def inner(): yield 2 yield inner with outer() as ctx: ctx #@ with ctx() as val: val #@ """ context_node, value_node = extract_node(code) value = next(value_node.infer()) context = next(context_node.infer()) assert isinstance(context, nodes.FunctionDef) assert isinstance(value, nodes.Const) def test_unary_op_leaks_stop_iteration(self) -> None: node = extract_node("+[] #@") self.assertEqual(util.Uninferable, next(node.infer())) def test_unary_operands(self) -> None: ast_nodes = extract_node( """ import os def func(): pass from missing import missing class GoodInstance(object): def __pos__(self): return 42 def __neg__(self): return +self - 41 def __invert__(self): return 42 class BadInstance(object): def __pos__(self): return lala def __neg__(self): return missing class LambdaInstance(object): __pos__ = lambda self: self.lala __neg__ = lambda self: self.lala + 1 @property def lala(self): return 24 class InstanceWithAttr(object): def __init__(self): self.x = 42 def __pos__(self): return self.x def __neg__(self): return +self - 41 def __invert__(self): return self.x + 1 instance = GoodInstance() lambda_instance = LambdaInstance() instance_with_attr = InstanceWithAttr() +instance #@ -instance #@ ~instance #@ --instance #@ +lambda_instance #@ -lambda_instance #@ +instance_with_attr #@ -instance_with_attr #@ ~instance_with_attr #@ bad_instance = BadInstance() +bad_instance #@ -bad_instance #@ ~bad_instance #@ # These should be TypeErrors. ~BadInstance #@ ~os #@ -func #@ +BadInstance #@ """ ) expected = [42, 1, 42, -1, 24, 25, 42, 1, 43] for node, value in zip(ast_nodes[:9], expected): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, value) for bad_node in ast_nodes[9:]: inferred = next(bad_node.infer()) self.assertEqual(inferred, util.Uninferable) def test_unary_op_instance_method_not_callable(self) -> None: ast_node = extract_node( """ class A: __pos__ = (i for i in range(10)) +A() #@ """ ) self.assertRaises(InferenceError, next, ast_node.infer()) def test_binary_op_type_errors(self) -> None: ast_nodes = extract_node( """ import collections 1 + "a" #@ 1 - [] #@ 1 * {} #@ 1 / collections #@ 1 ** (lambda x: x) #@ {} * {} #@ {} - {} #@ {} >> {} #@ [] + () #@ () + [] #@ [] * 2.0 #@ () * 2.0 #@ 2.0 >> 2.0 #@ class A(object): pass class B(object): pass A() + B() #@ class A1(object): def __add__(self, other): return NotImplemented A1() + A1() #@ class A(object): def __add__(self, other): return NotImplemented class B(object): def __radd__(self, other): return NotImplemented A() + B() #@ class Parent(object): pass class Child(Parent): def __add__(self, other): return NotImplemented Child() + Parent() #@ class A(object): def __add__(self, other): return NotImplemented class B(A): def __radd__(self, other): return NotImplemented A() + B() #@ # Augmented f = 1 f+=A() #@ x = 1 x+=[] #@ """ ) msg = "unsupported operand type(s) for {op}: {lhs!r} and {rhs!r}" expected = [ msg.format(op="+", lhs="int", rhs="str"), msg.format(op="-", lhs="int", rhs="list"), msg.format(op="*", lhs="int", rhs="dict"), msg.format(op="/", lhs="int", rhs="module"), msg.format(op="**", lhs="int", rhs="function"), msg.format(op="*", lhs="dict", rhs="dict"), msg.format(op="-", lhs="dict", rhs="dict"), msg.format(op=">>", lhs="dict", rhs="dict"), msg.format(op="+", lhs="list", rhs="tuple"), msg.format(op="+", lhs="tuple", rhs="list"), msg.format(op="*", lhs="list", rhs="float"), msg.format(op="*", lhs="tuple", rhs="float"), msg.format(op=">>", lhs="float", rhs="float"), msg.format(op="+", lhs="A", rhs="B"), msg.format(op="+", lhs="A1", rhs="A1"), msg.format(op="+", lhs="A", rhs="B"), msg.format(op="+", lhs="Child", rhs="Parent"), msg.format(op="+", lhs="A", rhs="B"), msg.format(op="+=", lhs="int", rhs="A"), msg.format(op="+=", lhs="int", rhs="list"), ] # PEP-584 supports | for dictionary union if not PY39_PLUS: ast_nodes.append(extract_node("{} | {} #@")) expected.append(msg.format(op="|", lhs="dict", rhs="dict")) for node, expected_value in zip(ast_nodes, expected): errors = node.type_errors() self.assertEqual(len(errors), 1) error = errors[0] self.assertEqual(str(error), expected_value) def test_unary_type_errors(self) -> None: ast_nodes = extract_node( """ import collections ~[] #@ ~() #@ ~dict() #@ ~{} #@ ~set() #@ -set() #@ -"" #@ ~"" #@ +"" #@ class A(object): pass ~(lambda: None) #@ ~A #@ ~A() #@ ~collections #@ ~2.0 #@ """ ) msg = "bad operand type for unary {op}: {type}" expected = [ msg.format(op="~", type="list"), msg.format(op="~", type="tuple"), msg.format(op="~", type="dict"), msg.format(op="~", type="dict"), msg.format(op="~", type="set"), msg.format(op="-", type="set"), msg.format(op="-", type="str"), msg.format(op="~", type="str"), msg.format(op="+", type="str"), msg.format(op="~", type=""), msg.format(op="~", type="A"), msg.format(op="~", type="A"), msg.format(op="~", type="collections"), msg.format(op="~", type="float"), ] for node, expected_value in zip(ast_nodes, expected): errors = node.type_errors() self.assertEqual(len(errors), 1) error = errors[0] self.assertEqual(str(error), expected_value) def test_unary_empty_type_errors(self) -> None: # These aren't supported right now ast_nodes = extract_node( """ ~(2 and []) #@ -(0 or {}) #@ """ ) expected = [ "bad operand type for unary ~: list", "bad operand type for unary -: dict", ] for node, expected_value in zip(ast_nodes, expected): errors = node.type_errors() self.assertEqual(len(errors), 1, (expected, node)) self.assertEqual(str(errors[0]), expected_value) def test_unary_type_errors_for_non_instance_objects(self) -> None: node = extract_node("~slice(1, 2, 3)") errors = node.type_errors() self.assertEqual(len(errors), 1) self.assertEqual(str(errors[0]), "bad operand type for unary ~: slice") def test_bool_value_recursive(self) -> None: pairs = [ ("{}", False), ("{1:2}", True), ("()", False), ("(1, 2)", True), ("[]", False), ("[1,2]", True), ("frozenset()", False), ("frozenset((1, 2))", True), ] for code, expected in pairs: node = extract_node(code) inferred = next(node.infer()) self.assertEqual(inferred.bool_value(), expected) def test_genexpr_bool_value(self) -> None: node = extract_node("""(x for x in range(10))""") self.assertTrue(node.bool_value()) def test_name_bool_value(self) -> None: node = extract_node( """ x = 42 y = x y """ ) self.assertIs(node.bool_value(), util.Uninferable) def test_bool_value(self) -> None: # Verify the truth value of nodes. module = parse( """ import collections collections_module = collections def function(): pass class Class(object): def method(self): pass dict_comp = {x:y for (x, y) in ((1, 2), (2, 3))} set_comp = {x for x in range(10)} list_comp = [x for x in range(10)] lambda_func = lambda: None unbound_method = Class.method instance = Class() bound_method = instance.method def generator_func(): yield def true_value(): return True generator = generator_func() bin_op = 1 + 2 bool_op = x and y callfunc = test() good_callfunc = true_value() compare = 2 < 3 const_str_true = 'testconst' const_str_false = '' """ ) collections_module = next(module["collections_module"].infer()) self.assertTrue(collections_module.bool_value()) function = module["function"] self.assertTrue(function.bool_value()) klass = module["Class"] self.assertTrue(klass.bool_value()) dict_comp = next(module["dict_comp"].infer()) self.assertEqual(dict_comp, util.Uninferable) set_comp = next(module["set_comp"].infer()) self.assertEqual(set_comp, util.Uninferable) list_comp = next(module["list_comp"].infer()) self.assertEqual(list_comp, util.Uninferable) lambda_func = next(module["lambda_func"].infer()) self.assertTrue(lambda_func) unbound_method = next(module["unbound_method"].infer()) self.assertTrue(unbound_method) bound_method = next(module["bound_method"].infer()) self.assertTrue(bound_method) generator = next(module["generator"].infer()) self.assertTrue(generator) bin_op = module["bin_op"].parent.value self.assertIs(bin_op.bool_value(), util.Uninferable) bool_op = module["bool_op"].parent.value self.assertEqual(bool_op.bool_value(), util.Uninferable) callfunc = module["callfunc"].parent.value self.assertEqual(callfunc.bool_value(), util.Uninferable) good_callfunc = next(module["good_callfunc"].infer()) self.assertTrue(good_callfunc.bool_value()) compare = module["compare"].parent.value self.assertEqual(compare.bool_value(), util.Uninferable) def test_bool_value_instances(self) -> None: instances = extract_node( """ class FalseBoolInstance(object): def __bool__(self): return False class TrueBoolInstance(object): def __bool__(self): return True class FalseLenInstance(object): def __len__(self): return 0 class TrueLenInstance(object): def __len__(self): return 14 class AlwaysTrueInstance(object): pass class ErrorInstance(object): def __bool__(self): return lala def __len__(self): return lala class NonMethods(object): __bool__ = 1 __len__ = 2 FalseBoolInstance() #@ TrueBoolInstance() #@ FalseLenInstance() #@ TrueLenInstance() #@ AlwaysTrueInstance() #@ ErrorInstance() #@ """ ) expected = (False, True, False, True, True, util.Uninferable, util.Uninferable) for node, expected_value in zip(instances, expected): inferred = next(node.infer()) self.assertEqual(inferred.bool_value(), expected_value) def test_bool_value_variable(self) -> None: instance = extract_node( """ class VariableBoolInstance(object): def __init__(self, value): self.value = value def __bool__(self): return self.value not VariableBoolInstance(True) """ ) inferred = next(instance.infer()) self.assertIs(inferred.bool_value(), util.Uninferable) def test_infer_coercion_rules_for_floats_complex(self) -> None: ast_nodes = extract_node( """ 1 + 1.0 #@ 1 * 1.0 #@ 2 - 1.0 #@ 2 / 2.0 #@ 1 + 1j #@ 2 * 1j #@ 2 - 1j #@ 3 / 1j #@ """ ) expected_values = [2.0, 1.0, 1.0, 1.0, 1 + 1j, 2j, 2 - 1j, -3j] for node, expected in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertEqual(inferred.value, expected) def test_binop_list_with_elts(self) -> None: ast_node = extract_node( """ x = [A] * 1 [1] + x """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.List) self.assertEqual(len(inferred.elts), 2) self.assertIsInstance(inferred.elts[0], nodes.Const) self.assertIsInstance(inferred.elts[1], nodes.Unknown) def test_binop_same_types(self) -> None: ast_nodes = extract_node( """ class A(object): def __add__(self, other): return 42 1 + 1 #@ 1 - 1 #@ "a" + "b" #@ A() + A() #@ """ ) expected_values = [2, 0, "ab", 42] for node, expected in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected) def test_binop_different_types_reflected_only(self) -> None: node = extract_node( """ class A(object): pass class B(object): def __radd__(self, other): return other A() + B() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_binop_different_types_unknown_bases(self) -> None: node = extract_node( """ from foo import bar class A(bar): pass class B(object): def __radd__(self, other): return other A() + B() #@ """ ) inferred = next(node.infer()) self.assertIs(inferred, util.Uninferable) def test_binop_different_types_normal_not_implemented_and_reflected(self) -> None: node = extract_node( """ class A(object): def __add__(self, other): return NotImplemented class B(object): def __radd__(self, other): return other A() + B() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_binop_different_types_no_method_implemented(self) -> None: node = extract_node( """ class A(object): pass class B(object): pass A() + B() #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_binop_different_types_reflected_and_normal_not_implemented(self) -> None: node = extract_node( """ class A(object): def __add__(self, other): return NotImplemented class B(object): def __radd__(self, other): return NotImplemented A() + B() #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_binop_subtype(self) -> None: node = extract_node( """ class A(object): pass class B(A): def __add__(self, other): return other B() + A() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_binop_subtype_implemented_in_parent(self) -> None: node = extract_node( """ class A(object): def __add__(self, other): return other class B(A): pass B() + A() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_binop_subtype_not_implemented(self) -> None: node = extract_node( """ class A(object): pass class B(A): def __add__(self, other): return NotImplemented B() + A() #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_binop_supertype(self) -> None: node = extract_node( """ class A(object): pass class B(A): def __radd__(self, other): return other A() + B() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_binop_supertype_rop_not_implemented(self) -> None: node = extract_node( """ class A(object): def __add__(self, other): return other class B(A): def __radd__(self, other): return NotImplemented A() + B() #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "B") def test_binop_supertype_both_not_implemented(self) -> None: node = extract_node( """ class A(object): def __add__(self): return NotImplemented class B(A): def __radd__(self, other): return NotImplemented A() + B() #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_binop_inference_errors(self) -> None: ast_nodes = extract_node( """ from unknown import Unknown class A(object): def __add__(self, other): return NotImplemented class B(object): def __add__(self, other): return Unknown A() + Unknown #@ Unknown + A() #@ B() + A() #@ A() + B() #@ """ ) for node in ast_nodes: self.assertEqual(next(node.infer()), util.Uninferable) def test_binop_ambiguity(self) -> None: ast_nodes = extract_node( """ class A(object): def __add__(self, other): if isinstance(other, B): return NotImplemented if type(other) is type(self): return 42 return NotImplemented class B(A): pass class C(object): def __radd__(self, other): if isinstance(other, B): return 42 return NotImplemented A() + B() #@ B() + A() #@ A() + C() #@ C() + A() #@ """ ) for node in ast_nodes: self.assertEqual(next(node.infer()), util.Uninferable) def test_binop_self_in_list(self) -> None: """If 'self' is referenced within a list it should not be bound by it. Reported in https://github.com/pylint-dev/pylint/issues/4826. """ ast_nodes = extract_node( """ class A: def __init__(self): for a in [self] + []: print(a) #@ class B: def __init__(self): for b in [] + [self]: print(b) #@ """ ) inferred_a = list(ast_nodes[0].args[0].infer()) self.assertEqual(len(inferred_a), 1) self.assertIsInstance(inferred_a[0], Instance) self.assertEqual(inferred_a[0]._proxied.name, "A") inferred_b = list(ast_nodes[1].args[0].infer()) self.assertEqual(len(inferred_b), 1) self.assertIsInstance(inferred_b[0], Instance) self.assertEqual(inferred_b[0]._proxied.name, "B") def test_metaclass__getitem__(self) -> None: ast_node = extract_node( """ class Meta(type): def __getitem__(cls, arg): return 24 class A(object, metaclass=Meta): pass A['Awesome'] #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 24) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_with_metaclass__getitem__(self): ast_node = extract_node( """ class Meta(type): def __getitem__(cls, arg): return 24 import six class A(six.with_metaclass(Meta)): pass A['Awesome'] #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 24) def test_bin_op_classes(self) -> None: ast_node = extract_node( """ class Meta(type): def __or__(self, other): return 24 class A(object, metaclass=Meta): pass A | A """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 24) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_bin_op_classes_with_metaclass(self): ast_node = extract_node( """ class Meta(type): def __or__(self, other): return 24 import six class A(six.with_metaclass(Meta)): pass A | A """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 24) def test_bin_op_supertype_more_complicated_example(self) -> None: ast_node = extract_node( """ class A(object): def __init__(self): self.foo = 42 def __add__(self, other): return other.bar + self.foo / 2 class B(A): def __init__(self): self.bar = 24 def __radd__(self, other): return NotImplemented A() + B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(int(inferred.value), 45) def test_aug_op_same_type_not_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return NotImplemented A() + A() #@ """ ) self.assertEqual(next(ast_node.infer()), util.Uninferable) def test_aug_op_same_type_aug_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return other f = A() f += A() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_aug_op_same_type_aug_not_implemented_normal_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return 42 f = A() f += A() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_aug_op_subtype_both_not_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return NotImplemented class B(A): pass b = B() b+=A() #@ """ ) self.assertEqual(next(ast_node.infer()), util.Uninferable) def test_aug_op_subtype_aug_op_is_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return 42 class B(A): pass b = B() b+=A() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_aug_op_subtype_normal_op_is_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __add__(self, other): return 42 class B(A): pass b = B() b+=A() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_aug_different_types_no_method_implemented(self) -> None: ast_node = extract_node( """ class A(object): pass class B(object): pass f = A() f += B() #@ """ ) self.assertEqual(next(ast_node.infer()), util.Uninferable) def test_aug_different_types_augop_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return other class B(object): pass f = A() f += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "B") def test_aug_different_types_aug_not_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return other class B(object): pass f = A() f += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "B") def test_aug_different_types_aug_not_implemented_rop_fallback(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return NotImplemented class B(object): def __radd__(self, other): return other f = A() f += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_augop_supertypes_none_implemented(self) -> None: ast_node = extract_node( """ class A(object): pass class B(object): pass a = A() a += B() #@ """ ) self.assertEqual(next(ast_node.infer()), util.Uninferable) def test_augop_supertypes_not_implemented_returned_for_all(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return NotImplemented class B(object): def __add__(self, other): return NotImplemented a = A() a += B() #@ """ ) self.assertEqual(next(ast_node.infer()), util.Uninferable) def test_augop_supertypes_augop_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return other class B(A): pass a = A() a += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "B") def test_augop_supertypes_reflected_binop_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented class B(A): def __radd__(self, other): return other a = A() a += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "A") def test_augop_supertypes_normal_binop_implemented(self) -> None: ast_node = extract_node( """ class A(object): def __iadd__(self, other): return NotImplemented def __add__(self, other): return other class B(A): def __radd__(self, other): return NotImplemented a = A() a += B() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "B") def test_string_interpolation(self): ast_nodes = extract_node( """ "a%d%d" % (1, 2) #@ "a%(x)s" % {"x": 42} #@ """ ) expected = ["a12", "a42"] for node, expected_value in zip(ast_nodes, expected): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected_value) def test_mul_list_supports__index__(self) -> None: ast_nodes = extract_node( """ class Index(object): def __index__(self): return 2 class NotIndex(object): pass class NotIndex2(object): def __index__(self): return None a = [1, 2] a * Index() #@ a * NotIndex() #@ a * NotIndex2() #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.List) self.assertEqual([node.value for node in first.itered()], [1, 2, 1, 2]) for rest in ast_nodes[1:]: inferred = next(rest.infer()) self.assertEqual(inferred, util.Uninferable) def test_subscript_supports__index__(self) -> None: ast_nodes = extract_node( """ class Index(object): def __index__(self): return 2 class LambdaIndex(object): __index__ = lambda self: self.foo @property def foo(self): return 1 class NonIndex(object): __index__ = lambda self: None a = [1, 2, 3, 4] a[Index()] #@ a[LambdaIndex()] #@ a[NonIndex()] #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.Const) self.assertEqual(first.value, 3) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, nodes.Const) self.assertEqual(second.value, 2) self.assertRaises(InferenceError, next, ast_nodes[2].infer()) def test_special_method_masquerading_as_another(self) -> None: ast_node = extract_node( """ class Info(object): def __add__(self, other): return "lala" __or__ = __add__ f = Info() f | Info() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, "lala") def test_unary_op_assignment(self) -> None: ast_node = extract_node( """ class A(object): pass def pos(self): return 42 A.__pos__ = pos f = A() +f #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def test_unary_op_classes(self) -> None: ast_node = extract_node( """ class Meta(type): def __invert__(self): return 42 class A(object, metaclass=Meta): pass ~A """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_unary_op_classes_with_metaclass(self): ast_node = extract_node( """ import six class Meta(type): def __invert__(self): return 42 class A(six.with_metaclass(Meta)): pass ~A """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 42) def _slicing_test_helper( self, pairs: tuple[ tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], tuple[str, list[int] | str], ], cls: ABCMeta | type, get_elts: Callable, ) -> None: for code, expected in pairs: ast_node = extract_node(code) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, cls) self.assertEqual(get_elts(inferred), expected, ast_node.as_string()) def test_slicing_list(self) -> None: pairs = ( ("[1, 2, 3][:] #@", [1, 2, 3]), ("[1, 2, 3][0:] #@", [1, 2, 3]), ("[1, 2, 3][None:] #@", [1, 2, 3]), ("[1, 2, 3][None:None] #@", [1, 2, 3]), ("[1, 2, 3][0:-1] #@", [1, 2]), ("[1, 2, 3][0:2] #@", [1, 2]), ("[1, 2, 3][0:2:None] #@", [1, 2]), ("[1, 2, 3][::] #@", [1, 2, 3]), ("[1, 2, 3][::2] #@", [1, 3]), ("[1, 2, 3][::-1] #@", [3, 2, 1]), ("[1, 2, 3][0:2:2] #@", [1]), ("[1, 2, 3, 4, 5, 6][0:4-1:2+0] #@", [1, 3]), ) self._slicing_test_helper( pairs, nodes.List, lambda inferred: [elt.value for elt in inferred.elts] ) def test_slicing_tuple(self) -> None: pairs = ( ("(1, 2, 3)[:] #@", [1, 2, 3]), ("(1, 2, 3)[0:] #@", [1, 2, 3]), ("(1, 2, 3)[None:] #@", [1, 2, 3]), ("(1, 2, 3)[None:None] #@", [1, 2, 3]), ("(1, 2, 3)[0:-1] #@", [1, 2]), ("(1, 2, 3)[0:2] #@", [1, 2]), ("(1, 2, 3)[0:2:None] #@", [1, 2]), ("(1, 2, 3)[::] #@", [1, 2, 3]), ("(1, 2, 3)[::2] #@", [1, 3]), ("(1, 2, 3)[::-1] #@", [3, 2, 1]), ("(1, 2, 3)[0:2:2] #@", [1]), ("(1, 2, 3, 4, 5, 6)[0:4-1:2+0] #@", [1, 3]), ) self._slicing_test_helper( pairs, nodes.Tuple, lambda inferred: [elt.value for elt in inferred.elts] ) def test_slicing_str(self) -> None: pairs = ( ("'123'[:] #@", "123"), ("'123'[0:] #@", "123"), ("'123'[None:] #@", "123"), ("'123'[None:None] #@", "123"), ("'123'[0:-1] #@", "12"), ("'123'[0:2] #@", "12"), ("'123'[0:2:None] #@", "12"), ("'123'[::] #@", "123"), ("'123'[::2] #@", "13"), ("'123'[::-1] #@", "321"), ("'123'[0:2:2] #@", "1"), ("'123456'[0:4-1:2+0] #@", "13"), ) self._slicing_test_helper(pairs, nodes.Const, lambda inferred: inferred.value) def test_invalid_slicing_primaries(self) -> None: examples = [ "(lambda x: x)[1:2]", "1[2]", "(1, 2, 3)[a:]", "(1, 2, 3)[object:object]", "(1, 2, 3)[1:object]", ] for code in examples: node = extract_node(code) self.assertRaises(InferenceError, next, node.infer()) def test_instance_slicing(self) -> None: ast_nodes = extract_node( """ class A(object): def __getitem__(self, index): return [1, 2, 3, 4, 5][index] A()[1:] #@ A()[:2] #@ A()[1:4] #@ """ ) expected_values = [[2, 3, 4, 5], [1, 2], [2, 3, 4]] for expected, node in zip(expected_values, ast_nodes): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.List) self.assertEqual([elt.value for elt in inferred.elts], expected) def test_instance_slicing_slices(self) -> None: ast_node = extract_node( """ class A(object): def __getitem__(self, index): return index A()[1:] #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Slice) self.assertEqual(inferred.lower.value, 1) self.assertIsNone(inferred.upper) def test_instance_slicing_fails(self) -> None: ast_nodes = extract_node( """ class A(object): def __getitem__(self, index): return 1[index] A()[4:5] #@ A()[2:] #@ """ ) for node in ast_nodes: self.assertEqual(next(node.infer()), util.Uninferable) def test_type__new__with_metaclass(self) -> None: ast_node = extract_node( """ class Metaclass(type): pass class Entity(object): pass type.__new__(Metaclass, 'NewClass', (Entity,), {'a': 1}) #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "NewClass") metaclass = inferred.metaclass() self.assertEqual(metaclass, inferred.root()["Metaclass"]) ancestors = list(inferred.ancestors()) self.assertEqual(len(ancestors), 2) self.assertEqual(ancestors[0], inferred.root()["Entity"]) attributes = inferred.getattr("a") self.assertEqual(len(attributes), 1) self.assertIsInstance(attributes[0], nodes.Const) self.assertEqual(attributes[0].value, 1) def test_type__new__not_enough_arguments(self) -> None: ast_nodes = extract_node( """ type.__new__(type, 'foo') #@ type.__new__(type, 'foo', ()) #@ type.__new__(type, 'foo', (), {}, ()) #@ """ ) for node in ast_nodes: with pytest.raises(InferenceError): next(node.infer()) def test_type__new__invalid_mcs_argument(self) -> None: ast_nodes = extract_node( """ class Class(object): pass type.__new__(1, 2, 3, 4) #@ type.__new__(Class, 2, 3, 4) #@ """ ) for node in ast_nodes: with pytest.raises(InferenceError): next(node.infer()) def test_type__new__invalid_name(self) -> None: ast_nodes = extract_node( """ class Class(type): pass type.__new__(Class, object, 1, 2) #@ type.__new__(Class, 1, 1, 2) #@ type.__new__(Class, [], 1, 2) #@ """ ) for node in ast_nodes: with pytest.raises(InferenceError): next(node.infer()) def test_type__new__invalid_bases(self) -> None: ast_nodes = extract_node( """ type.__new__(type, 'a', 1, 2) #@ type.__new__(type, 'a', [], 2) #@ type.__new__(type, 'a', {}, 2) #@ type.__new__(type, 'a', (1, ), 2) #@ type.__new__(type, 'a', (object, 1), 2) #@ """ ) for node in ast_nodes: with pytest.raises(InferenceError): next(node.infer()) def test_type__new__invalid_attrs(self) -> None: type_error_nodes = extract_node( """ type.__new__(type, 'a', (), ()) #@ type.__new__(type, 'a', (), object) #@ type.__new__(type, 'a', (), 1) #@ """ ) for node in type_error_nodes: with pytest.raises(InferenceError): next(node.infer()) # Ignore invalid keys ast_nodes = extract_node( """ type.__new__(type, 'a', (), {object: 1}) #@ type.__new__(type, 'a', (), {1:2, "a":5}) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) def test_type__new__metaclass_lookup(self) -> None: ast_node = extract_node( """ class Metaclass(type): def test(cls): pass @classmethod def test1(cls): pass attr = 42 type.__new__(Metaclass, 'A', (), {}) #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) test = inferred.getattr("test") self.assertEqual(len(test), 1) self.assertIsInstance(test[0], BoundMethod) self.assertIsInstance(test[0].bound, nodes.ClassDef) self.assertEqual(test[0].bound, inferred) test1 = inferred.getattr("test1") self.assertEqual(len(test1), 1) self.assertIsInstance(test1[0], BoundMethod) self.assertIsInstance(test1[0].bound, nodes.ClassDef) self.assertEqual(test1[0].bound, inferred.metaclass()) attr = inferred.getattr("attr") self.assertEqual(len(attr), 1) self.assertIsInstance(attr[0], nodes.Const) self.assertEqual(attr[0].value, 42) def test_type__new__metaclass_and_ancestors_lookup(self) -> None: ast_node = extract_node( """ class Book(object): title = 'Ubik' class MetaBook(type): title = 'Grimus' type.__new__(MetaBook, 'book', (Book, ), {'title':'Catch 22'}) #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) titles = [ title.value for attr in inferred.getattr("title") for title in attr.inferred() ] self.assertEqual(titles, ["Catch 22", "Ubik", "Grimus"]) @staticmethod def test_builtin_new() -> None: ast_node = extract_node("int.__new__(int, 42)") inferred = next(ast_node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 ast_node2 = extract_node("int.__new__(int)") inferred2 = next(ast_node2.infer()) assert isinstance(inferred2, Instance) assert not isinstance(inferred2, nodes.Const) assert inferred2._proxied is inferred._proxied ast_node3 = extract_node( """ x = 43 int.__new__(int, x) #@ """ ) inferred3 = next(ast_node3.infer()) assert isinstance(inferred3, nodes.Const) assert inferred3.value == 43 ast_node4 = extract_node("int.__new__()") with pytest.raises(InferenceError): next(ast_node4.infer()) ast_node5 = extract_node( """ class A: pass A.__new__(A()) #@ """ ) with pytest.raises(InferenceError): next(ast_node5.infer()) ast_nodes6 = extract_node( """ class A: pass class B(A): pass class C: pass A.__new__(A) #@ A.__new__(B) #@ B.__new__(A) #@ B.__new__(B) #@ C.__new__(A) #@ """ ) instance_A1 = next(ast_nodes6[0].infer()) assert instance_A1._proxied.name == "A" instance_B1 = next(ast_nodes6[1].infer()) assert instance_B1._proxied.name == "B" instance_A2 = next(ast_nodes6[2].infer()) assert instance_A2._proxied.name == "A" instance_B2 = next(ast_nodes6[3].infer()) assert instance_B2._proxied.name == "B" instance_A3 = next(ast_nodes6[4].infer()) assert instance_A3._proxied.name == "A" ast_nodes7 = extract_node( """ import enum class A(enum.EnumMeta): pass class B(enum.EnumMeta): def __new__(mcs, value, **kwargs): return super().__new__(mcs, "str", (enum.Enum,), enum._EnumDict(), **kwargs) class C(enum.EnumMeta): def __new__(mcs, **kwargs): return super().__new__(A, "str", (enum.Enum,), enum._EnumDict(), **kwargs) B("") #@ C() #@ """ ) instance_B = next(ast_nodes7[0].infer()) assert instance_B._proxied.name == "B" instance_C = next(ast_nodes7[1].infer()) # TODO: This should be A. However, we don't infer EnumMeta.__new__ # correctly. assert instance_C._proxied.name == "C" @pytest.mark.xfail(reason="Does not support function metaclasses") def test_function_metaclasses(self): # These are not supported right now, although # they will be in the future. ast_node = extract_node( """ class BookMeta(type): author = 'Rushdie' def metaclass_function(*args): return BookMeta class Book(object, metaclass=metaclass_function): pass Book #@ """ ) inferred = next(ast_node.infer()) metaclass = inferred.metaclass() self.assertIsInstance(metaclass, nodes.ClassDef) self.assertEqual(metaclass.name, "BookMeta") author = next(inferred.igetattr("author")) self.assertIsInstance(author, nodes.Const) self.assertEqual(author.value, "Rushdie") def test_subscript_inference_error(self) -> None: # Used to raise StopIteration ast_node = extract_node( """ class AttributeDict(dict): def __getitem__(self, name): return self flow = AttributeDict() flow['app'] = AttributeDict() flow['app']['config'] = AttributeDict() flow['app']['config']['doffing'] = AttributeDict() #@ """ ) self.assertIsInstance(util.safe_infer(ast_node.targets[0]), Instance) def test_classmethod_inferred_by_context(self) -> None: ast_node = extract_node( """ class Super(object): def instance(cls): return cls() instance = classmethod(instance) class Sub(Super): def method(self): return self # should see the Sub.instance() is returning a Sub # instance, not a Super instance Sub.instance().method() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, Instance) self.assertEqual(inferred.name, "Sub") def test_infer_call_result_invalid_dunder_call_on_instance(self) -> None: ast_nodes = extract_node( """ class A: __call__ = 42 class B: __call__ = A() class C: __call = None A() #@ B() #@ C() #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertRaises(InferenceError, next, inferred.infer_call_result(node)) def test_infer_call_result_same_proxied_class(self) -> None: node = extract_node( """ class A: __call__ = A() A() #@ """ ) inferred = next(node.infer()) fully_evaluated_inference_results = list(inferred.infer_call_result(node)) assert fully_evaluated_inference_results[0].name == "A" def test_infer_call_result_with_metaclass(self) -> None: node = extract_node("def with_metaclass(meta, *bases): return 42") inferred = next(node.infer_call_result(caller=node)) self.assertIsInstance(inferred, nodes.Const) def test_context_call_for_context_managers(self) -> None: ast_nodes = extract_node( """ class A: def __enter__(self): return self class B: __enter__ = lambda self: self class C: @property def a(self): return A() def __enter__(self): return self.a with A() as a: a #@ with B() as b: b #@ with C() as c: c #@ """ ) assert isinstance(ast_nodes, list) first_a = next(ast_nodes[0].infer()) self.assertIsInstance(first_a, Instance) self.assertEqual(first_a.name, "A") second_b = next(ast_nodes[1].infer()) self.assertIsInstance(second_b, Instance) self.assertEqual(second_b.name, "B") third_c = next(ast_nodes[2].infer()) self.assertIsInstance(third_c, Instance) self.assertEqual(third_c.name, "A") def test_metaclass_subclasses_arguments_are_classes_not_instances(self) -> None: ast_node = extract_node( """ class A(type): def test(cls): return cls class B(object, metaclass=A): pass B.test() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "B") @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_with_metaclass_subclasses_arguments_are_classes_not_instances(self): ast_node = extract_node( """ class A(type): def test(cls): return cls import six class B(six.with_metaclass(A)): pass B.test() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "B") @unittest.skipUnless(HAS_SIX, "These tests require the six library") def test_with_metaclass_with_partial_imported_name(self): ast_node = extract_node( """ class A(type): def test(cls): return cls from six import with_metaclass class B(with_metaclass(A)): pass B.test() #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "B") def test_infer_cls_in_class_methods(self) -> None: ast_nodes = extract_node( """ class A(type): def __call__(cls): cls #@ class B(object): def __call__(cls): cls #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.ClassDef) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, Instance) @pytest.mark.xfail(reason="Metaclass arguments not inferred as classes") def test_metaclass_arguments_are_classes_not_instances(self): ast_node = extract_node( """ class A(type): def test(cls): return cls A.test() #@ """ ) # This is not supported yet inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "A") def test_metaclass_with_keyword_args(self) -> None: ast_node = extract_node( """ class TestMetaKlass(type): def __new__(mcs, name, bases, ns, kwo_arg): return super().__new__(mcs, name, bases, ns) class TestKlass(metaclass=TestMetaKlass, kwo_arg=42): #@ pass """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) def test_metaclass_custom_dunder_call(self) -> None: """The Metaclass __call__ should take precedence over the default metaclass type call (initialization). See https://github.com/pylint-dev/pylint/issues/2159 """ val = ( extract_node( """ class _Meta(type): def __call__(cls): return 1 class Clazz(metaclass=_Meta): def __call__(self): return 5.5 Clazz() #@ """ ) .inferred()[0] .value ) assert val == 1 def test_metaclass_custom_dunder_call_boundnode(self) -> None: """The boundnode should be the calling class.""" cls = extract_node( """ class _Meta(type): def __call__(cls): return cls class Clazz(metaclass=_Meta): pass Clazz() #@ """ ).inferred()[0] assert isinstance(cls, Instance) and cls.name == "Clazz" def test_infer_subclass_attr_outer_class(self) -> None: node = extract_node( """ class Outer: data = 123 class Test(Outer): pass Test.data """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 123 def test_infer_subclass_attr_inner_class_works_indirectly(self) -> None: node = extract_node( """ class Outer: class Inner: data = 123 Inner = Outer.Inner class Test(Inner): pass Test.data """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 123 def test_infer_subclass_attr_inner_class(self) -> None: clsdef_node, attr_node = extract_node( """ class Outer: class Inner: data = 123 class Test(Outer.Inner): pass Test #@ Test.data #@ """ ) clsdef = next(clsdef_node.infer()) assert isinstance(clsdef, nodes.ClassDef) inferred = next(clsdef.igetattr("data")) assert isinstance(inferred, nodes.Const) assert inferred.value == 123 # Inferring the value of .data via igetattr() worked before the # old_boundnode fixes in infer_subscript, so it should have been # possible to infer the subscript directly. It is the difference # between these two cases that led to the discovery of the cause of the # bug in https://github.com/pylint-dev/astroid/issues/904 inferred = next(attr_node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 123 def test_infer_method_empty_body(self) -> None: # https://github.com/PyCQA/astroid/issues/1015 node = extract_node( """ class A: def foo(self): ... A().foo() #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value is None def test_infer_method_overload(self) -> None: # https://github.com/PyCQA/astroid/issues/1015 node = extract_node( """ class A: def foo(self): ... def foo(self): yield A().foo() #@ """ ) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], Generator) def test_infer_function_under_if(self) -> None: node = extract_node( """ if 1 in [1]: def func(): return 42 else: def func(): return False func() #@ """ ) inferred = list(node.inferred()) assert [const.value for const in inferred] == [42, False] def test_delayed_attributes_without_slots(self) -> None: ast_node = extract_node( """ class A(object): __slots__ = ('a', ) a = A() a.teta = 24 a.a = 24 a #@ """ ) inferred = next(ast_node.infer()) with self.assertRaises(NotFoundError): inferred.getattr("teta") inferred.getattr("a") def test_lambda_as_methods(self) -> None: ast_node = extract_node( """ class X: m = lambda self, arg: self.z + arg z = 24 X().m(4) #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 28) def test_inner_value_redefined_by_subclass(self) -> None: ast_node = extract_node( """ class X(object): M = lambda self, arg: "a" x = 24 def __init__(self): x = 24 self.m = self.M(x) class Y(X): M = lambda self, arg: arg + 1 def blurb(self): self.m #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, 25) def test_inner_value_redefined_by_subclass_with_mro(self) -> None: ast_node = extract_node( """ class X(object): M = lambda self, arg: arg + 1 x = 24 def __init__(self): y = self self.m = y.M(1) + y.z class C(object): z = 24 class Y(X, C): M = lambda self, arg: arg + 1 def blurb(self): self.m #@ """ ) inferred = next(ast_node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 26 def test_getitem_of_class_raised_type_error(self) -> None: # Test that we wrap an AttributeInferenceError # and reraise it as a TypeError in Class.getitem node = extract_node( """ def test(): yield test() """ ) inferred = next(node.infer()) with self.assertRaises(AstroidTypeError): inferred.getitem(nodes.Const("4")) def test_infer_arg_called_type_is_uninferable(self) -> None: node = extract_node( """ def func(type): type #@ """ ) inferred = next(node.infer()) assert inferred is util.Uninferable def test_infer_arg_called_object_when_used_as_index_is_uninferable(self) -> None: node = extract_node( """ def func(object): ['list'][ object #@ ] """ ) inferred = next(node.infer()) assert inferred is util.Uninferable @test_utils.require_version(minver="3.9") def test_infer_arg_called_type_when_used_as_index_is_uninferable(self): # https://github.com/pylint-dev/astroid/pull/958 node = extract_node( """ def func(type): ['list'][ type #@ ] """ ) inferred = next(node.infer()) assert not isinstance(inferred, nodes.ClassDef) # was inferred as builtins.type assert inferred is util.Uninferable @test_utils.require_version(minver="3.9") def test_infer_arg_called_type_when_used_as_subscript_is_uninferable(self): # https://github.com/pylint-dev/astroid/pull/958 node = extract_node( """ def func(type): type[0] #@ """ ) inferred = next(node.infer()) assert not isinstance(inferred, nodes.ClassDef) # was inferred as builtins.type assert inferred is util.Uninferable @test_utils.require_version(minver="3.9") def test_infer_arg_called_type_defined_in_outer_scope_is_uninferable(self): # https://github.com/pylint-dev/astroid/pull/958 node = extract_node( """ def outer(type): def inner(): type[0] #@ """ ) inferred = next(node.infer()) assert not isinstance(inferred, nodes.ClassDef) # was inferred as builtins.type assert inferred is util.Uninferable def test_infer_subclass_attr_instance_attr_indirect(self) -> None: node = extract_node( """ class Parent: def __init__(self): self.data = 123 class Test(Parent): pass t = Test() t """ ) inferred = next(node.infer()) assert isinstance(inferred, Instance) const = next(inferred.igetattr("data")) assert isinstance(const, nodes.Const) assert const.value == 123 def test_infer_subclass_attr_instance_attr(self) -> None: node = extract_node( """ class Parent: def __init__(self): self.data = 123 class Test(Parent): pass t = Test() t.data """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 123 def test_uninferable_type_subscript(self) -> None: node = extract_node("[type for type in [] if type['id']]") with self.assertRaises(InferenceError): _ = next(node.infer()) class GetattrTest(unittest.TestCase): def test_yes_when_unknown(self) -> None: ast_nodes = extract_node( """ from missing import Missing getattr(1, Unknown) #@ getattr(Unknown, 'a') #@ getattr(Unknown, Unknown) #@ getattr(Unknown, Unknown, Unknown) #@ getattr(Missing, 'a') #@ getattr(Missing, Missing) #@ getattr('a', Missing) #@ getattr('a', Missing, Missing) #@ """ ) for node in ast_nodes[:4]: self.assertRaises(InferenceError, next, node.infer()) for node in ast_nodes[4:]: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable, node) def test_attrname_not_string(self) -> None: ast_nodes = extract_node( """ getattr(1, 1) #@ c = int getattr(1, c) #@ """ ) for node in ast_nodes: self.assertRaises(InferenceError, next, node.infer()) def test_attribute_missing(self) -> None: ast_nodes = extract_node( """ getattr(1, 'ala') #@ getattr(int, 'ala') #@ getattr(float, 'bala') #@ getattr({}, 'portocala') #@ """ ) for node in ast_nodes: self.assertRaises(InferenceError, next, node.infer()) def test_default(self) -> None: ast_nodes = extract_node( """ getattr(1, 'ala', None) #@ getattr(int, 'bala', int) #@ getattr(int, 'bala', getattr(int, 'portocala', None)) #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, nodes.Const) self.assertIsNone(first.value) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, nodes.ClassDef) self.assertEqual(second.qname(), "builtins.int") third = next(ast_nodes[2].infer()) self.assertIsInstance(third, nodes.Const) self.assertIsNone(third.value) def test_lookup(self) -> None: ast_nodes = extract_node( """ class A(object): def test(self): pass class B(A): def test_b(self): pass class C(A): pass class E(C, B): def test_e(self): pass getattr(A(), 'test') #@ getattr(A, 'test') #@ getattr(E(), 'test_b') #@ getattr(E(), 'test') #@ class X(object): def test(self): getattr(self, 'test') #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertIsInstance(first, BoundMethod) self.assertEqual(first.bound.name, "A") second = next(ast_nodes[1].infer()) self.assertIsInstance(second, UnboundMethod) self.assertIsInstance(second.parent, nodes.ClassDef) self.assertEqual(second.parent.name, "A") third = next(ast_nodes[2].infer()) self.assertIsInstance(third, BoundMethod) # Bound to E, but the provider is B. self.assertEqual(third.bound.name, "E") self.assertEqual(third._proxied._proxied.parent.name, "B") fourth = next(ast_nodes[3].infer()) self.assertIsInstance(fourth, BoundMethod) self.assertEqual(fourth.bound.name, "E") self.assertEqual(third._proxied._proxied.parent.name, "B") fifth = next(ast_nodes[4].infer()) self.assertIsInstance(fifth, BoundMethod) self.assertEqual(fifth.bound.name, "X") def test_lambda(self) -> None: node = extract_node( """ getattr(lambda x: x, 'f') #@ """ ) inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) class HasattrTest(unittest.TestCase): def test_inference_errors(self) -> None: ast_nodes = extract_node( """ from missing import Missing hasattr(Unknown, 'ala') #@ hasattr(Missing, 'bala') #@ hasattr('portocala', Missing) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_attribute_is_missing(self) -> None: ast_nodes = extract_node( """ class A: pass hasattr(int, 'ala') #@ hasattr({}, 'bala') #@ hasattr(A(), 'portocala') #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertFalse(inferred.value) def test_attribute_is_not_missing(self) -> None: ast_nodes = extract_node( """ class A(object): def test(self): pass class B(A): def test_b(self): pass class C(A): pass class E(C, B): def test_e(self): pass hasattr(A(), 'test') #@ hasattr(A, 'test') #@ hasattr(E(), 'test_b') #@ hasattr(E(), 'test') #@ class X(object): def test(self): hasattr(self, 'test') #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertTrue(inferred.value) def test_lambda(self) -> None: node = extract_node( """ hasattr(lambda x: x, 'f') #@ """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertIs(inferred.value, False) class BoolOpTest(unittest.TestCase): def test_bool_ops(self) -> None: expected = [ ("1 and 2", 2), ("0 and 2", 0), ("1 or 2", 1), ("0 or 2", 2), ("0 or 0 or 1", 1), ("1 and 2 and 3", 3), ("1 and 2 or 3", 2), ("1 and 0 or 3", 3), ("1 or 0 and 2", 1), ("(1 and 2) and (2 and 3)", 3), ("not 2 and 3", False), ("2 and not 3", False), ("not 0 and 3", 3), ("True and False", False), ("not (True or False) and True", False), ] for code, expected_value in expected: node = extract_node(code) inferred = next(node.infer()) self.assertEqual(inferred.value, expected_value) def test_yes_when_unknown(self) -> None: ast_nodes = extract_node( """ from unknown import unknown, any, not_any 0 and unknown #@ unknown or 0 #@ any or not_any and unknown #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_other_nodes(self) -> None: ast_nodes = extract_node( """ def test(): pass test and 0 #@ 1 and test #@ """ ) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) self.assertEqual(first.value, 0) second = next(ast_nodes[1].infer()) self.assertIsInstance(second, nodes.FunctionDef) self.assertEqual(second.name, "test") class TestCallable(unittest.TestCase): def test_callable(self) -> None: expected = [ ("callable(len)", True), ('callable("a")', False), ("callable(callable)", True), ("callable(lambda x, y: x+y)", True), ("import os; __(callable(os))", False), ("callable(int)", True), ( """ def test(): pass callable(test) #@""", True, ), ( """ class C1: def meth(self): pass callable(C1) #@""", True, ), ] for code, expected_value in expected: node = extract_node(code) inferred = next(node.infer()) self.assertEqual(inferred.value, expected_value) def test_callable_methods(self) -> None: ast_nodes = extract_node( """ class C: def test(self): pass @staticmethod def static(): pass @classmethod def class_method(cls): pass def __call__(self): pass class D(C): pass class NotReallyCallableDueToPythonMisfeature(object): __call__ = 42 callable(C.test) #@ callable(C.static) #@ callable(C.class_method) #@ callable(C().test) #@ callable(C().static) #@ callable(C().class_method) #@ C #@ C() #@ NotReallyCallableDueToPythonMisfeature() #@ staticmethod #@ classmethod #@ property #@ D #@ D() #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertTrue(inferred) def test_inference_errors(self) -> None: ast_nodes = extract_node( """ from unknown import unknown callable(unknown) #@ def test(): return unknown callable(test()) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_not_callable(self) -> None: ast_nodes = extract_node( """ callable("") #@ callable(1) #@ callable(True) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertFalse(inferred.value) class TestBool(unittest.TestCase): def test_bool(self) -> None: pairs = [ ("bool()", False), ("bool(1)", True), ("bool(0)", False), ("bool([])", False), ("bool([1])", True), ("bool({})", False), ("bool(True)", True), ("bool(False)", False), ("bool(None)", False), ("from unknown import Unknown; __(bool(Unknown))", util.Uninferable), ] for code, expected in pairs: node = extract_node(code) inferred = next(node.infer()) if expected is util.Uninferable: self.assertEqual(expected, inferred) else: self.assertEqual(inferred.value, expected) def test_bool_bool_special_method(self) -> None: ast_nodes = extract_node( """ class FalseClass: def __bool__(self): return False class TrueClass: def __bool__(self): return True class C(object): def __call__(self): return False class B(object): __bool__ = C() class LambdaBoolFalse(object): __bool__ = lambda self: self.foo @property def foo(self): return 0 class FalseBoolLen(object): __len__ = lambda self: self.foo @property def foo(self): return 0 bool(FalseClass) #@ bool(TrueClass) #@ bool(FalseClass()) #@ bool(TrueClass()) #@ bool(B()) #@ bool(LambdaBoolFalse()) #@ bool(FalseBoolLen()) #@ """ ) expected = [True, True, False, True, False, False, False] for node, expected_value in zip(ast_nodes, expected): inferred = next(node.infer()) self.assertEqual(inferred.value, expected_value) def test_bool_instance_not_callable(self) -> None: ast_nodes = extract_node( """ class BoolInvalid(object): __bool__ = 42 class LenInvalid(object): __len__ = "a" bool(BoolInvalid()) #@ bool(LenInvalid()) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_class_subscript(self) -> None: node = extract_node( """ class Foo: def __class_getitem__(cls, *args, **kwargs): return cls Foo[int] """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "Foo") def test_class_subscript_inference_context(self) -> None: """Context path has a reference to any parents inferred by getitem().""" code = """ class Parent: pass class A(Parent): def __class_getitem__(self, value): return cls """ klass = extract_node(code) context = InferenceContext() # For this test, we want a fresh inference, rather than a cache hit on # the inference done at brain time in _is_enum_subclass() context.lookupname = "Fresh lookup!" _ = klass.getitem(0, context=context) assert next(iter(context.path))[0].name == "Parent" class TestType(unittest.TestCase): def test_type(self) -> None: pairs = [ ("type(1)", "int"), ("type(type)", "type"), ("type(None)", "NoneType"), ("type(object)", "type"), ("type(dict())", "dict"), ("type({})", "dict"), ("type(frozenset())", "frozenset"), ] for code, expected in pairs: node = extract_node(code) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, expected) class ArgumentsTest(unittest.TestCase): @staticmethod def _get_dict_value( inferred: dict, ) -> list[tuple[str, int]] | list[tuple[str, str]]: items = inferred.items return sorted((key.value, value.value) for key, value in items) @staticmethod def _get_tuple_value(inferred: tuple) -> tuple[int, ...]: elts = inferred.elts return tuple(elt.value for elt in elts) def test_args(self) -> None: expected_values = [ (), (1,), (2, 3), (4, 5), (3,), (), (3, 4, 5), (), (), (4,), (4, 5), (), (3,), (), (), (3,), (42,), ] ast_nodes = extract_node( """ def func(*args): return args func() #@ func(1) #@ func(2, 3) #@ func(*(4, 5)) #@ def func(a, b, *args): return args func(1, 2, 3) #@ func(1, 2) #@ func(1, 2, 3, 4, 5) #@ def func(a, b, c=42, *args): return args func(1, 2) #@ func(1, 2, 3) #@ func(1, 2, 3, 4) #@ func(1, 2, 3, 4, 5) #@ func = lambda a, b, *args: args func(1, 2) #@ func(1, 2, 3) #@ func = lambda a, b=42, *args: args func(1) #@ func(1, 2) #@ func(1, 2, 3) #@ func(1, 2, *(42, )) #@ """ ) for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Tuple) self.assertEqual(self._get_tuple_value(inferred), expected_value) def test_multiple_starred_args(self) -> None: expected_values = [(1, 2, 3), (1, 4, 2, 3, 5, 6, 7)] ast_nodes = extract_node( """ def func(a, b, *args): return args func(1, 2, *(1, ), *(2, 3)) #@ func(1, 2, *(1, ), 4, *(2, 3), 5, *(6, 7)) #@ """ ) for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Tuple) self.assertEqual(self._get_tuple_value(inferred), expected_value) def test_defaults(self) -> None: expected_values = [42, 3, 41, 42] ast_nodes = extract_node( """ def func(a, b, c=42, *args): return c func(1, 2) #@ func(1, 2, 3) #@ func(1, 2, c=41) #@ func(1, 2, 42, 41) #@ """ ) for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected_value) def test_kwonly_args(self) -> None: expected_values = [24, 24, 42, 23, 24, 24, 54] ast_nodes = extract_node( """ def test(*, f, b): return f test(f=24, b=33) #@ def test(a, *, f): return f test(1, f=24) #@ def test(a, *, f=42): return f test(1) #@ test(1, f=23) #@ def test(a, b, c=42, *args, f=24): return f test(1, 2, 3) #@ test(1, 2, 3, 4) #@ test(1, 2, 3, 4, 5, f=54) #@ """ ) for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Const) self.assertEqual(inferred.value, expected_value) def test_kwargs(self) -> None: expected = [[("a", 1), ("b", 2), ("c", 3)], [("a", 1)], [("a", "b")]] ast_nodes = extract_node( """ def test(**kwargs): return kwargs test(a=1, b=2, c=3) #@ test(a=1) #@ test(**{'a': 'b'}) #@ """ ) for node, expected_value in zip(ast_nodes, expected): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Dict) value = self._get_dict_value(inferred) self.assertEqual(value, expected_value) def test_kwargs_and_other_named_parameters(self) -> None: ast_nodes = extract_node( """ def test(a=42, b=24, **kwargs): return kwargs test(42, 24, c=3, d=4) #@ test(49, b=24, d=4) #@ test(a=42, b=33, c=3, d=42) #@ test(a=42, **{'c':42}) #@ """ ) expected_values = [ [("c", 3), ("d", 4)], [("d", 4)], [("c", 3), ("d", 42)], [("c", 42)], ] for node, expected_value in zip(ast_nodes, expected_values): inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Dict) value = self._get_dict_value(inferred) self.assertEqual(value, expected_value) def test_kwargs_access_by_name(self) -> None: expected_values = [42, 42, 42, 24] ast_nodes = extract_node( """ def test(**kwargs): return kwargs['f'] test(f=42) #@ test(**{'f': 42}) #@ test(**dict(f=42)) #@ def test(f=42, **kwargs): return kwargs['l'] test(l=24) #@ """ ) for ast_node, value in zip(ast_nodes, expected_values): inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Const, inferred) self.assertEqual(inferred.value, value) def test_multiple_kwargs(self) -> None: expected_value = [("a", 1), ("b", 2), ("c", 3), ("d", 4), ("f", 42)] ast_node = extract_node( """ def test(**kwargs): return kwargs test(a=1, b=2, **{'c': 3}, **{'d': 4}, f=42) #@ """ ) inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.Dict) value = self._get_dict_value(inferred) self.assertEqual(value, expected_value) def test_kwargs_are_overridden(self) -> None: ast_nodes = extract_node( """ def test(f): return f test(f=23, **{'f': 34}) #@ def test(f=None): return f test(f=23, **{'f':23}) #@ """ ) for ast_node in ast_nodes: inferred = next(ast_node.infer()) self.assertEqual(inferred, util.Uninferable) def test_fail_to_infer_args(self) -> None: ast_nodes = extract_node( """ def test(a, **kwargs): return a test(*missing) #@ test(*object) #@ test(*1) #@ def test(**kwargs): return kwargs test(**miss) #@ test(**(1, 2)) #@ test(**1) #@ test(**{misss:1}) #@ test(**{object:1}) #@ test(**{1:1}) #@ test(**{'a':1, 'a':1}) #@ def test(a): return a test() #@ test(1, 2, 3) #@ from unknown import unknown test(*unknown) #@ def test(*args): return args test(*unknown) #@ """ ) for node in ast_nodes: inferred = next(node.infer()) self.assertEqual(inferred, util.Uninferable) def test_args_overwritten(self) -> None: # https://github.com/pylint-dev/astroid/issues/180 node = extract_node( """ next = 42 def wrapper(next=next): next = 24 def test(): return next return test wrapper()() #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() self.assertEqual(len(inferred), 1) self.assertIsInstance(inferred[0], nodes.Const, inferred[0]) self.assertEqual(inferred[0].value, 24) class SliceTest(unittest.TestCase): def test_slice(self) -> None: ast_nodes = [ ("[1, 2, 3][slice(None)]", [1, 2, 3]), ("[1, 2, 3][slice(None, None)]", [1, 2, 3]), ("[1, 2, 3][slice(None, None, None)]", [1, 2, 3]), ("[1, 2, 3][slice(1, None)]", [2, 3]), ("[1, 2, 3][slice(None, 1, None)]", [1]), ("[1, 2, 3][slice(0, 1)]", [1]), ("[1, 2, 3][slice(0, 3, 2)]", [1, 3]), ] for node, expected_value in ast_nodes: ast_node = extract_node(f"__({node})") inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.List) self.assertEqual([elt.value for elt in inferred.elts], expected_value) def test_slice_inference_error(self) -> None: ast_nodes = extract_node( """ from unknown import unknown [1, 2, 3][slice(None, unknown, unknown)] #@ [1, 2, 3][slice(None, missing, missing)] #@ [1, 2, 3][slice(object, list, tuple)] #@ [1, 2, 3][slice(b'a')] #@ [1, 2, 3][slice(1, 'aa')] #@ [1, 2, 3][slice(1, 2.0, 3.0)] #@ [1, 2, 3][slice()] #@ [1, 2, 3][slice(1, 2, 3, 4)] #@ """ ) for node in ast_nodes: self.assertRaises(InferenceError, next, node.infer()) def test_slice_attributes(self) -> None: ast_nodes = [ ("slice(2, 3, 4)", (2, 3, 4)), ("slice(None, None, 4)", (None, None, 4)), ("slice(None, 1, None)", (None, 1, None)), ] for code, values in ast_nodes: lower, upper, step = values node = extract_node(code) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Slice) lower_value = next(inferred.igetattr("start")) self.assertIsInstance(lower_value, nodes.Const) self.assertEqual(lower_value.value, lower) higher_value = next(inferred.igetattr("stop")) self.assertIsInstance(higher_value, nodes.Const) self.assertEqual(higher_value.value, upper) step_value = next(inferred.igetattr("step")) self.assertIsInstance(step_value, nodes.Const) self.assertEqual(step_value.value, step) self.assertEqual(inferred.pytype(), "builtins.slice") def test_slice_type(self) -> None: ast_node = extract_node("type(slice(None, None, None))") inferred = next(ast_node.infer()) self.assertIsInstance(inferred, nodes.ClassDef) self.assertEqual(inferred.name, "slice") class CallSiteTest(unittest.TestCase): @staticmethod def _call_site_from_call(call: nodes.Call) -> CallSite: return arguments.CallSite.from_call(call) def _test_call_site_pair( self, code: str, expected_args: list[int], expected_keywords: dict[str, int] ) -> None: ast_node = extract_node(code) call_site = self._call_site_from_call(ast_node) self.assertEqual(len(call_site.positional_arguments), len(expected_args)) self.assertEqual( [arg.value for arg in call_site.positional_arguments], expected_args ) self.assertEqual(len(call_site.keyword_arguments), len(expected_keywords)) for keyword, value in expected_keywords.items(): self.assertIn(keyword, call_site.keyword_arguments) self.assertEqual(call_site.keyword_arguments[keyword].value, value) def _test_call_site( self, pairs: list[tuple[str, list[int], dict[str, int]]] ) -> None: for pair in pairs: self._test_call_site_pair(*pair) def test_call_site_starred_args(self) -> None: pairs = [ ( "f(*(1, 2), *(2, 3), *(3, 4), **{'a':1}, **{'b': 2})", [1, 2, 2, 3, 3, 4], {"a": 1, "b": 2}, ), ( "f(1, 2, *(3, 4), 5, *(6, 7), f=24, **{'c':3})", [1, 2, 3, 4, 5, 6, 7], {"f": 24, "c": 3}, ), # Too many fs passed into. ("f(f=24, **{'f':24})", [], {}), ] self._test_call_site(pairs) def test_call_site(self) -> None: pairs = [ ("f(1, 2)", [1, 2], {}), ("f(1, 2, *(1, 2))", [1, 2, 1, 2], {}), ("f(a=1, b=2, c=3)", [], {"a": 1, "b": 2, "c": 3}), ] self._test_call_site(pairs) def _test_call_site_valid_arguments(self, values: list[str], invalid: bool) -> None: for value in values: ast_node = extract_node(value) call_site = self._call_site_from_call(ast_node) self.assertEqual(call_site.has_invalid_arguments(), invalid) def test_call_site_valid_arguments(self) -> None: values = ["f(*lala)", "f(*1)", "f(*object)"] self._test_call_site_valid_arguments(values, invalid=True) values = ["f()", "f(*(1, ))", "f(1, 2, *(2, 3))"] self._test_call_site_valid_arguments(values, invalid=False) def test_duplicated_keyword_arguments(self) -> None: ast_node = extract_node('f(f=24, **{"f": 25})') site = self._call_site_from_call(ast_node) self.assertIn("f", site.duplicated_keywords) def test_call_site_uninferable(self) -> None: code = """ def get_nums(): nums = () if x == '1': nums = (1, 2) return nums def add(x, y): return x + y nums = get_nums() if x: kwargs = {1: bar} else: kwargs = {} if nums: add(*nums) print(**kwargs) """ # Test that `*nums` argument should be Uninferable ast = parse(code, __name__) *_, add_call, print_call = list(ast.nodes_of_class(nodes.Call)) nums_arg = add_call.args[0] add_call_site = self._call_site_from_call(add_call) self.assertEqual(add_call_site._unpack_args([nums_arg]), [Uninferable]) print_call_site = self._call_site_from_call(print_call) keywords = CallContext(print_call.args, print_call.keywords).keywords self.assertEqual( print_call_site._unpack_keywords(keywords), {None: Uninferable} ) class ObjectDunderNewTest(unittest.TestCase): def test_object_dunder_new_is_inferred_if_decorator(self) -> None: node = extract_node( """ @object.__new__ class instance(object): pass """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, Instance) def test_augassign_recursion() -> None: """Make sure inference doesn't throw a RecursionError. Regression test for augmented assign dropping context.path causing recursion errors """ # infinitely recurses in python code = """ def rec(): a = 0 a += rec() return a rec() """ cls_node = extract_node(code) assert next(cls_node.infer()) is util.Uninferable def test_infer_custom_inherit_from_property() -> None: node = extract_node( """ class custom_property(property): pass class MyClass(object): @custom_property def my_prop(self): return 1 MyClass().my_prop """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 1 def test_cannot_infer_call_result_for_builtin_methods() -> None: node = extract_node( """ a = "fast" a """ ) inferred = next(node.infer()) lenmeth = next(inferred.igetattr("__len__")) with pytest.raises(InferenceError): next(lenmeth.infer_call_result(None, None)) def test_unpack_dicts_in_assignment() -> None: ast_nodes = extract_node( """ a, b = {1:2, 2:3} a #@ b #@ """ ) assert isinstance(ast_nodes, list) first_inferred = next(ast_nodes[0].infer()) second_inferred = next(ast_nodes[1].infer()) assert isinstance(first_inferred, nodes.Const) assert first_inferred.value == 1 assert isinstance(second_inferred, nodes.Const) assert second_inferred.value == 2 def test_slice_inference_in_for_loops() -> None: node = extract_node( """ for a, (c, *b) in [(1, (2, 3, 4)), (4, (5, 6))]: b #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[3, 4]" node = extract_node( """ for a, *b in [(1, 2, 3, 4)]: b #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[2, 3, 4]" node = extract_node( """ for a, *b in [(1,)]: b #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[]" def test_slice_inference_in_for_loops_not_working() -> None: ast_nodes = extract_node( """ from unknown import Unknown for a, *b in something: b #@ for a, *b in Unknown: b #@ for a, *b in (1): b #@ """ ) for node in ast_nodes: inferred = next(node.infer()) assert inferred == util.Uninferable def test_slice_zero_step_does_not_raise_ValueError() -> None: node = extract_node("x = [][::0]; x") assert next(node.infer()) == util.Uninferable def test_slice_zero_step_on_str_does_not_raise_ValueError() -> None: node = extract_node('x = ""[::0]; x') assert next(node.infer()) == util.Uninferable def test_unpacking_starred_and_dicts_in_assignment() -> None: node = extract_node( """ a, *b = {1:2, 2:3, 3:4} b """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[2, 3]" node = extract_node( """ a, *b = {1:2} b """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[]" def test_unpacking_starred_empty_list_in_assignment() -> None: node = extract_node( """ a, *b, c = [1, 2] b #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.List) assert inferred.as_string() == "[]" def test_regression_infinite_loop_decorator() -> None: """Make sure decorators with the same names as a decorated method do not cause an infinite loop. See https://github.com/pylint-dev/astroid/issues/375 """ code = """ from functools import lru_cache class Foo(): @lru_cache() def lru_cache(self, value): print('Computing {}'.format(value)) return value Foo().lru_cache(1) """ node = extract_node(code) assert isinstance(node, nodes.NodeNG) [result] = node.inferred() assert result.value == 1 def test_stop_iteration_in_int() -> None: """Handle StopIteration error in infer_int.""" code = """ def f(lst): if lst[0]: return f(lst) else: args = lst[:1] return int(args[0]) f([]) """ [first_result, second_result] = extract_node(code).inferred() assert first_result is util.Uninferable assert isinstance(second_result, Instance) assert second_result.name == "int" def test_call_on_instance_with_inherited_dunder_call_method() -> None: """Stop inherited __call__ method from incorrectly returning wrong class. See https://github.com/pylint-dev/pylint/issues/2199 """ node = extract_node( """ class Base: def __call__(self): return self class Sub(Base): pass obj = Sub() val = obj() val #@ """ ) assert isinstance(node, nodes.NodeNG) [val] = node.inferred() assert isinstance(val, Instance) assert val.name == "Sub" class TestInferencePropagation: """Make sure function argument values are properly propagated to sub functions. """ @pytest.mark.xfail(reason="Relying on path copy") def test_call_context_propagation(self): n = extract_node( """ def chest(a): return a * a def best(a, b): return chest(a) def test(a, b, c): return best(a, b) test(4, 5, 6) #@ """ ) assert next(n.infer()).as_string() == "16" def test_call_starargs_propagation(self) -> None: code = """ def foo(*args): return args def bar(*args): return foo(*args) bar(4, 5, 6, 7) #@ """ assert next(extract_node(code).infer()).as_string() == "(4, 5, 6, 7)" def test_call_kwargs_propagation(self) -> None: code = """ def b(**kwargs): return kwargs def f(**kwargs): return b(**kwargs) f(**{'f': 1}) #@ """ assert next(extract_node(code).infer()).as_string() == "{'f': 1}" @pytest.mark.parametrize( "op,result", [ ("<", False), ("<=", True), ("==", True), (">=", True), (">", False), ("!=", False), ], ) def test_compare(op, result) -> None: code = f""" 123 {op} 123 """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value == result @pytest.mark.xfail(reason="uninferable") @pytest.mark.parametrize( "op,result", [ ("is", True), ("is not", False), ], ) def test_compare_identity(op, result) -> None: code = f""" obj = object() obj {op} obj """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value == result @pytest.mark.parametrize( "op,result", [ ("in", True), ("not in", False), ], ) def test_compare_membership(op, result) -> None: code = f""" 1 {op} [1, 2, 3] """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value == result @pytest.mark.parametrize( "lhs,rhs,result", [ (1, 1, True), (1, 1.1, True), (1.1, 1, False), (1.0, 1.0, True), ("abc", "def", True), ("abc", "", False), ([], [1], True), ((1, 2), (2, 3), True), ((1, 0), (1,), False), (True, True, True), (True, False, False), (False, 1, True), (1 + 0j, 2 + 0j, util.Uninferable), (+0.0, -0.0, True), (0, "1", util.Uninferable), (b"\x00", b"\x01", True), ], ) def test_compare_lesseq_types(lhs, rhs, result) -> None: code = f""" {lhs!r} <= {rhs!r} """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value == result def test_compare_chained() -> None: code = """ 3 < 5 > 3 """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value is True def test_compare_inferred_members() -> None: code = """ a = 11 b = 13 a < b """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value is True def test_compare_instance_members() -> None: code = """ class A: value = 123 class B: @property def value(self): return 456 A().value < B().value """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value is True @pytest.mark.xfail(reason="unimplemented") def test_compare_dynamic() -> None: code = """ class A: def __le__(self, other): return True A() <= None """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value is True def test_compare_uninferable_member() -> None: code = """ from unknown import UNKNOWN 0 <= UNKNOWN """ node = extract_node(code) inferred = next(node.infer()) assert inferred is util.Uninferable def test_compare_chained_comparisons_shortcircuit_on_false() -> None: code = """ from unknown import UNKNOWN 2 < 1 < UNKNOWN """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value is False def test_compare_chained_comparisons_continue_on_true() -> None: code = """ from unknown import UNKNOWN 1 < 2 < UNKNOWN """ node = extract_node(code) inferred = next(node.infer()) assert inferred is util.Uninferable @pytest.mark.xfail(reason="unimplemented") def test_compare_known_false_branch() -> None: code = """ a = 'hello' if 1 < 2: a = 'goodbye' a """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hello" def test_compare_ifexp_constant() -> None: code = """ a = 'hello' if 1 < 2 else 'goodbye' a """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == "hello" def test_compare_typeerror() -> None: code = """ 123 <= "abc" """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 assert inferred[0] is util.Uninferable def test_compare_multiple_possibilites() -> None: code = """ from unknown import UNKNOWN a = 1 if UNKNOWN: a = 2 b = 3 if UNKNOWN: b = 4 a < b """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 # All possible combinations are true: (1 < 3), (1 < 4), (2 < 3), (2 < 4) assert inferred[0].value is True def test_compare_ambiguous_multiple_possibilites() -> None: code = """ from unknown import UNKNOWN a = 1 if UNKNOWN: a = 3 b = 2 if UNKNOWN: b = 4 a < b """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 # Not all possible combinations are true: (1 < 2), (1 < 4), (3 !< 2), (3 < 4) assert inferred[0] is util.Uninferable def test_compare_nonliteral() -> None: code = """ def func(a, b): return (a, b) <= (1, 2) #@ """ return_node = extract_node(code) node = return_node.value inferred = list(node.infer()) # should not raise ValueError assert len(inferred) == 1 assert inferred[0] is util.Uninferable def test_compare_unknown() -> None: code = """ def func(a): if tuple() + (a[1],) in set(): raise Exception() """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.FunctionDef) def test_limit_inference_result_amount() -> None: """Test setting limit inference result amount.""" code = """ args = [] if True: args += ['a'] if True: args += ['b'] if True: args += ['c'] if True: args += ['d'] args #@ """ result = extract_node(code).inferred() assert len(result) == 16 with patch("astroid.manager.AstroidManager.max_inferable_values", 4): result_limited = extract_node(code).inferred() # Can't guarantee exact size assert len(result_limited) < 16 # Will not always be at the end assert util.Uninferable in result_limited def test_attribute_inference_should_not_access_base_classes() -> None: """Attributes of classes should mask ancestor attributes.""" code = """ type.__new__ #@ """ res = extract_node(code).inferred() assert len(res) == 1 assert res[0].parent.name == "type" def test_attribute_mro_object_inference() -> None: """Inference should only infer results from the first available method.""" inferred = extract_node( """ class A: def foo(self): return 1 class B(A): def foo(self): return 2 B().foo() #@ """ ).inferred() assert len(inferred) == 1 assert inferred[0].value == 2 def test_inferred_sequence_unpacking_works() -> None: inferred = next( extract_node( """ def test(*args): return (1, *args) test(2) #@ """ ).infer() ) assert isinstance(inferred, nodes.Tuple) assert len(inferred.elts) == 2 assert [value.value for value in inferred.elts] == [1, 2] def test_recursion_error_inferring_slice() -> None: node = extract_node( """ class MyClass: def __init__(self): self._slice = slice(0, 10) def incr(self): self._slice = slice(0, self._slice.stop + 1) def test(self): self._slice #@ """ ) inferred = next(node.infer()) assert isinstance(inferred, Slice) def test_exception_lookup_last_except_handler_wins() -> None: node = extract_node( """ try: 1/0 except ValueError as exc: pass try: 1/0 except OSError as exc: exc #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 inferred_exc = inferred[0] assert isinstance(inferred_exc, Instance) assert inferred_exc.name == "OSError" # Two except handlers on the same Try work the same as separate node = extract_node( """ try: 1/0 except ZeroDivisionError as exc: pass except ValueError as exc: exc #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 inferred_exc = inferred[0] assert isinstance(inferred_exc, Instance) assert inferred_exc.name == "ValueError" def test_exception_lookup_name_bound_in_except_handler() -> None: node = extract_node( """ try: 1/0 except ValueError: name = 1 try: 1/0 except OSError: name = 2 name #@ """ ) assert isinstance(node, nodes.NodeNG) inferred = node.inferred() assert len(inferred) == 1 inferred_exc = inferred[0] assert isinstance(inferred_exc, nodes.Const) assert inferred_exc.value == 2 def test_builtin_inference_list_of_exceptions() -> None: node = extract_node( """ tuple([ValueError, TypeError]) """ ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Tuple) assert len(inferred.elts) == 2 assert isinstance(inferred.elts[0], nodes.EvaluatedObject) assert isinstance(inferred.elts[0].value, nodes.ClassDef) assert inferred.elts[0].value.name == "ValueError" assert isinstance(inferred.elts[1], nodes.EvaluatedObject) assert isinstance(inferred.elts[1].value, nodes.ClassDef) assert inferred.elts[1].value.name == "TypeError" # Test that inference of evaluated objects returns what is expected first_elem = next(inferred.elts[0].infer()) assert isinstance(first_elem, nodes.ClassDef) assert first_elem.name == "ValueError" second_elem = next(inferred.elts[1].infer()) assert isinstance(second_elem, nodes.ClassDef) assert second_elem.name == "TypeError" # Test that as_string() also works as_string = inferred.as_string() assert as_string.strip() == "(ValueError, TypeError)" def test_cannot_getattr_ann_assigns() -> None: node = extract_node( """ class Cls: ann: int """ ) inferred = next(node.infer()) with pytest.raises(AttributeInferenceError): inferred.getattr("ann") # But if it had a value, then it would be okay. node = extract_node( """ class Cls: ann: int = 0 """ ) inferred = next(node.infer()) values = inferred.getattr("ann") assert len(values) == 1 def test_prevent_recursion_error_in_igetattr_and_context_manager_inference() -> None: code = """ class DummyContext(object): def __enter__(self): return self def __exit__(self, ex_type, ex_value, ex_tb): return True if False: with DummyContext() as con: pass with DummyContext() as con: con.__enter__ #@ """ node = extract_node(code) # According to the original issue raised that introduced this test # (https://github.com/pylint-dev/astroid/663, see 55076ca), this test was a # non-regression check for StopIteration leaking out of inference and # causing a RuntimeError. Hence, here just consume the inferred value # without checking it and rely on pytest to fail on raise next(node.infer()) def test_igetattr_idempotent() -> None: code = """ class InferMeTwice: item = 10 InferMeTwice() """ call = extract_node(code) instance = call.inferred()[0] context_to_be_used_twice = InferenceContext() assert util.Uninferable not in instance.igetattr("item", context_to_be_used_twice) assert util.Uninferable not in instance.igetattr("item", context_to_be_used_twice) @patch("astroid.nodes.Call._infer") def test_cache_usage_without_explicit_context(mock) -> None: code = """ class InferMeTwice: item = 10 InferMeTwice() """ call = extract_node(code) mock.return_value = [Uninferable] # no explicit InferenceContext call.inferred() call.inferred() mock.assert_called_once() def test_infer_context_manager_with_unknown_args() -> None: code = """ class client_log(object): def __init__(self, client): self.client = client def __enter__(self): return self.client def __exit__(self, exc_type, exc_value, traceback): pass with client_log(None) as c: c #@ """ node = extract_node(code) assert next(node.infer()) is util.Uninferable # But if we know the argument, then it is easy code = """ class client_log(object): def __init__(self, client=24): self.client = client def __enter__(self): return self.client def __exit__(self, exc_type, exc_value, traceback): pass with client_log(None) as c: c #@ """ node = extract_node(code) assert isinstance(next(node.infer()), nodes.Const) @pytest.mark.parametrize( "code", [ """ class Error(Exception): pass a = Error() a #@ """, """ class Error(Exception): def method(self): self #@ """, ], ) def test_subclass_of_exception(code) -> None: inferred = next(extract_node(code).infer()) assert isinstance(inferred, Instance) args = next(inferred.igetattr("args")) assert isinstance(args, nodes.Tuple) def test_ifexp_inference() -> None: code = """ def truth_branch(): return 1 if True else 2 def false_branch(): return 1 if False else 2 def both_branches(): return 1 if unknown() else 2 truth_branch() #@ false_branch() #@ both_branches() #@ """ ast_nodes = extract_node(code) assert isinstance(ast_nodes, list) first = next(ast_nodes[0].infer()) assert isinstance(first, nodes.Const) assert first.value == 1 second = next(ast_nodes[1].infer()) assert isinstance(second, nodes.Const) assert second.value == 2 third = list(ast_nodes[2].infer()) assert isinstance(third, list) assert [third[0].value, third[1].value] == [1, 2] def test_assert_last_function_returns_none_on_inference() -> None: code = """ def check_equal(a, b): res = do_something_with_these(a, b) assert a == b == res check_equal(a, b) """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value is None @test_utils.require_version(minver="3.8") def test_posonlyargs_inference() -> None: code = """ class A: method = lambda self, b, /, c: b + c def __init__(self, other=(), /, **kw): self #@ A() #@ A().method #@ """ self_node, instance, lambda_method = extract_node(code) inferred = next(self_node.infer()) assert isinstance(inferred, Instance) assert inferred.name == "A" inferred = next(instance.infer()) assert isinstance(inferred, Instance) assert inferred.name == "A" inferred = next(lambda_method.infer()) assert isinstance(inferred, BoundMethod) assert inferred.type == "method" def test_infer_args_unpacking_of_self() -> None: code = """ class A: def __init__(*args, **kwargs): self, *args = args self.data = {1: 2} self #@ A().data #@ """ self, data = extract_node(code) inferred_self = next(self.infer()) assert isinstance(inferred_self, Instance) assert inferred_self.name == "A" inferred_data = next(data.infer()) assert isinstance(inferred_data, nodes.Dict) assert inferred_data.as_string() == "{1: 2}" def test_infer_exception_instance_attributes() -> None: code = """ class UnsupportedFormatCharacter(Exception): def __init__(self, index): Exception.__init__(self, index) self.index = index try: 1/0 except UnsupportedFormatCharacter as exc: exc #@ """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, ExceptionInstance) index = inferred.getattr("index") assert len(index) == 1 assert isinstance(index[0], nodes.AssignAttr) def test_infer_assign_attr() -> None: code = """ class Counter: def __init__(self): self.count = 0 def increment(self): self.count += 1 #@ """ node = extract_node(code) inferred = next(node.infer()) assert inferred.value == 1 @pytest.mark.parametrize( "code,instance_name", [ ( """ class A: def __enter__(self): return self def __exit__(self, err_type, err, traceback): return class B(A): pass with B() as b: b #@ """, "B", ), ( """ class A: def __enter__(self): return A() def __exit__(self, err_type, err, traceback): return class B(A): pass with B() as b: b #@ """, "A", ), ( """ class A: def test(self): return A() class B(A): def test(self): return A.test(self) B().test() """, "A", ), ], ) def test_inference_is_limited_to_the_boundnode(code, instance_name) -> None: node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, Instance) assert inferred.name == instance_name def test_property_inference() -> None: code = """ class A: @property def test(self): return 42 @test.setter def test(self, value): return "banco" A.test #@ A().test #@ A.test.fget(A) #@ A.test.fset(A, "a_value") #@ A.test.setter #@ A.test.getter #@ A.test.deleter #@ """ ( prop, prop_result, prop_fget_result, prop_fset_result, prop_setter, prop_getter, prop_deleter, ) = extract_node(code) inferred = next(prop.infer()) assert isinstance(inferred, objects.Property) assert inferred.pytype() == "builtins.property" assert inferred.type == "property" inferred = next(prop_result.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 inferred = next(prop_fget_result.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 inferred = next(prop_fset_result.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == "banco" for prop_func in prop_setter, prop_getter, prop_deleter: inferred = next(prop_func.infer()) assert isinstance(inferred, nodes.FunctionDef) def test_property_as_string() -> None: code = """ class A: @property def test(self): return 42 A.test #@ """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, objects.Property) property_body = textwrap.dedent( """ @property def test(self): return 42 """ ) assert inferred.as_string().strip() == property_body.strip() def test_property_callable_inference() -> None: code = """ class A: def func(self): return 42 p = property(func) A().p """ property_call = extract_node(code) inferred = next(property_call.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 # Try with lambda as well code = """ class A: p = property(lambda self: 42) A().p """ property_call = extract_node(code) inferred = next(property_call.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == 42 def test_property_docstring() -> None: code = """ class A: @property def test(self): '''Docstring''' return 42 A.test #@ """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, objects.Property) assert isinstance(inferred.doc_node, nodes.Const) assert inferred.doc_node.value == "Docstring" def test_recursion_error_inferring_builtin_containers() -> None: node = extract_node( """ class Foo: a = "foo" inst = Foo() b = tuple([inst.a]) #@ inst.a = b """ ) util.safe_infer(node.targets[0]) def test_inferaugassign_picking_parent_instead_of_stmt() -> None: code = """ from collections import namedtuple SomeClass = namedtuple('SomeClass', ['name']) items = [SomeClass(name='some name')] some_str = '' some_str += ', '.join(__(item) for item in items) """ # item needs to be inferrd as `SomeClass` but it was inferred # as a string because the entire `AugAssign` node was inferred # as a string. node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, Instance) assert inferred.name == "SomeClass" def test_classmethod_from_builtins_inferred_as_bound() -> None: code = """ import builtins class Foo(): @classmethod def bar1(cls, text): pass @builtins.classmethod def bar2(cls, text): pass Foo.bar1 #@ Foo.bar2 #@ """ first_node, second_node = extract_node(code) assert isinstance(next(first_node.infer()), BoundMethod) assert isinstance(next(second_node.infer()), BoundMethod) def test_infer_dict_passes_context() -> None: code = """ k = {} (_ for k in __(dict(**k))) """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, Instance) assert inferred.qname() == "builtins.dict" @pytest.mark.parametrize( "code,obj,obj_type", [ ( """ def klassmethod1(method): @classmethod def inner(cls): return method(cls) return inner class X(object): @klassmethod1 def x(cls): return 'X' X.x """, BoundMethod, "classmethod", ), ( """ def staticmethod1(method): @staticmethod def inner(cls): return method(cls) return inner class X(object): @staticmethod1 def x(cls): return 'X' X.x """, nodes.FunctionDef, "staticmethod", ), ( """ def klassmethod1(method): def inner(cls): return method(cls) return classmethod(inner) class X(object): @klassmethod1 def x(cls): return 'X' X.x """, BoundMethod, "classmethod", ), ( """ def staticmethod1(method): def inner(cls): return method(cls) return staticmethod(inner) class X(object): @staticmethod1 def x(cls): return 'X' X.x """, nodes.FunctionDef, "staticmethod", ), ], ) def test_custom_decorators_for_classmethod_and_staticmethods(code, obj, obj_type): node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, obj) assert inferred.type == obj_type @pytest.mark.skipif( PY39_PLUS, reason="Exact inference with dataclasses (replace function) in python3.9", ) def test_dataclasses_subscript_inference_recursion_error(): code = """ from dataclasses import dataclass, replace @dataclass class ProxyConfig: auth: str = "/auth" a = ProxyConfig("") test_dict = {"proxy" : {"auth" : "", "bla" : "f"}} foo = test_dict['proxy'] replace(a, **test_dict['proxy']) # This fails """ node = extract_node(code) # Reproduces only with safe_infer() assert util.safe_infer(node) is None @pytest.mark.skipif( not PY39_PLUS, reason="Exact inference with dataclasses (replace function) in python3.9", ) def test_dataclasses_subscript_inference_recursion_error_39(): code = """ from dataclasses import dataclass, replace @dataclass class ProxyConfig: auth: str = "/auth" a = ProxyConfig("") test_dict = {"proxy" : {"auth" : "", "bla" : "f"}} foo = test_dict['proxy'] replace(a, **test_dict['proxy']) # This fails """ node = extract_node(code) infer_val = util.safe_infer(node) assert isinstance(infer_val, Instance) assert infer_val.pytype() == ".ProxyConfig" def test_self_reference_infer_does_not_trigger_recursion_error() -> None: # Prevents https://github.com/pylint-dev/pylint/issues/1285 code = """ def func(elems): return elems class BaseModel(object): def __init__(self, *args, **kwargs): self._reference = func(*self._reference.split('.')) BaseModel()._reference """ node = extract_node(code) inferred = next(node.infer()) assert inferred is util.Uninferable def test_inferring_properties_multiple_time_does_not_mutate_locals() -> None: code = """ class A: @property def a(self): return 42 A() """ node = extract_node(code) # Infer the class cls = next(node.infer()) (prop,) = cls.getattr("a") # Try to infer the property function *multiple* times. `A.locals` should be modified only once for _ in range(3): prop.inferred() a_locals = cls.locals["a"] # [FunctionDef, Property] assert len(a_locals) == 2 def test_getattr_fails_on_empty_values() -> None: code = """ import collections collections """ node = extract_node(code) inferred = next(node.infer()) with pytest.raises(InferenceError): next(inferred.igetattr("")) with pytest.raises(AttributeInferenceError): inferred.getattr("") def test_infer_first_argument_of_static_method_in_metaclass() -> None: code = """ class My(type): @staticmethod def test(args): args #@ """ node = extract_node(code) inferred = next(node.infer()) assert inferred is util.Uninferable def test_recursion_error_metaclass_monkeypatching() -> None: module = resources.build_file( "data/metaclass_recursion/monkeypatch.py", "data.metaclass_recursion" ) cls = next(module.igetattr("MonkeyPatchClass")) assert isinstance(cls, nodes.ClassDef) assert cls.declared_metaclass() is None @pytest.mark.xfail(reason="Cannot fully infer all the base classes properly.") def test_recursion_error_self_reference_type_call() -> None: # Fix for https://github.com/pylint-dev/astroid/issues/199 code = """ class A(object): pass class SomeClass(object): route_class = A def __init__(self): self.route_class = type('B', (self.route_class, ), {}) self.route_class() #@ """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, Instance) assert inferred.name == "B" # TODO: Cannot infer [B, A, object] but at least the recursion error is gone. assert [cls.name for cls in inferred.mro()] == ["B", "A", "object"] def test_allow_retrieving_instance_attrs_and_special_attrs_for_functions() -> None: code = """ class A: def test(self): "a" # Add `__doc__` to `FunctionDef.instance_attrs` via an `AugAssign` test.__doc__ += 'b' test #@ """ node = extract_node(code) inferred = next(node.infer()) attrs = inferred.getattr("__doc__") # One from the `AugAssign`, one from the special attributes assert len(attrs) == 2 def test_implicit_parameters_bound_method() -> None: code = """ class A(type): @classmethod def test(cls, first): return first def __new__(cls, name, bases, dictionary): return super().__new__(cls, name, bases, dictionary) A.test #@ A.__new__ #@ """ test, dunder_new = extract_node(code) test = next(test.infer()) assert isinstance(test, BoundMethod) assert test.implicit_parameters() == 1 dunder_new = next(dunder_new.infer()) assert isinstance(dunder_new, BoundMethod) assert dunder_new.implicit_parameters() == 0 def test_super_inference_of_abstract_property() -> None: code = """ from abc import abstractmethod class A: @property def test(self): return "super" class C: @property @abstractmethod def test(self): "abstract method" class B(A, C): @property def test(self): super() #@ """ node = extract_node(code) inferred = next(node.infer()) test = inferred.getattr("test") assert len(test) == 2 def test_infer_generated_setter() -> None: code = """ class A: @property def test(self): pass A.test.setter """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, nodes.FunctionDef) assert isinstance(inferred.args, nodes.Arguments) # This line used to crash because property generated functions # did not have args properly set assert not list(inferred.nodes_of_class(nodes.Const)) def test_infer_list_of_uninferables_does_not_crash() -> None: code = """ x = [A] * 1 f = [x, [A] * 2] x = list(f) + [] # List[Uninferable] tuple(x[0]) """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, nodes.Tuple) # Would not be able to infer the first element. assert not inferred.elts # https://github.com/pylint-dev/astroid/issues/926 def test_issue926_infer_stmts_referencing_same_name_is_not_uninferable() -> None: code = """ pair = [1, 2] ex = pair[0] if 1 + 1 == 2: ex = pair[1] ex """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 2 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 1 assert isinstance(inferred[1], nodes.Const) assert inferred[1].value == 2 # https://github.com/pylint-dev/astroid/issues/926 def test_issue926_binop_referencing_same_name_is_not_uninferable() -> None: code = """ pair = [1, 2] ex = pair[0] + pair[1] ex """ node = extract_node(code) inferred = list(node.infer()) assert len(inferred) == 1 assert isinstance(inferred[0], nodes.Const) assert inferred[0].value == 3 def test_pylint_issue_4692_attribute_inference_error_in_infer_import_from() -> None: """Https://github.com/pylint-dev/pylint/issues/4692.""" code = """ import click for name, item in click.__dict__.items(): _ = isinstance(item, click.Command) and item != 'foo' """ node = extract_node(code) with pytest.raises(InferenceError): list(node.infer()) def test_issue_1090_infer_yield_type_base_class() -> None: code = """ import contextlib class A: @contextlib.contextmanager def get(self): yield self class B(A): def play(): pass with B().get() as b: b b """ node = extract_node(code) assert next(node.infer()).pytype() == ".B" def test_namespace_package() -> None: """Check that a file using namespace packages and relative imports is parseable.""" resources.build_file("data/beyond_top_level/import_package.py") def test_namespace_package_same_name() -> None: """Check that a file using namespace packages and relative imports with similar names is parseable. """ resources.build_file("data/beyond_top_level_two/a.py") def test_relative_imports_init_package() -> None: """Check that relative imports within a package that uses __init__.py still works. """ resources.build_file( "data/beyond_top_level_three/module/sub_module/sub_sub_module/main.py" ) def test_inference_of_items_on_module_dict() -> None: """Crash test for the inference of items() on a module's dict attribute. Originally reported in https://github.com/pylint-dev/astroid/issues/1085 """ builder.file_build(str(DATA_DIR / "module_dict_items_call" / "test.py"), "models") def test_imported_module_var_inferable() -> None: """ Module variables can be imported and inferred successfully as part of binary operators. """ mod1 = parse( textwrap.dedent( """ from top1.mod import v as z w = [1] + z """ ), module_name="top1", ) parse("v = [2]", module_name="top1.mod") w_val = mod1.body[-1].value i_w_val = next(w_val.infer()) assert i_w_val is not util.Uninferable assert i_w_val.as_string() == "[1, 2]" def test_imported_module_var_inferable2() -> None: """Version list of strings.""" mod2 = parse( textwrap.dedent( """ from top2.mod import v as z w = ['1'] + z """ ), module_name="top2", ) parse("v = ['2']", module_name="top2.mod") w_val = mod2.body[-1].value i_w_val = next(w_val.infer()) assert i_w_val is not util.Uninferable assert i_w_val.as_string() == "['1', '2']" def test_imported_module_var_inferable3() -> None: """Version list of strings with a __dunder__ name.""" mod3 = parse( textwrap.dedent( """ from top3.mod import __dunder_var__ as v __dunder_var__ = ['w'] + v """ ), module_name="top", ) parse("__dunder_var__ = ['v']", module_name="top3.mod") w_val = mod3.body[-1].value i_w_val = next(w_val.infer()) assert i_w_val is not util.Uninferable assert i_w_val.as_string() == "['w', 'v']" @pytest.mark.skipif( IS_PYPY, reason="Test run with coverage on PyPy sometimes raises a RecursionError" ) def test_recursion_on_inference_tip() -> None: """Regression test for recursion in inference tip. Originally reported in https://github.com/pylint-dev/pylint/issues/5408. When run on PyPy with coverage enabled, the test can sometimes raise a RecursionError outside of the code that we actually want to test. As the issue seems to be with coverage, skip the test on PyPy. https://github.com/pylint-dev/astroid/pull/1984#issuecomment-1407720311 """ code = """ class MyInnerClass: ... class MySubClass: inner_class = MyInnerClass class MyClass: sub_class = MySubClass() def get_unpatched_class(cls): return cls def get_unpatched(item): lookup = get_unpatched_class if isinstance(item, type) else lambda item: None return lookup(item) _Child = get_unpatched(MyClass.sub_class.inner_class) class Child(_Child): def patch(cls): MyClass.sub_class.inner_class = cls """ module = parse(code) assert module def test_function_def_cached_generator() -> None: """Regression test for https://github.com/pylint-dev/astroid/issues/817.""" funcdef: nodes.FunctionDef = extract_node("def func(): pass") next(funcdef._infer()) class TestOldStyleStringFormatting: @pytest.mark.parametrize( "format_string", [ pytest.param( """"My name is %s, I'm %s" % ("Daniel", 12)""", id="empty-indexes" ), pytest.param( """"My name is %0s, I'm %1s" % ("Daniel", 12)""", id="numbered-indexes", ), pytest.param( """ fname = "Daniel" age = 12 "My name is %s, I'm %s" % (fname, age) """, id="empty-indexes-from-positional", ), pytest.param( """ fname = "Daniel" age = 12 "My name is %0s, I'm %1s" % (fname, age) """, id="numbered-indexes-from-positionl", ), pytest.param( """ fname = "Daniel" age = 12 "My name is %(fname)s, I'm %(age)s" % {"fname": fname, "age": age} """, id="named-indexes-from-keyword", ), pytest.param( """ string = "My name is %s, I'm %s" string % ("Daniel", 12) """, id="empty-indexes-on-variable", ), pytest.param( """"My name is Daniel, I'm %s" % 12""", id="empty-indexes-from-variable" ), pytest.param( """ age = 12 "My name is Daniel, I'm %s" % age """, id="empty-indexes-from-variable", ), ], ) def test_old_style_string_formatting(self, format_string: str) -> None: node: nodes.Call = _extract_single_node(format_string) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == "My name is Daniel, I'm 12" @pytest.mark.parametrize( "format_string", [ """ from missing import Unknown fname = Unknown age = 12 "My name is %(fname)s, I'm %(age)s" % {"fname": fname, "age": age} """, """ from missing import fname age = 12 "My name is %(fname)s, I'm %(age)s" % {"fname": fname, "age": age} """, """ from missing import fname "My name is %s, I'm %s" % (fname, 12) """, """ "My name is %0s, I'm %1s" % ("Daniel") """, """"I am %s" % ()""", """"I am %s" % Exception()""", """ fsname = "Daniel" "My name is %(fname)s, I'm %(age)s" % {"fsname": fsname, "age": age} """, """ "My name is %(fname)s, I'm %(age)s" % {Exception(): "Daniel", "age": age} """, """ fname = "Daniel" age = 12 "My name is %0s, I'm %(age)s" % (fname, age) """, """ "My name is %s, I'm %s" % ((fname,)*2) """, """20 % 0""", """("%" + str(20)) % 0""", ], ) def test_old_style_string_formatting_uninferable(self, format_string: str) -> None: node: nodes.Call = _extract_single_node(format_string) inferred = next(node.infer()) assert inferred is util.Uninferable def test_old_style_string_formatting_with_specs(self) -> None: node: nodes.Call = _extract_single_node( """"My name is %s, I'm %.2f" % ("Daniel", 12)""" ) inferred = next(node.infer()) assert isinstance(inferred, nodes.Const) assert inferred.value == "My name is Daniel, I'm 12.00" def test_sys_argv_uninferable() -> None: """Regression test for https://github.com/pylint-dev/pylint/issues/7710.""" a: nodes.List = extract_node( textwrap.dedent( """ import sys sys.argv""" ) ) sys_argv_value = list(a._infer()) assert len(sys_argv_value) == 1 assert sys_argv_value[0] is Uninferable astroid-3.2.2/tests/test_regrtest.py0000664000175000017500000003540114622475517017523 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import sys import textwrap import unittest from unittest import mock import pytest from astroid import MANAGER, Instance, bases, manager, nodes, parse, test_utils from astroid.builder import AstroidBuilder, _extract_single_node, extract_node from astroid.context import InferenceContext from astroid.exceptions import InferenceError from astroid.raw_building import build_module from astroid.util import Uninferable from . import resources try: import numpy # pylint: disable=unused-import except ImportError: HAS_NUMPY = False else: HAS_NUMPY = True class NonRegressionTests(unittest.TestCase): def setUp(self) -> None: sys.path.insert(0, resources.find("data")) MANAGER.always_load_extensions = True self.addCleanup(MANAGER.clear_cache) def tearDown(self) -> None: MANAGER.always_load_extensions = False sys.path.pop(0) sys.path_importer_cache.pop(resources.find("data"), None) def test_manager_instance_attributes_reference_global_MANAGER(self) -> None: for expected in (True, False): with mock.patch.dict( manager.AstroidManager.brain, values={"always_load_extensions": expected}, ): assert ( MANAGER.always_load_extensions == manager.AstroidManager.brain["always_load_extensions"] ) with mock.patch.dict( manager.AstroidManager.brain, values={"optimize_ast": expected}, ): assert ( MANAGER.optimize_ast == manager.AstroidManager.brain["optimize_ast"] ) def test_module_path(self) -> None: man = test_utils.brainless_manager() mod = man.ast_from_module_name("package.import_package_subpackage_module") package = next(mod.igetattr("package")) self.assertEqual(package.name, "package") subpackage = next(package.igetattr("subpackage")) self.assertIsInstance(subpackage, nodes.Module) self.assertTrue(subpackage.package) self.assertEqual(subpackage.name, "package.subpackage") module = next(subpackage.igetattr("module")) self.assertEqual(module.name, "package.subpackage.module") def test_package_sidepackage(self) -> None: brainless_manager = test_utils.brainless_manager() assert "package.sidepackage" not in MANAGER.astroid_cache package = brainless_manager.ast_from_module_name("absimp") self.assertIsInstance(package, nodes.Module) self.assertTrue(package.package) subpackage = next(package.getattr("sidepackage")[0].infer()) self.assertIsInstance(subpackage, nodes.Module) self.assertTrue(subpackage.package) self.assertEqual(subpackage.name, "absimp.sidepackage") def test_living_property(self) -> None: builder = AstroidBuilder() builder._done = {} builder._module = sys.modules[__name__] builder.object_build(build_module("module_name", ""), Whatever) @unittest.skipIf(not HAS_NUMPY, "Needs numpy") def test_numpy_crash(self): """Test don't crash on numpy.""" # a crash occurred somewhere in the past, and an # InferenceError instead of a crash was better, but now we even infer! builder = AstroidBuilder() data = """ from numpy import multiply multiply([1, 2], [3, 4]) """ astroid = builder.string_build(data, __name__, __file__) callfunc = astroid.body[1].value.func inferred = callfunc.inferred() self.assertEqual(len(inferred), 1) @unittest.skipUnless(HAS_NUMPY, "Needs numpy") def test_numpy_distutils(self): """Special handling of virtualenv's patching of distutils shouldn't interfere with numpy.distutils. PY312_PLUS -- This test will likely become unnecessary when Python 3.12 is numpy's minimum version. (numpy.distutils will be removed then.) """ node = extract_node( """ from numpy.distutils.misc_util import is_sequence is_sequence("ABC") #@ """ ) inferred = node.inferred() self.assertIsInstance(inferred[0], nodes.Const) def test_nameconstant(self) -> None: # used to fail for Python 3.4 builder = AstroidBuilder() astroid = builder.string_build("def test(x=True): pass") default = astroid.body[0].args.args[0] self.assertEqual(default.name, "x") self.assertEqual(next(default.infer()).value, True) def test_recursion_regression_issue25(self) -> None: builder = AstroidBuilder() data = """ import recursion as base _real_Base = base.Base class Derived(_real_Base): pass def run(): base.Base = Derived """ astroid = builder.string_build(data, __name__, __file__) # Used to crash in _is_metaclass, due to wrong # ancestors chain classes = astroid.nodes_of_class(nodes.ClassDef) for klass in classes: # triggers the _is_metaclass call klass.type # pylint: disable=pointless-statement # noqa: B018 def test_decorator_callchain_issue42(self) -> None: builder = AstroidBuilder() data = """ def test(): def factory(func): def newfunc(): func() return newfunc return factory @test() def crash(): pass """ astroid = builder.string_build(data, __name__, __file__) self.assertEqual(astroid["crash"].type, "function") def test_filter_stmts_scoping(self) -> None: builder = AstroidBuilder() data = """ def test(): compiler = int() class B(compiler.__class__): pass compiler = B() return compiler """ astroid = builder.string_build(data, __name__, __file__) test = astroid["test"] result = next(test.infer_call_result(astroid)) self.assertIsInstance(result, Instance) base = next(result._proxied.bases[0].infer()) self.assertEqual(base.name, "int") def test_filter_stmts_nested_if(self) -> None: builder = AstroidBuilder() data = """ def test(val): variable = None if val == 1: variable = "value" if variable := "value": pass elif val == 2: variable = "value_two" variable = "value_two" return variable """ module = builder.string_build(data, __name__, __file__) test_func = module["test"] result = list(test_func.infer_call_result(module)) assert len(result) == 3 assert isinstance(result[0], nodes.Const) assert result[0].value is None assert result[0].lineno == 3 assert isinstance(result[1], nodes.Const) assert result[1].value == "value" assert result[1].lineno == 7 assert isinstance(result[1], nodes.Const) assert result[2].value == "value_two" assert result[2].lineno == 12 def test_ancestors_patching_class_recursion(self) -> None: node = AstroidBuilder().string_build( textwrap.dedent( """ import string Template = string.Template class A(Template): pass class B(A): pass def test(x=False): if x: string.Template = A else: string.Template = B """ ) ) klass = node["A"] ancestors = list(klass.ancestors()) self.assertEqual(ancestors[0].qname(), "string.Template") def test_ancestors_yes_in_bases(self) -> None: # Test for issue https://bitbucket.org/logilab/astroid/issue/84 # This used to crash astroid with a TypeError, because an Uninferable # node was present in the bases node = extract_node( """ def with_metaclass(meta, *bases): class metaclass(meta): def __new__(cls, name, this_bases, d): return meta(name, bases, d) return type.__new__(metaclass, 'temporary_class', (), {}) import lala class A(with_metaclass(object, lala.lala)): #@ pass """ ) ancestors = list(node.ancestors()) self.assertEqual(len(ancestors), 1) self.assertEqual(ancestors[0].qname(), "builtins.object") def test_ancestors_missing_from_function(self) -> None: # Test for https://www.logilab.org/ticket/122793 node = extract_node( """ def gen(): yield GEN = gen() next(GEN) """ ) self.assertRaises(InferenceError, next, node.infer()) def test_unicode_in_docstring(self) -> None: # Crashed for astroid==1.4.1 # Test for https://bitbucket.org/logilab/astroid/issues/273/ # In a regular file, "coding: utf-8" would have been used. node = extract_node( f""" from __future__ import unicode_literals class MyClass(object): def method(self): "With unicode : {'’'} " instance = MyClass() """ ) next(node.value.infer()).as_string() def test_binop_generates_nodes_with_parents(self) -> None: node = extract_node( """ def no_op(*args): pass def foo(*args): def inner(*more_args): args + more_args #@ return inner """ ) inferred = next(node.infer()) self.assertIsInstance(inferred, nodes.Tuple) self.assertIsNotNone(inferred.parent) self.assertIsInstance(inferred.parent, nodes.BinOp) def test_decorator_names_inference_error_leaking(self) -> None: node = extract_node( """ class Parent(object): @property def foo(self): pass class Child(Parent): @Parent.foo.getter def foo(self): #@ return super(Child, self).foo + ['oink'] """ ) inferred = next(node.infer()) self.assertEqual(inferred.decoratornames(), {".Parent.foo.getter"}) def test_recursive_property_method(self) -> None: node = extract_node( """ class APropert(): @property def property(self): return self APropert().property """ ) next(node.infer()) def test_uninferable_string_argument_of_namedtuple(self) -> None: node = extract_node( """ import collections collections.namedtuple('{}'.format("a"), '')() """ ) next(node.infer()) def test_regression_inference_of_self_in_lambda(self) -> None: code = """ class A: @b(lambda self: __(self)) def d(self): pass """ node = extract_node(code) inferred = next(node.infer()) assert isinstance(inferred, Instance) assert inferred.qname() == ".A" def test_inference_context_consideration(self) -> None: """https://github.com/PyCQA/astroid/issues/1828""" code = """ class Base: def return_type(self): return type(self)() class A(Base): def method(self): return self.return_type() class B(Base): def method(self): return self.return_type() A().method() #@ B().method() #@ """ node1, node2 = extract_node(code) inferred1 = next(node1.infer()) assert inferred1.qname() == ".A" inferred2 = next(node2.infer()) assert inferred2.qname() == ".B" class Whatever: a = property(lambda x: x, lambda x: x) # type: ignore[misc] def test_ancestor_looking_up_redefined_function() -> None: code = """ class Foo: def _format(self): pass def format(self): self.format = self._format self.format() Foo """ node = extract_node(code) inferred = next(node.infer()) ancestor = next(inferred.ancestors()) _, found = ancestor.lookup("format") assert len(found) == 1 assert isinstance(found[0], nodes.FunctionDef) def test_crash_in_dunder_inference_prevented() -> None: code = """ class MyClass(): def fu(self, objects): delitem = dict.__delitem__.__get__(self, dict) delitem #@ """ inferred = next(extract_node(code).infer()) assert inferred.qname() == "builtins.dict.__delitem__" def test_regression_crash_classmethod() -> None: """Regression test for a crash reported in https://github.com/pylint-dev/pylint/issues/4982. """ code = """ class Base: @classmethod def get_first_subclass(cls): for subclass in cls.__subclasses__(): return subclass return object subclass = Base.get_first_subclass() class Another(subclass): pass """ parse(code) def test_max_inferred_for_complicated_class_hierarchy() -> None: """Regression test for a crash reported in https://github.com/pylint-dev/pylint/issues/5679. The class hierarchy of 'sqlalchemy' is so intricate that it becomes uninferable with the standard max_inferred of 100. We used to crash when this happened. """ # Create module and get relevant nodes module = resources.build_file( str(resources.RESOURCE_PATH / "max_inferable_limit_for_classes" / "main.py") ) init_attr_node = module.body[-1].body[0].body[0].value.func init_object_node = module.body[-1].mro()[-1]["__init__"] super_node = next(init_attr_node.expr.infer()) # Arbitrarily limit the max number of infered nodes per context InferenceContext.max_inferred = -1 context = InferenceContext() # Try to infer 'object.__init__' > because of limit is impossible for inferred in bases._infer_stmts([init_object_node], context, frame=super): assert inferred == Uninferable # Reset inference limit InferenceContext.max_inferred = 100 # Check that we don't crash on a previously uninferable node assert super_node.getattr("__init__", context=context)[0] == Uninferable @mock.patch( "astroid.nodes.ImportFrom._infer", side_effect=RecursionError, ) def test_recursion_during_inference(mocked) -> None: """Check that we don't crash if we hit the recursion limit during inference.""" node: nodes.Call = _extract_single_node( """ from module import something something() """ ) with pytest.raises(InferenceError) as error: next(node.infer()) assert error.value.message.startswith("RecursionError raised") astroid-3.2.2/tests/test_nodes_position.py0000664000175000017500000001010714622475517020714 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from __future__ import annotations import textwrap from astroid import builder, nodes class TestNodePosition: """Test node ``position`` attribute.""" @staticmethod def test_position_class() -> None: """Position should only include keyword and name. >>> class A(Parent): >>> ^^^^^^^ """ code = textwrap.dedent( """ class A: #@ ... class B(A): #@ pass class C: #@ '''Docstring''' class D: #@ ... class E: #@ def f(): ... @decorator class F: #@ ... """ ).strip() ast_nodes: list[nodes.NodeNG] = builder.extract_node(code) # type: ignore[assignment] a = ast_nodes[0] assert isinstance(a, nodes.ClassDef) assert a.position == (1, 0, 1, 7) b = ast_nodes[1] assert isinstance(b, nodes.ClassDef) assert b.position == (4, 0, 4, 7) c = ast_nodes[2] assert isinstance(c, nodes.ClassDef) assert c.position == (7, 0, 7, 7) d = ast_nodes[3] assert isinstance(d, nodes.ClassDef) assert d.position == (10, 4, 10, 11) e = ast_nodes[4] assert isinstance(e, nodes.ClassDef) assert e.position == (13, 0, 13, 7) f = ast_nodes[5] assert isinstance(f, nodes.ClassDef) assert f.position == (18, 0, 18, 7) @staticmethod def test_position_function() -> None: """Position should only include keyword and name. >>> def func(var: int = 42): >>> ^^^^^^^^ """ code = textwrap.dedent( """ def a(): #@ ... def b(): #@ '''Docstring''' def c( #@ var: int = 42 ): def d(): #@ ... @decorator def e(): #@ ... """ ).strip() ast_nodes: list[nodes.NodeNG] = builder.extract_node(code) # type: ignore[assignment] a = ast_nodes[0] assert isinstance(a, nodes.FunctionDef) assert a.position == (1, 0, 1, 5) b = ast_nodes[1] assert isinstance(b, nodes.FunctionDef) assert b.position == (4, 0, 4, 5) c = ast_nodes[2] assert isinstance(c, nodes.FunctionDef) assert c.position == (7, 0, 7, 5) d = ast_nodes[3] assert isinstance(d, nodes.FunctionDef) assert d.position == (10, 4, 10, 9) e = ast_nodes[4] assert isinstance(e, nodes.FunctionDef) assert e.position == (14, 0, 14, 5) @staticmethod def test_position_async_function() -> None: """Position should only include keyword and name. >>> async def func(var: int = 42): >>> ^^^^^^^^^^^^^^ """ code = textwrap.dedent( """ async def a(): #@ ... async def b(): #@ '''Docstring''' async def c( #@ var: int = 42 ): async def d(): #@ ... @decorator async def e(): #@ ... """ ).strip() ast_nodes: list[nodes.NodeNG] = builder.extract_node(code) # type: ignore[assignment] a = ast_nodes[0] assert isinstance(a, nodes.FunctionDef) assert a.position == (1, 0, 1, 11) b = ast_nodes[1] assert isinstance(b, nodes.FunctionDef) assert b.position == (4, 0, 4, 11) c = ast_nodes[2] assert isinstance(c, nodes.FunctionDef) assert c.position == (7, 0, 7, 11) d = ast_nodes[3] assert isinstance(d, nodes.FunctionDef) assert d.position == (10, 4, 10, 15) e = ast_nodes[4] assert isinstance(e, nodes.FunctionDef) assert e.position == (14, 0, 14, 11) astroid-3.2.2/tests/test_nodes_lineno.py0000664000175000017500000015422514622475517020346 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import textwrap import pytest import astroid from astroid import builder, nodes from astroid.const import IS_PYPY, PY38, PY39_PLUS, PY310_PLUS, PY312_PLUS @pytest.mark.skipif( not (PY38 and IS_PYPY), reason="end_lineno and end_col_offset were added in PY38", ) class TestEndLinenoNotSet: """Test 'end_lineno' and 'end_col_offset' are initialized as 'None' for Python < 3.8. """ @staticmethod def test_end_lineno_not_set() -> None: code = textwrap.dedent( """ [1, 2, 3] #@ var #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 n1 = ast_nodes[0] assert isinstance(n1, nodes.List) assert (n1.lineno, n1.col_offset) == (1, 0) assert (n1.end_lineno, n1.end_col_offset) == (None, None) n2 = ast_nodes[1] assert isinstance(n2, nodes.Name) assert (n2.lineno, n2.col_offset) == (2, 0) assert (n2.end_lineno, n2.end_col_offset) == (None, None) @pytest.mark.skipif( PY38 and IS_PYPY, reason="end_lineno and end_col_offset were added in PY38", ) class TestLinenoColOffset: """Test 'lineno', 'col_offset', 'end_lineno', and 'end_col_offset' for all nodes. """ @staticmethod def test_end_lineno_container() -> None: """Container nodes: List, Tuple, Set.""" code = textwrap.dedent( """ [1, 2, 3] #@ [ #@ 1, 2, 3 ] (1, 2, 3) #@ {1, 2, 3} #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 c1 = ast_nodes[0] assert isinstance(c1, nodes.List) assert (c1.lineno, c1.col_offset) == (1, 0) assert (c1.end_lineno, c1.end_col_offset) == (1, 9) c2 = ast_nodes[1] assert isinstance(c2, nodes.List) assert (c2.lineno, c2.col_offset) == (2, 0) assert (c2.end_lineno, c2.end_col_offset) == (4, 1) c3 = ast_nodes[2] assert isinstance(c3, nodes.Tuple) assert (c3.lineno, c3.col_offset) == (5, 0) assert (c3.end_lineno, c3.end_col_offset) == (5, 9) c4 = ast_nodes[3] assert isinstance(c4, nodes.Set) assert (c4.lineno, c4.col_offset) == (6, 0) assert (c4.end_lineno, c4.end_col_offset) == (6, 9) @staticmethod def test_end_lineno_name() -> None: """Name, Assign, AssignName, Delete, DelName.""" code = textwrap.dedent( """ var = 42 #@ var #@ del var #@ var2 = ( #@ 1, 2, 3 ) """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 n1 = ast_nodes[0] assert isinstance(n1, nodes.Assign) assert isinstance(n1.targets[0], nodes.AssignName) assert isinstance(n1.value, nodes.Const) assert (n1.lineno, n1.col_offset) == (1, 0) assert (n1.end_lineno, n1.end_col_offset) == (1, 8) assert (n1.targets[0].lineno, n1.targets[0].col_offset) == (1, 0) assert (n1.targets[0].end_lineno, n1.targets[0].end_col_offset) == (1, 3) assert (n1.value.lineno, n1.value.col_offset) == (1, 6) assert (n1.value.end_lineno, n1.value.end_col_offset) == (1, 8) n2 = ast_nodes[1] assert isinstance(n2, nodes.Name) assert (n2.lineno, n2.col_offset) == (2, 0) assert (n2.end_lineno, n2.end_col_offset) == (2, 3) n3 = ast_nodes[2] assert isinstance(n3, nodes.Delete) and isinstance(n3.targets[0], nodes.DelName) assert (n3.lineno, n3.col_offset) == (3, 0) assert (n3.end_lineno, n3.end_col_offset) == (3, 7) assert (n3.targets[0].lineno, n3.targets[0].col_offset) == (3, 4) assert (n3.targets[0].end_lineno, n3.targets[0].end_col_offset) == (3, 7) n4 = ast_nodes[3] assert isinstance(n4, nodes.Assign) assert isinstance(n4.targets[0], nodes.AssignName) assert (n4.lineno, n4.col_offset) == (5, 0) assert (n4.end_lineno, n4.end_col_offset) == (7, 1) assert (n4.targets[0].lineno, n4.targets[0].col_offset) == (5, 0) assert (n4.targets[0].end_lineno, n4.targets[0].end_col_offset) == (5, 4) @staticmethod def test_end_lineno_attribute() -> None: """Attribute, AssignAttr, DelAttr.""" code = textwrap.dedent( """ class X: var = 42 X.var2 = 2 #@ X.var2 #@ del X.var2 #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 3 a1 = ast_nodes[0] assert isinstance(a1, nodes.Assign) assert isinstance(a1.targets[0], nodes.AssignAttr) assert isinstance(a1.value, nodes.Const) assert (a1.lineno, a1.col_offset) == (4, 0) assert (a1.end_lineno, a1.end_col_offset) == (4, 10) assert (a1.targets[0].lineno, a1.targets[0].col_offset) == (4, 0) assert (a1.targets[0].end_lineno, a1.targets[0].end_col_offset) == (4, 6) assert (a1.value.lineno, a1.value.col_offset) == (4, 9) assert (a1.value.end_lineno, a1.value.end_col_offset) == (4, 10) a2 = ast_nodes[1] assert isinstance(a2, nodes.Attribute) and isinstance(a2.expr, nodes.Name) assert (a2.lineno, a2.col_offset) == (5, 0) assert (a2.end_lineno, a2.end_col_offset) == (5, 6) assert (a2.expr.lineno, a2.expr.col_offset) == (5, 0) assert (a2.expr.end_lineno, a2.expr.end_col_offset) == (5, 1) a3 = ast_nodes[2] assert isinstance(a3, nodes.Delete) and isinstance(a3.targets[0], nodes.DelAttr) assert (a3.lineno, a3.col_offset) == (6, 0) assert (a3.end_lineno, a3.end_col_offset) == (6, 10) assert (a3.targets[0].lineno, a3.targets[0].col_offset) == (6, 4) assert (a3.targets[0].end_lineno, a3.targets[0].end_col_offset) == (6, 10) @staticmethod def test_end_lineno_call() -> None: """Call, Keyword.""" code = textwrap.dedent( """ func(arg1, arg2=value) #@ """ ).strip() c1 = builder.extract_node(code) assert isinstance(c1, nodes.Call) assert isinstance(c1.func, nodes.Name) assert isinstance(c1.args[0], nodes.Name) assert isinstance(c1.keywords[0], nodes.Keyword) assert isinstance(c1.keywords[0].value, nodes.Name) assert (c1.lineno, c1.col_offset) == (1, 0) assert (c1.end_lineno, c1.end_col_offset) == (1, 22) assert (c1.func.lineno, c1.func.col_offset) == (1, 0) assert (c1.func.end_lineno, c1.func.end_col_offset) == (1, 4) assert (c1.args[0].lineno, c1.args[0].col_offset) == (1, 5) assert (c1.args[0].end_lineno, c1.args[0].end_col_offset) == (1, 9) # fmt: off if PY39_PLUS: # 'lineno' and 'col_offset' information only added in Python 3.9 assert (c1.keywords[0].lineno, c1.keywords[0].col_offset) == (1, 11) assert (c1.keywords[0].end_lineno, c1.keywords[0].end_col_offset) == (1, 21) else: assert (c1.keywords[0].lineno, c1.keywords[0].col_offset) == (None, None) assert (c1.keywords[0].end_lineno, c1.keywords[0].end_col_offset) == (None, None) assert (c1.keywords[0].value.lineno, c1.keywords[0].value.col_offset) == (1, 16) assert (c1.keywords[0].value.end_lineno, c1.keywords[0].value.end_col_offset) == (1, 21) # fmt: on @staticmethod def test_end_lineno_assignment() -> None: """Assign, AnnAssign, AugAssign.""" code = textwrap.dedent( """ var = 2 #@ var2: int = 2 #@ var3 += 2 #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 3 a1 = ast_nodes[0] assert isinstance(a1, nodes.Assign) assert isinstance(a1.targets[0], nodes.AssignName) assert isinstance(a1.value, nodes.Const) assert (a1.lineno, a1.col_offset) == (1, 0) assert (a1.end_lineno, a1.end_col_offset) == (1, 7) assert (a1.targets[0].lineno, a1.targets[0].col_offset) == (1, 0) assert (a1.targets[0].end_lineno, a1.targets[0].end_col_offset) == (1, 3) assert (a1.value.lineno, a1.value.col_offset) == (1, 6) assert (a1.value.end_lineno, a1.value.end_col_offset) == (1, 7) a2 = ast_nodes[1] assert isinstance(a2, nodes.AnnAssign) assert isinstance(a2.target, nodes.AssignName) assert isinstance(a2.annotation, nodes.Name) assert isinstance(a2.value, nodes.Const) assert (a2.lineno, a2.col_offset) == (2, 0) assert (a2.end_lineno, a2.end_col_offset) == (2, 13) assert (a2.target.lineno, a2.target.col_offset) == (2, 0) assert (a2.target.end_lineno, a2.target.end_col_offset) == (2, 4) assert (a2.annotation.lineno, a2.annotation.col_offset) == (2, 6) assert (a2.annotation.end_lineno, a2.annotation.end_col_offset) == (2, 9) assert (a2.value.lineno, a2.value.col_offset) == (2, 12) assert (a2.value.end_lineno, a2.value.end_col_offset) == (2, 13) a3 = ast_nodes[2] assert isinstance(a3, nodes.AugAssign) assert isinstance(a3.target, nodes.AssignName) assert isinstance(a3.value, nodes.Const) assert (a3.lineno, a3.col_offset) == (3, 0) assert (a3.end_lineno, a3.end_col_offset) == (3, 9) assert (a3.target.lineno, a3.target.col_offset) == (3, 0) assert (a3.target.end_lineno, a3.target.end_col_offset) == (3, 4) assert (a3.value.lineno, a3.value.col_offset) == (3, 8) assert (a3.value.end_lineno, a3.value.end_col_offset) == (3, 9) @staticmethod def test_end_lineno_mix_stmts() -> None: """Assert, Break, Continue, Global, Nonlocal, Pass, Raise, Return, Expr.""" code = textwrap.dedent( """ assert True, "Some message" #@ break #@ continue #@ global var #@ nonlocal var #@ pass #@ raise Exception from ex #@ return 42 #@ var #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 9 s1 = ast_nodes[0] assert isinstance(s1, nodes.Assert) assert isinstance(s1.test, nodes.Const) assert isinstance(s1.fail, nodes.Const) assert (s1.lineno, s1.col_offset) == (1, 0) assert (s1.end_lineno, s1.end_col_offset) == (1, 27) assert (s1.test.lineno, s1.test.col_offset) == (1, 7) assert (s1.test.end_lineno, s1.test.end_col_offset) == (1, 11) assert (s1.fail.lineno, s1.fail.col_offset) == (1, 13) assert (s1.fail.end_lineno, s1.fail.end_col_offset) == (1, 27) s2 = ast_nodes[1] assert isinstance(s2, nodes.Break) assert (s2.lineno, s2.col_offset) == (2, 0) assert (s2.end_lineno, s2.end_col_offset) == (2, 5) s3 = ast_nodes[2] assert isinstance(s3, nodes.Continue) assert (s3.lineno, s3.col_offset) == (3, 0) assert (s3.end_lineno, s3.end_col_offset) == (3, 8) s4 = ast_nodes[3] assert isinstance(s4, nodes.Global) assert (s4.lineno, s4.col_offset) == (4, 0) assert (s4.end_lineno, s4.end_col_offset) == (4, 10) s5 = ast_nodes[4] assert isinstance(s5, nodes.Nonlocal) assert (s5.lineno, s5.col_offset) == (5, 0) assert (s5.end_lineno, s5.end_col_offset) == (5, 12) s6 = ast_nodes[5] assert isinstance(s6, nodes.Pass) assert (s6.lineno, s6.col_offset) == (6, 0) assert (s6.end_lineno, s6.end_col_offset) == (6, 4) s7 = ast_nodes[6] assert isinstance(s7, nodes.Raise) assert isinstance(s7.exc, nodes.Name) assert isinstance(s7.cause, nodes.Name) assert (s7.lineno, s7.col_offset) == (7, 0) assert (s7.end_lineno, s7.end_col_offset) == (7, 23) assert (s7.exc.lineno, s7.exc.col_offset) == (7, 6) assert (s7.exc.end_lineno, s7.exc.end_col_offset) == (7, 15) assert (s7.cause.lineno, s7.cause.col_offset) == (7, 21) assert (s7.cause.end_lineno, s7.cause.end_col_offset) == (7, 23) s8 = ast_nodes[7] assert isinstance(s8, nodes.Return) assert isinstance(s8.value, nodes.Const) assert (s8.lineno, s8.col_offset) == (8, 0) assert (s8.end_lineno, s8.end_col_offset) == (8, 9) assert (s8.value.lineno, s8.value.col_offset) == (8, 7) assert (s8.value.end_lineno, s8.value.end_col_offset) == (8, 9) s9 = ast_nodes[8].parent assert isinstance(s9, nodes.Expr) assert isinstance(s9.value, nodes.Name) assert (s9.lineno, s9.col_offset) == (9, 0) assert (s9.end_lineno, s9.end_col_offset) == (9, 3) assert (s9.value.lineno, s9.value.col_offset) == (9, 0) assert (s9.value.end_lineno, s9.value.end_col_offset) == (9, 3) @staticmethod def test_end_lineno_mix_nodes() -> None: """Await, Starred, Yield, YieldFrom.""" code = textwrap.dedent( """ await func #@ *args #@ yield 42 #@ yield from (1, 2) #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 n1 = ast_nodes[0] assert isinstance(n1, nodes.Await) assert isinstance(n1.value, nodes.Name) assert (n1.lineno, n1.col_offset) == (1, 0) assert (n1.end_lineno, n1.end_col_offset) == (1, 10) assert (n1.value.lineno, n1.value.col_offset) == (1, 6) assert (n1.value.end_lineno, n1.value.end_col_offset) == (1, 10) n2 = ast_nodes[1] assert isinstance(n2, nodes.Starred) assert isinstance(n2.value, nodes.Name) assert (n2.lineno, n2.col_offset) == (2, 0) assert (n2.end_lineno, n2.end_col_offset) == (2, 5) assert (n2.value.lineno, n2.value.col_offset) == (2, 1) assert (n2.value.end_lineno, n2.value.end_col_offset) == (2, 5) n3 = ast_nodes[2] assert isinstance(n3, nodes.Yield) assert isinstance(n3.value, nodes.Const) assert (n3.lineno, n3.col_offset) == (3, 0) assert (n3.end_lineno, n3.end_col_offset) == (3, 8) assert (n3.value.lineno, n3.value.col_offset) == (3, 6) assert (n3.value.end_lineno, n3.value.end_col_offset) == (3, 8) n4 = ast_nodes[3] assert isinstance(n4, nodes.YieldFrom) assert isinstance(n4.value, nodes.Tuple) assert (n4.lineno, n4.col_offset) == (4, 0) assert (n4.end_lineno, n4.end_col_offset) == (4, 17) assert (n4.value.lineno, n4.value.col_offset) == (4, 11) assert (n4.value.end_lineno, n4.value.end_col_offset) == (4, 17) @staticmethod def test_end_lineno_ops() -> None: """BinOp, BoolOp, UnaryOp, Compare.""" code = textwrap.dedent( """ x + y #@ a and b #@ -var #@ a < b #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 o1 = ast_nodes[0] assert isinstance(o1, nodes.BinOp) assert isinstance(o1.left, nodes.Name) assert isinstance(o1.right, nodes.Name) assert (o1.lineno, o1.col_offset) == (1, 0) assert (o1.end_lineno, o1.end_col_offset) == (1, 5) assert (o1.left.lineno, o1.left.col_offset) == (1, 0) assert (o1.left.end_lineno, o1.left.end_col_offset) == (1, 1) assert (o1.right.lineno, o1.right.col_offset) == (1, 4) assert (o1.right.end_lineno, o1.right.end_col_offset) == (1, 5) o2 = ast_nodes[1] assert isinstance(o2, nodes.BoolOp) assert isinstance(o2.values[0], nodes.Name) assert isinstance(o2.values[1], nodes.Name) assert (o2.lineno, o2.col_offset) == (2, 0) assert (o2.end_lineno, o2.end_col_offset) == (2, 7) assert (o2.values[0].lineno, o2.values[0].col_offset) == (2, 0) assert (o2.values[0].end_lineno, o2.values[0].end_col_offset) == (2, 1) assert (o2.values[1].lineno, o2.values[1].col_offset) == (2, 6) assert (o2.values[1].end_lineno, o2.values[1].end_col_offset) == (2, 7) o3 = ast_nodes[2] assert isinstance(o3, nodes.UnaryOp) assert isinstance(o3.operand, nodes.Name) assert (o3.lineno, o3.col_offset) == (3, 0) assert (o3.end_lineno, o3.end_col_offset) == (3, 4) assert (o3.operand.lineno, o3.operand.col_offset) == (3, 1) assert (o3.operand.end_lineno, o3.operand.end_col_offset) == (3, 4) o4 = ast_nodes[3] assert isinstance(o4, nodes.Compare) assert isinstance(o4.left, nodes.Name) assert isinstance(o4.ops[0][1], nodes.Name) assert (o4.lineno, o4.col_offset) == (4, 0) assert (o4.end_lineno, o4.end_col_offset) == (4, 5) assert (o4.left.lineno, o4.left.col_offset) == (4, 0) assert (o4.left.end_lineno, o4.left.end_col_offset) == (4, 1) assert (o4.ops[0][1].lineno, o4.ops[0][1].col_offset) == (4, 4) assert (o4.ops[0][1].end_lineno, o4.ops[0][1].end_col_offset) == (4, 5) @staticmethod def test_end_lineno_if() -> None: """If, IfExp, NamedExpr.""" code = textwrap.dedent( """ if ( #@ var := 2 #@ ): pass else: pass 2 if True else 1 #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 3 i1 = ast_nodes[0] assert isinstance(i1, nodes.If) assert isinstance(i1.test, nodes.NamedExpr) assert isinstance(i1.body[0], nodes.Pass) assert isinstance(i1.orelse[0], nodes.Pass) assert (i1.lineno, i1.col_offset) == (1, 0) assert (i1.end_lineno, i1.end_col_offset) == (6, 8) assert (i1.test.lineno, i1.test.col_offset) == (2, 4) assert (i1.test.end_lineno, i1.test.end_col_offset) == (2, 12) assert (i1.body[0].lineno, i1.body[0].col_offset) == (4, 4) assert (i1.body[0].end_lineno, i1.body[0].end_col_offset) == (4, 8) assert (i1.orelse[0].lineno, i1.orelse[0].col_offset) == (6, 4) assert (i1.orelse[0].end_lineno, i1.orelse[0].end_col_offset) == (6, 8) i2 = ast_nodes[1] assert isinstance(i2, nodes.NamedExpr) assert isinstance(i2.target, nodes.AssignName) assert isinstance(i2.value, nodes.Const) assert (i2.lineno, i2.col_offset) == (2, 4) assert (i2.end_lineno, i2.end_col_offset) == (2, 12) assert (i2.target.lineno, i2.target.col_offset) == (2, 4) assert (i2.target.end_lineno, i2.target.end_col_offset) == (2, 7) assert (i2.value.lineno, i2.value.col_offset) == (2, 11) assert (i2.value.end_lineno, i2.value.end_col_offset) == (2, 12) i3 = ast_nodes[2] assert isinstance(i3, nodes.IfExp) assert isinstance(i3.test, nodes.Const) assert isinstance(i3.body, nodes.Const) assert isinstance(i3.orelse, nodes.Const) assert (i3.lineno, i3.col_offset) == (8, 0) assert (i3.end_lineno, i3.end_col_offset) == (8, 16) assert (i3.test.lineno, i3.test.col_offset) == (8, 5) assert (i3.test.end_lineno, i3.test.end_col_offset) == (8, 9) assert (i3.body.lineno, i3.body.col_offset) == (8, 0) assert (i3.body.end_lineno, i3.body.end_col_offset) == (8, 1) assert (i3.orelse.lineno, i3.orelse.col_offset) == (8, 15) assert (i3.orelse.end_lineno, i3.orelse.end_col_offset) == (8, 16) @staticmethod def test_end_lineno_for() -> None: """For, AsyncFor.""" code = textwrap.dedent( """ for i in lst: #@ pass else: pass async for i in lst: #@ pass """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 f1 = ast_nodes[0] assert isinstance(f1, nodes.For) assert isinstance(f1.target, nodes.AssignName) assert isinstance(f1.iter, nodes.Name) assert isinstance(f1.body[0], nodes.Pass) assert isinstance(f1.orelse[0], nodes.Pass) assert (f1.lineno, f1.col_offset) == (1, 0) assert (f1.end_lineno, f1.end_col_offset) == (4, 8) assert (f1.target.lineno, f1.target.col_offset) == (1, 4) assert (f1.target.end_lineno, f1.target.end_col_offset) == (1, 5) assert (f1.iter.lineno, f1.iter.col_offset) == (1, 9) assert (f1.iter.end_lineno, f1.iter.end_col_offset) == (1, 12) assert (f1.body[0].lineno, f1.body[0].col_offset) == (2, 4) assert (f1.body[0].end_lineno, f1.body[0].end_col_offset) == (2, 8) assert (f1.orelse[0].lineno, f1.orelse[0].col_offset) == (4, 4) assert (f1.orelse[0].end_lineno, f1.orelse[0].end_col_offset) == (4, 8) f2 = ast_nodes[1] assert isinstance(f2, nodes.AsyncFor) assert isinstance(f2.target, nodes.AssignName) assert isinstance(f2.iter, nodes.Name) assert isinstance(f2.body[0], nodes.Pass) assert (f2.lineno, f2.col_offset) == (6, 0) assert (f2.end_lineno, f2.end_col_offset) == (7, 8) assert (f2.target.lineno, f2.target.col_offset) == (6, 10) assert (f2.target.end_lineno, f2.target.end_col_offset) == (6, 11) assert (f2.iter.lineno, f2.iter.col_offset) == (6, 15) assert (f2.iter.end_lineno, f2.iter.end_col_offset) == (6, 18) assert (f2.body[0].lineno, f2.body[0].col_offset) == (7, 4) assert (f2.body[0].end_lineno, f2.body[0].end_col_offset) == (7, 8) @staticmethod def test_end_lineno_const() -> None: """Const (int, str, bool, None, bytes, ellipsis).""" code = textwrap.dedent( """ 2 #@ "Hello" #@ True #@ None #@ b"01" #@ ... #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 6 c1 = ast_nodes[0] assert isinstance(c1, nodes.Const) assert (c1.lineno, c1.col_offset) == (1, 0) assert (c1.end_lineno, c1.end_col_offset) == (1, 1) c2 = ast_nodes[1] assert isinstance(c2, nodes.Const) assert (c2.lineno, c2.col_offset) == (2, 0) assert (c2.end_lineno, c2.end_col_offset) == (2, 7) c3 = ast_nodes[2] assert isinstance(c3, nodes.Const) assert (c3.lineno, c3.col_offset) == (3, 0) assert (c3.end_lineno, c3.end_col_offset) == (3, 4) c4 = ast_nodes[3] assert isinstance(c4, nodes.Const) assert (c4.lineno, c4.col_offset) == (4, 0) assert (c4.end_lineno, c4.end_col_offset) == (4, 4) c5 = ast_nodes[4] assert isinstance(c5, nodes.Const) assert (c5.lineno, c5.col_offset) == (5, 0) assert (c5.end_lineno, c5.end_col_offset) == (5, 5) c6 = ast_nodes[5] assert isinstance(c6, nodes.Const) assert (c6.lineno, c6.col_offset) == (6, 0) assert (c6.end_lineno, c6.end_col_offset) == (6, 3) @staticmethod def test_end_lineno_function() -> None: """FunctionDef, AsyncFunctionDef, Decorators, Lambda, Arguments.""" code = textwrap.dedent( """ def func( #@ a: int = 0, /, var: int = 1, *args: Any, keyword: int = 2, **kwargs: Any ) -> None: pass @decorator1 @decorator2 async def func(): #@ pass lambda x: 2 #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 3 # fmt: off f1 = ast_nodes[0] assert isinstance(f1, nodes.FunctionDef) assert isinstance(f1.args, nodes.Arguments) assert isinstance(f1.returns, nodes.Const) assert isinstance(f1.body[0], nodes.Pass) assert (f1.lineno, f1.col_offset) == (1, 0) assert (f1.end_lineno, f1.end_col_offset) == (6, 8) assert (f1.returns.lineno, f1.returns.col_offset) == (5, 5) assert (f1.returns.end_lineno, f1.returns.end_col_offset) == (5, 9) assert (f1.body[0].lineno, f1.body[0].col_offset) == (6, 4) assert (f1.body[0].end_lineno, f1.body[0].end_col_offset) == (6, 8) # pos only arguments # TODO fix column offset: arg -> arg (AssignName) assert isinstance(f1.args.posonlyargs[0], nodes.AssignName) assert (f1.args.posonlyargs[0].lineno, f1.args.posonlyargs[0].col_offset) == (2, 4) assert (f1.args.posonlyargs[0].end_lineno, f1.args.posonlyargs[0].end_col_offset) == (2, 10) assert isinstance(f1.args.posonlyargs_annotations[0], nodes.Name) assert ( f1.args.posonlyargs_annotations[0].lineno, f1.args.posonlyargs_annotations[0].col_offset ) == (2, 7) assert ( f1.args.posonlyargs_annotations[0].end_lineno, f1.args.posonlyargs_annotations[0].end_col_offset ) == (2, 10) assert (f1.args.defaults[0].lineno, f1.args.defaults[0].col_offset) == (2, 13) assert (f1.args.defaults[0].end_lineno, f1.args.defaults[0].end_col_offset) == (2, 14) # pos or kw arguments assert isinstance(f1.args.args[0], nodes.AssignName) assert (f1.args.args[0].lineno, f1.args.args[0].col_offset) == (3, 4) assert (f1.args.args[0].end_lineno, f1.args.args[0].end_col_offset) == (3, 12) assert isinstance(f1.args.annotations[0], nodes.Name) assert (f1.args.annotations[0].lineno, f1.args.annotations[0].col_offset) == (3, 9) assert (f1.args.annotations[0].end_lineno, f1.args.annotations[0].end_col_offset) == (3, 12) assert isinstance(f1.args.defaults[1], nodes.Const) assert (f1.args.defaults[1].lineno, f1.args.defaults[1].col_offset) == (3, 15) assert (f1.args.defaults[1].end_lineno, f1.args.defaults[1].end_col_offset) == (3, 16) # *args assert isinstance(f1.args.varargannotation, nodes.Name) assert (f1.args.varargannotation.lineno, f1.args.varargannotation.col_offset) == (3, 25) assert (f1.args.varargannotation.end_lineno, f1.args.varargannotation.end_col_offset) == (3, 28) # kw_only arguments assert isinstance(f1.args.kwonlyargs[0], nodes.AssignName) assert (f1.args.kwonlyargs[0].lineno, f1.args.kwonlyargs[0].col_offset) == (4, 4) assert (f1.args.kwonlyargs[0].end_lineno, f1.args.kwonlyargs[0].end_col_offset) == (4, 16) annotations = f1.args.kwonlyargs_annotations assert isinstance(annotations[0], nodes.Name) assert (annotations[0].lineno, annotations[0].col_offset) == (4, 13) assert (annotations[0].end_lineno, annotations[0].end_col_offset) == (4, 16) assert isinstance(f1.args.kw_defaults[0], nodes.Const) assert (f1.args.kw_defaults[0].lineno, f1.args.kw_defaults[0].col_offset) == (4, 19) assert (f1.args.kw_defaults[0].end_lineno, f1.args.kw_defaults[0].end_col_offset) == (4, 20) # **kwargs assert isinstance(f1.args.kwargannotation, nodes.Name) assert (f1.args.kwargannotation.lineno, f1.args.kwargannotation.col_offset) == (4, 32) assert (f1.args.kwargannotation.end_lineno, f1.args.kwargannotation.end_col_offset) == (4, 35) f2 = ast_nodes[1] assert isinstance(f2, nodes.AsyncFunctionDef) assert isinstance(f2.decorators, nodes.Decorators) assert isinstance(f2.decorators.nodes[0], nodes.Name) assert isinstance(f2.decorators.nodes[1], nodes.Name) assert (f2.lineno, f2.col_offset) == (8, 0) assert (f2.end_lineno, f2.end_col_offset) == (11, 8) assert (f2.decorators.lineno, f2.decorators.col_offset) == (8, 0) assert (f2.decorators.end_lineno, f2.decorators.end_col_offset) == (9, 11) assert (f2.decorators.nodes[0].lineno, f2.decorators.nodes[0].col_offset) == (8, 1) assert (f2.decorators.nodes[0].end_lineno, f2.decorators.nodes[0].end_col_offset) == (8, 11) assert (f2.decorators.nodes[1].lineno, f2.decorators.nodes[1].col_offset) == (9, 1) assert (f2.decorators.nodes[1].end_lineno, f2.decorators.nodes[1].end_col_offset) == (9, 11) f3 = ast_nodes[2] assert isinstance(f3, nodes.Lambda) assert isinstance(f3.args, nodes.Arguments) assert isinstance(f3.args.args[0], nodes.AssignName) assert isinstance(f3.body, nodes.Const) assert (f3.lineno, f3.col_offset) == (13, 0) assert (f3.end_lineno, f3.end_col_offset) == (13, 11) assert (f3.args.args[0].lineno, f3.args.args[0].col_offset) == (13, 7) assert (f3.args.args[0].end_lineno, f3.args.args[0].end_col_offset) == (13, 8) assert (f3.body.lineno, f3.body.col_offset) == (13, 10) assert (f3.body.end_lineno, f3.body.end_col_offset) == (13, 11) # fmt: on @staticmethod def test_end_lineno_dict() -> None: """Dict, DictUnpack.""" code = textwrap.dedent( """ { #@ 1: "Hello", **{2: "World"} #@ } """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 d1 = ast_nodes[0] assert isinstance(d1, nodes.Dict) assert isinstance(d1.items[0][0], nodes.Const) assert isinstance(d1.items[0][1], nodes.Const) assert (d1.lineno, d1.col_offset) == (1, 0) assert (d1.end_lineno, d1.end_col_offset) == (4, 1) assert (d1.items[0][0].lineno, d1.items[0][0].col_offset) == (2, 4) assert (d1.items[0][0].end_lineno, d1.items[0][0].end_col_offset) == (2, 5) assert (d1.items[0][1].lineno, d1.items[0][1].col_offset) == (2, 7) assert (d1.items[0][1].end_lineno, d1.items[0][1].end_col_offset) == (2, 14) d2 = ast_nodes[1] assert isinstance(d2, nodes.DictUnpack) assert (d2.lineno, d2.col_offset) == (3, 6) assert (d2.end_lineno, d2.end_col_offset) == (3, 18) @staticmethod def test_end_lineno_try() -> None: """Try, ExceptHandler.""" code = textwrap.dedent( """ try: #@ pass except KeyError as ex: pass except AttributeError as ex: pass else: pass try: #@ pass except KeyError as ex: pass else: pass finally: pass """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 t1 = ast_nodes[0] assert isinstance(t1, nodes.Try) assert isinstance(t1.body[0], nodes.Pass) assert isinstance(t1.orelse[0], nodes.Pass) assert (t1.lineno, t1.col_offset) == (1, 0) assert (t1.end_lineno, t1.end_col_offset) == (8, 8) assert (t1.body[0].lineno, t1.body[0].col_offset) == (2, 4) assert (t1.body[0].end_lineno, t1.body[0].end_col_offset) == (2, 8) assert (t1.orelse[0].lineno, t1.orelse[0].col_offset) == (8, 4) assert (t1.orelse[0].end_lineno, t1.orelse[0].end_col_offset) == (8, 8) t2 = t1.handlers[0] assert isinstance(t2, nodes.ExceptHandler) assert isinstance(t2.type, nodes.Name) assert isinstance(t2.name, nodes.AssignName) assert isinstance(t2.body[0], nodes.Pass) assert (t2.lineno, t2.col_offset) == (3, 0) assert (t2.end_lineno, t2.end_col_offset) == (4, 8) assert (t2.type.lineno, t2.type.col_offset) == (3, 7) assert (t2.type.end_lineno, t2.type.end_col_offset) == (3, 15) # TODO fix column offset: ExceptHandler -> name (AssignName) assert (t2.name.lineno, t2.name.col_offset) == (3, 0) assert (t2.name.end_lineno, t2.name.end_col_offset) == (4, 8) assert (t2.body[0].lineno, t2.body[0].col_offset) == (4, 4) assert (t2.body[0].end_lineno, t2.body[0].end_col_offset) == (4, 8) t3 = ast_nodes[1] assert isinstance(t3, nodes.Try) assert isinstance(t3.finalbody[0], nodes.Pass) assert (t3.lineno, t3.col_offset) == (10, 0) assert (t3.end_lineno, t3.end_col_offset) == (17, 8) assert (t3.body[0].lineno, t3.body[0].col_offset) == (11, 4) assert (t3.body[0].end_lineno, t3.body[0].end_col_offset) == (11, 8) assert (t3.finalbody[0].lineno, t3.finalbody[0].col_offset) == (17, 4) assert (t3.finalbody[0].end_lineno, t3.finalbody[0].end_col_offset) == (17, 8) @staticmethod def test_end_lineno_subscript() -> None: """Subscript, Slice, (ExtSlice, Index).""" code = textwrap.dedent( """ var[0] #@ var[1:2:1] #@ var[1:2, 2] #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 3 s1 = ast_nodes[0] assert isinstance(s1, nodes.Subscript) assert isinstance(s1.value, nodes.Name) assert isinstance(s1.slice, nodes.Const) assert (s1.lineno, s1.col_offset) == (1, 0) assert (s1.end_lineno, s1.end_col_offset) == (1, 6) assert (s1.value.lineno, s1.value.col_offset) == (1, 0) assert (s1.value.end_lineno, s1.value.end_col_offset) == (1, 3) assert (s1.slice.lineno, s1.slice.col_offset) == (1, 4) assert (s1.slice.end_lineno, s1.slice.end_col_offset) == (1, 5) s2 = ast_nodes[1] assert isinstance(s2, nodes.Subscript) assert isinstance(s2.slice, nodes.Slice) assert isinstance(s2.slice.lower, nodes.Const) assert isinstance(s2.slice.upper, nodes.Const) assert isinstance(s2.slice.step, nodes.Const) assert (s2.lineno, s2.col_offset) == (2, 0) assert (s2.end_lineno, s2.end_col_offset) == (2, 10) assert (s2.slice.lower.lineno, s2.slice.lower.col_offset) == (2, 4) assert (s2.slice.lower.end_lineno, s2.slice.lower.end_col_offset) == (2, 5) assert (s2.slice.upper.lineno, s2.slice.upper.col_offset) == (2, 6) assert (s2.slice.upper.end_lineno, s2.slice.upper.end_col_offset) == (2, 7) assert (s2.slice.step.lineno, s2.slice.step.col_offset) == (2, 8) assert (s2.slice.step.end_lineno, s2.slice.step.end_col_offset) == (2, 9) s3 = ast_nodes[2] assert isinstance(s3, nodes.Subscript) assert isinstance(s3.slice, nodes.Tuple) assert (s3.lineno, s3.col_offset) == (3, 0) assert (s3.end_lineno, s3.end_col_offset) == (3, 11) if PY39_PLUS: # 'lineno' and 'col_offset' information only added in Python 3.9 assert (s3.slice.lineno, s3.slice.col_offset) == (3, 4) assert (s3.slice.end_lineno, s3.slice.end_col_offset) == (3, 10) else: assert (s3.slice.lineno, s3.slice.col_offset) == (None, None) assert (s3.slice.end_lineno, s3.slice.end_col_offset) == (None, None) @staticmethod def test_end_lineno_import() -> None: """Import, ImportFrom.""" code = textwrap.dedent( """ import a.b #@ import a as x #@ from . import x #@ from .a import y as y #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 i1 = ast_nodes[0] assert isinstance(i1, nodes.Import) assert (i1.lineno, i1.col_offset) == (1, 0) assert (i1.end_lineno, i1.end_col_offset) == (1, 10) i2 = ast_nodes[1] assert isinstance(i2, nodes.Import) assert (i2.lineno, i2.col_offset) == (2, 0) assert (i2.end_lineno, i2.end_col_offset) == (2, 13) i3 = ast_nodes[2] assert isinstance(i3, nodes.ImportFrom) assert (i3.lineno, i3.col_offset) == (3, 0) assert (i3.end_lineno, i3.end_col_offset) == (3, 15) i4 = ast_nodes[3] assert isinstance(i4, nodes.ImportFrom) assert (i4.lineno, i4.col_offset) == (4, 0) assert (i4.end_lineno, i4.end_col_offset) == (4, 21) @staticmethod def test_end_lineno_with() -> None: """With, AsyncWith.""" code = textwrap.dedent( """ with open(file) as fp, \\ open(file2) as fp2: #@ pass async with open(file) as fp: #@ pass """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 w1 = ast_nodes[0].parent assert isinstance(w1, nodes.With) assert isinstance(w1.items[0][0], nodes.Call) assert isinstance(w1.items[0][1], nodes.AssignName) assert isinstance(w1.items[1][0], nodes.Call) assert isinstance(w1.items[1][1], nodes.AssignName) assert isinstance(w1.body[0], nodes.Pass) assert (w1.lineno, w1.col_offset) == (1, 0) assert (w1.end_lineno, w1.end_col_offset) == (3, 8) assert (w1.items[0][0].lineno, w1.items[0][0].col_offset) == (1, 5) assert (w1.items[0][0].end_lineno, w1.items[0][0].end_col_offset) == (1, 15) assert (w1.items[0][1].lineno, w1.items[0][1].col_offset) == (1, 19) assert (w1.items[0][1].end_lineno, w1.items[0][1].end_col_offset) == (1, 21) assert (w1.items[1][0].lineno, w1.items[1][0].col_offset) == (2, 8) assert (w1.items[1][0].end_lineno, w1.items[1][0].end_col_offset) == (2, 19) assert (w1.items[1][1].lineno, w1.items[1][1].col_offset) == (2, 23) assert (w1.items[1][1].end_lineno, w1.items[1][1].end_col_offset) == (2, 26) assert (w1.body[0].lineno, w1.body[0].col_offset) == (3, 4) assert (w1.body[0].end_lineno, w1.body[0].end_col_offset) == (3, 8) w2 = ast_nodes[1] assert isinstance(w2, nodes.AsyncWith) assert isinstance(w2.items[0][0], nodes.Call) assert isinstance(w2.items[0][1], nodes.AssignName) assert isinstance(w2.body[0], nodes.Pass) assert (w2.lineno, w2.col_offset) == (5, 0) assert (w2.end_lineno, w2.end_col_offset) == (6, 8) assert (w2.items[0][0].lineno, w2.items[0][0].col_offset) == (5, 11) assert (w2.items[0][0].end_lineno, w2.items[0][0].end_col_offset) == (5, 21) assert (w2.items[0][1].lineno, w2.items[0][1].col_offset) == (5, 25) assert (w2.items[0][1].end_lineno, w2.items[0][1].end_col_offset) == (5, 27) assert (w2.body[0].lineno, w2.body[0].col_offset) == (6, 4) assert (w2.body[0].end_lineno, w2.body[0].end_col_offset) == (6, 8) @staticmethod def test_end_lineno_while() -> None: """While.""" code = textwrap.dedent( """ while 2: pass else: pass """ ).strip() w1 = builder.extract_node(code) assert isinstance(w1, nodes.While) assert isinstance(w1.test, nodes.Const) assert isinstance(w1.body[0], nodes.Pass) assert isinstance(w1.orelse[0], nodes.Pass) assert (w1.lineno, w1.col_offset) == (1, 0) assert (w1.end_lineno, w1.end_col_offset) == (4, 8) assert (w1.test.lineno, w1.test.col_offset) == (1, 6) assert (w1.test.end_lineno, w1.test.end_col_offset) == (1, 7) assert (w1.body[0].lineno, w1.body[0].col_offset) == (2, 4) assert (w1.body[0].end_lineno, w1.body[0].end_col_offset) == (2, 8) assert (w1.orelse[0].lineno, w1.orelse[0].col_offset) == (4, 4) assert (w1.orelse[0].end_lineno, w1.orelse[0].end_col_offset) == (4, 8) @staticmethod def test_end_lineno_string() -> None: """FormattedValue, JoinedStr.""" code = textwrap.dedent( """ f"Hello World: {42.1234:02d}" #@ f"Hello: {name=}" #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 2 s1 = ast_nodes[0] assert isinstance(s1, nodes.JoinedStr) assert isinstance(s1.values[0], nodes.Const) assert (s1.lineno, s1.col_offset) == (1, 0) assert (s1.end_lineno, s1.end_col_offset) == (1, 29) if PY312_PLUS: assert (s1.values[0].lineno, s1.values[0].col_offset) == (1, 2) assert (s1.values[0].end_lineno, s1.values[0].end_col_offset) == (1, 15) else: # Bug in Python 3.11 # https://github.com/python/cpython/issues/81639 assert (s1.values[0].lineno, s1.values[0].col_offset) == (1, 0) assert (s1.values[0].end_lineno, s1.values[0].end_col_offset) == (1, 29) s2 = s1.values[1] assert isinstance(s2, nodes.FormattedValue) if PY312_PLUS: assert (s2.lineno, s2.col_offset) == (1, 15) assert (s2.end_lineno, s2.end_col_offset) == (1, 28) else: assert (s2.lineno, s2.col_offset) == (1, 0) assert (s2.end_lineno, s2.end_col_offset) == (1, 29) assert isinstance(s2.value, nodes.Const) # 42.1234 if PY39_PLUS: assert (s2.value.lineno, s2.value.col_offset) == (1, 16) assert (s2.value.end_lineno, s2.value.end_col_offset) == (1, 23) else: # Bug in Python 3.8 # https://bugs.python.org/issue44885 assert (s2.value.lineno, s2.value.col_offset) == (1, 1) assert (s2.value.end_lineno, s2.value.end_col_offset) == (1, 8) assert isinstance(s2.format_spec, nodes.JoinedStr) # ':02d' if PY312_PLUS: assert (s2.format_spec.lineno, s2.format_spec.col_offset) == (1, 23) assert (s2.format_spec.end_lineno, s2.format_spec.end_col_offset) == (1, 27) else: assert (s2.format_spec.lineno, s2.format_spec.col_offset) == (1, 0) assert (s2.format_spec.end_lineno, s2.format_spec.end_col_offset) == (1, 29) s3 = ast_nodes[1] assert isinstance(s3, nodes.JoinedStr) assert isinstance(s3.values[0], nodes.Const) assert (s3.lineno, s3.col_offset) == (2, 0) assert (s3.end_lineno, s3.end_col_offset) == (2, 17) if PY312_PLUS: assert (s3.values[0].lineno, s3.values[0].col_offset) == (2, 2) assert (s3.values[0].end_lineno, s3.values[0].end_col_offset) == (2, 15) else: assert (s3.values[0].lineno, s3.values[0].col_offset) == (2, 0) assert (s3.values[0].end_lineno, s3.values[0].end_col_offset) == (2, 17) s4 = s3.values[1] assert isinstance(s4, nodes.FormattedValue) if PY312_PLUS: assert (s4.lineno, s4.col_offset) == (2, 9) assert (s4.end_lineno, s4.end_col_offset) == (2, 16) else: assert (s4.lineno, s4.col_offset) == (2, 0) assert (s4.end_lineno, s4.end_col_offset) == (2, 17) assert isinstance(s4.value, nodes.Name) # 'name' if PY39_PLUS: assert (s4.value.lineno, s4.value.col_offset) == (2, 10) assert (s4.value.end_lineno, s4.value.end_col_offset) == (2, 14) else: # Bug in Python 3.8 # https://bugs.python.org/issue44885 assert (s4.value.lineno, s4.value.col_offset) == (2, 1) assert (s4.value.end_lineno, s4.value.end_col_offset) == (2, 5) @staticmethod @pytest.mark.skipif(not PY310_PLUS, reason="pattern matching was added in PY310") def test_end_lineno_match() -> None: """Match, MatchValue, MatchSingleton, MatchSequence, MatchMapping, MatchClass, MatchStar, MatchOr, MatchAs. """ code = textwrap.dedent( """ match x: #@ case 200 if True: #@ pass case True: #@ pass case (1, 2, *args): #@ pass case {1: "Hello", **rest}: #@ pass case Point2d(0, y=0): #@ pass case 200 | 300: #@ pass case 200 as c: #@ pass """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 8 # fmt: off m1 = ast_nodes[0] assert isinstance(m1, nodes.Match) assert (m1.lineno, m1.col_offset) == (1, 0) assert (m1.end_lineno, m1.end_col_offset) == (15, 12) assert (m1.subject.lineno, m1.subject.col_offset) == (1, 6) assert (m1.subject.end_lineno, m1.subject.end_col_offset) == (1, 7) m2 = ast_nodes[1] assert isinstance(m2, nodes.MatchCase) assert isinstance(m2.pattern, nodes.MatchValue) assert isinstance(m2.guard, nodes.Const) assert isinstance(m2.body[0], nodes.Pass) assert (m2.pattern.lineno, m2.pattern.col_offset) == (2, 9) assert (m2.pattern.end_lineno, m2.pattern.end_col_offset) == (2, 12) assert (m2.guard.lineno, m2.guard.col_offset) == (2, 16) assert (m2.guard.end_lineno, m2.guard.end_col_offset) == (2, 20) assert (m2.body[0].lineno, m2.body[0].col_offset) == (3, 8) assert (m2.body[0].end_lineno, m2.body[0].end_col_offset) == (3, 12) m3 = ast_nodes[2] assert isinstance(m3, nodes.MatchCase) assert isinstance(m3.pattern, nodes.MatchSingleton) assert (m3.pattern.lineno, m3.pattern.col_offset) == (4, 9) assert (m3.pattern.end_lineno, m3.pattern.end_col_offset) == (4, 13) m4 = ast_nodes[3] assert isinstance(m4, nodes.MatchCase) assert isinstance(m4.pattern, nodes.MatchSequence) assert isinstance(m4.pattern.patterns[0], nodes.MatchValue) assert (m4.pattern.lineno, m4.pattern.col_offset) == (6, 9) assert (m4.pattern.end_lineno, m4.pattern.end_col_offset) == (6, 22) assert (m4.pattern.patterns[0].lineno, m4.pattern.patterns[0].col_offset) == (6, 10) assert (m4.pattern.patterns[0].end_lineno, m4.pattern.patterns[0].end_col_offset) == (6, 11) m5 = m4.pattern.patterns[2] assert isinstance(m5, nodes.MatchStar) assert isinstance(m5.name, nodes.AssignName) assert (m5.lineno, m5.col_offset) == (6, 16) assert (m5.end_lineno, m5.end_col_offset) == (6, 21) # TODO fix column offset: MatchStar -> name (AssignName) assert (m5.name.lineno, m5.name.col_offset) == (6, 16) assert (m5.name.end_lineno, m5.name.end_col_offset) == (6, 21) m6 = ast_nodes[4] assert isinstance(m6, nodes.MatchCase) assert isinstance(m6.pattern, nodes.MatchMapping) assert isinstance(m6.pattern.keys[0], nodes.Const) assert isinstance(m6.pattern.patterns[0], nodes.MatchValue) assert isinstance(m6.pattern.rest, nodes.AssignName) assert (m6.pattern.lineno, m6.pattern.col_offset) == (8, 9) assert (m6.pattern.end_lineno, m6.pattern.end_col_offset) == (8, 29) assert (m6.pattern.keys[0].lineno, m6.pattern.keys[0].col_offset) == (8, 10) assert (m6.pattern.keys[0].end_lineno, m6.pattern.keys[0].end_col_offset) == (8, 11) assert (m6.pattern.patterns[0].lineno, m6.pattern.patterns[0].col_offset) == (8, 13) assert (m6.pattern.patterns[0].end_lineno, m6.pattern.patterns[0].end_col_offset) == (8, 20) # TODO fix column offset: MatchMapping -> rest (AssignName) assert (m6.pattern.rest.lineno, m6.pattern.rest.col_offset) == (8, 9) assert (m6.pattern.rest.end_lineno, m6.pattern.rest.end_col_offset) == (8, 29) m7 = ast_nodes[5] assert isinstance(m7, nodes.MatchCase) assert isinstance(m7.pattern, nodes.MatchClass) assert isinstance(m7.pattern.cls, nodes.Name) assert isinstance(m7.pattern.patterns[0], nodes.MatchValue) assert isinstance(m7.pattern.kwd_patterns[0], nodes.MatchValue) assert (m7.pattern.lineno, m7.pattern.col_offset) == (10, 9) assert (m7.pattern.end_lineno, m7.pattern.end_col_offset) == (10, 24) assert (m7.pattern.cls.lineno, m7.pattern.cls.col_offset) == (10, 9) assert (m7.pattern.cls.end_lineno, m7.pattern.cls.end_col_offset) == (10, 16) assert (m7.pattern.patterns[0].lineno, m7.pattern.patterns[0].col_offset) == (10, 17) assert (m7.pattern.patterns[0].end_lineno, m7.pattern.patterns[0].end_col_offset) == (10, 18) assert (m7.pattern.kwd_patterns[0].lineno, m7.pattern.kwd_patterns[0].col_offset) == (10, 22) assert (m7.pattern.kwd_patterns[0].end_lineno, m7.pattern.kwd_patterns[0].end_col_offset) == (10, 23) m8 = ast_nodes[6] assert isinstance(m8, nodes.MatchCase) assert isinstance(m8.pattern, nodes.MatchOr) assert isinstance(m8.pattern.patterns[0], nodes.MatchValue) assert (m8.pattern.lineno, m8.pattern.col_offset) == (12, 9) assert (m8.pattern.end_lineno, m8.pattern.end_col_offset) == (12, 18) assert (m8.pattern.patterns[0].lineno, m8.pattern.patterns[0].col_offset) == (12, 9) assert (m8.pattern.patterns[0].end_lineno, m8.pattern.patterns[0].end_col_offset) == (12, 12) m9 = ast_nodes[7] assert isinstance(m9, nodes.MatchCase) assert isinstance(m9.pattern, nodes.MatchAs) assert isinstance(m9.pattern.pattern, nodes.MatchValue) assert isinstance(m9.pattern.name, nodes.AssignName) assert (m9.pattern.lineno, m9.pattern.col_offset) == (14, 9) assert (m9.pattern.end_lineno, m9.pattern.end_col_offset) == (14, 17) assert (m9.pattern.pattern.lineno, m9.pattern.pattern.col_offset) == (14, 9) assert (m9.pattern.pattern.end_lineno, m9.pattern.pattern.end_col_offset) == (14, 12) # TODO fix column offset: MatchAs -> name (AssignName) assert (m9.pattern.name.lineno, m9.pattern.name.col_offset) == (14, 9) assert (m9.pattern.name.end_lineno, m9.pattern.name.end_col_offset) == (14, 17) # fmt: on @staticmethod def test_end_lineno_comprehension() -> None: """ListComp, SetComp, DictComp, GeneratorExpr.""" code = textwrap.dedent( """ [x for x in var] #@ {x for x in var} #@ {x: y for x, y in var} #@ (x for x in var) #@ """ ).strip() ast_nodes = builder.extract_node(code) assert isinstance(ast_nodes, list) and len(ast_nodes) == 4 c1 = ast_nodes[0] assert isinstance(c1, nodes.ListComp) assert isinstance(c1.elt, nodes.Name) assert isinstance(c1.generators[0], nodes.Comprehension) # type: ignore[index] assert (c1.lineno, c1.col_offset) == (1, 0) assert (c1.end_lineno, c1.end_col_offset) == (1, 16) assert (c1.elt.lineno, c1.elt.col_offset) == (1, 1) assert (c1.elt.end_lineno, c1.elt.end_col_offset) == (1, 2) c2 = ast_nodes[1] assert isinstance(c2, nodes.SetComp) assert isinstance(c2.elt, nodes.Name) assert isinstance(c2.generators[0], nodes.Comprehension) # type: ignore[index] assert (c2.lineno, c2.col_offset) == (2, 0) assert (c2.end_lineno, c2.end_col_offset) == (2, 16) assert (c2.elt.lineno, c2.elt.col_offset) == (2, 1) assert (c2.elt.end_lineno, c2.elt.end_col_offset) == (2, 2) c3 = ast_nodes[2] assert isinstance(c3, nodes.DictComp) assert isinstance(c3.key, nodes.Name) assert isinstance(c3.value, nodes.Name) assert isinstance(c3.generators[0], nodes.Comprehension) # type: ignore[index] assert (c3.lineno, c3.col_offset) == (3, 0) assert (c3.end_lineno, c3.end_col_offset) == (3, 22) assert (c3.key.lineno, c3.key.col_offset) == (3, 1) assert (c3.key.end_lineno, c3.key.end_col_offset) == (3, 2) assert (c3.value.lineno, c3.value.col_offset) == (3, 4) assert (c3.value.end_lineno, c3.value.end_col_offset) == (3, 5) c4 = ast_nodes[3] assert isinstance(c4, nodes.GeneratorExp) assert isinstance(c4.elt, nodes.Name) assert isinstance(c4.generators[0], nodes.Comprehension) # type: ignore[index] assert (c4.lineno, c4.col_offset) == (4, 0) assert (c4.end_lineno, c4.end_col_offset) == (4, 16) assert (c4.elt.lineno, c4.elt.col_offset) == (4, 1) assert (c4.elt.end_lineno, c4.elt.end_col_offset) == (4, 2) @staticmethod def test_end_lineno_class() -> None: """ClassDef, Keyword.""" code = textwrap.dedent( """ @decorator1 @decorator2 class X(Parent, var=42): pass """ ).strip() c1 = builder.extract_node(code) assert isinstance(c1, nodes.ClassDef) assert isinstance(c1.decorators, nodes.Decorators) assert isinstance(c1.bases[0], nodes.Name) assert isinstance(c1.keywords[0], nodes.Keyword) assert isinstance(c1.body[0], nodes.Pass) # fmt: off assert (c1.lineno, c1.col_offset) == (3, 0) assert (c1.end_lineno, c1.end_col_offset) == (4, 8) assert (c1.decorators.lineno, c1.decorators.col_offset) == (1, 0) assert (c1.decorators.end_lineno, c1.decorators.end_col_offset) == (2, 11) assert (c1.bases[0].lineno, c1.bases[0].col_offset) == (3, 8) assert (c1.bases[0].end_lineno, c1.bases[0].end_col_offset) == (3, 14) if PY39_PLUS: # 'lineno' and 'col_offset' information only added in Python 3.9 assert (c1.keywords[0].lineno, c1.keywords[0].col_offset) == (3, 16) assert (c1.keywords[0].end_lineno, c1.keywords[0].end_col_offset) == (3, 22) else: assert (c1.keywords[0].lineno, c1.keywords[0].col_offset) == (None, None) assert (c1.keywords[0].end_lineno, c1.keywords[0].end_col_offset) == (None, None) assert (c1.body[0].lineno, c1.body[0].col_offset) == (4, 4) assert (c1.body[0].end_lineno, c1.body[0].end_col_offset) == (4, 8) # fmt: on @staticmethod def test_end_lineno_module() -> None: """Tests for Module.""" code = """print()""" module = astroid.parse(code) assert isinstance(module, nodes.Module) assert module.lineno == 0 assert module.col_offset == 0 assert module.end_lineno is None assert module.end_col_offset is None astroid-3.2.2/script/0000775000175000017500000000000014622475517014412 5ustar epsilonepsilonastroid-3.2.2/script/create_contributor_list.py0000664000175000017500000000141214622475517021712 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt from pathlib import Path from contributors_txt import create_contributors_txt CWD = Path(".").absolute() ASTROID_BASE_DIRECTORY = Path(__file__).parent.parent.absolute() ALIASES_FILE = ( ASTROID_BASE_DIRECTORY / "script/.contributors_aliases.json" ).relative_to(CWD) DEFAULT_CONTRIBUTOR_PATH = (ASTROID_BASE_DIRECTORY / "CONTRIBUTORS.txt").relative_to( CWD ) def main(): create_contributors_txt( aliases_file=ALIASES_FILE, output=DEFAULT_CONTRIBUTOR_PATH, verbose=True ) if __name__ == "__main__": main() astroid-3.2.2/script/test_bump_changelog.py0000664000175000017500000001344414622475517021003 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt import logging import pytest from bump_changelog import ( VersionType, get_next_version, get_next_versions, transform_content, ) @pytest.mark.parametrize( "version,version_type,expected_version,expected_versions", [ ["2.6.1", VersionType.PATCH, "2.6.2", ["2.6.2"]], ["2.10.0", VersionType.MINOR, "2.11.0", ["2.11.0", "2.10.1"]], ["10.1.10", VersionType.PATCH, "10.1.11", ["10.1.11"]], [ "2.6.0", VersionType.MINOR, "2.7.0", [ "2.7.0", "2.6.1", ], ], ["2.6.1", VersionType.MAJOR, "3.0.0", ["3.1.0", "3.0.1"]], ["2.6.1-dev0", VersionType.PATCH, "2.6.2", ["2.6.2"]], [ "2.6.1-dev0", VersionType.MINOR, "2.7.0", [ "2.7.1", "2.7.0", ], ], ["2.6.1-dev0", VersionType.MAJOR, "3.0.0", ["3.1.0", "3.0.1"]], ["2.7.0", VersionType.PATCH, "2.7.1", ["2.7.1"]], ["2.7.0", VersionType.MINOR, "2.8.0", ["2.8.0", "2.7.1"]], ["2.7.0", VersionType.MAJOR, "3.0.0", ["3.1.0", "3.0.1"]], ["2.0.0", VersionType.PATCH, "2.0.1", ["2.0.1"]], ["2.0.0", VersionType.MINOR, "2.1.0", ["2.1.0", "2.0.1"]], ["2.0.0", VersionType.MAJOR, "3.0.0", ["3.1.0", "3.0.1"]], ], ) def test_get_next_version(version, version_type, expected_version, expected_versions): assert get_next_version(version, version_type) == expected_version if ( version_type == VersionType.PATCH or version_type == VersionType.MINOR and version.endswith(".0") ): assert get_next_versions(version, version_type) == expected_versions @pytest.mark.parametrize( "old_content,version,expected_error", [ [ """ What's New in astroid 2.7.0? ============================ Release date: TBA What's New in astroid 2.6.1? ============================ Release date: TBA What's New in astroid 2.6.0? ============================ Release date: TBA """, "2.6.1", r"There should be only two release dates 'TBA' \(2.6.1 and 2.7.0\)", ], [ """=================== astroid's ChangeLog =================== What's New in astroid 2.6.0? ============================ Release date: TBA """, "2.6.1", "text for this version '2.6.1' did not exists", ], [ """ What's New in astroid 2.6.2? ============================ Release date: TBA What's New in astroid 2.6.1? ============================ Release date: TBA """, "2.6.1", "The text for the next version '2.6.2' already exists", ], [ """ What's New in astroid 3.0.0? ============================ Release date: TBA What's New in astroid 2.6.10? ============================ Release date: TBA """, "3.0.0", r"There should be only one release date 'TBA' \(3.0.0\)", ], [ """ What's New in astroid 2.7.0? ============================ Release date: TBA What's New in astroid 2.6.10? ============================ Release date: TBA """, "2.7.0", r"There should be only one release date 'TBA' \(2.7.0\)", ], ], ) def test_update_content_error(old_content, version, expected_error, caplog): caplog.set_level(logging.DEBUG) with pytest.raises(AssertionError, match=expected_error): transform_content(old_content, version) def test_update_content(caplog): caplog.set_level(logging.DEBUG) old_content = """ =================== astroid's ChangeLog =================== What's New in astroid 2.6.1? ============================ Release date: TBA """ expected_beginning = """ =================== astroid's ChangeLog =================== What's New in astroid 2.6.2? ============================ Release date: TBA What's New in astroid 2.6.1? ============================ Release date: 20""" new_content = transform_content(old_content, "2.6.1") assert new_content[: len(expected_beginning)] == expected_beginning def test_update_content_minor(): old_content = """ =================== astroid's ChangeLog =================== What's New in astroid 2.7.0? ============================ Release date: TBA """ expected_beginning = """ =================== astroid's ChangeLog =================== What's New in astroid 2.8.0? ============================ Release date: TBA What's New in astroid 2.7.1? ============================ Release date: TBA What's New in astroid 2.7.0? ============================ Release date: 20""" new_content = transform_content(old_content, "2.7.0") assert new_content[: len(expected_beginning)] == expected_beginning def test_update_content_major(caplog): caplog.set_level(logging.DEBUG) old_content = """ =================== astroid's ChangeLog =================== What's New in astroid 3.0.0? ============================ Release date: TBA What's New in astroid 2.7.1? ============================ Release date: 2020-04-03 What's New in astroid 2.7.0? ============================ Release date: 2020-04-01 """ expected_beginning = """ =================== astroid's ChangeLog =================== What's New in astroid 3.1.0? ============================ Release date: TBA What's New in astroid 3.0.1? ============================ Release date: TBA What's New in astroid 3.0.0? ============================ Release date: 20""" new_content = transform_content(old_content, "3.0.0") assert new_content[: len(expected_beginning)] == expected_beginning astroid-3.2.2/script/.contributors_aliases.json0000664000175000017500000001245314622475517021626 0ustar epsilonepsilon{ "13665637+DanielNoord@users.noreply.github.com": { "mails": ["13665637+DanielNoord@users.noreply.github.com"], "name": "Daniël van Noord", "team": "Maintainers" }, "15907922+kasium@users.noreply.github.com": { "mails": ["15907922+kasium@users.noreply.github.com"], "name": "Kai Mueller" }, "30130371+cdce8p@users.noreply.github.com": { "mails": ["30130371+cdce8p@users.noreply.github.com"], "name": "Marc Mueller", "team": "Maintainers" }, "31762852+mbyrnepr2@users.noreply.github.com": { "mails": ["31762852+mbyrnepr2@users.noreply.github.com", "mbyrnepr2@gmail.com"], "name": "Mark Byrne", "team": "Maintainers" }, "55152140+jayaddison@users.noreply.github.com": { "mails": ["55152140+jayaddison@users.noreply.github.com", "jay@jp-hosting.net"], "name": "James Addison" }, "adam.grant.hendry@gmail.com": { "mails": ["adam.grant.hendry@gmail.com"], "name": "Adam Hendry" }, "androwiiid@gmail.com": { "mails": ["androwiiid@gmail.com"], "name": "Paligot Gérard" }, "antonio@zoftko.com": { "mails": ["antonio@zoftko.com", "antonioglez-23@hotmail.com"], "name": "Antonio" }, "areveny@protonmail.com": { "mails": ["areveny@protonmail.com", "self@areveny.com"], "name": "Areveny", "team": "Maintainers" }, "ashley@awhetter.co.uk": { "mails": [ "ashley@awhetter.co.uk", "awhetter.2011@my.bristol.ac.uk", "asw@dneg.com", "AWhetter@users.noreply.github.com" ], "name": "Ashley Whetter", "team": "Maintainers" }, "bot@noreply.github.com": { "mails": [ "66853113+pre-commit-ci[bot]@users.noreply.github.com", "49699333+dependabot[bot]@users.noreply.github.com", "41898282+github-actions[bot]@users.noreply.github.com" ], "name": "bot" }, "bryce.paul.guinta@gmail.com": { "mails": ["bryce.paul.guinta@gmail.com", "bryce.guinta@protonmail.com"], "name": "Bryce Guinta", "team": "Maintainers" }, "calen.pennington@gmail.com": { "mails": ["cale@edx.org", "calen.pennington@gmail.com"], "name": "Calen Pennington" }, "ceridwenv@gmail.com": { "mails": ["ceridwenv@gmail.com"], "name": "Ceridwen", "team": "Maintainers" }, "dmand@yandex.ru": { "mails": ["dmand@yandex.ru"], "name": "Dimitri Prybysh", "team": "Maintainers" }, "github@euresti.com": { "mails": ["david@dropbox.com", "github@euresti.com"], "name": "David Euresti" }, "guillaume.peillex@gmail.com": { "mails": ["guillaume.peillex@gmail.com"], "name": "Hippo91", "team": "Maintainers" }, "hugovk@users.noreply.github.com": { "mails": ["hugovk@users.noreply.github.com"], "name": "Hugo van Kemenade" }, "jacob@bogdanov.dev": { "mails": ["jacob@bogdanov.dev", "jbogdanov@128technology.com"], "name": "Jacob Bogdanov" }, "jacobtylerwalls@gmail.com": { "mails": ["jacobtylerwalls@gmail.com"], "name": "Jacob Walls", "team": "Maintainers" }, "joshdcannon@gmail.com": { "mails": ["joshdcannon@gmail.com", "joshua.cannon@ni.com"], "name": "Joshua Cannon" }, "kavinsingh@hotmail.com": { "mails": ["kavin.singh@mail.utoronto.ca", "kavinsingh@hotmail.com"], "name": "Kavins Singh" }, "keichi.t@me.com": { "mails": ["hello@keichi.dev", "keichi.t@me.com"], "name": "Keichi Takahashi" }, "mariocj89@gmail.com": { "mails": ["mcorcherojim@bloomberg.net", "mariocj89@gmail.com"], "name": "Mario Corchero" }, "me@the-compiler.org": { "mails": ["me@the-compiler.org"], "name": "Florian Bruhin", "team": "Maintainers" }, "michael-k@users.noreply.github.com": { "mails": ["michael-k@users.noreply.github.com"], "name": "Michael K" }, "moylop260@vauxoo.com": { "mails": ["moylop260@vauxoo.com"], "name": "Moises Lopez" }, "no-reply@google.com": { "mails": [ "nathaniel@google.com", "mbp@google.com", "balparda@google.com", "dlindquist@google.com" ], "name": "Google, Inc." }, "pcmanticore@gmail.com": { "mails": ["cpopa@cloudbasesolutions.com", "pcmanticore@gmail.com"], "name": "Claudiu Popa", "team": "Ex-maintainers" }, "pierre.sassoulas@gmail.com": { "mails": ["pierre.sassoulas@gmail.com", "pierre.sassoulas@cea.fr"], "name": "Pierre Sassoulas", "team": "Maintainers" }, "raphael@makeleaps.com": { "mails": ["raphael@rtpg.co", "raphael@makeleaps.com"], "name": "Raphael Gaschignard" }, "rogalski.91@gmail.com": { "mails": ["rogalski.91@gmail.com"], "name": "Łukasz Rogalski", "team": "Maintainers" }, "shlomme@gmail.com": { "mails": ["shlomme@gmail.com", "tmarek@google.com"], "name": "Torsten Marek", "team": "Ex-maintainers" }, "stefan@sofa-rockers.org": { "mails": ["stefan.scherfke@energymeteo.de", "stefan@sofa-rockers.org"], "name": "Stefan Scherfke" }, "thenault@gmail.com": { "mails": ["thenault@gmail.com", "sylvain.thenault@logilab.fr"], "name": "Sylvain Thénault", "team": "Ex-maintainers" }, "tushar.sadhwani000@gmail.com": { "mails": [ "tushar.sadhwani000@gmail.com", "86737547+tushar-deepsource@users.noreply.github.com" ], "name": "Tushar Sadhwani" }, "ville.skytta@iki.fi": { "mails": ["ville.skytta@iki.fi", "ville.skytta@upcloud.com"], "name": "Ville Skyttä" } } astroid-3.2.2/script/copyright.txt0000664000175000017500000000035614622475517017167 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt astroid-3.2.2/script/bump_changelog.py0000664000175000017500000001364114622475517017743 0ustar epsilonepsilon# Licensed under the LGPL: https://www.gnu.org/licenses/old-licenses/lgpl-2.1.en.html # For details: https://github.com/pylint-dev/astroid/blob/main/LICENSE # Copyright (c) https://github.com/pylint-dev/astroid/blob/main/CONTRIBUTORS.txt """ This script permits to upgrade the changelog in astroid or pylint when releasing a version. """ # pylint: disable=logging-fstring-interpolation from __future__ import annotations import argparse import enum import logging from datetime import datetime from pathlib import Path DEFAULT_CHANGELOG_PATH = Path("ChangeLog") RELEASE_DATE_TEXT = "Release date: TBA" WHATS_NEW_TEXT = "What's New in astroid" TODAY = datetime.now() FULL_WHATS_NEW_TEXT = WHATS_NEW_TEXT + " {version}?" NEW_RELEASE_DATE_MESSAGE = f"Release date: {TODAY.strftime('%Y-%m-%d')}" def main() -> None: parser = argparse.ArgumentParser(__doc__) parser.add_argument("version", help="The version we want to release") parser.add_argument( "-v", "--verbose", action="store_true", default=False, help="Logging or not" ) args = parser.parse_args() if args.verbose: logging.basicConfig(level=logging.DEBUG) logging.debug(f"Launching bump_changelog with args: {args}") if any(s in args.version for s in ("dev", "a", "b")): return with open(DEFAULT_CHANGELOG_PATH, encoding="utf-8") as f: content = f.read() content = transform_content(content, args.version) with open(DEFAULT_CHANGELOG_PATH, "w", encoding="utf8") as f: f.write(content) class VersionType(enum.Enum): MAJOR = 0 MINOR = 1 PATCH = 2 def get_next_version(version: str, version_type: VersionType) -> str: new_version = version.split(".") part_to_increase = new_version[version_type.value] if "-" in part_to_increase: part_to_increase = part_to_increase.split("-")[0] for i in range(version_type.value, 3): new_version[i] = "0" new_version[version_type.value] = str(int(part_to_increase) + 1) return ".".join(new_version) def get_next_versions(version: str, version_type: VersionType) -> list[str]: if version_type == VersionType.PATCH: # "2.6.1" => ["2.6.2"] return [get_next_version(version, VersionType.PATCH)] if version_type == VersionType.MINOR: # "2.6.0" => ["2.7.0", "2.6.1"] assert version.endswith(".0"), f"{version} does not look like a minor version" else: # "3.0.0" => ["3.1.0", "3.0.1"] assert version.endswith(".0.0"), f"{version} does not look like a major version" next_minor_version = get_next_version(version, VersionType.MINOR) next_patch_version = get_next_version(version, VersionType.PATCH) logging.debug(f"Getting the new version for {version} - {version_type.name}") return [next_minor_version, next_patch_version] def get_version_type(version: str) -> VersionType: if version.endswith(".0.0"): version_type = VersionType.MAJOR elif version.endswith(".0"): version_type = VersionType.MINOR else: version_type = VersionType.PATCH return version_type def get_whats_new( version: str, add_date: bool = False, change_date: bool = False ) -> str: whats_new_text = FULL_WHATS_NEW_TEXT.format(version=version) result = [whats_new_text, "=" * len(whats_new_text)] if add_date and change_date: result += [NEW_RELEASE_DATE_MESSAGE] elif add_date: result += [RELEASE_DATE_TEXT] elif change_date: raise ValueError("Can't use change_date=True with add_date=False") logging.debug( f"version='{version}', add_date='{add_date}', change_date='{change_date}': {result}" ) return "\n".join(result) def get_all_whats_new(version: str, version_type: VersionType) -> str: result = "" for version_ in get_next_versions(version, version_type=version_type): result += get_whats_new(version_, add_date=True) + "\n" * 4 return result def transform_content(content: str, version: str) -> str: version_type = get_version_type(version) next_version = get_next_version(version, version_type) old_date = get_whats_new(version, add_date=True) new_date = get_whats_new(version, add_date=True, change_date=True) next_version_with_date = get_all_whats_new(version, version_type) do_checks(content, next_version, version, version_type) index = content.find(old_date) logging.debug(f"Replacing\n'{old_date}'\nby\n'{new_date}'\n") content = content.replace(old_date, new_date) end_content = content[index:] content = content[:index] logging.debug(f"Adding:\n'{next_version_with_date}'\n") content += next_version_with_date + end_content return content def do_checks(content, next_version, version, version_type): err = "in the changelog, fix that first!" NEW_VERSION_ERROR_MSG = ( "The text for this version '{version}' did not exists %s" % err ) NEXT_VERSION_ERROR_MSG = ( "The text for the next version '{version}' already exists %s" % err ) wn_next_version = get_whats_new(next_version) wn_this_version = get_whats_new(version) # There is only one field where the release date is TBA if version_type in [VersionType.MAJOR, VersionType.MINOR]: assert ( content.count(RELEASE_DATE_TEXT) <= 1 ), f"There should be only one release date 'TBA' ({version}) {err}" else: next_minor_version = get_next_version(version, VersionType.MINOR) assert ( content.count(RELEASE_DATE_TEXT) <= 2 ), f"There should be only two release dates 'TBA' ({version} and {next_minor_version}) {err}" # There is already a release note for the version we want to release assert content.count(wn_this_version) == 1, NEW_VERSION_ERROR_MSG.format( version=version ) # There is no release notes for the next version assert content.count(wn_next_version) == 0, NEXT_VERSION_ERROR_MSG.format( version=next_version ) if __name__ == "__main__": main() astroid-3.2.2/tox.ini0000664000175000017500000000064714622475517014430 0ustar epsilonepsilon[tox] envlist = py{38,39,310,311,312} skip_missing_interpreters = true isolated_build = true [testenv] deps = -r requirements_full.txt commands = pytest --cov {posargs} [testenv:formatting] deps = -r requirements_dev.txt commands = pre-commit run --all-files [testenv:docs] skipsdist = True usedevelop = True changedir = doc/ deps = -r doc/requirements.txt commands = sphinx-build -E -b html . build astroid-3.2.2/requirements_dev.txt0000664000175000017500000000020514622475517017225 0ustar epsilonepsilon-r requirements_minimal.txt # Tools used during development, prefer running these with pre-commit black pre-commit pylint mypy ruff astroid-3.2.2/.coveragerc0000664000175000017500000000051014622475517015223 0ustar epsilonepsilon[run] relative_files = true [report] omit = */tests/* */tmp*/* exclude_lines = # Re-enable default pragma pragma: no cover # Debug-only code def __repr__ # Type checking code not executed during pytest runs if TYPE_CHECKING: @overload # Abstract methods raise NotImplementedError astroid-3.2.2/requirements_minimal.txt0000664000175000017500000000020014622475517020070 0ustar epsilonepsilon# Tools used when releasing contributors-txt>=0.7.4 tbump~=6.11 # Tools used to run tests coverage~=7.5 pytest pytest-cov~=5.0 astroid-3.2.2/CONTRIBUTORS.txt0000664000175000017500000002030714622475517015606 0ustar epsilonepsilon# This file is autocompleted by 'contributors-txt', # using the configuration in 'script/.contributors_aliases.json'. # Do not add new persons manually and only add information without # using '-' as the line first character. # Please verify that your change are stable if you modify manually. Ex-maintainers -------------- - Claudiu Popa - Sylvain Thénault - Torsten Marek Maintainers ----------- - Pierre Sassoulas - Daniël van Noord <13665637+DanielNoord@users.noreply.github.com> - Hippo91 - Marc Mueller <30130371+cdce8p@users.noreply.github.com> - Jacob Walls - Bryce Guinta - Ceridwen - Mark Byrne <31762852+mbyrnepr2@users.noreply.github.com> - Łukasz Rogalski - Florian Bruhin - Ashley Whetter - Dimitri Prybysh - Areveny Contributors ------------ - Emile Anclin - Nick Drozd - Andrew Haigh - Julien Cristau - David Liu - Alexandre Fayolle - Eevee (Alex Munroe) - David Gilman - Tushar Sadhwani - Julien Jehannet - Calen Pennington - Hugo van Kemenade - Tim Martin - Phil Schaf - Alex Hall - Raphael Gaschignard - Radosław Ganczarek - Paligot Gérard - Ioana Tagirta - Derek Gustafson - David Shea - Daniel Harding - Christian Clauss - Ville Skyttä - Rene Zhang - Philip Lorenz - Nicolas Chauvat - Michael K - Mario Corchero - Marien Zwart - Laura Médioni - James Addison <55152140+jayaddison@users.noreply.github.com> - FELD Boris - Enji Cooper - Dani Alcala <112832187+clavedeluna@users.noreply.github.com> - Antonio - Adrien Di Mascio - tristanlatr <19967168+tristanlatr@users.noreply.github.com> - emile@crater.logilab.fr - doranid - brendanator - Tomas Gavenciak - Tim Paine - Thomas Hisch - Stefan Scherfke - Sergei Lebedev <185856+superbobry@users.noreply.github.com> - Saugat Pachhai (सौगात) - Ram Rachum - Pierre-Yves David - Peter Pentchev - Peter Kolbus - Omer Katz - Moises Lopez - Michal Vasilek - Keichi Takahashi - Kavins Singh - Karthikeyan Singaravelan - Joshua Cannon - John Vandenberg - Jacob Bogdanov - Google, Inc. - David Euresti - David Douard - David Cain - Anthony Truchet - Anthony Sottile - Alexander Shadchin - wgehalo - rr- - raylu - plucury - ostr00000 - noah-weingarden <33741795+noah-weingarden@users.noreply.github.com> - nathannaveen <42319948+nathannaveen@users.noreply.github.com> - mathieui - markmcclain - ioanatia - grayjk - alm - adam-grant-hendry <59346180+adam-grant-hendry@users.noreply.github.com> - Zbigniew Jędrzejewski-Szmek - Zac Hatfield-Dodds - Vilnis Termanis - Valentin Valls - Uilian Ries - Tomas Novak - Thirumal Venkat - SupImDos <62866982+SupImDos@users.noreply.github.com> - Stanislav Levin - Simon Hewitt - Serhiy Storchaka - Roy Wright - Robin Jarry - René Fritze <47802+renefritze@users.noreply.github.com> - Redoubts - Philipp Hörist - Peter de Blanc - Peter Talley - Ovidiu Sabou - Nicolas Noirbent - Neil Girdhar - Miro Hrončok - Michał Masłowski - Mateusz Bysiek - Marcelo Trylesinski - Leandro T. C. Melo - Konrad Weihmann - Kian Meng, Ang - Kai Mueller <15907922+kasium@users.noreply.github.com> - Jörg Thalheim - Josef Kemetmüller - Jonathan Striebel - John Belmonte - Jeff Widman - Jeff Quast - Jarrad Hope - Jared Garst - Jakub Wilk - Iva Miholic - Ionel Maries Cristian - HoverHell - HQupgradeHQ <18361586+HQupgradeHQ@users.noreply.github.com> - Grygorii Iermolenko - Gregory P. Smith - Giuseppe Scrivano - Frédéric Chapoton - Francis Charette Migneault - Felix Mölder - Federico Bond - DudeNr33 <3929834+DudeNr33@users.noreply.github.com> - Dmitry Shachnev - Denis Laxalde - Deepyaman Datta - David Poirier - Dave Hirschfeld - Dave Baum - Daniel Martin - Daniel Colascione - Damien Baty - Craig Franklin - Colin Kennedy - Cole Robinson - Christoph Reiter - Chris Philip - BioGeek - Bianca Power <30207144+biancapower@users.noreply.github.com> - Benjamin Elven <25181435+S3ntinelX@users.noreply.github.com> - Ben Elliston - Becker Awqatty - Batuhan Taskaya - BasPH - Azeem Bande-Ali - Avram Lubkin - Aru Sahni - Artsiom Kaval - Anubhav <35621759+anubh-v@users.noreply.github.com> - Antoine Boellinger - Alphadelta14 - Alexander Scheel - Alexander Presnyakov - Ahmed Azzaoui - JulianJvn <128477611+JulianJvn@users.noreply.github.com> - Gwyn Ciesla Co-Author --------- The following persons were credited manually but did not commit themselves under this name, or we did not manage to find their commits in the history. - François Mockers - platings - carl - alain lefroy - Mark Gius - Jérome Perrin - Jamie Scott astroid-3.2.2/.gitattributes0000664000175000017500000000012014622475517015772 0ustar epsilonepsilon# See http://git-scm.com/docs/gitattributes#_end_of_line_conversion * text=auto astroid-3.2.2/tbump.toml0000664000175000017500000000301714622475517015133 0ustar epsilonepsilongithub_url = "https://github.com/pylint-dev/astroid" [version] current = "3.2.2" regex = ''' ^(?P0|[1-9]\d*) \. (?P0|[1-9]\d*) \. (?P0|[1-9]\d*) (?:-?(?P(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$ ''' [git] message_template = "Bump astroid to {new_version}, update changelog" tag_template = "v{new_version}" # For each file to patch, add a [[file]] config # section containing the path of the file, relative to the # tbump.toml location. [[file]] src = "astroid/__pkginfo__.py" # You can specify a list of commands to # run after the files have been patched # and before the git commit is made [[before_commit]] name = "Upgrade changelog changelog" cmd = "python3 script/bump_changelog.py {new_version}" [[before_commit]] name = "Normalize the contributors-txt configuration" cmd = "contributors-txt-normalize-configuration -a script/.contributors_aliases.json -o script/.contributors_aliases.json" [[before_commit]] name = "Upgrade the contributors list" cmd = "python3 script/create_contributor_list.py" [[before_commit]] name = "Apply pre-commit" cmd = "pre-commit run --all-files||echo 'Hack so this command does not fail'" [[before_commit]] name = "Confirm changes" cmd = "read -p 'Continue (y)? ' -n 1 -r; echo; [[ ! $REPLY =~ ^[Yy]$ ]] && exit 1 || exit 0" # Or run some commands after the git tag and the branch # have been pushed: # [[after_push]] # name = "publish" # cmd = "./publish.sh"